Incremental comprehension of spoken quantifier sentences: Evidence from brain potentials.
Freunberger, Dominik; Nieuwland, Mante S
2016-09-01
Do people incrementally incorporate the meaning of quantifier expressions to understand an unfolding sentence? Most previous studies concluded that quantifiers do not immediately influence how a sentence is understood based on the observation that online N400-effects differed from offline plausibility judgments. Those studies, however, used serial visual presentation (SVP), which involves unnatural reading. In the current ERP-experiment, we presented spoken positive and negative quantifier sentences ("Practically all/practically no postmen prefer delivering mail, when the weather is good/bad during the day"). Different from results obtained in a previously reported SVP-study (Nieuwland, 2016) sentence truth-value N400 effects occurred in positive and negative quantifier sentences alike, reflecting fully incremental quantifier comprehension. This suggests that the prosodic information available during spoken language comprehension supports the generation of online predictions for upcoming words and that, at least for quantifier sentences, comprehension of spoken language may proceed more incrementally than comprehension during SVP reading. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Space station needs, attributes, and architectural options study. Volume 3: Cost and benefits
NASA Technical Reports Server (NTRS)
1983-01-01
Cost and schedule, cost/incremental capability, and schedule impact analyses are presented. Quantifiable benefits, space technology, non-quantifiable benefits, and space commercialization are addressed.
When three is not some: on the pragmatics of numerals.
Shetreet, Einat; Chierchia, Gennaro; Gaab, Nadine
2014-04-01
Both numerals and quantifiers (like some) have more than one possible interpretation (i.e., weak and strong interpretations). Some studies have found similar behavior for numerals and quantifiers, whereas others have shown critical differences. It is, therefore, debated whether they are processed in the same way. A previous fMRI investigation showed that the left inferior frontal gyrus is linked to the computation of the strong interpretation of quantifiers (derived by a scalar implicature) and that the left middle frontal gyrus and the medial frontal gyrus are linked to processing the mismatch between the strong interpretation of quantifiers and the context in which they are presented. In the current study, we attempted to characterize the similarities and differences between numbers and quantifiers by examining brain activation patterns related to the processing of numerals in these brain regions. When numbers were presented in a mismatch context (i.e., where their strong interpretation did not match the context), they elicited brain activations similar to those previously observed with quantifiers in the same context type. Conversely, in a match context (i.e., where both interpretations of the scalar item matched the context), numbers elicited a different activation pattern than the one observed with quantifiers: Left inferior frontal gyrus activations in response to the match condition showed decrease for numbers (but not for quantifiers). Our results support previous findings suggesting that, although they share some features, numbers and quantifiers are processed differently. We discuss our results in light of various theoretical approaches linked to the representation of numerals.
Energetic arousal and language: predictions from the computational theory of quantifiers processing.
Zajenkowski, Marcin
2013-10-01
The author examines the relationship between energetic arousal (EA) and the processing of sentences containing natural-language quantifiers. Previous studies and theories have shown that energy may differentially affect various cognitive functions. Recent investigations devoted to quantifiers strongly support the theory that various types of quantifiers involve different cognitive functions in the sentence-picture verification task. In the present study, 201 students were presented with a sentence-picture verification task consisting of simple propositions containing a quantifier that referred to the color of a car on display. Color pictures of cars accompanied the propositions. In addition, the level of participants' EA was measured before and after the verification task. It was found that EA and performance on proportional quantifiers (e.g., "More than half of the cars are red") are in an inverted U-shaped relationship. This result may be explained by the fact that proportional sentences engage working memory to a high degree, and previous models of EA-cognition associations have been based on the assumption that tasks that require parallel attentional and memory processes are best performed when energy is moderate. The research described in the present article has several applications, as it shows the optimal human conditions for verbal comprehension. For instance, it may be important in workplace design to control the level of arousal experienced by office staff when work is mostly related to the processing of complex texts. Energy level may be influenced by many factors, such as noise, time of day, or thermal conditions.
Behavioral Inhibition and Risk for Developing Social Anxiety Disorder: A Meta-Analytic Study
ERIC Educational Resources Information Center
Clauss, Jacqueline A.; Blackford, Jennifer Urbano
2012-01-01
Objective: Behavioral inhibition (BI) has been associated with increased risk for developing social anxiety disorder (SAD); however, the degree of risk associated with BI has yet to be systematically examined and quantified. The goal of the present study was to quantify the association between childhood BI and risk for developing SAD. Method: A…
Quantifying the high-velocity, low-amplitude spinal manipulative thrust: a systematic review.
Downie, Aron S; Vemulpad, Subramanyam; Bull, Peter W
2010-09-01
The purpose of this study was to systematically review studies that quantify the high-velocity, low-amplitude (HVLA) spinal thrust, to qualitatively compare the apparatus used and the force-time profiles generated, and to critically appraise studies involving the quantification of thrust as an augmented feedback tool in psychomotor learning. A search of the literature was conducted to identify the sources that reported quantification of the HVLA spinal thrust. MEDLINE-OVID (1966-present), MANTIS-OVID (1950-present), and CINAHL-EBSCO host (1981-present) were searched. Eligibility criteria included that thrust subjects were human, animal, or manikin and that the thrust type was a hand-delivered HVLA spinal thrust. Data recorded were single force, force-time, or displacement-time histories. Publications were in English language and after 1980. The relatively small number of studies, combined with the diversity of method and data interpretation, did not enable meta-analysis. Twenty-seven studies met eligibility criteria: 17 studies measured thrust as a primary outcome (13 human, 2 cadaver, and 2 porcine). Ten studies demonstrated changes in psychomotor learning related to quantified thrust data on human, manikin, or other device. Quantifiable parameters of the HVLA spinal thrust exist and have been described. There remain a number of variables in recording that prevent a standardized kinematic description of HVLA spinal manipulative therapy. Despite differences in data between studies, a relationship between preload, peak force, and thrust duration was evident. Psychomotor learning outcomes were enhanced by the application of thrust data as an augmented feedback tool. Copyright © 2010 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.
Temporal Coherence: A Model for Non-Stationarity in Natural and Simulated Wind Records
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rinker, Jennifer M.; Gavin, Henri P.; Clifton, Andrew
We present a novel methodology for characterizing and simulating non-stationary stochastic wind records. In this new method, non-stationarity is characterized and modelled via temporal coherence, which is quantified in the discrete frequency domain by probability distributions of the differences in phase between adjacent Fourier components. Temporal coherence can also be used to quantify non-stationary characteristics in wind data. Three case studies are presented that analyze the non-stationarity of turbulent wind data obtained at the National Wind Technology Center near Boulder, Colorado, USA. The first study compares the temporal and spectral characteristics of a stationary wind record and a non-stationary windmore » record in order to highlight their differences in temporal coherence. The second study examines the distribution of one of the proposed temporal coherence parameters and uses it to quantify the prevalence of nonstationarity in the dataset. The third study examines how temporal coherence varies with a range of atmospheric parameters to determine what conditions produce more non-stationarity.« less
Scope Interpretation in First and Second Language Acquisition: Numeral Quantifiers and Negation
ERIC Educational Resources Information Center
Kwak, Hye-Young
2010-01-01
The present study investigates the interpretation of scopally ambiguous sentences containing a numeral quantifier and negation, such as (1) and (2), with a view to examining the interpretive preferences for Korean manifested by Korean-speaking children and adults, and the interpretive preferences for English manifested by Korean-speaking second…
Evaluation of equipment and methods to map lost circulation zones in geothermal wells
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonald, W.J.; Leon, P.A.; Pittard, G.
A study and evaluation of methods to locate, characterize, and quantify lost circulation zones are described. Twenty-five methods of mapping and quantifying lost circulation zones were evaluated, including electrical, acoustical, mechanical, radioactive, and optical systems. Each tool studied is described. The structured, numerical evaluation plan, used as the basis for comparing the 25 tools, and the resulting ranking among the tools is presented.
The effect of vortex formation on left ventricular filling and mitral valve efficiency.
Pierrakos, Olga; Vlachos, Pavlos P
2006-08-01
A new mechanism for quantifying the filling energetics in the left ventricle (LV) and past mechanical heart valves (MHV) is identified and presented. This mechanism is attributed to vortex formation dynamics past MHV leaflets. Recent studies support the conjecture that the natural healthy left ventricle (LV) performs in an optimum, energy-preserving manner by redirecting the flow with high efficiency. Yet to date, no quantitative proof has been presented. The present work provides quantitative results and validation of a theory based on the dynamics of vortex ring formation, which is governed by a critical formation number (FN) that corresponds to the dimensionless time at which the vortex ring has reached its maximum circulation content, in support of this hypothesis. Herein, several parameters (vortex ring circulation, vortex ring energy, critical FN, hydrodynamic efficiencies, vortex ring propagation speed) have been quantified and presented as a means of bridging the physics of vortex formation in the LV. In fact, the diastolic hydrodynamic efficiencies were found to be 60, 41, and 29%, respectively, for the porcine, anti-anatomical, and anatomical valve configurations. This assessment provides quantitative proof of vortex formation, which is dependent of valve design and orientation, being an important flow characteristic and associated to LV energetics. Time resolved digital particle image velocimetry with kilohertz sampling rate was used to study the ejection of fluid into the LV and resolve the spatiotemporal evolution of the flow. The clinical significance of this study is quantifying vortex formation and the critical FN that can potentially serve as a parameter to quantify the LV filling process and the performance of heart valves.
Huijbregts, Mark A J; Gilijamse, Wim; Ragas, Ad M J; Reijnders, Lucas
2003-06-01
The evaluation of uncertainty is relatively new in environmental life-cycle assessment (LCA). It provides useful information to assess the reliability of LCA-based decisions and to guide future research toward reducing uncertainty. Most uncertainty studies in LCA quantify only one type of uncertainty, i.e., uncertainty due to input data (parameter uncertainty). However, LCA outcomes can also be uncertain due to normative choices (scenario uncertainty) and the mathematical models involved (model uncertainty). The present paper outlines a new methodology that quantifies parameter, scenario, and model uncertainty simultaneously in environmental life-cycle assessment. The procedure is illustrated in a case study that compares two insulation options for a Dutch one-family dwelling. Parameter uncertainty was quantified by means of Monte Carlo simulation. Scenario and model uncertainty were quantified by resampling different decision scenarios and model formulations, respectively. Although scenario and model uncertainty were not quantified comprehensively, the results indicate that both types of uncertainty influence the case study outcomes. This stresses the importance of quantifying parameter, scenario, and model uncertainty simultaneously. The two insulation options studied were found to have significantly different impact scores for global warming, stratospheric ozone depletion, and eutrophication. The thickest insulation option has the lowest impact on global warming and eutrophication, and the highest impact on stratospheric ozone depletion.
Quantifying the Thermal Fatigue of CPV Modules
NASA Astrophysics Data System (ADS)
Bosco, Nick; Kurtz, Sarah
2010-10-01
A method is presented to quantify thermal fatigue in the CPV die-attach from meteorological data. A comparative study between cities demonstrates a significant difference in the accumulated damage. These differences are most sensitive to the number of larger (ΔT) thermal cycles experienced for a location. High frequency data (<1/min) may be required to most accurately employ this method.
O'Connor, B.L.; Hondzo, Miki; Harvey, J.W.
2009-01-01
Traditionally, dissolved oxygen (DO) fluxes have been calculated using the thin-film theory with DO microstructure data in systems characterized by fine sediments and low velocities. However, recent experimental evidence of fluctuating DO concentrations near the sediment-water interface suggests that turbulence and coherent motions control the mass transfer, and the surface renewal theory gives a more mechanistic model for quantifying fluxes. Both models involve quantifying the mass transfer coefficient (k) and the relevant concentration difference (??C). This study compared several empirical models for quantifying k based on both thin-film and surface renewal theories, as well as presents a new method for quantifying ??C (dynamic approach) that is consistent with the observed DO concentration fluctuations near the interface. Data were used from a series of flume experiments that includes both physical and kinetic uptake limitations of the flux. Results indicated that methods for quantifying k and ??C using the surface renewal theory better estimated the DO flux across a range of fluid-flow conditions. ?? 2009 ASCE.
How the brain learns how few are “many”: An fMRI study of the flexibility of quantifier semantics
Heim, Stefan; McMillan, Corey T.; Clark, Robin; Baehr, Laura; Ternes, Kylie; Olm, Christopher; Min, Nam Eun; Grossman, Murray
2015-01-01
Previous work has shown that the meaning of a quantifier such as “many” or “few” depends in part on quantity. However, the meaning of a quantifier may vary depending on the context, e.g. in the case of common entities such as “many ants” (perhaps several thousands) compared to endangered species such as “many pandas” (perhaps a dozen). In a recent study (Heim et al. 2015 Front. Psychol.) we demonstrated that the relative meaning of “many” and “few” may be changed experimentally. In a truth value judgment task, displays with 40% of circles in a named color initially had a low probability of being labeled “many”. After a training phase, the likelihood of acceptance 40% as “many” increased. Moreover, the semantic learning effect also generalized to the related quantifier “few” which had not been mentioned in the training phase. Thus, fewer 40% arrays were considered “few.” In the present study, we tested the hypothesis that this semantic adaptation effect was supported by cytoarchitectonic Brodmann area (BA) 45 in Broca’s region which may contribute to semantic evaluation in the context of language and quantification. In an event-related fMRI study, 17 healthy volunteers performed the same paradigm as in the previous behavioral study. We found a relative signal increase when comparing the critical, trained proportion to untrained proportions. This specific effect was found in left BA 45 for the trained quantifier “many”, and in left BA 44 for both quantifiers, reflecting the semantic adjustment for the untrained but related quantifier “few.” These findings demonstrate the neural basis for processing the flexible meaning of a quantifier, and illustrate the neuroanatomical structures that contribute to variable meanings that can be associated with a word when used in different contexts. PMID:26481678
Martin, Bryn A.; Kalata, Wojciech; Shaffer, Nicholas; Fischer, Paul; Luciano, Mark; Loth, Francis
2013-01-01
Elevated or reduced velocity of cerebrospinal fluid (CSF) at the craniovertebral junction (CVJ) has been associated with type I Chiari malformation (CMI). Thus, quantification of hydrodynamic parameters that describe the CSF dynamics could help assess disease severity and surgical outcome. In this study, we describe the methodology to quantify CSF hydrodynamic parameters near the CVJ and upper cervical spine utilizing subject-specific computational fluid dynamics (CFD) simulations based on in vivo MRI measurements of flow and geometry. Hydrodynamic parameters were computed for a healthy subject and two CMI patients both pre- and post-decompression surgery to determine the differences between cases. For the first time, we present the methods to quantify longitudinal impedance (LI) to CSF motion, a subject-specific hydrodynamic parameter that may have value to help quantify the CSF flow blockage severity in CMI. In addition, the following hydrodynamic parameters were quantified for each case: maximum velocity in systole and diastole, Reynolds and Womersley number, and peak pressure drop during the CSF cardiac flow cycle. The following geometric parameters were quantified: cross-sectional area and hydraulic diameter of the spinal subarachnoid space (SAS). The mean values of the geometric parameters increased post-surgically for the CMI models, but remained smaller than the healthy volunteer. All hydrodynamic parameters, except pressure drop, decreased post-surgically for the CMI patients, but remained greater than in the healthy case. Peak pressure drop alterations were mixed. To our knowledge this study represents the first subject-specific CFD simulation of CMI decompression surgery and quantification of LI in the CSF space. Further study in a larger patient and control group is needed to determine if the presented geometric and/or hydrodynamic parameters are helpful for surgical planning. PMID:24130704
Quantifying VOC emissions from polymers: A case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schulze, J.K.; Qasem, J.S.; Snoddy, R.
1996-12-31
Evaluating residual volatile organic compound emissions emanating from low-density polyethylene can pose significant challenges. These challenges include quantifying emissions from: (a) multiple process lines with different operating conditions; (b) several different comonomers; (c) variations of comonomer content in each grade; and (d) over 120 grades of LDPE. This presentation is a Case Study outlining a project to develop grade-specific emission data for low-density polyethylene pellets. This study included extensive laboratory analyses and required the development of a relational database to compile analytical results, calculate the mean concentration and standard deviation, and generate emissions reports.
Stanko, Jason P; Easterling, Michael R; Fenton, Suzanne E
2015-07-01
Studies that utilize the rodent mammary gland (MG) as an endpoint for assessing the developmental toxicity of chemical exposures typically employ either basic dimensional measurements or developmental scoring of morphological characteristics as a means to quantify MG development. There are numerous means by which to report these developmental changes, leading to inconsistent translation across laboratories. The Sholl analysis is a method historically used for quantifying neuronal dendritic patterns. The present study describes the use of the Sholl analysis to quantify MG branching characteristics. Using this method, we were able to detect significant differences in branching density in MG of peripubertal female Sprague Dawley rats that had been exposed to vehicle or a potent estrogen. These data suggest the Sholl analysis can be an effective tool for quantitatively measuring an important characteristic of MG development and for examining associations between MG growth and density and adverse effects in the breast. Published by Elsevier Inc.
This presentation, Particle-Resolved Simulations for Quantifying Black Carbon Climate Impact and Model Uncertainty, was given at the STAR Black Carbon 2016 Webinar Series: Changing Chemistry over Time held on Oct. 31, 2016.
Aircraft Radiation Shield Experiments--Preflight Laboratory Testing
NASA Technical Reports Server (NTRS)
Singleterry, Robert C., Jr.; Shinn, Judy L.; Wilson, John W.; Maiden, Donald L.; Thibeault, Sheila A.; Badavi, Francis F.; Conroy, Thomas; Braby, Leslie
1999-01-01
In the past, measurements onboard a research Boeing 57F (RB57-F) aircraft have demonstrated that the neutron environment within the aircraft structure is greater than that in the local external environment. Recent studies onboard Boeing 737 commercial flights have demonstrated cabin variations in radiation exposure up to 30 percent. These prior results were the basis of the present study to quantify the potential effects of aircraft construction materials on the internal exposures of the crew and passengers. The present study constitutes preflight measurements using an unmoderated Cf-252 fission neutron source to quantify the effects of three current and potential aircraft materials (aluminum, titanium, and graphite-epoxy composite) on the fast neutron flux. Conclusions about the effectiveness of the three selected materials for radiation shielding must wait until testing in the atmosphere is complete; however, it is clear that for shielding low-energy neutrons, the composite material is an improved shielding material over aluminum or titanium.
Quantifying real-gas effects on a laminar n-dodecane - air premixed flame
NASA Astrophysics Data System (ADS)
Gopal, Abishek; Yellapantula, Shashank; Larsson, Johan
2015-11-01
With the increasing demand for higher efficiencies in aircraft gas-turbine engines, there has been a progressive march towards high pressure-ratio cycles. Under these conditions, the aviation fuel, Jet A, is injected into the combustor at supercritical pressures. In this work, we study and quantify the effects of transcriticality on a 1D freely propagating laminar n-dodecane - air premixed flame. The impact of the constitutive state relations arising from the Ideal Gas equation of state(EOS) and Peng-Robinson EOS on flame structure and propagation is presented. The effects of real-gas models of transport properties, such as viscosity on laminar flame speed, are also presented.
Quifer-Rada, Paola; Martínez-Huélamo, Miriam; Lamuela-Raventos, Rosa M
2017-07-19
Phenolic compounds are present in human fluids (plasma and urine) mainly as glucuronidated and sulfated metabolites. Up to now, due to the unavailability of standards, enzymatic hydrolysis has been the method of choice in analytical chemistry to quantify these phase II phenolic metabolites. Enzymatic hydrolysis procedures vary in enzyme concentration, pH and temperature; however, there is a lack of knowledge about the stability of polyphenols in their free form during the process. In this study, we evaluated the stability of 7 phenolic acids, 2 flavonoids and 3 prenylflavanoids in urine during enzymatic hydrolysis to assess the suitability of this analytical procedure, using three different concentrations of β-glucuronidase/sulfatase enzymes from Helix pomatia. The results indicate that enzymatic hydrolysis negatively affected the recovery of the precursor and free-form polyphenols present in the sample. Thus, enzymatic hydrolysis does not seem an ideal analytical strategy to quantify glucuronidated and sulfated polyphenol metabolites.
Chemical evolution via beta decay: a case study in strontium-90
NASA Astrophysics Data System (ADS)
Marks, N. A.; Carter, D. J.; Sassi, M.; Rohl, A. L.; Sickafus, K. E.; Uberuaga, B. P.; Stanek, C. R.
2013-02-01
Using 90Sr as a representative isotope, we present a framework for understanding beta decay within the solid state. We quantify three key physical and chemical principles, namely momentum-induced recoil during the decay event, defect creation due to physical displacement, and chemical evolution over time. A fourth effect, that of electronic excitation, is also discussed, but this is difficult to quantify and is strongly material dependent. The analysis is presented for the specific cases of SrTiO3 and SrH2. By comparing the recoil energy with available threshold displacement data we show that in many beta-decay situations defects such as Frenkel pairs will not be created during decay as the energy transfer is too low. This observation leads to the concept of chemical evolution over time, which we quantify using density functional theory. Using a combination of Bader analysis, phonon calculations and cohesive energy calculations, we show that beta decay leads to counter-intuitive behavior that has implications for nuclear waste storage and novel materials design.
Chemical evolution via beta decay: a case study in strontium-90.
Marks, N A; Carter, D J; Sassi, M; Rohl, A L; Sickafus, K E; Uberuaga, B P; Stanek, C R
2013-02-13
Using (90)Sr as a representative isotope, we present a framework for understanding beta decay within the solid state. We quantify three key physical and chemical principles, namely momentum-induced recoil during the decay event, defect creation due to physical displacement, and chemical evolution over time. A fourth effect, that of electronic excitation, is also discussed, but this is difficult to quantify and is strongly material dependent. The analysis is presented for the specific cases of SrTiO(3) and SrH(2). By comparing the recoil energy with available threshold displacement data we show that in many beta-decay situations defects such as Frenkel pairs will not be created during decay as the energy transfer is too low. This observation leads to the concept of chemical evolution over time, which we quantify using density functional theory. Using a combination of Bader analysis, phonon calculations and cohesive energy calculations, we show that beta decay leads to counter-intuitive behavior that has implications for nuclear waste storage and novel materials design.
Global Persistent Attack: A Systems Architecture, Process Modeling, and Risk Analysis Approach
2008-06-01
develop an analysis process for quantifying risk associated with the limitations presented by a fiscally constrained environment. The second step...previous independent analysis of each force structure provided information for quantifying risk associated with the given force presentations, the
Modeling Soot Oxidation and Gasification with Bayesian Statistics
Josephson, Alexander J.; Gaffin, Neal D.; Smith, Sean T.; ...
2017-08-22
This paper presents a statistical method for model calibration using data collected from literature. The method is used to calibrate parameters for global models of soot consumption in combustion systems. This consumption is broken into two different submodels: first for oxidation where soot particles are attacked by certain oxidizing agents; second for gasification where soot particles are attacked by H 2O or CO 2 molecules. Rate data were collected from 19 studies in the literature and evaluated using Bayesian statistics to calibrate the model parameters. Bayesian statistics are valued in their ability to quantify uncertainty in modeling. The calibrated consumptionmore » model with quantified uncertainty is presented here along with a discussion of associated implications. The oxidation results are found to be consistent with previous studies. Significant variation is found in the CO 2 gasification rates.« less
Modeling Soot Oxidation and Gasification with Bayesian Statistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Josephson, Alexander J.; Gaffin, Neal D.; Smith, Sean T.
This paper presents a statistical method for model calibration using data collected from literature. The method is used to calibrate parameters for global models of soot consumption in combustion systems. This consumption is broken into two different submodels: first for oxidation where soot particles are attacked by certain oxidizing agents; second for gasification where soot particles are attacked by H 2O or CO 2 molecules. Rate data were collected from 19 studies in the literature and evaluated using Bayesian statistics to calibrate the model parameters. Bayesian statistics are valued in their ability to quantify uncertainty in modeling. The calibrated consumptionmore » model with quantified uncertainty is presented here along with a discussion of associated implications. The oxidation results are found to be consistent with previous studies. Significant variation is found in the CO 2 gasification rates.« less
Assessment of Runoff Toxicity from Coated Surfaces
Presented in this paper are results from a field and laboratory study of the potential runoff toxicity from coated surfaces. The study results qualified and quantified the types and concentrations of pollutants in runoff from surfaces sealed with a variety of products. Coatings a...
NASA Astrophysics Data System (ADS)
Owusu Twumasi, Jones; Le, Viet; Tang, Qixiang; Yu, Tzuyang
2016-04-01
Corrosion of steel reinforcing bars (rebars) is the primary cause for the deterioration of reinforced concrete structures. Traditional corrosion monitoring methods such as half-cell potential and linear polarization resistance can only detect the presence of corrosion but cannot quantify it. This study presents an experimental investigation of quantifying degree of corrosion of steel rebar inside cement mortar specimens using ultrasonic testing (UT). A UT device with two 54 kHz transducers was used to measure ultrasonic pulse velocity (UPV) of cement mortar, uncorroded and corroded reinforced cement mortar specimens, utilizing the direct transmission method. The results obtained from the study show that UPV decreases linearly with increase in degree of corrosion and corrosion-induced cracks (surface cracks). With respect to quantifying the degree of corrosion, a model was developed by simultaneously fitting UPV and surface crack width measurements to a two-parameter linear model. The proposed model can be used for predicting the degree of corrosion of steel rebar embedded in cement mortar under similar conditions used in this study up to 3.03%. Furthermore, the modeling approach can be applied to corroded reinforced concrete specimens with additional modification. The findings from this study show that UT has the potential of quantifying the degree of corrosion inside reinforced cement mortar specimens.
Novelli, M D; Barreto, E; Matos, D; Saad, S S; Borra, R C
1997-01-01
The authors present the experimental results of the computerized quantifying of tissular structures involved in the reparative process of colonic anastomosis performed by manual suture and biofragmentable ring. The quantified variables in this study were: oedema fluid, myofiber tissue, blood vessel and cellular nuclei. An image processing software developed at Laboratório de Informática Dedicado à Odontologia (LIDO) was utilized to quantifying the pathognomonic alterations in the inflammatory process in colonic anastomosis performed in 14 dogs. The results were compared to those obtained through traditional way diagnosis by two pathologists in view of counterproof measures. The criteria for these diagnoses were defined in levels represented by absent, light, moderate and intensive which were compared to analysis performed by the computer. There was significant statistical difference between two techniques: the biofragmentable ring technique exhibited low oedema fluid, organized myofiber tissue and higher number of alongated cellular nuclei in relation to manual suture technique. The analysis of histometric variables through computational image processing was considered efficient and powerful to quantify the main tissular inflammatory and reparative changing.
Sun, Meng; Bloom, Alexander B.; Zaman, Muhammad H.
2015-01-01
Metastatic cancers aggressively reorganize collagen in their microenvironment. For example, radially orientated collagen fibers have been observed surrounding tumor cell clusters in vivo. The degree of fiber alignment, as a consequence of this remodeling, has often been difficult to quantify. In this paper, we present an easy to implement algorithm for accurate detection of collagen fiber orientation in a rapid pixel-wise manner. This algorithm quantifies the alignment of both computer generated and actual collagen fiber networks of varying degrees of alignment within 5°°. We also present an alternative easy method to calculate the alignment index directly from the standard deviation of fiber orientation. Using this quantitative method for determining collagen alignment, we demonstrate that the number of collagen fiber intersections has a negative correlation with the degree of fiber alignment. This decrease in intersections of aligned fibers could explain why cells move more rapidly along aligned fibers than unaligned fibers, as previously reported. Overall, our paper provides an easier, more quantitative and quicker way to quantify fiber orientation and alignment, and presents a platform in studying effects of matrix and cellular properties on fiber alignment in complex 3D environments. PMID:26158674
NASA Astrophysics Data System (ADS)
French, N. H. F.; Prichard, S.; McKenzie, D.; Kennedy, M. C.; Billmire, M.; Ottmar, R. D.; Kasischke, E. S.
2016-12-01
Quantification of emissions of carbon during combustion relies on knowing three general variables: how much landscape is impacted by fire (burn area), how much carbon is in that landscape (fuel loading), and fuel properties that determine the fraction that is consumed (fuel condition). These variables also determine how much carbon remains at the site in the form of unburned organic material or char, and therefore drive post-fire carbon dynamics and pools. In this presentation we review the importance of understanding fuel type, fuel loading, and fuel condition for quantifying carbon dynamics properly during burning and for measuring and mapping fuels across landscapes, regions, and continents. Variability in fuels has been shown to be a major driver of uncertainty in fire emissions, but has had little attention until recently. We review the current state of fuel characterization for fire management and carbon accounting, and present a new approach to quantifying fuel loading for use in fire-emissions mapping and for improving fire-effects assessment. The latest results of a study funded by the Joint Fire Science Program (JFSP) are presented, where a fuel loading database is being built to quantify variation in fuel loadings, as represented in the Fuel Characteristic Classification System (FCCS), across the conterminous US and Alaska. Statistical assessments of these data at multiple spatial scales will improve tools used by fire managers and scientists to quantify fire's impact on the land, atmosphere, and carbon cycle.
Quantification of Carbohydrates in Grape Tissues Using Capillary Zone Electrophoresis
Zhao, Lu; Chanon, Ann M.; Chattopadhyay, Nabanita; Dami, Imed E.; Blakeslee, Joshua J.
2016-01-01
Soluble sugars play an important role in freezing tolerance in both herbaceous and woody plants, functioning in both the reduction of freezing-induced dehydration and the cryoprotection of cellular constituents. The quantification of soluble sugars in plant tissues is, therefore, essential in understanding freezing tolerance. While a number of analytical techniques and methods have been used to quantify sugars, most of these are expensive and time-consuming due to complex sample preparation procedures which require the derivatization of the carbohydrates being analyzed. Analysis of soluble sugars using capillary zone electrophoresis (CZE) under alkaline conditions with direct UV detection has previously been used to quantify simple sugars in fruit juices. However, it was unclear whether CZE-based methods could be successfully used to quantify the broader range of sugars present in complex plant extracts. Here, we present the development of an optimized CZE method capable of separating and quantifying mono-, di-, and tri-saccharides isolated from plant tissues. This optimized CZE method employs a column electrolyte buffer containing 130 mM NaOH, pH 13.0, creating a current of 185 μA when a separation voltage of 10 kV is employed. The optimized CZE method provides limits-of-detection (an average of 1.5 ng/μL) for individual carbohydrates comparable or superior to those obtained using gas chromatography–mass spectrometry, and allows resolution of non-structural sugars and cell wall components (structural sugars). The optimized CZE method was successfully used to quantify sugars from grape leaves and buds, and is a robust tool for the quantification of plant sugars found in vegetative and woody tissues. The increased analytical efficiency of this CZE method makes it ideal for use in high-throughput metabolomics studies designed to quantify plant sugars. PMID:27379118
EVALUATION OF TOXICS IN RUNOFF FROM COATED SURFACES
Presented in this paper are results from a field and laboratory study of the potential runoff toxicity from coated surfaces. The study results qualified and quantified the types and concentrations of pollutants in runoff from surfaces sealed with a variety of products. Coatings a...
Quantifying functional mobility progress for chronic disease management.
Boyle, Justin; Karunanithi, Mohan; Wark, Tim; Chan, Wilbur; Colavitti, Christine
2006-01-01
A method for quantifying improvements in functional mobility is presented based on patient-worn accelerometer devices. For patients with cardiovascular, respiratory, or other chronic disease, increasing the amount of functional mobility is a large component of rehabilitation programs. We have conducted an observational trial on the use of accelerometers for quantifying mobility improvements in a small group of chronic disease patients (n=15, 48 - 86 yrs). Cognitive impairments precluded complex instrumentation of patients, and movement data was obtained from a single 2-axis accelerometer device worn at the hip. In our trial, movement data collected from accelerometer devices was classified into Lying vs Sitting/Standing vs Walking/Activity movements. This classification enabled the amount of walking to be quantified and graphically presented to clinicians and carers for feedback on exercise efficacy. Presenting long term trends in this data to patients also provides valuable feedback for self managed care and assisting with compliance.
Visual Attention and Quantifier-Spreading in Heritage Russian Bilinguals
ERIC Educational Resources Information Center
Sekerina, Irina A.; Sauermann, Antje
2015-01-01
It is well established in language acquisition research that monolingual children and adult second language learners misinterpret sentences with the universal quantifier "every" and make quantifier-spreading errors that are attributed to a preference for a match in number between two sets of objects. The present Visual World eye-tracking…
Mueller, Katharina Felicitas; Meerpohl, Joerg J; Briel, Matthias; Antes, Gerd; von Elm, Erik; Lang, Britta; Motschall, Edith; Schwarzer, Guido; Bassler, Dirk
2016-12-01
To systematically review methodological articles which focus on nonpublication of studies and to describe methods of detecting and/or quantifying and/or adjusting for dissemination in meta-analyses. To evaluate whether the methods have been applied to an empirical data set for which one can be reasonably confident that all studies conducted have been included. We systematically searched Medline, the Cochrane Library, and Web of Science, for methodological articles that describe at least one method of detecting and/or quantifying and/or adjusting for dissemination bias in meta-analyses. The literature search retrieved 2,224 records, of which we finally included 150 full-text articles. A great variety of methods to detect, quantify, or adjust for dissemination bias were described. Methods included graphical methods mainly based on funnel plot approaches, statistical methods, such as regression tests, selection models, sensitivity analyses, and a great number of more recent statistical approaches. Only few methods have been validated in empirical evaluations using unpublished studies obtained from regulators (Food and Drug Administration, European Medicines Agency). We present an overview of existing methods to detect, quantify, or adjust for dissemination bias. It remains difficult to advise which method should be used as they are all limited and their validity has rarely been assessed. Therefore, a thorough literature search remains crucial in systematic reviews, and further steps to increase the availability of all research results need to be taken. Copyright © 2016 Elsevier Inc. All rights reserved.
Montgomery, Jill D; Hensler, Heather R; Jacobson, Lisa P; Jenkins, Frank J
2008-07-01
The aim of the present study was to determine if the Alpha DigiDoc RT system would be an effective method of quantifying immunohistochemical staining as compared with a manual counting method, which is considered the gold standard. Two readers were used to count 31 samples by both methods. The results obtained using the Bland-Altman for concordance deemed no statistical difference between the 2 methods. Thus, the Alpha DigiDoc RT system is an effective, low cost method to quantify immunohistochemical data.
Quantifying errors without random sampling.
Phillips, Carl V; LaPole, Luwanna M
2003-06-12
All quantifications of mortality, morbidity, and other health measures involve numerous sources of error. The routine quantification of random sampling error makes it easy to forget that other sources of error can and should be quantified. When a quantification does not involve sampling, error is almost never quantified and results are often reported in ways that dramatically overstate their precision. We argue that the precision implicit in typical reporting is problematic and sketch methods for quantifying the various sources of error, building up from simple examples that can be solved analytically to more complex cases. There are straightforward ways to partially quantify the uncertainty surrounding a parameter that is not characterized by random sampling, such as limiting reported significant figures. We present simple methods for doing such quantifications, and for incorporating them into calculations. More complicated methods become necessary when multiple sources of uncertainty must be combined. We demonstrate that Monte Carlo simulation, using available software, can estimate the uncertainty resulting from complicated calculations with many sources of uncertainty. We apply the method to the current estimate of the annual incidence of foodborne illness in the United States. Quantifying uncertainty from systematic errors is practical. Reporting this uncertainty would more honestly represent study results, help show the probability that estimated values fall within some critical range, and facilitate better targeting of further research.
Rail Transportation Requirements for Coal Movement in 1985
DOT National Transportation Integrated Search
1978-12-01
This study of transportation requirements for coal movements through 1985 is one of the series conducted for the U.S. Department of Transportation to identify and quantify future transportation requirements for energy materials. This report presents ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jing; Toloczko, Mychailo B.; Kruska, Karen
Accelerator-based ion beam techniques have been used to study radiation effects in materials for decades. Although carbon contamination induced by ion beam in target materials is a well-known issue, it has not been fully characterized nor quantified for studies in ferritic/martensitic (F/M) steels that are candidate materials for applications such as core structural components in advanced nuclear reactors. It is an especially important issue for this class of material because of the effect of carbon level on precipitate formation. In this paper, the ability to quantify carbon contamination using three common techniques, namely time-of-flight secondary ion mass spectroscopy (ToF-SIMS), atommore » probe tomography (APT) and transmission electron microscopy (TEM) is compared. Their effectiveness and short-comings in determining carbon contamination will be presented and discussed. The corresponding microstructural changes related to carbon contamination in ion irradiated F/M steels are also presented and briefly discussed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jing; Toloczko, Mychailo B.; Kruska, Karen
Accelerator-based ion beam irradiation techniques have been used to study radiation effects in materials for decades. Although carbon contamination induced by ion beams in target materials is a well-known issue in some material systems, it has not been fully characterized nor quantified for studies in ferritic/martensitic (F/M) steels that are candidate materials for applications such as core structural components in advanced nuclear reactors. It is an especially important issue for this class of material because of the strong effect of carbon level on precipitate formation. In this paper, the ability to quantify carbon contamination using three common techniques, namely time-of-flightmore » secondary ion mass spectroscopy (ToF-SIMS), atom probe tomography (APT), and transmission electron microscopy (TEM) is compared. Their effectiveness and shortcomings in determining carbon contamination are presented and discussed. The corresponding microstructural changes related to carbon contamination in ion irradiated F/M steels are also presented and briefly discussed.« less
Wind driven erosion and the effects of particulate electrification
NASA Astrophysics Data System (ADS)
Merrison, J. P.; Bak, E.; Finster, K.; Gunnlaugsson, H. P.; Holstein-Rathlou, C.; Knak Jensen, S.; Nørnberg, P.; Rasmussen, K. R.
2012-09-01
Several related aspects of Aeolian activity are presently being studied in the laboratory, the most recent advances in this field will be presented. These include simulating wind driven erosion in the laboratory, quantifying erosion rates and the study of mineral change due to mechanical activation. Also advances in our understanding of the electrification of sand/dust particles is being made and how this phenomenon affects their behavior.
Modeling noisy resonant system response
NASA Astrophysics Data System (ADS)
Weber, Patrick Thomas; Walrath, David Edwin
2017-02-01
In this paper, a theory-based model replicating empirical acoustic resonant signals is presented and studied to understand sources of noise present in acoustic signals. Statistical properties of empirical signals are quantified and a noise amplitude parameter, which models frequency and amplitude-based noise, is created, defined, and presented. This theory-driven model isolates each phenomenon and allows for parameters to be independently studied. Using seven independent degrees of freedom, this model will accurately reproduce qualitative and quantitative properties measured from laboratory data. Results are presented and demonstrate success in replicating qualitative and quantitative properties of experimental data.
Interpreting Quantifier Scope Ambiguity: Evidence of Heuristic First, Algorithmic Second Processing
Dwivedi, Veena D.
2013-01-01
The present work suggests that sentence processing requires both heuristic and algorithmic processing streams, where the heuristic processing strategy precedes the algorithmic phase. This conclusion is based on three self-paced reading experiments in which the processing of two-sentence discourses was investigated, where context sentences exhibited quantifier scope ambiguity. Experiment 1 demonstrates that such sentences are processed in a shallow manner. Experiment 2 uses the same stimuli as Experiment 1 but adds questions to ensure deeper processing. Results indicate that reading times are consistent with a lexical-pragmatic interpretation of number associated with context sentences, but responses to questions are consistent with the algorithmic computation of quantifier scope. Experiment 3 shows the same pattern of results as Experiment 2, despite using stimuli with different lexical-pragmatic biases. These effects suggest that language processing can be superficial, and that deeper processing, which is sensitive to structure, only occurs if required. Implications for recent studies of quantifier scope ambiguity are discussed. PMID:24278439
Many applications analyze quantified transcript-level abundances to make inferences. Having completed this computation across the large sample set, the CTD2 Center at the Translational Genomics Research Institute presents the quantified data in a straightforward, consolidated form for these types of analyses.
Quantifying Water Stress Using Total Water Volumes and GRACE
NASA Astrophysics Data System (ADS)
Richey, A. S.; Famiglietti, J. S.; Druffel-Rodriguez, R.
2011-12-01
Water will follow oil as the next critical resource leading to unrest and uprisings globally. To better manage this threat, an improved understanding of the distribution of water stress is required today. This study builds upon previous efforts to characterize water stress by improving both the quantification of human water use and the definition of water availability. Current statistics on human water use are often outdated or inaccurately reported nationally, especially for groundwater. This study improves these estimates by defining human water use in two ways. First, we use NASA's Gravity Recovery and Climate Experiment (GRACE) to isolate the anthropogenic signal in water storage anomalies, which we equate to water use. Second, we quantify an ideal water demand by using average water requirements for the domestic, industrial, and agricultural water use sectors. Water availability has traditionally been limited to "renewable" water, which ignores large, stored water sources that humans use. We compare water stress estimates derived using either renewable water or the total volume of water globally. We use the best-available data to quantify total aquifer and surface water volumes, as compared to groundwater recharge and surface water runoff from land-surface models. The work presented here should provide a more realistic image of water stress by explicitly quantifying groundwater, defining water availability as total water supply, and using GRACE to more accurately quantify water use.
Morgan, Brianna; Gross, Rachel; Clark, Robin; Dreyfuss, Michael; Boller, Ashley; Camp, Emily; Liang, Tsao-Wei; Avants, Brian; McMillan, Corey; Grossman, Murray
2011-01-01
Quantifiers are very common in everyday speech, but we know little about their cognitive basis or neural representation. The present study examined comprehension of three classes of quantifiers that depend on different cognitive components in patients with focal neurodegenerative diseases. Patients evaluated the truth-value of a sentence containing a quantifier relative to a picture illustrating a small number of familiar objects, and performance was related to MRI grey matter atrophy using voxel-based morphometry. We found that patients with corticobasal syndrome (CBS) and posterior cortical atrophy (PCA) are significantly impaired in their comprehension of Cardinal Quantifiers (e.g. “At least three birds are on the branch”), due in part to their deficit in quantity knowledge. MRI analyses related this deficit to temporal-parietal atrophy found in CBS/PCA. We also found that patients with behavioral variant frontotemporal dementia (bvFTD) are significantly impaired in their comprehension of Logical Quantifiers (e.g. “Some the birds are on the branch”), associated with a simple form of perceptual logic, and this correlated with their deficit on executive measures. This deficit was related to disease in rostral prefrontal cortex in bvFTD. These patients were also impaired in their comprehension of Majority Quantifiers (e.g. “At least half of the birds are on the branch”), and this too was correlated with their deficit on executive measures. This was related to disease in the basal ganglia interrupting a frontal-striatal loop critical for executive functioning. These findings suggest that a large-scale frontal-parietal neural network plays a crucial role in quantifier comprehension, and that comprehension of specific classes of quantifiers may be selectively impaired in patients with focal neurodegenerative conditions in these areas. PMID:21930136
An international pilot study has been developed to explore the possibility of quantifying and assessing environmental condition, processes of land degradation, and subsequent impacts on natural and human resources. The purpose of the study is to foster a framework for scientific...
Issues in Retrospective Conversion for a Small Special Collection: A Case Study.
ERIC Educational Resources Information Center
Hieb, Fern
1997-01-01
Small special collections present unique problems for retrospective conversion of catalogs to machine-readable form. Examines retrospective conversion using the Moravian Music Foundation as a case study. Discusses advantages to automation, options for conversion process, quantifying conversion effort, costs, in-house conversion, national standards…
Quantitative Assessment of Interutterance Stability: Application to Dysarthria
ERIC Educational Resources Information Center
Cummins, Fred; Lowit, Anja; van Brenk, Frits
2014-01-01
Purpose: Following recent attempts to quantify articulatory impairment in speech, the present study evaluates the usefulness of a novel measure of motor stability to characterize dysarthria. Method: The study included 8 speakers with ataxic dysarthria (AD), 16 speakers with hypokinetic dysarthria (HD) as a result of Parkinson's disease, and…
Incidence of Dysarthria in Children with Cerebellar Tumors: A Prospective Study
ERIC Educational Resources Information Center
Richter, S.; Schoch, B.; Ozimek, A.; Gorissen, B.; Hein-Kropp, C.; Kaiser, O.; Hovel, M.; Wieland, R.; Gizewski, E.; Timmann, D.
2005-01-01
The present study investigated dysarthric symptoms in children with cerebellar tumors. Ten children with cerebellar tumors and 10 orthopedic control children were tested prior and one week after surgery. Clinical dysarthric symptoms were quantified in spontaneous speech. Syllable durations were analyzed in syllable repetition and sentence…
A retrospective analysis of aeromedical certification denial actions : January 1961 - December 1967.
DOT National Transportation Integrated Search
1968-05-01
The study quantifies several unknowns and/or uncertainties with respect to medical and general descriptive attributes of airmen denied medical certification. Data are presented concerning age, sex, occupation, total flying time, and medical character...
Quantitative analysis of thoria phase in Th-U alloys using diffraction studies
NASA Astrophysics Data System (ADS)
Thakur, Shital; Krishna, P. S. R.; Shinde, A. B.; Kumar, Raj; Roy, S. B.
2017-05-01
In the present study the quantitative phase analysis of Th-U alloys in bulk form namely Th-52 wt% U and Th-3wt%U has been performed over the data obtained from both X ray diffraction and neutron diffraction technique using Rietveld method of FULLPROF software. Quantifying thoria (ThO2) phase present in bulk of the sample is limited due to surface oxidation and low penetration of x rays in high Z material. Neutron diffraction study probing bulk of the samples has been presented in comparison with x-ray diffraction study.
Transboundary Contributions To Surface Ozone In California's Central Valley
NASA Astrophysics Data System (ADS)
Post, A.; Faloona, I. C.; Conley, S. A.; Lighthall, D.
2014-12-01
Rising concern over the impacts of exogenous air pollution in California's Central Valley has prompted the establishment of a coastal, high altitude monitoring site at the Chews Ridge Observatory (1550 m) approximately 30 km east of Point Sur in Monterey County, under the auspices of the Monterey Institute for Research in Astronomy. Two and a half years of continuous ozone data are presented in the context of long-range transport and its potential impact on surface air quality in the San Joaquin Valley (SJV). Past attempts to quantify the impact of transboundary ozone on surface levels have relied on uncertain model estimates, or have been limited to weekly ozonesonde data. Here, we present an observationally derived quantification of the contribution of free tropospheric ozone to surface sites in the San Joaquin Valley throughout three ozone seasons (June through September, 2012-2014). The diurnal ozone patterns at Chews Ridge, and their correlations with ozone aloft over the Valley, have been presented previously. Furthermore, reanalysis data of geopotential heights indicate consistent flow from Chews Ridge to the East, directly over the SJV. In a related airborne project we quantify the vertical exchange, or entrainment, rate over the Southern SJV from a series of focused flights measuring ozone concentrations during peak photochemical hours in conjunction with local meteorological data to quantify an ozone budget for the area. By applying the entrainment rates observed in that study here we are able to quantify the seasonal contributions of free tropospheric ozone measured at Chews Ridge to surface sites in the San Joaquin Valley, and compare prior model estimates to our observationally derived values.
Many applications analyze quantified transcript-level abundances to make inferences. Having completed this computation across the large sample set, the CTD2 Center at the Translational Genomics Research Institute presents the quantified data in a straightforward, consolidated form for these types of analyses. Experimental Approaches
Food waste quantification in primary production - The Nordic countries as a case study.
Hartikainen, Hanna; Mogensen, Lisbeth; Svanes, Erik; Franke, Ulrika
2018-01-01
Our understanding of food waste in the food supply chain has increased, but very few studies have been published on food waste in primary production. The overall aims of this study were to quantify the total amount of food waste in primary production in Finland, Sweden, Norway and Denmark, and to create a framework for how to define and quantify food waste in primary production. The quantification of food waste was based on case studies conducted in the present study and estimates published in scientific literature. The chosen scope of the study was to quantify the amount of edible food (excluding inedible parts like peels and bones) produced for human consumption that did not end up as food. As a result, the quantification was different from the existing guidelines. One of the main differences is that food that ends up as animal feed is included in the present study, whereas this is not the case for the recently launched food waste definition of the FUSIONS project. To distinguish the 'food waste' definition of the present study from the existing definitions and to avoid confusion with established usage of the term, a new term 'side flow' (SF) was introduced as a synonym for food waste in primary production. A rough estimate of the total amount of food waste in primary production in Finland, Sweden, Norway and Denmark was made using SF and 'FUSIONS Food Waste' (FFW) definitions. The SFs in primary production in the four Nordic countries were an estimated 800,000 tonnes per year with an additional 100,000 tonnes per year from the rearing phase of animals. The 900,000 tonnes per year of SF corresponds to 3.7% of the total production of 24,000,000 tonnes per year of edible primary products. When using the FFW definition proposed by the FUSIONS project, the FFW amount was estimated at 330,000 tonnes per year, or 1% of the total production. Copyright © 2017 Elsevier Ltd. All rights reserved.
RGB-NDVI colour composites for visualizing forest change dynamics
NASA Technical Reports Server (NTRS)
Sader, S. A.; Winne, J. C.
1992-01-01
The study presents a simple and logical technique to display and quantify forest change using three dates of satellite imagery. The normalized difference vegetation index (NDVI) was computed for each date of imagery to define high and low vegetation biomass. Color composites were generated by combining each date of NDVI with either the red, green, or blue (RGB) image planes in an image display monitor. Harvest and regeneration areas were quantified by applying a modified parallelepiped classification creating an RGB-NDVI image with 27 classes that were grouped into nine major forest change categories. Aerial photographs and stand history maps are compared with the forest changes indicated by the RGB-NDVI image. The utility of the RGB-NDVI technique for supporting forest inventories and updating forest resource information systems are presented and discussed.
Estimating plant distance in maize using Unmanned Aerial Vehicle (UAV).
Zhang, Jinshui; Basso, Bruno; Price, Richard F; Putman, Gregory; Shuai, Guanyuan
2018-01-01
Distance between rows and plants are essential parameters that affect the final grain yield in row crops. This paper presents the results of research intended to develop a novel method to quantify the distance between maize plants at field scale using an Unmanned Aerial Vehicle (UAV). Using this method, we can recognize maize plants as objects and calculate the distance between plants. We initially developed our method by training an algorithm in an indoor facility with plastic corn plants. Then, the method was scaled up and tested in a farmer's field with maize plant spacing that exhibited natural variation. The results of this study demonstrate that it is possible to precisely quantify the distance between maize plants. We found that accuracy of the measurement of the distance between maize plants depended on the height above ground level at which UAV imagery was taken. This study provides an innovative approach to quantify plant-to-plant variability and, thereby final crop yield estimates.
Detecting Breech Presentation Before Labour: Lessons From a Low-Risk Maternity Clinic.
Ressl, Bill; O'Beirne, Maeve
2015-08-01
Evaluation of fetal position is an important part of prenatal care. A woman with a breech presentation may need referral for external cephalic version, for assisted breech delivery, or to schedule a Caesarean section. In many centres, a breech presentation undetected until labour will result in an emergency Caesarean section, a less desirable alternative for both the mother and the health care system. The anecdotal reports of undiagnosed breech presentations at a busy maternity clinic prompted a study to quantify the missed breech presentations and to evaluate the effectiveness of the current detection process, with the aim of allowing no more than 1% of breech presentations to remain undetected until labour. We performed a retrospective analysis of 102 breech deliveries over a 14 month period to quantify missed breech presentations, and used a prospective physician survey documenting how fetal presentation was determined at 186 prenatal visits over four months to analyze the current detection process. We found that approximately 8% of breech presentations were undetected until labour. We concluded that within the limitations of the small sample size evaluated, the current practice of using a vaginal examination to verify fetal presentation determined by abdominal palpation (Leopold's manoeuvres) may not be more accurate than abdominal palpation alone. The current detection process resulted in an unacceptably high rate of missed breech presentations. The results of this study prompted the clinic's acquisition of bedside ultrasound capability to assess fetal position.
A temporal analysis of urban forest carbon storage using remote sensing
Soojeong Myeong; David J. Nowak; Michael J. Duggin
2006-01-01
Quantifying the carbon storage, distribution, and change of urban trees is vital to understanding the role of vegetation in the urban environment. At present, this is mostly achieved through ground study. This paper presents a method based on the satellite image time series, which can save time and money and greatly speed the process of urban forest carbon storage...
Quantifying coordination among the rearfoot, midfoot, and forefoot segments during running.
Takabayashi, Tomoya; Edama, Mutsuaki; Yokoyama, Erika; Kanaya, Chiaki; Kubo, Masayoshi
2018-03-01
Because previous studies have suggested that there is a relationship between injury risk and inter-segment coordination, quantifying coordination between the segments is essential. Even though the midfoot and forefoot segments play important roles in dynamic tasks, previous studies have mostly focused on coordination between the shank and rearfoot segments. This study aimed to quantify coordination among rearfoot, midfoot, and forefoot segments during running. Eleven healthy young men ran on a treadmill. The coupling angle, representing inter-segment coordination, was calculated using a modified vector coding technique. The coupling angle was categorised into four coordination patterns. During the absorption phase, rearfoot-midfoot coordination in the frontal planes was mostly in-phase (rearfoot and midfoot eversion with similar amplitudes). The present study found that the eversion of the midfoot with respect to the rearfoot was comparable in magnitude to the eversion of the rearfoot with respect to the shank. A previous study has suggested that disruption of the coordination between the internal rotation of the shank and eversion of the rearfoot leads to running injuries such as anterior knee pain. Thus, these data might be used in the future to compare to individuals with foot deformities or running injuries.
A subjective study and an objective metric to quantify the granularity level of textures
NASA Astrophysics Data System (ADS)
Subedar, Mahesh M.; Karam, Lina J.
2015-03-01
Texture granularity is an important visual characteristic that is useful in a variety of applications, including analysis, recognition, and compression, to name a few. A texture granularity measure can be used to quantify the perceived level of texture granularity. The granularity level of the textures is influenced by the size of the texture primitives. A primitive is defined as the smallest recognizable repetitive object in the texture. If the texture has large primitives then the perceived granularity level tends to be lower as compared to a texture with smaller primitives. In this work we are presenting a texture granularity database referred as GranTEX which consists of 30 textures with varying levels of primitive sizes and granularity levels. The GranTEX database consists of both natural and man-made textures. A subjective study is conducted to measure the perceived granularity level of textures present in the GranTEX database. An objective metric that automatically measures the perceived granularity level of textures is also presented as part of this work. It is shown that the proposed granularity metric correlates well with the subjective granularity scores.
ERIC Educational Resources Information Center
Oldham, Mary; Kellett, Stephen; Miles, Eleanor; Sheeran, Paschal
2012-01-01
Objective: Rates of nonattendance for psychotherapy hinder the effective delivery of evidence-based treatments. Although many strategies have been developed to increase attendance, the effectiveness of these strategies has not been quantified. Our aim in the present study was to undertake a meta-analysis of rigorously controlled studies to…
Religious Studies as a Test-Case For Computer-Assisted Instruction In The Humanities.
ERIC Educational Resources Information Center
Jones, Bruce William
Experiences with computer-assisted instructional (CAI) programs written for religious studies indicate that CAI has contributions to offer the humanities and social sciences. The usefulness of the computer for presentation, drill and review of factual material and its applicability to quantifiable data is well accepted. There now exist…
Quantifying soil profile change caused by land use in central Missouri loess hillslopes
Samuel J. Indorante; John M. Kabrick; Brad D. Lee; Jon M. Maatta
2014-01-01
Three major challenges are present when studying anthropogenic impacts on soil profile properties: (i) site selection; (ii) sampling and modeling native and cultivated soil-landscape relationships; and (iii) graphically and statistically comparing native and cultivated sites to model soil profile changes. This study addressed those challenges by measuring and modeling...
Godoy, L; Garrido, D; Martínez, C; Saavedra, J; Combina, M; Ganga, M A
2009-04-01
To evaluate the coumarate descarboxylase (CD) and vinylphenol reductase (VR) activities in Dekkera bruxellensis isolates and study their relationship to the growth rate, protein profile and random amplified polymorphic DNA (RAPD) molecular pattern. CD and VR activities were quantified, as well, the growth rate, intracellular protein profile and molecular analysis (RAPD) were determined in 12 isolates of D. bruxellensis. All the isolates studied showed CD activity, but only some showed VR activity. Those isolates with the greatest growth rate did not present a different protein profile from the others. The FASC showed a relationship between RAPD molecular patterns and VR activity. CD activity is common to all of the D. bruxellensis isolates. This was not the case with VR activity, which was detected at a low percentage in the analysed micro-organisms. A correlation was observed between VR activity and the RAPD patterns. This is the first study that quantifies the CD and VR enzyme activities in D. bruxellensis, demonstrating that these activities are not present in all isolates of this yeast.
DOT National Transportation Integrated Search
2016-12-01
This report presents the findings of a valuation study recently conducted in Florida to quantify the : freight users willingness to pay (WTP) for the improvement of transportation-related attributes, : particularly reliability. A stated preference...
Pendulum Underwater--An Approach for Quantifying Viscosity
ERIC Educational Resources Information Center
Leme, José Costa; Oliveira, Agostinho
2017-01-01
The purpose of the experiment presented in this paper is to quantify the viscosity of a liquid. Viscous effects are important in the flow of fluids in pipes, in the bloodstream, in the lubrication of engine parts, and in many other situations. In the present paper, the authors explore the oscillations of a physical pendulum in the form of a long…
An Approach to the Classification of Potential Reserve Additions of Giant Oil Fields of the World
Klett, T.R.; Tennyson, Marilyn E.
2008-01-01
This report contains slides and notes for slides for a presentation given to the Committee on Sustainable Energy and the Ad Hoc Group of Experts on Harmonization of Fossil Energy and Mineral Resources Terminology on 17 October 2007 in Geneva, Switzerland. The presentation describes the U.S. Geological Survey study to characterize and quantify petroleum-reserve additions, and the application of this study to help classify the quantities.
Universal Quantification in a Constraint-Based Planner
NASA Technical Reports Server (NTRS)
Golden, Keith; Frank, Jeremy; Clancy, Daniel (Technical Monitor)
2002-01-01
Constraints and universal quantification are both useful in planning, but handling universally quantified constraints presents some novel challenges. We present a general approach to proving the validity of universally quantified constraints. The approach essentially consists of checking that the constraint is not violated for all members of the universe. We show that this approach can sometimes be applied even when variable domains are infinite, and we present some useful special cases where this can be done efficiently.
Quantifying swallowing function for healthy adults in different age groups using acoustic analysis
NASA Astrophysics Data System (ADS)
Leung, Man-Yin
Dysphagia is a medical condition that can lead to devastating complications including weight loss, aspiration pneumonia, dehydration, and malnutrition; hence, timely identification is essential. Current dysphagia evaluation tools are either invasive, time consuming, or highly dependent on the experience of an individual clinician. The present study aims to develop a non-invasive, quantitative screening tool for dysphagia identification by capturing acoustic data from swallowing and mastication. The first part of this study explores the feasibility of using acoustic data to quantify swallowing and mastication. This study then further identifies mastication and swallowing trends in a neurotypical adult population. An acoustic capture protocol for dysphagia screening is proposed. Finally, the relationship among speaking, lingual and mastication rates are explored. Results and future directions are discussed.
Are Water-lean Solvent Systems Viable for Post-Combustion CO 2 Capture?
Heldebrant, David J.; Koech, Phillip K.; Rousseau, Roger; ...
2017-08-18
Here, we present here an overview of water-lean solvents that compares their projected costs and performance to aqueous amine systems, emphasizing critical areas of study needed to evaluate their performance against their water-based brethren. The work presented her focuses on bridging these knowledge gaps. Because the majority of water-lean solvents are still at the lab scale, substantial studies are still needed to model their performance at scale. This presents a significant challenge as eachformulation has different physical and thermodynamic properties and behavior, and quantifying how these different properties manifest themselves in conventional absorber-stripper configurations, or identifying new configurations that aremore » specific for a solvent’s signature behavior. We identify critical areas of study that are needed, and our efforts (e.g. custom infrastructure, molecular models) to predict, measure, and model these behaviors. Such findings are critical for determining the rheology required for heat exchanger design; absorber designs and packing to accommodate solvents with gradient changes (e.g. viscosity, contact angle, surface tension), and stripper configurations without direct steam utilization or water reflux. Another critical area of research need is to understand the molecular structure of the liquid interface and bulk as a function of CO 2 loading, and to assess whether conventional film theories accurately quantify solvent behavior, or if thermodynamic models adequately quantify activity coefficients of ions in solution. We conclude with an assessment of our efforts to aid in bridging the knowledge gaps in understanding water-lean solvents, and suggestions of what is needed to enable large-scale demonstrations to meet the United States Department of Energy’s year 2030 goal.« less
Are Water-lean Solvent Systems Viable for Post-Combustion CO 2 Capture?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heldebrant, David J.; Koech, Phillip K.; Rousseau, Roger
Here, we present here an overview of water-lean solvents that compares their projected costs and performance to aqueous amine systems, emphasizing critical areas of study needed to evaluate their performance against their water-based brethren. The work presented her focuses on bridging these knowledge gaps. Because the majority of water-lean solvents are still at the lab scale, substantial studies are still needed to model their performance at scale. This presents a significant challenge as eachformulation has different physical and thermodynamic properties and behavior, and quantifying how these different properties manifest themselves in conventional absorber-stripper configurations, or identifying new configurations that aremore » specific for a solvent’s signature behavior. We identify critical areas of study that are needed, and our efforts (e.g. custom infrastructure, molecular models) to predict, measure, and model these behaviors. Such findings are critical for determining the rheology required for heat exchanger design; absorber designs and packing to accommodate solvents with gradient changes (e.g. viscosity, contact angle, surface tension), and stripper configurations without direct steam utilization or water reflux. Another critical area of research need is to understand the molecular structure of the liquid interface and bulk as a function of CO 2 loading, and to assess whether conventional film theories accurately quantify solvent behavior, or if thermodynamic models adequately quantify activity coefficients of ions in solution. We conclude with an assessment of our efforts to aid in bridging the knowledge gaps in understanding water-lean solvents, and suggestions of what is needed to enable large-scale demonstrations to meet the United States Department of Energy’s year 2030 goal.« less
A novel pneumatic micropipette aspiration method using a balance pressure model.
Zhao, Qili; Wu, Ming; Cui, Maosheng; Qin, Yanding; Yu, Jin; Sun, Mingzhu; Zhao, Xin; Feng, Xizeng
2013-12-01
This paper presents a novel micropipette aspiration (MA) method based on a common pneumatic micro-injection system. This method is the first to quantify the influence of capillary effect on aspiration pressure using a balance pressure model, and in return, uses the capillary effect to quantify the aspiration pressure. Subsequently, the seal between the cell and the micropipette is detected to judge and exclude the ineffective MA attempts. The rationality of the balance pressure model is validated by the designed micropipette-filling experiments. Through applied to elasticity-determination of the cells with different sizes, the feasibility and versatility of this MA method are proved. With abilities to quantify aspiration pressures and detect the seam between the cell and the micropipette, our method is expected to advance the application of the commercial pneumatic injector in the MA of cells. Moreover, with the quantified volume of the liquid entering into the micropipette during MA process, our method also has a potential applicability to the study of the permeability of the cell membrane in the future.
Analyzing Idioms and Their Frequency in Three Advanced ILI Textbooks: A Corpus-Based Study
ERIC Educational Resources Information Center
Alavi, Sepideh; Rajabpoor, Aboozar
2015-01-01
The present study aimed at identifying and quantifying the idioms used in three ILI "Advanced" level textbooks based on three different English corpora; MICASE, BNC and the Brown Corpus, and comparing the frequencies of the idioms across the three corpora. The first step of the study involved searching the books to find multi-word…
Children's interpretations of general quantifiers, specific quantifiers, and generics
Gelman, Susan A.; Leslie, Sarah-Jane; Was, Alexandra M.; Koch, Christina M.
2014-01-01
Recently, several scholars have hypothesized that generics are a default mode of generalization, and thus that young children may at first treat quantifiers as if they were generic in meaning. To address this issue, the present experiment provides the first in-depth, controlled examination of the interpretation of generics compared to both general quantifiers ("all Xs", "some Xs") and specific quantifiers ("all of these Xs", "some of these Xs"). We provided children (3 and 5 years) and adults with explicit frequency information regarding properties of novel categories, to chart when "some", "all", and generics are deemed appropriate. The data reveal three main findings. First, even 3-year-olds distinguish generics from quantifiers. Second, when children make errors, they tend to be in the direction of treating quantifiers like generics. Third, children were more accurate when interpreting specific versus general quantifiers. We interpret these data as providing evidence for the position that generics are a default mode of generalization, especially when reasoning about kinds. PMID:25893205
Roberti, Joshua A.; SanClements, Michael D.; Loescher, Henry W.; Ayres, Edward
2014-01-01
Even though fine-root turnover is a highly studied topic, it is often poorly understood as a result of uncertainties inherent in its sampling, e.g., quantifying spatial and temporal variability. While many methods exist to quantify fine-root turnover, use of minirhizotrons has increased over the last two decades, making sensor errors another source of uncertainty. Currently, no standardized methodology exists to test and compare minirhizotron camera capability, imagery, and performance. This paper presents a reproducible, laboratory-based method by which minirhizotron cameras can be tested and validated in a traceable manner. The performance of camera characteristics was identified and test criteria were developed: we quantified the precision of camera location for successive images, estimated the trueness and precision of each camera's ability to quantify root diameter and root color, and also assessed the influence of heat dissipation introduced by the minirhizotron cameras and electrical components. We report detailed and defensible metrology analyses that examine the performance of two commercially available minirhizotron cameras. These cameras performed differently with regard to the various test criteria and uncertainty analyses. We recommend a defensible metrology approach to quantify the performance of minirhizotron camera characteristics and determine sensor-related measurement uncertainties prior to field use. This approach is also extensible to other digital imagery technologies. In turn, these approaches facilitate a greater understanding of measurement uncertainties (signal-to-noise ratio) inherent in the camera performance and allow such uncertainties to be quantified and mitigated so that estimates of fine-root turnover can be more confidently quantified. PMID:25391023
Reed, Derek D; Kaplan, Brent A; Becirevic, Amel; Roma, Peter G; Hursh, Steven R
2016-07-01
Many adults engage in ultraviolet indoor tanning despite evidence of its association with skin cancer. The constellation of behaviors associated with ultraviolet indoor tanning is analogous to that in other behavioral addictions. Despite a growing literature on ultraviolet indoor tanning as an addiction, there remains no consensus on how to identify ultraviolet indoor tanning addictive tendencies. The purpose of the present study was to translate a behavioral economic task more commonly used in substance abuse to quantify the "abuse liability" of ultraviolet indoor tanning, establish construct validity, and determine convergent validity with the most commonly used diagnostic tools for ultraviolet indoor tanning addiction (i.e., mCAGE and mDSM-IV-TR). We conducted a between-groups study using a novel hypothetical Tanning Purchase Task to quantify intensity and elasticity of ultraviolet indoor tanning demand and permit statistical comparisons with the mCAGE and mDSM-IV-TR. Results suggest that behavioral economic demand is related to ultraviolet indoor tanning addiction status and adequately discriminates between potential addicted individuals from nonaddicted individuals. Moreover, we provide evidence that the Tanning Purchase Task renders behavioral economic indicators that are relevant to public health research. The present findings are limited to two ultraviolet indoor tanning addiction tools and a relatively small sample of high-risk ultraviolet indoor tanning users; however, these pilot data demonstrate the potential for behavioral economic assessment tools as diagnostic and research aids in ultraviolet indoor tanning addiction studies. © 2016 Society for the Experimental Analysis of Behavior.
Observation and quantification of the quantum dynamics of a strong-field excited multi-level system.
Liu, Zuoye; Wang, Quanjun; Ding, Jingjie; Cavaletto, Stefano M; Pfeifer, Thomas; Hu, Bitao
2017-01-04
The quantum dynamics of a V-type three-level system, whose two resonances are first excited by a weak probe pulse and subsequently modified by another strong one, is studied. The quantum dynamics of the multi-level system is closely related to the absorption spectrum of the transmitted probe pulse and its modification manifests itself as a modulation of the absorption line shape. Applying the dipole-control model, the modulation induced by the second strong pulse to the system's dynamics is quantified by eight intensity-dependent parameters, describing the self and inter-state contributions. The present study opens the route to control the quantum dynamics of multi-level systems and to quantify the quantum-control process.
NASA Technical Reports Server (NTRS)
Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina
2004-01-01
A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.
Influence of Noise Barriers on Near-Road and On-Road Air Quality: Results from Phoenix
The presentation describes field study results quantifying the impact of roadside barriers under real-world conditions in Phoenix, Arizona. Public health concerns regarding adverse health effects for populations spending significant amounts of time near high traffic roadways has ...
DOT National Transportation Integrated Search
2004-02-01
This report explores the effectiveness of relying on commercial radio as a source of traveler information, and presents an approach to quantify mobility benefits from radio traffic advisories. The study, conducted for the Washington, DC, metropolitan...
Currie, Jens J; Stack, Stephanie H; McCordic, Jessica A; Kaufman, Gregory D
2017-08-15
Marine debris poses considerable threat to biodiversity and ecosystems and has been identified as a stressor for a variety of marine life. Here we present results from the first study quantifying the amount and type of debris accumulation in Maui leeward waters and relate this to cetacean distribution to identify areas where marine debris may present a higher threat. Transect surveys were conducted within the 4-island region of Maui, Hawai'i from April 1, 2013 to April 15, 2016. Debris was found in all areas of the study region with higher concentrations observed where the Au'au, Kealaikahiki, and Alalakeiki channels converge. The degree of overlap between debris and cetaceans varied among species but was largest for humpback whales, which account for the largest portion of reported entanglements in the 4-island region of Maui. Identifying areas of high debris-cetacean density overlap can facilitate species management and debris removal efforts. Copyright © 2017 Elsevier Ltd. All rights reserved.
Chinese air pollution embodied in trade
NASA Astrophysics Data System (ADS)
Davis, S. J.
2014-12-01
Rapid economic development in China has been accompanied by high levels of air pollution in many areas of China. Although researchers have applied a range of methods to monitor and track pollutant emissions in the atmosphere, studies of the underlying economic and technological drivers of this pollution have received considerably less attention. I will present results of a series of studies that have quantified the air pollutants embodied in goods being traded both within China and internationally. The results show that trade is facilitating the concentration of pollution in less economically developed areas, which in turn export pollution-intensive goods to more affluent areas. However, the export-related pollution itself is sometimes transported long distances; for instance, we have quantified the impacts of the Chinese pollution embodied in internationally-exported goods on air quality in the US. These findings important implications for Chinese efforts to curb CO2 emissions and improve air quality. The research to be presented reflects the efforts of a multiple year, ongoing collaboration among interdisciplinary researchers in China, the US and the UK.
Quantifying Reinforcement Value and Demand for Psychoactive Substances in Humans
Heinz, Adrienne J.; Lilje, Todd C.; Kassel, Jon D.; de Wit, Harriet
2013-01-01
Behavioral economics is an emerging cross-disciplinary field that is providing an exciting new contextual framework for researchers to study addictive processes. New initiatives to study addiction under a behavioral economic rubric have yielded variable terminology and differing methods and theoretical approaches that are consistent with the multidimensional nature of addiction. The present article is intended to provide an integrative overview of the behavioral economic nomenclature and to describe relevant theoretical models, principles and concepts. Additionally, we present measures derived from behavioral economic theories that quantify demand for substances and assess decision making processes surrounding substance use. The sensitivity of these measures to different contextual elements (e.g., drug use status, acute drug effects, deprivation) is also addressed. The review concludes with discussion of the validity of these approaches and their potential for clinical application and highlights areas that warrant further research. Overall, behavioral economics offers a compelling framework to help explicate complex addictive processes and it is likely to provide a translational platform for clinical intervention. PMID:23062106
NASA Technical Reports Server (NTRS)
Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter, III; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina;
2016-01-01
This work presents an overview of the This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNCs demonstrated wavefront sensing and control system to refine and quantify the end-to-end system performance for high-contrast starlight suppression. This pathfinder system will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes., a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNCs demonstrated wavefront sensing and control system to refine and quantify the end-to-end system performance for high-contrast starlight suppression. This pathfinder system will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.
Wang, Jing; Toloczko, Mychailo B.; Kruska, Karen; ...
2017-11-17
Accelerator-based ion beam irradiation techniques have been used to study radiation effects in materials for decades. Although carbon contamination induced by ion beams in target materials is a well-known issue in some material systems, it has not been fully characterized nor quantified for studies in ferritic/martensitic (F/M) steels that are candidate materials for applications such as core structural components in advanced nuclear reactors. It is an especially important issue for this class of material because of the strong effect of carbon level on precipitate formation. In this paper, the ability to quantify carbon contamination using three common techniques, namely time-of-flightmore » secondary ion mass spectroscopy (ToF-SIMS), atom probe tomography (APT), and transmission electron microscopy (TEM) is compared. Their effectiveness and shortcomings in determining carbon contamination are presented and discussed. The corresponding microstructural changes related to carbon contamination in ion irradiated F/M steels are also presented and briefly discussed.« less
Hausberger, Anna; Lamanna, William C; Hartinger, Martin; Seidl, Andreas; Toll, Hansjoerg; Holzmann, Johann
2016-06-01
Filgrastim is a recombinant, non-glycosylated form of human granulocyte colony-stimulating factor, used to stimulate leukocyte proliferation in patients suffering from neutropenia. Since the expiration of patents associated with Amgen's filgrastim biopharmaceutical, Neupogen(®), in 2006, a number of filgrastim products have been marketed; however, a detailed characterization and comparison of variants associated with these products have not been publically reported. The objective of this study was to identify and quantify product-related variants in filgrastim reference products and biosimilars thereof that are presently available in highly regulated markets. In this study, we used intact and top-down mass spectrometry to identify and quantify product-related variants in filgrastim products. Mass spectrometry has become the method of choice for physicochemical characterization of biopharmaceuticals, allowing accurate and sensitive characterization of product-related variants. In addition to modifications ubiquitously present in biopharmaceuticals, such as methionine oxidation and asparagine/glutamine deamidation, we identified six different low-level, product-related variants present in some, but not all, of the tested products. Two variants, an acetylated filgrastim variant and a filgrastim variant containing an additional C-terminal tryptophan extension, are newly identified variants. This study demonstrates that filgrastim products already in widespread clinical use in highly regulated markets differ in low-level, product-related variants present at levels mostly below 1 % relative abundance. This study provides a comprehensive catalog of minor differences between filgrastim products and suggests that the filgrastim product-related variants described here are not clinically relevant when present at low abundance.
Plasmonic imaging of protein interactions with single bacterial cells.
Syal, Karan; Wang, Wei; Shan, Xiaonan; Wang, Shaopeng; Chen, Hong-Yuan; Tao, Nongjian
2015-01-15
Quantifying the interactions of bacteria with external ligands is fundamental to the understanding of pathogenesis, antibiotic resistance, immune evasion, and mechanism of antimicrobial action. Due to inherent cell-to-cell heterogeneity in a microbial population, each bacterium interacts differently with its environment. This large variability is washed out in bulk assays, and there is a need of techniques that can quantify interactions of bacteria with ligands at the single bacterium level. In this work, we present a label-free and real-time plasmonic imaging technique to measure the binding kinetics of ligand interactions with single bacteria, and perform statistical analysis of the heterogeneity. Using the technique, we have studied interactions of antibodies with single Escherichia coli O157:H7 cells and demonstrated a capability of determining the binding kinetic constants of single live bacteria with ligands, and quantify heterogeneity in a microbial population. Copyright © 2014 Elsevier B.V. All rights reserved.
Sensors: Views of Staff of a Disability Service Organization
Wolbring, Gregor; Leopatra, Verlyn
2013-01-01
Sensors have become ubiquitous in their reach and scope of application. They are a technological cornerstone for various modes of health surveillance and participatory medicine—such as quantifying oneself; they are also employed to track people with certain as impairments perceived ability differences. This paper presents quantitative and qualitative data of an exploratory, non-generalizable study into the perceptions, attitudes and concerns of staff of a disability service organization, that mostly serve people with intellectual disabilities, towards the use of various types of sensor technologies that might be used by and with their clients. In addition, perspectives of various types of privacy issues linked to sensors, as well data regarding the concept of quantified self were obtained. Our results highlight the need to involve disabled people and their support networks in sensor and quantified-self discourses, in order to prevent undue disadvantages. PMID:25562409
Infrastructure Vulnerability Assessment Model (I-VAM).
Ezell, Barry Charles
2007-06-01
Quantifying vulnerability to critical infrastructure has not been adequately addressed in the literature. Thus, the purpose of this article is to present a model that quantifies vulnerability. Vulnerability is defined as a measure of system susceptibility to threat scenarios. This article asserts that vulnerability is a condition of the system and it can be quantified using the Infrastructure Vulnerability Assessment Model (I-VAM). The model is presented and then applied to a medium-sized clean water system. The model requires subject matter experts (SMEs) to establish value functions and weights, and to assess protection measures of the system. Simulation is used to account for uncertainty in measurement, aggregate expert assessment, and to yield a vulnerability (Omega) density function. Results demonstrate that I-VAM is useful to decisionmakers who prefer quantification to qualitative treatment of vulnerability. I-VAM can be used to quantify vulnerability to other infrastructures, supervisory control and data acquisition systems (SCADA), and distributed control systems (DCS).
X-ray photoelectron spectroscopy of select multi-layered transition metal carbides (MXenes)
NASA Astrophysics Data System (ADS)
Halim, Joseph; Cook, Kevin M.; Naguib, Michael; Eklund, Per; Gogotsi, Yury; Rosen, Johanna; Barsoum, Michel W.
2016-01-01
In this work, a detailed high resolution X-ray photoelectron spectroscopy (XPS) analysis is presented for select MXenes-a recently discovered family of two-dimensional (2D) carbides and carbonitrides. Given their 2D nature, understanding their surface chemistry is paramount. Herein we identify and quantify the surface groups present before, and after, sputter-cleaning as well as freshly prepared vs. aged multi-layered cold pressed discs. The nominal compositions of the MXenes studied here are Ti3C2Tx, Ti2CTx, Ti3CNTx, Nb2CTx and Nb4C3Tx, where T represents surface groups that this work attempts to quantify. In all the cases, the presence of three surface terminations, sbnd O, sbnd OH and sbnd F, in addition to OH-terminations relatively strongly bonded to H2O molecules, was confirmed. From XPS peak fits, it was possible to establish the average sum of the negative charges of the terminations for the aforementioned MXenes. Based on this work, it is now possible to quantify the nature of the surface terminations. This information can, in turn, be used to better design and tailor these novel 2D materials for various applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halim, Joseph; Cook, Kevin M.; Naguib, Michael
A detailed high resolution X-ray photoelectron spectroscopy (XPS) analysis is presented in this work for select MXenes—a recently discovered family of two-dimensional (2D) carbides and carbonitrides. Given their 2D nature, understanding their surface chemistry is paramount. Thus we identify and quantify the surface groups present before, and after, sputter-cleaning as well as freshly prepared vs. aged multi-layered cold pressed discs. The nominal compositions of the MXenes studied here are Ti 3C 2T x, Ti 2CT x, Ti 3CNTx, Nb 2CT x and Nb 4C 3T x, where T represents surface groups that this work attempts to quantify. In all themore » cases, the presence of three surface terminations, single bondO, single bondOH and single bondF, in addition to OH-terminations relatively strongly bonded to H 2O molecules, was confirmed. Moreover, from XPS peak fits, it was possible to establish the average sum of the negative charges of the terminations for the aforementioned MXenes. Based on this work, it is now possible to quantify the nature of the surface terminations. This information can, in turn, be used to better design and tailor these novel 2D materials for various applications.« less
New dual in-growth core isotopic technique to assess the root litter carbon input to the soil
USDA-ARS?s Scientific Manuscript database
The root-derived carbon (C) input to the soil, whose quantification is often neglected because of methodological difficulties, is considered a crucial C flux for soil C dynamics and net ecosystem productivity (NEP) studies. In the present study, we compared two independent methods to quantify this C...
Week Long Topography Study of Young Adults Using Electronic Cigarettes in Their Natural Environment.
Robinson, R J; Hensel, E C; Roundtree, K A; Difrancesco, A G; Nonnemaker, J M; Lee, Y O
2016-01-01
Results of an observational, descriptive study quantifying topography characteristics of twenty first generation electronic nicotine delivery system users in their natural environment for a one week observation period are presented. The study quantifies inter-participant variation in puffing topography between users and the intra-participant variation for each user observed during one week of use in their natural environment. Puff topography characteristics presented for each user include mean puff duration, flow rate and volume for each participant, along with descriptive statistics of each quantity. Exposure characteristics including the number of vaping sessions, total number of puffs and cumulative volume of aerosol generated from ENDS use (e-liquid aerosol) are reported for each participant for a one week exposure period and an effective daily average exposure. Significant inter-participant and intra-participant variation in puff topography was observed. The observed range of natural use environment characteristics is used to propose a set of topography protocols for use as command inputs to drive machine-puffed electronic nicotine delivery systems in a controlled laboratory environment.
Week Long Topography Study of Young Adults Using Electronic Cigarettes in Their Natural Environment
Roundtree, K. A.; Difrancesco, A. G.; Nonnemaker, J. M.; Lee, Y. O.
2016-01-01
Results of an observational, descriptive study quantifying topography characteristics of twenty first generation electronic nicotine delivery system users in their natural environment for a one week observation period are presented. The study quantifies inter-participant variation in puffing topography between users and the intra-participant variation for each user observed during one week of use in their natural environment. Puff topography characteristics presented for each user include mean puff duration, flow rate and volume for each participant, along with descriptive statistics of each quantity. Exposure characteristics including the number of vaping sessions, total number of puffs and cumulative volume of aerosol generated from ENDS use (e-liquid aerosol) are reported for each participant for a one week exposure period and an effective daily average exposure. Significant inter-participant and intra-participant variation in puff topography was observed. The observed range of natural use environment characteristics is used to propose a set of topography protocols for use as command inputs to drive machine-puffed electronic nicotine delivery systems in a controlled laboratory environment. PMID:27736944
Methods for microbiological quality assessment in drinking water: a comparative study.
Helmi, K; Barthod, F; Méheut, G; Henry, A; Poty, F; Laurent, F; Charni-Ben-Tabassi, N
2015-03-01
The present study aimed to compare several methods for quantifying and discriminating between the different physiological states of a bacterial population present in drinking water. Flow cytometry (FCM), solid-phase cytometry (SPC), epifluorescence microscopy (MSP) and culture method performances were assessed by comparing the results obtained for different water samples. These samples, including chlorinated and non-chlorinated water, were collected in a drinking water treatment plant. Total bacteria were quantified by using SYBR Green II (for FCM) and 4',6'-diamino-2-phenylindole (DAPI) (for MSP), viable and non-viable bacteria were distinguished by using SYBR Green II and propidium iodide dual staining (for FCM), and active cells were distinguished by using CTC (for MSP) and Chemchrome V6 (for FCM and SPC). In our conditions, counts using microscopy and FCM were significantly correlated regarding total bacteria and active cells. Conversely, counts were not significantly similar using solid-phase and FCM for active bacteria. Moreover, the R2A medium showed that bacterial culturability could be recovered after chlorination. This study highlights that FCM appears to be a useful and powerful technique for drinking water production monitoring.
NASA Astrophysics Data System (ADS)
Kort, E. A.; Gvakharia, A.; Smith, M. L.; Conley, S.; Frauhammer, K.
2017-12-01
Nitrous Oxide (N2O) is a crucial atmospheric trace gas that drives 21st century stratospheric ozone depletion and substantively impacts climate. Anthropogenic emissions drive the global imbalance and annual growth of N2O, and the dominant anthropogenic source is fertilizer production and application, both of which have large uncertainties. In this presentation we will discuss the FEAST campaign, a study designed to demonstrate new approaches to quantify N2O emissions from fertilizer production and usage with aircraft measurements. In the FEAST campaign we deployed new instrumentation along with experienced flight sensors onboard the Scientific Aviation Mooney aircraft to make 40 hours of continuous 1Hz measurements of N2O, CO2, CO, H2O, CH4, O3, T, and winds. The Mississippi River Valley provided an optimal target as this location includes significant fertilizer production facilities as well as large cropland areas (dominated by corn, soy, rice, and cotton) with substantive fertilizer application. By leveraging our payload and unique airborne capabilities we directly observe and quantify N2O emissions from individual fertilizer production facilities (as well as CO2 and CH4 emissions from these same facilities). We are also able to quantify N2O fluxes from large cropland areas ( 100's km) employing a mass balance approach, a first for N2O, and will show results highlighting differences between crop types and amounts of applied fertilizer. The ability to quantify fluxes of croplands at 100km scale enables new understanding of processes controlling emissions at spatial scales that has eluded prior studies that either rely on extrapolation of small (flux chamber, towers), or work on 1,000+ km spatial scales (regional-global inversions from atmospheric measurements).
SCIENTIFIC UNCERTAINTIES IN ATMOSPHERIC MERCURY MODELS II: SENSITIVITY ANALYSIS IN THE CONUS DOMAIN
In this study, we present the response of model results to different scientific treatments in an effort to quantify the uncertainties caused by the incomplete understanding of mercury science and by model assumptions in atmospheric mercury models. Two sets of sensitivity simulati...
Quantifying plant age and available water effects on soybean leaf conductance
USDA-ARS?s Scientific Manuscript database
In this study, we present data characterizing the effects of soil moisture levels on total leaf conductance for two distinct determinate soybean (Glycine max (L.) Merr.) genotypes and subsequently use these data to formulate and validate a model that characterizes total leaf conductance. Conductanc...
Binge Drinking in Young Adults: Data, Definitions and Determinants
ERIC Educational Resources Information Center
Courtney, Kelly E.; Polich, John
2009-01-01
Binge drinking is an increasingly important topic in alcohol research, but the field lacks empirical cohesion and definitional precision. The present review summarizes findings and viewpoints from the scientific binge-drinking literature. Epidemiological studies quantify the seriousness of alcohol-related problems arising from binge drinking, with…
HYDRAULIC REDISTRIBUTION IN A DOUGLAS-FIR FOREST: LESSONS FROM SYSTEM MANIPULATIONS
Hydraulic redistribution (HR) occurs in many ecosystems; however, key questions remain about its consequences at the ecosystem level. The objectives of the present study were to quantify seasonal variation in HR and its driving force, and to manipulate the soil-root system to e...
NASA Astrophysics Data System (ADS)
Tarancón, A.; García, J. F.; Rauret, G.
2004-01-01
Plastic scintillation has recently been shown to be a powerful alternative to liquid scintillation and Cherenkov techniques in radionuclide determination due to the good values obtained for the measurement parameters and the low amount of wastes generated. The present study evaluated the capability of plastic scintillation beads and polyethylene vials for routine measurements of beta emitters ( 90Sr, 14C, 3H). Results show that high- and medium-energetic beta emitters can be quantified with relative errors less than 5% in low-activity aqueous samples, whereas low-energetic beta emitters can only be quantified in medium-activity samples.
Peterson, Courtney M; Apolzan, John W; Wright, Courtney; Martin, Corby K
2016-11-01
We conducted two studies to test the validity, reliability, feasibility and acceptability of using video chat technology to quantify dietary and pill-taking (i.e. supplement and medication) adherence. In study 1, we investigated whether video chat technology can accurately quantify adherence to dietary and pill-taking interventions. Mock study participants ate food items and swallowed pills, while performing randomised scripted 'cheating' behaviours to mimic non-adherence. Monitoring was conducted in a cross-over design, with two monitors watching in-person and two watching remotely by Skype on a smartphone. For study 2, a twenty-two-item online survey was sent to a listserv with more than 20 000 unique email addresses of past and present study participants to assess the feasibility and acceptability of the technology. For the dietary adherence tests, monitors detected 86 % of non-adherent events (sensitivity) in-person v. 78 % of events via video chat monitoring (P=0·12), with comparable inter-rater agreement (0·88 v. 0·85; P=0·62). However, for pill-taking, non-adherence trended towards being more easily detected in-person than by video chat (77 v. 60 %; P=0·08), with non-significantly higher inter-rater agreement (0·85 v. 0·69; P=0·21). Survey results from study 2 (n 1076 respondents; ≥5 % response rate) indicated that 86·4 % of study participants had video chatting hardware, 73·3 % were comfortable using the technology and 79·8 % were willing to use it for clinical research. Given the capability of video chat technology to reduce participant burden and outperform other adherence monitoring methods such as dietary self-report and pill counts, video chatting is a novel and promising platform to quantify dietary and pill-taking adherence.
Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food.
Jacobs, Rianne; van der Voet, Hilko; Ter Braak, Cajo J F
Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.
Is hyporheic flow an indicator for salmonid spawning site selection?
NASA Astrophysics Data System (ADS)
Benjankar, R. M.; Tonina, D.; Marzadri, A.; McKean, J. A.; Isaak, D.
2015-12-01
Several studies have investigated the role of hydraulic variables in the selection of spawning sites by salmonids. Some recent studies suggest that the intensity of the ambient hyporheic flow, that present without a salmon egg pocket, is a cue for spawning site selection, but others have argued against it. We tested this hypothesis by using a unique dataset of field surveyed spawning site locations and an unprecedented meter-scale resolution bathymetry of a 13.5 km long reach of Bear Valley Creek (Idaho, USA), an important Chinook salmon spawning stream. We used a two-dimensional surface water model to quantify stream hydraulics and a three-dimensional hyporheic model to quantify the hyporheic flows. Our results show that the intensity of ambient hyporheic flows is not a statistically significant variable for spawning site selection. Conversely, the intensity of the water surface curvature and the habitat quality, quantified as a function of stream hydraulics and morphology, are the most important variables for salmonid spawning site selection. KEY WORDS: Salmonid spawning habitat, pool-riffle system, habitat quality, surface water curvature, hyporheic flow
Kokaly, R.F.; Asner, Gregory P.; Ollinger, S.V.; Martin, M.E.; Wessman, C.A.
2009-01-01
For two decades, remotely sensed data from imaging spectrometers have been used to estimate non-pigment biochemical constituents of vegetation, including water, nitrogen, cellulose, and lignin. This interest has been motivated by the important role that these substances play in physiological processes such as photosynthesis, their relationships with ecosystem processes such as litter decomposition and nutrient cycling, and their use in identifying key plant species and functional groups. This paper reviews three areas of research to improve the application of imaging spectrometers to quantify non-pigment biochemical constituents of plants. First, we examine recent empirical and modeling studies that have advanced our understanding of leaf and canopy reflectance spectra in relation to plant biochemistry. Next, we present recent examples of how spectroscopic remote sensing methods are applied to characterize vegetation canopies, communities and ecosystems. Third, we highlight the latest developments in using imaging spectrometer data to quantify net primary production (NPP) over large geographic areas. Finally, we discuss the major challenges in quantifying non-pigment biochemical constituents of plant canopies from remotely sensed spectra.
Structures and materials technology needs for communications and remote sensing spacecraft
NASA Technical Reports Server (NTRS)
Gronet, M. J.; Jensen, G. A.; Hoskins, J. W.
1995-01-01
This report documents trade studies conducted from the perspective of a small spacecraft developer to determine and quantify the structures and structural materials technology development needs for future commercial and NASA small spacecraft to be launched in the period 1999 to 2005. Emphasis is placed on small satellites weighing less than 1800 pounds for two focus low-Earth orbit missions: commercial communications and remote sensing. The focus missions are characterized in terms of orbit, spacecraft size, performance, and design drivers. Small spacecraft program personnel were interviewed to determine their technology needs, and the results are summarized. A systems-analysis approach for quantifying the benefits of inserting advanced state-of-the-art technologies into a current reference, state-of-the-practice small spacecraft design is developed and presented. This approach is employed in a set of abbreviated trade studies to quantify the payoffs of using a subset of 11 advanced technologies selected from the interview results The 11 technology development opportunities are then ranked based on their relative payoff. Based on the strong potential for significant benefits, recommendations are made to pursue development of 8 and the 11 technologies. Other important technology development areas identified are recommended for further study.
Fractal Analysis in Agrophysics
USDA-ARS?s Scientific Manuscript database
The geometric irregularity is an intrinsic property of soils and plants. This geometric irregularity is easy to perceive and observe, but quantifying it has long presented a daunting challenge. Such quantifying is imperative because the geometric irregularity is the cause and the reflection of spati...
van den Noort, Josien C; Verhagen, Rens; van Dijk, Kees J; Veltink, Peter H; Vos, Michelle C P M; de Bie, Rob M A; Bour, Lo J; Heida, Ciska T
2017-10-01
This proof-of-principle study describes the methodology and explores and demonstrates the applicability of a system, existing of miniature inertial sensors on the hand and a separate force sensor, to objectively quantify hand motor symptoms in patients with Parkinson's disease (PD) in a clinical setting (off- and on-medication condition). Four PD patients were measured in off- and on- dopaminergic medication condition. Finger tapping, rapid hand opening/closing, hand pro/supination, tremor during rest, mental task and kinetic task, and wrist rigidity movements were measured with the system (called the PowerGlove). To demonstrate applicability, various outcome parameters of measured hand motor symptoms of the patients in off- vs. on-medication condition are presented. The methodology described and results presented show applicability of the PowerGlove in a clinical research setting, to objectively quantify hand bradykinesia, tremor and rigidity in PD patients, using a single system. The PowerGlove measured a difference in off- vs. on-medication condition in all tasks in the presented patients with most of its outcome parameters. Further study into the validity and reliability of the outcome parameters is required in a larger cohort of patients, to arrive at an optimal set of parameters that can assist in clinical evaluation and decision-making.
Frequency of discriminative sensory loss in the hand after stroke in a rehabilitation setting.
Carey, Leeanne M; Matyas, Thomas A
2011-02-01
Somatosensory loss following stroke is common, with negative consequences for functional outcome. However, existing studies typically do not include quantitative measures of discriminative sensibility. The aim of this study was to quantify the proportion of stroke patients presenting with discriminative sensory loss of the hand in the post-acute rehabilitation phase. Prospective cohort study of stroke survivors presenting for rehabilitation. Fifty-one consecutive patients admitted to a metropolitan rehabilitation centre over a continuous 12-month period who met selection criteria. Quantitative measures of touch discrimination and limb position sense, with high re-test reliability, good discriminative test properties and objective criteria of abnormality, were employed. Both upper limbs were tested, in counterbalanced order. Impaired touch discrimination was identified in the hand contralateral to the lesion in 47% of patients, and in the ipsilesional hand in 16%. Forty-nine percent showed impaired limb position sense in the contralesional limb and 20% in the ipsilesional limb. Sixty-seven percent demonstrated impairment of at least one modality in the contralesional limb. Ipsilesional impairment was less severe. Discriminative sensory impairment was quantified in the contralesional hand in approximately half of stroke patients presenting for rehabilitation. A clinically significant number also experienced impairment in the ipsilesional "unaffected" hand.
Quantification of mouse in vivo whole-body vibration amplitude from motion-blur using x-ray imaging.
Hu, Zhengyi; Welch, Ian; Yuan, Xunhua; Pollmann, Steven I; Nikolov, Hristo N; Holdsworth, David W
2015-08-21
Musculoskeletal effects of whole-body vibration on animals and humans have become an intensely studied topic recently, due to the potential of applying this method as a non-pharmacological therapy for strengthening bones. It is relatively easy to quantify the transmission of whole-body mechanical vibration through the human skeletal system using accelerometers. However, this is not the case for small-animal pre-clinical studies because currently available accelerometers have a large mass, relative to the mass of the animals, which causes the accelerometers themselves to affect the way vibration is transmitted. Additionally, live animals do not typically remain motionless for long periods, unless they are anesthetized, and they are required to maintain a static standing posture during these studies. These challenges provide the motivation for the development of a method to quantify vibrational transmission in small animals. We present a novel imaging technique to quantify whole-body vibration transmission in small animals using 280 μm diameter tungsten carbide beads implanted into the hind limbs of mice. Employing time-exposure digital x-ray imaging, vibrational amplitude is quantified based on the blurring of the implanted beads caused by the vibrational motion. Our in vivo results have shown this technique is capable of measuring vibration amplitudes as small as 0.1 mm, with precision as small as ±10 μm, allowing us to distinguish differences in the transmitted vibration at different locations on the hindlimbs of mice.
Quantification of mouse in vivo whole-body vibration amplitude from motion-blur using x-ray imaging
NASA Astrophysics Data System (ADS)
Hu, Zhengyi; Welch, Ian; Yuan, Xunhua; Pollmann, Steven I.; Nikolov, Hristo N.; Holdsworth, David W.
2015-08-01
Musculoskeletal effects of whole-body vibration on animals and humans have become an intensely studied topic recently, due to the potential of applying this method as a non-pharmacological therapy for strengthening bones. It is relatively easy to quantify the transmission of whole-body mechanical vibration through the human skeletal system using accelerometers. However, this is not the case for small-animal pre-clinical studies because currently available accelerometers have a large mass, relative to the mass of the animals, which causes the accelerometers themselves to affect the way vibration is transmitted. Additionally, live animals do not typically remain motionless for long periods, unless they are anesthetized, and they are required to maintain a static standing posture during these studies. These challenges provide the motivation for the development of a method to quantify vibrational transmission in small animals. We present a novel imaging technique to quantify whole-body vibration transmission in small animals using 280 μm diameter tungsten carbide beads implanted into the hind limbs of mice. Employing time-exposure digital x-ray imaging, vibrational amplitude is quantified based on the blurring of the implanted beads caused by the vibrational motion. Our in vivo results have shown this technique is capable of measuring vibration amplitudes as small as 0.1 mm, with precision as small as ±10 μm, allowing us to distinguish differences in the transmitted vibration at different locations on the hindlimbs of mice.
We present the results of monthly sediment and water quality surveys to evaluate the impact of intermittent, seasonal hypoxia on benthic habitat condition. This study was conducted in the Pensacola Bay (Florida) estuary across nine sites extending from the mouth of the Escambia ...
Source Apportionment of the Summer Time Carbonaceous Aerosol at Nordic Rural Background Sites
In the present study, natural and anthropogenic sources of particulate organic carbon (OCp) and elemental carbon (EC) have been quantified based on weekly filter samples of PM10 (particles with aerodynamic diameter <10µ collected at four Nordic rural backgro...
The paper presents a new approach to quantifying emissions from fugitive gaseous air pollution sources. Computed tomography (CT) and path-integrated optical remote sensing (PI-ORS) concentration data are combined in a new field beam geometry. Path-integrated concentrations are ...
Quantifying landfill biogas production potential in the U.S.
USDA-ARS?s Scientific Manuscript database
This study presents an overview of the biogas (biomethane) availability in U.S. landfills, calculated from EPA estimates of landfill capacities. This survey concludes that the volume of landfill-derived methane in the U.S. is 466 billion cubic feet per year, of which 66 percent is collected and onl...
There are reported insecticide residues present in food, water, and surfaces such as carpets treated for flea control. However, no studies (except those we currently have in place) have quantified the transferable flea control insecticide residues which occur on pets (the majo...
Education M.S., Soil Science, Colorado State University, 2006 B.S., Soil Science/Environmental Studies Ash in Algal Biomass, NREL Technical Report (2013) "Nitrates in soil and plant systems across ;Quantifying nitrate leaching below root zone across site specific management zones," presented at Soil
NASA Astrophysics Data System (ADS)
Diffenbaugh, N. S.
2017-12-01
Severe heat provides one of the most direct, acute, and rapidly changing impacts of climate on people and ecostystems. Theory, historical observations, and climate model simulations all suggest that global warming should increase the probability of hot events that fall outside of our historical experience. Given the acutre impacts of extreme heat, quantifying the probability of historically unprecedented hot events at different levels of climate forcing is critical for climate adaptation and mitigation decisions. However, in practice that quantification presents a number of methodological challenges. This presentation will review those methodological challenges, including the limitations of the observational record and of climate model fidelity. The presentation will detail a comprehensive approach to addressing these challenges. It will then demonstrate the application of that approach to quantifying uncertainty in the probability of record-setting hot events in the current climate, as well as periods with lower and higher greenhouse gas concentrations than the present.
Quantifying factors for the success of stratified medicine.
Trusheim, Mark R; Burgess, Breon; Hu, Sean Xinghua; Long, Theresa; Averbuch, Steven D; Flynn, Aiden A; Lieftucht, Alfons; Mazumder, Abhijit; Milloy, Judy; Shaw, Peter M; Swank, David; Wang, Jian; Berndt, Ernst R; Goodsaid, Federico; Palmer, Michael C
2011-10-31
Co-developing a drug with a diagnostic to create a stratified medicine - a therapy that is targeted to a specific patient population on the basis of a clinical characteristic such as a biomarker that predicts treatment response - presents challenges for product developers, regulators, payers and physicians. With the aim of developing a shared framework and tools for addressing these challenges, here we present an analysis using data from case studies in oncology and Alzheimer's disease, coupled with integrated computational modelling of clinical outcomes and developer economic value, to quantify the effects of decisions related to key issues such as the design of clinical trials. This illustrates how such analyses can aid the coordination of diagnostic and drug development, and the selection of optimal development and commercialization strategies. It also illustrates the impact of the interplay of these factors on the economic feasibility of stratified medicine, which has important implications for public policy makers.
ERIC Educational Resources Information Center
Baker, Mohammad A. Abu; Emerson, Sara E.; Brown, Joel S.
2015-01-01
We present a practical field exercise for ecology and animal behavior classes that can be carried out on campus, using urban wildlife. Students document an animal's feeding behavior to study its interactions with the surrounding environment. In this approach, an animal's feeding behavior is quantified at experimental food patches placed within its…
Measuring Social Motivation Using Signal Detection and Reward Responsiveness.
Chevallier, Coralie; Tonge, Natasha; Safra, Lou; Kahn, David; Kohls, Gregor; Miller, Judith; Schultz, Robert T
2016-01-01
Recent trends in psychiatry have emphasized the need for a shift from categorical to dimensional approaches. Of critical importance to this transformation is the availability of tools to objectively quantify behaviors dimensionally. The present study focuses on social motivation, a dimension of behavior that is central to a range of psychiatric conditions but for which a particularly small number of assays currently exist. In Study 1 (N = 48), healthy adults completed a monetary reward task and a social reward task, followed by completion of the Chapman Physical and Social Anhedonia Scales. In Study 2 (N = 26), an independent sample was recruited to assess the robustness of Study 1's findings. The reward tasks were analyzed using signal detection theory to quantify how much reward cues bias participants' responses. In both Study 1 and Study 2, social anhedonia scores were negatively correlated with change in response bias in the social reward task but not in the monetary reward task. A median split on social anhedonia scores confirmed that participants with high social anhedonia showed less change in response bias in the social reward task compared to participants with low social anhedonia. This study confirms that social anhedonia selectively affects how much an individual changes their behavior based on the presence of socially rewarding cues and establishes a tool to quantify social reward responsiveness dimensionally.
In-vivo quantification of primary microRNA processing by Drosha with a luciferase based system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allegra, Danilo; Cooperation Unit 'Mechanisms of Leukemogenesis', B061, DKFZ, Im Neuenheimer Feld 280, 69120 Heidelberg; Mertens, Daniel, E-mail: daniel.mertens@uniklinik-ulm.de
2011-03-25
Research highlights: {yields} Posttranscriptional regulation of miRNA processing is difficult to quantify. {yields} Our in-vivo processing assay can quantify Drosha cleavage in live cells. {yields} It is based on luciferase reporters fused with pri-miRNAs. {yields} The assay validates the processing defect caused by a mutation in pri-16-1. {yields} It is a sensitive method to quantify pri-miRNA cleavage by Drosha in live cells. -- Abstract: The RNAse III Drosha is responsible for the first step of microRNA maturation, the cleavage of primary miRNA to produce the precursor miRNA. Processing by Drosha is finely regulated and influences the amount of mature microRNAmore » in a cell. We describe in the present work a method to quantify Drosha processing activity in-vivo, which is applicable to any microRNA. With respect to other methods for measuring Drosha activity, our system is faster and scalable, can be used with any cellular system and does not require cell sorting or use of radioactive isotopes. This system is useful to study regulation of Drosha activity in physiological and pathological conditions.« less
Quantifying parametric uncertainty in the Rothermel model
S. Goodrick
2008-01-01
The purpose of the present work is to quantify parametric uncertainty in the Rothermel wildland fire spreadmodel (implemented in software such as fire spread models in the United States. This model consists of a non-linear system of equations that relates environmentalvariables (input parameter groups...
U.S. Military Aircraft For Sale: Crafting an F-22 Export Policy
2000-06-01
present on virtually every piece of hardware on the aircraft.” Quantifying Risk Protection measures, however, are only part of the equation for...production of its parts and components. 5 Lockheed Martin uses such a matrix for quantifying risk . The
Preferential flow from pore to landscape scales
NASA Astrophysics Data System (ADS)
Koestel, J. K.; Jarvis, N.; Larsbo, M.
2017-12-01
In this presentation, we give a brief personal overview of some recent progress in quantifying preferential flow in the vadose zone, based on our own work and those of other researchers. One key challenge is to bridge the gap between the scales at which preferential flow occurs (i.e. pore to Darcy scales) and the scales of interest for management (i.e. fields, catchments, regions). We present results of recent studies that exemplify the potential of 3-D non-invasive imaging techniques to visualize and quantify flow processes at the pore scale. These studies should lead to a better understanding of how the topology of macropore networks control key state variables like matric potential and thus the strength of preferential flow under variable initial and boundary conditions. Extrapolation of this process knowledge to larger scales will remain difficult, since measurement technologies to quantify macropore networks at these larger scales are lacking. Recent work suggests that the application of key concepts from percolation theory could be useful in this context. Investigation of the larger Darcy-scale heterogeneities that generate preferential flow patterns at the soil profile, hillslope and field scales has been facilitated by hydro-geophysical measurement techniques that produce highly spatially and temporally resolved data. At larger regional and global scales, improved methods of data-mining and analyses of large datasets (machine learning) may help to parameterize models as well as lead to new insights into the relationships between soil susceptibility to preferential flow and site attributes (climate, land uses, soil types).
The Hypersonic Inflatable Aerodynamic Decelerator (HIAD) Mission Applications Study
NASA Technical Reports Server (NTRS)
Bose, David M.; Winski, Richard; Shidner, Jeremy; Zumwalt, Carlie; Johnston, Christopher O.; Komar, D. R.; Cheatwood, F. M.; Hughes, Stephen J.
2013-01-01
The objective of the HIAD Mission Applications Study is to quantify the benefits of HIAD infusion to the concept of operations of high priority exploration missions. Results of the study will identify the range of mission concepts ideally suited to HIADs and provide mission-pull to associated technology development programs while further advancing operational concepts associated with HIAD technology. A summary of Year 1 modeling and analysis results is presented covering missions focusing on Earth and Mars-based applications. Recommended HIAD scales are presented for near term and future mission opportunities and the associated environments (heating and structural loads) are described.
Imaging Alzheimer's disease pathophysiology with PET
Schilling, Lucas Porcello; Zimmer, Eduardo R.; Shin, Monica; Leuzy, Antoine; Pascoal, Tharick A.; Benedet, Andréa L.; Borelli, Wyllians Vendramini; Palmini, André; Gauthier, Serge; Rosa-Neto, Pedro
2016-01-01
ABSTRACT Alzheimer's disease (AD) has been reconceptualised as a dynamic pathophysiological process characterized by preclinical, mild cognitive impairment (MCI), and dementia stages. Positron emission tomography (PET) associated with various molecular imaging agents reveals numerous aspects of dementia pathophysiology, such as brain amyloidosis, tau accumulation, neuroreceptor changes, metabolism abnormalities and neuroinflammation in dementia patients. In the context of a growing shift toward presymptomatic early diagnosis and disease-modifying interventions, PET molecular imaging agents provide an unprecedented means of quantifying the AD pathophysiological process, monitoring disease progression, ascertaining whether therapies engage their respective brain molecular targets, as well as quantifying pharmacological responses. In the present study, we highlight the most important contributions of PET in describing brain molecular abnormalities in AD. PMID:29213438
NASA Astrophysics Data System (ADS)
Cruz, C.
The characterization of quantum information quantifiers has attracted a considerable attention of the scientific community, since they are a useful tool to verify the presence of quantum correlations in a quantum system. In this context, in the present work we show a theoretical study of some quantifiers, such as entanglement witness, entanglement of formation, Bell’s inequality violation and geometric quantum discord as a function of the diffractive properties of neutron scattering. We provide one path toward identifying the presence of quantum correlations and quantum nonlocality in a molecular magnet as a Heisenberg spin-1/2 dimer, by diffractive properties typically obtained via neutron scattering experiments.
Bach, Benoit; Cleroux, Marilyn; Saillen, Mayra; Schönenberger, Patrik; Burgos, Stephane; Ducruet, Julien; Vallat, Armelle
2016-12-15
The concentrations of α/β-thujone and the bitter components of Artemisia absinthium were quantified from alcoholic wormwood extracts during four phenological stages of their harvest period. A solid-phase micro-extraction method coupled to gas chromatography-mass spectrometry was used to determine the concentration of the two isomeric forms of thujone. In parallel, the combination of ultra-high pressure liquid chromatography and high resolution mass spectrometry allowed to quantify the compounds absinthin, artemisetin and dihydro-epi-deoxyarteannuin B. This present study aimed at helping absinthe producers to determine the best harvesting period. Copyright © 2016 Elsevier Ltd. All rights reserved.
Merrifield, R C; Stephan, C; Lead, J R
2018-02-20
Quantifying metal and nanoparticle (NP) biouptake and distribution on an individual cellular basis has previously been impossible, given available techniques which provide qualitative data that are laborious to acquire and prone to artifacts. Quantifying metal and metal NP uptake and loss processes in environmental organisms will lead to mechanistic understanding of biouptake and improved understanding of potential hazards and risks of metals and NPs. In this work, we present a new technique, single cell inductively coupled plasma mass spectrometry (SC-ICP-MS), which allows quantification of metal concentrations on an individual cell basis down to the attogram (ag) per cell level. We present data validating the novel method, along with the mass of metal per cell. Finally, we use SC-ICP-MS, with ancillary cell counting methods, to quantify the biouptake and strong sorption and distribution of both dissolved Au and Au NPs in a freshwater alga (Cyptomonas ovate). The data suggests differences between dissolved and NP uptake and loss. In the case of NPs, there was a dose and time dependent uptake, but individual cellular variations; at the highest realistic exposure conditions used in this study up to 40-50% of cells contained NPs, while 50-60% of cells did not.
NASA Astrophysics Data System (ADS)
Felbauer, Lucia; Pöppl, Ronald
2016-04-01
Global warming results in an ongoing retreat of glaciers in the Alps, leaving behind large amounts of easily erodible sediments. In addition, the debuttressing of rock-walls and the decay of permafrost in the high mountain regions facilitates mass movements of potential disastrous consequences, such as rock falls, landslides and debris flows. Therefore, it is highly important to quantify the amount of sediments that are supplied from the different compartments and to investigate how glacial retreat influences sediment dynamics in proglacial areas. In the presented work glacier retreat and associated sediment dynamics were investigated in the Kromer valley (Silvretta Alps, Austria) by analyzing remote sensing data. Glacial retreat from the period of 1950 to 2012 was documented by interpreting aerial photographs. By digitizing the different stages of the glaciers for six time frames, changes in glacier area and length were mapped and quantified. In order to identify, characterize and quantify sediment dynamics in the proglacial areas a high resolution DEM of difference (DoD) between 2007 and 2012 was created and analyzed, further differentiating between different zones (e.g. valley bottom, hillslope) and types of geomorphic processes (e.g. fluvial, gravitational). First results will be presented at the EGU General Assembly 2016.
X-ray photoelectron spectroscopy of select multi-layered transition metal carbides (MXenes)
Halim, Joseph; Cook, Kevin M.; Naguib, Michael; ...
2015-12-01
A detailed high resolution X-ray photoelectron spectroscopy (XPS) analysis is presented in this work for select MXenes—a recently discovered family of two-dimensional (2D) carbides and carbonitrides. Given their 2D nature, understanding their surface chemistry is paramount. Thus we identify and quantify the surface groups present before, and after, sputter-cleaning as well as freshly prepared vs. aged multi-layered cold pressed discs. The nominal compositions of the MXenes studied here are Ti 3C 2T x, Ti 2CT x, Ti 3CNTx, Nb 2CT x and Nb 4C 3T x, where T represents surface groups that this work attempts to quantify. In all themore » cases, the presence of three surface terminations, single bondO, single bondOH and single bondF, in addition to OH-terminations relatively strongly bonded to H 2O molecules, was confirmed. Moreover, from XPS peak fits, it was possible to establish the average sum of the negative charges of the terminations for the aforementioned MXenes. Based on this work, it is now possible to quantify the nature of the surface terminations. This information can, in turn, be used to better design and tailor these novel 2D materials for various applications.« less
Hart, Andy; Hoekstra, Jeljer; Owen, Helen; Kennedy, Marc; Zeilmaker, Marco J; de Jong, Nynke; Gunnlaugsdottir, Helga
2013-04-01
The EU project BRAFO proposed a framework for risk-benefit assessment of foods, or changes in diet, that present both potential risks and potential benefits to consumers (Hoekstra et al., 2012a). In higher tiers of the BRAFO framework, risks and benefits are integrated quantitatively to estimate net health impact measured in DALYs or QALYs (disability- or quality-adjusted life years). This paper describes a general model that was developed by a second EU project, Qalibra, to assist users in conducting these assessments. Its flexible design makes it applicable to a wide range of dietary questions involving different nutrients, contaminants and health effects. Account can be taken of variation between consumers in their diets and also other characteristics relevant to the estimation of risk and benefit, such as body weight, gender and age. Uncertainty in any input parameter may be quantified probabilistically, using probability distributions, or deterministically by repeating the assessment with alternative assumptions. Uncertainties that are not quantified should be evaluated qualitatively. Outputs produced by the model are illustrated using results from a simple assessment of fish consumption. More detailed case studies on oily fish and phytosterols are presented in companion papers. The model can be accessed as web-based software at www.qalibra.eu. Copyright © 2012. Published by Elsevier Ltd.
Quantifying the effects of pesticide exposure on annual reproductive success of birds (presentation)
The Markov chain nest productivity model (MCnest) was developed for quantifying the effects of specific pesticide‐use scenarios on the annual reproductive success of simulated populations of birds. Each nesting attempt is divided into a series of discrete phases (e.g., egg ...
2018-01-01
The morphology of an animal’s face will have large effects on the sensory information it can acquire. Here we quantify the arrangement of cranial sensory structures of the rat, with special emphasis on the mystacial vibrissae (whiskers). Nearly all mammals have vibrissae, which are generally arranged in rows and columns across the face. The vibrissae serve a wide variety of important behavioral functions, including navigation, climbing, wake following, anemotaxis, and social interactions. To date, however, there are few studies that compare the morphology of vibrissal arrays across species, or that describe the arrangement of the vibrissae relative to other facial sensory structures. The few studies that do exist have exploited the whiskers’ grid-like arrangement to quantify array morphology in terms of row and column identity. However, relying on whisker identity poses a challenge for comparative research because different species have different numbers and arrangements of whiskers. The present work introduces an approach to quantify vibrissal array morphology regardless of the number of rows and columns, and to quantify the array’s location relative to other sensory structures. We use the three-dimensional locations of the whisker basepoints as fundamental parameters to generate equations describing the length, curvature, and orientation of each whisker. Results show that in the rat, whisker length varies exponentially across the array, and that a hard limit on intrinsic curvature constrains the whisker height-to-length ratio. Whiskers are oriented to “fan out” approximately equally in dorsal-ventral and rostral-caudal directions. Quantifying positions of the other sensory structures relative to the whisker basepoints shows remarkable alignment to the somatosensory cortical homunculus, an alignment that would not occur for other choices of coordinate systems (e.g., centered on the midpoint of the eyes). We anticipate that the quantification of facial sensory structures, including the vibrissae, will ultimately enable cross-species comparisons of multi-modal sensing volumes. PMID:29621356
The Development of Voiceless Sibilant Fricatives in Putonghua-Speaking Children
ERIC Educational Resources Information Center
Li, Fangfang; Munson, Benjamin
2016-01-01
Purpose The aims of the present study are (a) to quantify the developmental sequence of fricative mastery in Putonghua-speaking children and discuss the observed pattern in relation to existing theoretical positions, and (b) to describe the acquisition of the fine-articulatory/acoustic details of fricatives in the multidimensional acoustic space.…
We present results from a monthly study in the Pensacola Bay estuary (FL) designed to evaluate the response and recovery in benthic habitats to intermittent, seasonal hypoxia (DO < 2 mg L-1). Samples were collected monthly from June 2015 through October 2017 at seven to nine s...
Speech Errors in Progressive Non-Fluent Aphasia
ERIC Educational Resources Information Center
Ash, Sharon; McMillan, Corey; Gunawardena, Delani; Avants, Brian; Morgan, Brianna; Khan, Alea; Moore, Peachie; Gee, James; Grossman, Murray
2010-01-01
The nature and frequency of speech production errors in neurodegenerative disease have not previously been precisely quantified. In the present study, 16 patients with a progressive form of non-fluent aphasia (PNFA) were asked to tell a story from a wordless children's picture book. Errors in production were classified as either phonemic,…
USDA-ARS?s Scientific Manuscript database
Historic baselines are important in developing our understanding of ecosystems and species dynamics in the face of rapid global change. While a number of studies have sought to elucidate the historic abundance of exploited marine populations, there are few that confidently quantify patterns of abund...
Sex Differences in Dichotic Listening
ERIC Educational Resources Information Center
Voyer, Daniel
2011-01-01
The present study quantified the magnitude of sex differences in perceptual asymmetries as measured with dichotic listening. This was achieved by means of a meta-analysis of the literature dating back from the initial use of dichotic listening as a measure of laterality. The meta-analysis included 249 effect sizes pertaining to sex differences and…
We present a novel approach to quantifying estuarine habitat use by fish using stable isotopes. In brief, we further developed and evaluated an existing stable isotope turnover model to estimate the time American shad, an anadromous clupeid, spend in various river habitats durin...
A Comparison of Mean Phase Difference and Generalized Least Squares for Analyzing Single-Case Data
ERIC Educational Resources Information Center
Manolov, Rumen; Solanas, Antonio
2013-01-01
The present study focuses on single-case data analysis specifically on two procedures for quantifying differences between baseline and treatment measurements. The first technique tested is based on generalized least square regression analysis and is compared to a proposed non-regression technique, which allows obtaining similar information. The…
Fractions as a Foundation for Algebra within a Sample of Prospective Teachers
ERIC Educational Resources Information Center
Zientek, Linda Reichwein; Younes, Rayya; Nimon, Kim; Mittag, Kathleen Cage; Taylor, Sharon
2013-01-01
Improving the mathematical skills of the next generation of students will require that elementary and middle school teachers are competent and confident in their abilities to perform fraction operations and to solve algebra equations The present study was conducted to (a) quantify relationships between prospective teachers' abilities to perform…
Postfire shrub-cover dynamics: a 70-year fire history in big sagebrush communities.
USDA-ARS?s Scientific Manuscript database
Land managers use prescribed fire to meet rangeland management objectives. This study was conducted to quantify, from present conditions, the effect of time since last burn (TSLB) on shrub cover over 70 yr of fire history. We sampled mountain big sagebrush communities at the USDA, ARS, U.S. Sheep ...
Quantifying Security Threats and Their Potential Impacts: A Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aissa, Anis Ben; Abercrombie, Robert K; Sheldon, Frederick T
In earlier works, we present a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper, we illustrate this infrastructure by means of an e-commerce application.
Matthew P. Thompson; Patrick Freeborn; Jon D. Rieck; Dave Calkin; Julie W. Gilbertson-Day; Mark A. Cochrane; Michael S. Hand
2016-01-01
We present a case study of the Las Conchas Fire (2011) to explore the role of previously burned areas (wildfires and prescribed fires) on suppression effectiveness and avoided exposure. Methodological innovations include characterisation of the joint dynamics of fire growth and suppression activities, development of a fire line effectiveness framework, and...
Frank H. Koch; Denys Yemshanov; Manuel Colunga-Garcia; Roger D. Magarey; William D. Smith
2011-01-01
International trade is widely acknowledged as a conduit for movement of invasive species, but few studies have directly quantified the invasion risk confronting individual locations of interest. This study presents estimates of the likelihood of successful entry for alien forest insect species at more than 3,000 urban areas in the contiguous United States (US). To...
NASA Astrophysics Data System (ADS)
Cullin, J. A.; Ward, A. S.; Cwiertny, D. M.; Barber, L. B.; Kolpin, D. W.; Bradley, P. M.; Keefe, S. H.; Hubbard, L. E.
2013-12-01
Contaminants of emerging concern (CECs) are an unregulated suite of constituents possessing the potential to cause a host of reproductive and developmental problems in humans and wildlife. CECs are frequently detected in environmental waters. Degradation pathways of several CECs are well-characterized in idealized laboratory settings, but CEC fate and transport in complex field settings is poorly understood. In the present study we used a multi-tracer solute injection study to quantify physical transport, photodegradation, and sorption in a wastewater effluent-impacted stream. Conservative tracers were used to quantify physical transport processes in the stream. Use of reactive fluorescent tracers allows for isolation of the relative contribution of photodegradation and sorption within the system. Field data was used to calibrate a one-dimensional transport model allowing us to use forward modeling to predict the transport of sulfamethoxazole, an antibiotic documented to be present in the wastewater effluent and in Fourmile Creek which is susceptible to both sorption and photolysis. Forward modeling will predict both temporal persistence and spatial extent of sulfamethoxazole in Fourmile Creek
A laboratory study of subjective annoyance response to sonic booms and aircraft flyovers
NASA Technical Reports Server (NTRS)
Leatherwood, Jack D.; Sullivan, Brenda M.
1994-01-01
Three experiments were conducted to determine subjective equivalence of aircraft subsonic flyover noise and sonic booms. Two of the experiments were conducted in a loudspeaker-driven sonic boom simulator, and the third in a large room containing conventional loudspeakers. The sound generation system of the boom simulator had a frequency response extending to very low frequencies (about 1 Hz) whereas the large room loudspeakers were limited to about 20 Hz. Subjective equivalence between booms and flyovers was quantified in terms of the difference between the noise level of a boom and that of a flyover when the two were judged equally annoying. Noise levels were quantified in terms of the following noise descriptors: Perceived Level (PL), Perceived Noise Level (PNL), C-weighted sound exposure level (SELC), and A-weighted sound exposure level (SELA). Results from the present study were compared, where possible, to similar results obtained in other studies. Results showed that noise level differences depended upon the descriptor used, specific boom and aircraft noise events being compared and, except for the PNL descriptor, varied between the simulator and large room. Comparison of noise level differences obtained in the present study with those of other studies indicated good agreement across studies only for the PNL and SELA descriptors. Comparison of the present results with assessments of community response to high-energy impulsive sounds made by Working Group 84 of the National Research Council's Committee on Hearing, Bioacoustics, and Biomechanics (CHABA) showed good agreement when boom/flyover noise level differences were based on SELA. However, noise level differences obtained by CHABA using SELA for aircraft flyovers and SELC for booms were not in agreement with results obtained in the present study.
Topographical and geological amplification: case studies and engineering implications
Celebi, M.
1991-01-01
Topographical and geological amplification that occurred during past earthquakes are quantified using spectral ratios of recorded motions. Several cases are presented from the 1985 Chilean and Mexican earthquakes as well as the 1983 Coalinga (California) and 1987 Supersition Hills (California) earthquake. The strong motions recorded in Mexico City during the 1985 Michoacan earthquake are supplemented by ambient motions recorded within Mexico City to quantify the now well known resonating frequencies of the Mexico City lakebed. Topographical amplification in Canal Beagle (Chile), Coalinga and Superstition Hills (California) are quantified using the ratios derived from the aftershocks following the earthquakes. A special dense array was deployed to record the aftershocks in each case. The implications of both geological and topographical amplification are discussed in light of current code provisions. The observed geological amplifications has already influenced the code provisions. Suggestions are made to the effect that the codes should include further provisions to take the amplification due to topography into account. ?? 1991.
Gregg, Chelsea L; Recknagel, Andrew K; Butcher, Jonathan T
2015-01-01
Tissue morphogenesis and embryonic development are dynamic events challenging to quantify, especially considering the intricate events that happen simultaneously in different locations and time. Micro- and more recently nano-computed tomography (micro/nanoCT) has been used for the past 15 years to characterize large 3D fields of tortuous geometries at high spatial resolution. We and others have advanced micro/nanoCT imaging strategies for quantifying tissue- and organ-level fate changes throughout morphogenesis. Exogenous soft tissue contrast media enables visualization of vascular lumens and tissues via extravasation. Furthermore, the emergence of antigen-specific tissue contrast enables direct quantitative visualization of protein and mRNA expression. Micro-CT X-ray doses appear to be non-embryotoxic, enabling longitudinal imaging studies in live embryos. In this chapter we present established soft tissue contrast protocols for obtaining high-quality micro/nanoCT images and the image processing techniques useful for quantifying anatomical and physiological information from the data sets.
SAFE LOCALIZATION FOR PLACEMENT OF PERCUTANEOUS PINS IN THE CALCANEUS.
Labronici, Pedro José; Pereira, Diogo do Nascimento; Pilar, Pedro Henrique Vargas Moreira; Franco, José Sergio; Serra, Marcos Donato; Cohen, José Carlos; Bitar, Rogério Carneiro
2012-01-01
To determine the areas presenting risk in six zones of the calcaneus, and to quantify the risks of injury to the anatomical structures (artery, vein, nerve and tendon). Fifty-three calcanei from cadavers were used, divided into three zones and each subdivided in two areas (upper and lower) by means of a longitudinal line through the calcaneus. The risk of injury to the anatomical structures in relation to each Kirschner wire was determined using a graded system according to the Licht classification. The total risk of injury to the anatomical structures through placement of more than one wire was quantified using the additive law of probabilities and the product law for independent events. The injury risk calculation according to the Licht classification showed that the highest risk of injury to the artery or vein was in zone IA (43%), in relation to injuries to nerves and tendons (13% and 0%, respectively). This study made it possible to identify the most vulnerable anatomical structures and quantify the risk of injury to the calcaneus.
Quantifying Human Visible Color Variation from High Definition Digital Images of Orb Web Spiders.
Tapia-McClung, Horacio; Ajuria Ibarra, Helena; Rao, Dinesh
2016-01-01
Digital processing and analysis of high resolution images of 30 individuals of the orb web spider Verrucosa arenata were performed to extract and quantify human visible colors present on the dorsal abdomen of this species. Color extraction was performed with minimal user intervention using an unsupervised algorithm to determine groups of colors on each individual spider, which was then analyzed in order to quantify and classify the colors obtained, both spatially and using energy and entropy measures of the digital images. Analysis shows that the colors cover a small region of the visible spectrum, are not spatially homogeneously distributed over the patterns and from an entropic point of view, colors that cover a smaller region on the whole pattern carry more information than colors covering a larger region. This study demonstrates the use of processing tools to create automatic systems to extract valuable information from digital images that are precise, efficient and helpful for the understanding of the underlying biology.
Quantifying Human Visible Color Variation from High Definition Digital Images of Orb Web Spiders
Ajuria Ibarra, Helena; Rao, Dinesh
2016-01-01
Digital processing and analysis of high resolution images of 30 individuals of the orb web spider Verrucosa arenata were performed to extract and quantify human visible colors present on the dorsal abdomen of this species. Color extraction was performed with minimal user intervention using an unsupervised algorithm to determine groups of colors on each individual spider, which was then analyzed in order to quantify and classify the colors obtained, both spatially and using energy and entropy measures of the digital images. Analysis shows that the colors cover a small region of the visible spectrum, are not spatially homogeneously distributed over the patterns and from an entropic point of view, colors that cover a smaller region on the whole pattern carry more information than colors covering a larger region. This study demonstrates the use of processing tools to create automatic systems to extract valuable information from digital images that are precise, efficient and helpful for the understanding of the underlying biology. PMID:27902724
Bouvignies, Guillaume; Hansen, D Flemming; Vallurupalli, Pramodh; Kay, Lewis E
2011-02-16
A method for quantifying millisecond time scale exchange in proteins is presented based on scaling the rate of chemical exchange using a 2D (15)N, (1)H(N) experiment in which (15)N dwell times are separated by short spin-echo pulse trains. Unlike the popular Carr-Purcell-Meiboom-Gill (CPMG) experiment where the effects of a radio frequency field on measured transverse relaxation rates are quantified, the new approach measures peak positions in spectra that shift as the effective exchange time regime is varied. The utility of the method is established through an analysis of data recorded on an exchanging protein-ligand system for which the exchange parameters have been accurately determined using alternative approaches. Computations establish that a combined analysis of CPMG and peak shift profiles extends the time scale that can be studied to include exchanging systems with highly skewed populations and exchange rates as slow as 20 s(-1).
Supervised and Unsupervised Learning Technology in the Study of Rodent Behavior
Gris, Katsiaryna V.; Coutu, Jean-Philippe; Gris, Denis
2017-01-01
Quantifying behavior is a challenge for scientists studying neuroscience, ethology, psychology, pathology, etc. Until now, behavior was mostly considered as qualitative descriptions of postures or labor intensive counting of bouts of individual movements. Many prominent behavioral scientists conducted studies describing postures of mice and rats, depicting step by step eating, grooming, courting, and other behaviors. Automated video assessment technologies permit scientists to quantify daily behavioral patterns/routines, social interactions, and postural changes in an unbiased manner. Here, we extensively reviewed published research on the topic of the structural blocks of behavior and proposed a structure of behavior based on the latest publications. We discuss the importance of defining a clear structure of behavior to allow professionals to write viable algorithms. We presented a discussion of technologies that are used in automated video assessment of behavior in mice and rats. We considered advantages and limitations of supervised and unsupervised learning. We presented the latest scientific discoveries that were made using automated video assessment. In conclusion, we proposed that the automated quantitative approach to evaluating animal behavior is the future of understanding the effect of brain signaling, pathologies, genetic content, and environment on behavior. PMID:28804452
Quantifying Behavior Driven Energy Savings for Hotels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, Bing; Wang, Na; Hooks, Edward
2016-08-12
Hotel facilities present abundant opportunities for energy savings. In the United States, there are around 25,000 hotels that spend on an average of $2,196 on energy costs per room each year. This amounts to about 6% of the total annual hotel operating cost. However, unlike offices, there are limited studies on establishing appropriate baselines and quantifying hotel energy savings given the variety of services and amenities, unpredictable customer behaviors, and the around-the-clock operation hours. In this study, we investigate behavior driven energy savings for three medium-size (around 90,000 sf2) hotels that offer similar services in different climate zones. We firstmore » used Department of Energy Asset Scoring Tool to establish baseline models. We then conducted energy saving analysis in EnergyPlus based on a behavior model that defines the upper bound and lower bound of customer and hotel staff behavior. Lastly, we presented a probabilistic energy savings outlook for each hotel. The analysis shows behavior driven energy savings up to 25%. We believe this is the first study to incorporate behavioral factors into energy analysis for hotels. It also demonstrates a procedure to quickly create tailored baselines and identify improvement opportunities for hotels.« less
Brett, Jonathan; Elshaug, Adam G; Bhatia, R Sacha; Chalmers, Kelsey; Badgery-Parker, Tim; Pearson, Sallie-Anne
2017-05-03
Growing imperatives for safety, quality and responsible resource allocation have prompted renewed efforts to identify and quantify harmful or wasteful (low-value) medical practices such as test ordering, procedures and prescribing. Quantifying these practices at a population level using routinely collected health data allows us to understand the scale of low-value medical practices, measure practice change following specific interventions and prioritise policy decisions. To date, almost all research examining health care through the low-value lens has focused on medical services (tests and procedures) rather than on prescribing. The protocol described herein outlines a program of research funded by Australia's National Health and Medical Research Council to select and quantify low-value prescribing practices within Australian routinely collected health data. We start by describing our process for identifying and cataloguing international low-value prescribing practices. We then outline our approach to translate these prescribing practices into indicators that can be applied to Australian routinely collected health data. Next, we detail methods of using Australian health data to quantify these prescribing practices (e.g. prevalence of low-value prescribing and related costs) and their downstream health consequences. We have approval from the necessary Australian state and commonwealth human research ethics and data access committees to undertake this work. The lack of systematic and transparent approaches to quantification of low-value practices in routinely collected data has been noted in recent reviews. Here, we present a methodology applied in the Australian context with the aim of demonstrating principles that can be applied across jurisdictions in order to harmonise international efforts to measure low-value prescribing. The outcomes of this research will be submitted to international peer-reviewed journals. Results will also be presented at national and international pharmacoepidemiology and health policy forums such that other jurisdictions have guidance to adapt this methodology.
Wondimu, Taddese; Wang, Rui; Ross, Brian
2014-09-01
The discovery that hydrogen sulphide (H2S) acts as a gasotransmitter when present at very low concentrations (sub-parts per billion (ppbv)) has resulted in the need to quickly quantify trace amounts of the gas in complex biological samples. Selected ion flow tube mass spectrometry (SIFT-MS) is capable of real-time quantification of H2S but many SIFT-MS instruments lack sufficient sensitivity for this application. In this study we investigate the utility of combining thermal desorption with SIFT-MS for quantifying H2S in the 0.1-1 ppbv concentration range. Human orally or nasally derived breath, and background ambient air, were collected in sampling bags and dried by passing through CaCl2 and H2S pre-concentrated using a sorbent trap optimised for the capture of this gas. The absorbed H2S was then thermally desorbed and quantified by SIFT-MS. H2S concentrations in ambient air, nasal breath and oral breath collected from 10 healthy volunteers were 0.12 ± 0.02 (mean ± SD), 0.40 ± 0.11 and 3.1 ± 2.5 ppbv respectively, and in the oral cavity H2S, quantified by SIFT-MS without pre-concentration, was present at 13.5 ± 8.6 ppbv. The oral cavity H2S correlates well with oral breath H2S but not with nasal breath H2S, suggesting that oral breath H2S derives mainly from the oral cavity but nasal breath is likely pulmonary in origin. The successful quantification of such low concentrations of H2S in nasal air using a rapid analytical procedure paves the way for the straightforward analysis of H2S in breath and may assist in elucidating the role that H2S plays in biological systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, Justin Matthew
These are the slides for a graduate presentation at Mississippi State University. It covers the following: the BRL Shaped-Charge Geometry in PAGOSA, mesh refinement study, surrogate modeling using a radial basis function network (RBFN), ruling out parameters using sensitivity analysis (equation of state study), uncertainty quantification (UQ) methodology, and sensitivity analysis (SA) methodology. In summary, a mesh convergence study was used to ensure that solutions were numerically stable by comparing PDV data between simulations. A Design of Experiments (DOE) method was used to reduce the simulation space to study the effects of the Jones-Wilkins-Lee (JWL) Parameters for the Composition Bmore » main charge. Uncertainty was quantified by computing the 95% data range about the median of simulation output using a brute force Monte Carlo (MC) random sampling method. Parameter sensitivities were quantified using the Fourier Amplitude Sensitivity Test (FAST) spectral analysis method where it was determined that detonation velocity, initial density, C1, and B1 controlled jet tip velocity.« less
NASA Technical Reports Server (NTRS)
Wilson, L. B., III; Sibeck, D. G.; Breneman, A.W.; Le Contel, O.; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.
2014-01-01
We present a detailed outline and discussion of the analysis techniques used to compare the relevance of different energy dissipation mechanisms at collisionless shock waves. We show that the low-frequency, quasi-static fields contribute less to ohmic energy dissipation, (-j · E ) (minus current density times measured electric field), than their high-frequency counterparts. In fact, we found that high-frequency, large-amplitude (greater than 100 millivolts per meter and/or greater than 1 nanotesla) waves are ubiquitous in the transition region of collisionless shocks. We quantitatively show that their fields, through wave-particle interactions, cause enough energy dissipation to regulate the global structure of collisionless shocks. The purpose of this paper, part one of two, is to outline and describe in detail the background, analysis techniques, and theoretical motivation for our new results presented in the companion paper. The companion paper presents the results of our quantitative energy dissipation rate estimates and discusses the implications. Together, the two manuscripts present the first study quantifying the contribution that high-frequency waves provide, through wave-particle interactions, to the total energy dissipation budget of collisionless shock waves.
Numerical Relativity Simulations of Compact Binary Populations in Dense Stellar Environments
NASA Astrophysics Data System (ADS)
Glennon, Derek Ray; Huerta, Eliu; Allen, Gabrielle; Haas, Roland; Seidel, Edward; NCSA Gravity Group
2018-01-01
We present a catalog of numerical relativity simulations that describe binary black hole mergers on eccentric orbits. These simulations have been obtained with the open source, Einstein Toolkit numerical relativity software, using the Blue Waters supercomputer. We use this catalog to quantify observables, such as the mass and spin of black holes formed by binary black hole mergers, as a function of eccentricity. This study is the first of its kind in the literature to quantify these astrophysical observables for binary black hole mergers with mass-ratios q<6, and eccentricities e<0.2. This study is an important step in understanding the properties of eccentric binary black hole mergers, and informs the use of gravitational wave observations to confirm or rule out the existence of compact binary populations in dense stellar environments.
NASA Astrophysics Data System (ADS)
He, Jingjing; Wang, Dengjiang; Zhang, Weifang
2015-03-01
This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.
Drifter observations of submesoscale flow kinematics in the coastal ocean
NASA Astrophysics Data System (ADS)
Ohlmann, J. C.; Molemaker, M. J.; Baschek, B.; Holt, B.; Marmorino, G.; Smith, G.
2017-01-01
Fronts and eddies identified with aerial guidance are seeded with drifters to quantify submesoscale flow kinematics. The Lagrangian observations show mean divergence and vorticity values that can exceed 5 times the Coriolis frequency. Values are the largest observed in the field to date and represent an extreme departure from geostrophic dynamics. The study also quantifies errors and biases associated with Lagrangian observations of the underlying velocity strain tensor. The greatest error results from undersampling, even with a large number of drifters. A significant bias comes from inhomogeneous sampling of convergent regions that accumulate drifters within a few hours of deployment. The study demonstrates a Lagrangian sampling paradigm for targeted submesoscale structures over a broad range of scales and presents flow kinematic values associated with vertical velocities O(10) m h-1 that can have profound implications on ocean biogeochemistry.
Hingerl, Ferdinand F.; Yang, Feifei; Pini, Ronny; ...
2016-02-02
In this paper we present the results of an extensive multiscale characterization of the flow properties and structural and capillary heterogeneities of the Heletz sandstone. We performed petrographic, porosity and capillary pressure measurements on several subsamples. We quantified mm-scale heterogeneity in saturation distributions in a rock core during multi-phase flow using conventional X-ray CT scanning. Core-flooding experiments were conducted under reservoirs conditions (9 MPa, 50 °C) to obtain primary drainage and secondary imbibition relative permeabilities and residual trapping was analyzed and quantified. We provide parameters for relative permeability, capillary pressure and trapping models for further modeling studies. A synchrotron-based microtomographymore » study complements our cm- to mm-scale investigation by providing links between the micromorphology and mm-scale saturation heterogeneities.« less
NASA Astrophysics Data System (ADS)
Huang, M.
2016-12-01
Earth System models (ESMs) are effective tools for investigating the water-energy-food system interactions under climate change. In this presentation, I will introduce research efforts at the Pacific Northwest National Laboratory towards quantifying impacts of LULCC on the water-energy-food nexus in a changing climate using an integrated regional Earth system modeling framework: the Platform for Regional Integrated Modeling and Analysis (PRIMA). Two studies will be discussed to showcase the capability of PRIMA: (1) quantifying changes in terrestrial hydrology over the Conterminous US (CONUS) from 2005 to 2095 using the Community Land Model (CLM) driven by high-resolution downscaled climate and land cover products from PRIMA, which was designed for assessing the impacts of and potential responses to climate and anthropogenic changes at regional scales; (2) applying CLM over the CONUS to provide the first county-scale model validation in simulating crop yields and assessing associated impacts on the water and energy budgets using CLM. The studies demonstrate the benefits of incorporating and coupling human activities into complex ESMs, and critical needs to account for the biogeophysical and biogeochemical effects of LULCC in climate impacts studies, and in designing mitigation and adaptation strategies at a scale meaningful for decision-making. Future directions in quantifying LULCC impacts on the water-energy-food nexus under a changing climate, as well as feedbacks among climate, energy production and consumption, and natural/managed ecosystems using an Integrated Multi-scale, Multi-sector Modeling framework will also be discussed.
Quantitative pilomotor axon reflex test: a novel test of pilomotor function.
Siepmann, Timo; Gibbons, Christopher H; Illigens, Ben M; Lafo, Jacob A; Brown, Christopher M; Freeman, Roy
2012-11-01
Cutaneous autonomic function can be quantified by the assessment of sudomotor and vasomotor responses. Although piloerector muscles are innervated by the sympathetic nervous system, there are at present no methods to quantify pilomotor function. To quantify piloerection using phenylephrine hydrochloride in humans. Pilot study. Hospital-based study. Twenty-two healthy volunteers (18 males,4 females) aged 24 to 48 years participated in 6 studies. Piloerection was stimulated by iontophoresis of 1% phenylephrine. Silicone impressions of piloerection were quantified by number and area. The direct and indirect responses to phenylephrine iontophoresis were compared on both forearms after pre treatment to topical and subcutaneous lidocaine and iontophoresis of normal saline. Iontophoresis of phenylephrine induced piloerection in both the direct and axon reflex–mediated regions, with similar responses in both arms. Topical lidocaine blocked axon reflex–mediated piloerection post-iontophoresis (mean [SD], 66.6 [19.2] for control impressions vs 7.2 [4.3] for lidocaine impressions;P.001). Subcutaneous lidocaine completely blocked piloerection.The area of axon reflex–mediated piloerection was also attenuated in the lidocaine-treated region postiontophoresis (mean [SD], 46.2 [16.1]cm2 vs 7.2 [3.9]cm2; P.001). Piloerection was delayed in the axon reflex region compared with the direct region. Normal saline did not cause piloerection. Phenylephrine provoked piloerection directly and indirectly through an axon reflex–mediated response that is attenuated by lidocaine. Piloerection is not stimulated by iontophoresis of normal saline alone.The quantitative pilomotor axon reflex test (QPART) may complement other measures of cutaneous autonomic nerve fiber function.
Method for obtaining chromosome painting probes
Lucas, Joe N.
2000-01-01
A method is provided for determining a clastogenic signature of a sample of chromosomes by quantifying a frequency of a first type of chromosome aberration present in the sample; quantifying a frequency of a second, different type of chromosome aberration present in the sample; and comparing the frequency of the first type of chromosome aberration to the frequency of the second type of chromosome aberration. A method is also provided for using that clastogenic signature to identify a clastogenic agent or dosage to which the cells were exposed.
Fatigue In Continuous-Fiber/Metal-Matrix Composites
NASA Technical Reports Server (NTRS)
Johnson, William S.
1992-01-01
Report describes experimental approaches to quantification of fatigue damage in metal-matrix composites (MMC's). Discusses number of examples of development of damage and failure along with associated analytical models of behavior of MMC. Objectives of report are twofold. First, present experimental procedures and techniques for conducting meaningful fatigue tests to detect and quantify fatigue damage in MMC's. Second, present examples of how fatigue damage initiated and grows in various MMC's. Report furnishes some insight into what type of fatigue damage occurs and how damage quantified.
NASA Astrophysics Data System (ADS)
Carpenter, Matthew H.; Jernigan, J. G.
2007-05-01
We present examples of an analysis progression consisting of a synthesis of the Photon Clean Method (Carpenter, Jernigan, Brown, Beiersdorfer 2007) and bootstrap methods to quantify errors and variations in many-parameter models. The Photon Clean Method (PCM) works well for model spaces with large numbers of parameters proportional to the number of photons, therefore a Monte Carlo paradigm is a natural numerical approach. Consequently, PCM, an "inverse Monte-Carlo" method, requires a new approach for quantifying errors as compared to common analysis methods for fitting models of low dimensionality. This presentation will explore the methodology and presentation of analysis results derived from a variety of public data sets, including observations with XMM-Newton, Chandra, and other NASA missions. Special attention is given to the visualization of both data and models including dynamic interactive presentations. This work was performed under the auspices of the Department of Energy under contract No. W-7405-Eng-48. We thank Peter Beiersdorfer and Greg Brown for their support of this technical portion of a larger program related to science with the LLNL EBIT program.
Sodium and T1ρ MRI for molecular and diagnostic imaging of articular cartilage†
Borthakur, Arijitt; Mellon, Eric; Niyogi, Sampreet; Witschey, Walter; Kneeland, J. Bruce; Reddy, Ravinder
2010-01-01
In this article, both sodium magnetic resonance (MR) and T1ρ relaxation mapping aimed at measuring molecular changes in cartilage for the diagnostic imaging of osteoarthritis are reviewed. First, an introduction to structure of cartilage, its degeneration in osteoarthritis (OA) and an outline of diagnostic imaging methods in quantifying molecular changes and early diagnostic aspects of cartilage degeneration are described. The sodium MRI section begins with a brief overview of the theory of sodium NMR of biological tissues and is followed by a section on multiple quantum filters that can be used to quantify both bi-exponential relaxation and residual quadrupolar interaction. Specifically, (i) the rationale behind the use of sodium MRI in quantifying proteoglycan (PG) changes, (ii) validation studies using biochemical assays, (iii) studies on human OA specimens, (iv) results on animal models and (v) clinical imaging protocols are reviewed. Results demonstrating the feasibility of quantifying PG in OA patients and comparison with that in healthy subjects are also presented. The section concludes with the discussion of advantages and potential issues with sodium MRI and the impact of new technological advancements (e.g. ultra-high field scanners and parallel imaging methods). In the theory section on T1ρ, a brief description of (i) principles of measuring T1ρ relaxation, (ii) pulse sequences for computing T1ρ relaxation maps, (iii) issues regarding radio frequency power deposition, (iv) mechanisms that contribute to T1ρ in biological tissues and (v) effects of exchange and dipolar interaction on T1ρ dispersion are discussed. Correlation of T1ρ relaxation rate with macromolecular content and biomechanical properties in cartilage specimens subjected to trypsin and cytokine-induced glycosaminoglycan depletion and validation against biochemical assay and histopathology are presented. Experimental T1ρ data from osteoarthritic specimens, animal models, healthy human subjects and as well from osteoarthritic patients are provided. The current status of T1ρ relaxation mapping of cartilage and future directions is also discussed. PMID:17075961
Nonlinear time series analysis of normal and pathological human walking
NASA Astrophysics Data System (ADS)
Dingwell, Jonathan B.; Cusumano, Joseph P.
2000-12-01
Characterizing locomotor dynamics is essential for understanding the neuromuscular control of locomotion. In particular, quantifying dynamic stability during walking is important for assessing people who have a greater risk of falling. However, traditional biomechanical methods of defining stability have not quantified the resistance of the neuromuscular system to perturbations, suggesting that more precise definitions are required. For the present study, average maximum finite-time Lyapunov exponents were estimated to quantify the local dynamic stability of human walking kinematics. Local scaling exponents, defined as the local slopes of the correlation sum curves, were also calculated to quantify the local scaling structure of each embedded time series. Comparisons were made between overground and motorized treadmill walking in young healthy subjects and between diabetic neuropathic (NP) patients and healthy controls (CO) during overground walking. A modification of the method of surrogate data was developed to examine the stochastic nature of the fluctuations overlying the nominally periodic patterns in these data sets. Results demonstrated that having subjects walk on a motorized treadmill artificially stabilized their natural locomotor kinematics by small but statistically significant amounts. Furthermore, a paradox previously present in the biomechanical literature that resulted from mistakenly equating variability with dynamic stability was resolved. By slowing their self-selected walking speeds, NP patients adopted more locally stable gait patterns, even though they simultaneously exhibited greater kinematic variability than CO subjects. Additionally, the loss of peripheral sensation in NP patients was associated with statistically significant differences in the local scaling structure of their walking kinematics at those length scales where it was anticipated that sensory feedback would play the greatest role. Lastly, stride-to-stride fluctuations in the walking patterns of all three subject groups were clearly distinguishable from linearly autocorrelated Gaussian noise. As a collateral benefit of the methodological approach taken in this study, some of the first steps at characterizing the underlying structure of human locomotor dynamics have been taken. Implications for understanding the neuromuscular control of locomotion are discussed.
Zhao, Chang; Sander, Heather A
2015-01-01
Studies that assess the distribution of benefits provided by ecosystem services across urban areas are increasingly common. Nevertheless, current knowledge of both the supply and demand sides of ecosystem services remains limited, leaving a gap in our understanding of balance between ecosystem service supply and demand that restricts our ability to assess and manage these services. The present study seeks to fill this gap by developing and applying an integrated approach to quantifying the supply and demand of a key ecosystem service, carbon storage and sequestration, at the local level. This approach follows three basic steps: (1) quantifying and mapping service supply based upon Light Detection and Ranging (LiDAR) processing and allometric models, (2) quantifying and mapping demand for carbon sequestration using an indicator based on local anthropogenic CO2 emissions, and (3) mapping a supply-to-demand ratio. We illustrate this approach using a portion of the Twin Cities Metropolitan Area of Minnesota, USA. Our results indicate that 1735.69 million kg carbon are stored by urban trees in our study area. Annually, 33.43 million kg carbon are sequestered by trees, whereas 3087.60 million kg carbon are emitted by human sources. Thus, carbon sequestration service provided by urban trees in the study location play a minor role in combating climate change, offsetting approximately 1% of local anthropogenic carbon emissions per year, although avoided emissions via storage in trees are substantial. Our supply-to-demand ratio map provides insight into the balance between carbon sequestration supply in urban trees and demand for such sequestration at the local level, pinpointing critical locations where higher levels of supply and demand exist. Such a ratio map could help planners and policy makers to assess and manage the supply of and demand for carbon sequestration.
NASA Astrophysics Data System (ADS)
Schumacher, Sandra; Pierau, Roberto; Wirth, Wolfgang
2017-04-01
In recent years, the development of geothermal plants in Germany has increased significantly due to a favorable political setting and resulting financial incentives. However, most projects are developed by local communities or private investors, which cannot afford a project to fail. To cover the risk of total loss if the geothermal well should not provide the energy output necessary for an economically viable project, investors try to procure insurances for this worst case scenario. In order to issue such insurances, the insurance companies insist on so called probability-of-success studies (POS studies), in which the geological risk for not achieving the necessary temperatures and/or flow rates for an economically successful project is quantified. Quantifying the probability of reaching a minimum temperature, which has to be defined by the project investors, is relatively straight forward as subsurface temperatures in Germany are comparatively well known due tens of thousands of hydrocarbon wells. Moreover, for the German Molasse Basin a method to characterize the hydraulic potential of a site based on pump test analysis has been developed and refined in recent years. However, to quantify the probability of reaching a given flow rate with a given drawdown is much more challenging in areas where pump test data are generally not available (e.g. the North German Basin). Therefore, a new method based on log and core derived porosity and permeability data was developed to quantify the geological risk of reaching a determined flow rate in such areas. We present both methods for POS studies and show how subsurface data such as pump tests or log and core measurements can be used to predict the chances of a potential geothermal project from a geological point of view.
Steinlin, Christine; Bogdal, Christian; Pavlova, Pavlina A; Schwikowski, Margit; Lüthi, Martin P; Scheringer, Martin; Schmid, Peter; Hungerbühler, Konrad
2015-12-15
We present results from a chemical fate model quantifying incorporation of polychlorinated biphenyls (PCBs) into the Silvretta glacier, a temperate Alpine glacier located in Switzerland. Temperate glaciers, in contrast to cold glaciers, are glaciers where melt processes are prevalent. Incorporation of PCBs into cold glaciers has been quantified in previous studies. However, the fate of PCBs in temperate glaciers has never been investigated. In the model, we include melt processes, inducing elution of water-soluble substances and, conversely, enrichment of particles and particle-bound chemicals. The model is validated by comparing modeled and measured PCB concentrations in an ice core collected in the Silvretta accumulation area. We quantify PCB incorporation between 1900 and 2010, and discuss the fate of six PCB congeners. PCB concentrations in the ice core peak in the period of high PCB emissions, as well as in years with strong melt. While for lower-chlorinated PCB congeners revolatilization is important, for higher-chlorinated congeners, the main processes are storage in glacier ice and removal by particle runoff. This study gives insight into PCB fate and dynamics and reveals the effect of snow accumulation and melt processes on the fate of semivolatile organic chemicals in a temperate Alpine glacier.
Quantifying Evaporation in a Permeable Pavement System
Studies quantifying evaporation from permeable pavement systems are limited to a few laboratory studies and one field application. This research quantifies evaporation for a larger-scale field application by measuring the water balance from lined permeable pavement sections. Th...
Assessment of Vulnerability to Extreme Flash Floods in Design Storms
Kim, Eung Seok; Choi, Hyun Il
2011-01-01
There has been an increase in the occurrence of sudden local flooding of great volume and short duration caused by heavy or excessive rainfall intensity over a small area, which presents the greatest potential danger threat to the natural environment, human life, public health and property, etc. Such flash floods have rapid runoff and debris flow that rises quickly with little or no advance warning to prevent flood damage. This study develops a flash flood index through the average of the same scale relative severity factors quantifying characteristics of hydrographs generated from a rainfall-runoff model for the long-term observed rainfall data in a small ungauged study basin, and presents regression equations between rainfall characteristics and the flash flood index. The aim of this study is to develop flash flood index-duration-frequency relation curves by combining the rainfall intensity-duration-frequency relation and the flash flood index from probability rainfall data in order to evaluate vulnerability to extreme flash floods in design storms. This study is an initial effort to quantify the flash flood severity of design storms for both existing and planned flood control facilities to cope with residual flood risks due to extreme flash floods that have ocurred frequently in recent years. PMID:21845165
Assessment of vulnerability to extreme flash floods in design storms.
Kim, Eung Seok; Choi, Hyun Il
2011-07-01
There has been an increase in the occurrence of sudden local flooding of great volume and short duration caused by heavy or excessive rainfall intensity over a small area, which presents the greatest potential danger threat to the natural environment, human life, public health and property, etc. Such flash floods have rapid runoff and debris flow that rises quickly with little or no advance warning to prevent flood damage. This study develops a flash flood index through the average of the same scale relative severity factors quantifying characteristics of hydrographs generated from a rainfall-runoff model for the long-term observed rainfall data in a small ungauged study basin, and presents regression equations between rainfall characteristics and the flash flood index. The aim of this study is to develop flash flood index-duration-frequency relation curves by combining the rainfall intensity-duration-frequency relation and the flash flood index from probability rainfall data in order to evaluate vulnerability to extreme flash floods in design storms. This study is an initial effort to quantify the flash flood severity of design storms for both existing and planned flood control facilities to cope with residual flood risks due to extreme flash floods that have ocurred frequently in recent years.
Lai, I-Chien; Lee, Chon-Lin; Huang, Hu-Ching
2016-03-01
Transboundary transport of air pollution is a serious environmental concern as pollutant affects both human health and the environment. Many numerical approaches have been utilized to quantify the amounts of pollutants transported to receptor regions, based on emission inventories from possible source regions. However, sparse temporal-spatial observational data and uncertainty in emission inventories might make the transboundary transport contribution difficult to estimate. This study presents a conceptual quantitative approach that uses transport pathway classification in combination with curve fitting models to simulate an air pollutant concentration baseline for pollution background concentrations. This approach is used to investigate the transboundary transport contribution of atmospheric pollutants to a metropolitan area in the East Asian Pacific rim region. Trajectory analysis categorized pollution sources for the study area into three regions: East Asia, Southeast Asia, and Taiwan cities. The occurrence frequency and transboundary contribution results suggest the predominant source region is the East Asian continent. This study also presents an application to evaluate heavy pollution cases for health concerns. This new baseline construction model provides a useful tool for the study of the contribution of transboundary pollution delivered to receptors, especially for areas deficient in emission inventories and regulatory monitoring data for harmful air pollutants. Copyright © 2015 Elsevier Ltd. All rights reserved.
Chien, Jung Hung; Eikema, Diderik-Jan Anthony; Mukherjee, Mukul; Stergiou, Nicholas
2014-12-01
Feedback based balance control requires the integration of visual, proprioceptive and vestibular input to detect the body's movement within the environment. When the accuracy of sensory signals is compromised, the system reorganizes the relative contributions through a process of sensory recalibration, for upright postural stability to be maintained. Whereas this process has been studied extensively in standing using the Sensory Organization Test (SOT), less is known about these processes in more dynamic tasks such as locomotion. In the present study, ten healthy young adults performed the six conditions of the traditional SOT to quantify standing postural control when exposed to sensory conflict. The same subjects performed these six conditions using a novel experimental paradigm, the Locomotor SOT (LSOT), to study dynamic postural control during walking under similar types of sensory conflict. To quantify postural control during walking, the net Center of Pressure sway variability was used. This corresponds to the Performance Index of the center of pressure trajectory, which is used to quantify postural control during standing. Our results indicate that dynamic balance control during locomotion in healthy individuals is affected by the systematic manipulation of multisensory inputs. The sway variability patterns observed during locomotion reflect similar balance performance with standing posture, indicating that similar feedback processes may be involved. However, the contribution of visual input is significantly increased during locomotion, compared to standing in similar sensory conflict conditions. The increased visual gain in the LSOT conditions reflects the importance of visual input for the control of locomotion. Since balance perturbations tend to occur in dynamic tasks and in response to environmental constraints not present during the SOT, the LSOT may provide additional information for clinical evaluation on healthy and deficient sensory processing.
Computed tomography-based volumetric tool for standardized measurement of the maxillary sinus
Giacomini, Guilherme; Pavan, Ana Luiza Menegatti; Altemani, João Mauricio Carrasco; Duarte, Sergio Barbosa; Fortaleza, Carlos Magno Castelo Branco; Miranda, José Ricardo de Arruda
2018-01-01
Volume measurements of maxillary sinus may be useful to identify diseases affecting paranasal sinuses. However, literature shows a lack of consensus in studies measuring the volume. This may be attributable to different computed tomography data acquisition techniques, segmentation methods, focuses of investigation, among other reasons. Furthermore, methods for volumetrically quantifying the maxillary sinus are commonly manual or semiautomated, which require substantial user expertise and are time-consuming. The purpose of the present study was to develop an automated tool for quantifying the total and air-free volume of the maxillary sinus based on computed tomography images. The quantification tool seeks to standardize maxillary sinus volume measurements, thus allowing better comparisons and determinations of factors that influence maxillary sinus size. The automated tool utilized image processing techniques (watershed, threshold, and morphological operators). The maxillary sinus volume was quantified in 30 patients. To evaluate the accuracy of the automated tool, the results were compared with manual segmentation that was performed by an experienced radiologist using a standard procedure. The mean percent differences between the automated and manual methods were 7.19% ± 5.83% and 6.93% ± 4.29% for total and air-free maxillary sinus volume, respectively. Linear regression and Bland-Altman statistics showed good agreement and low dispersion between both methods. The present automated tool for maxillary sinus volume assessment was rapid, reliable, robust, accurate, and reproducible and may be applied in clinical practice. The tool may be used to standardize measurements of maxillary volume. Such standardization is extremely important for allowing comparisons between studies, providing a better understanding of the role of the maxillary sinus, and determining the factors that influence maxillary sinus size under normal and pathological conditions. PMID:29304130
Breeding chorus indices are weakly related to estimated abundance of boreal chorus frogs
Paul Stephen Corn; Erin Muths; Amanda M. Kissel; Rick D. Scherer
2011-01-01
Call surveys used to monitor breeding choruses of anuran amphibians generate index values that are frequently used to represent the number of male frogs present, but few studies have quantified this relationship. We compared abundance of male Boreal Chorus Frogs (Pseudacris maculata), estimated using capture-recapture methods in two populations in Colorado, to call...
USDA-ARS?s Scientific Manuscript database
We have previously reported on the low lipid bioaccessibility from almond seeds during digestion in the upper gastrointestinal tract (GIT). In the present study, we quantified the lipid released during artificial mastication from four almond meals: natural raw almonds (NA), roasted almonds (RA), roa...
Digital Avionics Information System (DAIS): Impact of DAIS Concept on Life Cycle Cost. Final Report.
ERIC Educational Resources Information Center
Goclowski, John C.; And Others
Designed to identify and quantify the potential impacts of the Digital Avionics Information System (DAIS) on weapon system personnel requirements and life cycle cost (LCC), this study postulated a typical close-air-support (CAS) mission avionics suite to serve as a basis for comparing present day and DAIS configuration specifications. The purpose…
Climate Change to the Year 2000: A Survey of Expert Opinion.
ERIC Educational Resources Information Center
Institute for the Future, Menlo Park, CA.
This survey of expert opinion was conducted by the National Defense University, Washington, D.C. to quantify the likelihood of significant changes in climate and their practical consequences. The major objectives of the study are embodied in four tasks. This publication presents the results of the first task only: the definition and estimation of…
Ten-year response of competing vegetation after oak shelterwood treatments in West Virginia
Gary W. Miller; James N. Kochenderfer; Jeffrey D. Kochenderfer; Kurt W. Gottschalk
2014-01-01
Successful oak regeneration depends on the relative status of advanced oak reproduction and associated competing woody vegetation present when harvests or other stand-replacing disturbances occur. This study was installed to quantify the effect of microsite light availability and deer browsing on the development of advanced northern red oak (Quercus rubra...
Hydraulic redistribution in a Douglas-fir forest: lessons from system manipulations.
J. Renée Brooks; Frederick C. Meinzer; Jeffery M. Warren; Jean-Christophe Domec; Rob Coulombe
2006-01-01
Hydraulic redistribution (HR) occurs in many ecosystems; however, key questions remain about its consequences at the ecosystem level. The objectives of the present study were to quantify seasonal variation in HR and its driving force, and to manipulate the soil-root system to elucidate physiological components controlling HR and utilization of redistributed water. In...
EPA and NIST have collaborated to establish the necessary procedures for establishing the required NIST traceability of commercially-provided Hg0 and HgCl2 reference generators. This presentation will discuss the approach of a joint EPA/NIST study to accurately quantify the tru...
Training Situation Analysis: Conducting a Needs Analysis for Teams and New Systems.
ERIC Educational Resources Information Center
Dell, Jay; Fox, John; Malcolm, Ralph
1998-01-01
The United States Coast Guard uses training situation analysis (TSA) to develop quantified training requirements, collect training and non-training performance data, and overcome turf issues to focus on performance outcomes. Presents the 1947 MLB (Motor Lifeboat Project) as a case study. Outlines 11 steps in the TSA needs analysis for teams and…
ERIC Educational Resources Information Center
Kenny, John; Fluck, Andrew; Jetson, Tim
2012-01-01
This paper presents a detailed case study of the development and implementation of a quantifiable academic workload model in the education faculty of an Australian university. Flowing from the enterprise bargaining process, the Academic Staff Agreement required the implementation of a workload allocation model for academics that was quantifiable…
2006-03-01
1989) present an innovative approach to quantifying risk . Their approach is to utilize linguistic terms or words and to systematically assign a...Together, these 15 factors were a first step in the problem of quantifying risk . These factors, and the four categories within which they fall, are
Peterson, Courtney M.; Apolzan, John W.; Wright, Courtney; Martin, Corby K.
2017-01-01
We conducted a pair of studies to test the validity, reliability, feasibility, and acceptability of using video chat technology as a novel method to quantify dietary and pill-taking (i.e., supplement and medication) adherence. In the first study, we investigated whether video chat technology can accurately quantify adherence to dietary and pill-taking interventions. Mock study participants ate food items and swallowed pills while performing randomized scripted “cheating” behaviors design to mimic non-adherence. Monitoring was conducted in a crossover design, with two monitors watching in-person and two watching remotely by Skype on a smartphone. For the second study, a 22-question online survey was sent to an email listserv with more than 20,000 unique email addresses of past and present study participants to assess the feasibility and acceptability of the technology. For the dietary adherence tests, monitors detected 86% of non-adherent events (sensitivity) in-person versus 78% of events via video chat monitoring (p=0.12), with comparable inter-rater agreement (0.88 vs. 0.85; p=0.62). However, for pill-taking, non-adherence trended towards being more easily detected in-person than by video chat (77% vs. 60%; p=0.08), with non-significantly higher inter-rater agreement (0.85 vs. 0.69; p=0.21). Survey results from the second study (N=1,076 respondents; at least a 5% response rate) indicated that 86.4% of study participants had video chatting hardware, 73.3% were comfortable using the technology; and 79.8% were willing to use it for clinical research. Given the capability of video chat technology to reduce participant burden and to outperform other adherence monitoring methods such as dietary self-report and pill counts, video chatting is a novel and highly promising platform to quantify dietary and pill-taking adherence. PMID:27753427
Heymsfield, Steven B.; Ebbeling, Cara B.; Zheng, Jolene; Pietrobelli, Angelo; Strauss, Boyd J.; Silva, Analiza M.; Ludwig, David S.
2015-01-01
Excess adiposity is the main phenotypic feature that defines human obesity and that plays a pathophysiological role in most chronic diseases. Measuring the amount of fat mass present is thus a central aspect of studying obesity at the individual and population levels. Nevertheless, a consensus is lacking among investigators on a single accepted “reference” approach for quantifying fat mass in vivo. While the research community generally relies on the multicomponent body-volume class of “reference” models for quantifying fat mass, no definable guide discerns among different applied equations for partitioning the four (fat, water, protein, and mineral mass) or more quantified components, standardizes “adjustment” or measurement system approaches for model-required labeled water dilution volumes and bone mineral mass estimates, or firmly establishes the body temperature at which model physical properties are assumed. The resulting differing reference strategies for quantifying body composition in vivo leads to small but under some circumstances important differences in the amount of measured body fat. Recent technological advances highlight opportunities to expand model applications to new subject groups and measured components such as total body protein. The current report reviews the historical evolution of multicomponent body volume-based methods in the context of prevailing uncertainties and future potential. PMID:25645009
Live cell interferometry quantifies dynamics of biomass partitioning during cytokinesis.
Zangle, Thomas A; Teitell, Michael A; Reed, Jason
2014-01-01
The equal partitioning of cell mass between daughters is the usual and expected outcome of cytokinesis for self-renewing cells. However, most studies of partitioning during cell division have focused on daughter cell shape symmetry or segregation of chromosomes. Here, we use live cell interferometry (LCI) to quantify the partitioning of daughter cell mass during and following cytokinesis. We use adherent and non-adherent mouse fibroblast and mouse and human lymphocyte cell lines as models and show that, on average, mass asymmetries present at the time of cleavage furrow formation persist through cytokinesis. The addition of multiple cytoskeleton-disrupting agents leads to increased asymmetry in mass partitioning which suggests the absence of active mass partitioning mechanisms after cleavage furrow positioning.
Visual degradation in Leonardo da Vinci's iconic self-portrait: A nanoscale study
NASA Astrophysics Data System (ADS)
Conte, A. Mosca; Pulci, O.; Misiti, M. C.; Lojewska, J.; Teodonio, L.; Violante, C.; Missori, M.
2014-06-01
The discoloration of ancient paper, due to the development of oxidized groups acting as chromophores in its chief component, cellulose, is responsible for severe visual degradation in ancient artifacts. By adopting a non-destructive approach based on the combination of optical reflectance measurements and time-dependent density functional theory ab-initio calculations, we describe and quantify the chromophores affecting Leonardo da Vinci's iconic self-portrait. Their relative concentrations are very similar to those measured in modern and ancient samples aged in humid environments. This analysis quantifies the present level of optical degradation of the Leonardo da Vinci's self-portrait which, compared with future measurements, will assess its degradation rate. This is a fundamental information in order to plan appropriate conservation strategies.
The Role and Quality of Software Safety in the NASA Constellation Program
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor R.; Zelkowitz, Marvin V.
2010-01-01
In this study, we examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Obtaining an accurate, program-wide picture of software safety risk is difficult across multiple, independently-developing systems. We leverage one source of safety information, hazard analysis, to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. The goal of this research is two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to quantify the level of risk presented by software in the hazard analysis. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. To quantify the importance of software, we collected metrics based on the number of software-related causes and controls of hazardous conditions. To quantify the level of risk presented by software, we created a metric scheme to measure the specificity of these software causes. We found that from 49-70% of hazardous conditions in the three systems could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. Furthermore, 10-12% of all controls were software-based. There is potential for inaccuracy in these counts, however, as software causes are not consistently scoped, and the presence of software in a cause or control is not always clear. The application of our software specificity metrics also identified risks in the hazard reporting process. In particular, we found a number of traceability risks in the hazard reports may impede verification of software and system safety.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harman-Ware, Anne E.; Sykes, Robert; Peter, Gary F.
Terpenoids, naturally occurring compounds derived from isoprene units present in pine oleoresin, are a valuable source of chemicals used in solvents, fragrances, flavors, and have shown potential use as a biofuel. This paper describes a method to extract and analyze the terpenoids present in loblolly pine saplings and pine lighter wood. Various extraction solvents were tested over different times and temperatures. Samples were analyzed by pyrolysis-molecular beam mass spectrometry before and after extractions to monitor the extraction efficiency. The pyrolysis studies indicated that the optimal extraction method used a 1:1 hexane/acetone solvent system at 22°C for 1 h. Extracts frommore » the hexane/acetone experiments were analyzed using a low thermal mass modular accelerated column heater for fast-GC/FID analysis. The most abundant terpenoids from the pine samples were quantified, using standard curves, and included the monoterpenes, α- and β-pinene, camphene, and δ-carene. Sesquiterpenes analyzed included caryophyllene, humulene, and α-bisabolene. In conclusion, diterpenoid resin acids were quantified in derivatized extractions, including pimaric, isopimaric, levopimaric, palustric, dehydroabietic, abietic, and neoabietic acids.« less
Explanation of Change Cost and Schedule Growth Study Interim Status Briefing
NASA Technical Reports Server (NTRS)
Croonce, Thomas; Bitten, Bob; Emmons, Debra
2010-01-01
This slide presentation reviews the study to understand the changes in cost and schedule growth for NASA projects. A second goal was to determine the percentage of growth that was outside the control of the project. The study examined project documentation, conducted interviews with key project personnel, and allocated growth events to an Explanation of Change (EoC) tree to quantify the reasons for growth in the scheduled time. This briefing reviews the results of the study of the first 20 missions.
2015-01-01
Do negative quantifiers like “few” reduce people’s ability to rapidly evaluate incoming language with respect to world knowledge? Previous research has addressed this question by examining whether online measures of quantifier comprehension match the “final” interpretation reflected in verification judgments. However, these studies confounded quantifier valence with its impact on the unfolding expectations for upcoming words, yielding mixed results. In the current event-related potentials study, participants read negative and positive quantifier sentences matched on cloze probability and on truth-value (e.g., “Most/Few gardeners plant their flowers during the spring/winter for best results”). Regardless of whether participants explicitly verified the sentences or not, true-positive quantifier sentences elicited reduced N400s compared with false-positive quantifier sentences, reflecting the facilitated semantic retrieval of words that render a sentence true. No such facilitation was seen in negative quantifier sentences. However, mixed-effects model analyses (with cloze value and truth-value as continuous predictors) revealed that decreasing cloze values were associated with an interaction pattern between truth-value and quantifier, whereas increasing cloze values were associated with more similar truth-value effects regardless of quantifier. Quantifier sentences are thus understood neither always in 2 sequential stages, nor always in a partial-incremental fashion, nor always in a maximally incremental fashion. Instead, and in accordance with prediction-based views of sentence comprehension, quantifier sentence comprehension depends on incorporation of quantifier meaning into an online, knowledge-based prediction for upcoming words. Fully incremental quantifier interpretation occurs when quantifiers are incorporated into sufficiently strong online predictions for upcoming words. PMID:26375784
Incremental generation of answers during the comprehension of questions with quantifiers.
Bott, Oliver; Augurzky, Petra; Sternefeld, Wolfgang; Ulrich, Rolf
2017-09-01
The paper presents a study on the online interpretation of quantified questions involving complex domain restriction, for instance, are all triangles blue that are in the circle. Two probe reaction time (RT) task experiments were conducted to study the incremental nature of answer generation while manipulating visual contexts and response hand overlap between tasks. We manipulated the contexts in such a way that the incremental answer to the question changed from 'yes' to 'no' or remained the same before and after encountering the extraposed relative clause. The findings of both experiments provide evidence for incremental answer preparation but only if the context did not involve the risk of answer revision. Our results show that preliminary output from incremental semantic interpretation results in response priming that facilitates congruent responses in the probe RT task. Copyright © 2017 Elsevier B.V. All rights reserved.
Civil helicopter propulsion system reliability and engine monitoring technology assessments
NASA Technical Reports Server (NTRS)
Murphy, J. A.; Zuk, J.
1982-01-01
A study to reduce operating costs of helicopters, particularly directed at the maintenance of the propulsion subsystem, is presented. The tasks of the study consisted of problem definition refinement, technology solutions, diagnostic system concepts, and emergency power augmentation. Quantifiable benefits (reduced fuel consumption, on-condition engine maintenance, extended drive system overhaul periods, and longer oil change intervals) would increase the initial cost by $43,000, but the benefit of $24.46 per hour would result in breakeven at 1758 hours. Other benefits not capable of being quantified but perhaps more important include improved aircraft avilability due to reduced maintenance time, potential for increased operating limits due to continuous automatic monitoring of gages, and less time and fuel required to make engine power checks. The most important improvement is the on-condition maintenance program, which will require the development of algorithms, equipment, and procedures compatible with all operating environments.
Random Constructions in Bell Inequalities: A Survey
NASA Astrophysics Data System (ADS)
Palazuelos, Carlos
2018-01-01
Initially motivated by their relevance in foundations of quantum mechanics and more recently by their applications in different contexts of quantum information science, violations of Bell inequalities have been extensively studied during the last years. In particular, an important effort has been made in order to quantify such Bell violations. Probabilistic techniques have been heavily used in this context with two different purposes. First, to quantify how common the phenomenon of Bell violations is; and second, to find large Bell violations in order to better understand the possibilities and limitations of this phenomenon. However, the strong mathematical content of these results has discouraged some of the potentially interested readers. The aim of the present work is to review some of the recent results in this direction by focusing on the main ideas and removing most of the technical details, to make the previous study more accessible to a wide audience.
Doran, Kara S.; Howd, Peter A.; Sallenger,, Asbury H.
2016-01-04
Recent studies, and most of their predecessors, use tide gage data to quantify SL acceleration, ASL(t). In the current study, three techniques were used to calculate acceleration from tide gage data, and of those examined, it was determined that the two techniques based on sliding a regression window through the time series are more robust compared to the technique that fits a single quadratic form to the entire time series, particularly if there is temporal variation in the magnitude of the acceleration. The single-fit quadratic regression method has been the most commonly used technique in determining acceleration in tide gage data. The inability of the single-fit method to account for time-varying acceleration may explain some of the inconsistent findings between investigators. Properly quantifying ASL(t) from field measurements is of particular importance in evaluating numerical models of past, present, and future SLR resulting from anticipated climate change.
Kannan, Ashwin; Karumanchi, Subbalakshmi Latha; Krishna, Vinatha; Thiruvengadam, Kothai; Ramalingam, Subramaniam; Gautam, Pennathur
2014-01-01
Colonization of surfaces by bacterial cells results in the formation of biofilms. There is a need to study the factors that are important for formation of biofilms since biofilms have been implicated in the failure of semiconductor devices and implants. In the present study, the adhesion force of biofilms (formed by Pseudomonas aeruginosa) on porous silicon substrates of varying surface roughness was quantified using atomic force microscopy (AFM). The experiments were carried out to quantify the effect of surface roughness on the adhesion force of biofilm. The results show that the adhesion force increased from 1.5 ± 0.5 to 13.2 ± 0.9 nN with increase in the surface roughness of silicon substrate. The results suggest that the adhesion force of biofilm is affected by surface roughness of substrate. © 2014 Wiley Periodicals, Inc.
A novel framework for virtual prototyping of rehabilitation exoskeletons.
Agarwal, Priyanshu; Kuo, Pei-Hsin; Neptune, Richard R; Deshpande, Ashish D
2013-06-01
Human-worn rehabilitation exoskeletons have the potential to make therapeutic exercises increasingly accessible to disabled individuals while reducing the cost and labor involved in rehabilitation therapy. In this work, we propose a novel human-model-in-the-loop framework for virtual prototyping (design, control and experimentation) of rehabilitation exoskeletons by merging computational musculoskeletal analysis with simulation-based design techniques. The framework allows to iteratively optimize design and control algorithm of an exoskeleton using simulation. We introduce biomechanical, morphological, and controller measures to quantify the performance of the device for optimization study. Furthermore, the framework allows one to carry out virtual experiments for testing specific "what-if" scenarios to quantify device performance and recovery progress. To illustrate the application of the framework, we present a case study wherein the design and analysis of an index-finger exoskeleton is carried out using the proposed framework.
Predicting Crystallization of Amorphous Drugs with Terahertz Spectroscopy.
Sibik, Juraj; Löbmann, Korbinian; Rades, Thomas; Zeitler, J Axel
2015-08-03
There is a controversy about the extent to which the primary and secondary dielectric relaxations influence the crystallization of amorphous organic compounds below the glass transition temperature. Recent studies also point to the importance of fast molecular dynamics on picosecond-to-nanosecond time scales with respect to the glass stability. In the present study we provide terahertz spectroscopy evidence on the crystallization of amorphous naproxen well below its glass transition temperature and confirm the direct role of Johari-Goldstein (JG) secondary relaxation as a facilitator of the crystallization. We determine the onset temperature Tβ above which the JG relaxation contributes to the fast molecular dynamics and analytically quantify the level of this contribution. We then show there is a strong correlation between the increase in the fast molecular dynamics and onset of crystallization in several chosen amorphous drugs. We believe that this technique has immediate applications to quantify the stability of amorphous drug materials.
NASA Astrophysics Data System (ADS)
Ohtaki, Yasuaki; Arif, Muhammad; Suzuki, Akihiro; Fujita, Kazuki; Inooka, Hikaru; Nagatomi, Ryoichi; Tsuji, Ichiro
This study presents an assessment of walking stability in elderly people, focusing on local dynamic stability of walking. Its main objectives were to propose a technique to quantify local dynamic stability using nonlinear time-series analyses and a portable instrument, and to investigate their reliability in revealing the efficacy of an exercise training intervention for elderly people for improvement of walking stability. The method measured three-dimensional acceleration of the upper body, and computation of Lyapunov exponents, thereby directly quantifying the local stability of the dynamic system. Straight level walking of young and elderly subjects was investigated in the experimental study. We compared Lyapunov exponents of young and the elderly subjects, and of groups before and after the exercise intervention. Experimental results demonstrated that the exercise intervention improved local dynamic stability of walking. The proposed method was useful in revealing effects and efficacies of the exercise intervention for elderly people.
Quantifying entanglement with witness operators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brandao, Fernando G.S.L.
2005-08-15
We present a unifying approach to the quantification of entanglement based on entanglement witnesses, which includes several already established entanglement measures such as the negativity, the concurrence, and the robustness of entanglement. We then introduce an infinite family of new entanglement quantifiers, having as its limits the best separable approximation measure and the generalized robustness. Gaussian states, states with symmetry, states constrained to super-selection rules, and states composed of indistinguishable particles are studied under the view of the witnessed entanglement. We derive new bounds to the fidelity of teleportation d{sub min}, for the distillable entanglement E{sub D} and for themore » entanglement of formation. A particular measure, the PPT-generalized robustness, stands out due to its easy calculability and provides sharper bounds to d{sub min} and E{sub D} than the negativity in most of the states. We illustrate our approach studying thermodynamical properties of entanglement in the Heisenberg XXX and dimerized models.« less
Quantifying Heuristic Bias: Anchoring, Availability, and Representativeness.
Richie, Megan; Josephson, S Andrew
2018-01-01
Construct: Authors examined whether a new vignette-based instrument could isolate and quantify heuristic bias. Heuristics are cognitive shortcuts that may introduce bias and contribute to error. There is no standardized instrument available to quantify heuristic bias in clinical decision making, limiting future study of educational interventions designed to improve calibration of medical decisions. This study presents validity data to support a vignette-based instrument quantifying bias due to the anchoring, availability, and representativeness heuristics. Participants completed questionnaires requiring assignment of probabilities to potential outcomes of medical and nonmedical scenarios. The instrument randomly presented scenarios in one of two versions: Version A, encouraging heuristic bias, and Version B, worded neutrally. The primary outcome was the difference in probability judgments for Version A versus Version B scenario options. Of 167 participants recruited, 139 enrolled. Participants assigned significantly higher mean probability values to Version A scenario options (M = 9.56, SD = 3.75) than Version B (M = 8.98, SD = 3.76), t(1801) = 3.27, p = .001. This result remained significant analyzing medical scenarios alone (Version A, M = 9.41, SD = 3.92; Version B, M = 8.86, SD = 4.09), t(1204) = 2.36, p = .02. Analyzing medical scenarios by heuristic revealed a significant difference between Version A and B for availability (Version A, M = 6.52, SD = 3.32; Version B, M = 5.52, SD = 3.05), t(404) = 3.04, p = .003, and representativeness (Version A, M = 11.45, SD = 3.12; Version B, M = 10.67, SD = 3.71), t(396) = 2.28, p = .02, but not anchoring. Stratifying by training level, students maintained a significant difference between Version A and B medical scenarios (Version A, M = 9.83, SD = 3.75; Version B, M = 9.00, SD = 3.98), t(465) = 2.29, p = .02, but not residents or attendings. Stratifying by heuristic and training level, availability maintained significance for students (Version A, M = 7.28, SD = 3.46; Version B, M = 5.82, SD = 3.22), t(153) = 2.67, p = .008, and residents (Version A, M = 7.19, SD = 3.24; Version B, M = 5.56, SD = 2.72), t(77) = 2.32, p = .02, but not attendings. Authors developed an instrument to isolate and quantify bias produced by the availability and representativeness heuristics, and illustrated the utility of their instrument by demonstrating decreased heuristic bias within medical contexts at higher training levels.
Chappell, Nick A; Jones, Timothy D; Tych, Wlodek
2017-10-15
Insufficient temporal monitoring of water quality in streams or engineered drains alters the apparent shape of storm chemographs, resulting in shifted model parameterisations and changed interpretations of solute sources that have produced episodes of poor water quality. This so-called 'aliasing' phenomenon is poorly recognised in water research. Using advances in in-situ sensor technology it is now possible to monitor sufficiently frequently to avoid the onset of aliasing. A systems modelling procedure is presented allowing objective identification of sampling rates needed to avoid aliasing within strongly rainfall-driven chemical dynamics. In this study aliasing of storm chemograph shapes was quantified by changes in the time constant parameter (TC) of transfer functions. As a proportion of the original TC, the onset of aliasing varied between watersheds, ranging from 3.9-7.7 to 54-79 %TC (or 110-160 to 300-600 min). However, a minimum monitoring rate could be identified for all datasets if the modelling results were presented in the form of a new statistic, ΔTC. For the eight H + , DOC and NO 3 -N datasets examined from a range of watershed settings, an empirically-derived threshold of 1.3(ΔTC) could be used to quantify minimum monitoring rates within sampling protocols to avoid artefacts in subsequent data analysis. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Bartz, Faith E.; Lickness, Jacquelyn Sunshine; Heredia, Norma; Fabiszewski de Aceituno, Anna; Newman, Kira L.; Hodge, Domonique Watson; Jaykus, Lee-Ann; García, Santos
2017-01-01
ABSTRACT To improve food safety on farms, it is critical to quantify the impact of environmental microbial contamination sources on fresh produce. However, studies are hampered by difficulties achieving study designs with powered sample sizes to elucidate relationships between environmental and produce contamination. Our goal was to quantify, in the agricultural production environment, the relationship between microbial contamination on hands, soil, and water and contamination on fresh produce. In 11 farms and packing facilities in northern Mexico, we applied a matched study design: composite samples (n = 636, equivalent to 11,046 units) of produce rinses were matched to water, soil, and worker hand rinses during two growing seasons. Microbial indicators (coliforms, Escherichia coli, Enterococcus spp., and somatic coliphage) were quantified from composite samples. Statistical measures of association and correlations were calculated through Spearman's correlation, linear regression, and logistic regression models. The concentrations of all microbial indicators were positively correlated between produce and hands (ρ range, 0.41 to 0.75; P < 0.01). When E. coli was present on hands, the handled produce was nine times more likely to contain E. coli (P < 0.05). Similarly, when coliphage was present on hands, the handled produce was eight times more likely to contain coliphage (P < 0.05). There were relatively low concentrations of indicators in soil and water samples, and a few sporadic significant associations were observed between contamination of soil and water and contamination of produce. This methodology provides a foundation for future field studies, and results highlight the need for interventions surrounding farmworker hygiene and sanitation to reduce microbial contamination of farmworkers' hands. IMPORTANCE This study of the relationships between microbes on produce and in the farm environment can be used to support the design of targeted interventions to prevent or reduce microbial contamination of fresh produce with associated reductions in foodborne illness. PMID:28363965
Bartz, Faith E; Lickness, Jacquelyn Sunshine; Heredia, Norma; Fabiszewski de Aceituno, Anna; Newman, Kira L; Hodge, Domonique Watson; Jaykus, Lee-Ann; García, Santos; Leon, Juan S
2017-06-01
To improve food safety on farms, it is critical to quantify the impact of environmental microbial contamination sources on fresh produce. However, studies are hampered by difficulties achieving study designs with powered sample sizes to elucidate relationships between environmental and produce contamination. Our goal was to quantify, in the agricultural production environment, the relationship between microbial contamination on hands, soil, and water and contamination on fresh produce. In 11 farms and packing facilities in northern Mexico, we applied a matched study design: composite samples ( n = 636, equivalent to 11,046 units) of produce rinses were matched to water, soil, and worker hand rinses during two growing seasons. Microbial indicators (coliforms, Escherichia coli , Enterococcus spp., and somatic coliphage) were quantified from composite samples. Statistical measures of association and correlations were calculated through Spearman's correlation, linear regression, and logistic regression models. The concentrations of all microbial indicators were positively correlated between produce and hands (ρ range, 0.41 to 0.75; P < 0.01). When E. coli was present on hands, the handled produce was nine times more likely to contain E. coli ( P < 0.05). Similarly, when coliphage was present on hands, the handled produce was eight times more likely to contain coliphage ( P < 0.05). There were relatively low concentrations of indicators in soil and water samples, and a few sporadic significant associations were observed between contamination of soil and water and contamination of produce. This methodology provides a foundation for future field studies, and results highlight the need for interventions surrounding farmworker hygiene and sanitation to reduce microbial contamination of farmworkers' hands. IMPORTANCE This study of the relationships between microbes on produce and in the farm environment can be used to support the design of targeted interventions to prevent or reduce microbial contamination of fresh produce with associated reductions in foodborne illness. Copyright © 2017 American Society for Microbiology.
Preliminary Characterization of Erythrocytes Deformability on the Entropy-Complexity Plane
Korol, Ana M; D’Arrigo, Mabel; Foresto, Patricia; Pérez, Susana; Martín, Maria T; Rosso, Osualdo A
2010-01-01
We present an application of wavelet-based Information Theory quantifiers (Normalized Total Shannon Entropy, MPR-Statistical Complexity and Entropy-Complexity plane) on red blood cells membrane viscoelasticity characterization. These quantifiers exhibit important localization advantages provided by the Wavelet Theory. The present approach produces a clear characterization of this dynamical system, finding out an evident manifestation of a random process on the red cell samples of healthy individuals, and its sharp reduction of randomness on analyzing a human haematological disease, such as β-thalassaemia minor. PMID:21611139
Rapid method for measuring clastogenic fingerprints using fluorescence in situ hybridization
Lucas, Joe N.
2000-01-01
A method is provided for determining a clastogenic signature of a sample of chromosomes by quantifying a frequency of a first type of chromosome aberration present in the sample; quantifying a frequency of a second, different type of chromosome aberration present in the sample; and comparing the frequency of the first type of chromosome aberration to the frequency of the second type of chromosome aberration. A method is also provided for using that clastogenic signature to identify a clastogenic agent or dosage to which the cells were exposed.
Method for detecting a pericentric inversion in a chromosome
Lucas, Joe N.
2000-01-01
A method is provided for determining a clastogenic signature of a sample of chromosomes by quantifying a frequency of a first type of chromosome aberration present in the sample; quantifying a frequency of a second, different type of chromosome aberration present in the sample; and comparing the frequency of the first type of chromosome aberration to the frequency of the second type of chromosome aberration. A method is also provided for using that clastogenic signature to identify a clastogenic agent or dosage to which the cells were exposed.
Anaphoric Reference to Quantified Antecedents: An Event-Related Brain Potential Study
ERIC Educational Resources Information Center
Filik, Ruth; Leuthold, Hartmut; Moxey, Linda M.; Sanford, Anthony J.
2011-01-01
We report an event-related brain potential (ERP) study examining how readers process sentences containing anaphoric reference to quantified antecedents. Previous studies indicate that positive (e.g. "many") and negative (e.g. "not many") quantifiers cause readers to focus on different sets of entities. For example in "Many of the fans attended the…
Murray C. Richardson; Carl P. J. Mitchell; Brian A. Branfireun; Randall K. Kolka
2010-01-01
A new technique for quantifying the geomorphic form of northern forested wetlands from airborne LiDAR surveys is introduced, demonstrating the unprecedented ability to characterize the geomorphic form of northern forested wetlands using high-resolution digital topography. Two quantitative indices are presented, including the lagg width index (LWI) which objectively...
Mass spectroscopic apparatus and method
Bomse, David S.; Silver, Joel A.; Stanton, Alan C.
1991-01-01
The disclosure is directed to a method and apparatus for ionization modulated mass spectrometric analysis. Analog or digital data acquisition and processing can be used. Ions from a time variant source are detected and quantified. The quantified ion output is analyzed using a computer to provide a two-dimensional representation of at least one component present within an analyte.
Design of an environmental field observatory for quantifying the urban water budget
Claire Welty; Andrew J. Miller; Kenneth T. Belt; James A. Smith; Lawrence E. Band; Peter M. Groffman; Todd M. Scanlon; Juying Warner; Robert J. Ryan; Robert J. Shedlock; Michael P. McGuire
2007-01-01
Quantifying the water budget of urban areas presents special challenges, owing to the influence of subsurface infrastructure that can cause short-circuiting of natural flowpaths. In this paper we review some considerations for data collection and analysis in support of determining urban water budget components, with a particular emphasis on groundwater, using Baltimore...
Use of the alpha shape to quantify finite helical axis dispersion during simulated spine movements.
McLachlin, Stewart D; Bailey, Christopher S; Dunning, Cynthia E
2016-01-04
In biomechanical studies examining joint kinematics the most common measurement is range of motion (ROM), yet other techniques, such as the finite helical axis (FHA), may elucidate the changes in the 3D motion pathology more effectively. One of the deficiencies with the FHA technique is in quantifying the axes generated throughout a motion sequence. This study attempted to solve this issue via a computational geometric technique known as the alpha shape, which bounds a set of point data within a closed boundary similar to a convex hull. The purpose of this study was to use the alpha shape as an additional tool to visualize and quantify FHA dispersion between intact and injured cadaveric spine movements and compare these changes to the gold-standard ROM measurements. Flexion-extension, axial rotation, and lateral bending were simulated with five C5-C6 motion segments using a spinal loading simulator and Optotrak motion tracking system. Specimens were first tested intact followed by a simulated injury model. ROM and the FHAs were calculated post-hoc, with alpha shapes and convex hulls generated from the anatomic planar intercept points of the FHAs. While both ROM and the boundary shape areas increased with injury (p<0.05), no consistent geometric trends in the alpha shape growth were identified. The alpha shape area was sensitive to the alpha value chosen and values examined below 2.5 created more than one closed boundary. Ultimately, the alpha shape presents as a useful technique to quantify sequences of joint kinematics described by scatter plots such as FHA intercept data. Copyright © 2015. Published by Elsevier Ltd.
Jung, Christian; Drummer, Karl; Oelzner, Peter; Figulla, Hans R; Boettcher, Joachim; Franz, Marcus; Betge, Stefan; Foerster, Martin; Wolf, Gunter; Pfeil, Alexander
2015-01-01
Systemic sclerosis (SSc) is a systemic, autoimmune connective tissue disease characterized by vasculopathy and microvascular changes. Fluorescence Optical Imaging (FOI) is a technique used to assess inflammation in patients with arthritis; in this study FOI is used to quantify inflammation in the hand. Endothelial Microparticle (EMP) can reflect damage or activation of the endothelium but also actively modulate processes of inflammation, coagulation and vascular function. The aim of the present study was to quantify EMP and FOI, to determine an association between these microparticles and inflammation and to endothelial function. EMP were quantified in plasma samples of 25 patients (24 female, 1 male, age: 41 ± 9 years) with SSc using flow cytometry. EMP was defined as CD31+/CD42- MP, and CD62+ MP. Perivascular inflammation was assessed using fluorescence optical imaging (FOI) of the hand. Macrovascular endothelial function was non-invasively estimated using the Endopat system. Plasma levels of CD31+/CD42- EMP and CD62+ EMP were lower in patients with SSc compared to controls (both p < 0.05). An impaired endothelial function with an increased hyperemia index was observed. A strong association could be demonstrated between CD62+ EMP and perivascular soft tissue inflammation as assessed by the FOI global score (Spearman, p = 0.002, r = 0.61). EMP indicate molecular vascular damage in SSc; in this study a strong association between EMP and perivascular inflammation as quantified by FOI is demonstrated. Consequently EMP, using FOI, may be a potential marker benefitting the diagnosis and therapy monitoring of patients with SSc with associated Raynaud's phenomenon.
Information transfer satellite concept study. Volume 3: Appendices
NASA Technical Reports Server (NTRS)
Bergin, P.; Kincade, C.; Kurpiewski, D.; Leinhaupel, F.; Millican, F.; Onstad, R.
1971-01-01
Briefly reviewed are the various documents which supply background information for the information transfer satellite study. The sixteen papers reviewed are evaluated in terms of: (1) the category of service or demand being treated; (2) the extent to which information transfer predictions are quantified; (3) the degree to which the data supplied is adequate for the purposes of system synthesis; (4) some assessment as to the overall utility of the data presented in the paper.
2016-08-31
crack initiation and SCG mechanisms (initiation and growth versus resistance). 2. Final summary Here, we present a hierarchical form of multiscale...prismatic faults in -Ti: A combined quantum mechanics /molecular mechanics study 2. Nano-indentation and slip transfer (critical in understanding crack...initiation) 3. An extended-finite element framework (XFEM) to study SCG mechanisms 4. Atomistic methods to develop a grain and twin boundaries database
Quantifier Comprehension in Corticobasal Degeneration
ERIC Educational Resources Information Center
McMillan, Corey T.; Clark, Robin; Moore, Peachie; Grossman, Murray
2006-01-01
In this study, we investigated patients with focal neurodegenerative diseases to examine a formal linguistic distinction between classes of generalized quantifiers, like "some X" and "less than half of X." Our model of quantifier comprehension proposes that number knowledge is required to understand both first-order and higher-order quantifiers.…
Studies on quantifying evaporation in permeable pavement systems are limited to few laboratory studies that used a scale to weigh evaporative losses and a field application with a tunnel-evaporation gauge. A primary objective of this research was to quantify evaporation for a la...
Carl W. Adkins
1995-01-01
The Fire Image Analysis System is a tool for quantifying flame geometry and relative position at selected points along a spreading line fire. At present, the system requires uniform terrain (constant slope). The system has been used in field and laboratory studies for determining flame length, depth, cross sectional area, and rate of spread.
Gary W. Miller; James N. Kochenerfer; Kurt W. Gottschalk
2004-01-01
Successful oak regeneration is related to the size and number of advanced seedlings present when harvests occur. This study was installed to quantify the effect of microsite light availability and deer on the development of advanced northern red oak (Quercus rubra L.) reproduction in mesic Appalachian hardwood stands. Microsite light was manipulated...
An Attempt to Quantify the Economic Benefits of Scientific Research, Science Policy Studies No. 4.
ERIC Educational Resources Information Center
Byatt, I. C. R.; Cohen, A. V.
This paper presents a possible methodology for measuring and predicting the future course of the long-range economic benefits of "curiosity-oriented" research. The basic premise is that much pure research tends to give rise to major industries in about one generation. Each industry will have some total economic benefit which can be…
S. P. Urbanski
2013-01-01
In the US, wildfires and prescribed burning present significant challenges to air regulatory agencies attempting to achieve and maintain compliance with air quality regulations. Fire emission factors (EF) are essential input for the emission models used to develop wildland fire emission inventories. Most previous studies quantifying wildland fire EF of temperate...
Massot-Cladera, Malen; Franch, Àngels; Castellote, Cristina; Castell, Margarida; Pérez-Cano, Francisco J.
2013-01-01
Previous studies have reported that a diet containing 10% cocoa, a rich source of flavonoids, has immunomodulatory effects on rats and, among others effects, is able to attenuate the immunoglobulin (Ig) synthesis in both systemic and intestinal compartments. The purpose of the present study was focused on investigating whether these effects were attributed exclusively to the flavonoid content or to other compounds present in cocoa. To this end, eight-week-old Lewis rats were fed, for two weeks, either a standard diet or three isoenergetic diets containing increasing proportions of cocoa flavonoids from different sources: one with 0.2% polyphenols from conventional defatted cocoa, and two others with 0.4% and 0.8% polyphenols, respectively, from non-fermented cocoa. Diet intake and body weight were monitored and fecal samples were obtained throughout the study to determine fecal pH, IgA, bacteria proportions, and IgA-coated bacteria. Moreover, IgG and IgM concentrations in serum samples collected during the study were quantified. At the end of the dietary intervention no clear changes of serum IgG or IgM concentrations were quantified, showing few effects of cocoa polyphenol diets at the systemic level. However, in the intestine, all cocoa polyphenol-enriched diets attenuated the age-related increase of both fecal IgA and IgA-coated bacteria, as well as the proportion of bacteria in feces. As these effects were not dependent on the dose of polyphenol present in the diets, other compounds and/or the precise polyphenol composition present in cocoa raw material used for the diets could be key factors in this effect. PMID:23966108
PolNet: A Tool to Quantify Network-Level Cell Polarity and Blood Flow in Vascular Remodeling.
Bernabeu, Miguel O; Jones, Martin L; Nash, Rupert W; Pezzarossa, Anna; Coveney, Peter V; Gerhardt, Holger; Franco, Claudio A
2018-05-08
In this article, we present PolNet, an open-source software tool for the study of blood flow and cell-level biological activity during vessel morphogenesis. We provide an image acquisition, segmentation, and analysis protocol to quantify endothelial cell polarity in entire in vivo vascular networks. In combination, we use computational fluid dynamics to characterize the hemodynamics of the vascular networks under study. The tool enables, to our knowledge for the first time, a network-level analysis of polarity and flow for individual endothelial cells. To date, PolNet has proven invaluable for the study of endothelial cell polarization and migration during vascular patterning, as demonstrated by two recent publications. Additionally, the tool can be easily extended to correlate blood flow with other experimental observations at the cellular/molecular level. We release the source code of our tool under the Lesser General Public License. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Uncertainties in the palaeoflood record - interpreting geomorphology since 12 500 BP
NASA Astrophysics Data System (ADS)
Moloney, Jessica; Coulthard, Tom; Freer, Jim; Rogerson, Mike
2017-04-01
Recent floods in the UK have reinvigorated the national debate within academic and non-academic organisations of how we quantify risk and improve the resilience of communities to flooding. One critical aspect of that debate is to better understand and quantify the frequency of extreme floods occurring. The research presented in this study explores the challenges and uncertainties of using longer term palaeoflood data records to improve the quantification of flood risk. The frequency of floods has been studied on short (under 100 years) and long-time (over 200 years) scales. Long term flood frequency records rely on the radiocarbon dating and interpretation of geomorphological evidence within fluvial depositional environments. However, there are limitations with the methods used to do this. Notably, the use of probability distribution functions of fluvial deposits dates does not consider any other information, such as the geomorphological context of material and/ or the type of depositional environment. This study re-analyses 776 radiocarbon dated fluvial deposits from the UK, which have been compiled into a database, to interpret the geomorphological flood record. Initial findings indicate that even this large number of samples may be unsuitable for probabilistic methods and shows an unusual sensitivity to the number of records present in the database.
Quantifying landscape linkages among giant panda subpopulations in regional scale conservation.
Qi, Dunwu; Hu, Yibo; Gu, Xiaodong; Yang, Xuyi; Yang, Guang; Wei, Fuwen
2012-06-01
Understanding habitat requirements and identifying landscape linkages are essential for the survival of isolated populations of endangered species. Currently, some of the giant panda populations are isolated, which threatens their long-term survival, particularly in the Xiaoxiangling mountains. In the present study, we quantified niche requirements and then identified potential linkages of giant panda subpopulations in the most isolated region, using ecological niche factor analysis and a least-cost path model. Giant pandas preferred habitat with conifer forest and gentle slopes (>20 to ≤30°). Based on spatial distribution of suitable habitat, linkages were identified for the Yele subpopulation to 4 other subpopulations (Liziping, Matou, Xinmin and Wanba). Their lengths ranged from 15 to 54 km. The accumulated cost ranged from 693 to 3166 and conifer forest covered over 31%. However, a variety of features (e.g. major roads, human settlements and large unforested areas) might act as barriers along the linkages for giant panda dispersal. Our analysis quantified giant panda subpopulation connectivity to ensure long-term survival. © 2012 ISZS, Blackwell Publishing and IOZ/CAS.
NASA Astrophysics Data System (ADS)
Coddington, O. M.; Vukicevic, T.; Schmidt, K. S.; Platnick, S.
2017-08-01
We rigorously quantify the probability of liquid or ice thermodynamic phase using only shortwave spectral channels specific to the National Aeronautics and Space Administration's Moderate Resolution Imaging Spectroradiometer, Visible Infrared Imaging Radiometer Suite, and the notional future Plankton, Aerosol, Cloud, ocean Ecosystem imager. The results show that two shortwave-infrared channels (2135 and 2250 nm) provide more information on cloud thermodynamic phase than either channel alone; in one case, the probability of ice phase retrieval increases from 65 to 82% by combining 2135 and 2250 nm channels. The analysis is performed with a nonlinear statistical estimation approach, the GEneralized Nonlinear Retrieval Analysis (GENRA). The GENRA technique has previously been used to quantify the retrieval of cloud optical properties from passive shortwave observations, for an assumed thermodynamic phase. Here we present the methodology needed to extend the utility of GENRA to a binary thermodynamic phase space (i.e., liquid or ice). We apply formal information content metrics to quantify our results; two of these (mutual and conditional information) have not previously been used in the field of cloud studies.
Kinematics of the asal rift (djibouti) determined from the deformation of fieale volcano.
De Chabalier, J B; Avouac, J P
1994-09-16
Because of its subaerial exposure the Asal rift segment provides an exceptional opportunity to quantify the deformation field of an active rift and assess the contribution of tectonics and volcanism to rifting processes. The present topography of the Asal rift results from the tectonic dismemberment during the last 100,000 years of a large central volcanic edifice that formed astride the rift zone 300,000 to 100,000 years ago. Three-dimensional deformation of this volcano has been quantified from the combined analysis of the topography and geology. The analysis indicates that spreading at 17 to 29 millimeters per year in a N40 degrees +/- 5 degrees E direction accounts for most of the separation between Arabia and Somalia. The small topographic subsidence relative to extension suggests that tectonic thinning of the crust has been balanced by injection and underplating of magmatic material of near crustal density. The methodology developed in this study could also be applied to quantify deformation in relatively inaccessible areas where the main available information is topography or bathymetry.
A standardized test battery for the study of synesthesia
Eagleman, David M.; Kagan, Arielle D.; Nelson, Stephanie S.; Sagaram, Deepak; Sarma, Anand K.
2014-01-01
Synesthesia is an unusual condition in which stimulation of one modality evokes sensation or experience in another modality. Although discussed in the literature well over a century ago, synesthesia slipped out of the scientific spotlight for decades because of the difficulty in verifying and quantifying private perceptual experiences. In recent years, the study of synesthesia has enjoyed a renaissance due to the introduction of tests that demonstrate the reality of the condition, its automatic and involuntary nature, and its measurable perceptual consequences. However, while several research groups now study synesthesia, there is no single protocol for comparing, contrasting and pooling synesthetic subjects across these groups. There is no standard battery of tests, no quantifiable scoring system, and no standard phrasing of questions. Additionally, the tests that exist offer no means for data comparison. To remedy this deficit we have devised the Synesthesia Battery. This unified collection of tests is freely accessible online (http://www.synesthete.org). It consists of a questionnaire and several online software programs, and test results are immediately available for use by synesthetes and invited researchers. Performance on the tests is quantified with a standard scoring system. We introduce several novel tests here, and offer the software for running the tests. By presenting standardized procedures for testing and comparing subjects, this endeavor hopes to speed scientific progress in synesthesia research. PMID:16919755
An energy-based body temperature threshold between torpor and normothermia for small mammals.
Willis, Craig K R
2007-01-01
Field studies of use of torpor by heterothermic endotherms suffer from the lack of a standardized threshold differentiating torpid body temperatures (T(b)) from normothermic T(b)'s. This threshold can be more readily observed if metabolic rate (MR) is measured in the laboratory. I digitized figures from the literature that depicted simultaneous traces of MR and T(b) from 32 respirometry runs for 14 mammal species. For each graph, I quantified the T(b) measured when MR first began to drop at the onset of torpor (T(b-onset)). I used a general linear model to quantify the effect of ambient temperature (T(a)) and body mass (BM) on T(b-onset). For species lighter than 70 g, the model was highly significant and was described by the equation Tb-onset=(0.055+/-0.014)BM+(0.071+/-0.031)Ta+(31.823+/-0.740). To be conservative, I recommend use of these model parameters minus 1 standard error, which modifies the equation to Tb-onset-1 SE=(0.041)BM+(0.040)Ta+31.083. This approach provides a standardized threshold for differentiating torpor from normothermia that is based on use of energy, the actual currency of interest for studies of torpor in the wild. Few laboratory studies have presented the time-course data required to quantify T(b-onset), so more data are needed to validate this relationship.
Zu Ermgassen, Philine S. E.; Spalding, Mark D.; Blake, Brady; Coen, Loren D.; Dumbauld, Brett; Geiger, Steve; Grabowski, Jonathan H.; Grizzle, Raymond; Luckenbach, Mark; McGraw, Kay; Rodney, William; Ruesink, Jennifer L.; Powers, Sean P.; Brumbaugh, Robert
2012-01-01
Historic baselines are important in developing our understanding of ecosystems in the face of rapid global change. While a number of studies have sought to determine changes in extent of exploited habitats over historic timescales, few have quantified such changes prior to late twentieth century baselines. Here, we present, to our knowledge, the first ever large-scale quantitative assessment of the extent and biomass of marine habitat-forming species over a 100-year time frame. We examined records of wild native oyster abundance in the United States from a historic, yet already exploited, baseline between 1878 and 1935 (predominantly 1885–1915), and a current baseline between 1968 and 2010 (predominantly 2000–2010). We quantified the extent of oyster grounds in 39 estuaries historically and 51 estuaries from recent times. Data from 24 estuaries allowed comparison of historic to present extent and biomass. We found evidence for a 64 per cent decline in the spatial extent of oyster habitat and an 88 per cent decline in oyster biomass over time. The difference between these two numbers illustrates that current areal extent measures may be masking significant loss of habitat through degradation. PMID:22696522
COMPLEXITY&APPROXIMABILITY OF QUANTIFIED&STOCHASTIC CONSTRAINT SATISFACTION PROBLEMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, H. B.; Marathe, M. V.; Stearns, R. E.
2001-01-01
Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S and T be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SATc(S).) Here, we study simultaneously the complexity of decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. We present simple yet general techniques to characterize simultaneously, the complexity ormore » efficient approximability of a number of versions/variants of the problems SAT(S), Q-SAT(S), S-SAT(S),MAX-Q-SAT(S) etc., for many different such D,C ,S, T. These versions/variants include decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. Our unified approach is based on the following two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic represent ability. Some of the results extend the earlier results in [Pa85,LMP99,CF+93,CF+94O]u r techniques and results reported here also provide significant steps towards obtaining dichotomy theorems, for a number of the problems above, including the problems MAX-&-SAT( S), and MAX-S-SAT(S). The discovery of such dichotomy theorems, for unquantified formulas, has received significant recent attention in the literature [CF+93,CF+94,Cr95,KSW97]« less
New Protocol Based on UHPLC-MS/MS for Quantitation of Metabolites in Xylose-Fermenting Yeasts
NASA Astrophysics Data System (ADS)
Campos, Christiane Gonçalves; Veras, Henrique César Teixeira; de Aquino Ribeiro, José Antônio; Costa, Patrícia Pinto Kalil Gonçalves; Araújo, Katiúscia Pereira; Rodrigues, Clenilson Martins; de Almeida, João Ricardo Moreira; Abdelnur, Patrícia Verardi
2017-12-01
Xylose fermentation is a bottleneck in second-generation ethanol production. As such, a comprehensive understanding of xylose metabolism in naturally xylose-fermenting yeasts is essential for prospection and construction of recombinant yeast strains. The objective of the current study was to establish a reliable metabolomics protocol for quantification of key metabolites of xylose catabolism pathways in yeast, and to apply this protocol to Spathaspora arborariae. Ultra-high performance liquid chromatography coupled to tandem mass spectrometry (UHPLC-MS/MS) was used to quantify metabolites, and afterwards, sample preparation was optimized to examine yeast intracellular metabolites. S. arborariae was cultivated using xylose as a carbon source under aerobic and oxygen-limited conditions. Ion pair chromatography (IPC) and hydrophilic interaction liquid chromatography-tandem mass spectrometry (HILIC-MS/MS) were shown to efficiently quantify 14 and 5 metabolites, respectively, in a more rapid chromatographic protocol than previously described. Thirteen and eleven metabolites were quantified in S. arborariae under aerobic and oxygen-limited conditions, respectively. This targeted metabolomics protocol is shown here to quantify a total of 19 metabolites, including sugars, phosphates, coenzymes, monosaccharides, and alcohols, from xylose catabolism pathways (glycolysis, pentose phosphate pathway, and tricarboxylic acid cycle) in yeast. Furthermore, to our knowledge, this is the first time that intracellular metabolites have been quantified in S. arborariae after xylose consumption. The results indicated that fine control of oxygen levels during fermentation is necessary to optimize ethanol production by S. arborariae. The protocol presented here may be applied to other yeast species and could support yeast genetic engineering to improve second generation ethanol production. [Figure not available: see fulltext.
NASA Astrophysics Data System (ADS)
Owers, Christopher J.; Rogers, Kerrylee; Woodroffe, Colin D.
2018-05-01
Above-ground biomass represents a small yet significant contributor to carbon storage in coastal wetlands. Despite this, above-ground biomass is often poorly quantified, particularly in areas where vegetation structure is complex. Traditional methods for providing accurate estimates involve harvesting vegetation to develop mangrove allometric equations and quantify saltmarsh biomass in quadrats. However broad scale application of these methods may not capture structural variability in vegetation resulting in a loss of detail and estimates with considerable uncertainty. Terrestrial laser scanning (TLS) collects high resolution three-dimensional point clouds capable of providing detailed structural morphology of vegetation. This study demonstrates that TLS is a suitable non-destructive method for estimating biomass of structurally complex coastal wetland vegetation. We compare volumetric models, 3-D surface reconstruction and rasterised volume, and point cloud elevation histogram modelling techniques to estimate biomass. Our results show that current volumetric modelling approaches for estimating TLS-derived biomass are comparable to traditional mangrove allometrics and saltmarsh harvesting. However, volumetric modelling approaches oversimplify vegetation structure by under-utilising the large amount of structural information provided by the point cloud. The point cloud elevation histogram model presented in this study, as an alternative to volumetric modelling, utilises all of the information within the point cloud, as opposed to sub-sampling based on specific criteria. This method is simple but highly effective for both mangrove (r2 = 0.95) and saltmarsh (r2 > 0.92) vegetation. Our results provide evidence that application of TLS in coastal wetlands is an effective non-destructive method to accurately quantify biomass for structurally complex vegetation.
Shoga, Janty S; Graham, Brian T; Wang, Liyun; Price, Christopher
2017-10-01
Articular cartilage is an avascular tissue; diffusive transport is critical for its homeostasis. While numerous techniques have been used to quantify diffusivity within porous, hydrated tissues and tissue engineered constructs, these techniques have suffered from issues regarding invasiveness and spatial resolution. In the present study, we implemented and compared two separate correlation spectroscopy techniques, fluorescence correlation spectroscopy (FCS) and raster image correlation spectroscopy (RICS), for the direct, and minimally-invasive quantification of fluorescent solute diffusion in agarose and articular cartilage. Specifically, we quantified the diffusional properties of fluorescein and Alexa Fluor 488-conjugated dextrans (3k and 10k) in aqueous solutions, agarose gels of varying concentration (i.e. 1, 3, 5%), and in different zones of juvenile bovine articular cartilage explants (i.e. superficial, middle, and deep). In agarose, properties of solute diffusion obtained via FCS and RICS were inversely related to molecule size, gel concentration, and applied strain. In cartilage, the diffusional properties of solutes were similarly dependent upon solute size, cartilage zone, and compressive strain; findings that agree with work utilizing other quantification techniques. In conclusion, this study established the utility of FCS and RICS as simple and minimally invasive techniques for quantifying microscale solute diffusivity within agarose constructs and articular cartilage explants.
Feng, Jingwen; Lin, Jie; Zhang, Pengquan; Yang, Songnan; Sa, Yu; Feng, Yuanming
2017-08-29
High-content screening is commonly used in studies of the DNA damage response. The double-strand break (DSB) is one of the most harmful types of DNA damage lesions. The conventional method used to quantify DSBs is γH2AX foci counting, which requires manual adjustment and preset parameters and is usually regarded as imprecise, time-consuming, poorly reproducible, and inaccurate. Therefore, a robust automatic alternative method is highly desired. In this manuscript, we present a new method for quantifying DSBs which involves automatic image cropping, automatic foci-segmentation and fluorescent intensity measurement. Furthermore, an additional function was added for standardizing the measurement of DSB response inhibition based on co-localization analysis. We tested the method with a well-known inhibitor of DSB response. The new method requires only one preset parameter, which effectively minimizes operator-dependent variations. Compared with conventional methods, the new method detected a higher percentage difference of foci formation between different cells, which can improve measurement accuracy. The effects of the inhibitor on DSB response were successfully quantified with the new method (p = 0.000). The advantages of this method in terms of reliability, automation and simplicity show its potential in quantitative fluorescence imaging studies and high-content screening for compounds and factors involved in DSB response.
Li, Maoyin; Butka, Emily; Wang, Xuemin
2014-10-10
Soybean seeds are an important source of vegetable oil and biomaterials. The content of individual triacylglycerol species (TAG) in soybean seeds is difficult to quantify in an accurate and rapid way. The present study establishes an approach to quantify TAG species in soybean seeds utilizing an electrospray ionization tandem mass spectrometry with multiple neutral loss scans. Ten neutral loss scans were performed to detect the fatty acyl chains of TAG, including palmitic (P, 1650), linolenic (Ln, 1853), linoleic (L, 1852), oleic (O, 1851), stearic (S, 1850), eicosadienoic (2052), gadoleic (2051), arachidic (2050), erucic (2251), and behenic (2250). The abundance ofmore » ten fatty acyl chains at 46 TAG masses (mass-to-charge ratio, m/z) were determined after isotopic deconvolution and correction by adjustment factors at each TAG mass. The direct sample infusion and multiple internal standards correction allowed a rapid and accurate quantification of TAG species. Ninety-three TAG species were resolved and their levels were determined.The most abundant TAG species were LLL, OLL, LLLn, PLL, OLLn, OOL, POL, and SLL. Many new species were detected and quantified. As a result, this shotgun lipidomics approach should facilitate the study of TAG metabolism and genetic breeding of soybean seeds for desirable TAG content and composition.« less
Chen, Ru; Pan, Sheng; Cooke, Kelly; Moyes, Kara White; Bronner, Mary P.; Goodlett, David R.; Aebersold, Ruedi; Brentnall, Teresa A.
2008-01-01
Objectives Pancreatitis is an inflammatory condition of the pancreas. However, it often shares many molecular features with pancreatic cancer. Biomarkers present in pancreatic cancer frequently occur in the setting of pancreatitis. The efforts to develop diagnostic biomarkers for pancreatic cancer have thus been complicated by the false-positive involvement of pancreatitis. Methods In an attempt to develop protein biomarkers for pancreatic cancer, we previously use quantitative proteomics to identify and quantify the proteins from pancreatic cancer juice. Pancreatic juice is a rich source of proteins that are shed by the pancreatic ductal cells. In this study, we used a similar approach to identify and quantify proteins from pancreatitis juice. Results In total, 72 proteins were identified and quantified in the comparison of pancreatic juice from pancreatitis patients versus pooled normal control juice. Nineteen of the juice proteins were overexpressed, and 8 were underexpressed in pancreatitis juice by at least 2-fold compared with normal pancreatic juice. Of these 27 differentially expressed proteins in pancreatitis, 9 proteins were also differentially expressed in the pancreatic juice from pancreatic cancer patient. Conclusions Identification of these differentially expressed proteins from pancreatitis juice provides useful information for future study of specific pancreatitis-associated proteins and to eliminate potential false-positive biomarkers for pancreatic cancer. PMID:17198186
Chen, Ru; Pan, Sheng; Cooke, Kelly; Moyes, Kara White; Bronner, Mary P; Goodlett, David R; Aebersold, Ruedi; Brentnall, Teresa A
2007-01-01
Pancreatitis is an inflammatory condition of the pancreas. However, it often shares many molecular features with pancreatic cancer. Biomarkers present in pancreatic cancer frequently occur in the setting of pancreatitis. The efforts to develop diagnostic biomarkers for pancreatic cancer have thus been complicated by the false-positive involvement of pancreatitis. In an attempt to develop protein biomarkers for pancreatic cancer, we previously use quantitative proteomics to identify and quantify the proteins from pancreatic cancer juice. Pancreatic juice is a rich source of proteins that are shed by the pancreatic ductal cells. In this study, we used a similar approach to identify and quantify proteins from pancreatitis juice. In total, 72 proteins were identified and quantified in the comparison of pancreatic juice from pancreatitis patients versus pooled normal control juice. Nineteen of the juice proteins were overexpressed, and 8 were underexpressed in pancreatitis juice by at least 2-fold compared with normal pancreatic juice. Of these 27 differentially expressed proteins in pancreatitis, 9 proteins were also differentially expressed in the pancreatic juice from pancreatic cancer patient. Identification of these differentially expressed proteins from pancreatitis juice provides useful information for future study of specific pancreatitis-associated proteins and to eliminate potential false-positive biomarkers for pancreatic cancer.
Special Education in the City: How Has the Money Been Spent and What Do We Have To Show for It?
ERIC Educational Resources Information Center
Parrish, Thomas B.; Bitter, Catherine Sousa
2003-01-01
This article discusses how the concept of efficiency in special education services is translated into specific practices. A paradigm is presented that quantifies a measure of student need, ties school allocations to student needs, tracks actual expenditures on special education, and links those expenditures to quantifiable measures of student…
Regional distribution and dynamics of coarse woody debris in Midwestern old-growth forests
Martin A. Spetich; Stephen R. Shifley; George R. Parker
1999-01-01
Old-growth forests have been noted for containing significant quantities of deadwood. However, there has been no coordinated effort to quantify the deadwood component of old-growth remnants across large regions of temperate deciduous forest. We present results of a regional inventory that quantifies and examines regional and temporal trends for deadwood in upland old-...
Quantifying uncertainty in climate change science through empirical information theory.
Majda, Andrew J; Gershgorin, Boris
2010-08-24
Quantifying the uncertainty for the present climate and the predictions of climate change in the suite of imperfect Atmosphere Ocean Science (AOS) computer models is a central issue in climate change science. Here, a systematic approach to these issues with firm mathematical underpinning is developed through empirical information theory. An information metric to quantify AOS model errors in the climate is proposed here which incorporates both coarse-grained mean model errors as well as covariance ratios in a transformation invariant fashion. The subtle behavior of model errors with this information metric is quantified in an instructive statistically exactly solvable test model with direct relevance to climate change science including the prototype behavior of tracer gases such as CO(2). Formulas for identifying the most sensitive climate change directions using statistics of the present climate or an AOS model approximation are developed here; these formulas just involve finding the eigenvector associated with the largest eigenvalue of a quadratic form computed through suitable unperturbed climate statistics. These climate change concepts are illustrated on a statistically exactly solvable one-dimensional stochastic model with relevance for low frequency variability of the atmosphere. Viable algorithms for implementation of these concepts are discussed throughout the paper.
Garland, Ellen C; Rendell, Luke; Lilley, Matthew S; Poole, M Michael; Allen, Jenny; Noad, Michael J
2017-07-01
Identifying and quantifying variation in vocalizations is fundamental to advancing our understanding of processes such as speciation, sexual selection, and cultural evolution. The song of the humpback whale (Megaptera novaeangliae) presents an extreme example of complexity and cultural evolution. It is a long, hierarchically structured vocal display that undergoes constant evolutionary change. Obtaining robust metrics to quantify song variation at multiple scales (from a sound through to population variation across the seascape) is a substantial challenge. Here, the authors present a method to quantify song similarity at multiple levels within the hierarchy. To incorporate the complexity of these multiple levels, the calculation of similarity is weighted by measurements of sound units (lower levels within the display) to bridge the gap in information between upper and lower levels. Results demonstrate that the inclusion of weighting provides a more realistic and robust representation of song similarity at multiple levels within the display. This method permits robust quantification of cultural patterns and processes that will also contribute to the conservation management of endangered humpback whale populations, and is applicable to any hierarchically structured signal sequence.
Optical studies of oxidative stress in pulmonary artery endothelial cells
NASA Astrophysics Data System (ADS)
Ghanian, Zahra; Sepehr, Reyhaneh; Eis, Annie; Kondouri, Ganesh; Ranji, Mahsa
2015-03-01
Reactive oxygen species (ROS) play an essential role in facilitating signal transduction processes within the cell and modulating the injuries. However, the generation of ROS is tightly controlled both spatially and temporally within the cell, making the study of ROS dynamics particularly difficult. This study present a novel protocol to quantify the dynamic of the mitochondrial superoxide as a precursor of reactive oxygen species. To regulate the mitochondrial superoxide level, metabolic perturbation was induced by administration of potassium cyanide (KCN). The presented method was able to monitor and measure the superoxide production rate over time. Our results demonstrated that the metabolic inhibitor, potassium cyanide (KCN) induced a significant increase in the rate of superoxide production in mitochondria of fetal pulmonary artery endothelial cells (FPAEC). Presented method sets the stage to study different ROS mediated injuries in vitro.
NMR-Metabolic Methodology in the Study of GM Foods
Sobolev, Anatoly P.; Capitani, Donatella; Giannino, Donato; Nicolodi, Chiara; Testone, Giulio; Santoro, Flavio; Frugis, Giovanna; Iannelli, Maria A.; Mattoo, Autar K.; Brosio, Elvino; Gianferri, Raffaella; D’Amico, Irene; Mannina, Luisa
2010-01-01
The 1H-NMR methodology used in the study of genetically modified (GM) foods is discussed. Transgenic lettuce (Lactuca sativa cv "Luxor") over-expressing the ArabidopsisKNAT1 gene is presented as a case study. Twenty-two water-soluble metabolites (amino acids, organic acids, sugars) present in leaves of conventional and GM lettuce were monitored by NMR and quantified at two developmental stages. The NMR spectra did not reveal any difference in metabolite composition between the GM lettuce and the wild type counterpart. Statistical analyses of metabolite variables highlighted metabolism variation as a function of leaf development as well as the transgene. A main effect of the transgene was in altering sugar metabolism. PMID:22253988
Stabilization of benthic algal biomass in a temperate stream draining agroecosystems.
Ford, William I; Fox, James F
2017-01-01
Results of the present study quantified carbon sequestration due to algal stabilization in low order streams, which has not been considered previously in carbon stream ecosystem studies. The authors used empirical mode decomposition of an 8-year carbon elemental and isotope dataset to quantify carbon accrual and fingerprint carbon derived from algal stabilization. The authors then applied a calibrated, process-based stream carbon model (ISOFLOC) that elicits further evidence of algal stabilization. Data and modeling results suggested that processes of shielding and burial during an extreme hydrologic event enhance algal stabilization. Given that previous studies assumed stream algae are turned over or sloughed downstream, the authors performed scenario simulations of the calibrated model in order to assess how changing environmental conditions might impact algae stabilization within the stream. Results from modeling scenarios showed an increase in algal stabilization as mean annual water temperature increases ranging from 0 to 0.04 tC km -2 °C -1 for the study watershed. The dependence of algal stabilization on temperature highlighted the importance of accounting for benthic fate of carbon in streams under projected warming scenarios. This finding contradicts the evolving paradigm that net efflux of CO 2 from streams increases with increasing temperatures. Results also quantified sloughed algae that is transported and potentially stabilized downstream and showed that benthos-derived sloughed algae was on the same order of magnitude, and at times greater, than phytoplankton within downstream water bodies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Investigating the Conceptual Variation of Major Physics Textbooks
NASA Astrophysics Data System (ADS)
Stewart, John; Campbell, Richard; Clanton, Jessica
2008-04-01
The conceptual problem content of the electricity and magnetism chapters of seven major physics textbooks was investigated. The textbooks presented a total of 1600 conceptual electricity and magnetism problems. The solution to each problem was decomposed into its fundamental reasoning steps. These fundamental steps are, then, used to quantify the distribution of conceptual content among the set of topics common to the texts. The variation of the distribution of conceptual coverage within each text is studied. The variation between the major groupings of the textbooks (conceptual, algebra-based, and calculus-based) is also studied. A measure of the conceptual complexity of the problems in each text is presented.
Waneesorn, Jarurin; Panyasai, Sitthichai; Kongthai, Kanyakan; Singboottra, Panthong; Pornprasert, Sakorn
2011-01-01
Hb Constant Spring [Hb CS; α142, Term→Gln (TAA>CAA in α2)] is often missed by routine laboratory testing since its mRNA as well as gene product are unstable and presented at a low level in peripheral blood. This study aimed to analyze the efficacy of capillary electrophoresis (CE) and high performance liquid chromatography (HPLC) for detecting and quantifying Hb CS in 19 heterozygotes and 14 homozygotes with Hb CS as well as 10 Hb H-CS disease subjects who were detected by molecular analysis. In the CE electrophoregram, Hb CS was seen at zone 2 and was observed in all samples, while the chromatogram of Hb CS peaks was found in 26.32% heterozygotes, 42.86% homozygotes and 90% Hb H-CS disease subjects, respectively. In addition, the Hb CS levels in each group of subjects quantified by CE were significantly higher than those quantified by HPLC. Based on the CE method, the lowest Hb CS level was found in the heterozygotes, whereas the highest level was found in the Hb H-CS disease patients. Therefore, the CE method was superior to the HPLC method for detecting Hb CS. Furthermore, the level of Hb CS quantified by CE proved useful in screening heterozygotes and homozygotes with Hb CS as well as Hb H-CS disease.
Vicino, Greg A; Marcacci, Emily S
2015-01-01
To the authors' knowledge there is currently no discrete index to measure the integrated intensity of a play bout in mammals, despite the potential for using intensity and duration of play bouts as a measure of physical activity and welfare. This study was developed to test an equation that quantified the intensity and duration of play bouts in a particularly gregarious mammal, African elephants (Loxodonta africana) housed at the San Diego Zoo Safari Park in Escondido, CA. To quantify these behaviors, we created a scale of intensity and a subsequent equation that produces an index value, giving each unique bout a score. A compilation of these scores provides a range of intensity of play behavior that is a representative value for that particular herd at that point in time, and thus a database to which later bouts can be compared. It can be argued that play behavior is an indicator of positive welfare, and if quantifiable, it is our belief that it can be used as an additional measure of positive welfare in zoo housed animals. Here we present the methods and technique used to calculate a standardized Integrated Play Index (IPI) that has potential for use in other socially living species that are known to exhibit play behavior. © 2015 Wiley Periodicals, Inc.
Templeton, Justin P.; Struebing, Felix L.; Lemmon, Andrew; Geisert, Eldon E.
2014-01-01
The present article introduces a new and easy to use counting application for the Apple iPad. The application “ImagePAD” takes advantage of the advanced user interface features offered by the Apple iOS® platform, simplifying the rather tedious task of quantifying features in anatomical studies. For example, the image under analysis can be easily panned and zoomed using iOS-supported multi-touch gestures without losing the spatial context of the counting task, which is extremely important for ensuring count accuracy. This application allows one to quantify up to 5 different types of objects in a single field and output the data in a tab-delimited format for subsequent analysis. We describe two examples of the use of the application: quantifying axons in the optic nerve of the C57BL/6J mouse and determining the percentage of cells labeled with NeuN or ChAT in the retinal ganglion cell layer. For the optic nerve, contiguous images at 60× magnification were taken and transferred onto an Apple iPad®. Axons were counted by tapping on the touch-sensitive screen using ImagePAD. Nine optic nerves were sampled and the number of axons in the nerves ranged from 38872 axons to 50196 axons with an average of 44846 axons per nerve (SD = 3980 axons). PMID:25281829
Zhen, Zonglei; Yang, Zetian; Huang, Lijie; Kong, Xiang-Zhen; Wang, Xu; Dang, Xiaobin; Huang, Yangyue; Song, Yiying; Liu, Jia
2015-06-01
Face-selective regions (FSRs) are among the most widely studied functional regions in the human brain. However, individual variability of the FSRs has not been well quantified. Here we use functional magnetic resonance imaging (fMRI) to localize the FSRs and quantify their spatial and functional variabilities in 202 healthy adults. The occipital face area (OFA), posterior and anterior fusiform face areas (pFFA and aFFA), posterior continuation of the superior temporal sulcus (pcSTS), and posterior and anterior STS (pSTS and aSTS) were delineated for each individual with a semi-automated procedure. A probabilistic atlas was constructed to characterize their interindividual variability, revealing that the FSRs were highly variable in location and extent across subjects. The variability of FSRs was further quantified on both functional (i.e., face selectivity) and spatial (i.e., volume, location of peak activation, and anatomical location) features. Considerable interindividual variability and rightward asymmetry were found in all FSRs on these features. Taken together, our work presents the first effort to characterize comprehensively the variability of FSRs in a large sample of healthy subjects, and invites future work on the origin of the variability and its relation to individual differences in behavioral performance. Moreover, the probabilistic functional atlas will provide an adequate spatial reference for mapping the face network. Copyright © 2015 Elsevier Inc. All rights reserved.
Huo, Yinghe; Vincken, Koen L; van der Heijde, Desiree; de Hair, Maria J H; Lafeber, Floris P; Viergever, Max A
2017-11-01
Objective: Wrist joint space narrowing is a main radiographic outcome of rheumatoid arthritis (RA). Yet, automatic radiographic wrist joint space width (JSW) quantification for RA patients has not been widely investigated. The aim of this paper is to present an automatic method to quantify the JSW of three wrist joints that are least affected by bone overlapping and are frequently involved in RA. These joints are located around the scaphoid bone, viz. the multangular-navicular, capitate-navicular-lunate, and radiocarpal joints. Methods: The joint space around the scaphoid bone is detected by using consecutive searches of separate path segments, where each segment location aids in constraining the subsequent one. For joint margin delineation, first the boundary not affected by X-ray projection is extracted, followed by a backtrace process to obtain the actual joint margin. The accuracy of the quantified JSW is evaluated by comparison with the manually obtained ground truth. Results: Two of the 50 radiographs used for evaluation of the method did not yield a correct path through all three wrist joints. The delineated joint margins of the remaining 48 radiographs were used for JSW quantification. It was found that 90% of the joints had a JSW deviating less than 20% from the mean JSW of manual indications, with the mean JSW error less than 10%. Conclusion: The proposed method is able to automatically quantify the JSW of radiographic wrist joints reliably. The proposed method may aid clinical researchers to study the progression of wrist joint damage in RA studies. Objective: Wrist joint space narrowing is a main radiographic outcome of rheumatoid arthritis (RA). Yet, automatic radiographic wrist joint space width (JSW) quantification for RA patients has not been widely investigated. The aim of this paper is to present an automatic method to quantify the JSW of three wrist joints that are least affected by bone overlapping and are frequently involved in RA. These joints are located around the scaphoid bone, viz. the multangular-navicular, capitate-navicular-lunate, and radiocarpal joints. Methods: The joint space around the scaphoid bone is detected by using consecutive searches of separate path segments, where each segment location aids in constraining the subsequent one. For joint margin delineation, first the boundary not affected by X-ray projection is extracted, followed by a backtrace process to obtain the actual joint margin. The accuracy of the quantified JSW is evaluated by comparison with the manually obtained ground truth. Results: Two of the 50 radiographs used for evaluation of the method did not yield a correct path through all three wrist joints. The delineated joint margins of the remaining 48 radiographs were used for JSW quantification. It was found that 90% of the joints had a JSW deviating less than 20% from the mean JSW of manual indications, with the mean JSW error less than 10%. Conclusion: The proposed method is able to automatically quantify the JSW of radiographic wrist joints reliably. The proposed method may aid clinical researchers to study the progression of wrist joint damage in RA studies.
Systematic changes in position sense accompany normal aging across adulthood.
Herter, Troy M; Scott, Stephen H; Dukelow, Sean P
2014-03-25
Development of clinical neurological assessments aimed at separating normal from abnormal capabilities requires a comprehensive understanding of how basic neurological functions change (or do not change) with increasing age across adulthood. In the case of proprioception, the research literature has failed to conclusively determine whether or not position sense in the upper limb deteriorates in elderly individuals. The present study was conducted a) to quantify whether upper limb position sense deteriorates with increasing age, and b) to generate a set of normative data that can be used for future comparisons with clinical populations. We examined position sense in 209 healthy males and females between the ages of 18 and 90 using a robotic arm position-matching task that is both objective and reliable. In this task, the robot moved an arm to one of nine positions and subjects attempted to mirror-match that position with the opposite limb. Measures of position sense were recorded by the robotic apparatus in hand-and joint-based coordinates, and linear regressions were used to quantify age-related changes and percentile boundaries of normal behaviour. For clinical comparisons, we also examined influences of sex (male versus female) and test-hand (dominant versus non-dominant) on all measures of position sense. Analyses of hand-based parameters identified several measures of position sense (Variability, Shift, Spatial Contraction, Absolute Error) with significant effects of age, sex, and test-hand. Joint-based parameters at the shoulder (Absolute Error) and elbow (Variability, Shift, Absolute Error) also exhibited significant effects of age and test-hand. The present study provides strong evidence that several measures of upper extremity position sense exhibit declines with age. Furthermore, this data provides a basis for quantifying when changes in position sense are related to normal aging or alternatively, pathology.
Using multidimensional scaling to quantify similarity in visual search and beyond
Godwin, Hayward J.; Fitzsimmons, Gemma; Robbins, Arryn; Menneer, Tamaryn; Goldinger, Stephen D.
2017-01-01
Visual search is one of the most widely studied topics in vision science, both as an independent topic of interest, and as a tool for studying attention and visual cognition. A wide literature exists that seeks to understand how people find things under varying conditions of difficulty and complexity, and in situations ranging from the mundane (e.g., looking for one’s keys) to those with significant societal importance (e.g., baggage or medical screening). A primary determinant of the ease and probability of success during search are the similarity relationships that exist in the search environment, such as the similarity between the background and the target, or the likeness of the non-targets to one another. A sense of similarity is often intuitive, but it is seldom quantified directly. This presents a problem in that similarity relationships are imprecisely specified, limiting the capacity of the researcher to examine adequately their influence. In this article, we present a novel approach to overcoming this problem that combines multidimensional scaling (MDS) analyses with behavioral and eye-tracking measurements. We propose a method whereby MDS can be repurposed to successfully quantify the similarity of experimental stimuli, thereby opening up theoretical questions in visual search and attention that cannot currently be addressed. These quantifications, in conjunction with behavioral and oculomotor measures, allow for critical observations about how similarity affects performance, information selection, and information processing. We provide a demonstration and tutorial of the approach, identify documented examples of its use, discuss how complementary computer vision methods could also be adopted, and close with a discussion of potential avenues for future application of this technique. PMID:26494381
Current challenges in quantifying preferential flow through the vadose zone
NASA Astrophysics Data System (ADS)
Koestel, John; Larsbo, Mats; Jarvis, Nick
2017-04-01
In this presentation, we give an overview of current challenges in quantifying preferential flow through the vadose zone. A review of the literature suggests that current generation models do not fully reflect the present state of process understanding and empirical knowledge of preferential flow. We believe that the development of improved models will be stimulated by the increasingly widespread application of novel imaging technologies as well as future advances in computational power and numerical techniques. One of the main challenges in this respect is to bridge the large gap between the scales at which preferential flow occurs (pore to Darcy scales) and the scale of interest for management (fields, catchments, regions). Studies at the pore scale are being supported by the development of 3-D non-invasive imaging and numerical simulation techniques. These studies are leading to a better understanding of how macropore network topology and initial/boundary conditions control key state variables like matric potential and thus the strength of preferential flow. Extrapolation of this knowledge to larger scales would require support from theoretical frameworks such as key concepts from percolation and network theory, since we lack measurement technologies to quantify macropore networks at these large scales. Linked hydro-geophysical measurement techniques that produce highly spatially and temporally resolved data enable investigation of the larger-scale heterogeneities that can generate preferential flow patterns at pedon, hillslope and field scales. At larger regional and global scales, improved methods of data-mining and analyses of large datasets (machine learning) may help in parameterizing models as well as lead to new insights into the relationships between soil susceptibility to preferential flow and site attributes (climate, land uses, soil types).
Serum levels of perfluoroalkyl compounds in human maternal and umbilical cord blood samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monroy, Rocio; Morrison, Katherine; Teo, Koon
2008-09-15
Perfluoroalkyl compounds (PFCs) are end-stage metabolic products from industrial flourochemicals used in the manufacture of plastics, textiles, and electronics that are widely distributed in the environment. The objective of the present study was to quantify exposure to perfluorooctane sulfonate (PFOS), perfluorooctanoate (PFOA), perfluorodecanoic acid (PFDeA), perfluorohexane sulfonate (PFHxS), perfluoroheptanoic acid (PFHpA), and perfluorononanoic acid (PFNA) in serum samples collected from pregnant women and the umbilical cord at delivery. Pregnant women (n=101) presenting for second trimester ultrasound were recruited and PFC residue levels were quantified in maternal serum at 24-28 weeks of pregnancy, at delivery, and in umbilical cord blood (UCB;more » n=105) by liquid chromatography-mass spectrometry. Paired t-test and multiple regression analysis were performed to determine the relationship between the concentrations of each analyte at different sample collection time points. PFOA and PFOS were detectable in all serum samples analyzed including the UCB. PFOS serum levels (mean{+-}S.D.) were significantly higher (p<0.001) in second trimester maternal serum (18.1{+-}10.9 ng/mL) than maternal serum levels at delivery (16.2{+-}10.4 ng/mL), which were higher than the levels found in UCB (7.3{+-}5.8 ng/mL; p<0.001). PFHxS was quantifiable in 46/101 (45.5%) maternal and 21/105 (20%) UCB samples with a mean concentration of 4.05{+-}12.3 and 5.05{+-}12.9 ng/mL, respectively. There was no association between serum PFCs at any time point studied and birth weight. Taken together our data demonstrate that although there is widespread exposure to PFCs during development, these exposures do not affect birth weight.« less
Li, Xiaoqiong; Jensen, Bent Borg; Højberg, Ole; Noel, Samantha Joan; Canibe, Nuria
2018-06-16
Olsenella scatoligenes is the only skatole-producing bacterium isolated from the pig gut. Skatole, produced from microbial degradation of l-tryptophan, is the main contributor to boar taint, an off-odor and off-flavor taint, released upon heating meat from some entire male pigs. An appropriate method for quantifying O. scatoligenes would help investigating the relationship between O. scatoligenes abundance and skatole concentration in the pig gut. Thus, the present study aimed at developing a TaqMan-MGB probe-based, species-specific qPCR assay for rapid quantification of O. scatoligenes. The use of a MGB probe allowed discriminating O. scatoligenes from other closely related species. Moreover, the assay allowed quantifying down to three target gene copies per PCR reaction using genomic DNA-constructed standards, or 1.5 × 10 3 cells/g digesta, using O. scatoligenes-spiked digesta samples as reference standards. The developed assay was applied to assess the impact of dietary chicory roots on O. scatoligenes in the hindgut of pigs. Olsenella scatoligenes made up < 0.01% of the microbial population in the pig hindgut. Interestingly, the highest number of O. scatoligenes was found in young entire male pigs fed high levels of chicory roots. This indicates that the known effect of chicory roots for reducing skatole production is not by inhibiting the growth of this skatole-producing bacterium in the pig hindgut. Accordingly, the abundance of O. scatoligenes in the hindgut does not seem to be an appropriate indicator of boar taint. The present study is the first to describe a TaqMan-MGB probe qPCR assay for detection and quantification of O. scatoligenes in pigs.
Experimental study of the constituents of space wash water
NASA Technical Reports Server (NTRS)
Putnam, D. F.; Colombo, G. V.
1975-01-01
This report presents experimental data, obtained under controlled conditions, which quantify the various constituents of human origin that may be expected in space wash water. The experiments were conducted with a simulated crew of two male and two female subjects. The data show that the expected wash water contaminants originating from human secretions are substantially lower than theoretical projections indicated. The data presented are immediately useful and may have considerable impact on the tradeoff comparisons among various unit processes and systems under consideration by NASA for recycling space wash water.
Sidorovskaia, Natalia A; Ackleh, Azmy S; Tiemann, Christopher O; Ma, Baoling; Ioup, Juliette W; Ioup, George E
2016-01-01
The Gulf of Mexico is a region densely populated by marine mammals that must adapt to living in a highly active industrial environment. This paper presents a new approach to quantifying the anthropogenic impact on the marine mammal population. The results for sperm and beaked whales of a case study of regional population dynamics trends after the Deepwater Horizon oil spill, derived from passive acoustic-monitoring data gathered before and after the spill in the vicinity of the accident, are presented.
Dibble, Edward; Zivanovic, Aleksandar; Davies, Brian
2004-01-01
This paper presents the results of several early studies relating to human haptic perception sensitivity when probing a virtual object. A 1 degree of freedom (DoF) rotary haptic system, that was designed and built for this purpose, is also presented. The experiments were to assess the maximum forces applied in a minimally invasive surgery (MIS) procedure, quantify the compliance sensitivity threshold when probing virtual tissue and identify the haptic system loop rate necessary for haptic feedback to feel realistic.
Recent advances in catchment hydrology
NASA Astrophysics Data System (ADS)
van Meerveld, I. H. J.
2017-12-01
Despite the consensus that field observations and catchment studies are imperative to understand hydrological processes, to determine the impacts of global change, to quantify the spatial and temporal variability in hydrological fluxes, and to refine and test hydrological models, there is a decline in the number of field studies. This decline and the importance of fieldwork for catchment hydrology have been described in several recent opinion papers. This presentation will summarize these commentaries, describe how catchment studies have evolved over time, and highlight the findings from selected recent studies published in Water Resources Research.
Quantifying risk of transfusion in children undergoing spine surgery.
Vitale, Michael G; Levy, Douglas E; Park, Maxwell C; Choi, Hyunok; Choe, Julie C; Roye, David P
2002-01-01
The risks and costs of transfusion are a great concern in the area of pediatric spine surgery, because it is a blood-intensive procedure with a high risk for transfusion. Therefore, determining the predictors of transfusion in this patient population is an important first step and has the potential to improve upon the current approaches to reducing transfusion rates. In this study, we reveal several predictors of transfusion in a pediatric patient population undergoing spine surgery. In turn, we present a general rule of thumb ("rule of two's") for gauging transfusion risk, thus enhancing the surgeon's approach to avoiding transfusion in certain clinical scenarios. This study was conducted to determine the main factors of transfusion in a population of pediatric patients undergoing scoliosis surgery. The goal was to present an algorithm for quantifying the true risk of transfusion for various patient groups that would highlight patients "at high risk" for transfusion. This is especially important in light of the various risks associated with undergoing a transfusion, as well as the costs involved in maintaining and disposing of exogenous blood materials. This is a retrospective review of a group of children who underwent scoliosis surgery between 1988 and 1995 at an academic institution. A total of 290 patients were analyzed in this study, of which 63 were transfused and 227 were not. No outcomes measures were used in this study. A retrospective review of 290 patients presenting to our institution for scoliosis surgery was conducted, with a focus on socioclinical data related to transfusion risk. Univariate analysis and logistic regression were used to quantify the determinants of transfusion risk. Univariate analysis identified many factors that were associated with the risk of transfusion. However, it is clear that several of these factors are dependent on each other, obscuring the true issues driving transfusion need. We used multivariate analysis to control for the various univariate predictors of transfusion. Our logistic regression model suggested that the type of scoliosis (odds ratio [OR], 2.02; 95% confidence interval [CI], 1.07 to 3.82), degree of curvature (OR, 1.012/degree curve; 95% CI, 1.01 to 1.03), and use of erythropoietin (OR, 0.29; 95% CI, 0.14 to 0.62) were the main determinants of transfusion risk for our population. The main risk factors of transfusion were used to formulate a simple algorithm, which can be used to quantify transfusion risk and to guide efforts to avoid transfusion in children undergoing spinal surgery. Given a 10% baseline risk for transfusion, our "rule of two's" indicates that each risk factor approximately doubles the chance of transfusion, whereas the administration of recombinant human erythropoietin roughly halves the risk of transfusion.
NASA Astrophysics Data System (ADS)
Broučková, Zuzana; Trávníček, Zdeněk; Šafařík, Pavel
2014-03-01
This study introduces two physical effects known from beverages: the effect of sinking bubbles and the hot chocolate sound effect. The paper presents two simple "kitchen" experiments. The first and second effects are indicated by means of a flow visualization and microphone measurement, respectively. To quantify the second (acoustic) effect, sound records are analyzed using time-frequency signal processing, and the obtained power spectra and spectrograms are discussed.
Characterization of flame radiosity in shrubland fires
Miguel G. Cruz; Bret W. Butler; Domingos X. Viegas; Pedro Palheiro
2011-01-01
The present study is aimed at quantifying the flame radiosity vertical profile and gas temperature in moderate to high intensity spreading fires in shrubland fuels. We report on the results from 11 experimental fires conducted over a range of fire rate of spread and frontal fire intensity varying respectively between 0.04-0.35ms-1 and 468-14,973kWm-1. Flame radiosity,...
Alba Argerich; Roy Haggerty; Eugènia Martí; Francesc Sabater; Jay Zarnetske
2011-01-01
Water transient storage zones are hotspots for metabolic activity in streams although the contribution of different types of transient storage zones to the whole�]reach metabolic activity is difficult to quantify. In this study we present a method to measure the fraction of the transient storage that is metabolically active (MATS) in two consecutive reaches...
Towards simulating and quantifying the light-cone EoR 21-cm signal
NASA Astrophysics Data System (ADS)
Mondal, Rajesh; Bharadwaj, Somnath; Datta, Kanan K.
2018-02-01
The light-cone (LC) effect causes the Epoch of Reionization (EoR) 21-cm signal T_b (\\hat{n}, ν ) to evolve significantly along the line-of-sight (LoS) direction ν. In the first part of this paper, we present a method to properly incorporate the LC effect in simulations of the EoR 21-cm signal that includes peculiar velocities. Subsequently, we discuss how to quantify the second-order statistics of the EoR 21-cm signal in the presence of the LC effect. We demonstrate that the 3D power spectrum P(k) fails to quantify the entire information because it assumes the signal to be ergodic and periodic, whereas the LC effect breaks these conditions along the LoS. Considering a LC simulation centred at redshift 8 where the mean neutral fraction drops from 0.65 to 0.35 across the box, we find that P(k) misses out ˜ 40 per cent of the information at the two ends of the 17.41 MHz simulation bandwidth. The multifrequency angular power spectrum (MAPS) C_{ℓ}(ν_1,ν_2) quantifies the statistical properties of T_b (\\hat{n}, ν ) without assuming the signal to be ergodic and periodic along the LoS. We expect this to quantify the entire statistical information of the EoR 21-cm signal. We apply MAPS to our LC simulation and present preliminary results for the EoR 21-cm signal.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liangzhe Zhang; Anthony D. Rollett; Timothy Bartel
2012-02-01
A calibrated Monte Carlo (cMC) approach, which quantifies grain boundary kinetics within a generic setting, is presented. The influence of misorientation is captured by adding a scaling coefficient in the spin flipping probability equation, while the contribution of different driving forces is weighted using a partition function. The calibration process relies on the established parametric links between Monte Carlo (MC) and sharp-interface models. The cMC algorithm quantifies microstructural evolution under complex thermomechanical environments and remedies some of the difficulties associated with conventional MC models. After validation, the cMC approach is applied to quantify the texture development of polycrystalline materials withmore » influences of misorientation and inhomogeneous bulk energy across grain boundaries. The results are in good agreement with theory and experiments.« less
Sodium and T1rho MRI for molecular and diagnostic imaging of articular cartilage.
Borthakur, Arijitt; Mellon, Eric; Niyogi, Sampreet; Witschey, Walter; Kneeland, J Bruce; Reddy, Ravinder
2006-11-01
In this article, both sodium magnetic resonance (MR) and T1rho relaxation mapping aimed at measuring molecular changes in cartilage for the diagnostic imaging of osteoarthritis are reviewed. First, an introduction to structure of cartilage, its degeneration in osteoarthritis (OA) and an outline of diagnostic imaging methods in quantifying molecular changes and early diagnostic aspects of cartilage degeneration are described. The sodium MRI section begins with a brief overview of the theory of sodium NMR of biological tissues and is followed by a section on multiple quantum filters that can be used to quantify both bi-exponential relaxation and residual quadrupolar interaction. Specifically, (i) the rationale behind the use of sodium MRI in quantifying proteoglycan (PG) changes, (ii) validation studies using biochemical assays, (iii) studies on human OA specimens, (iv) results on animal models and (v) clinical imaging protocols are reviewed. Results demonstrating the feasibility of quantifying PG in OA patients and comparison with that in healthy subjects are also presented. The section concludes with the discussion of advantages and potential issues with sodium MRI and the impact of new technological advancements (e.g. ultra-high field scanners and parallel imaging methods). In the theory section on T1rho, a brief description of (i) principles of measuring T1rho relaxation, (ii) pulse sequences for computing T1rho relaxation maps, (iii) issues regarding radio frequency power deposition, (iv) mechanisms that contribute to T1rho in biological tissues and (v) effects of exchange and dipolar interaction on T1rho dispersion are discussed. Correlation of T1rho relaxation rate with macromolecular content and biomechanical properties in cartilage specimens subjected to trypsin and cytokine-induced glycosaminoglycan depletion and validation against biochemical assay and histopathology are presented. Experimental T1rho data from osteoarthritic specimens, animal models, healthy human subjects and as well from osteoarthritic patients are provided. The current status of T1rho relaxation mapping of cartilage and future directions is also discussed. Copyright 2006 John Wiley & Sons, Ltd.
Electronic Cigarette Topography in the Natural Environment
Morabito, P. N.; Roundtree, K. A.
2015-01-01
This paper presents the results of a clinical, observational, descriptive study to quantify the use patterns of electronic cigarette users in their natural environment. Previously published work regarding puff topography has been widely indirect in nature, and qualitative rather than quantitative, with the exception of three studies conducted in a laboratory environment for limited amounts of time. The current study quantifies the variation in puffing behaviors among users as well as the variation for a given user throughout the course of a day. Puff topography characteristics computed for each puffing session by each subject include the number of subject puffs per puffing session, the mean puff duration per session, the mean puff flow rate per session, the mean puff volume per session, and the cumulative puff volume per session. The same puff topography characteristics are computed across all puffing sessions by each single subject and across all subjects in the study cohort. Results indicate significant inter-subject variability with regard to puffing topography, suggesting that a range of representative puffing topography patterns should be used to drive machine-puffed electronic cigarette aerosol evaluation systems. PMID:26053075
Electronic Cigarette Topography in the Natural Environment.
Robinson, R J; Hensel, E C; Morabito, P N; Roundtree, K A
2015-01-01
This paper presents the results of a clinical, observational, descriptive study to quantify the use patterns of electronic cigarette users in their natural environment. Previously published work regarding puff topography has been widely indirect in nature, and qualitative rather than quantitative, with the exception of three studies conducted in a laboratory environment for limited amounts of time. The current study quantifies the variation in puffing behaviors among users as well as the variation for a given user throughout the course of a day. Puff topography characteristics computed for each puffing session by each subject include the number of subject puffs per puffing session, the mean puff duration per session, the mean puff flow rate per session, the mean puff volume per session, and the cumulative puff volume per session. The same puff topography characteristics are computed across all puffing sessions by each single subject and across all subjects in the study cohort. Results indicate significant inter-subject variability with regard to puffing topography, suggesting that a range of representative puffing topography patterns should be used to drive machine-puffed electronic cigarette aerosol evaluation systems.
NASA Astrophysics Data System (ADS)
Junior, Benedito Roberto Alvarenga; Soares, Frederico Luis Felipe; Ardila, Jorge Armando; Durango, Luis Guillermo Cuadrado; Forim, Moacir Rossi; Carneiro, Renato Lajarim
2018-01-01
The aim of this work was to quantify B-complex vitamins in pharmaceutical samples by surface enhanced Raman spectroscopy technique using gold colloid substrate. Synthesis of gold nanoparticles was performed according to an adapted Turkevich method. Initial essays were able to suggest the orientation of molecules on gold nanoparticles surface. Central Composite design was performed to obtain the highest SERS signal for nicotinamide and riboflavin. The evaluated parameters in the experimental design were volume of AuNPs, concentration of vitamins and sodium chloride concentration. The best condition for nicotinamide was NaCl 2.3 × 10- 3 mol L- 1 and 700 μL of AuNPs colloid and this same condition showed to be adequate to quantify thiamine. The experimental design for riboflavin shows the best condition at NaCl 1.15 × 10- 2 mol L- 1 and 2.8 mL of AuNPs colloid. It was possible to quantify thiamine and nicotinamide in presence of others vitamins and excipients in two solid multivitamin formulations using the standard addition procedure. The standard addition curve presented a R2 higher than 0.96 for both nicotinamide and thiamine, at orders of magnitude 10- 7 and 10- 8 mol L- 1, respectively. The nicotinamide content in a cosmetic gel sample was also quantified by direct analysis presenting R2 0.98. The t-student test presented no significant difference regarding HPLC method. Despite the experimental design performed for riboflavin, it was not possible its quantification in the commercial samples.
A field method for soil erosion measurements in agricultural and natural lands
Y.P. Hsieh; K.T. Grant; G.C. Bugna
2009-01-01
Soil erosion is one of the most important watershed processes in nature, yet quantifying it under field conditions remains a challenge. The lack of soil erosion field data is a major factor hindering our ability to predict soil erosion in a watershed. We present here the development of a simple and sensitive field method that quantifies soil erosion and the resulting...
Jens T. Stevens; Hugh D. Safford; Malcolm P. North; Jeremy S. Fried; Andrew N. Gray; Peter M. Brown; Christopher R. Dolanc; Solomon Z. Dobrowski; Donald A. Falk; Calvin A. Farris; Jerry F. Franklin; Peter Z. Fulé; R. Keala Hagmann; Eric E. Knapp; Jay D. Miller; Douglas F. Smith; Thomas W. Swetnam; Alan H. Taylor; Julia A. Jones
2016-01-01
Quantifying historical fire regimes provides important information for managing contemporary forests. Historical fire frequency and severity can be estimated using several methods; each method has strengths and weaknesses and presents challenges for interpretation and verification. Recent efforts to quantify the timing of historical high-severity fire events in forests...
Zhao, Chang; Sander, Heather A.
2015-01-01
Studies that assess the distribution of benefits provided by ecosystem services across urban areas are increasingly common. Nevertheless, current knowledge of both the supply and demand sides of ecosystem services remains limited, leaving a gap in our understanding of balance between ecosystem service supply and demand that restricts our ability to assess and manage these services. The present study seeks to fill this gap by developing and applying an integrated approach to quantifying the supply and demand of a key ecosystem service, carbon storage and sequestration, at the local level. This approach follows three basic steps: (1) quantifying and mapping service supply based upon Light Detection and Ranging (LiDAR) processing and allometric models, (2) quantifying and mapping demand for carbon sequestration using an indicator based on local anthropogenic CO2 emissions, and (3) mapping a supply-to-demand ratio. We illustrate this approach using a portion of the Twin Cities Metropolitan Area of Minnesota, USA. Our results indicate that 1735.69 million kg carbon are stored by urban trees in our study area. Annually, 33.43 million kg carbon are sequestered by trees, whereas 3087.60 million kg carbon are emitted by human sources. Thus, carbon sequestration service provided by urban trees in the study location play a minor role in combating climate change, offsetting approximately 1% of local anthropogenic carbon emissions per year, although avoided emissions via storage in trees are substantial. Our supply-to-demand ratio map provides insight into the balance between carbon sequestration supply in urban trees and demand for such sequestration at the local level, pinpointing critical locations where higher levels of supply and demand exist. Such a ratio map could help planners and policy makers to assess and manage the supply of and demand for carbon sequestration. PMID:26317530
Cerebral Microcirculation during Experimental Normovolaemic Anemia
Bellapart, Judith; Cuthbertson, Kylie; Dunster, Kimble; Diab, Sara; Platts, David G.; Raffel, O. Christopher; Gabrielian, Levon; Barnett, Adrian; Paratz, Jenifer; Boots, Rob; Fraser, John F.
2016-01-01
Anemia is accepted among critically ill patients as an alternative to elective blood transfusion. This practice has been extrapolated to head injury patients with only one study comparing the effects of mild anemia on neurological outcome. There are no studies quantifying microcirculation during anemia. Experimental studies suggest that anemia leads to cerebral hypoxia and increased rates of infarction, but the lack of clinical equipoise, when testing the cerebral effects of transfusion among critically injured patients, supports the need of experimental studies. The aim of this study was to quantify cerebral microcirculation and the potential presence of axonal damage in an experimental model exposed to normovolaemic anemia, with the intention of describing possible limitations within management practices in critically ill patients. Under non-recovered anesthesia, six Merino sheep were instrumented using an intracardiac transeptal catheter to inject coded microspheres into the left atrium to ensure systemic and non-chaotic distribution. Cytometric analyses quantified cerebral microcirculation at specific regions of the brain. Amyloid precursor protein staining was used as an indicator of axonal damage. Animals were exposed to normovolaemic anemia by blood extractions from the indwelling arterial catheter with simultaneous fluid replacement through a venous central catheter. Simultaneous data recording from cerebral tissue oxygenation, intracranial pressure, and cardiac output was monitored. A regression model was used to examine the effects of anemia on microcirculation with a mixed model to control for repeated measures. Homogeneous and normal cerebral microcirculation with no evidence of axonal damage was present in all cerebral regions, with no temporal variability, concluding that acute normovolaemic anemia does not result in short-term effects on cerebral microcirculation in the ovine brain. PMID:26869986
Kodama, Naomi; Kimura, Toshifumi; Yonemura, Seiichiro; Kaneda, Satoshi; Ohashi, Mizue; Ikeno, Hidetoshi
2014-01-01
Earthworms are important soil macrofauna inhabiting almost all ecosystems. Their biomass is large and their burrowing and ingestion of soils alters soil physicochemical properties. Because of their large biomass, earthworms are regarded as an indicator of "soil heath". However, primarily because the difficulties in quantifying their behavior, the extent of their impact on soil material flow dynamics and soil health is poorly understood. Image data, with the aid of image processing tools, are a powerful tool in quantifying the movements of objects. Image data sets are often very large and time-consuming to analyze, especially when continuously recorded and manually processed. We aimed to develop a system to quantify earthworm movement from video recordings. Our newly developed program successfully tracked the two-dimensional positions of three separate parts of the earthworm and simultaneously output the change in its body length. From the output data, we calculated the velocity of the earthworm's movement. Our program processed the image data three times faster than the manual tracking system. To date, there are no existing systems to quantify earthworm activity from continuously recorded image data. The system developed in this study will reduce input time by a factor of three compared with manual data entry and will reduce errors involved in quantifying large data sets. Furthermore, it will provide more reliable measured values, although the program is still a prototype that needs further testing and improvement. Combined with other techniques, such as measuring metabolic gas emissions from earthworm bodies, this program could provide continuous observations of earthworm behavior in response to environmental variables under laboratory conditions. In the future, this standardized method will be applied to other animals, and the quantified earthworm movement will be incorporated into models of soil material flow dynamics or behavior in response to chemical substances present in the soil.
NASA Astrophysics Data System (ADS)
Hoose, C.; Hande, L. B.; Mohler, O.; Niemand, M.; Paukert, M.; Reichardt, I.; Ullrich, R.
2016-12-01
Between 0 and -37°C, ice formation in clouds is triggered by aerosol particles acting as heterogeneous ice nuclei. At lower temperatures, heterogeneous ice nucleation on aerosols can occur at lower supersaturations than homogeneous freezing of solutes. In laboratory experiments, the ability of different aerosol species (e.g. desert dusts, soot, biological particles) has been studied in detail and quantified via various theoretical or empirical parameterization approaches. For experiments in the AIDA cloud chamber, we have quantified the ice nucleation efficiency via a temperature- and supersaturation dependent ice nucleation active site density. Here we present a new empirical parameterization scheme for immersion and deposition ice nucleation on desert dust and soot based on these experimental data. The application of this parameterization to the simulation of cirrus clouds, deep convective clouds and orographic clouds will be shown, including the extension of the scheme to the treatment of freezing of rain drops. The results are compared to other heterogeneous ice nucleation schemes. Furthermore, an aerosol-dependent parameterization of contact ice nucleation is presented.
de Oliveira Mendes, Thiago; Porto, Brenda Lee Simas; Bell, Maria José Valenzuela; Perrone, Ítalo Tuler; de Oliveira, Marcone Augusto Leal
2016-12-15
Adulteration of milk with whey is difficult to detect because these two have similar physical and chemical characteristics. The traditional methodologies to monitor this fraud are based on the analysis of caseinomacropeptide. The present study proposes a new approach to detect and quantify this fraud using the fatty acid profiles of milk and whey. Fatty acids C14:0, C16:0, C18:0, C18:1, C18:2 and C18:3 were selected by gas chromatography associated with discriminant analysis to differentiate milk and whey, as they are present in quite different amounts. These six fatty acids were quantified within a short time by capillary zone electrophoresis in a set of adulterated milk samples. The correlation coefficient between the true values of whey addition and the experimental values obtained by this technique was 0.973. The technique is thus useful for the evaluation of milk adulteration with whey, contributing to the quality control of milk in the dairy industry. Copyright © 2016. Published by Elsevier Ltd.
Determination of terpenoid content in pine by organic solvent extraction and fast-GC analysis
Harman-Ware, Anne E.; Sykes, Robert; Peter, Gary F.; ...
2016-01-25
Terpenoids, naturally occurring compounds derived from isoprene units present in pine oleoresin, are a valuable source of chemicals used in solvents, fragrances, flavors, and have shown potential use as a biofuel. This paper describes a method to extract and analyze the terpenoids present in loblolly pine saplings and pine lighter wood. Various extraction solvents were tested over different times and temperatures. Samples were analyzed by pyrolysis-molecular beam mass spectrometry before and after extractions to monitor the extraction efficiency. The pyrolysis studies indicated that the optimal extraction method used a 1:1 hexane/acetone solvent system at 22°C for 1 h. Extracts frommore » the hexane/acetone experiments were analyzed using a low thermal mass modular accelerated column heater for fast-GC/FID analysis. The most abundant terpenoids from the pine samples were quantified, using standard curves, and included the monoterpenes, α- and β-pinene, camphene, and δ-carene. Sesquiterpenes analyzed included caryophyllene, humulene, and α-bisabolene. In conclusion, diterpenoid resin acids were quantified in derivatized extractions, including pimaric, isopimaric, levopimaric, palustric, dehydroabietic, abietic, and neoabietic acids.« less
Stereology techniques in radiation biology
NASA Technical Reports Server (NTRS)
Kubinova, Lucie; Mao, XiaoWen; Janacek, Jiri; Archambeau, John O.; Nelson, G. A. (Principal Investigator)
2003-01-01
Clinicians involved in conventional radiation therapy are very concerned about the dose-response relationships of normal tissues. Before proceeding to new clinical protocols, radiation biologists involved with conformal proton therapy believe it is necessary to quantify the dose response and tolerance of the organs and tissues that will be irradiated. An important focus is on the vasculature. This presentation reviews the methodology and format of using confocal microscopy and stereological methods to quantify tissue parameters, cell number, tissue volume and surface area, and vessel length using the microvasculature as a model tissue. Stereological methods and their concepts are illustrated using an ongoing study of the dose response of the microvessels in proton-irradiated hemibrain. Methods for estimating the volume of the brain and the brain cortex, the total number of endothelial cells in cortical microvessels, the length of cortical microvessels, and the total surface area of cortical microvessel walls are presented step by step in a way understandable for readers with little mathematical background. It is shown that stereological techniques, based on a sound theoretical basis, are powerful and reliable and have been used successfully.
Quantifying Sustainability in Puerto Rico – A Scientific Discussion
The presentation introduces the symposium and an overview of work on sustainability metrics research in Puerto Rico. The presentation starts broadly by presenting the focus of Office of Research and Development on sustainability and systems thinking and drilling down to the how ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karmi, S.
1996-03-18
The United States Air Force (Air Force) has prepared this Remedial investigation/Feasibility Study (RI/FS) report as part of the Installation Restoration Program (IRP) to present results of RI/FS activities at five sites at the Bullen Point radar installation. The IRP provides for investigating, quantifying, and remediating environmental contamination from past waste management activities at Air Force installations throughout the United States.
Parametric study of statistical bias in laser Doppler velocimetry
NASA Technical Reports Server (NTRS)
Gould, Richard D.; Stevenson, Warren H.; Thompson, H. Doyle
1989-01-01
Analytical studies have often assumed that LDV velocity bias depends on turbulence intensity in conjunction with one or more characteristic time scales, such as the time between validated signals, the time between data samples, and the integral turbulence time-scale. These parameters are presently varied independently, in an effort to quantify the biasing effect. Neither of the post facto correction methods employed is entirely accurate. The mean velocity bias error is found to be nearly independent of data validation rate.
Sarnat, Jeremy A; Wilson, William E; Strand, Matthew; Brook, Jeff; Wyzga, Ron; Lumley, Thomas
2007-12-01
Examining the validity of exposure metrics used in air pollution epidemiologic models has been a key focus of recent exposure assessment studies. The objective of this work has been, largely, to determine what a given exposure metric represents and to quantify and reduce any potential errors resulting from using these metrics in lieu of true exposure measurements. The current manuscript summarizes the presentations of the co-authors from a recent EPA workshop, held in December 2006, dealing with the role and contributions of exposure assessment in addressing these issues. Results are presented from US and Canadian exposure and pollutant measurement studies as well as theoretical simulations to investigate what both particulate and gaseous pollutant concentrations represent and the potential errors resulting from their use in air pollution epidemiologic studies. Quantifying the association between ambient pollutant concentrations and corresponding personal exposures has led to the concept of defining attenuation factors, or alpha. Specifically, characterizing pollutant-specific estimates for alpha was shown to be useful in developing regression calibration methods involving PM epidemiologic risk estimates. For some gaseous pollutants such as NO2 and SO2, the associations between ambient concentrations and personal exposures were shown to be complex and still poorly understood. Results from recent panel studies suggest that ambient NO2 measurements may, in some locations, be serving as surrogates to traffic pollutants, including traffic-related PM2.5, hopanes, steranes, and oxidized nitrogen compounds (rather than NO2).
Analysis of vegetation changes in Cidanau watershed, Indonesia
NASA Astrophysics Data System (ADS)
Khairiah, R. N.; Kunihiko, Y.; Prasetyo, L. B.; Setiawan, Y.
2018-05-01
Vegetation change detection is needed for conserve of quality and water cycle in Cidanau watershed. The NDVI was applied to quantify the vegetation changes of Cidanau watershed for three different years 1989, 2001, and 2015. Using NDVI we mapped the reflectance from chlorophyll and distinguished varying amounts of vegetation at the pixel level by index. In the present study, as a preliminary study, we proposed a vegetation change detection analysis based on the NDVI from 1989 through 2015. Multi-temporal satellite data i.e. Landsat imagery with 30 m spatial resolution are used in the present study. It is reported that agroforestry land exhibited the greatest reductions in highly dense vegetation class in 1989-2001 and also moderate vegetation class in 2001-2015. It’s mean that amount of vegetation present in agroforestry land is getting lower year by year.
Assessing biomass accumulation in second growth forests of Puerto Rico using airborne lidar
NASA Astrophysics Data System (ADS)
Martinuzzi, S.; Cook, B.; Corp, L. A.; Morton, D. C.; Helmer, E.; Keller, M.
2017-12-01
Degraded and second growth tropical forests provide important ecosystem services, such as carbon sequestration and soil stabilization. Lidar data measure the three-dimensional structure of forest canopies and are commonly used to quantify aboveground biomass in temperate forest landscapes. However, the ability of lidar data to quantify second growth forest biomass in complex, tropical landscapes is less understood. Our goal was to evaluate the use of airborne lidar data to quantify aboveground biomass in a complex tropical landscape, the Caribbean island of Puerto Rico. Puerto Rico provides an ideal place for studying biomass accumulation because of the abundance of second growth forests in different stages of recovery, and the high ecological heterogeneity. Puerto Rico was almost entirely deforested for agriculture until the 1930s. Thereafter, agricultural abandonment resulted in a mosaic of second growth forests that have recovered naturally under different types of climate, land use, topography, and soil fertility. We integrated forest plot data from the US Forest Service, Forest Inventory and Analysis (FIA) Program with recent lidar data from NASA Goddard's Lidar, Hyperspectral, and Thermal (G-LiHT) airborne imager to quantify forest biomass across the island's landscape. The G-LiHT data consisted on targeted acquisitions over the FIA plots and other forested areas representing the environmental heterogeneity of the island. To fully assess the potential of the lidar data, we compared the ability of lidar-derived canopy metrics to quantify biomass alone, and in combination with intensity and topographic metrics. The results presented here are a key step for improving our understanding of the patterns and drivers of biomass accumulation in tropical forests.
Precise measurement of the performance of thermoelectric modules
NASA Astrophysics Data System (ADS)
Díaz-Chao, Pablo; Muñiz-Piniella, Andrés; Selezneva, Ekaterina; Cuenat, Alexandre
2016-08-01
The potential exploitation of thermoelectric modules into mass market applications such as exhaust gas heat recovery in combustion engines requires an accurate knowledge of their performance. Further expansion of the market will also require confidence on the results provided by suppliers to end-users. However, large variation in performance and maximum operating point is observed for identical modules when tested by different laboratories. Here, we present the first metrological study of the impact of mounting and testing procedures on the precision of thermoelectric modules measurement. Variability in the electrical output due to mechanical pressure or type of thermal interface materials is quantified for the first time. The respective contribution of the temperature difference and the mean temperature to the variation in the output performance is quantified. The contribution of these factors to the total uncertainties in module characterisation is detailed.
Quantifying uncertainty in discharge measurements: A new approach
Kiang, J.E.; Cohn, T.A.; Mason, R.R.
2009-01-01
The accuracy of discharge measurements using velocity meters and the velocity-area method is typically assessed based on empirical studies that may not correspond to conditions encountered in practice. In this paper, a statistical approach for assessing uncertainty based on interpolated variance estimation (IVE) is introduced. The IVE method quantifies all sources of random uncertainty in the measured data. This paper presents results employing data from sites where substantial over-sampling allowed for the comparison of IVE-estimated uncertainty and observed variability among repeated measurements. These results suggest that the IVE approach can provide approximate estimates of measurement uncertainty. The use of IVE to estimate the uncertainty of a discharge measurement would provide the hydrographer an immediate determination of uncertainty and help determine whether there is a need for additional sampling in problematic river cross sections. ?? 2009 ASCE.
Human eyeball model reconstruction and quantitative analysis.
Xing, Qi; Wei, Qi
2014-01-01
Determining shape of the eyeball is important to diagnose eyeball disease like myopia. In this paper, we present an automatic approach to precisely reconstruct three dimensional geometric shape of eyeball from MR Images. The model development pipeline involved image segmentation, registration, B-Spline surface fitting and subdivision surface fitting, neither of which required manual interaction. From the high resolution resultant models, geometric characteristics of the eyeball can be accurately quantified and analyzed. In addition to the eight metrics commonly used by existing studies, we proposed two novel metrics, Gaussian Curvature Analysis and Sphere Distance Deviation, to quantify the cornea shape and the whole eyeball surface respectively. The experiment results showed that the reconstructed eyeball models accurately represent the complex morphology of the eye. The ten metrics parameterize the eyeball among different subjects, which can potentially be used for eye disease diagnosis.
Yildiz, Leyla; Başkan, Kevser Sözgen; Tütem, Esma; Apak, Reşat
2008-10-19
This study aims to identify the essential antioxidant compounds present in parsley (Petroselinum sativum) and celery (Apium graveolens) leaves belonging to the Umbelliferae (Apiaceae) family, and in stinging nettle (Urtica dioica) belonging to Urticaceae family, to measure the total antioxidant capacity (TAC) of these compounds with CUPRAC (cupric ion reducing antioxidant capacity) and ABTS spectrophotometric methods, and to correlate the TAC with high performance liquid chromatography (HPLC) findings. The CUPRAC spectrophotometric method of TAC assay using copper(II)-neocuproine (2,9-dimethyl-1,10-phenanthroline) as the chromogenic oxidant was developed in our laboratories. The individual antioxidant constituents of plant extracts were identified and quantified by HPLC on a C18 column using a modified mobile phase of gradient elution comprised of MeOH-0.2% o-phosphoric acid and UV detection for polyphenols at 280 nm. The TAC values of HPLC-quantified antioxidant constituents were found, and compared for the first time with those found by CUPRAC. The TAC of HPLC-quantified compounds accounted for a relatively high percentage of the observed CUPRAC capacities of plant extracts, namely 81% of nettle, 60-77% of parsley (in different hydrolyzates of extract and solid sample), and 41-57% of celery leaves (in different hydrolyzates). The CUPRAC total capacities of the 70% MeOH extracts of studied plants (in the units of mmol trolox g(-1)plant) were in the order: celery leaves>nettle>parsley. The TAC calculated with the aid of HPLC-spectrophotometry did not compensate for 100% of the CUPRAC total capacities, because all flavonoid glycosides subjected to hydrolysis were either not detectable with HPLC, or not converted to the corresponding aglycons (i.e., easily detectable and quantifiable with HPLC) during the hydrolysis step.
Roberts, Tawna L; Kester, Kristi N; Hertle, Richard W
2018-04-01
This study presents test-retest reliability of optotype visual acuity (OVA) across 60° of horizontal gaze position in patients with infantile nystagmus syndrome (INS). Also, the validity of the metric gaze-dependent functional vision space (GDFVS) is shown in patients with INS. In experiment 1, OVA was measured twice in seven horizontal gaze positions from 30° left to right in 10° steps in 20 subjects with INS and 14 without INS. Test-retest reliability was assessed using intraclass correlation coefficient (ICC) in each gaze. OVA area under the curve (AUC) was calculated with horizontal eye position on the x-axis, and logMAR visual acuity on the y-axis and then converted to GDFVS. In experiment 2, validity of GDFVS was determined over 40° horizontal gaze by applying the 95% limits of agreement from experiment 1 to pre- and post-treatment GDFVS values from 85 patients with INS. In experiment 1, test-retest reliability for OVA was high (ICC ≥ 0.88) as the difference in test-retest was on average less than 0.1 logMAR in each gaze position. In experiment 2, as a group, INS subjects had a significant increase (P < 0.001) in the size of their GDFVS that exceeded the 95% limits of agreement found during test-retest. OVA is a reliable measure in INS patients across 60° of horizontal gaze position. GDFVS is a valid clinical method to be used to quantify OVA as a function of eye position in INS patients. This method captures the dynamic nature of OVA in INS patients and may be a valuable measure to quantify visual function patients with INS, particularly in quantifying change as part of clinical studies.
Interindividual Responses of Appetite to Acute Exercise: A Replicated Crossover Study.
Goltz, Fernanda R; Thackray, Alice E; King, James A; Dorling, James L; Atkinson, Greg; Stensel, David J
2018-04-01
Acute exercise transiently suppresses appetite, which coincides with alterations in appetite-regulatory hormone concentrations. Individual variability in these responses is suspected, but replicated trials are needed to quantify them robustly. We examined the reproducibility of appetite and appetite-regulatory hormone responses to acute exercise and quantified the individual differences in responses. Fifteen healthy, recreationally active men completed two control (60-min resting) and two exercise (60-min fasted treadmill running at 70% peak oxygen uptake) conditions in randomized sequences. Perceived appetite and circulating concentrations of acylated ghrelin and total peptide YY (PYY) were measured immediately before and after the interventions. Interindividual differences were explored by correlating the two sets of response differences between exercise and control conditions. Within-participant covariate-adjusted linear mixed models were used to quantify participant-condition interactions. Compared with control, exercise suppressed mean acylated ghrelin concentrations and appetite perceptions (all ES = 0.62-1.47, P < 0.001) and elevated total PYY concentrations (ES = 1.49, P < 0.001). For all variables, the standard deviation of the change scores was substantially greater in the exercise versus control conditions. Moderate-to-large positive correlations were observed between the two sets of control-adjusted exercise responses for all variables (r = 0.54-0.82, P ≤ 0.036). After adjusting for baseline measurements, participant-condition interactions were present for all variables (P ≤ 0.053). Our replicated crossover study allowed, for the first time, the interaction between participant and acute exercise response in appetite parameters to be quantified. Even after adjustment for individual baseline measurements, participants demonstrated individual differences in perceived appetite and hormone responses to acute exercise bouts beyond any random within-subject variability over time.
Study Quantifies Physical Demands of Yoga in Seniors
... Z Study Quantifies Physical Demands of Yoga in Seniors Share: A recent NCCAM-funded study measured the ... performance of seven standing poses commonly taught in senior yoga classes: Chair, Wall Plank, Tree, Warrior II, ...
Wang, Bin; Zhou, Xiaozhou; Price, Christopher; Li, Wen; Pan, Jun; Wang, Liyun
2012-01-01
Osteocytes, the most abundant cells in bone, are critical in maintaining tissue homeostasis and orchestrating bone’s mechanical adaptation. Osteocytes depend upon load-induced convection within the lacunar-canalicular system (LCS) to maintain viability and to sense their mechanical environment. Using the fluorescence recovery after photobleaching (FRAP) imaging approach, we previously quantified the convection of a small tracer (sodium fluorescein, 376Da) in the murine tibial LCS for an intermittent cyclic loading (Price et al., 2011. JBMR 26:277-85). In the present study we first expanded the investigation of solute transport using a larger tracer (parvalbumin, 12.3kDa), which is comparable in size to some signaling proteins secreted by osteocytes. Murine tibiae were subjected to sequential FRAP tests under rest-inserted cyclic loading while the loading magnitude (0, 2.8, or 4.8N) and frequency (0.5, 1, or 2 Hz) were varied. The characteristic transport rate k and the transport enhancement relative to diffusion (k/k0) were measured under each loading condition, from which the peak solute velocity in the LCS was derived using our LCS transport model. Both the transport enhancement and solute velocity increased with loading magnitude and decreased with loading frequency. Furthermore, the solute-matrix interactions, quantified in terms of the reflection coefficient through the osteocytic pericellular matrix (PCM), were measured and theoretically modeled. The reflection coefficient of parvalbumin (σ=0.084) was derived from the differential fluid and solute velocities within loaded bone. Using a newly developed PCM sieving model, the PCM’s fiber configurations accounting for the measured interactions were obtained for the first time. The present study provided not only new data on the micro-fluidic environment experienced by osteocytes in situ, but also a powerful quantitative tool for future study of the PCM, the critical interface that controls both outside-in and inside-out signaling in osteocytes during normal bone adaptation and in pathological conditions. PMID:23109140
Quiroz Arita, Carlos; Yilmaz, Özge; Barlak, Semin; Catton, Kimberly B; Quinn, Jason C; Bradley, Thomas H
2016-12-01
The microalgae biofuels life cycle assessments (LCA) present in the literature have excluded the effects of direct land use change (DLUC) from facility construction under the assumption that DLUC effects are negligible. This study seeks to model the greenhouse gas (GHG) emissions of microalgae biofuels including DLUC by quantifying the CO 2 equivalence of carbon released to the atmosphere through the construction of microalgae facilities. The locations and types of biomass and Soil Organic Carbon that are disturbed through microalgae cultivation facility construction are quantified using geographical models of microalgae productivity potential including consideration of land availability. The results of this study demonstrate that previous LCA of microalgae to biofuel processes have overestimated GHG benefits of microalgae-based biofuels production by failing to include the effect of DLUC. Previous estimations of microalgae biofuel production potential have correspondingly overestimated the volume of biofuels that can be produced in compliance with U.S. environmental goals. Copyright © 2016 Elsevier Ltd. All rights reserved.
Effect of Metakaolin on Strength and Efflorescence Quantity of Cement-Based Composites
Weng, Tsai-Lung; Lin, Wei-Ting; Cheng, An
2013-01-01
This study investigated the basic mechanical and microscopic properties of cement produced with metakaolin and quantified the production of residual white efflorescence. Cement mortar was produced at various replacement ratios of metakaolin (0, 5, 10, 15, 20, and 25% by weight of cement) and exposed to various environments. Compressive strength and efflorescence quantify (using Matrix Laboratory image analysis and the curettage method), scanning electron microscopy, and X-ray diffraction analysis were reported in this study. Specimens with metakaolin as a replacement for Portland cement present higher compressive strength and greater resistance to efflorescence; however, the addition of more than 20% metakaolin has a detrimental effect on strength and efflorescence. This may be explained by the microstructure and hydration products. The quantity of efflorescence determined using MATLAB image analysis is close to the result obtained using the curettage method. The results demonstrate the best effectiveness of replacing Portland cement with metakaolin at a 15% replacement ratio by weight. PMID:23737719
Quantifying, Visualizing, and Monitoring Lead Optimization.
Maynard, Andrew T; Roberts, Christopher D
2016-05-12
Although lead optimization (LO) is by definition a process, process-centric analysis and visualization of this important phase of pharmaceutical R&D has been lacking. Here we describe a simple statistical framework to quantify and visualize the progression of LO projects so that the vital signs of LO convergence can be monitored. We refer to the resulting visualizations generated by our methodology as the "LO telemetry" of a project. These visualizations can be automated to provide objective, holistic, and instantaneous analysis and communication of LO progression. This enhances the ability of project teams to more effectively drive LO process, while enabling management to better coordinate and prioritize LO projects. We present the telemetry of five LO projects comprising different biological targets and different project outcomes, including clinical compound selection, termination due to preclinical safety/tox, and termination due to lack of tractability. We demonstrate that LO progression is accurately captured by the telemetry. We also present metrics to quantify LO efficiency and tractability.
Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification
NASA Technical Reports Server (NTRS)
Mehta, Unmeel B. (Editor); Eklund, Dean R.; Romero, Vicente J.; Pearce, Jeffrey A.; Keim, Nicholas S.
2016-01-01
Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility: verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to other technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.
NASA Astrophysics Data System (ADS)
Qu, Yueqiao; He, Youmin; Zhang, Yi; Ma, Teng; Zhu, Jiang; Miao, Yusi; Dai, Cuixia; Silverman, Ronald; Humayun, Mark S.; Zhou, Qifa; Chen, Zhongping
2017-02-01
Age-related macular degeneration and keratoconus are two ocular diseases occurring in the posterior and anterior eye, respectively. In both conditions, the mechanical elasticity of the respective tissues changes during the early onset of disease. It is necessary to detect these differences and treat the diseases in their early stages to provide proper treatment. Acoustic radiation force optical coherence elastography is a method of elasticity mapping using confocal ultrasound waves for excitation and Doppler optical coherence tomography for detection. We report on an ARF-OCE system that uses modulated compression wave based excitation signals, and detects the spatial and frequency responses of the tissue. First, all components of the system is synchronized and triggered such that the signal is consistent between frames. Next, phantom studies are performed to validate and calibrate the relationship between the resonance frequency and the Young's modulus. Then the frequency responses of the anterior and posterior eye are detected for porcine and rabbit eyes, and the results correlated to the elasticity. Finally, spatial elastograms are obtained for a porcine retina. Layer segmentation and analysis is performed and correlated to the histology of the retina, where five distinct layers are recognized. The elasticities of the tissue layers will be quantified according to the mean thickness and displacement response for the locations on the retina. This study is a stepping stone to future in-vivo animal studies, where the elastic modulus of the ocular tissue can be quantified and mapped out accordingly.
Quantifying and minimizing entropy generation in AMTEC cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendricks, T.J.; Huang, C.
1997-12-31
Entropy generation in an AMTEC cell represents inherent power loss to the AMTEC cell. Minimizing cell entropy generation directly maximizes cell power generation and efficiency. An internal project is on-going at AMPS to identify, quantify and minimize entropy generation mechanisms within an AMTEC cell, with the goal of determining cost-effective design approaches for maximizing AMTEC cell power generation. Various entropy generation mechanisms have been identified and quantified. The project has investigated several cell design techniques in a solar-driven AMTEC system to minimize cell entropy generation and produce maximum power cell designs. In many cases, various sources of entropy generation aremore » interrelated such that minimizing entropy generation requires cell and system design optimization. Some of the tradeoffs between various entropy generation mechanisms are quantified and explained and their implications on cell design are discussed. The relationship between AMTEC cell power and efficiency and entropy generation is presented and discussed.« less
Kojima, Tsuyoshi; Van Deusen, Mark; Jerome, W. Gray; Garrett, C. Gaelyn; Sivasankar, M. Preeti; Novaleski, Carolyn K.; Rousseau, Bernard
2014-01-01
Because the vocal folds undergo repeated trauma during continuous cycles of vibration, the epithelium is routinely susceptible to damage during phonation. Excessive and prolonged vibration exposure is considered a significant predisposing factor in the development of vocal fold pathology. The purpose of the present study was to quantify the extent of epithelial surface damage following increased time and magnitude doses of vibration exposure using an in vivo rabbit phonation model. Forty-five New Zealand white breeder rabbits were randomized to nine groups and received varying phonation time-doses (30, 60, or 120 minutes) and magnitude-doses (control, modal intensity phonation, or raised intensity phonation) of vibration exposure. Scanning electron microscopy and transmission electron microscopy was used to quantify the degree of epithelial surface damage. Results revealed a significant reduction in microprojection density, microprojection height, and depth of the epithelial surface with increasing time and phonation magnitudes doses, signifying increased epithelial surface damage risk with excessive and prolonged vibration exposure. Destruction to the epithelial cell surface may provide significant insight into the disruption of cell function following prolonged vibration exposure. One important goal achieved in the present study was the quantification of epithelial surface damage using objective imaging criteria. These data provide an important foundation for future studies of long-term tissue recovery from excessive and prolonged vibration exposure. PMID:24626217
A novel method for visualising and quantifying through-plane skin layer deformations.
Gerhardt, L-C; Schmidt, J; Sanz-Herrera, J A; Baaijens, F P T; Ansari, T; Peters, G W M; Oomens, C W J
2012-10-01
Skin is a multilayer composite and exhibits highly non-linear, viscoelastic, anisotropic material properties. In many consumer product and medical applications (e.g. during shaving, needle insertion, patient re-positioning), large tissue displacements and deformations are involved; consequently large local strains in the skin tissue can occur. Here, we present a novel imaging-based method to study skin deformations and the mechanics of interacting skin layers of full-thickness skin. Shear experiments and real-time video recording were combined with digital image correlation and strain field analysis to visualise and quantify skin layer deformations during dynamic mechanical testing. A global shear strain of 10% was applied to airbrush-patterned porcine skin (thickness: 1.2-1.6mm) using a rotational rheometer. The recordings were analysed with ARAMIS image correlation software, and local skin displacement, strain and stiffness profiles through the skin layers determined. The results of this pilot study revealed inhomogeneous skin deformation, characterised by a gradual transition from a low (2.0-5.0%; epidermis) to high (10-22%; dermis) shear strain regime. Shear moduli ranged from 20 to 130kPa. The herein presented method will be used for more extended studies on viable human skin, and is considered a valuable foundation for further development of constitutive models which can be used in advanced finite element analyses of skin. Copyright © 2012 Elsevier Ltd. All rights reserved.
Boal, S; Miguel Carreira, L
2015-09-01
Degenerative joint disease (DJD) is a progressive, chronic joint disease with an inflammatory component promoting an acute phase protein (APP) response. C-reactive protein (CRP) is one of the most important APPs, used as an inflammation marker in human, but not veterinary medicine. The study was developed in a sample of 48 dogs (n = 48) with DJD and aimed to: 1) identify and quantify the synovial fluid CRP (SFCRP) in these specimens using a validated ELISA test for serum CRP (SCRP) detection and quantification; and 2) to study the possible relationship between SCRP and SFCRP levels variations in DJD patients evaluating the influence of some physical parameters such as gender, body weight, pain level, DJD grade, and the physical activity (PA) of the patients. Statistical analysis considered the results significant for p values <0.05. Our study showed that it is possible to detect and quantify SFCRP levels in DJD patients using a previously validated canine SCRP ELISA test, allowing us to point out a preliminary reference value for SFCRP in patients with DJD. Although, individuals with DJD presents SCRP values within the normal reference range and the SFCRP levels were always lower. Obesity, pain, and the DJD grade presented by the patients are conditions which seem to influence the SCRP levels but not the SFCRP.
NASA Astrophysics Data System (ADS)
Parker, L. K.; Morris, R. E.; Zapert, J.; Cook, F.; Koo, B.; Rasmussen, D.; Jung, J.; Grant, J.; Johnson, J.; Shah, T.; Pavlovic, T.
2015-12-01
The Colorado Air Resource Management Modeling Study (CARMMS) was funded by the Bureau of Land Management (BLM) to predict the impacts from future federal and non-federal energy development in Colorado and Northern New Mexico. The study used the Comprehensive Air Quality Model with extensions (CAMx) photochemical grid model (PGM) to quantify potential impacts from energy development from BLM field office planning areas. CAMx source apportionment technology was used to track the impacts from multiple (14) different emissions source regions (i.e. field office areas) within one simulation, as well as to assess the cumulative impact of emissions from all source regions combined. The energy development emissions estimates were for the year 2021 for three different development scenarios: (1) low; (2) high; (3) high with emissions mitigation. Impacts on air quality (AQ) including ozone, PM2.5, PM10, NO2, SO2, and air quality related values (AQRVs) such as atmospheric deposition, regional haze and changes in Acid Neutralizing Capacity (ANC) of lakes were quantified, and compared to establish threshold levels. In this presentation, we present a brief summary of the how the emission scenarios were developed, we compare the emission totals for each scenario, and then focus on the ozone impacts for each scenario to assess: (1). the difference in potential ozone impacts under the different development scenarios and (2). to establish the sensitivity of the ozone impacts to different emissions levels. Region-wide ozone impacts will be presented as well as impacts at specific locations with ozone monitors.
Climate Change Accuracy: Requirements and Economic Value
NASA Astrophysics Data System (ADS)
Wielicki, B. A.; Cooke, R.; Mlynczak, M. G.; Lukashin, C.; Thome, K. J.; Baize, R. R.
2014-12-01
Higher than normal accuracy is required to rigorously observe decadal climate change. But what level is needed? How can this be quantified? This presentation will summarize a new more rigorous and quantitative approach to determining the required accuracy for climate change observations (Wielicki et al., 2013, BAMS). Most current global satellite observations cannot meet this accuracy level. A proposed new satellite mission to resolve this challenge is CLARREO (Climate Absolute Radiance and Refractivity Observatory). CLARREO is designed to achieve advances of a factor of 10 for reflected solar spectra and a factor of 3 to 5 for thermal infrared spectra (Wielicki et al., Oct. 2013 BAMS). The CLARREO spectrometers are designed to serve as SI traceable benchmarks for the Global Satellite Intercalibration System (GSICS) and to greatly improve the utility of a wide range of LEO and GEO infrared and reflected solar passive satellite sensors for climate change observations (e.g. CERES, MODIS, VIIIRS, CrIS, IASI, Landsat, SPOT, etc). Providing more accurate decadal change trends can in turn lead to more rapid narrowing of key climate science uncertainties such as cloud feedback and climate sensitivity. A study has been carried out to quantify the economic benefits of such an advance as part of a rigorous and complete climate observing system. The study concludes that the economic value is $12 Trillion U.S. dollars in Net Present Value for a nominal discount rate of 3% (Cooke et al. 2013, J. Env. Sys. Dec.). A brief summary of these two studies and their implications for the future of climate science will be presented.
Muralidhar, Gautam S; Channappayya, Sumohana S; Slater, John H; Blinka, Ellen M; Bovik, Alan C; Frey, Wolfgang; Markey, Mia K
2008-11-06
Automated analysis of fluorescence microscopy images of endothelial cells labeled for actin is important for quantifying changes in the actin cytoskeleton. The current manual approach is laborious and inefficient. The goal of our work is to develop automated image analysis methods, thereby increasing cell analysis throughput. In this study, we present preliminary results on comparing different algorithms for cell segmentation and image denoising.
The Value of Distributed Solar Electric Generation to San Antonio
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Nic; Norris, Ben; Meyer, Lisa
2013-02-14
This report presents an analysis of value provided by grid-connected, distributed PV in San Antonio from a utility perspective. The study quantified six value components, summarized in Table ES- 1. These components represent the benefits that accrue to the utility, CPS Energy, in accepting solar onto the grid. This analysis does not treat the compensation of value, policy objectives, or cost-effectiveness from the retail consumer perspective.
Maughan, Curtis; Martini, Silvana
2012-02-01
The objectives of this study were to use a meat flavor lexicon to identify and quantify flavor differences among different types of meats such as beef, chicken, lamb, pork, and turkey, and to identify and quantify specific flavor attributes associated with "beef flavor" notes. A trained descriptive panel with 11 participants used a previously developed meat lexicon composed of 18 terms to evaluate the flavor of beef, chicken, pork, turkey, and lamb samples. Results show that beef and lamb samples can be described by flavor attributes such as barny, bitter, gamey, grassy, livery, metallic, and roast beef. Inversely related to these samples were pork and turkey and those attributes that were closely related to them, namely brothy, fatty, salty, sweet, and umami. Chicken was not strongly related to the other types of meats or the attributes used. The descriptive panel also evaluated samples of ground beef mixed with chicken to identify and quantify flavor attributes associated with a "beef flavor." Meat patties for this portion consisted of ground beef mixed with ground chicken in varying amounts: 0%, 25%, 50%, 75%, and 100% beef, with the remainder made up of chicken. Beef and beef-rich patties (75% beef) were more closely related to flavor attributes such as astringent, bloody, fatty, gamey, metallic, livery, oxidized, grassy, and roast beef, while chicken was more closely associated with brothy, juicy, sour, sweet, and umami. This research provides information regarding the specific flavor attributes that differentiate chicken and beef products and provides the first set of descriptors that can be associated with "beefy" notes. POTENTIAL APPLICATION: The use of a standardized flavor lexicon will allow meat producers to identify specific flavors present in their products. The impact is to identify and quantify negative and positive flavors in the product with the ultimate goal of optimizing processing or cooking conditions and improve the quality of meat products. © 2012 Institute of Food Technologists®
Quantitative characterization of the aqueous fraction from hydrothermal liquefaction of algae
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maddi, Balakrishna; Panisko, Ellen; Wietsma, Thomas
Aqueous streams generated from hydrothermal liquefaction contain approximately 30% of the total carbon present from the algal feed. Hence, this aqueous carbon must be utilized to produce liquid fuels and/or specialty chemicals for economic sustainability of hydrothermal liquefaction on industrial scale. In this study, aqueous fractions produced from the hydrothermal liquefaction of fresh water and saline water algal cultures were analyzed using a wide variety of analytical instruments to determine their compositional characteristics. This study will also inform researchers designing catalysts for down-stream processing such as high-pressure catalytic conversion of organics in aqueous phase, catalytic hydrothermal gasification, and biological conversions.more » Organic chemical compounds present in all eight aqueous fractions were identified using two-dimensional gas chromatography equipped with time-of-flight mass spectrometry. Identified compounds include organic acids, nitrogen compounds and aldehydes/ketones. Conventional gas chromatography and liquid chromatography methods were utilized to quantify the identified compounds. Inorganic species in the aqueous stream of hydrothermal liquefaction of algae were identified using ion chromatography and inductively coupled plasma optical emission spectrometer. The concentrations of organic chemical compounds and inorganic species are reported. The amount quantified carbon ranged from 45 to 72 % of total carbon in the aqueous fractions.« less
Pedroso, Amanda P; Souza, Adriana P; Dornellas, Ana P S; Oyama, Lila M; Nascimento, Cláudia M O; Santos, Gianni M S; Rosa, José C; Bertolla, Ricardo P; Klawitter, Jelena; Christians, Uwe; Tashima, Alexandre K; Ribeiro, Eliane B
2017-04-07
Programming of hypothalamic functions regulating energy homeostasis may play a role in intrauterine growth restriction (IUGR)-induced adulthood obesity. The present study investigated the effects of IUGR on the hypothalamus proteome and metabolome of adult rats submitted to 50% protein-energy restriction throughout pregnancy. Proteomic and metabolomic analyzes were performed by data independent acquisition mass spectrometry and multiple reaction monitoring, respectively. At age 4 months, the restricted rats showed elevated adiposity, increased leptin and signs of insulin resistance. 1356 proteins were identified and 348 quantified while 127 metabolites were quantified. The restricted hypothalamus showed down-regulation of 36 proteins and 5 metabolites and up-regulation of 21 proteins and 9 metabolites. Integrated pathway analysis of the proteomics and metabolomics data indicated impairment of hypothalamic glucose metabolism, increased flux through the hexosamine pathway, deregulation of TCA cycle and the respiratory chain, and alterations in glutathione metabolism. The data suggest IUGR modulation of energy metabolism and redox homeostasis in the hypothalamus of male adult rats. The present results indicated deleterious consequences of IUGR on hypothalamic pathways involved in pivotal physiological functions. These results provide guidance for future mechanistic studies assessing the role of intrauterine malnutrition in the development of metabolic diseases later in life.
NASA Technical Reports Server (NTRS)
Chamberlin, Phillip
2008-01-01
The Flare Irradiance Spectral Model (FISM) is an empirical model of the solar irradiance spectrum from 0.1 to 190 nm at 1 nm spectral resolution and on a 1-minute time cadence. The goal of FISM is to provide accurate solar spectral irradiances over the vacuum ultraviolet (VUV: 0-200 nm) range as input for ionospheric and thermospheric models. The seminar will begin with a brief overview of the FISM model, and also how the Solar Dynamics Observatory (SDO) EUV Variability Experiment (EVE) will contribute to improving FISM. Some current studies will then be presented that use FISM estimations of the solar VUV irradiance to quantify the contributions of the increased irradiance from flares to Earth's increased thermospheric and ionospheric densites. Initial results will also be presented from a study looking at the electron density increases in the Martian atmosphere during a solar flare. Results will also be shown quantifying the VUV contributions to the total flare energy budget for both the impulsive and gradual phases of solar flares. Lastly, an example of how FISM can be used to simplify the design of future solar VUV irradiance instruments will be discussed, using the future NOAA GOES-R Extreme Ultraviolet and X-Ray Sensors (EXIS) space weather instrument.
Focusing cosmic telescopes: systematics of strong lens modeling
NASA Astrophysics Data System (ADS)
Johnson, Traci Lin; Sharon, Keren q.
2018-01-01
The use of strong gravitational lensing by galaxy clusters has become a popular method for studying the high redshift universe. While diverse in computational methods, lens modeling techniques have grasped the means for determining statistical errors on cluster masses and magnifications. However, the systematic errors have yet to be quantified, arising from the number of constraints, availablity of spectroscopic redshifts, and various types of image configurations. I will be presenting my dissertation work on quantifying systematic errors in parametric strong lensing techniques. I have participated in the Hubble Frontier Fields lens model comparison project, using simulated clusters to compare the accuracy of various modeling techniques. I have extended this project to understanding how changing the quantity of constraints affects the mass and magnification. I will also present my recent work extending these studies to clusters in the Outer Rim Simulation. These clusters are typical of the clusters found in wide-field surveys, in mass and lensing cross-section. These clusters have fewer constraints than the HFF clusters and thus, are more susceptible to systematic errors. With the wealth of strong lensing clusters discovered in surveys such as SDSS, SPT, DES, and in the future, LSST, this work will be influential in guiding the lens modeling efforts and follow-up spectroscopic campaigns.
Quantifiers are incrementally interpreted in context, more than less
Urbach, Thomas P.; DeLong, Katherine A.; Kutas, Marta
2015-01-01
Language interpretation is often assumed to be incremental. However, our studies of quantifier expressions in isolated sentences found N400 event-related brain potential (ERP) evidence for partial but not full immediate quantifier interpretation (Urbach & Kutas, 2010). Here we tested similar quantifier expressions in pragmatically supporting discourse contexts (Alex was an unusual toddler. Most/Few kids prefer sweets/vegetables…) while participants made plausibility judgments (Experiment 1) or read for comprehension (Experiment 2). Control Experiments 3A (plausibility) and 3B (comprehension) removed the discourse contexts. Quantifiers always modulated typical and/or atypical word N400 amplitudes. However, only the real-time N400 effects only in Experiment 2 mirrored offline quantifier and typicality crossover interaction effects for plausibility ratings and cloze probabilities. We conclude that quantifier expressions can be interpreted fully and immediately, though pragmatic and task variables appear to impact the speed and/or depth of quantifier interpretation. PMID:26005285
NASA Astrophysics Data System (ADS)
Broothaerts, Nils; López-Sáez, José Antonio; Verstraeten, Gert
2017-04-01
Reconstructing and quantifying human impact is an important step to understand human-environment interactions in the past. Quantitative measures of human impact on the landscape are needed to fully understand long-term influence of anthropogenic land cover changes on the global climate, ecosystems and geomorphic processes. Nevertheless, quantifying past human impact is not straightforward. Recently, multivariate statistical analysis of fossil pollen records have been proposed to characterize vegetation changes and to get insights in past human impact. Although statistical analysis of fossil pollen data can provide useful insights in anthropogenic driven vegetation changes, still it cannot be used as an absolute quantification of past human impact. To overcome this shortcoming, in this study fossil pollen records were included in a multivariate statistical analysis (cluster analysis and non-metric multidimensional scaling (NMDS)) together with modern pollen data and modern vegetation data. The information on the modern pollen and vegetation dataset can be used to get a better interpretation of the representativeness of the fossil pollen records, and can result in a full quantification of human impact in the past. This methodology was applied in two contrasting environments: SW Turkey and Central Spain. For each region, fossil pollen data from different study sites were integrated, together with modern pollen data and information on modern vegetation. In this way, arboreal cover, grazing pressure and agricultural activities in the past were reconstructed and quantified. The data from SW Turkey provides new integrated information on changing human impact through time in the Sagalassos territory, and shows that human impact was most intense during the Hellenistic and Roman Period (ca. 2200-1750 cal a BP) and decreased and changed in nature afterwards. The data from central Spain shows for several sites that arboreal cover decreases bellow 5% from the Feudal period onwards (ca. 850 cal a BP) related to increasing human impact in the landscape. At other study sites arboreal cover remained above 25% beside significant human impact. Overall, the presented examples from two contrasting environments shows how cluster analysis and NMDS of modern and fossil pollen data can help to provide quantitative insights in anthropogenic land cover changes. Our study extensively discuss and illustrate the possibilities and limitations of statistical analysis of pollen data to quantify human induced land use changes.
Lahiri, Uttama; Trewyn, Adam; Warren, Zachary; Sarkar, Nilanjan
2011-01-01
Children with Autism Spectrum Disorder are often characterized by deficits in social communication skills. While evidence suggests that intensive individualized interventions can improve aspects of core deficits in Autism Spectrum Disorder, at present numerous potent barriers exist related to accessing and implementing such interventions. Researchers are increasingly employing technology to develop more accessible, quantifiable, and individualized intervention tools to address core vulnerabilities related to autism. The present study describes the development and preliminary application of a Virtual Reality technology aimed at facilitating improvements in social communication skills for adolescents with autism. We present preliminary data from the usability study of this technological application for six adolescents with autism and discuss potential future development and application of adaptive Virtual Reality technology within an intervention framework.
Uncertainty analysis of trade-offs between multiple responses using hypervolume
Cao, Yongtao; Lu, Lu; Anderson-Cook, Christine M.
2017-08-04
When multiple responses are considered in process optimization, the degree to which they can be simultaneously optimized depends on the optimization objectives and the amount of trade-offs between the responses. The normalized hypervolume of the Pareto front is a useful summary to quantify the amount of trade-offs required to balance performance across the multiple responses. In order to quantify the impact of uncertainty of the estimated response surfaces and add realism to what future data to expect, 2 versions of the scaled normalized hypervolume of the Pareto front are presented. To demonstrate the variation of the hypervolume distributions, we exploremore » a case study for a chemical process involving 3 responses, each with a different type of optimization goal. Our results show that the global normalized hypervolume characterizes the proximity to the ideal results possible, while the instance-specific summary considers the richness of the front and the severity of trade-offs between alternatives. Furthermore, the 2 scaling schemes complement each other and highlight different features of the Pareto front and hence are useful to quantify what solutions are possible for simultaneous optimization of multiple responses.« less
Recurrence quantity analysis based on matrix eigenvalues
NASA Astrophysics Data System (ADS)
Yang, Pengbo; Shang, Pengjian
2018-06-01
Recurrence plots is a powerful tool for visualization and analysis of dynamical systems. Recurrence quantification analysis (RQA), based on point density and diagonal and vertical line structures in the recurrence plots, is considered to be alternative measures to quantify the complexity of dynamical systems. In this paper, we present a new measure based on recurrence matrix to quantify the dynamical properties of a given system. Matrix eigenvalues can reflect the basic characteristics of the complex systems, so we show the properties of the system by exploring the eigenvalues of the recurrence matrix. Considering that Shannon entropy has been defined as a complexity measure, we propose the definition of entropy of matrix eigenvalues (EOME) as a new RQA measure. We confirm that EOME can be used as a metric to quantify the behavior changes of the system. As a given dynamical system changes from a non-chaotic to a chaotic regime, the EOME will increase as well. The bigger EOME values imply higher complexity and lower predictability. We also study the effect of some factors on EOME,including data length, recurrence threshold, the embedding dimension, and additional noise. Finally, we demonstrate an application in physiology. The advantage of this measure lies in a high sensitivity and simple computation.
Yang, Bin; Jan, Ning-Jiun; Brazile, Bryn; Voorhees, Andrew; Lathrop, Kira L; Sigal, Ian A
2018-04-06
Collagen fibers play a central role in normal eye mechanics and pathology. In ocular tissues, collagen fibers exhibit a complex 3-dimensional (3D) fiber orientation, with both in-plane (IP) and out-of-plane (OP) orientations. Imaging techniques traditionally applied to the study of ocular tissues only quantify IP fiber orientation, providing little information on OP fiber orientation. Accurate description of the complex 3D fiber microstructures of the eye requires quantifying full 3D fiber orientation. Herein, we present 3dPLM, a technique based on polarized light microscopy developed to quantify both IP and OP collagen fiber orientations of ocular tissues. The performance of 3dPLM was examined by simulation and experimental verification and validation. The experiments demonstrated an excellent agreement between extracted and true 3D fiber orientation. Both IP and OP fiber orientations can be extracted from the sclera and the cornea, providing previously unavailable quantitative 3D measures and insight into the tissue microarchitecture. Together, the results demonstrate that 3dPLM is a powerful imaging technique for the analysis of ocular tissues. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Automated Tracking and Quantification of Autistic Behavioral Symptoms Using Microsoft Kinect.
Kang, Joon Young; Kim, Ryunhyung; Kim, Hyunsun; Kang, Yeonjune; Hahn, Susan; Fu, Zhengrui; Khalid, Mamoon I; Schenck, Enja; Thesen, Thomas
2016-01-01
The prevalence of autism spectrum disorder (ASD) has risen significantly in the last ten years, and today, roughly 1 in 68 children has been diagnosed. One hallmark set of symptoms in this disorder are stereotypical motor movements. These repetitive movements may include spinning, body-rocking, or hand-flapping, amongst others. Despite the growing number of individuals affected by autism, an effective, accurate method of automatically quantifying such movements remains unavailable. This has negative implications for assessing the outcome of ASD intervention and drug studies. Here we present a novel approach to detecting autistic symptoms using the Microsoft Kinect v.2 to objectively and automatically quantify autistic body movements. The Kinect camera was used to film 12 actors performing three separate stereotypical motor movements each. Visual Gesture Builder (VGB) was implemented to analyze the skeletal structures in these recordings using a machine learning approach. In addition, movement detection was hard-coded in Matlab. Manual grading was used to confirm the validity and reliability of VGB and Matlab analysis. We found that both methods were able to detect autistic body movements with high probability. The machine learning approach yielded highest detection rates, supporting its use in automatically quantifying complex autistic behaviors with multi-dimensional input.
Uncertainty analysis of trade-offs between multiple responses using hypervolume
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, Yongtao; Lu, Lu; Anderson-Cook, Christine M.
When multiple responses are considered in process optimization, the degree to which they can be simultaneously optimized depends on the optimization objectives and the amount of trade-offs between the responses. The normalized hypervolume of the Pareto front is a useful summary to quantify the amount of trade-offs required to balance performance across the multiple responses. In order to quantify the impact of uncertainty of the estimated response surfaces and add realism to what future data to expect, 2 versions of the scaled normalized hypervolume of the Pareto front are presented. To demonstrate the variation of the hypervolume distributions, we exploremore » a case study for a chemical process involving 3 responses, each with a different type of optimization goal. Our results show that the global normalized hypervolume characterizes the proximity to the ideal results possible, while the instance-specific summary considers the richness of the front and the severity of trade-offs between alternatives. Furthermore, the 2 scaling schemes complement each other and highlight different features of the Pareto front and hence are useful to quantify what solutions are possible for simultaneous optimization of multiple responses.« less
Information Uncertainty to Compare Qualitative Reasoning Security Risk Assessment Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavez, Gregory M; Key, Brian P; Zerkle, David K
2009-01-01
The security risk associated with malevolent acts such as those of terrorism are often void of the historical data required for a traditional PRA. Most information available to conduct security risk assessments for these malevolent acts is obtained from subject matter experts as subjective judgements. Qualitative reasoning approaches such as approximate reasoning and evidential reasoning are useful for modeling the predicted risk from information provided by subject matter experts. Absent from these approaches is a consistent means to compare the security risk assessment results. Associated with each predicted risk reasoning result is a quantifiable amount of information uncertainty which canmore » be measured and used to compare the results. This paper explores using entropy measures to quantify the information uncertainty associated with conflict and non-specificity in the predicted reasoning results. The measured quantities of conflict and non-specificity can ultimately be used to compare qualitative reasoning results which are important in triage studies and ultimately resource allocation. Straight forward extensions of previous entropy measures are presented here to quantify the non-specificity and conflict associated with security risk assessment results obtained from qualitative reasoning models.« less
Local and global evaluation for remote sensing image segmentation
NASA Astrophysics Data System (ADS)
Su, Tengfei; Zhang, Shengwei
2017-08-01
In object-based image analysis, how to produce accurate segmentation is usually a very important issue that needs to be solved before image classification or target recognition. The study for segmentation evaluation method is key to solving this issue. Almost all of the existent evaluation strategies only focus on the global performance assessment. However, these methods are ineffective for the situation that two segmentation results with very similar overall performance have very different local error distributions. To overcome this problem, this paper presents an approach that can both locally and globally quantify segmentation incorrectness. In doing so, region-overlapping metrics are utilized to quantify each reference geo-object's over and under-segmentation error. These quantified error values are used to produce segmentation error maps which have effective illustrative power to delineate local segmentation error patterns. The error values for all of the reference geo-objects are aggregated through using area-weighted summation, so that global indicators can be derived. An experiment using two scenes of very different high resolution images showed that the global evaluation part of the proposed approach was almost as effective as other two global evaluation methods, and the local part was a useful complement to comparing different segmentation results.
Avti, Pramod K; Hu, Song; Favazza, Christopher; Mikos, Antonios G; Jansen, John A; Shroyer, Kenneth R; Wang, Lihong V; Sitharaman, Balaji
2012-01-01
In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM) was investigated to detect, map, and quantify trace amounts [nanograms (ng) to micrograms (µg)] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies (histological specimens from implanted tissue engineering scaffolds). Optical-resolution (OR) and acoustic-resolution (AR)--Photoacoustic microscopy (PAM) was employed to detect, map and quantify the SWCNTs in a variety of tissue histological specimens and compared with other optical techniques (bright-field optical microscopy, Raman microscopy, near infrared (NIR) fluorescence microscopy). Both optical-resolution and acoustic-resolution PAM, allow the detection and quantification of SWCNTs in histological specimens with scalable spatial resolution and depth penetration. The noise-equivalent detection sensitivity to SWCNTs in the specimens was calculated to be as low as ∼7 pg. Image processing analysis further allowed the mapping, distribution, and quantification of the SWCNTs in the histological sections. The results demonstrate the potential of PAM as a promising imaging technique to detect, map, and quantify SWCNTs in histological specimens, and could complement the capabilities of current optical and electron microscopy techniques in the analysis of histological specimens containing SWCNTs.
Gates, Kristin; Petterson, Stephen; Wingrove, Peter; Miller, Benjamin; Klink, Kathleen
2016-12-01
Research suggests that 13-25% of primary care patients who present with physical complaints have underlying depression or anxiety. The goal of this paper is to quantify and compare the frequency of the diagnosis of depression and anxiety in patients with a somatic reason for visit among primary care physicians across disciplines. Data obtained from the National Ambulatory Medical Care Survey (NAMCS) from 2002 to 2010 was used to quantify primary care patients with somatic presentations who were given a diagnosis of depression or anxiety. The Patient Health Questionnaire (PHQ)-15, Somatic Symptom Scale, and the Child Behavior Checklist for Ages 6-18 were used to define what constituted a somatic reason for visit in this study. Of the patients presenting with a somatic reason for visit in this nationally representative survey, less than 4% of patents in family or internal medicine were diagnosed with depression or anxiety. Less than 1% of patients were diagnosed with depression or anxiety in pediatrics or obstetrics and gynecology. Less than 2% of patients with somatic reasons for visit in any primary care specialty had documented screening for depression. The rates of diagnosis of depression and anxiety in patents presenting with somatic reasons for visit were significantly less than the prevalence reported in the literature across primary care disciplines. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Yoon, Donhee; Lee, Dongkun; Lee, Jong-Hyeon; Cha, Sangwon; Oh, Han Bin
2015-01-30
Quantifying polymers by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOFMS) with a conventional crystalline matrix generally suffers from poor sample-to-sample or shot-to-shot reproducibility. An ionic-liquid matrix has been demonstrated to mitigate these reproducibility issues by providing a homogeneous sample surface, which is useful for quantifying polymers. In the present study, we evaluated the use of an ionic liquid matrix, i.e., 1-methylimidazolium α-cyano-4-hydroxycinnamate (1-MeIm-CHCA), to quantify polyhexamethylene guanidine (PHMG) samples that impose a critical health hazard when inhaled in the form of droplets. MALDI-TOF mass spectra were acquired for PHMG oligomers using a variety of ionic-liquid matrices including 1-MeIm-CHCA. Calibration curves were constructed by plotting the sum of the PHMG oligomer peak areas versus PHMG sample concentration with a variety of peptide internal standards. Compared with the conventional crystalline matrix, the 1-MeIm-CHCA ionic-liquid matrix had much better reproducibility (lower standard deviations). Furthermore, by using an internal peptide standard, good linear calibration plots could be obtained over a range of PMHG concentrations of at least 4 orders of magnitude. This study successfully demonstrated that PHMG samples can be quantitatively characterized by MALDI-TOFMS with an ionic-liquid matrix and an internal standard. Copyright © 2014 John Wiley & Sons, Ltd.
Homann, Stefanie; Hofmann, Christian; Gorin, Aleksandr M.; Nguyen, Huy Cong Xuan; Huynh, Diana; Hamid, Phillip; Maithel, Neil; Yacoubian, Vahe; Mu, Wenli; Kossyvakis, Athanasios; Sen Roy, Shubhendu; Yang, Otto Orlean
2017-01-01
Transfection is one of the most frequently used techniques in molecular biology that is also applicable for gene therapy studies in humans. One of the biggest challenges to investigate the protein function and interaction in gene therapy studies is to have reliable monospecific detection reagents, particularly antibodies, for all human gene products. Thus, a reliable method that can optimize transfection efficiency based on not only expression of the target protein of interest but also the uptake of the nucleic acid plasmid, can be an important tool in molecular biology. Here, we present a simple, rapid and robust flow cytometric method that can be used as a tool to optimize transfection efficiency at the single cell level while overcoming limitations of prior established methods that quantify transfection efficiency. By using optimized ratios of transfection reagent and a nucleic acid (DNA or RNA) vector directly labeled with a fluorochrome, this method can be used as a tool to simultaneously quantify cellular toxicity of different transfection reagents, the amount of nucleic acid plasmid that cells have taken up during transfection as well as the amount of the encoded expressed protein. Finally, we demonstrate that this method is reproducible, can be standardized and can reliably and rapidly quantify transfection efficiency, reducing assay costs and increasing throughput while increasing data robustness. PMID:28863132
Quantum correlations in a family of bipartite separable qubit states
NASA Astrophysics Data System (ADS)
Xie, Chuanmei; Liu, Yimin; Chen, Jianlan; Zhang, Zhanjun
2017-03-01
Quantum correlations (QCs) in some separable states have been proposed as a key resource for certain quantum communication tasks and quantum computational models without entanglement. In this paper, a family of nine-parameter separable states, obtained from arbitrary mixture of two sets of bi-qubit product pure states, is considered. QCs in these separable states are studied analytically or numerically using four QC quantifiers, i.e., measurement-induced disturbance (Luo in Phys Rev A77:022301, 2008), ameliorated MID (Girolami et al. in J Phys A Math Theor 44:352002, 2011),quantum dissonance (DN) (Modi et al. in Phys Rev Lett 104:080501, 2010), and new quantum dissonance (Rulli in Phys Rev A 84:042109, 2011), respectively. First, an inherent symmetry in the concerned separable states is revealed, that is, any nine-parameter separable states concerned in this paper can be transformed to a three-parameter kernel state via some certain local unitary operation. Then, four different QC expressions are concretely derived with the four QC quantifiers. Furthermore, some comparative studies of the QCs are presented, discussed and analyzed, and some distinct features about them are exposed. We find that, in the framework of all the four QC quantifiers, the more mixed the original two pure product states, the bigger QCs the separable states own. Our results reveal some intrinsic features of QCs in separable systems in quantum information.
Revisiting the assessment of semen viscosity and its relationship to leucocytospermia.
Flint, M; du Plessis, S S; Menkveld, R
2014-10-01
With infertility challenges posing an obstacle to many couples, the extension of variables to assess male fertility is an important line of research. At the Reproductive Biology Unit where the study was undertaken, a considerable proportion of male patient's seeking fertility assessment presented with hyperviscous semen samples and elevated concentrations of leucocytes. Despite viscosity being included as part of a routine spermiogram, it raises a considerable amount of concern as it is assessed semiquantitatively. The study was undertaken to evaluate the quantification of semen viscosity in centipoise (cP) and to investigate whether a correlation exists between hyperviscosity and leucocytospermia. A total of 200 semen samples were assessed from a sample cohort of two population groups: 162 male patients undergoing fertility assessment and 38 volunteer donors. Semen viscosity was determined by measuring the filling time of a capillary-loaded Leja chamber and quantifying the viscosity in cP. Leucocytes were identified histochemically with a leucocyte peroxidase test. The viscosity when quantified in cP was significantly higher in the peroxidase positive sample group (9.01 ± 0.49 vs. 7.39 ± 0.23 cP; P < 0.005). The introduction of a more accurate method of quantifying viscosity may possibly help to identify, diagnose and treat patients suffering from leucocytospermia to ultimately enhance their fertility potential. © 2013 Blackwell Verlag GmbH.
Katharopoulos, Efstathios; Touloupi, Katerina; Touraki, Maria
2016-08-01
The present study describes the development of a simple and efficient screening system that allows identification and quantification of nine bacteriocins produced by Lactococcus lactis. Cell-free L. lactis extracts presented a broad spectrum of antibacterial activity, including Gram-negative bacteria, Gram-positive bacteria, and fungi. The characterization of their sensitivity to pH, and heat, showed that the extracts retained their antibacterial activity at extreme pH values and in a wide temperature range. The loss of antibacterial activity following treatment of the extracts with lipase or protease suggests a lipoproteinaceous nature of the produced antimicrobials. The extracts were subjected to a purification protocol that employs a two phase extraction using ammonium sulfate precipitation and organic solvent precipitation, followed by ion exchange chromatography, solid phase extraction and HPLC. In the nine fractions that presented antimicrobial activity, bacteriocins were quantified by the turbidometric method using a standard curve of nisin and by the HPLC method with nisin as the external standard, with both methods producing comparable results. Turbidometry appears to be unique in the qualitative determination of bacteriocins but the only method suitable to both separate and quantify the bacteriocins providing increased sensitivity, accuracy, and precision is HPLC. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Knightes, C. D.; Bouchard, D.; Zepp, R. G.; Henderson, W. M.; Han, Y.; Hsieh, H. S.; Avant, B. K.; Acrey, B.; Spear, J.
2017-12-01
The unique properties of engineered nanomaterials led to their increased production and potential release into the environment. Currently available environmental fate models developed for traditional contaminants are limited in their ability to simulate nanomaterials' environmental behavior. This is due to an incomplete understanding and representation of the processes governing nanomaterial distribution in the environment and by scarce empirical data quantifying the interaction of nanomaterials with environmental surfaces. The well-known Water Quality Analysis Simulation Program (WASP) was updated to incorporate nanomaterial-specific processes, specifically hetero-aggregation with particulate matter. In parallel with this effort, laboratory studies were used to quantify parameter values parameters necessary for governing processes in surface waters. This presentation will discuss the recent developments in the new architecture for WASP8 and the newly constructed Advanced Toxicant Module. The module includes advanced algorithms for increased numbers of state variables: chemicals, solids, dissolved organic matter, pathogens, temperature, and salinity. This presentation will focus specifically on the incorporation of nanomaterials, with the applications of the fate and transport of hypothetical releases of Multi-Walled Carbon Nanotubes (MWCNT) and Graphene Oxide (GO) into the headwaters of a southeastern US coastal plains river. While this presentation focuses on nanomaterials, the advanced toxicant module can also simulate metals and organic contaminants.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, R.; Gibson, D.
This paper draws heavily on the results of case studies in Bolivia, Costa Rica, and Ecuador to explain how sectoral policies have tilted land use decisions against forestry and in favor of agriculture, and to present estimates of the economic development effects of those decisions. The paper summarizes information on forests and forest industries of the three countries, and it describes the framework within which policies are designed. It presents the effects of sectoral policies on land use and forest management, and then quantifies and discusses economic costs of relevant sectoral policies. Conclusions and recommendations for policy reform are offered.
The aggregate complexity of decisions in the game of Go
NASA Astrophysics Data System (ADS)
Harré, M. S.; Bossomaier, T.; Gillett, A.; Snyder, A.
2011-04-01
Artificial intelligence (AI) research is fast approaching, or perhaps has already reached, a bottleneck whereby further advancement towards practical human-like reasoning in complex tasks needs further quantified input from large studies of human decision-making. Previous studies in psychology, for example, often rely on relatively small cohorts and very specific tasks. These studies have strongly influenced some of the core notions in AI research such as the reinforcement learning and the exploration versus exploitation paradigms. With the goal of contributing to this direction in AI developments we present our findings on the evolution towards world-class decision-making across large cohorts of subjects in the formidable game of Go. Some of these findings directly support previous work on how experts develop their skills but we also report on several previously unknown aspects of the development of expertise that suggests new avenues for AI research to explore. In particular, at the level of play that has so far eluded current AI systems for Go, we are able to quantify the lack of `predictability' of experts and how this changes with their level of skill.
The scale invariant generator technique for quantifying anisotropic scale invariance
NASA Astrophysics Data System (ADS)
Lewis, G. M.; Lovejoy, S.; Schertzer, D.; Pecknold, S.
1999-11-01
Scale invariance is rapidly becoming a new paradigm for geophysics. However, little attention has been paid to the anisotropy that is invariably present in geophysical fields in the form of differential stratification and rotation, texture and morphology. In order to account for scaling anisotropy, the formalism of generalized scale invariance (GSI) was developed. Until now there has existed only a single fairly ad hoc GSI analysis technique valid for studying differential rotation. In this paper, we use a two-dimensional representation of the linear approximation to generalized scale invariance, to obtain a much improved technique for quantifying anisotropic scale invariance called the scale invariant generator technique (SIG). The accuracy of the technique is tested using anisotropic multifractal simulations and error estimates are provided for the geophysically relevant range of parameters. It is found that the technique yields reasonable estimates for simulations with a diversity of anisotropic and statistical characteristics. The scale invariant generator technique can profitably be applied to the scale invariant study of vertical/horizontal and space/time cross-sections of geophysical fields as well as to the study of the texture/morphology of fields.
Gut instinct: a diagnostic tool?
Iqbal, I Z; Kara, N; Hartley, C
2015-04-01
It is generally accepted that with experience clinicians develop the ability to identify patients who present with malignancy prior to a formal diagnosis. This ability cannot be quantified, nor is it a plausible substitute for investigation. This study aimed to evaluate the association between instinct and head and neck cancer diagnosis. A prospective study of patients requiring urgent diagnostic procedures for suspected cancer between August and December 2010 was performed. Risk factors, symptoms, signs and the clinician's impression were recorded. These were graded and subsequently correlated with histology findings. Twenty-seven patients, with a mean age of 62.2 years, underwent a diagnostic procedure. Thirty per cent of patients were referred under the two-week pathway and 18.5 per cent had a previous history of head and neck cancer. A diagnosis of cancer was made in 37 per cent of patients. There was a positive correlation between clinical suspicion and cancer diagnosis (Kendall's tau-b = 0.648749). This study highlights the importance of clinical suspicion in cancer diagnosis. Although clinical suspicion cannot be quantified, it should be regarded as an integral part of patient assessment.
Biesiekierski, J R; Rosella, O; Rose, R; Liels, K; Barrett, J S; Shepherd, S J; Gibson, P R; Muir, J G
2011-04-01
Wholegrain grains and cereals contain a wide range of potentially protective factors that are relevant to gastrointestinal health. The prebiotics best studied are fructans [fructooligosaccharides (FOS), inulin] and galactooligosaccharides (GOS). These and other short-chain carbohydrates can also be poorly absorbed in the small intestine (named fermentable oligo-, di- and monosaccharides and polyols; FODMAPs) and may have important implications for the health of the gut. In the present study, FODMAPs, including fructose in excess of glucose, FOS (nystose, kestose), GOS (raffinose, stachyose) and sugar polyols (sorbitol, mannitol), were quantified using high-performance liquid chromatography with an evaporative light scattering detector. Total fructan was quantified using an enzymic hydrolysis method. Fifty-five commonly consumed grains, breakfast cereals, breads, pulses and biscuits were analysed. Total fructan were the most common short-chain carbohydrate present in cereal grain products and ranged (g per portion as eaten) from 1.12 g in couscous to 0 g in rice; 0.6 g in dark rye bread to 0.07 g in spelt bread; 0.96 g in wheat-free muesli to 0.11 g in oats; and 0.81 g in muesli fruit bar to 0.05 g in potato chips. Raffinose and stachyose were most common in pulses. Composition tables including FODMAPs and prebiotics (FOS and GOS) that are naturally present in food will greatly assist research aimed at understanding their physiological role in the gut. © 2011 The Authors. Journal compilation © 2011 The British Dietetic Association Ltd.
NASA Astrophysics Data System (ADS)
Rodigast, Maria; Mutzel, Anke; Herrmann, Hartmut
2017-03-01
Methylglyoxal forms oligomeric compounds in the atmospheric aqueous particle phase, which could establish a significant contribution to the formation of aqueous secondary organic aerosol (aqSOA). Thus far, no suitable method for the quantification of methylglyoxal oligomers is available despite the great effort spent for structure elucidation. In the present study a simplified method was developed to quantify heat-decomposable methylglyoxal oligomers as a sum parameter. The method is based on the thermal decomposition of oligomers into methylglyoxal monomers. Formed methylglyoxal monomers were detected using PFBHA (o-(2,3,4,5,6-pentafluorobenzyl)hydroxylamine hydrochloride) derivatisation and gas chromatography-mass spectrometry (GC/MS) analysis. The method development was focused on the heating time (varied between 15 and 48 h), pH during the heating process (pH = 1-7), and heating temperature (50, 100 °C). The optimised values of these method parameters are presented. The developed method was applied to quantify heat-decomposable methylglyoxal oligomers formed during the OH-radical oxidation of 1,3,5-trimethylbenzene (TMB) in the Leipzig aerosol chamber (LEipziger AerosolKammer, LEAK). Oligomer formation was investigated as a function of seed particle acidity and relative humidity. A fraction of heat-decomposable methylglyoxal oligomers of up to 8 % in the produced organic particle mass was found, highlighting the importance of those oligomers formed solely by methylglyoxal for SOA formation. Overall, the present study provides a new and suitable method for quantification of heat-decomposable methylglyoxal oligomers in the aqueous particle phase.
NASA Astrophysics Data System (ADS)
Allec, N.; Abbaszadeh, S.; Scott, C. C.; Lewin, J. M.; Karim, K. S.
2012-12-01
In contrast-enhanced mammography (CEM), the dual-energy dual-exposure technique, which can leverage existing conventional mammography infrastructure, relies on acquiring the low- and high-energy images using two separate exposures. The finite time between image acquisition leads to motion artifacts in the combined image. Motion artifacts can lead to greater anatomical noise in the combined image due to increased mismatch of the background tissue in the images to be combined, however the impact has not yet been quantified. In this study we investigate a method to include motion artifacts in the dual-energy noise and performance analysis. The motion artifacts are included via an extended cascaded systems model. To validate the model, noise power spectra of a previous dual-energy clinical study are compared to that of the model. The ideal observer detectability is used to quantify the effect of motion artifacts on tumor detectability. It was found that the detectability can be significantly degraded when motion is present (e.g., detectability of 2.5 mm radius tumor decreased by approximately a factor of 2 for translation motion on the order of 1000 μm). The method presented may be used for a more comprehensive theoretical noise and performance analysis and fairer theoretical performance comparison between dual-exposure techniques, where motion artifacts are present, and single-exposure techniques, where low- and high-energy images are acquired simultaneously and motion artifacts are absent.
Allec, N; Abbaszadeh, S; Scott, C C; Lewin, J M; Karim, K S
2012-12-21
In contrast-enhanced mammography (CEM), the dual-energy dual-exposure technique, which can leverage existing conventional mammography infrastructure, relies on acquiring the low- and high-energy images using two separate exposures. The finite time between image acquisition leads to motion artifacts in the combined image. Motion artifacts can lead to greater anatomical noise in the combined image due to increased mismatch of the background tissue in the images to be combined, however the impact has not yet been quantified. In this study we investigate a method to include motion artifacts in the dual-energy noise and performance analysis. The motion artifacts are included via an extended cascaded systems model. To validate the model, noise power spectra of a previous dual-energy clinical study are compared to that of the model. The ideal observer detectability is used to quantify the effect of motion artifacts on tumor detectability. It was found that the detectability can be significantly degraded when motion is present (e.g., detectability of 2.5 mm radius tumor decreased by approximately a factor of 2 for translation motion on the order of 1000 μm). The method presented may be used for a more comprehensive theoretical noise and performance analysis and fairer theoretical performance comparison between dual-exposure techniques, where motion artifacts are present, and single-exposure techniques, where low- and high-energy images are acquired simultaneously and motion artifacts are absent.
Craniofacial structure alterations of foetuses from folic acid deficient pregnant mice.
Maldonado, Estela; López, Yamila; Herrera, Manuel; Martínez-Sanz, Elena; Martínez-Álvarez, Concepción; Pérez-Miguelsanz, Juliana
2018-03-28
Craniofacial development in mammals is a complex process that involves a coordinated series of molecular and morphogenetic events. Folic acid (FA) deficiency has historically been associated with congenital spinal cord malformations, but the effect that a maternal diet deficient in FA has on the development of other structures has been poorly explored. In the present study, the objective was to describe and quantify the alterations of craniofacial structures presented in mouse foetuses from dams fed a FA deficient (FAD) diet compared with controls that were given a regular maternal diet. E17 mouse foetuses were removed from dams that were fed with a control diet or with a FAD diet for several weeks. Foetuses with maternal FAD diets were selected for the study when they showed an altered tongue or mandible. Histological sections were used to quantify the dimensions of the head, tongue, mandibular bone and masseter muscle areas using ImageJ software. The muscles of the tongue, suprahyoid muscles, lingual septum, submandibular ducts, and lingual arteries were also analysed. The heads of malformed foetuses were smaller than the heads of the controls, and they showed different types of malformations: microglossia with micrognathia (some of which were combined with cleft palate) and aglossia with either micrognathia or agnathia. Lingual and suprahyoid muscles were affected in different forms and degrees. We also found alterations in the lingual arteries and in the ducts of the submandibular glands. Summarised we can state that pharyngeal arches-derived structures were affected, and the main malformations observed corroborate the vulnerability of cranial neural crest cells to FA deficiency. The present study reveals alterations in the development of craniofacial structures in FAD foetuses. This study provides a new focus for the role of FA during embryological development. Copyright © 2018 Elsevier GmbH. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, E.; Gonder, J.; Lopp, S.
It is widely understood that cold-temperature engine operation negatively impacts vehicle fuel use due to a combination of increased friction (high-viscosity engine oil) and temporary enrichment (accelerated catalyst heating). However, relatively little effort has been dedicated to thoroughly quantifying these impacts across a large number of driving cycles and ambient conditions. This work leverages high-quality dynamometer data collected at various ambient conditions to develop a modeling framework for quantifying engine cold-start fuel penalties over a wide array of real-world usage profiles. Additionally, mitigation strategies including energy retention and exhaust heat recovery are explored with benefits quantified for each approach.
Processing of Numerical and Proportional Quantifiers
ERIC Educational Resources Information Center
Shikhare, Sailee; Heim, Stefan; Klein, Elise; Huber, Stefan; Willmes, Klaus
2015-01-01
Quantifier expressions like "many" and "at least" are part of a rich repository of words in language representing magnitude information. The role of numerical processing in comprehending quantifiers was studied in a semantic truth value judgment task, asking adults to quickly verify sentences about visual displays using…
Rajamurugan, R; Selvaganabathy, N; Kumaravel, S; Ramamurthy, Ch; Sujatha, V; Suresh Kumar, M; Thirunavukkarasu, C
2011-12-01
Vernonia cinerea (L.) Less [Compositae (Asteraceae)] is used traditionally for several medical purposes such as inflammation, pain, fever, and cancer. The present study identified the bioactive constituents in the methanol extract of Vernonia cinerea leaf and evaluated its antioxidant activity and acute toxicity. The identification of phytochemicals was accomplished by GC-MS and the major antioxidant phenolic compounds in the extract were quantified by HPTLC analysis. To quantify the essential elements, atomic absorption spectrophotometeric analysis was carried out. Total phenol and flavonoid content was measured by Folin-Ciocalteau reagent and 2% aluminium chloride, respectively. GC-MS analysis identified the presence of 27 phytoconstituents. The predominant phenolic compound in the extract as quantified by HPTLC was gallic acid (1.92 mg/g) followed by rutin (0.705 mg/g), quercetin (0.173 mg/g), caffeic acid (0.082 mg/g) and ferulic acid (0.033 mg/g). The following elements were quantified: Fe (0.050 ppm), Mn (0.022 ppm), Co (0.0180 ppm), Pb (0.029 ppm), Hg (3.885 ppm) and Se (4.5240 ppm). The antioxidant activity of the extract increased with increasing concentration and the correlation (r²) for all in vitro assays were satisfactory. V. cinerea extract has significant (p < 0.05) antiradical activity. Hence, V. cinerea may have potential medicinal value and can be used in the formulation of pharmacological products for degenerative diseases.
New Approaches to Quantifying Transport Model Error in Atmospheric CO2 Simulations
NASA Technical Reports Server (NTRS)
Ott, L.; Pawson, S.; Zhu, Z.; Nielsen, J. E.; Collatz, G. J.; Gregg, W. W.
2012-01-01
In recent years, much progress has been made in observing CO2 distributions from space. However, the use of these observations to infer source/sink distributions in inversion studies continues to be complicated by difficulty in quantifying atmospheric transport model errors. We will present results from several different experiments designed to quantify different aspects of transport error using the Goddard Earth Observing System, Version 5 (GEOS-5) Atmospheric General Circulation Model (AGCM). In the first set of experiments, an ensemble of simulations is constructed using perturbations to parameters in the model s moist physics and turbulence parameterizations that control sub-grid scale transport of trace gases. Analysis of the ensemble spread and scales of temporal and spatial variability among the simulations allows insight into how parameterized, small-scale transport processes influence simulated CO2 distributions. In the second set of experiments, atmospheric tracers representing model error are constructed using observation minus analysis statistics from NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA). The goal of these simulations is to understand how errors in large scale dynamics are distributed, and how they propagate in space and time, affecting trace gas distributions. These simulations will also be compared to results from NASA's Carbon Monitoring System Flux Pilot Project that quantified the impact of uncertainty in satellite constrained CO2 flux estimates on atmospheric mixing ratios to assess the major factors governing uncertainty in global and regional trace gas distributions.
Area Under the Curve as a Novel Metric of Behavioral Economic Demand for Alcohol
Amlung, Michael; Yurasek, Ali; McCarty, Kayleigh N.; MacKillop, James; Murphy, James G.
2015-01-01
Behavioral economic purchase tasks can be readily used to assess demand for a number of addictive substances including alcohol, tobacco and illicit drugs. However, several methodological limitations associated with the techniques used to quantify demand may reduce the utility of demand measures. In the present study, we sought to introduce area under the curve (AUC), commonly used to quantify degree of delay discounting, as a novel index of demand. A sample of 207 heavy drinking college students completed a standard alcohol purchase task and provided information about typical weekly drinking patterns and alcohol-related problems. Level of alcohol demand was quantified using AUC – which reflects the entire amount of consumption across all drink prices - as well as the standard demand indices (e.g., intensity, breakpoint, Omax, Pmax, and elasticity). Results indicated that AUC was significantly correlated with each of the other demand indices (rs = .42–.92), with particularly strong associations with Omax (r = .92). In regression models, AUC and intensity were significant predictors of weekly drinking quantity and AUC uniquely predicted alcohol-related problems, even after controlling for drinking level. In a parallel set of analyses, Omax also predicted drinking quantity and alcohol problems, although Omax was not a unique predictor of the latter. These results offer initial support for using AUC as an index of alcohol demand. Additional research is necessary to further validate this approach and to examine its utility in quantifying demand for other addictive substances such as tobacco and illicit drugs. PMID:25895013
Templeton, Justin P; Struebing, Felix L; Lemmon, Andrew; Geisert, Eldon E
2014-11-01
The present article introduces a new and easy to use counting application for the Apple iPad. The application "ImagePAD" takes advantage of the advanced user interface features offered by the Apple iOS platform, simplifying the rather tedious task of quantifying features in anatomical studies. For example, the image under analysis can be easily panned and zoomed using iOS-supported multi-touch gestures without losing the spatial context of the counting task, which is extremely important for ensuring count accuracy. This application allows one to quantify up to 5 different types of objects in a single field and output the data in a tab-delimited format for subsequent analysis. We describe two examples of the use of the application: quantifying axons in the optic nerve of the C57BL/6J mouse and determining the percentage of cells labeled with NeuN or ChAT in the retinal ganglion cell layer. For the optic nerve, contiguous images at 60× magnification were taken and transferred onto an Apple iPad. Axons were counted by tapping on the touch-sensitive screen using ImagePAD. Nine optic nerves were sampled and the number of axons in the nerves ranged from 38,872 axons to 50,196 axons with an average of 44,846 axons per nerve (SD = 3980 axons). Copyright © 2014 Elsevier Ltd. All rights reserved.
Interconnections Seam Study | Energy Analysis | NREL
Interconnections Seam Study Interconnections Seam Study Through the Interconnections Seam Study between the interconnections. This study will quantify the value of strengthening the connections (or Peer Review - Interconnections Seam Study to learn more. Our Approach To quantify the value of
Junior, Benedito Roberto Alvarenga; Soares, Frederico Luis Felipe; Ardila, Jorge Armando; Durango, Luis Guillermo Cuadrado; Forim, Moacir Rossi; Carneiro, Renato Lajarim
2018-01-05
The aim of this work was to quantify B-complex vitamins in pharmaceutical samples by surface enhanced Raman spectroscopy technique using gold colloid substrate. Synthesis of gold nanoparticles was performed according to an adapted Turkevich method. Initial essays were able to suggest the orientation of molecules on gold nanoparticles surface. Central Composite design was performed to obtain the highest SERS signal for nicotinamide and riboflavin. The evaluated parameters in the experimental design were volume of AuNPs, concentration of vitamins and sodium chloride concentration. The best condition for nicotinamide was NaCl 2.3×10 -3 molL -1 and 700μL of AuNPs colloid and this same condition showed to be adequate to quantify thiamine. The experimental design for riboflavin shows the best condition at NaCl 1.15×10 -2 molL -1 and 2.8mL of AuNPs colloid. It was possible to quantify thiamine and nicotinamide in presence of others vitamins and excipients in two solid multivitamin formulations using the standard addition procedure. The standard addition curve presented a R 2 higher than 0.96 for both nicotinamide and thiamine, at orders of magnitude 10 -7 and 10 -8 molL -1 , respectively. The nicotinamide content in a cosmetic gel sample was also quantified by direct analysis presenting R 2 0.98. The t-student test presented no significant difference regarding HPLC method. Despite the experimental design performed for riboflavin, it was not possible its quantification in the commercial samples. Copyright © 2017 Elsevier B.V. All rights reserved.
Supporting statement for community study of human response to aircraft noise
NASA Technical Reports Server (NTRS)
Dempsey, T. K.; Deloach, R.; Stephens, D. G.
1980-01-01
A study plan for quantifying the relationship between human annoyance and the noise level of individual aircraft events is studied. The validity of various noise descriptors or noise metrics for quantifying aircraft noise levels are assessed.
Wasserberg, Gideon; White, L; Bullard, A; King, J; Maxwell, R
2013-09-01
For organisms lacking parental care and where larval dispersal is limited, oviposition site selection decisions are critical fitness-enhancing choices. However, studies usually do not consider the interdependence of the two. In this study, we evaluated the effect of food level on the oviposition behavior of Aedes albopictus (Skuse) in the presence or the absence of a nonlethal predator (caged dragonfly nymph). We also attempted to quantify the perceived cost of predation to ovipositioning mosquitoes. Mosquitoes were presented with oviposition cups containing four levels of larval food (fermented leaf infusion) with or without a caged libellulid nymph. By titrating larval food, we estimated the amount of food needed to attract the female mosquito to oviposit in the riskier habitat. As expected, oviposition rate increased with food level and decreased in the presence of a predator. However, the effect of food level did not differ between predator treatments. By calculating the difference in the amount of food for points of equal oviposition rate in the predator-present and predator-absent regression lines, we estimated the cost of predation risk to be 1950 colony-forming-units per milliliter. Our study demonstrated the importance of considering the possible interdependence of predation risk and food abundance for oviposition-site-seeking insects. This study also quantified the perceived cost of predation and found it to be relatively low, a fact with positive implications for biological control.
Laboratory requirements for in-situ and remote sensing of suspended material
NASA Technical Reports Server (NTRS)
Kuo, C. Y.; Cheng, R. Y. K.
1978-01-01
Recommendations for laboratory and in-situ measurements required for remote sensing of suspended material are presented. This study investigates the properties of the suspended materials, factors influencing the upwelling radiance, and the various types of remote sensing techniques. Calibration and correlation procedures are given to obtain the accuracy necessary to quantify the suspended materials by remote sensing. In addition, the report presents a survey of the national need for sediment data, the agencies that deal with and require the data of suspended sediment, and a summary of some recent findings of sediment measurements.
Towards quantifying dynamic human-human physical interactions for robot assisted stroke therapy.
Mohan, Mayumi; Mendonca, Rochelle; Johnson, Michelle J
2017-07-01
Human-Robot Interaction is a prominent field of robotics today. Knowledge of human-human physical interaction can prove vital in creating dynamic physical interactions between human and robots. Most of the current work in studying this interaction has been from a haptic perspective. Through this paper, we present metrics that can be used to identify if a physical interaction occurred between two people using kinematics. We present a simple Activity of Daily Living (ADL) task which involves a simple interaction. We show that we can use these metrics to successfully identify interactions.
NASA Astrophysics Data System (ADS)
Su, Y. H.; Chen, K. S.; Roberts, D. C.; Spearing, S. M.
2001-11-01
The large deflection analysis of a pre-stressed annular plate with a central rigid boss subjected to axisymmetric loading is presented. The factors affecting the transition from plate behaviour to membrane behaviour (e.g. thickness, in-plane tension and material properties) are studied. The effect of boss size and pre-tension on the effective stiffness of the plate are investigated. The extent of the bending boundary layers at the edges of the plate are quantified. All results are presented in non-dimensional form. The design implications for microelectromechanical system components are assessed.
Detecting Nano-Scale Vibrations in Rotating Devices by Using Advanced Computational Methods
del Toro, Raúl M.; Haber, Rodolfo E.; Schmittdiel, Michael C.
2010-01-01
This paper presents a computational method for detecting vibrations related to eccentricity in ultra precision rotation devices used for nano-scale manufacturing. The vibration is indirectly measured via a frequency domain analysis of the signal from a piezoelectric sensor attached to the stationary component of the rotating device. The algorithm searches for particular harmonic sequences associated with the eccentricity of the device rotation axis. The detected sequence is quantified and serves as input to a regression model that estimates the eccentricity. A case study presents the application of the computational algorithm during precision manufacturing processes. PMID:22399918
Laboratory requirements for in-situ and remote sensing of suspended material
NASA Technical Reports Server (NTRS)
Kuo, C. Y.; Cheng, R. Y. K.
1976-01-01
Recommendations for laboratory and in-situ measurements required for remote sensing of suspended material are presented. This study investigates the properties of the suspended materials, factors influencing the upwelling radiance, and the various types of remote sensing techniques. Calibration and correlation procedures are given to obtain the accuracy necessary to quantify the suspended materials by remote sensing. In addition, the report presents a survey of the national need for sediment data, the agencies that deal with and require the data of suspended sediment, and a summary of some recent findings of sediment measurements.
A prioritized set of physiological measurements for future spaceflight experiments
NASA Technical Reports Server (NTRS)
1978-01-01
A set of desired experimental measurements to be obtained in future spaceflights in four areas of physiological investigation are identified. The basis for identifying the measurements was the physiological systems analysis performed on Skylab data and related ground-based studies. An approach for prioritizing the measurement list is identified and discussed with the use of examples. A prioritized measurement list is presented for each of the following areas; cardiopulmonary, fluid-renal and electrolyte, hematology and immunology, and musculoskeletal. Also included is a list of interacting stresses and other factors present in spaceflight experiments whose effects may need to be quantified.
Tagging Water Sources in Atmospheric Models
NASA Technical Reports Server (NTRS)
Bosilovich, M.
2003-01-01
Tagging of water sources in atmospheric models allows for quantitative diagnostics of how water is transported from its source region to its sink region. In this presentation, we review how this methodology is applied to global atmospheric models. We will present several applications of the methodology. In one example, the regional sources of water for the North American Monsoon system are evaluated by tagging the surface evaporation. In another example, the tagged water is used to quantify the global water cycling rate and residence time. We will also discuss the need for more research and the importance of these diagnostics in water cycle studies.
SU-E-J-158: Audiovisual Biofeedback Reduces Image Artefacts in 4DCT: A Digital Phantom Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pollock, S; Kipritidis, J; Lee, D
2015-06-15
Purpose: Irregular breathing motion has a deleterious impact on 4DCT image quality. The breathing guidance system: audiovisual biofeedback (AVB) is designed to improve breathing regularity, however, its impact on 4DCT image quality has yet to be quantified. The purpose of this study was to quantify the impact of AVB on thoracic 4DCT image quality by utilizing the digital eXtended Cardiac Torso (XCAT) phantom driven by lung tumor motion patterns. Methods: 2D tumor motion obtained from 4 lung cancer patients under two breathing conditions (i) without breathing guidance (free breathing), and (ii) with guidance (AVB). There were two breathing sessions, yieldingmore » 8 tumor motion traces. This tumor motion was synchronized with the XCAT phantom to simulate 4DCT acquisitions under two acquisition modes: (1) cine mode, and (2) prospective respiratory-gated mode. Motion regularity was quantified by the root mean square error (RMSE) of displacement. The number of artefacts was visually assessed for each 4DCT and summed up for each breathing condition. Inter-session anatomic reproducibility was quantified by the mean absolute difference (MAD) between the Session 1 4DCT and Session 2 4DCT. Results: AVB improved tumor motion regularity by 30%. In cine mode, the number of artefacts was reduced from 61 in free breathing to 40 with AVB, in addition to AVB reducing the MAD by 34%. In gated mode, the number of artefacts was reduced from 63 in free breathing to 51 with AVB, in addition to AVB reducing the MAD by 23%. Conclusion: This was the first study to compare the impact of breathing guidance on 4DCT image quality compared to free breathing, with AVB reducing the amount of artefacts present in 4DCT images in addition to improving inter-session anatomic reproducibility. Results thus far suggest that breathing guidance interventions could have implications for improving radiotherapy treatment planning and interfraction reproducibility.« less
Czolowski, Eliza D; Santoro, Renee L; Srebotnjak, Tanja; Shonkoff, Seth B C
2017-08-23
Higher risk of exposure to environmental health hazards near oil and gas wells has spurred interest in quantifying populations that live in proximity to oil and gas development. The available studies on this topic lack consistent methodology and ignore aspects of oil and gas development of value to public health-relevant assessment and decision-making. We aim to present a methodological framework for oil and gas development proximity studies grounded in an understanding of hydrocarbon geology and development techniques. We geospatially overlay locations of active oil and gas wells in the conterminous United States and Census data to estimate the population living in proximity to hydrocarbon development at the national and state levels. We compare our methods and findings with existing proximity studies. Nationally, we estimate that 17.6 million people live within 1,600m (∼1 mi) of at least one active oil and/or gas well. Three of the eight studies overestimate populations at risk from actively producing oil and gas wells by including wells without evidence of production or drilling completion and/or using inappropriate population allocation methods. The remaining five studies, by omitting conventional wells in regions dominated by historical conventional development, significantly underestimate populations at risk. The well inventory guidelines we present provide an improved methodology for hydrocarbon proximity studies by acknowledging the importance of both conventional and unconventional well counts as well as the relative exposure risks associated with different primary production categories (e.g., oil, wet gas, dry gas) and developmental stages of wells. https://doi.org/10.1289/EHP1535.
Tortora, Chiara; Meazzini, Maria C; Garattini, Giovanna; Brusati, Roberto
2008-03-01
To evaluate the dental characteristics of patients subjected to a protocol that included early secondary gingivoalveoloplasty (ESGAP). Panoramic radiographs of 87 patients with unilateral cleft lip and palate (UCLP) and 29 with bilateral cleft lip and palate (BCLP) were evaluated. Missing and supernumerary teeth were also quantified on the cleft and noncleft side and in the maxilla and mandible. Crown and root malformations and tooth rotations were quantified. A subsample in permanent dentition was extrapolated to analyze canine eruption patterns. A total of 48.8% of the UCLP patients presented with missing permanent lateral incisors in the cleft area and 6.1% contralaterally. A total of 4.9% presented with missing second maxillary premolars on the cleft site and 1.2% contralaterally. A total of 7.3% presented with supernumerary lateral incisors, and 45% of the BCLP cleft sites presented with missing lateral incisors, while 25% of the cleft sites presented second maxillary premolars agenesis. Five percent of the cleft sites presented with supernumerary lateral incisors. Evaluation of the subsample in permanent dentition showed that 15.5% had a canine retention and 4.4% of the canines had to be surgically exposed. A significant association was observed between canine inclination and retention but not with absence of the lateral incisor. The frequency of dental anomalies in this sample was similar to other cleft populations. As surgical trauma has been suggested to damage forming teeth, the results of this study indicated that ESGAP has no detrimental influence on subsequent dental development.
NASA Astrophysics Data System (ADS)
Hohert, Geoffrey; Pahlevaninezhad, Hamid; Lee, Anthony; Lane, Pierre M.
2016-03-01
Endoscopic catheter-based imaging systems that employ a 2-dimensional rotary or 3-dimensional rotary-pullback scanning mechanism require constant angular velocity at the distal tip to ensure correct angular registration of the collected signal. Non-uniform rotational distortion (NURD) - often present due to a variety of mechanical issues - can result in inconsistent position and velocity profiles at the tip, limiting the accuracy of any measurements. Since artifacts like NURD are difficult to identify and characterize during tissue imaging, phantoms with well-defined patterns have been used to quantify position and/or velocity error. In this work we present a fast, versatile, and cost-effective method for making fused deposition modeling 3D printed phantoms for identifying and quantifying NURD errors along an arbitrary user-defined pullback path. Eight evenly-spaced features are present at the same orientation at all points on the path such that deviations from expected geometry can be quantified for the imaging catheter. The features are printed vertically and then folded together around the path to avoid issues with printer head resolution. This method can be adapted for probes of various diameters and for complex imaging paths with multiple bends. We demonstrate imaging using the 3D printed phantoms with a 1mm diameter rotary-pullback OCT catheter and system as a means of objectively evaluating the mechanical performance of similarly constructed probes.
Imaging cochlear soft tissue displacement with coherent x-rays
NASA Astrophysics Data System (ADS)
Rau, Christoph; Richter, Claus-Peter
2015-10-01
At present, imaging of cochlear mechanics at mid-cochlear turns has not been accomplished. Although challenging, this appears possible with partially coherent hard x-rays. The present study shows results from stroboscopic x-ray imaging of a test object at audio frequencies. The vibration amplitudes were quantified. In a different set of experiments, an intact and calcified gerbil temporal bone was used to determine displacements of the reticular lamina, tectorial membrane, and Reissner’s membrane with the Lucas and Kanade video flow algorithm. The experiments validated high frequency x-ray imaging and imaging in a calcified cochlea. The present work is key for future imaging of cochlear micromechanics at a high spatial resolution.
NASA Astrophysics Data System (ADS)
Jakubovic, Raphael; Gupta, Shuarya; Guha, Daipayan; Mainprize, Todd; Yang, Victor X. D.
2017-02-01
Cranial neurosurgical procedures are especially delicate considering that the surgeon must localize the subsurface anatomy with limited exposure and without the ability to see beyond the surface of the surgical field. Surgical accuracy is imperative as even minor surgical errors can cause major neurological deficits. Traditionally surgical precision was highly dependent on surgical skill. However, the introduction of intraoperative surgical navigation has shifted the paradigm to become the current standard of care for cranial neurosurgery. Intra-operative image guided navigation systems are currently used to allow the surgeon to visualize the three-dimensional subsurface anatomy using pre-acquired computed tomography (CT) or magnetic resonance (MR) images. The patient anatomy is fused to the pre-acquired images using various registration techniques and surgical tools are typically localized using optical tracking methods. Although these techniques positively impact complication rates, surgical accuracy is limited by the accuracy of the navigation system and as such quantification of surgical error is required. While many different measures of registration accuracy have been presented true navigation accuracy can only be quantified post-operatively by comparing a ground truth landmark to the intra-operative visualization. In this study we quantified the accuracy of cranial neurosurgical procedures using a novel optical surface imaging navigation system to visualize the three-dimensional anatomy of the surface anatomy. A tracked probe was placed on the screws of cranial fixation plates during surgery and the reported position of the centre of the screw was compared to the co-ordinates of the post-operative CT or MR images, thus quantifying cranial neurosurgical error.
Newsome, Seth D.; Yeakel, Justin D.; Wheatley, Patrick V.; Tinker, M. Tim
2012-01-01
Ecologists are increasingly using stable isotope analysis to inform questions about variation in resource and habitat use from the individual to community level. In this study we investigate data sets from 2 California sea otter (Enhydra lutris nereis) populations to illustrate the advantages and potential pitfalls of applying various statistical and quantitative approaches to isotopic data. We have subdivided these tools, or metrics, into 3 categories: IsoSpace metrics, stable isotope mixing models, and DietSpace metrics. IsoSpace metrics are used to quantify the spatial attributes of isotopic data that are typically presented in bivariate (e.g., δ13C versus δ15N) 2-dimensional space. We review IsoSpace metrics currently in use and present a technique by which uncertainty can be included to calculate the convex hull area of consumers or prey, or both. We then apply a Bayesian-based mixing model to quantify the proportion of potential dietary sources to the diet of each sea otter population and compare this to observational foraging data. Finally, we assess individual dietary specialization by comparing a previously published technique, variance components analysis, to 2 novel DietSpace metrics that are based on mixing model output. As the use of stable isotope analysis in ecology continues to grow, the field will need a set of quantitative tools for assessing isotopic variance at the individual to community level. Along with recent advances in Bayesian-based mixing models, we hope that the IsoSpace and DietSpace metrics described here will provide another set of interpretive tools for ecologists.
Origins Space Telescope: Tracing Dark Molecular Gas in the Milky Way
NASA Astrophysics Data System (ADS)
Narayanan, Desika; Li, Qi; Krumholz, Mark; Dave, Romeel; Origins Space Telescope Science and Technology Definition Team
2018-01-01
We present theoretical models for quantifying the fraction of CO-dark molecular gas in galaxies. To do this, we combine novel thermal, chemical, and radiative equilibrium calculations with high-resolution cosmological zoom galaxy formation models. We discuss how this dark molecular gas will be uncovered by the Origins Space Telescope, one of the four science and technology definition studies of NASA Headquarters for the 2020 Astronomy and Astrophysics Decadal survey.
Nonlinear viscoelastic characterization of structural adhesives
NASA Technical Reports Server (NTRS)
Rochefort, M. A.; Brinson, H. F.
1983-01-01
Measurements of the nonliner viscoelastic behavior of two adhesives, FM-73 and FM-300, are presented and discussed. Analytical methods to quantify the measurements are given and fitted into a framework of an accelerated testing and analysis procedure. The single integral model used is shown to function well and is analogous to a time-temperature stress-superposition procedure (TTSSP). Advantages and disadvantages of the creep power law method used in this study are given.
Geothermal probabilistic cost study
NASA Technical Reports Server (NTRS)
Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.
1981-01-01
A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.
Quantifying and Monetizing Renewable Energy Resiliency
Anderson, Kate H.; Laws, Nicholas D.; Marr, Spencer; ...
2018-03-23
Energy resiliency has been thrust to the forefront by recent severe weather events and natural disasters. Billions of dollars are lost each year due to power outages. This article highlights the unique value renewable energy hybrid systems (REHS), comprised of solar, energy storage, and generators, provide in increasing resiliency. We present a methodology to quantify the amount and value of resiliency provided by REHS, and ways to monetize this resiliency value through insurance premium discounts. A case study of buildings in New York City demonstrates how implementing REHS in place of traditional backup diesel generators can double the amount ofmore » outage survivability, with an added value of $781,200. For a Superstorm Sandy type event, results indicate that insurance premium reductions could support up to 4% of the capital cost of REHS, and the potential exists to prevent up to $2.5 billion in business interruption losses with increased REHS deployment.« less
Methods for quantifying T cell receptor binding affinities and thermodynamics
Piepenbrink, Kurt H.; Gloor, Brian E.; Armstrong, Kathryn M.; Baker, Brian M.
2013-01-01
αβ T cell receptors (TCRs) recognize peptide antigens bound and presented by class I or class II major histocompatibility complex (MHC) proteins. Recognition of a peptide/MHC complex is required for initiation and propagation of a cellular immune response, as well as the development and maintenance of the T cell repertoire. Here we discuss methods to quantify the affinities and thermodynamics of interactions between soluble ectodomains of TCRs and their peptide/MHC ligands, focusing on titration calorimetry, surface plasmon resonance, and fluorescence anisotropy. As TCRs typically bind ligand with weak-to-moderate affinities, we focus the discussion on means to enhance the accuracy and precision of low affinity measurements. In addition to further elucidating the biology of the T cell mediated immune response, more reliable low affinity measurements will aid with more probing studies with mutants or altered peptides that can help illuminate the physical underpinnings of how TCRs achieve their remarkable recognition properties. PMID:21609868
Maldini, Mariateresa; Montoro, Paola; Pizza, Cosimo
2011-08-25
Phytochemical investigation of the methanolic extract of Byrsonima crassifolia's bark led to the isolation of 8 known phenolic compounds 5-O-galloylquinic acid, 3-O-galloylquinic acid, 3,4-di-O-galloylquinic acid, 3,5-di-O-galloylquinic acid, 3,4,5-tri-O-galloylquinic acid, (+)-epicatechin-3-gallate along with (+)-catechin and (+)-epicatechin. Due to their biological value, in the present study, a high-performance liquid chromatography-tandem mass spectrometry (LC-MS/MS) method, working in multiple reaction monitoring (MRM) mode, has been developed to quantify these compounds. B. crassifolia bark resulted in a rich source of phenolic compounds and particularly of galloyl derivates. The proposed analytical method is promising to be applied to other galloyl derivatives to quantify these bioactive compounds in raw material and final products. Copyright © 2011 Elsevier B.V. All rights reserved.
Quantifying and Monetizing Renewable Energy Resiliency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Kate H.; Laws, Nicholas D.; Marr, Spencer
Energy resiliency has been thrust to the forefront by recent severe weather events and natural disasters. Billions of dollars are lost each year due to power outages. This article highlights the unique value renewable energy hybrid systems (REHS), comprised of solar, energy storage, and generators, provide in increasing resiliency. We present a methodology to quantify the amount and value of resiliency provided by REHS, and ways to monetize this resiliency value through insurance premium discounts. A case study of buildings in New York City demonstrates how implementing REHS in place of traditional backup diesel generators can double the amount ofmore » outage survivability, with an added value of $781,200. For a Superstorm Sandy type event, results indicate that insurance premium reductions could support up to 4% of the capital cost of REHS, and the potential exists to prevent up to $2.5 billion in business interruption losses with increased REHS deployment.« less
Abdolahad, M; Mohajerzadeh, S; Janmaleki, M; Taghinejad, H; Taghinejad, M
2013-03-01
Vertically aligned carbon nanotube (VACNT) arrays have been demonstrated as probes for rapid quantifying of cancer cell deformability with high resolution. Through entrapment of various cancer cells on CNT arrays, the deflections of the nanotubes during cell deformation were used to derive the lateral cell shear force using a large deflection mode method. It is observed that VACNT beams act as sensitive and flexible agents, which transfer the shear force of cells trapped on them by an observable deflection. The metastatic cancer cells have significant deformable structures leading to a further cell traction force (CTF) than primary cancerous one on CNT arrays. The elasticity of different cells could be compared by their CTF measurement on CNT arrays. This study presents a nanotube-based methodology for quantifying the single cell mechanical behavior, which could be useful for understanding the metastatic behavior of cells.
Peñarrubia, Luis; Alcaraz, Carles; Vaate, Abraham Bij de; Sanz, Nuria; Pla, Carles; Vidal, Oriol; Viñas, Jordi
2016-12-14
The zebra mussel (Dreissena polymorpha Pallas, 1771) and the quagga mussel (D. rostriformis Deshayes, 1838) are successful invasive bivalves with substantial ecological and economic impacts in freshwater systems once they become established. Since their eradication is extremely difficult, their detection at an early stage is crucial to prevent spread. In this study, we optimized and validated a qPCR detection method based on the histone H2B gene to quantify combined infestation levels of zebra and quagga mussels in environmental DNA samples. Our results show specific dreissenid DNA present in filtered water samples for which microscopic diagnostic identification for larvae failed. Monitoring a large number of locations for invasive dreissenid species based on a highly specific environmental DNA qPCR assay may prove to be an essential tool for management and control plans focused on prevention of establishment of dreissenid mussels in new locations.
Peñarrubia, Luis; Alcaraz, Carles; Vaate, Abraham bij de; Sanz, Nuria; Pla, Carles; Vidal, Oriol; Viñas, Jordi
2016-01-01
The zebra mussel (Dreissena polymorpha Pallas, 1771) and the quagga mussel (D. rostriformis Deshayes, 1838) are successful invasive bivalves with substantial ecological and economic impacts in freshwater systems once they become established. Since their eradication is extremely difficult, their detection at an early stage is crucial to prevent spread. In this study, we optimized and validated a qPCR detection method based on the histone H2B gene to quantify combined infestation levels of zebra and quagga mussels in environmental DNA samples. Our results show specific dreissenid DNA present in filtered water samples for which microscopic diagnostic identification for larvae failed. Monitoring a large number of locations for invasive dreissenid species based on a highly specific environmental DNA qPCR assay may prove to be an essential tool for management and control plans focused on prevention of establishment of dreissenid mussels in new locations. PMID:27966602
Premixed autoignition in compressible turbulence
NASA Astrophysics Data System (ADS)
Konduri, Aditya; Kolla, Hemanth; Krisman, Alexander; Chen, Jacqueline
2016-11-01
Prediction of chemical ignition delay in an autoignition process is critical in combustion systems like compression ignition engines and gas turbines. Often, ignition delay times measured in simple homogeneous experiments or homogeneous calculations are not representative of actual autoignition processes in complex turbulent flows. This is due the presence of turbulent mixing which results in fluctuations in thermodynamic properties as well as chemical composition. In the present study the effect of fluctuations of thermodynamic variables on the ignition delay is quantified with direct numerical simulations of compressible isotropic turbulence. A premixed syngas-air mixture is used to remove the effects of inhomogeneity in the chemical composition. Preliminary results show a significant spatial variation in the ignition delay time. We analyze the topology of autoignition kernels and identify the influence of extreme events resulting from compressibility and intermittency. The dependence of ignition delay time on Reynolds and turbulent Mach numbers is also quantified. Supported by Basic Energy Sciences, Dept of Energy, United States.
Spatio-temporal Organization During Ventricular Fibrillation in the Human Heart.
Robson, Jinny; Aram, Parham; Nash, Martyn P; Bradley, Chris P; Hayward, Martin; Paterson, David J; Taggart, Peter; Clayton, Richard H; Kadirkamanathan, Visakan
2018-06-01
In this paper, we present a novel approach to quantify the spatio-temporal organization of electrical activation during human ventricular fibrillation (VF). We propose three different methods based on correlation analysis, graph theoretical measures and hierarchical clustering. Using the proposed approach, we quantified the level of spatio-temporal organization during three episodes of VF in ten patients, recorded using multi-electrode epicardial recordings with 30 s coronary perfusion, 150 s global myocardial ischaemia and 30 s reflow. Our findings show a steady decline in spatio-temporal organization from the onset of VF with coronary perfusion. We observed transient increases in spatio-temporal organization during global myocardial ischaemia. However, the decline in spatio-temporal organization continued during reflow. Our results were consistent across all patients, and were consistent with the numbers of phase singularities. Our findings show that the complex spatio-temporal patterns can be studied using complex network analysis.
Cross-scale interactions: Quantifying multi-scaled cause–effect relationships in macrosystems
Soranno, Patricia A.; Cheruvelil, Kendra S.; Bissell, Edward G.; Bremigan, Mary T.; Downing, John A.; Fergus, Carol E.; Filstrup, Christopher T.; Henry, Emily N.; Lottig, Noah R.; Stanley, Emily H.; Stow, Craig A.; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E.
2014-01-01
Ecologists are increasingly discovering that ecological processes are made up of components that are multi-scaled in space and time. Some of the most complex of these processes are cross-scale interactions (CSIs), which occur when components interact across scales. When undetected, such interactions may cause errors in extrapolation from one region to another. CSIs, particularly those that include a regional scaled component, have not been systematically investigated or even reported because of the challenges of acquiring data at sufficiently broad spatial extents. We present an approach for quantifying CSIs and apply it to a case study investigating one such interaction, between local and regional scaled land-use drivers of lake phosphorus. Ultimately, our approach for investigating CSIs can serve as a basis for efforts to understand a wide variety of multi-scaled problems such as climate change, land-use/land-cover change, and invasive species.
Species arboreal as a bioindicator of the environmental pollution: Analysis by SR-TXRF
NASA Astrophysics Data System (ADS)
de Vives, Ana Elisa Sirito; Moreira, Silvana; Brienza, Sandra Maria Boscolo; Medeiros, Jean Gabriel S.; Filho, Mario Tomazello; Zucchi, Orghêda Luiza Araújo Domingues; do Nascimento Filho, Virgilio Franco; Barroso, Regina Cely
2007-08-01
This paper aims to study the environmental pollution in the tree development, in order to evaluate its use as bioindicator in urban and countrysides. The sample collection was carried out in Piracicaba city, São Paulo State, that presents high level of environmental contamination in water, soil and air, due to industrial activities, vehicle combustion, sugar-cane leaves burning in the harvesting, etc. The species Caesalpinia peltophoroides ("Sibipiruna") was selected because it is often used in urban arborization. Synchrotron radiation X-ray fluorescence technique (SR-TXRF) was employed to identify and quantify the elements and metals of nutritional and toxicological importance in the wood samples. The analysis was performed in the Brazilian Synchrotron Light Source Laboratory, using a white beam for excitation and an Si(Li) detector for X-ray detection. In several samples were quantified P, K, Ca, Ti, Fe, Sr, Ba and Pb elements.
Quantifying the relative risk of sex offenders: risk ratios for static-99R.
Hanson, R Karl; Babchishin, Kelly M; Helmus, Leslie; Thornton, David
2013-10-01
Given the widespread use of empirical actuarial risk tools in corrections and forensic mental health, it is important that evaluators and decision makers understand how scores relate to recidivism risk. In the current study, we found strong evidence for a relative risk interpretation of Static-99R scores using 8 samples from Canada, United Kingdom, and Western Europe (N = 4,037 sex offenders). Each increase in Static-99R score was associated with a stable and consistent increase in relative risk (as measured by an odds ratio or hazard ratio of approximately 1.4). Hazard ratios from Cox regression were used to calculate risk ratios that can be reported for Static-99R. We recommend that evaluators consider risk ratios as a useful, nonarbitrary metric for quantifying and communicating risk information. To avoid misinterpretation, however, risk ratios should be presented with recidivism base rates.
NASA Astrophysics Data System (ADS)
da Silva, Marcus Fernandes; de Area Leão Pereira, Éder Johnson; da Silva Filho, Aloisio Machado; de Castro, Arleys Pereira Nunes; Miranda, José Garcia Vivas; Zebende, Gilney Figueira
2016-07-01
In this paper we quantify the cross-correlation between the adjusted closing index of the G7 countries, by their Gross Domestic Product (nominal). For this purpose we consider the 2008 financial crisis. Thus, we intend to observe the impact of the 2008 crisis by applying the DCCA cross-correlation coefficient ρDCCA between these countries. As an immediate result we observe that there is a positive cross-correlation between the index, and this coefficient changes with time between weak, medium, and strong values. If we compare the pre-crisis period (before 2008) with the post-crisis period (after 2008), it is noticed that ρDCCA changes its value. From these facts, we propose to study the contagion (interdependence) effect from this change by a new variable, ΔρDCCA. Thus, we present new findings for the 2008 crisis between the members of the G7.
Orthogonal design to sift the optimal parameter of Neiguan acupuncture for cerebral infarction
Zhang, Yanan; Yang, Sha; Fan, Xiaonong; Wang, Shu; He, Nina; Li, Lingxin; Luo, Ding; Shi, Xuemin
2013-01-01
The individual difference and non-repeatability in acupuncture have not only restricted the development of acupuncture, but have also affected the specificity of acupoints. The present study used instruments to control needle depth, lifting and thrusting frequency, and the duration of acupuncture. Effects of the quantified acupuncture were observed at Neiguan (PC6) with different stimulation parameters. A frequency of 1, 2, or 3 Hz and duration of 5, 60, or 180 seconds were used to observe cerebral blood flow and ratio of infarct volume recovery. Results showed that stimulation at Neiguan with a frequency of 1 Hz and long duration of 180 seconds or 2/3 Hz and long duration of 5/60 seconds significantly increased cerebral blood flow and decreased the ratio of infarct volume. Interactions between frequency and duration play a critical role in quantified acupuncture therapy. PMID:25206575
Nuclear recoil measurements with the ARIS experiment
NASA Astrophysics Data System (ADS)
Fan, Alden; ARIS Collaboration
2017-01-01
As direct dark matter searches become increasingly sensitive, it is important to fully characterize the target of the search. The goal of the Argon Recoil Ionization and Scintillation (ARIS) experiment is to quantify information related to the scintillation and ionization energy scale, quenching factor, ion recombination probability, and scintillation time response of nuclear recoils, as expected from WIMPs, in liquid argon. A time projection chamber with an active mass of 0.5 kg of liquid argon and capable of full 3D position reconstruction was exposed to an inverse kinematic neutron beam at the Institut de Physique Nucleaire d'Orsay in France. A scan of nuclear recoil energies was performed through coincidence with a set of neutron detectors to quantify properties of nuclear recoils in liquid argon at various electric fields. The difference in ionization and scintillation response with differing recoil track angle to the electric field was also studied. The preliminary results of the experiment will be presented.
Monitoring human melanocytic cell responses to piperine using multispectral imaging
NASA Astrophysics Data System (ADS)
Samatham, Ravikant; Phillips, Kevin G.; Sonka, Julia; Yelma, Aznegashe; Reddy, Neha; Vanka, Meenakshi; Thuillier, Philippe; Soumyanath, Amala; Jacques, Steven
2011-03-01
Vitiligo is a depigmentary disease characterized by melanocyte loss attributed most commonly to autoimmune mechanisms. Currently vitiligo has a high incidence (1% worldwide) but a poor set of treatment options. Piperine, a compound found in black pepper, is a potential treatment for the depigmentary skin disease vitiligo, due to its ability to stimulate mouse epidermal melanocyte proliferation in vitro and in vivo. The present study investigates the use of multispectral imaging and an image processing technique based on local contrast to quantify the stimulatory effects of piperine on human melanocyte proliferation in reconstructed epidermis. We demonstrate the ability of the imaging method to quantify increased pigmentation in response to piperine treatment. The quantization of melanocyte stimulation by the proposed imaging technique illustrates the potential use of this technology to quickly assess therapeutic responses of vitiligo tissue culture models to treatment non-invasively.
Nassar, Natasha; Roberts, Christine L; Cameron, Carolyn A; Peat, Brian
2006-01-01
Probabilistic information on outcomes of breech presentation is important for clinical decision-making. We aim to quantify adverse maternal and fetal outcomes of breech presentation at term. We conducted an audit of 1,070 women with a term, singleton breech presentation who were classified as eligible or ineligible for external cephalic version or diagnosed in labor at a tertiary obstetric hospital in Australia, 1997-2004. Maternal, delivery and perinatal outcomes were assessed and frequency of events quantified. Five hundred and sixty (52%) women were eligible and 170 (16%) were ineligible for external cephalic version, 211 (20%) women were diagnosed in labor and 134 (12%) were unclassifiable. Seventy-one percent of eligible women had an external cephalic version, with a 39% success rate. Adverse outcomes of breech presentation at term were rare: immediate delivery for prelabor rupture of membranes (1.3%), nuchal cord (9.3%), cord prolapse (0.4%), and fetal death (0.3%); and did not differ by clinical classification. Women who had an external cephalic version had a reduced risk of onset-of-labor within 24 h (RR 0.25; 95%CI 0.08, 0.82) compared with women eligible for but who did not have an external cephalic version. Women diagnosed with breech in labor had the highest rates of emergency cesarean section (64%), cord prolapse (1.4%) and poorest infant outcomes. Adverse maternal and fetal outcomes of breech presentation at term are rare and there was no increased risk of complications after external cephalic version. Findings provide important data to quantify the frequency of adverse outcomes that will help facilitate informed decision-making and ensure optimal management of breech presentation.
Model-based synthesis of aircraft noise to quantify human perception of sound quality and annoyance
NASA Astrophysics Data System (ADS)
Berckmans, D.; Janssens, K.; Van der Auweraer, H.; Sas, P.; Desmet, W.
2008-04-01
This paper presents a method to synthesize aircraft noise as perceived on the ground. The developed method gives designers the opportunity to make a quick and economic evaluation concerning sound quality of different design alternatives or improvements on existing aircraft. By presenting several synthesized sounds to a jury, it is possible to evaluate the quality of different aircraft sounds and to construct a sound that can serve as a target for future aircraft designs. The combination of using a sound synthesis method that can perform changes to a recorded aircraft sound together with executing jury tests allows to quantify the human perception of aircraft noise.
Talker-specificity and adaptation in quantifier interpretation
Yildirim, Ilker; Degen, Judith; Tanenhaus, Michael K.; Jaeger, T. Florian
2015-01-01
Linguistic meaning has long been recognized to be highly context-dependent. Quantifiers like many and some provide a particularly clear example of context-dependence. For example, the interpretation of quantifiers requires listeners to determine the relevant domain and scale. We focus on another type of context-dependence that quantifiers share with other lexical items: talker variability. Different talkers might use quantifiers with different interpretations in mind. We used a web-based crowdsourcing paradigm to study participants’ expectations about the use of many and some based on recent exposure. We first established that the mapping of some and many onto quantities (candies in a bowl) is variable both within and between participants. We then examined whether and how listeners’ expectations about quantifier use adapts with exposure to talkers who use quantifiers in different ways. The results demonstrate that listeners can adapt to talker-specific biases in both how often and with what intended meaning many and some are used. PMID:26858511
Handbook of human engineering design data for reduced gravity conditions
NASA Technical Reports Server (NTRS)
Marton, T.; Rudek, F. P.; Miller, R. A.; Norman, D. G.
1971-01-01
A Handbook is presented for the use of engineers, designers, and human factors specialists during the developmental and detailed design phases of manned spacecraft programs. Detailed and diverse quantified data on man's capabilities and tolerances for survival and productive effort in the extraterrestrial environment are provided. Quantified data and information on the space environment as well as the characteristics of the vehicular or residential environment required to support man in outer space are also given.
Near infrared spectral linearisation in quantifying soluble solids content of intact carambola.
Omar, Ahmad Fairuz; MatJafri, Mohd Zubir
2013-04-12
This study presents a novel application of near infrared (NIR) spectral linearisation for measuring the soluble solids content (SSC) of carambola fruits. NIR spectra were measured using reflectance and interactance methods. In this study, only the interactance measurement technique successfully generated a reliable measurement result with a coefficient of determination of (R2) = 0.724 and a root mean square error of prediction for (RMSEP) = 0.461° Brix. The results from this technique produced a highly accurate and stable prediction model compared with multiple linear regression techniques.
Near Infrared Spectral Linearisation in Quantifying Soluble Solids Content of Intact Carambola
Omar, Ahmad Fairuz; MatJafri, Mohd Zubir
2013-01-01
This study presents a novel application of near infrared (NIR) spectral linearisation for measuring the soluble solids content (SSC) of carambola fruits. NIR spectra were measured using reflectance and interactance methods. In this study, only the interactance measurement technique successfully generated a reliable measurement result with a coefficient of determination of (R2) = 0.724 and a root mean square error of prediction for (RMSEP) = 0.461° Brix. The results from this technique produced a highly accurate and stable prediction model compared with multiple linear regression techniques. PMID:23584118
NASA Technical Reports Server (NTRS)
1975-01-01
User benefits resulting from the application of space systems to previously described application areas were identified, and methods to assign priorities to application areas and to quantify the benefits were described. The following areas were selected for in-depth review: communications, materials processing in space, weather and climate, and institutional arrangements for space applications. Recommendations concerning studies that should be undertaken to develop a more precise understanding of the source and magnitude of the realizable economic benefits were also presented.
Data Assimilation and Predictability Studies on Typhoon Sinlaku (2008) Using the WRF-LETKF System
NASA Astrophysics Data System (ADS)
Miyoshi, T.; Kunii, M.
2011-12-01
Data assimilation and predictability studies on Tropical Cyclones with a particular focus on intensity forecasts are performed with the newly-developed Local Ensemble Transform Kalman Filter (LETKF) system with the WRF model. Taking advantage of intensive observations of the internationally collaborated T-PARC (THORPEX Pacific Asian Regional Campaign) project, we focus on Typhoon Sinlaku (2008) which intensified rapidly before making landfall to Taiwan. This study includes a number of data assimilation experiments, higher-resolution forecasts, and sensitivity analysis which quantifies impacts of observations on forecasts. This presentation includes latest achievements up to the time of the conference.
NASA Astrophysics Data System (ADS)
Loisel, J.; Harden, J. W.; Hugelius, G.
2017-12-01
What are the most important soil services valued by land stewards and planners? Which soil-data metrics can be used to quantify each soil service? What are the steps required to quantitatively index the baseline value of soil services and their vulnerability under different land-use and climate change scenarios? How do we simulate future soil service pathways (or trajectories) under changing management regimes using process-based ecosystem models? What is the potential cost (economic, social, and other) of soil degradation under these scenarios? How sensitive or resilient are soil services to prescribed management practices, and how does sensitivity vary over space and time? We are bringing together a group of scientists and conservation organizations to answer these questions by launching Soil Banker, an open and flexible tool to quantify soil services that can be used at any scale, and by any stakeholder. Our overarching goals are to develop metrics and indices to quantify peatland soil ecosystem services, monitor change of these services, and guide management. This paper describes our methodology applied to peatlands and presents two case studies (Indonesia and Patagonia) demonstrating how Peatland Soil Banker can be deployed as an accounting tool of peatland stocks, a quantitative measure of peatland health, and as a projection of peatland degradation or enhancement under different land-use cases. Why peatlands? They store about 600 billion tons of carbon that account for ⅓ of the world's soil carbon. Peatlands have dynamic GHG exchanges of CO2, CH4, and NOx with the atmosphere, which plays a role in regulating global climate; studies indicate that peatland degradation releases about 2-3 billion tons of CO2 to the atmosphere annually. These ecosystems also provide local and regional ecosystem services: they constitute important components of the N and P cycles, store about 10% of the world's freshwater and buffer large fluxes of freshwater on an annual basis; they also support much biodiversity, including iconic species such as the orangutan in Indonesia and the guanaco in Chile. While these ecosystem services have been recognized in many sectors and a voluntary standard for a peatland carbon market is emerging, peatland services have not been systematically quantified, or accounted for, at the global level.
Soil compaction: Evaluation of stress transmission and resulting soil structure
NASA Astrophysics Data System (ADS)
Naveed, Muhammad; Schjønning, Per; Keller, Thomas; Lamande, Mathieu
2016-04-01
Accurate estimation of stress transmission and resultant deformation in soil profiles is a prerequisite for the development of predictive models and decision support tools for preventing soil compaction. Numerous studies have been carried out on the effects of soil compaction, whilst relatively few studies have focused on the cause (mode of stress transmission in the soil). We have coupled both cause and effects together in the present study by carrying out partially confined compression tests on (1) wet aggregates, (2) air dry aggregates, and (3) intact soils to quantify stress transmission and compaction-resulted soil structure at the same time. Stress transmission was quantified using both X-ray CT and Tactilus sensor mat, and soil-pore structure was quantified using X-ray CT. Our results imply that stress transmission through soil highly depends on the magnitude of applied load and aggregate strength. As soon as the applied load is lower than the aggregate strength, the mode of stress transmission is discrete as stresses were mainly transmitted through chain of aggregates. With increasing applied load soil aggregates start deforming that transformed heterogeneous soil into homogenous, as a result stress transmission mode was shifted from discrete towards more like a continuum. Continuum-like stress transmission mode was better simulated with Boussinesq (1885) model based on theory of elasticity compared to discrete. The soil-pore structure was greatly affected by increasing applied stresses. Total porosity was reduced 5-16% and macroporosity 50-85% at 620 kPa applied stress for the intact soils. Similarly, significant changes in the morphological indices of the macropore space were also observed with increasing applied stresses.
Direct Numerical Simulation of a Cavity-Stabilized Ethylene/Air Premixed Flame
NASA Astrophysics Data System (ADS)
Chen, Jacqueline; Konduri, Aditya; Kolla, Hemanth; Rauch, Andreas; Chelliah, Harsha
2016-11-01
Cavity flame holders have been shown to be important for flame stabilization in scramjet combustors. In the present study the stabilization of a lean premixed ethylene/air flame in a rectangular cavity at thermo-chemical conditions relevant to scramjet combustors is simulated using a compressible reacting multi-block direct numerical simulation solver, S3D, incorporating a 22 species ethylene-air reduced chemical model. The fuel is premixed with air to an equivalence ratio of 0.4 and enters the computational domain at Mach numbers between 0.3 and 0.6. An auxiliary inert channel flow simulation is used to provide the turbulent velocity profile at the inlet for the reacting flow simulation. The detailed interaction between intense turbulence, nonequilibrium concentrations of radical species formed in the cavity and mixing with the premixed main stream under density variations due to heat release rate and compressibility effects is quantified. The mechanism for flame stabilization is quantified in terms of relevant non-dimensional parameters, and detailed analysis of the flame and turbulence structure will be presented. We acknowledge the sponsorship of the AFOSR-NSF Joint Effort on Turbulent Combustion Model Assumptions and the DOE Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences.
NASA Astrophysics Data System (ADS)
Osei Tutu, A.; Webb, S. J.; Steinberger, B. M.; Rogozhina, I.
2017-12-01
The debate about the origin of the highlands in southern African has generated varying hypothesis, since the nominal processes for mountain building such as evidence of orogeny is not observed here at present-day. For example, some studies have suggested a pre-Paleozoic subduction under the southern Africa plate, might have caused the high topography, whiles other have proposed a large-scale buoyant flow coming from the mid-mantle over the African Large Low Share Velocity Province (LLSVP) as the source. A different school of thought is centered on a probable plume-lithosphere interaction in the early Miocene to late Pliocene. Using joint analysis of geodynamics and geophysical models with geological records; we seek to quantify both shallow and deep mantle density heterogeneities and viscosity structure to understand the tectonics of the southern Africa regional topography. We estimate uplift rates and change in lithosphere stress field for the past 200 Ma and compare with geological records considering first only shallow and deep contributions and their combined effect using a thermo-mechanical model with a free surface.
Montiglio, Pierre-Olivier; Ferrari, Caterina; Réale, Denis
2013-01-01
Several personality traits are mainly expressed in a social context, and others, which are not restricted to a social context, can be affected by the social interactions with conspecifics. In this paper, we focus on the recently proposed hypothesis that social niche specialization (i.e. individuals in a population occupy different social roles) can explain the maintenance of individual differences in personality. We first present ecological and social niche specialization hypotheses. In particular, we show how niche specialization can be quantified and highlight the link between personality differences and social niche specialization. We then review some ecological factors (e.g. competition and environmental heterogeneity) and the social mechanisms (e.g. frequency-dependent, state-dependent and social awareness) that may be associated with the evolution of social niche specialization and personality differences. Finally, we present a conceptual model and methods to quantify the contribution of ecological factors and social mechanisms to the dynamics between personality and social roles. In doing so, we suggest a series of research objectives to help empirical advances in this research area. Throughout this paper, we highlight empirical studies of social niche specialization in mammals, where available. PMID:23569291
Control of maglev vehicles with aerodynamic and guideway disturbances
NASA Technical Reports Server (NTRS)
Flueckiger, Karl; Mark, Steve; Caswell, Ruth; Mccallum, Duncan
1994-01-01
A modeling, analysis, and control design methodology is presented for maglev vehicle ride quality performance improvement as measured by the Pepler Index. Ride quality enhancement is considered through active control of secondary suspension elements and active aerodynamic surfaces mounted on the train. To analyze and quantify the benefits of active control, the authors have developed a five degree-of-freedom lumped parameter model suitable for describing a large class of maglev vehicles, including both channel and box-beam guideway configurations. Elements of this modeling capability have been recently employed in studies sponsored by the U.S. Department of Transportation (DOT). A perturbation analysis about an operating point, defined by vehicle and average crosswind velocities, yields a suitable linearized state space model for multivariable control system analysis and synthesis. Neglecting passenger compartment noise, the ride quality as quantified by the Pepler Index is readily computed from the system states. A statistical analysis is performed by modeling the crosswind disturbances and guideway variations as filtered white noise, whereby the Pepler Index is established in closed form through the solution to a matrix Lyapunov equation. Data is presented which indicates the anticipated ride quality achieved through various closed-loop control arrangements.
Scheijen, Jean L J M; Clevers, Egbert; Engelen, Lian; Dagnelie, Pieter C; Brouns, Fred; Stehouwer, Coen D A; Schalkwijk, Casper G
2016-01-01
The aim of this study was to validate an ultra-performance liquid chromatography tandem mass-spectrometry (UPLC-MS/MS) method for the determination of advanced glycation endproducts (AGEs) in food items and to analyze AGEs in a selection of food items commonly consumed in a Western diet. N(ε)-(carboxymethyl)lysine (CML), N(ε)-(1-carboxyethyl)lysine (CEL) and N(δ)-(5-hydro-5-methyl-4-imidazolon-2-yl)-ornithine (MG-H1) were quantified in the protein fractions of 190 food items using UPLC-MS/MS. Intra- and inter-day accuracy and precision were 2-29%. The calibration curves showed perfect linearity in water and food matrices. We found the highest AGE levels in high-heat processed nut or grain products, and canned meats. Fruits, vegetables, butter and coffee had the lowest AGE content. The described method proved to be suitable for the quantification of three major AGEs in food items. The presented dietary AGE database opens the possibility to further quantify actual dietary exposure to AGEs and to explore its physiological impact on human health. Copyright © 2015 Elsevier Ltd. All rights reserved.
Tartibi, M; Liu, Y X; Liu, G-Y; Komvopoulos, K
2015-11-01
The membrane-cytoskeleton system plays a major role in cell adhesion, growth, migration, and differentiation. F-actin filaments, cross-linkers, binding proteins that bundle F-actin filaments to form the actin cytoskeleton, and integrins that connect the actin cytoskeleton network to the cell plasma membrane and extracellular matrix are major cytoskeleton constituents. Thus, the cell cytoskeleton is a complex composite that can assume different shapes. Atomic force microscopy (AFM)-based techniques have been used to measure cytoskeleton material properties without much attention to cell shape. A recently developed surface chemical patterning method for long-term single-cell culture was used to seed individual cells on circular patterns. A continuum-based cell model, which uses as input the force-displacement response obtained with a modified AFM setup and relates the membrane-cytoskeleton elastic behavior to the cell geometry, while treating all other subcellular components suspended in the cytoplasmic liquid (gel) as an incompressible fluid, is presented and validated by experimental results. The developed analytical-experimental methodology establishes a framework for quantifying the membrane-cytoskeleton elasticity of live cells. This capability may have immense implications in cell biology, particularly in studies seeking to establish correlations between membrane-cytoskeleton elasticity and cell disease, mortality, differentiation, and migration, and provide insight into cell infiltration through nonwoven fibrous scaffolds. The present method can be further extended to analyze membrane-cytoskeleton viscoelasticity, examine the role of other subcellular components (e.g., nucleus envelope) in cell elasticity, and elucidate the effects of mechanical stimuli on cell differentiation and motility. This is the first study to decouple the membrane-cytoskeleton elasticity from cell stiffness and introduce an effective approach for measuring the elastic modulus. The novelty of this study is the development of new technology for quantifying the elastic stiffness of the membrane-cytoskeleton system of cells. This capability could have immense implications in cell biology, particularly in establishing correlations between various cell diseases, mortality, and differentiation with membrane-cytoskeleton elasticity, examining through-tissue cell migration, and understanding cell infiltration in porous scaffolds. The present method can be further extended to analyze membrane-cytoskeleton viscous behavior, identify the contribution of other subcellular components (e.g., nucleus envelope) to load sharing, and elucidate mechanotransduction effects due to repetitive compressive loading and unloading on cell differentiation and motility. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Relationship between blastocoel cell-free DNA and day-5 blastocyst morphology.
Rule, Kiersten; Chosed, Renee J; Arthur Chang, T; David Wininger, J; Roudebush, William E
2018-06-04
Cell-free DNA (cfDNA) which is present in the blastocoel cavity of embryos is believed to result from physiological apoptosis during development. This study assessed cfDNA content and caspase-3 protease activity in day-5 IVF blastocysts to determine if there was a correlation with embryo morphology. Day-5 IVF blastocysts were scored according to the Gardner and Schoolcraft system (modified to generate a numerical value) and cfDNA was collected following laser-induced blastocoel collapsing prior to cryopreservation in 25 μL of media. cfDNA was quantified via fluorospectrometry and apoptotic activity was assessed via a caspase-3 protease assay using a fluorescent peptide substrate. Data were compared by linear regression. A total of 32 embryos were evaluated. There was a significant (p < 0.01) and positive correlation (cfDNA = 104.753 + (11.281 × score); R 2 = 0.200) between embryo score and cfDNA content. A significant (p < 0.05) and positive correlation (cfDNA = 115.9 + (0.05 × caspase-3); R 2 = 0.128) was observed between caspase-3 activity and cfDNA levels. There was no significant relationship between caspase-3 activity and embryo morphology score. This study provides further evidence that cfDNA is present in blastocoel fluid, can be quantified, and positively correlates with embryonic morphology. There is also evidence that at least a portion of the cfDNA present is from intracellular contents of embryonic cells that underwent apoptosis. Additional studies are warranted to determine other physiological sources of the cfDNA in blastocyst fluid and to determine the relationship with cfDNA content, embryo morphology, and chromosomal ploidy status plus implantation potential.
Plasma pharmacokinetics of catechin metabolite 4'-O-Me-EGC in healthy humans.
Renouf, Mathieu; Redeuil, Karine; Longet, Karin; Marmet, Cynthia; Dionisi, Fabiola; Kussmann, Martin; Williamson, Gary; Nagy, Kornél
2011-10-01
Tea is an infusion of the leaves of the Camellia sinensis plant and is the most widely consumed beverage in the world after water. Green tea contains significant amounts of polyphenol catechins and represents a promising dietary component to maintain health and well-being. Epidemiological studies indicate that polyphenol intake may have potential health benefits, such as, reducing the incidence of coronary heart disease, diabetes and cancer. While bioavailability of green tea bioactives is fairly well understood, some gaps still remain to be filled, especially the identification and quantification of conjugated metabolites in plasma, such as, sulphated, glucuronidated or methylated compounds. In the present study, we aimed to quantify the appearance of green tea catechins in plasma with particular emphasis on their methylated forms. After feeding 400 mL of green tea, 1.25% infusion to 9 healthy subjects, we found significant amounts of EC, EGC and EGCg in plasma as expected. EGC was the most bioavailable catechin, and its methylated form (4'-O-Me-EGC) was also present in quantifiable amounts. Its kinetics followed that of its parent compound. However, the relative amount of the methylated form of EGC was lower than that of the parent compound, an important aspect which, in the literature, has been controversial so far. The quantitative results presented in our study were confirmed by co-chromatography and accurate mass analysis of the respective standards. We show that the relative abundance of 4'-O-Me-EGC is ~40% compared to the parent EGC. 4'-O-Me-EGC is an important metabolite derived from catechin metabolism. Its presence in significant amounts should not be overlooked when assessing human bioavailability of green tea.
A force-based, parallel assay for the quantification of protein-DNA interactions.
Limmer, Katja; Pippig, Diana A; Aschenbrenner, Daniela; Gaub, Hermann E
2014-01-01
Analysis of transcription factor binding to DNA sequences is of utmost importance to understand the intricate regulatory mechanisms that underlie gene expression. Several techniques exist that quantify DNA-protein affinity, but they are either very time-consuming or suffer from possible misinterpretation due to complicated algorithms or approximations like many high-throughput techniques. We present a more direct method to quantify DNA-protein interaction in a force-based assay. In contrast to single-molecule force spectroscopy, our technique, the Molecular Force Assay (MFA), parallelizes force measurements so that it can test one or multiple proteins against several DNA sequences in a single experiment. The interaction strength is quantified by comparison to the well-defined rupture stability of different DNA duplexes. As a proof-of-principle, we measured the interaction of the zinc finger construct Zif268/NRE against six different DNA constructs. We could show the specificity of our approach and quantify the strength of the protein-DNA interaction.
Technical aspects of virtual liver resection planning.
Glombitza, G; Lamadé, W; Demiris, A M; Göpfert, M R; Mayer, A; Bahner, M L; Meinzer, H P; Richter, G; Lehnert, T; Herfarth, C
1998-01-01
Operability of a liver tumor is depending on its three dimensional relation to the intrahepatic vascular trees which define autonomously functioning liver (sub-)segments. Precise operation planning is complicated by anatomic variability, distortion of the vascular trees by the tumor or preceding liver resections. Because of the missing possibility to track the deformation of the liver during the operation an integration of the resection planning system into an intra-operative navigation system is not feasible. So the main task of an operation planning system in this domain is a quantifiable patient selection by exact prediction of post-operative liver function and a quantifiable resection proposal. The system quantifies the organ structures and resection volumes by means of absolute and relative values. It defines resection planes depending on security margins and the vascular trees and presents the data in visualized form as a 3D movie. The new 3D operation planning system offers quantifiable liver resection proposals based on individualized liver anatomy. The results are visualized in digital movies as well as in quantitative reports.
A novel approach to quantify cybersecurity for electric power systems
NASA Astrophysics Data System (ADS)
Kaster, Paul R., Jr.
Electric Power grid cybersecurity is a topic gaining increased attention in academia, industry, and government circles, yet a method of quantifying and evaluating a system's security is not yet commonly accepted. In order to be useful, a quantification scheme must be able to accurately reflect the degree to which a system is secure, simply determine the level of security in a system using real-world values, model a wide variety of attacker capabilities, be useful for planning and evaluation, allow a system owner to publish information without compromising the security of the system, and compare relative levels of security between systems. Published attempts at quantifying cybersecurity fail at one or more of these criteria. This document proposes a new method of quantifying cybersecurity that meets those objectives. This dissertation evaluates the current state of cybersecurity research, discusses the criteria mentioned previously, proposes a new quantification scheme, presents an innovative method of modeling cyber attacks, demonstrates that the proposed quantification methodology meets the evaluation criteria, and proposes a line of research for future efforts.
Quantifying the sensitivity of post-glacial sea level change to laterally varying viscosity
NASA Astrophysics Data System (ADS)
Crawford, Ophelia; Al-Attar, David; Tromp, Jeroen; Mitrovica, Jerry X.; Austermann, Jacqueline; Lau, Harriet C. P.
2018-05-01
We present a method for calculating the derivatives of measurements of glacial isostatic adjustment (GIA) with respect to the viscosity structure of the Earth and the ice sheet history. These derivatives, or kernels, quantify the linearised sensitivity of measurements to the underlying model parameters. The adjoint method is used to enable efficient calculation of theoretically exact sensitivity kernels within laterally heterogeneous earth models that can have a range of linear or non-linear viscoelastic rheologies. We first present a new approach to calculate GIA in the time domain, which, in contrast to the more usual formulation in the Laplace domain, is well suited to continuously varying earth models and to the use of the adjoint method. Benchmarking results show excellent agreement between our formulation and previous methods. We illustrate the potential applications of the kernels calculated in this way through a range of numerical calculations relative to a spherically symmetric background model. The complex spatial patterns of the sensitivities are not intuitive, and this is the first time that such effects are quantified in an efficient and accurate manner.
Xie, Wei-Qi; Gong, Yi-Xian; Yu, Kong-Xian
2018-06-01
An automated and accurate headspace gas chromatographic (HS-GC) technique was investigated for rapidly quantifying water content in edible oils. In this method, multiple headspace extraction (MHE) procedures were used to analyse the integrated water content from the edible oil sample. A simple vapour phase calibration technique with an external vapour standard was used to calibrate both the water content in the gas phase and the total weight of water in edible oil sample. After that the water in edible oils can be quantified. The data showed that the relative standard deviation of the present HS-GC method in the precision test was less than 1.13%, the relative differences between the new method and a reference method (i.e. the oven-drying method) were no more than 1.62%. The present HS-GC method is automated, accurate, efficient, and can be a reliable tool for quantifying water content in edible oil related products and research. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Herold, Volker; Herz, Stefan; Winter, Patrick; Gutjahr, Fabian Tobias; Andelovic, Kristina; Bauer, Wolfgang Rudolf; Jakob, Peter Michael
2017-10-16
Local aortic pulse wave velocity (PWV) is a measure for vascular stiffness and has a predictive value for cardiovascular events. Ultra high field CMR scanners allow the quantification of local PWV in mice, however these systems are yet unable to monitor the distribution of local elasticities. In the present study we provide a new accelerated method to quantify local aortic PWV in mice with phase-contrast cardiovascular magnetic resonance imaging (PC-CMR) at 17.6 T. Based on a k-t BLAST (Broad-use Linear Acquisition Speed-up Technique) undersampling scheme, total measurement time could be reduced by a factor of 6. The fast data acquisition enables to quantify the local PWV at several locations along the aortic blood vessel based on the evaluation of local temporal changes in blood flow and vessel cross sectional area. To speed up post processing and to eliminate operator bias, we introduce a new semi-automatic segmentation algorithm to quantify cross-sectional areas of the aortic vessel. The new methods were applied in 10 eight-month-old mice (4 C57BL/6J-mice and 6 ApoE (-/-) -mice) at 12 adjacent locations along the abdominal aorta. Accelerated data acquisition and semi-automatic post-processing delivered reliable measures for the local PWV, similiar to those obtained with full data sampling and manual segmentation. No statistically significant differences of the mean values could be detected for the different measurement approaches. Mean PWV values were elevated for the ApoE (-/-) -group compared to the C57BL/6J-group (3.5 ± 0.7 m/s vs. 2.2 ± 0.4 m/s, p < 0.01). A more heterogeneous PWV-distribution in the ApoE (-/-) -animals could be observed compared to the C57BL/6J-mice, representing the local character of lesion development in atherosclerosis. In the present work, we showed that k-t BLAST PC-MRI enables the measurement of the local PWV distribution in the mouse aorta. The semi-automatic segmentation method based on PC-CMR data allowed rapid determination of local PWV. The findings of this study demonstrate the ability of the proposed methods to non-invasively quantify the spatial variations in local PWV along the aorta of ApoE (-/-) -mice as a relevant model of atherosclerosis.
FracPaQ: a MATLAB™ Toolbox for the Quantification of Fracture Patterns
NASA Astrophysics Data System (ADS)
Healy, D.; Rizzo, R. E.; Cornwell, D. G.; Timms, N.; Farrell, N. J.; Watkins, H.; Gomez-Rivas, E.; Smith, M.
2016-12-01
The patterns of fractures in deformed rocks are rarely uniform or random. Fracture orientations, sizes, shapes and spatial distributions often exhibit some kind of order. In detail, there may be relationships among the different fracture attributes e.g. small fractures dominated by one orientation, larger fractures by another. These relationships are important because the mechanical (e.g. strength, anisotropy) and transport (e.g. fluids, heat) properties of rock depend on these fracture patterns and fracture attributes. This presentation describes an open source toolbox to quantify fracture patterns, including distributions in fracture attributes and their spatial variation. Software has been developed to quantify fracture patterns from 2-D digital images, such as thin section micrographs, geological maps, outcrop or aerial photographs or satellite images. The toolbox comprises a suite of MATLAB™ scripts based on published quantitative methods for the analysis of fracture attributes: orientations, lengths, intensity, density and connectivity. An estimate of permeability in 2-D is made using a parallel plate model. The software provides an objective and consistent methodology for quantifying fracture patterns and their variations in 2-D across a wide range of length scales. Our current focus for the application of the software is on quantifying the fracture patterns in and around fault zones. There is a large body of published work on the quantification of relatively simple joint patterns, but fault zones present a bigger, and arguably more important, challenge. The method presented is inherently scale independent, and a key task will be to analyse and integrate quantitative fracture pattern data from micro- to macro-scales. Planned future releases will incorporate multi-scale analyses based on a wavelet method to look for scale transitions, and combining fracture traces from multiple 2-D images to derive the statistically equivalent 3-D fracture pattern.
EEG in the classroom: Synchronised neural recordings during video presentation
Poulsen, Andreas Trier; Kamronn, Simon; Dmochowski, Jacek; Parra, Lucas C.; Hansen, Lars Kai
2017-01-01
We performed simultaneous recordings of electroencephalography (EEG) from multiple students in a classroom, and measured the inter-subject correlation (ISC) of activity evoked by a common video stimulus. The neural reliability, as quantified by ISC, has been linked to engagement and attentional modulation in earlier studies that used high-grade equipment in laboratory settings. Here we reproduce many of the results from these studies using portable low-cost equipment, focusing on the robustness of using ISC for subjects experiencing naturalistic stimuli. The present data shows that stimulus-evoked neural responses, known to be modulated by attention, can be tracked for groups of students with synchronized EEG acquisition. This is a step towards real-time inference of engagement in the classroom. PMID:28266588
EEG in the classroom: Synchronised neural recordings during video presentation
NASA Astrophysics Data System (ADS)
Poulsen, Andreas Trier; Kamronn, Simon; Dmochowski, Jacek; Parra, Lucas C.; Hansen, Lars Kai
2017-03-01
We performed simultaneous recordings of electroencephalography (EEG) from multiple students in a classroom, and measured the inter-subject correlation (ISC) of activity evoked by a common video stimulus. The neural reliability, as quantified by ISC, has been linked to engagement and attentional modulation in earlier studies that used high-grade equipment in laboratory settings. Here we reproduce many of the results from these studies using portable low-cost equipment, focusing on the robustness of using ISC for subjects experiencing naturalistic stimuli. The present data shows that stimulus-evoked neural responses, known to be modulated by attention, can be tracked for groups of students with synchronized EEG acquisition. This is a step towards real-time inference of engagement in the classroom.
Cologna, Nicholas de Mojana di; Gómez-Mendoza, Diana Paola; Zanoelo, Fabiana Fonseca; Giannesi, Giovana Cristina; Guimarães, Nelciele Cavalieri de Alencar; Moreira, Leonora Rios de Souza; Filho, Edivaldo Ximenes Ferreira; Ricart, Carlos André Ornelas
2018-02-01
Filamentous fungal secretomes comprise highly dynamic sets of proteins, including multiple carbohydrate active enzymes (CAZymes) which are able to hydrolyze plant biomass polysaccharides into products of biotechnological interest such as fermentable sugars. In recent years, proteomics has been used to identify and quantify enzymatic and non-enzymatic polypeptides present in secretomes of several fungi species. The resulting data have widened the scientific understanding of the way filamentous fungi perform biomass degradation and offered novel perspectives for biotechnological applications. The present review discusses proteomics approaches that have been applied to the study of fungal secretomes, focusing on two of the most studied filamentous fungi genera: Trichoderma and Aspergillus. Copyright © 2017 Elsevier Inc. All rights reserved.
Kim, Dahan; Curthoys, Nikki M.; Parent, Matthew T.; Hess, Samuel T.
2015-01-01
Multi-colour localization microscopy has enabled sub-diffraction studies of colocalization between multiple biological species and quantification of their correlation at length scales previously inaccessible with conventional fluorescence microscopy. However, bleed-through, or misidentification of probe species, creates false colocalization and artificially increases certain types of correlation between two imaged species, affecting the reliability of information provided by colocalization and quantified correlation. Despite the potential risk of these artefacts of bleed-through, neither the effect of bleed-through on correlation nor methods of its correction in correlation analyses has been systematically studied at typical rates of bleed-through reported to affect multi-colour imaging. Here, we present a reliable method of bleed-through correction applicable to image rendering and correlation analysis of multi-colour localization microscopy. Application of our bleed-through correction shows our method accurately corrects the artificial increase in both types of correlations studied (Pearson coefficient and pair correlation), at all rates of bleed-through tested, in all types of correlations examined. In particular, anti-correlation could not be quantified without our bleed-through correction, even at rates of bleed-through as low as 2%. Demonstrated with dichroic-based multi-colour FPALM here, our presented method of bleed-through correction can be applied to all types of localization microscopy (PALM, STORM, dSTORM, GSDIM, etc.), including both simultaneous and sequential multi-colour modalities, provided the rate of bleed-through can be reliably determined. PMID:26185614
Kim, Dahan; Curthoys, Nikki M; Parent, Matthew T; Hess, Samuel T
2013-09-01
Multi-colour localization microscopy has enabled sub-diffraction studies of colocalization between multiple biological species and quantification of their correlation at length scales previously inaccessible with conventional fluorescence microscopy. However, bleed-through, or misidentification of probe species, creates false colocalization and artificially increases certain types of correlation between two imaged species, affecting the reliability of information provided by colocalization and quantified correlation. Despite the potential risk of these artefacts of bleed-through, neither the effect of bleed-through on correlation nor methods of its correction in correlation analyses has been systematically studied at typical rates of bleed-through reported to affect multi-colour imaging. Here, we present a reliable method of bleed-through correction applicable to image rendering and correlation analysis of multi-colour localization microscopy. Application of our bleed-through correction shows our method accurately corrects the artificial increase in both types of correlations studied (Pearson coefficient and pair correlation), at all rates of bleed-through tested, in all types of correlations examined. In particular, anti-correlation could not be quantified without our bleed-through correction, even at rates of bleed-through as low as 2%. Demonstrated with dichroic-based multi-colour FPALM here, our presented method of bleed-through correction can be applied to all types of localization microscopy (PALM, STORM, dSTORM, GSDIM, etc.), including both simultaneous and sequential multi-colour modalities, provided the rate of bleed-through can be reliably determined.
Quantifying Volcanic Emissions of Trace Elements to the Atmosphere: Ideas Based on Past Studies
NASA Astrophysics Data System (ADS)
Rose, W. I.
2003-12-01
Extensive data exist from volcanological and geochemical studies about exotic elemental enrichments in volcanic emissions to the atmosphere but quantitative data are quite rare. Advanced, highly sensitive techniques of analysis are needed to detect low concentrations of some minor elements, especially during major eruptions. I will present data from studies done during low levels of activity (incrustations and silica tube sublimates at high temperature fumaroles, from SEM studies of particle samples collected in volcanic plumes and volcanic clouds, from geochemical analysis of volcanic gas condensates, from analysis of treated particle and gas filter packs) and a much smaller number that could reflect explosive activity (from fresh ashfall leachate geochemistry, and from thermodynamic codes modeling volatile emissions from magma). This data describes a highly variable pattern of elemental enrichments which are difficult to quantify, generalize and understand. Sampling in a routine way is difficult, and work in active craters has heightened our awareness of danger, which appropriately inhibits some sampling. There are numerous localized enrichments of minor elements that can be documented and others can be expected or inferred. There is a lack of systematic tools to measure minor element abundances in volcanic emissions. The careful combination of several methodologies listed above for the same volcanic vents can provide redundant data on multiple elements which could lead to overall quantification of minor element fluxes but there are challenging issues about detection. For quiescent plumes we can design combinations of measurements to quantify minor element emission rates. Doing a comparable methodology to succeed in measuring minor element fluxes for significant eruptions will require new strategies and/or ideas.
A new improved graphical and quantitative method for detecting bias in meta-analysis.
Furuya-Kanamori, Luis; Barendregt, Jan J; Doi, Suhail A R
2018-04-04
Detection of publication and related biases remains suboptimal and threatens the validity and interpretation of meta-analytical findings. When bias is present, it usually differentially affects small and large studies manifesting as an association between precision and effect size and therefore visual asymmetry of conventional funnel plots. This asymmetry can be quantified and Egger's regression is, by far, the most widely used statistical measure for quantifying funnel plot asymmetry. However, concerns have been raised about both the visual appearance of funnel plots and the sensitivity of Egger's regression to detect such asymmetry, particularly when the number of studies is small. In this article, we propose a new graphical method, the Doi plot, to visualize asymmetry and also a new measure, the LFK index, to detect and quantify asymmetry of study effects in Doi plots. We demonstrate that the visual representation of asymmetry was better for the Doi plot when compared with the funnel plot. We also show that the diagnostic accuracy of the LFK index in discriminating between asymmetry due to simulated publication bias versus chance or no asymmetry was also better with the LFK index which had areas under the receiver operating characteristic curve of 0.74-0.88 with simulations of meta-analyses with five, 10, 15, and 20 studies. The Egger's regression result had lower areas under the receiver operating characteristic curve values of 0.58-0.75 across the same simulations. The LFK index also had a higher sensitivity (71.3-72.1%) than the Egger's regression result (18.5-43.0%). We conclude that the methods proposed in this article can markedly improve the ability of researchers to detect bias in meta-analysis.
Eggeling, Thomas; Regitz-Zagrosek, Vera; Zimmermann, Andrea; Burkart, Martin
2011-11-15
The efficacy of quantified Crataegus extract in chronic heart failure (CHF) has been assessed in numerous clinical studies. The present pooled analysis evaluates the impact of baseline severity and gender on objective and patient-reported endpoints and associations between both types of outcomes in patients with early CHF. Available data from 687 individual patients treated with quantified Crataegus extract or placebo in ten studies were pooled. Treatment effects on physiologic outcome parameters and on symptoms were analysed for their association with baseline severity and gender. Changes in symptom scores were investigated with respect to their relation to physiologic outcome parameters. Results were compared with observations in a 3-year cohort study. Physiologic outcome parameters maximal workload (MWL), left ventricular ejection fraction (LVEF) and pressure-heart rate product increase (PHRPI) at 50 W ergometric exercise improved more in active treatment than in placebo patients. Magnitude of improvement was independent from baseline for LVEF but increased for MWL and PHRPI with baseline severity. Improvement of typical symptoms like reduced exercise tolerance, exertional dyspnea, weakness, fatigue, and palpitations improved more with active treatment and in patients with more severe symptoms. A weak association between improvements in MWL, PRHP, and symptoms could be demonstrated. Gender differences in treatment effects could be explained by baseline differences. Results of the pooled analysis are in agreement with observations in the cohort study. Crataegus extract treatment effects on physiologic outcomes and typical symptoms were modulated by baseline severity. Taking baseline differences into account, benefits were comparable in male and female patients with impaired exercise-tolerance in early chronic heart-failure. Copyright © 2011 Elsevier GmbH. All rights reserved.
IMPAIRED VERBAL COMPREHENSION OF QUANTIFIERS IN CORTICOBASAL SYNDROME
Troiani, Vanessa; Clark, Robin; Grossman, Murray
2011-01-01
Objective Patients with Corticobasal Syndrome (CBS) have atrophy in posterior parietal cortex. This region of atrophy has been previously linked with their quantifier comprehension difficulty, but previous studies used visual stimuli, making it difficult to account for potential visuospatial deficits in CBS patients. The current study evaluated comprehension of generalized quantifiers using strictly verbal materials. Method CBS patients, a brain-damaged control group (consisting of Alzheimer's Disease and frontotemporal dementia), and age-matched controls participated in this study. We assessed familiar temporal, spatial, and monetary domains of verbal knowledge comparatively. Judgment accuracy was only evaluated in statements for which patients demonstrated accurate factual knowledge about the target domain. Results We found that patients with CBS are significantly impaired in their ability to evaluate quantifiers compared to healthy seniors and a brain-damaged control group, even in this strictly visual task. This impairment was seen in the vast majority of individual CBS patients. Conclusions These findings offer additional evidence of quantifier impairment in CBS patients and emphasize that this impairment cannot be attributed to potential spatial processing impairments in patients with parietal disease. PMID:21381823
NASA Astrophysics Data System (ADS)
Eugenio Pappalardo, Salvatore; Ferrarese, Francesco; Tarolli, Paolo; Varotto, Mauro
2016-04-01
Traditional agricultural terraced landscapes presently embody an important cultural value to be deeply investigated, both for their role in local heritage and cultural economy and for their potential geo-hydrological hazard due to abandonment and degradation. Moreover, traditional terraced landscapes are usually based on non-intensive agro-systems and may enhance some important ecosystems services such as agro-biodiversity conservation and cultural services. Due to their unplanned genesis, mapping, quantifying and classifying agricultural terraces at regional scale is often critical as far as they are usually set up on geomorphologically and historically complex landscapes. Hence, traditional mapping methods are generally based on scientific literature and local documentation, historical and cadastral sources, technical cartography and aerial images visual interpretation or, finally, field surveys. By this, limitations and uncertainty in mapping at regional scale are basically related to forest cover and lack in thematic cartography. The Veneto Region (NE of Italy) presents a wide heterogeneity of agricultural terraced landscapes, mainly distributed within the hilly and Prealps areas. Previous studies performed by traditional mapping method quantified 2,688 ha of terraced areas, showing the higher values within the Prealps of Lessinia (1,013 ha, within the Province of Verona) and in the Brenta Valley (421 ha, within the Province of Vicenza); however, terraced features of these case studies show relevant differences in terms of fragmentation and intensity of terraces, highlighting dissimilar degrees of clusterization: 1.7 ha on one hand (Province of Verona) and 1.2 ha per terraced area (Province of Vicenza) on the other one. The aim of this paper is to implement and to compare automatic methodologies with traditional survey methodologies to map and assess agricultural terraces in two representative areas of the Veneto Region. Testing different Remote Sensing analyses such as LiDAR topography survey and visual interpretation from aerial orthophotos (RGB+NIR bands) we performed a territorial analysis in the Lessinia and Brenta Valley case studies. Preliminary results show that terraced feature extraction by automatic LiDAR survey is more efficient both in identifying geometries (walls and terraced surfaces) and in quantifying features under the forest canopy; however, traditional mapping methodology confirms its strength by matching different methods and different data such as aerial photo, visual interpretation, maps and field surveys. Hence, the two methods here compared represent a cross-validation and let us to better know the complexity of this kind of landscape.
Simple Statistical Model to Quantify Maximum Expected EMC in Spacecraft and Avionics Boxes
NASA Technical Reports Server (NTRS)
Trout, Dawn H.; Bremner, Paul
2014-01-01
This study shows cumulative distribution function (CDF) comparisons of composite a fairing electromagnetic field data obtained by computational electromagnetic 3D full wave modeling and laboratory testing. Test and model data correlation is shown. In addition, this presentation shows application of the power balance and extention of this method to predict the variance and maximum exptected mean of the E-field data. This is valuable for large scale evaluations of transmission inside cavities.
NASA Technical Reports Server (NTRS)
1991-01-01
The topics presented are covered in viewgraph form. Programmatic objectives are: (1) to improve characterization of the orbital debris environment; and (2) to provide a passive sensor test bed for debris collision detection systems. Technical objectives are: (1) to study LEO debris altitude, size and temperature distribution down to 1 mm particles; (2) to quantify ground based radar and optical data ambiguities; and (3) to optimize debris detection strategies.
Quantification and Visualization of Variation in Anatomical Trees
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amenta, Nina; Datar, Manasi; Dirksen, Asger
This paper presents two approaches to quantifying and visualizing variation in datasets of trees. The first approach localizes subtrees in which significant population differences are found through hypothesis testing and sparse classifiers on subtree features. The second approach visualizes the global metric structure of datasets through low-distortion embedding into hyperbolic planes in the style of multidimensional scaling. A case study is made on a dataset of airway trees in relation to Chronic Obstructive Pulmonary Disease.
Dynamic and rheological properties of soft biological cell suspensions
Yazdani, Alireza; Li, Xuejin
2016-01-01
Quantifying dynamic and rheological properties of suspensions of soft biological particles such as vesicles, capsules, and red blood cells (RBCs) is fundamentally important in computational biology and biomedical engineering. In this review, recent studies on dynamic and rheological behavior of soft biological cell suspensions by computer simulations are presented, considering both unbounded and confined shear flow. Furthermore, the hemodynamic and hemorheological characteristics of RBCs in diseases such as malaria and sickle cell anemia are highlighted. PMID:27540271
Sánchez-Guijo, Alberto; Oji, Vinzenz; Hartmann, Michaela F.; Traupe, Heiko; Wudy, Stefan A.
2015-01-01
Steroids are primarily present in human fluids in their sulfated forms. Profiling of these compounds is important from both diagnostic and physiological points of view. Here, we present a novel method for the quantification of 11 intact steroid sulfates in human serum by LC-MS/MS. The compounds analyzed in our method, some of which are quantified for the first time in blood, include cholesterol sulfate, pregnenolone sulfate, 17-hydroxy-pregnenolone sulfate, 16-α-hydroxy-dehydroepiandrosterone sulfate, dehydroepiandrosterone sulfate, androstenediol sulfate, androsterone sulfate, epiandrosterone sulfate, testosterone sulfate, epitestosterone sulfate, and dihydrotestosterone sulfate. The assay was conceived to quantify sulfated steroids in a broad range of concentrations, requiring only 300 μl of serum. The method has been validated and its performance was studied at three quality controls, selected for each compound according to its physiological concentration. The assay showed good linearity (R2 > 0.99) and recovery for all the compounds, with limits of quantification ranging between 1 and 80 ng/ml. Averaged intra-day and between-day precisions (coefficient of variation) and accuracies (relative errors) were below 10%. The method has been successfully applied to study the sulfated steroidome in diseases such as steroid sulfatase deficiency, proving its diagnostic value. This is, to our best knowledge, the most comprehensive method available for the quantification of sulfated steroids in human blood. PMID:26239050
Peng, Xian; Yuan, Han; Chen, Wufan; Ding, Lei
2017-01-01
Continuous loop averaging deconvolution (CLAD) is one of the proven methods for recovering transient auditory evoked potentials (AEPs) in rapid stimulation paradigms, which requires an elaborated stimulus sequence design to attenuate impacts from noise in data. The present study aimed to develop a new metric in gauging a CLAD sequence in terms of noise gain factor (NGF), which has been proposed previously but with less effectiveness in the presence of pink (1/f) noise. We derived the new metric by explicitly introducing the 1/f model into the proposed time-continuous sequence. We selected several representative CLAD sequences to test their noise property on typical EEG recordings, as well as on five real CLAD electroencephalogram (EEG) recordings to retrieve the middle latency responses. We also demonstrated the merit of the new metric in generating and quantifying optimized sequences using a classic genetic algorithm. The new metric shows evident improvements in measuring actual noise gains at different frequencies, and better performance than the original NGF in various aspects. The new metric is a generalized NGF measurement that can better quantify the performance of a CLAD sequence, and provide a more efficient mean of generating CLAD sequences via the incorporation with optimization algorithms. The present study can facilitate the specific application of CLAD paradigm with desired sequences in the clinic. PMID:28414803
Modelling the effect of pyrethroid use intensity on mite population density for walnuts.
Zhan, Yu; Fan, Siqi; Zhang, Minghua; Zalom, Frank
2015-01-01
Published studies relating pyrethroid use and subsequent mite outbreaks have largely been based on laboratory and field experiments, with some inferring a result of increased miticide use. The present study derived a mathematical model proposed to quantify the effect of pyrethroid use intensity on mite population density. The model was validated against and parameterized with actual field-level pyrethroid and miticide use data from 1995 to 2009 for California walnuts, where the miticide use intensity was a proxy of the mite population density. The parameterized model was MI = 1.61 - 0.89 · exp(-93.31PI) (RMSE = 0.13; R(2) = 0.69; P < 0.01), where PI and MI are the average pyrethroid and miticide use intensity in small intervals respectively. A three-range scheme was presented to quantify pesticide applications based on the change rate of MI to PI. Specific for California walnuts, the PI range of 0-0.025 kg ha(-1) was identified as the rapidly increasing range where MI increased vastly when PI increased. Results confirmed that more miticide was used, presumably to prevent or control mite resurgence when pyrethroids were applied, a practice that is not only costly but might be expected to aggravate mite resistance to miticides and increase risk associated with these chemicals to the environment and human health. © 2014 Society of Chemical Industry.
Quantifying plant colour and colour difference as perceived by humans using digital images.
Kendal, Dave; Hauser, Cindy E; Garrard, Georgia E; Jellinek, Sacha; Giljohann, Katherine M; Moore, Joslin L
2013-01-01
Human perception of plant leaf and flower colour can influence species management. Colour and colour contrast may influence the detectability of invasive or rare species during surveys. Quantitative, repeatable measures of plant colour are required for comparison across studies and generalisation across species. We present a standard method for measuring plant leaf and flower colour traits using images taken with digital cameras. We demonstrate the method by quantifying the colour of and colour difference between the flowers of eleven grassland species near Falls Creek, Australia, as part of an invasive species detection experiment. The reliability of the method was tested by measuring the leaf colour of five residential garden shrub species in Ballarat, Australia using five different types of digital camera. Flowers and leaves had overlapping but distinct colour distributions. Calculated colour differences corresponded well with qualitative comparisons. Estimates of proportional cover of yellow flowers identified using colour measurements correlated well with estimates obtained by measuring and counting individual flowers. Digital SLR and mirrorless cameras were superior to phone cameras and point-and-shoot cameras for producing reliable measurements, particularly under variable lighting conditions. The analysis of digital images taken with digital cameras is a practicable method for quantifying plant flower and leaf colour in the field or lab. Quantitative, repeatable measurements allow for comparisons between species and generalisations across species and studies. This allows plant colour to be related to human perception and preferences and, ultimately, species management.
Quantifying Plant Colour and Colour Difference as Perceived by Humans Using Digital Images
Kendal, Dave; Hauser, Cindy E.; Garrard, Georgia E.; Jellinek, Sacha; Giljohann, Katherine M.; Moore, Joslin L.
2013-01-01
Human perception of plant leaf and flower colour can influence species management. Colour and colour contrast may influence the detectability of invasive or rare species during surveys. Quantitative, repeatable measures of plant colour are required for comparison across studies and generalisation across species. We present a standard method for measuring plant leaf and flower colour traits using images taken with digital cameras. We demonstrate the method by quantifying the colour of and colour difference between the flowers of eleven grassland species near Falls Creek, Australia, as part of an invasive species detection experiment. The reliability of the method was tested by measuring the leaf colour of five residential garden shrub species in Ballarat, Australia using five different types of digital camera. Flowers and leaves had overlapping but distinct colour distributions. Calculated colour differences corresponded well with qualitative comparisons. Estimates of proportional cover of yellow flowers identified using colour measurements correlated well with estimates obtained by measuring and counting individual flowers. Digital SLR and mirrorless cameras were superior to phone cameras and point-and-shoot cameras for producing reliable measurements, particularly under variable lighting conditions. The analysis of digital images taken with digital cameras is a practicable method for quantifying plant flower and leaf colour in the field or lab. Quantitative, repeatable measurements allow for comparisons between species and generalisations across species and studies. This allows plant colour to be related to human perception and preferences and, ultimately, species management. PMID:23977275
Quantifying collagen orientation in breast tissue biopsies using SLIM (Conference Presentation)
NASA Astrophysics Data System (ADS)
Majeed, Hassaan; Okoro, Chukwuemeka; Balla, Andre; Toussaint, Kimani C.; Popescu, Gabriel
2017-02-01
Breast cancer is a major public health problem worldwide, being the most common type of cancer among women according to the World Health Organization (WHO). The WHO has further stressed the importance of an early determination of the disease course through prognostic markers. Recent studies have shown that the alignment of collagen fibers in tumor adjacent stroma correlate with poorer health outcomes in patients. Such studies have typically been carried out using Second-Harmonic Generation (SHG) microscopy. SHG images are very useful for quantifying collagen fiber orientation due their specificity to non-centrosymmetric structures in tissue, leading to high contrast in collagen rich areas. However, the imaging throughput in SHG microscopy is limited by its point scanning geometry. In this work, we show that SLIM, a wide-field high-throughput QPI technique, can be used to obtain the same information on collagen fiber orientation as is obtainable through SHG microscopy. We imaged a tissue microarray containing both benign and malignant cores using both SHG microscopy and SLIM. The cellular (non-collagenous) structures in the SLIM images were next segmented out using an algorithm developed in-house. Using the previously published Fourier Transform Second Harmonic Generation (FT-SHG) tool, the fiber orientations in SHG and segmented SLIM images were then quantified. The resulting histograms of fiber orientation angles showed that both SHG and SLIM generate similar measurements of collagen fiber orientation. The SLIM modality, however, can generate these results at much higher throughput due to its wide-field, whole-slide scanning capabilities.
Quantifying the Availability of Vertebrate Hosts to Ticks: A Camera-Trapping Approach
Hofmeester, Tim R.; Rowcliffe, J. Marcus; Jansen, Patrick A.
2017-01-01
The availability of vertebrate hosts is a major determinant of the occurrence of ticks and tick-borne zoonoses in natural and anthropogenic ecosystems and thus drives disease risk for wildlife, livestock, and humans. However, it remains challenging to quantify the availability of vertebrate hosts in field settings, particularly for medium-sized to large-bodied mammals. Here, we present a method that uses camera traps to quantify the availability of warm-bodied vertebrates to ticks. The approach is to deploy camera traps at questing height at a representative sample of random points across the study area, measure the average photographic capture rate for vertebrate species, and then correct these rates for the effective detection distance. The resulting “passage rate” is a standardized measure of the frequency at which vertebrates approach questing ticks, which we show is proportional to contact rate. A field test across twenty 1-ha forest plots in the Netherlands indicated that this method effectively captures differences in wildlife assemblage composition between sites. Also, the relative abundances of three life stages of the sheep tick Ixodes ricinus from drag sampling were correlated with passage rates of deer, which agrees with the known association with this group of host species, suggesting that passage rate effectively reflects the availability of medium- to large-sized hosts to ticks. This method will facilitate quantitative studies of the relationship between densities of questing ticks and the availability of different vertebrate species—wild as well as domesticated species—in natural and anthropogenic settings. PMID:28770219
NASA Astrophysics Data System (ADS)
Letcher, T.; Minder, J. R.
2015-12-01
High resolution regional climate models are used to characterize and quantify the snow albedo feedback (SAF) over the complex terrain of the Colorado Headwaters region. Three pairs of 7-year control and pseudo global warming simulations (with horizontal grid spacings of 4, 12, and 36 km) are used to study how the SAF modifies the regional climate response to a large-scale thermodynamic perturbation. The SAF substantially enhances warming within the Headwaters domain, locally as much as 5 °C in regions of snow loss. The SAF also increases the inter-annual variability of the springtime warming within Headwaters domain under the perturbed climate. Linear feedback analysis is used quantify the strength of the SAF. The SAF attains a maximum value of 4 W m-2 K-1 during April when snow loss coincides with strong incoming solar radiation. On sub-seasonal timescales, simulations at 4 km and 12 km horizontal grid-spacing show good agreement in the strength and timing of the SAF, whereas a 36km simulation shows greater discrepancies that are tired to differences in snow accumulation and ablation caused by smoother terrain. An analysis of the regional energy budget shows that transport by atmospheric motion acts as a negative feedback to regional warming, damping the effects of the SAF. On the mesoscale, this transport causes non-local warming in locations with no snow. The methods presented here can be used generally to quantify the role of the SAF in other regional climate modeling experiments.
Area under the curve as a novel metric of behavioral economic demand for alcohol.
Amlung, Michael; Yurasek, Ali; McCarty, Kayleigh N; MacKillop, James; Murphy, James G
2015-06-01
Behavioral economic purchase tasks can be readily used to assess demand for a number of addictive substances, including alcohol, tobacco, and illicit drugs. However, several methodological limitations associated with the techniques used to quantify demand may reduce the utility of demand measures. In the present study, we sought to introduce area under the curve (AUC), commonly used to quantify degree of delay discounting, as a novel index of demand. A sample of 207 heavy-drinking college students completed a standard alcohol purchase task and provided information about typical weekly drinking patterns and alcohol-related problems. Level of alcohol demand was quantified using AUC--which reflects the entire amount of consumption across all drink prices--as well as the standard demand indices (e.g., intensity, breakpoint, Omax, Pmax, and elasticity). Results indicated that AUC was significantly correlated with each of the other demand indices (rs = .42-.92), with particularly strong associations with Omax (r = .92). In regression models, AUC and intensity were significant predictors of weekly drinking quantity, and AUC uniquely predicted alcohol-related problems, even after controlling for drinking level. In a parallel set of analyses, Omax also predicted drinking quantity and alcohol problems, although Omax was not a unique predictor of the latter. These results offer initial support for using AUC as an index of alcohol demand. Additional research is necessary to further validate this approach and to examine its utility in quantifying demand for other addictive substances such as tobacco and illicit drugs. (c) 2015 APA, all rights reserved).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chong, Irene; Hawkins, Maria; Hansen, Vibeke
2011-11-15
Purpose: There has been no previously published data related to the quantification of rectal motion using cone-beam computed tomography (CBCT) during standard conformal long-course chemoradiotherapy. The purpose of the present study was to quantify the interfractional changes in rectal movement and dimensions and rectal and bladder volume using CBCT and to quantify the bony anatomy displacements to calculate the margins required to account for systematic ({Sigma}) and random ({sigma}) setup errors. Methods and Materials: CBCT images were acquired from 16 patients on the first 3 days of treatment and weekly thereafter. The rectum and bladder were outlined on all CBCTmore » images. The interfraction movement was measured using fixed bony landmarks as references to define the rectal location (upper, mid, and low), The maximal rectal diameter at the three rectal locations was also measured. The bony anatomy displacements were quantified, allowing the calculation of systematic ({Sigma}) and random ({sigma}) setup errors. Results: A total of 123 CBCT data sets were analyzed. Analysis of variance for standard deviation from planning scans showed that rectal anterior and lateral wall movement differed significantly by rectal location. Anterior and lateral rectal wall movements were larger in the mid and upper rectum compared with the low rectum. The posterior rectal wall movement did not change significantly with the rectal location. The rectal diameter changed more in the mid and upper than in the low rectum. No consistent relationship was found between the rectal and bladder volume and time, nor was a significant relationship found between the rectal volume and bladder volume. Conclusions: In the present study, the anterior and lateral rectal movement and rectal diameter were found to change most in the upper rectum, followed by the mid rectum, with the smallest changes seen in the low rectum. Asymmetric margins are warranted to ensure phase 2 coverage.« less
Gillet, Ludovic C.; Navarro, Pedro; Tate, Stephen; Röst, Hannes; Selevsek, Nathalie; Reiter, Lukas; Bonner, Ron; Aebersold, Ruedi
2012-01-01
Most proteomic studies use liquid chromatography coupled to tandem mass spectrometry to identify and quantify the peptides generated by the proteolysis of a biological sample. However, with the current methods it remains challenging to rapidly, consistently, reproducibly, accurately, and sensitively detect and quantify large fractions of proteomes across multiple samples. Here we present a new strategy that systematically queries sample sets for the presence and quantity of essentially any protein of interest. It consists of using the information available in fragment ion spectral libraries to mine the complete fragment ion maps generated using a data-independent acquisition method. For this study, the data were acquired on a fast, high resolution quadrupole-quadrupole time-of-flight (TOF) instrument by repeatedly cycling through 32 consecutive 25-Da precursor isolation windows (swaths). This SWATH MS acquisition setup generates, in a single sample injection, time-resolved fragment ion spectra for all the analytes detectable within the 400–1200 m/z precursor range and the user-defined retention time window. We show that suitable combinations of fragment ions extracted from these data sets are sufficiently specific to confidently identify query peptides over a dynamic range of 4 orders of magnitude, even if the precursors of the queried peptides are not detectable in the survey scans. We also show that queried peptides are quantified with a consistency and accuracy comparable with that of selected reaction monitoring, the gold standard proteomic quantification method. Moreover, targeted data extraction enables ad libitum quantification refinement and dynamic extension of protein probing by iterative re-mining of the once-and-forever acquired data sets. This combination of unbiased, broad range precursor ion fragmentation and targeted data extraction alleviates most constraints of present proteomic methods and should be equally applicable to the comprehensive analysis of other classes of analytes, beyond proteomics. PMID:22261725
NASA Astrophysics Data System (ADS)
Zhang, J.; Ives, A. R.; Turner, M. G.; Kucharik, C. J.
2017-12-01
Previous studies have identified global agricultural regions where "stagnation" of long-term crop yield increases has occurred. These studies have used a variety of simple statistical methods that often ignore important aspects of time series regression modeling. These methods can lead to differing and contradictory results, which creates uncertainty regarding food security given rapid global population growth. Here, we present a new statistical framework incorporating time series-based algorithms into standard regression models to quantify spatiotemporal yield trends of US maize, soybean, and winter wheat from 1970-2016. Our primary goal was to quantify spatial differences in yield trends for these three crops using USDA county level data. This information was used to identify regions experiencing the largest changes in the rate of yield increases over time, and to determine whether abrupt shifts in the rate of yield increases have occurred. Although crop yields continue to increase in most maize-, soybean-, and winter wheat-growing areas, yield increases have stagnated in some key agricultural regions during the most recent 15 to 16 years: some maize-growing areas, except for the northern Great Plains, have shown a significant trend towards smaller annual yield increases for maize; soybean has maintained an consistent long-term yield gains in the Northern Great Plains, the Midwest, and southeast US, but has experienced a shift to smaller annual increases in other regions; winter wheat maintained a moderate annual increase in eastern South Dakota and eastern US locations, but showed a decline in the magnitude of annual increases across the central Great Plains and western US regions. Our results suggest that there were abrupt shifts in the rate of annual yield increases in a variety of US regions among the three crops. The framework presented here can be broadly applied to additional yield trend analyses for different crops and regions of the Earth.
NASA Technical Reports Server (NTRS)
Kahn, Ralph A.
2015-01-01
The organizers of the National Academy of Sciences Arthur M. Sackler Colloquia Series on Improving Our Fundamental Understanding of the Role of Aerosol-Cloud Interactions in the Climate System would like to post Ralph Kahn's presentation entitled Remote Sensing of Aerosols from Satellites: Why has it been so difficult to quantify aerosol-cloud interactions for climate assessment, and how can we make progress? to their public website.
Branciard, Cyril; Gisin, Nicolas
2011-07-08
The simulation of quantum correlations with finite nonlocal resources, such as classical communication, gives a natural way to quantify their nonlocality. While multipartite nonlocal correlations appear to be useful resources, very little is known on how to simulate multipartite quantum correlations. We present a protocol that reproduces tripartite Greenberger-Horne-Zeilinger correlations with bounded communication: 3 bits in total turn out to be sufficient to simulate all equatorial Von Neumann measurements on the tripartite Greenberger-Horne-Zeilinger state.
Thomas, Kevin V; Amador, Arturo; Baz-Lomba, Jose Antonio; Reid, Malcolm
2017-10-03
Wastewater-based epidemiology is an established approach for quantifying community drug use and has recently been applied to estimate population exposure to contaminants such as pesticides and phthalate plasticizers. A major source of uncertainty in the population weighted biomarker loads generated is related to estimating the number of people present in a sewer catchment at the time of sample collection. Here, the population quantified from mobile device-based population activity patterns was used to provide dynamic population normalized loads of illicit drugs and pharmaceuticals during a known period of high net fluctuation in the catchment population. Mobile device-based population activity patterns have for the first time quantified the high degree of intraday, week, and month variability within a specific sewer catchment. Dynamic population normalization showed that per capita pharmaceutical use remained unchanged during the period when static normalization would have indicated an average reduction of up to 31%. Per capita illicit drug use increased significantly during the monitoring period, an observation that was only possible to measure using dynamic population normalization. The study quantitatively confirms previous assessments that population estimates can account for uncertainties of up to 55% in static normalized data. Mobile device-based population activity patterns allow for dynamic normalization that yields much improved temporal and spatial trend analysis.
NASA Astrophysics Data System (ADS)
Kulkarni, Sandip; Ramaswamy, Bharath; Horton, Emily; Gangapuram, Sruthi; Nacev, Alek; Depireux, Didier; Shimoji, Mika; Shapiro, Benjamin
2015-11-01
This article presents a method to investigate how magnetic particle characteristics affect their motion inside tissues under the influence of an applied magnetic field. Particles are placed on top of freshly excised tissue samples, a calibrated magnetic field is applied by a magnet underneath each tissue sample, and we image and quantify particle penetration depth by quantitative metrics to assess how particle sizes, their surface coatings, and tissue resistance affect particle motion. Using this method, we tested available fluorescent particles from Chemicell of four sizes (100 nm, 300 nm, 500 nm, and 1 μm diameter) with four different coatings (starch, chitosan, lipid, and PEG/P) and quantified their motion through freshly excised rat liver, kidney, and brain tissues. In broad terms, we found that the applied magnetic field moved chitosan particles most effectively through all three tissue types (as compared to starch, lipid, and PEG/P coated particles). However, the relationship between particle properties and their resulting motion was found to be complex. Hence, it will likely require substantial further study to elucidate the nuances of transport mechanisms and to select and engineer optimal particle properties to enable the most effective transport through various tissue types under applied magnetic fields.
Integrated Information Increases with Fitness in the Evolution of Animats
Edlund, Jeffrey A.; Chaumont, Nicolas; Hintze, Arend; Koch, Christof; Tononi, Giulio; Adami, Christoph
2011-01-01
One of the hallmarks of biological organisms is their ability to integrate disparate information sources to optimize their behavior in complex environments. How this capability can be quantified and related to the functional complexity of an organism remains a challenging problem, in particular since organismal functional complexity is not well-defined. We present here several candidate measures that quantify information and integration, and study their dependence on fitness as an artificial agent (“animat”) evolves over thousands of generations to solve a navigation task in a simple, simulated environment. We compare the ability of these measures to predict high fitness with more conventional information-theoretic processing measures. As the animat adapts by increasing its “fit” to the world, information integration and processing increase commensurately along the evolutionary line of descent. We suggest that the correlation of fitness with information integration and with processing measures implies that high fitness requires both information processing as well as integration, but that information integration may be a better measure when the task requires memory. A correlation of measures of information integration (but also information processing) and fitness strongly suggests that these measures reflect the functional complexity of the animat, and that such measures can be used to quantify functional complexity even in the absence of fitness data. PMID:22028639
NASA Astrophysics Data System (ADS)
Garden, Christopher J.; Craw, Dave; Waters, Jonathan M.; Smith, Abigail
2011-12-01
Tracking and quantifying biological dispersal presents a major challenge in marine systems. Most existing methods for measuring dispersal are limited by poor resolution and/or high cost. Here we use geological data to quantify the frequency of long-distance dispersal in detached bull-kelp (Phaeophyceae: Durvillaea) in southern New Zealand. Geological resolution in this region is enhanced by the presence of a number of distinct and readily-identifiable geological terranes. We sampled 13,815 beach-cast bull-kelp plants across 130 km of coastline. Rocks were found attached to 2639 of the rafted plants, and were assigned to specific geological terranes (source regions) to quantify dispersal frequencies and distances. Although the majority of kelp-associated rock specimens were found to be locally-derived, a substantial number (4%) showed clear geological evidence of long-distance dispersal, several having travelled over 200 km from their original source regions. The proportion of local versus foreign clasts varied considerably between regions. While short-range dispersal clearly predominates, long-distance travel of detached bull-kelp plants is shown to be a common and ongoing process that has potential to connect isolated coastal populations. Geological analyses represent a cost-effective and powerful method for assigning large numbers of drifted macroalgae to their original source regions.
Using ultrasound to quantify tongue shape and movement characteristics.
Zharkova, Natalia
2013-01-01
Objective : Previous experimental studies have demonstrated abnormal lingual articulatory patterns characterizing cleft palate speech. Most articulatory information to date has been collected using electropalatography, which records the location and size of tongue-palate contact but not the tongue shape. The latter type of data can be provided by ultrasound. The present paper aims to describe ultrasound tongue imaging as a potential tool for quantitative analysis of tongue function in speakers with cleft palate. A description of the ultrasound technique as applied to analyzing tongue movements is given, followed by the requirements for quantitative analysis. Several measures are described, and example calculations are provided. Measures : Two measures aim to quantify overuse of tongue dorsum in cleft palate articulations. Crucially for potential clinical applications, these measures do not require head-to-transducer stabilization because both are based on a single tongue curve. The other three measures compare sets of tongue curves, with the aim to quantify the dynamics of tongue displacement, token-to-token variability in tongue position, and the extent of separation between tongue curves for different speech sounds. Conclusions : All measures can be used to compare tongue function in speakers with cleft palate before and after therapy, as well as to assess their performance against that in typical speakers and to help in selecting more effective treatments.
Nematodes enhance plant growth and nutrient uptake under C and N-rich conditions.
Gebremikael, Mesfin T; Steel, Hanne; Buchan, David; Bert, Wim; De Neve, Stefaan
2016-09-08
The role of soil fauna in crucial ecosystem services such as nutrient cycling remains poorly quantified, mainly because of the overly reductionistic approach adopted in most experimental studies. Given that increasing nitrogen inputs in various ecosystems influence the structure and functioning of soil microbes and the activity of fauna, we aimed to quantify the role of the entire soil nematode community in nutrient mineralization in an experimental set-up emulating nutrient-rich field conditions and accounting for crucial interactions amongst the soil microbial communities and plants. To this end, we reconstructed a complex soil foodweb in mesocosms that comprised largely undisturbed native microflora and the entire nematode community added into defaunated soil, planted with Lolium perenne as a model plant, and amended with fresh grass-clover residues. We determined N and P availability and plant uptake, plant biomass and abundance and structure of the microbial and nematode communities during a three-month incubation. The presence of nematodes significantly increased plant biomass production (+9%), net N (+25%) and net P (+23%) availability compared to their absence, demonstrating that nematodes link below- and above-ground processes, primarily through increasing nutrient availability. The experimental set-up presented allows to realistically quantify the crucial ecosystem services provided by the soil biota.
Hirn, Ulrich; Schennach, Robert
2015-01-01
The process of papermaking requires substantial amounts of energy and wood consumption, which contributes to larger environmental costs. In order to optimize the production of papermaking to suit its many applications in material science and engineering, a quantitative understanding of bonding forces between the individual pulp fibers is of importance. Here we show the first approach to quantify the bonding energies contributed by the individual bonding mechanisms. We calculated the impact of the following mechanisms necessary for paper formation: mechanical interlocking, interdiffusion, capillary bridges, hydrogen bonding, Van der Waals forces, and Coulomb forces on the bonding energy. Experimental results quantify the area in molecular contact necessary for bonding. Atomic force microscopy experiments derive the impact of mechanical interlocking. Capillary bridges also contribute to the bond. A model based on the crystal structure of cellulose leads to values for the chemical bonds. In contrast to general believe which favors hydrogen bonding Van der Waals bonds play the most important role according to our model. Comparison with experimentally derived bond energies support the presented model. This study characterizes bond formation between pulp fibers leading to insight that could be potentially used to optimize the papermaking process, while reducing energy and wood consumption. PMID:26000898
Harvey, Judson W.; Jackson, J.M.; Mooney, R.H.; Choi, Jungyill
2000-01-01
The data presented in this report are products of an investigation that quantified interactions between ground water and surface water in Taylor Slough in Everglades National Park. Determining the extent of hydrologic interactions between wetland surface water and ground water in Taylor Slough is important because the balance of freshwater flow in the lower part of the Slough is uncertain. Although freshwater flows through Taylor Slough are quite small in comparison to Shark Slough (the larger of the two major sloughs in Everglades National Park), flows through Taylor Slough are especially important to the ecology of estuarine mangrove embayments of northeastern Florida Bay. Also, wetland and ground- water interactions must be quantified if their role in affecting water quality is to be determined. In order to define basic hydrologic characteristics of the wetland, depth of wetland peat was mapped, and hydraulic conductivity and vertical hydraulic gradients in peat were determined. During specific time periods representing both wet and dry conditions in the area, the distribution of major ions, nutrients, and water stable isotopes throughout the slough were determined. The purpose of chemical measurements was to identify an environmental tracer could be used to quantify ground-water discharge.
Stefănescu, Lucrina; Robu, Brînduşa Mihaela; Ozunu, Alexandru
2013-11-01
The environmental impact assessment of mining sites represents nowadays a large interest topic in Romania. Historical pollution in the Rosia Montana mining area of Romania caused extensive damage to environmental media. This paper has two goals: to investigate the environmental pollution induced by mining activities in the Rosia Montana area and to quantify the environmental impacts and associated risks by means of an integrated approach. Thus, a new method was developed and applied for quantifying the impact of mining activities, taking account of the quality of environmental media in the mining area, and used as case study in the present paper. The associated risks are a function of the environmental impacts and the probability of their occurrence. The results show that the environmental impacts and quantified risks, based on quality indicators to characterize the environmental quality, are of a higher order, and thus measures for pollution remediation and control need to be considered in the investigated area. The conclusion drawn is that an integrated approach for the assessment of environmental impact and associated risks is a valuable and more objective method, and is an important tool that can be applied in the decision-making process for national authorities in the prioritization of emergency action.
Nematodes enhance plant growth and nutrient uptake under C and N-rich conditions
NASA Astrophysics Data System (ADS)
Gebremikael, Mesfin T.; Steel, Hanne; Buchan, David; Bert, Wim; de Neve, Stefaan
2016-09-01
The role of soil fauna in crucial ecosystem services such as nutrient cycling remains poorly quantified, mainly because of the overly reductionistic approach adopted in most experimental studies. Given that increasing nitrogen inputs in various ecosystems influence the structure and functioning of soil microbes and the activity of fauna, we aimed to quantify the role of the entire soil nematode community in nutrient mineralization in an experimental set-up emulating nutrient-rich field conditions and accounting for crucial interactions amongst the soil microbial communities and plants. To this end, we reconstructed a complex soil foodweb in mesocosms that comprised largely undisturbed native microflora and the entire nematode community added into defaunated soil, planted with Lolium perenne as a model plant, and amended with fresh grass-clover residues. We determined N and P availability and plant uptake, plant biomass and abundance and structure of the microbial and nematode communities during a three-month incubation. The presence of nematodes significantly increased plant biomass production (+9%), net N (+25%) and net P (+23%) availability compared to their absence, demonstrating that nematodes link below- and above-ground processes, primarily through increasing nutrient availability. The experimental set-up presented allows to realistically quantify the crucial ecosystem services provided by the soil biota.
Mikos, Antonios G.; Jansen, John A.; Shroyer, Kenneth R.; Wang, Lihong V.; Sitharaman, Balaji
2012-01-01
Aims In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM) was investigated to detect, map, and quantify trace amounts [nanograms (ng) to micrograms (µg)] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies (histological specimens from implanted tissue engineering scaffolds). Materials and Methods Optical-resolution (OR) and acoustic-resolution (AR) - Photoacoustic microscopy (PAM) was employed to detect, map and quantify the SWCNTs in a variety of tissue histological specimens and compared with other optical techniques (bright-field optical microscopy, Raman microscopy, near infrared (NIR) fluorescence microscopy). Results Both optical-resolution and acoustic-resolution PAM, allow the detection and quantification of SWCNTs in histological specimens with scalable spatial resolution and depth penetration. The noise-equivalent detection sensitivity to SWCNTs in the specimens was calculated to be as low as ∼7 pg. Image processing analysis further allowed the mapping, distribution, and quantification of the SWCNTs in the histological sections. Conclusions The results demonstrate the potential of PAM as a promising imaging technique to detect, map, and quantify SWCNTs in histological specimens, and could complement the capabilities of current optical and electron microscopy techniques in the analysis of histological specimens containing SWCNTs. PMID:22496892
Cerqueira-Silva, Carlos Bernard M.; Jesus, Onildo N.; Santos, Elisa S. L.; Corrêa, Ronan X.; Souza, Anete P.
2014-01-01
Despite the ecological and economic importance of passion fruit (Passiflora spp.), molecular markers have only recently been utilized in genetic studies of this genus. In addition, both basic genetic researches related to population studies and pre-breeding programs of passion fruit remain scarce for most Passiflora species. Considering the number of Passiflora species and the increasing use of these species as a resource for ornamental, medicinal, and food purposes, the aims of this review are the following: (i) to present the current condition of the passion fruit crop; (ii) to quantify the applications and effects of using molecular markers in studies of Passiflora; (iii) to present the contributions of genetic engineering for passion fruit culture; and (iv) to discuss the progress and perspectives of this research. Thus, the present review aims to summarize and discuss the relationship between historical and current progress on the culture, breeding, and molecular genetics of passion fruit. PMID:25196515
Hydrology, water quality, and phosphorus loading of Little St Germain Lake, Vilas County, Wisconsin
Robertson, Dale M.; Rose, William J.
2000-01-01
The lake was monitored in detail again during 1991-94 by the U.S. Geological Survey (USGS) as part of a cooperative study with the Lake District. This study demonstrated water-quality variation among the basins of Little St. Germain Lake and extensive areas of winter anoxia (absence of oxygen). Further in-depth studies were then conducted during 1994-2000 to define the extent of winter anoxia, refine the hydrologic and phosphorus budgets of the lake, quantify the effects of annual drawdowns, and provide information needed to develop a comprehensive lake-management plan. This report presents the results of the studies since 1991.
Value of Earth Observation for Risk Mitigation
NASA Astrophysics Data System (ADS)
Pearlman, F.; Shapiro, C. D.; Grasso, M.; Pearlman, J.; Adkins, J. E.; Pindilli, E.; Geppi, D.
2017-12-01
Societal benefits flowing from Earth observation are intuitively obvious as we use the information to assess natural hazards (such as storm tracks), water resources (such as flooding and droughts in coastal and riverine systems), ecosystem vitality and other dynamics that impact the health and economic well being of our population. The most powerful confirmation of these benefits would come from quantifying the impact and showing direct quantitative links in the value chain from data to decisions. However, our ability to identify and quantify those benefits is challenging. The impact of geospatial data on these types of decisions is not well characterized and assigning a true value to the observations on a broad scale across disciplines still remains to be done in a systematic way. This presentation provides the outcomes of a workshop held in October 2017 as a side event of the GEO Plenary that addressed research on economic methodologies for quantification of impacts. To achieve practical outputs during the meeting, the workshop focused on the use and value of Earth observations in risk mitigation including: ecosystem impacts, weather events, and other natural and manmade hazards. Case studies on approaches were discussed and will be part of this presentation. The presentation will also include the exchange of lessons learned and a discussion of gaps in the current understanding of the use and value of earth observation information for risk mitigation.
Coelho, F F; Martins, R P; Figueira, J E C; Demetrio, G R
2014-11-01
In this study, we hypothesized that the life history traits of Leiothrix spiralis and L. vivipara would be linked to soil factors of the rupestrian grasslands and that rosette size would be influenced by soil moisture. Soil analyses were performed from five populations of L. spiralis and four populations of L. vivipara. In each area, three replicates were employed in 19 areas of occurrence of Leiothrix species, and we quantified the life history attributes. The microhabitats of these species show low favorability regarding to soil factors. During the dry season, their rosettes decreased in diameter due the loss of its most outlying leaves. The absence of seedlings indicated the low fecundity of both species. However, both species showed rapid population growth by pseudovivipary. Both L. spiralis and L. vivipara exhibit a kind of parental care that was quantified by the presence of connections between parental-rosettes and ramets. The findings of the present study show that the life history traits are linked to soil factors.
Using nonlinear methods to quantify changes in infant limb movements and vocalizations.
Abney, Drew H; Warlaumont, Anne S; Haussman, Anna; Ross, Jessica M; Wallot, Sebastian
2014-01-01
The pairing of dynamical systems theory and complexity science brings novel concepts and methods to the study of infant motor development. Accordingly, this longitudinal case study presents a new approach to characterizing the dynamics of infant limb and vocalization behaviors. A single infant's vocalizations and limb movements were recorded from 51-days to 305-days of age. On each recording day, accelerometers were placed on all four of the infant's limbs and an audio recorder was worn on the child's chest. Using nonlinear time series analysis methods, such as recurrence quantification analysis and Allan factor, we quantified changes in the stability and multiscale properties of the infant's behaviors across age as well as how these dynamics relate across modalities and effectors. We observed that particular changes in these dynamics preceded or coincided with the onset of various developmental milestones. For example, the largest changes in vocalization dynamics preceded the onset of canonical babbling. The results show that nonlinear analyses can help to understand the functional co-development of different aspects of infant behavior.
NASA Astrophysics Data System (ADS)
Moin, Paymann; Ma, Kevin; Amezcua, Lilyana; Gertych, Arkadiusz; Liu, Brent
2009-02-01
Multiple sclerosis (MS) is a demyelinating disease of the central nervous system that affects approximately 2.5 million people worldwide. Magnetic resonance imaging (MRI) is an established tool for the assessment of disease activity, progression and response to treatment. The progression of the disease is variable and requires routine follow-up imaging studies. Currently, MRI quantification of multiple sclerosis requires a manual approach to lesion measurement and yields an estimate of lesion volume and interval change. In the setting of several prior studies and a long treatment history, trends related to treatment change quickly become difficult to extrapolate. Our efforts seek to develop an imaging informatics based MS lesion computer aided detection (CAD) package to quantify and track MS lesions including lesion load, volume, and location. Together, with select clinical parameters, this data will be incorporated into an MS specific e- Folder to provide decision support to evaluate and assess treatment options for MS in a manner tailored specifically to an individual based on trends in MS presentation and progression.
Using nonlinear methods to quantify changes in infant limb movements and vocalizations
Abney, Drew H.; Warlaumont, Anne S.; Haussman, Anna; Ross, Jessica M.; Wallot, Sebastian
2014-01-01
The pairing of dynamical systems theory and complexity science brings novel concepts and methods to the study of infant motor development. Accordingly, this longitudinal case study presents a new approach to characterizing the dynamics of infant limb and vocalization behaviors. A single infant's vocalizations and limb movements were recorded from 51-days to 305-days of age. On each recording day, accelerometers were placed on all four of the infant's limbs and an audio recorder was worn on the child's chest. Using nonlinear time series analysis methods, such as recurrence quantification analysis and Allan factor, we quantified changes in the stability and multiscale properties of the infant's behaviors across age as well as how these dynamics relate across modalities and effectors. We observed that particular changes in these dynamics preceded or coincided with the onset of various developmental milestones. For example, the largest changes in vocalization dynamics preceded the onset of canonical babbling. The results show that nonlinear analyses can help to understand the functional co-development of different aspects of infant behavior. PMID:25161629
Experimental study on nonmonotonicity of capillary desaturation curves in a 2-D pore-network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodriquez de Castro, Antonio; Shokri, Nima; Karadimitriou, Nikolaos
2015-10-28
Immiscible displacement in a porous medium is important in many applications such as soil remediation and enhanced oil recovery. When gravitational forces are negligible, two-phase immiscible displacement at the pore level is controlled by capillary and viscous forces whose relative importance is quantified through the dimensionless capillary number Ca and the viscosity ratio M between liquid phases. Depending on the values of Ca and M, capillary fingering, viscous fingering, or stable displacement may be observed resulting in a variety of patterns affecting the phase entrapment. The Capillary Desaturation Curve (CDC), which represents the relationship between the residual oils saturation andmore » Ca, is an important relation to describe the phase entrapment at a given Ca. In the present study, we investigate the CDC as influenced by the viscosity ratio. A comprehensive series of experiments using a high-resolution microscope and state-of-the-art micromodels were conducted. The CDCs were calculated and the effects of Ca and M on phase entrapments were quantified. The results show that CDCs are not necessarily monotonic for all M.« less
Science Overview: The LTTG Technology Review Meeting March 2006 Summary Report
NASA Technical Reports Server (NTRS)
Bruning, Claus; Ko, Malcolm; Lee, David; Miake-Lye, Richard
2006-01-01
This report presents an overview of the latest scientific consensus understanding of the effect of aviation emissions on the atmosphere for both local air quality and climate change in order to provide a contextual framework for raising future questions to help assess the environmental benefits of technology goals. Although studies of the two issues share a common framework (of quantifying the emissions, the change in concentrations in the atmosphere, and the environmental impacts), the communities of practitioners are distinctly different. The scientific community will continue to provide guidelines on trade-off among different contributors to a specific environmental impact, such as global climate, or local air quality. Ultimately, monetization of the costs and benefits of mitigation actions is the proper tool for quantifying and analyzing trade-offs between the two issues. Scientific assessment of the impacts and their uncertainties are critical inputs to these analyses. Until environmental effects of aviation emerge as a policy driven issue, there is little incentive within the scientific community to focus on research efforts specific to trade-off studies between local and global impacts.
A Value Measure for Public-Sector Enterprise Risk Management: A TSA Case Study.
Fletcher, Kenneth C; Abbas, Ali E
2018-05-01
This article presents a public value measure that can be used to aid executives in the public sector to better assess policy decisions and maximize value to the American people. Using Transportation Security Administration (TSA) programs as an example, we first identify the basic components of public value. We then propose a public value account to quantify the outcomes of various risk scenarios, and we determine the certain equivalent of several important TSA programs. We illustrate how this proposed measure can quantify the effects of two main challenges that government organizations face when conducting enterprise risk management: (1) short-term versus long-term incentives and (2) avoiding potential negative consequences even if they occur with low probability. Finally, we illustrate how this measure enables the use of various tools from decision analysis to be applied in government settings, such as stochastic dominance arguments and certain equivalent calculations. Regarding the TSA case study, our analysis demonstrates the value of continued expansion of the TSA trusted traveler initiative and increasing the background vetting for passengers who are afforded expedited security screening. © 2017 Society for Risk Analysis.
2012-01-01
Phytophthora cinnamomi Rands. is an important root rot pathogen widely distributed in the north hemisphere, with a large host range. Among others diseases, it is known to be a principal factor in the decline of holm oak and cork oak, the most important tree species in the “dehesa” ecosystem of south-western Spain. Previously, the focus of studies on P. cinnamomi and holm oak have been on molecular tools for identification, functional responses of the host, together with other physiological and morphological host variables. However, a microscopic index to describe the degree of infection and colonization in the plant tissues has not yet been developed. A colonization or infection index would be a useful tool for studies that examine differences between individuals subjected to different treatments or to individuals belonging to different breeding accessions, together with their specific responses to the pathogen. This work presents a methodology based on the capture and digital treatment of microscopic images, using simple and accessible software, together with a range of variables that quantify the infection and colonization process. PMID:22974221
Lorenzi, M; Ayache, N; Pennec, X
2015-07-15
In this study we introduce the regional flux analysis, a novel approach to deformation based morphometry based on the Helmholtz decomposition of deformations parameterized by stationary velocity fields. We use the scalar pressure map associated to the irrotational component of the deformation to discover the critical regions of volume change. These regions are used to consistently quantify the associated measure of volume change by the probabilistic integration of the flux of the longitudinal deformations across the boundaries. The presented framework unifies voxel-based and regional approaches, and robustly describes the volume changes at both group-wise and subject-specific level as a spatial process governed by consistently defined regions. Our experiments on the large cohorts of the ADNI dataset show that the regional flux analysis is a powerful and flexible instrument for the study of Alzheimer's disease in a wide range of scenarios: cross-sectional deformation based morphometry, longitudinal discovery and quantification of group-wise volume changes, and statistically powered and robust quantification of hippocampal and ventricular atrophy. Copyright © 2015 Elsevier Inc. All rights reserved.
Kuss, S.; Tanner, E. E. L.; Ordovas-Montanes, M.
2017-01-01
The colorimetric identification of pathogenic and non-pathogenic bacteria in cell culture is commonly performed using the redox mediator N,N,N′,N′-tetramethyl-para-phenylene-diamine (TMPD) in the so-called oxidase test, which indicates the presence of bacterial cytochrome c oxidases. The presented study demonstrates the ability of electrochemistry to employ TMPD to detect bacteria and quantify the activity of bacterial cytochrome c oxidases. Cyclic voltammetry studies and chronoamperometry measurements performed on the model organism Bacillus subtilis result in a turnover number, calculated for single bacteria. Furthermore, trace amounts of cytochrome c oxidases were revealed in aerobically cultured Escherichia coli, which to our knowledge no other technique is currently able to quantify in molecular biology. The reported technique could be applied to a variety of pathogenic bacteria and has the potential to be employed in future biosensing technology. PMID:29568431
An open-source computational and data resource to analyze digital maps of immunopeptidomes
Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J.; ...
2015-07-08
We present a novel mass spectrometry-based high-throughput workflow and an open-source computational and data resource to reproducibly identify and quantify HLA-associated peptides. Collectively, the resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra, and the analysis of quantitative digital maps of HLA peptidomes generated from a range of biological sources by SWATH mass spectrometry (MS). This study represents the first community-based effort to develop a robust platform for the reproducible and quantitative measurement of the entire repertoire of peptides presented by HLA molecules, an essential step towards the design of efficient immunotherapies.
X-ray imaging inspection of fiberglass reinforced by epoxy composite
NASA Astrophysics Data System (ADS)
Rique, A. M.; Machado, A. C.; Oliveira, D. F.; Lopes, R. T.; Lima, I.
2015-04-01
The goal of this work was to study the voids presented in bonded joints in order to minimize failures due to low adhesion of the joints in the industry field. One of the main parameters to be characterized is the porosity of the glue, since these pores are formed by several reasons in the moment of its adhesion, which are formed by composite of epoxy resin reinforced by fiberglass. For such purpose, it was used high energy X-ray microtomography and the results show its potential effective in recognizing and quantifying directly in 3D all the occlusions regions presented at glass fiber-epoxy adhesive joints.
Analyzing complex networks evolution through Information Theory quantifiers
NASA Astrophysics Data System (ADS)
Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martín Gómez
2011-01-01
A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Niño/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.
Orbiter ECLSS support of Shuttle payloads
NASA Technical Reports Server (NTRS)
Jaax, J. R.; Morris, D. W.; Prince, R. N.
1974-01-01
The orbiter ECLSS (Environmental Control and Life Support System) provides the functions of atmosphere revitalization, crew life support, and active thermal control. This paper describes these functions as they relate to the support of Shuttle payloads, including automated spacecraft, Spacelab and Department of Defense missions. Functional and performance requirements for the orbiter ECLSS which affect payload support are presented for the atmosphere revitalization subsystem, the food, water and waste subsystem, and the active thermal control subsystem. Schematics for these subsystems are also described. Finally, based on the selected orbiter configuration, preliminary design and off-design thermodynamic data are presented to quantify the baseline orbiter capability; to quantify the payload chargeable penalties for increasing this support; and to identify the significant limits of orbiter ECLSS support available to Shuttle payloads.
NASA Technical Reports Server (NTRS)
Goldman, A.; Murcray, F. J.; Rinsland, C. P.; Blatherwick, R. D.; Murcray, F. H.; Murcray, D. G.
1991-01-01
Results of ongoing studies of high-resolution solar absorption spectra aimed at the identification and quantification of trace constituents of importance in the chemistry of the stratosphere and upper troposphere are presented. An analysis of balloon-borne and ground-based spectra obtained at 0.0025/cm covering the 700-2200/cm interval is presented. The 0.0025/cm spectra, along with corresponding laboratory spectra, improves the spectral line parameters, and thus the accuracy of quantifying trace constituents. Results for COF2, F22, SF6, and other species are presented. The retrieval methods used for total column density and altitude distribution for both ground-based and balloon-borne spectra are also discussed.
Urban gully erosion and the SDGs: a case study from the Koboko rural town of Uganda
NASA Astrophysics Data System (ADS)
Zolezzi, Guido; Bezzi, Marco
2017-04-01
Urban gully erosion in developing regions has been addressed by the scientific community only recently, while it has been given much less attention in past decades. Nonetheless, recent examples show how relevant urban gully erosion in African towns of different sizes can be in terms of several Sustainable Development Goals, like goals 3 (good health and well being), 6 (clean water and sanitation) and 11 (sustainable cities and communities). The present work illustrate an example of gully erosion in the rapidly growing rural town of Koboko in NW Uganda close to the borders with Congo Democratic Republic and South Sudan. The research aims are (i) to develop a simple, low-cost methodology to quantify gully properties in data-scarce and resource-limited contexts, (ii) to quantify the main properties of and processes related to the urban gullies in the Koboko case study and (iii) to quantify the potential risk associated with urban gully erosion at the country scale in relation to rapid growth of urban centers in a sub-saharan African country. The methodology integrates collection of existing hydrological and land use data, rapid topographic surveys and related data processing, basic hydrological and hydro-morphological modeling, interviews to local inhabitants and stakeholders. Results indicate that Koboko may not represent an isolated hotspot of extensive urban gully development among rapidly growing small towns in Uganda, and, consequently, in countries with similar sustainable and human development challenges. Koboko, established two decades ago as a temporary war refugee camp, has been progressively established as a permanent urban settlement. The urban center is located on the top of an elongated hill and many of its recent neighbourhoods are expanding along the hill sides, where the local slope may reach considerable values, up to 10%. In the last ten years several gully systems with local depth up to 8 to 10 meters have been rapidly evolving especially following the construction of new roads and in the absence of a structured urban drainage plan. The deeper gullies are presently located in densely populated areas and present a variety of risks for people's livelihoods, including personal safety, risk of accidents for small vehicles (especially during night time), sanitation risk related with untreated domestic wastewater and uncontrolled garbage disposal into the deepest parts of the gullies. The methodology is easily repeatable and has the potential to quantify the fundamental properties of gully systems in contexts with scarce hydrological, soil and geomorphological local data availability and where the responsible agencies for urban planning and environmental protection are constrained by severe limitation in financial and human resources. For each gully system it allows to quantify total eroded volumes, length of the unstable gully reaches, time scale of development, drainage area and peak formative streamflow and also to provide process-based insight on the causes of gully development. The related knowledge base can be used to develop guidelines for urban growth aimed at minimizing the risk of gully erosion and related societal impacts.
I will be giving 2 invited presentations during a 2 day meeting at Marquette University. the presentations will focus on our bioassay work on wastewater, surface water, and drinking water and bioassay work on CAFOs.
HINTS 2013 Conference Summaries of Presentations and Posters
Summaries of Presentations and Poster Abstracts for the HINTS 2013 Conference titled A Decade of HINTS: Quantifying the Health Information Revolution through Data Innovation and Collaboration and held on October 2-3, 2013 at the Natcher building on the NIH campus in Bethesda, MD
Cyanobacteria Assessment Network (CyAN) - 2017 NASA Water Resources PI Presentation
Presentation on the Cyanobacteria Assessment Network (CYAN) and how is supports the environmental management and public use of the U.S. lakes and estuaries by providing a capability of detecting and quantifying algal blooms and related water quality using satellite data records.
Czolowski, Eliza D.; Santoro, Renee L.; Srebotnjak, Tanja
2017-01-01
Background: Higher risk of exposure to environmental health hazards near oil and gas wells has spurred interest in quantifying populations that live in proximity to oil and gas development. The available studies on this topic lack consistent methodology and ignore aspects of oil and gas development of value to public health–relevant assessment and decision-making. Objectives: We aim to present a methodological framework for oil and gas development proximity studies grounded in an understanding of hydrocarbon geology and development techniques. Methods: We geospatially overlay locations of active oil and gas wells in the conterminous United States and Census data to estimate the population living in proximity to hydrocarbon development at the national and state levels. We compare our methods and findings with existing proximity studies. Results: Nationally, we estimate that 17.6 million people live within 1,600m (∼1 mi) of at least one active oil and/or gas well. Three of the eight studies overestimate populations at risk from actively producing oil and gas wells by including wells without evidence of production or drilling completion and/or using inappropriate population allocation methods. The remaining five studies, by omitting conventional wells in regions dominated by historical conventional development, significantly underestimate populations at risk. Conclusions: The well inventory guidelines we present provide an improved methodology for hydrocarbon proximity studies by acknowledging the importance of both conventional and unconventional well counts as well as the relative exposure risks associated with different primary production categories (e.g., oil, wet gas, dry gas) and developmental stages of wells. https://doi.org/10.1289/EHP1535 PMID:28858829
Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects
Lambers, Martin; Kolb, Andreas
2017-01-01
In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data. PMID:29271888
Christin, Zachary; Bagstad, Kenneth J.; Verdone, Michael
2016-01-01
Restoring degraded forests and agricultural lands has become a global conservation priority. A growing number of tools can quantify ecosystem service tradeoffs associated with forest restoration. This evolving “tools landscape” presents a dilemma: more tools are available, but selecting appropriate tools has become more challenging. We present a Restoration Ecosystem Service Tool Selector (RESTS) framework that describes key characteristics of 13 ecosystem service assessment tools. Analysts enter information about their decision context, services to be analyzed, and desired outputs. Tools are filtered and presented based on five evaluative criteria: scalability, cost, time requirements, handling of uncertainty, and applicability to benefit-cost analysis. RESTS uses a spreadsheet interface but a web-based interface is planned. Given the rapid evolution of ecosystem services science, RESTS provides an adaptable framework to guide forest restoration decision makers toward tools that can help quantify ecosystem services in support of restoration.
Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects.
Bulczak, David; Lambers, Martin; Kolb, Andreas
2017-12-22
In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data.
Pendulum Underwater - An Approach for Quantifying Viscosity
NASA Astrophysics Data System (ADS)
Leme, José Costa; Oliveira, Agostinho
2017-12-01
The purpose of the experiment presented in this paper is to quantify the viscosity of a liquid. Viscous effects are important in the flow of fluids in pipes, in the bloodstream, in the lubrication of engine parts, and in many other situations. In the present paper, the authors explore the oscillations of a physical pendulum in the form of a long and lightweight wire that carries a ball at its lower end, which is totally immersed in water, so as to determine the water viscosity. The system used represents a viscous damped pendulum and we tried different theoretical models to describe it. The experimental part of the present paper is based on a very simple and low-cost image capturing apparatus that can easily be replicated in a physics classroom. Data on the pendulum's amplitude as a function of time were acquired using digital video analysis with the open source software Tracker.
Tracking the Invasion of Small Numbers of Cells in Paper-Based Assays with Quantitative PCR.
Truong, Andrew S; Lochbaum, Christian A; Boyce, Matthew W; Lockett, Matthew R
2015-11-17
Paper-based scaffolds are an attractive material for culturing mammalian cells in a three-dimensional environment. There are a number of previously published studies, which utilize these scaffolds to generate models of aortic valves, cardiac ischemia and reperfusion, and solid tumors. These models have largely relied on fluorescence imaging and microscopy to quantify cells in the scaffolds. We present here a polymerase chain reaction (PCR)-based method, capable of quantifying multiple cell types in a single culture with the aid of DNA barcodes: unique sequences of DNA introduced to the genome of individual cells or cell types through lentiviral transduction. PCR-based methods are highly specific and are amenable to high-throughput and multiplexed analyses. To validate this method, we engineered two different breast cancer lines to constitutively express either a green or red fluorescent protein. These cells lines allowed us to directly compare the ability of fluorescence imaging (of the fluorescent proteins) and qPCR (of the unique DNA sequences of the fluorescent proteins) to quantify known numbers of cells in the paper based-scaffolds. We also used both methods to quantify the distribution of these breast cell lines in homotypic and heterotypic invasion assays. In the paper-based invasion assays, a single sheet of paper containing cells suspended in a hydrogel was sandwiched between sheets of paper containing only hydrogel. The stack was incubated, and the cells invaded the adjacent layers. The individual sheets of the invasion assay were then destacked and the number of cells in each layer quantified. Our results show both methods can accurately detect cell populations of greater than 500 cells. The qPCR method can repeatedly and accurately detect as few as 50 cells, allowing small populations of highly invasive cells to be detected and differentiated from other cell types.
Bringing Clouds into Our Lab! - The Influence of Turbulence on the Early Stage Rain Droplets
NASA Astrophysics Data System (ADS)
Yavuz, Mehmet Altug; Kunnen, Rudie; Heijst, Gertjan; Clercx, Herman
2015-11-01
We are investigating a droplet-laden flow in an air-filled turbulence chamber, forced by speaker-driven air jets. The speakers are running in a random manner; yet they allow us to control and define the statistics of the turbulence. We study the motion of droplets with tunable size (Stokes numbers ~ 0.13 - 9) in a turbulent flow, mimicking the early stages of raindrop formation. 3D Particle Tracking Velocimetry (PTV) together with Laser Induced Fluorescence (LIF) methods are chosen as the experimental method to track the droplets and collect data for statistical analysis. Thereby it is possible to study the spatial distribution of the droplets in turbulence using the so-called Radial Distribution Function (RDF), a statistical measure to quantify the clustering of particles. Additionally, 3D-PTV technique allows us to measure velocity statistics of the droplets and the influence of the turbulence on droplet trajectories, both individually and collectively. In this contribution, we will present the clustering probability quantified by the RDF for different Stokes numbers. We will explain the physics underlying the influence of turbulence on droplet cluster behavior. This study supported by FOM/NWO Netherlands.
Cocaine and thrombosis: a narrative systematic review of clinical and in-vivo studies
Wright, Nat MJ; Martin, Matthew; Goff, Tom; Morgan, John; Elworthy, Rebecca; Ghoneim, Shariffe
2007-01-01
Purpose To systematically review the literature pertaining to the link between cocaine and either arterial or venous thrombosis. Procedures Narrative systematic review of Medline, CINAHL, Embase, Psycinfo and Cochrane databases supplemented by hand trawling of relevant journals and reference lists up to April 2007. In-vivo studies and those with clinical endpoints were included in the review. Results A total of 2458 abstracts led to 186 full-text papers being retrieved. 15 met the criteria for inclusion in the review. The weight of evidence would support cocaine as a pro-thrombotic agent. There is evidence of it activating thrombotic pathways. The effect of cocaine upon clinical endpoints has not been quantified though there is evidence of an association between cocaine and myocardial infarction particularly amongst young adults. Cocaine may also be a causal agent in cerebrovascular accident though studies lacked sufficient power to determine a statistically significant effect. There is a gap in the evidence pertaining to the issue of cocaine and venous thrombosis. Conclusion Clinicians should consider questioning for cocaine use particularly amongst young adults who present with cardiac symptoms. More epidemiological work is required to quantify the effect of cocaine upon both arterial and venous clotting mechanisms. PMID:17880705
Toward quantifying the effectiveness of water trading under uncertainty.
Luo, B; Huang, G H; Zou, Y; Yin, Y Y
2007-04-01
This paper presents a methodology for quantifying the effectiveness of water-trading under uncertainty, by developing an optimization model based on the interval-parameter two-stage stochastic program (TSP) technique. In the study, the effectiveness of a water-trading program is measured by the water volume that can be released through trading from a statistical point of view. The methodology can also deal with recourse water allocation problems generated by randomness in water availability and, at the same time, tackle uncertainties expressed as intervals in the trading system. The developed methodology was tested with a hypothetical water-trading program in an agricultural system in the Swift Current Creek watershed, Canada. Study results indicate that the methodology can effectively measure the effectiveness of a trading program through estimating the water volume being released through trading in a long-term view. A sensitivity analysis was also conducted to analyze the effects of different trading costs on the trading program. It shows that the trading efforts would become ineffective when the trading costs are too high. The case study also demonstrates that the trading program is more effective in a dry season when total water availability is in shortage.
A global probabilistic tsunami hazard assessment from earthquake sources
Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana
2017-01-01
Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.
Ju, Brian L; Miller, Christopher P; Whang, Peter G; Grauer, Jonathan N
2011-01-01
In recent years, greater attention has been directed toward determining how potential financial conflicts of interest may affect the integrity of biomedical research. To address this issue, various disclosure policies have been adopted in an attempt to increase the transparency of this process. However, the consistency of such reporting among spine surgeons remains unknown. This study quantifies the variability in the self-reported disclosures of individual authors presenting at multiple spine conferences during the same year. The author disclosure information published for the 2008 North American Spine Society (NASS), Cervical Spine Research Society (CSRS), and Scoliosis Research Society (SRS), conferences were compiled into a database. We evaluated the disclosure policy for each society and compared the disclosure listings of authors who presented at more than one of these meetings. Disclosure records were available for 1,231 authors at NASS, 550 at CSRS, and 642 at SRS. Of these individuals, 278 (NASS), 129 (CSRS), and 181 (SRS) presented at one of the other conferences and 40 presented at all three conferences. North American Spine Society and CSRS required disclosure of all financial relationships, whereas SRS only requested disclosures pertinent to authors' presentations. Of the 153 authors who presented at the NASS and CSRS meetings, 51% exhibited discrepancies in their disclosure information. In contrast, only 9% of the 205 individuals whose data was listed at both the NASS and SRS conferences demonstrated irregularities. Similarly, 18% of the 56 authors who had provided information to both CSRS and SRS were inconsistent in their reporting. These findings emphasize the significant variability that currently exists in the reporting of financial conflicts of interest by authors who presented at three major spine conferences within the past year. We believe these discrepancies are likely because of confusion regarding what relationships should be acknowledged in certain situations and the clear lack of uniformity among the disclosure policies established by these various associations. This study evaluates financial conflicts of interests in clinical research. Copyright © 2011 Elsevier Inc. All rights reserved.
NREL, Johns Hopkins SAIS Develop Method to Quantify Life Cycle Land Use of
Life Cycle Land Use of Electricity from Natural Gas News Release: NREL, Johns Hopkins SAIS Develop Method to Quantify Life Cycle Land Use of Electricity from Natural Gas October 2, 2017 A case study of time provides quantifiable information on the life cycle land use of generating electricity from
ERIC Educational Resources Information Center
Creech, Sandra K.; And Others
This study sought to quantify economic impacts associated with Texas state expenditures on higher education by (1) quantifying the reduction in Texas' economic activity associated with reduced spending by the private sector due to taxes levied for higher education; and (2) quantifying the increase in Texas' economic activity associated with the…
Harris, Theodore D.; Graham, Jennifer L.
2015-01-01
The bbe-Moldaenke BenthoTorch (BT) is an in vivo fluorometer designed to quantify algal biomass and community composition in benthic environments. The BT quantifies total algal biomass via chlorophyll a (Chl-a) concentration and may differentiate among cyanobacteria, green algae, and diatoms based on pigment fluorescence. To evaluate how BT measurements of periphytic algal biomass (as Chl-a) compared with an ethanol extraction laboratory analysis, we collected BT- and laboratory-measured Chl-a data from 6 stream sites in the Indian Creek basin, Johnson County, Kansas, during August and September 2012. BT-measured Chl-a concentrations were positively related to laboratory-measured concentrations (R2 = 0.47); sites with abundant filamentous algae had weaker relations (R2 = 0.27). Additionally, on a single sample date, we used the BT to determine periphyton biomass and community composition upstream and downstream from 2 wastewater treatment facilities (WWTF) that discharge into Indian Creek. We found that algal biomass increased immediately downstream from the WWTF discharge then slowly decreased as distance from the WWTF increased. Changes in periphyton community structure also occurred; however, there were discrepancies between BT- and laboratory-measured community composition data. Most notably, cyanobacteria were present at all sites based on BT measurements but were present at only one site based on laboratory-analyzed samples. Overall, we found that the BT compared reasonably well with laboratory methods for relative patterns in Chl-a but not as well with absolute Chl-aconcentrations. Future studies need to test the BT over a wider range of Chl-aconcentrations, in colored waters, and across various periphyton assemblages.
Hansen, Scott K.; Berkowitz, Brian; Vesselinov, Velimir V.; ...
2016-12-01
Path reversibility and radial symmetry are often assumed in push-pull tracer test analysis. In reality, heterogeneous flow fields mean that both assumptions are idealizations. In this paper, to understand their impact, we perform a parametric study which quantifies the scattering effects of ambient flow, local-scale dispersion, and velocity field heterogeneity on push-pull breakthrough curves and compares them to the effects of mobile-immobile mass transfer (MIMT) processes including sorption and diffusion into secondary porosity. We identify specific circumstances in which MIMT overwhelmingly determines the breakthrough curve, which may then be considered uninformative about drift and local-scale dispersion. Assuming path reversibility, wemore » develop a continuous-time-random-walk-based interpretation framework which is flow-field-agnostic and well suited to quantifying MIMT. Adopting this perspective, we show that the radial flow assumption is often harmless: to the extent that solute paths are reversible, the breakthrough curve is uninformative about velocity field heterogeneity. Our interpretation method determines a mapping function (i.e., subordinator) from travel time in the absence of MIMT to travel time in its presence. A mathematical theory allowing this function to be directly “plugged into” an existing Laplace-domain transport model to incorporate MIMT is presented and demonstrated. Algorithms implementing the calibration are presented and applied to interpretation of data from a push-pull test performed in a heterogeneous environment. A successful four-parameter fit is obtained, of comparable fidelity to one obtained using a million-node 3-D numerical model. In conclusion, we demonstrate analytically and numerically how push-pull tests quantifying MIMT are sensitive to remobilization, but not immobilization, kinetics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Scott K.; Berkowitz, Brian; Vesselinov, Velimir V.
Path reversibility and radial symmetry are often assumed in push-pull tracer test analysis. In reality, heterogeneous flow fields mean that both assumptions are idealizations. In this paper, to understand their impact, we perform a parametric study which quantifies the scattering effects of ambient flow, local-scale dispersion, and velocity field heterogeneity on push-pull breakthrough curves and compares them to the effects of mobile-immobile mass transfer (MIMT) processes including sorption and diffusion into secondary porosity. We identify specific circumstances in which MIMT overwhelmingly determines the breakthrough curve, which may then be considered uninformative about drift and local-scale dispersion. Assuming path reversibility, wemore » develop a continuous-time-random-walk-based interpretation framework which is flow-field-agnostic and well suited to quantifying MIMT. Adopting this perspective, we show that the radial flow assumption is often harmless: to the extent that solute paths are reversible, the breakthrough curve is uninformative about velocity field heterogeneity. Our interpretation method determines a mapping function (i.e., subordinator) from travel time in the absence of MIMT to travel time in its presence. A mathematical theory allowing this function to be directly “plugged into” an existing Laplace-domain transport model to incorporate MIMT is presented and demonstrated. Algorithms implementing the calibration are presented and applied to interpretation of data from a push-pull test performed in a heterogeneous environment. A successful four-parameter fit is obtained, of comparable fidelity to one obtained using a million-node 3-D numerical model. In conclusion, we demonstrate analytically and numerically how push-pull tests quantifying MIMT are sensitive to remobilization, but not immobilization, kinetics.« less
Future of Alpine Water Resources : Uncertainty from Trees and Glaciers
NASA Astrophysics Data System (ADS)
Ceperley, N. C.; Beria, H.; Michelon, A.; Schaefli, B.
2016-12-01
Alpine water resources are particularly susceptible to climate change, which presents a high risk to many of the ecologic and economic roles played by mountain environments. In Switzerland, water from glacier-fed catchments provides a large portion of hydroelectric power and water supply as well as a multitude of services including the creation and maintenance of biological communities and the physical landscape. Loss of glaciers will also pose indirect consequences, such as changing the hydrologic, biologic, and physical environment, for example opening up new surfaces for vegetation growth and forestation. Hydrologic models are a primary tool to predict these consequences. Quantifying evaporation is an on-going challenge for modeling, and changes in the partition between transpiration and evaporation from bare ground or sublimation from glaciers is a larve source of uncertainty in the alpine water balance. We just began an intensive monitoring program of hydrological processes in the Vallon de Nant, Switzerland (area of 14 km², altitude ranging from 1200 to 3051 m). This site is both a karst system and a protected area, making it a particularly interesting site to study eco-hydrologic processes. Monitoring of stable isotopes (δO18 and δD) in water combines with measurements of climate and hydrologic parameters to quantify flows through the components of the water balance and assess their certainty. Additionally, we are observing water use by trees at the upper limit of their habitat range. Our presentation will highlight the importance of in situ measurements to quantify the spatial and temporal variations in the water balance. We will discuss the innovative measurement techniques that we are deploying, the uncertainty from each component, and show the first results of our work.
Uz, Zühre; van Gulik, Thomas M; Aydemirli, Mehtap Derya; Guerci, Philippe; Ince, Yasin; Cuppen, Diede V; Ergin, Bulent; Aksu, Ugur; de Mol, Bas A; Ince, Can
2018-03-08
Leukocyte recruitment and adhesion to the endothelium are hallmarks of systemic inflammation that manifest in a wide range of diseases. At present, no method is available to directly measure leukocyte kinetics at the bedside. In this study, we validate a new method to identify and quantify microcirculatory leukocytes observed by handheld vital microscopy (HVM) using space-time diagram (STD) analysis. Video clips (N=59) containing one capillary-post capillary venule (C-PCV) unit where leukocytes could be observed emanating from a capillary into a venule in cardiac surgery patients (N=20) were included. STD analysis and manual counting were used to quantify the number of leukocytes (total, rolling and non-rolling). Pearson's correlation and Bland-Altman analysis were used to determine agreement between the STDs and manual counting. For reproducibility, intra- and inter-observer coefficients of variation (CVs) were assessed. Leukocyte (rolling and non-rolling) and red blood cell velocities were assessed. The STDs and manual counting procedures for the quantification of rolling leukocytes showed good agreement (r=0.8197, P<0.0001), with a Bland-Altman analysis mean difference of -0.0 (-6.56; 6.56). The overall intra-observer CV for the STD method was 1.5%. The overall inter-observer CVs for the STD and the manual method were 5.6% and 9.4%, respectively. The non-rolling velocity was significantly higher than the rolling velocity (812{plus minus}519 µm/s vs 201{plus minus}149 µm/s, P=0.001). The STD results agreed with the manual counting procedure results, had a better reproducibility and could assess the leukocyte velocity. STD analysis using bedside HVM imaging presented a new methodology for quantifying leukocyte kinetics and functions in the microcirculation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamay, Ryan A
2014-01-01
Despite the ubiquitous existence of dams within riverscapes, much of our knowledge about dams and their environmental effects remains context-specific. Hydrology, more than any other environmental variable, has been studied in great detail with regard to dam regulation. While much progress has been made in generalizing the hydrologic effects of regulation by large dams, many aspects of hydrology show site-specific fidelity to dam operations, small dams (including diversions), and regional hydrologic regimes. A statistical modeling framework is presented to quantify and generalize hydrologic responses to varying degrees of dam regulation. Specifically, the objectives were to 1) compare the effects ofmore » local versus cumulative dam regulation, 2) determine the importance of different regional hydrologic regimes in influencing hydrologic responses to dams, and 3) evaluate how different regulation contexts lead to error in predicting hydrologic responses to dams. Overall, model performance was poor in quantifying the magnitude of hydrologic responses, but performance was sufficient in classifying hydrologic responses as negative or positive. Responses of some hydrologic indices to dam regulation were highly dependent upon hydrologic class membership and the purpose of the dam. The opposing coefficients between local and cumulative-dam predictors suggested that hydrologic responses to cumulative dam regulation are complex, and predicting the hydrology downstream of individual dams, as opposed to multiple dams, may be more easy accomplished using statistical approaches. Results also suggested that particular contexts, including multipurpose dams, high cumulative regulation by multiple dams, diversions, close proximity to dams, and certain hydrologic classes are all sources of increased error when predicting hydrologic responses to dams. Statistical models, such as the ones presented herein, show promise in their ability to model the effects of dam regulation effects at large spatial scales as to generalize the directionality of hydrologic responses.« less
NASA Astrophysics Data System (ADS)
Dvory, N. Z.; Ronen, A.; Livshitz, Y.; Adar, E.; Kuznetsov, M.; Yakirevich, A.
2017-12-01
Sustainable groundwater production from karstic aquifers is primarily dictated by its recharge rate. Therefore, in order to limit over-exploitation, it is essential to accurately quantify groundwater recharge. Infiltration during erratic floods in karstic basins may contribute substantial amount to aquifer recharge. However, the complicated nature of karst systems, which are characterized in part by multiple springs, sinkholes, and losing/gaining streams, present a large obstacle to accurately assess the actual contribution of flood water to groundwater recharge. In this study, we aim to quantify the proportion of groundwater recharge during flood events in relation to the annual recharge for karst aquifers. The role of karst conduits on flash flood infiltration was examined during four flood and artificial runoff events in the Sorek creek near Jerusalem, Israel. The events were monitored in short time steps (four minutes). This high resolution analysis is essential to accurately estimating surface flow volumes, which are of particular importance in arid and semi-arid climate where ephemeral flows may provide a substantial contribution to the groundwater reservoirs. For the present investigation, we distinguished between direct infiltration, percolation through karst conduits and diffused infiltration, which is most affected by evapotranspiration. A water balance was then calculated for the 2014/15 hydrologic year using the Hydrologic Engineering Center - Hydrologic Modelling System (HEC-HMS). Simulations show that an additional 8% to 24% of the annual recharge volume is added from runoff losses along the creek that infiltrate through the karst system into the aquifer. The results improve the understanding of recharge processes and support the use of the proposed methodology for quantifying groundwater recharge.
Sze, N N; Wong, S C; Lee, C Y
2014-12-01
In past several decades, many countries have set quantified road safety targets to motivate transport authorities to develop systematic road safety strategies and measures and facilitate the achievement of continuous road safety improvement. Studies have been conducted to evaluate the association between the setting of quantified road safety targets and road fatality reduction, in both the short and long run, by comparing road fatalities before and after the implementation of a quantified road safety target. However, not much work has been done to evaluate whether the quantified road safety targets are actually achieved. In this study, we used a binary logistic regression model to examine the factors - including vehicle ownership, fatality rate, and national income, in addition to level of ambition and duration of target - that contribute to a target's success. We analyzed 55 quantified road safety targets set by 29 countries from 1981 to 2009, and the results indicate that targets that are in progress and with lower level of ambitions had a higher likelihood of eventually being achieved. Moreover, possible interaction effects on the association between level of ambition and the likelihood of success are also revealed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Vocational Guidance and Psychology in Spain: A Scientometric Study
ERIC Educational Resources Information Center
Flores-Buils, Raquel; Gil-Beltran, Jose Manuel; Caballer-Miedes, Antonio; Martinez-Martinez, Miguel Angel
2013-01-01
Introduction: Studies that investigate research activity are possible by quantifying certain variables pertaining to articles published in specialized journals. Once quantified, numerical data are obtained that summarize characteristics of the research activity. These data are obtained through scientometric indicators. This is an objective and…
A cost benefit evaluation of the LANDSAT follow-on program
NASA Technical Reports Server (NTRS)
1976-01-01
Results are presented of a benefit and cost study for the LANDSAT Follow-on system with a thematic mapper. The analysis shows that the present worth of the benefits exceeds the present worth of the costs by a factor between 6.5 and 13 using a 10 percent discount rate and an infinite horizon for both. This study focuses only on major, demonstrated applications, conservatively evaluated. No benefits have been included except where a definite need for the information has been shown, a mechanism for disseminating the information has been defined, a technical capability has been demonstrated, and a defendable method of evaluating the economic worth has been developed. This approach has meant that certain applications with definite promise and substantial likely benefits could not be evaluated or assigned any benefits. Mention is made of these areas, however, either in the appropriate subject chapter or in the final chapter on non-quantified benefits.