Science.gov

Sample records for quantified results show

  1. Quantifying causal emergence shows that macro can beat micro.

    PubMed

    Hoel, Erik P; Albantakis, Larissa; Tononi, Giulio

    2013-12-03

    Causal interactions within complex systems can be analyzed at multiple spatial and temporal scales. For example, the brain can be analyzed at the level of neurons, neuronal groups, and areas, over tens, hundreds, or thousands of milliseconds. It is widely assumed that, once a micro level is fixed, macro levels are fixed too, a relation called supervenience. It is also assumed that, although macro descriptions may be convenient, only the micro level is causally complete, because it includes every detail, thus leaving no room for causation at the macro level. However, this assumption can only be evaluated under a proper measure of causation. Here, we use a measure [effective information (EI)] that depends on both the effectiveness of a system's mechanisms and the size of its state space: EI is higher the more the mechanisms constrain the system's possible past and future states. By measuring EI at micro and macro levels in simple systems whose micro mechanisms are fixed, we show that for certain causal architectures EI can peak at a macro level in space and/or time. This happens when coarse-grained macro mechanisms are more effective (more deterministic and/or less degenerate) than the underlying micro mechanisms, to an extent that overcomes the smaller state space. Thus, although the macro level supervenes upon the micro, it can supersede it causally, leading to genuine causal emergence--the gain in EI when moving from a micro to a macro level of analysis.

  2. Different methods to quantify Listeria monocytogenes biofilms cells showed different profile in their viability.

    PubMed

    Winkelströter, Lizziane Kretli; De Martinis, Elaine C P

    2015-03-01

    Listeria monocytogenes is a foodborne pathogen able to adhere and to form biofilms in several materials commonly present in food processing plants. The aim of this study was to evaluate the resistance of Listeria monocytogenes attached to abiotic surface, after treatment with sanitizers, by culture method, microscopy and Quantitative Real Time Polymerase Chain Reaction (qPCR). Biofilms of L. monocytogenes were obtained in stainless steel coupons immersed in Brain Heart Infusion Broth, under agitation at 37 °C for 24 h. The methods selected for this study were based on plate count, microscopic count with the aid of viability dyes (CTC-DAPI), and qPCR. Results of culture method showed that peroxyacetic acid was efficient to kill sessile L. monocytogenes populations, while sodium hypochlorite was only partially effective to kill attached L. monocytogenes (p < 0.05). When, viability dyes (CTC/DAPI) combined with fluorescence microscopy and qPCR were used and lower counts were found after treatments (p < 0.05). Selective quantification of viable cells of L. monocytogenes by qPCR using EMA revelead that the pre-treatment with EMA was not appropriate since it also inhibited amplification of DNA from live cells by ca. 2 log. Thus, the use of CTC counts was the best method to count viable cells in biofilms.

  3. 13. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF POOR CONSTRUCTION WORK. THOUGH NOT A SERIOUS STRUCTURAL DEFICIENCY, THE 'HONEYCOMB' TEXTURE OF THE CONCRETE SURFACE WAS THE RESULT OF INADEQUATE TAMPING AT THE TIME OF THE INITIAL 'POUR'. - Hume Lake Dam, Sequioa National Forest, Hume, Fresno County, CA

  4. Quantifying IOHDR brachytherapy underdosage resulting from an incomplete scatter environment

    SciTech Connect

    Raina, Sanjay; Avadhani, Jaiteerth S.; Oh, Moonseong; Malhotra, Harish K.; Jaggernauth, Wainwright; Kuettel, Michael R.; Podgorsak, Matthew B. . E-mail: matthew.podgorsak@roswellpark.org

    2005-04-01

    Purpose: Most brachytherapy planning systems are based on a dose calculation algorithm that assumes an infinite scatter environment surrounding the target volume and applicator. Dosimetric errors from this assumption are negligible. However, in intraoperative high-dose-rate brachytherapy (IOHDR) where treatment catheters are typically laid either directly on a tumor bed or within applicators that may have little or no scatter material above them, the lack of scatter from one side of the applicator can result in underdosage during treatment. This study was carried out to investigate the magnitude of this underdosage. Methods: IOHDR treatment geometries were simulated using a solid water phantom beneath an applicator with varying amounts of bolus material on the top and sides of the applicator to account for missing tissue. Treatment plans were developed for 3 different treatment surface areas (4 x 4, 7 x 7, 12 x 12 cm{sup 2}), each with prescription points located at 3 distances (0.5 cm, 1.0 cm, and 1.5 cm) from the source dwell positions. Ionization measurements were made with a liquid-filled ionization chamber linear array with a dedicated electrometer and data acquisition system. Results: Measurements showed that the magnitude of the underdosage varies from about 8% to 13% of the prescription dose as the prescription depth is increased from 0.5 cm to 1.5 cm. This treatment error was found to be independent of the irradiated area and strongly dependent on the prescription distance. Furthermore, for a given prescription depth, measurements in planes parallel to an applicator at distances up to 4.0 cm from the applicator plane showed that the dose delivery error is equal in magnitude throughout the target volume. Conclusion: This study demonstrates the magnitude of underdosage in IOHDR treatments delivered in a geometry that may not result in a full scatter environment around the applicator. This implies that the target volume and, specifically, the prescription

  5. Quantifying Uncertainty in Model Predictions for the Pliocene (Plio-QUMP): Initial results

    USGS Publications Warehouse

    Pope, J.O.; Collins, M.; Haywood, A.M.; Dowsett, H.J.; Hunter, S.J.; Lunt, D.J.; Pickering, S.J.; Pound, M.J.

    2011-01-01

    Examination of the mid-Pliocene Warm Period (mPWP; ~. 3.3 to 3.0. Ma BP) provides an excellent opportunity to test the ability of climate models to reproduce warm climate states, thereby assessing our confidence in model predictions. To do this it is necessary to relate the uncertainty in model simulations of mPWP climate to uncertainties in projections of future climate change. The uncertainties introduced by the model can be estimated through the use of a Perturbed Physics Ensemble (PPE). Developing on the UK Met Office Quantifying Uncertainty in Model Predictions (QUMP) Project, this paper presents the results from an initial investigation using the end members of a PPE in a fully coupled atmosphere-ocean model (HadCM3) running with appropriate mPWP boundary conditions. Prior work has shown that the unperturbed version of HadCM3 may underestimate mPWP sea surface temperatures at higher latitudes. Initial results indicate that neither the low sensitivity nor the high sensitivity simulations produce unequivocally improved mPWP climatology relative to the standard. Whilst the high sensitivity simulation was able to reconcile up to 6 ??C of the data/model mismatch in sea surface temperatures in the high latitudes of the Northern Hemisphere (relative to the standard simulation), it did not produce a better prediction of global vegetation than the standard simulation. Overall the low sensitivity simulation was degraded compared to the standard and high sensitivity simulations in all aspects of the data/model comparison. The results have shown that a PPE has the potential to explore weaknesses in mPWP modelling simulations which have been identified by geological proxies, but that a 'best fit' simulation will more likely come from a full ensemble in which simulations that contain the strengths of the two end member simulations shown here are combined. ?? 2011 Elsevier B.V.

  6. Quantifying viruses and bacteria in wastewater—Results, interpretation methods, and quality control

    USGS Publications Warehouse

    Francy, Donna S.; Stelzer, Erin A.; Bushon, Rebecca N.; Brady, Amie M.G.; Mailot, Brian E.; Spencer, Susan K.; Borchardt, Mark A.; Elber, Ashley G.; Riddell, Kimberly R.; Gellner, Terry M.

    2011-01-01

    Membrane bioreactors (MBR), used for wastewater treatment in Ohio and elsewhere in the United States, have pore sizes small enough to theoretically reduce concentrations of protozoa and bacteria, but not viruses. Sampling for viruses in wastewater is seldom done and not required. Instead, the bacterial indicators Escherichia coli (E. coli) and fecal coliforms are the required microbial measures of effluents for wastewater-discharge permits. Information is needed on the effectiveness of MBRs in removing human enteric viruses from wastewaters, particularly as compared to conventional wastewater treatment before and after disinfection. A total of 73 regular and 28 quality-control (QC) samples were collected at three MBR and two conventional wastewater plants in Ohio during 23 regular and 3 QC sampling trips in 2008-10. Samples were collected at various stages in the treatment processes and analyzed for bacterial indicators E. coli, fecal coliforms, and enterococci by membrane filtration; somatic and F-specific coliphage by the single agar layer (SAL) method; adenovirus, enterovirus, norovirus GI and GII, rotavirus, and hepatitis A virus by molecular methods; and viruses by cell culture. While addressing the main objective of the study-comparing removal of viruses and bacterial indicators in MBR and conventional plants-it was realized that work was needed to identify data analysis and quantification methods for interpreting enteric virus and QC data. Therefore, methods for quantifying viruses, qualifying results, and applying QC data to interpretations are described in this report. During each regular sampling trip, samples were collected (1) before conventional or MBR treatment (post-preliminary), (2) after secondary or MBR treatment (post-secondary or post-MBR), (3) after tertiary treatment (one conventional plant only), and (4) after disinfection (post-disinfection). Glass-wool fiber filtration was used to concentrate enteric viruses from large volumes, and small

  7. Results From Mars Show Electrostatic Charging of the Mars Pathfinder Sojourner Rover

    NASA Technical Reports Server (NTRS)

    Kolecki, Joseph C.; Siebert, Mark W.

    1998-01-01

    flighata. Electrical charging of vehicles and, one day, astronauts moving across the Martian surface may have moderate to severe consequences if large potential differences develop. The observations from Sojourner point to just such a possibility. It is desirable to quantify these results. The various lander/rover missions being planned for the upcoming decade provide the means for doing so. They should, therefore, carry instruments that will not only measure vehicle charging but characterize all the natural and induced electrical phenomena occurring in the environment and assess their impact on future missions.

  8. Gun shows and gun violence: fatally flawed study yields misleading results.

    PubMed

    Wintemute, Garen J; Hemenway, David; Webster, Daniel; Pierce, Glenn; Braga, Anthony A

    2010-10-01

    A widely publicized but unpublished study of the relationship between gun shows and gun violence is being cited in debates about the regulation of gun shows and gun commerce. We believe the study is fatally flawed. A working paper entitled "The Effect of Gun Shows on Gun-Related Deaths: Evidence from California and Texas" outlined this study, which found no association between gun shows and gun-related deaths. We believe the study reflects a limited understanding of gun shows and gun markets and is not statistically powered to detect even an implausibly large effect of gun shows on gun violence. In addition, the research contains serious ascertainment and classification errors, produces results that are sensitive to minor specification changes in key variables and in some cases have no face validity, and is contradicted by 1 of its own authors' prior research. The study should not be used as evidence in formulating gun policy.

  9. Preliminary Results In Quantifying The Climatic Impact Forcing Factors Around 3 Ma Ago

    NASA Astrophysics Data System (ADS)

    Fluteau, F.; Ramstein, G.; Duringer, P.; Schuster, M.; Tiercelin, J. J.

    What is exactly the control of climate changes on the development of the Hominids ? Is it possible to quantify such changes ? and which are the forcing factors that create these changes ? We use here a General Circulation Model to investigate the climate sensitivity of 3 different forcing factors : the uplift of the East African Rift, the ex- tent (more than twenty time PD surfaces) of the Chad Lake and ultimately we shall with a coupled oceanatmospher GCM test the the effect of Indonesian throughflow changes. To achieve these goals, we need a multidisciplinary group to assess the evo- lution of the Rift and the extent of the Lake. We prescribe these different boundary conditions to the GCM and use a biome model to assess the vegetation changes. In this presentation we will only focus on the Rift uplift and the Chad lake impacts on Atmospheric circulation, monsoon and their environmental consequences in term of vegetation changes.

  10. Astronomy Diagnostic Test Results Reflect Course Goals and Show Room for Improvement

    NASA Astrophysics Data System (ADS)

    Lopresto, Michael C.

    The results of administering the Astronomy Diagnostic Test (ADT) to introductory astronomy students at Henry Ford Community College over three years have shown gains comparable with national averages. Results have also accurately corresponded to course goals, showing greater gains in topics covered in more detail, and lower gains in topics covered in less detail. Also evident in the results were topics for which improvement of instruction is needed. These factors and the ease with which the ADT can be administered constitute evidence of the usefulness of the ADT as an assessment instrument for introductory astronomy.

  11. Quantifying the effects of root reinforcing on slope stability: results of the first tests with an new shearing device

    NASA Astrophysics Data System (ADS)

    Rickli, Christian; Graf, Frank

    2013-04-01

    The role of vegetation in preventing shallow soil mass movements such as shallow landslides and soil erosion is generally well recognized and, correspondingly, soil bioengineering on steep slopes has been widely used in practice. However, the precise effectiveness of vegetation regarding slope stabilityis still difficult to determine. A recently designed inclinable shearing device for large scale vegetated soil samples allows quantitative evaluation of the additional shear strength provided by roots of specific plant species. In the following we describe the results of a first series of shear strength experiments with this apparatus focusing on root reinforcement of White Alder (Alnus incana) and Silver Birch (Betula pendula) in large soil block samples (500 x 500 x 400 mm). The specimen with partly saturated soil of a maximum grain size of 10 mm were slowly sheared at an inclination of 35° with low normal stresses of 3.2 kPa accounting for natural conditions on a typical slope prone to mass movements. Measurements during the experiments involved shear stress, shear displacement and normal displacement, all recorded with high accuracy. In addition, dry weights of sprout and roots were measured to quantify plant growth of the planted specimen. The results with the new apparatus indicate a considerable reinforcement of the soil due to plant roots, i.e. maximum shear stress of the vegetated specimen were substantially higher compared to non-vegetated soil and the additional strength was a function of species and growth. Soil samples with seedlings planted five months prior to the test yielded an important increase in maximum shear stress of 250% for White Alder and 240% for Silver Birch compared to non-vegetated soil. The results of a second test series with 12 month old plants showed even clearer enhancements in maximum shear stress (390% for Alder and 230% for Birch). Overall the results of this first series of shear strength experiments with the new apparatus

  12. Showing Value in Newborn Screening: Challenges in Quantifying the Effectiveness and Cost-Effectiveness of Early Detection of Phenylketonuria and Cystic Fibrosis

    PubMed Central

    Grosse, Scott D.

    2015-01-01

    Decision makers sometimes request information on the cost savings, cost-effectiveness, or cost-benefit of public health programs. In practice, quantifying the health and economic benefits of population-level screening programs such as newborn screening (NBS) is challenging. It requires that one specify the frequencies of health outcomes and events, such as hospitalizations, for a cohort of children with a given condition under two different scenarios—with or without NBS. Such analyses also assume that everything else, including treatments, is the same between groups. Lack of comparable data for representative screened and unscreened cohorts that are exposed to the same treatments following diagnosis can result in either under- or over-statement of differences. Accordingly, the benefits of early detection may be understated or overstated. This paper illustrates these common problems through a review of past economic evaluations of screening for two historically significant conditions, phenylketonuria and cystic fibrosis. In both examples qualitative judgments about the value of prompt identification and early treatment to an affected child were more influential than specific numerical estimates of lives or costs saved. PMID:26702401

  13. Image analysis techniques: Used to quantify and improve the precision of coatings testing results

    SciTech Connect

    Duncan, D.J.; Whetten, A.R.

    1993-12-31

    Coating evaluations often specify tests to measure performance characteristics rather than coating physical properties. These evaluation results are often very subjective. A new tool, Digital Video Image Analysis (DVIA), is successfully being used for two automotive evaluations; cyclic (scab) corrosion, and gravelometer (chip) test. An experimental design was done to evaluate variability and interactions among the instrumental factors. This analysis method has proved to be an order of magnitude more sensitive and reproducible than the current evaluations. Coating evaluations can be described and measured that had no way to be expressed previously. For example, DVIA chip evaluations can differentiate how much damage was done to the topcoat, primer even to the metal. DVIA with or without magnification, has the capability to become the quantitative measuring tool for several other coating evaluations, such as T-bends, wedge bends, acid etch analysis, coating defects, observing cure, defect formation or elimination over time, etc.

  14. Quantifying entanglement

    NASA Astrophysics Data System (ADS)

    Thapliyal, Ashish Vachaspati

    Entanglement is an essential element of quantum mechanics. The aim of this work is to explore various properties of entanglement from the viewpoints of both physics and information science, thus providing a unique picture of entanglement from an interdisciplinary point of view. The focus of this work is on quantifying entanglement as a resource. We start with bipartite states, proposing a new measure of bipartite entanglement called entanglement of assistance, showing that bound entangled states of rank two cannot exist, exploring the number of members required in the ensemble achieving the entanglement of formation and the possibility of bound entangled states that are negative under partial transposition (NPT bound entangled states). For multipartite states we introduce the notions of reducibilities and equivalences under entanglement non-increasing operations and we study the relations between various reducibilities and equivalences such as exact and asymptotic LOCC, asymptotic LOCCq, cLOCC, LOc, etc. We use this new language to attempt to quantify entanglement for multiple parties. We introduce the idea of entanglement span and minimal entanglement generating set and entanglement coefficients associated with it which are the entanglement measures, thus proposing a multicomponent measure of entanglement for three or more parties. We show that the class of Schmidt decomposable states have only GHZM or Cat-like entanglement. Further we introduce the class of multiseparable states for quantification of their entanglement and prove that they are equivalent to the Schmidt decomposable states, and thus have only Cat-like entanglement. We further explore the conditions under which LOCO equivalences are possible for multipartite isentropic states. We define Cat-distillability, EPRB-distillability and distillability for multipartite mixed states and show that distillability implies EPRB-distillability. Further we show that all non-factorizable pure states are Cat

  15. Long-Term Trial Results Show No Mortality Benefit from Annual Prostate Cancer Screening

    Cancer.gov

    Thirteen year follow-up data from the Prostate, Lung, Colorectal and Ovarian (PLCO) cancer screening trial show higher incidence but similar mortality among men screened annually with the prostate-specific antigen (PSA) test and digital rectal examination

  16. Stem cells show promising results for lymphoedema treatment--a literature review.

    PubMed

    Toyserkani, Navid Mohamadpour; Christensen, Marlene Louise; Sheikh, Søren Paludan; Sørensen, Jens Ahm

    2015-04-01

    Lymphoedema is a debilitating condition, manifesting in excess lymphatic fluid and swelling of subcutaneous tissues. Lymphoedema is as of yet still an incurable condition and current treatment modalities are not satisfactory. The capacity of mesenchymal stem cells to promote angiogenesis, secrete growth factors, regulate the inflammatory process, and differentiate into multiple cell types make them a potential ideal therapy for lymphoedema. Adipose tissue is the richest and most accessible source of mesenchymal stem cells and they can be harvested, isolated, and used for therapy in a single stage procedure as an autologous treatment. The aim of this paper was to review all studies using mesenchymal stem cells for lymphoedema treatment with a special focus on the potential use of adipose-derived stem cells. A systematic search was performed and five preclinical and two clinical studies were found. Different stem cell sources and lymphoedema models were used in the described studies. Most studies showed a decrease in lymphoedema and an increased lymphangiogenesis when treated with stem cells and this treatment modality has so far shown great potential. The present studies are, however, subject to bias and more preclinical studies and large-scale high quality clinical trials are needed to show if this emerging therapy can satisfy expectations.

  17. AMS 14C analysis of teeth from archaeological sites showing anomalous esr dating results

    NASA Astrophysics Data System (ADS)

    Grün, Rainer; Abeyratne, Mohan; Head, John; Tuniz, Claudio; Hedges, Robert E. M.

    We have carried out AMS radiocarbon analysis on two groups of samples: the first one gave reasonable ESR age estimates and the second one yielded serious age underestinations. All samples were supposedly older than 35 ka, the oldest being around 160 ka. Two pretreatment techniques were used for radiocarbon dating: acid evolution and thermal release. Heating to 600, 750 and 900°C combined with total de-gassing at these temperatures was chosen to obtain age estimates on the organic fraction, secondary carbonates and original carbonate present in the hydroxyapatite mineral phase, respectively. All radiocarbon results present serious age underestimations. The secondary carbonate fraction gives almost modern results indicating an extremely rapid exchange of this component. Owing to this very rapid carbonate exchange it is not likely that the ESR signals used for dating are associated with the secondary carbonates. One tooth from Tabun with independent age estimates of >150 ka was further investigated by the Oxford AMS laboratory, yielding an age estimate of 1930±100 BP on the residual collagen from dentine and 18,000±160 BP on the carbonate component of the enamel bioapatite. We did not, however, find an explanation of why some samples give serious ESR underestimatioils whilst many others provide reasonable results.

  18. Animation shows promise in initiating timely cardiopulmonary resuscitation: results of a pilot study.

    PubMed

    Attin, Mina; Winslow, Katheryn; Smith, Tyler

    2014-04-01

    Delayed responses during cardiac arrest are common. Timely interventions during cardiac arrest have a direct impact on patient survival. Integration of technology in nursing education is crucial to enhance teaching effectiveness. The goal of this study was to investigate the effect of animation on nursing students' response time to cardiac arrest, including initiation of timely chest compression. Nursing students were randomized into experimental and control groups prior to practicing in a high-fidelity simulation laboratory. The experimental group was educated, by discussion and animation, about the importance of starting cardiopulmonary resuscitation upon recognizing an unresponsive patient. Afterward, a discussion session allowed students in the experimental group to gain more in-depth knowledge about the most recent changes in the cardiac resuscitation guidelines from the American Heart Association. A linear mixed model was run to investigate differences in time of response between the experimental and control groups while controlling for differences in those with additional degrees, prior code experience, and basic life support certification. The experimental group had a faster response time compared with the control group and initiated timely cardiopulmonary resuscitation upon recognition of deteriorating conditions (P < .0001). The results demonstrated the efficacy of combined teaching modalities for timely cardiopulmonary resuscitation. Providing opportunities for repetitious practice when a patient's condition is deteriorating is crucial for teaching safe practice.

  19. Aortic emboli show surprising size dependent predilection for cerebral arteries: Results from computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Carr, Ian; Schwartz, Robert; Shadden, Shawn

    2012-11-01

    Cardiac emboli can have devastating consequences if they enter the cerebral circulation, and are the most common cause of embolic stroke. Little is known about relationships of embolic origin/density/size to cerebral events; as these relationships are difficult to observe. To better understand stoke risk from cardiac and aortic emboli, we developed a computational model to track emboli from the heart to the brain. Patient-specific models of the human aorta and arteries to the brain were derived from CT angiography from 10 MHIF patients. Blood flow was modeled by the Navier-Stokes equations using pulsatile inflow at the aortic valve, and physiologic Windkessel models at the outlets. Particulate was injected at the aortic valve and tracked using modified Maxey-Riley equations with a wall collision model. Results demonstrate aortic emboli that entered the cerebral circulation through the carotid or vertebral arteries were localized to specific locations of the proximal aorta. The percentage of released particles embolic to the brain markedly increased with particle size from 0 to ~1-1.5 mm in all patients. Larger particulate became less likely to traverse the cerebral vessels. These findings are consistent with sparse literature based on transesophageal echo measurements. This work was supported in part by the National Science Foundation, award number 1157041.

  20. Initiative To Reduce Avoidable Hospitalizations Among Nursing Facility Residents Shows Promising Results.

    PubMed

    Ingber, Melvin J; Feng, Zhanlian; Khatutsky, Galina; Wang, Joyce M; Bercaw, Lawren E; Zheng, Nan Tracy; Vadnais, Alison; Coomer, Nicole M; Segelman, Micah

    2017-03-01

    Nursing facility residents are frequently admitted to the hospital, and these hospital stays are often potentially avoidable. Such hospitalizations are detrimental to patients and costly to Medicare and Medicaid. In 2012 the Centers for Medicare and Medicaid Services launched the Initiative to Reduce Avoidable Hospitalizations among Nursing Facility Residents, using evidence-based clinical and educational interventions among long-stay residents in 143 facilities in seven states. In state-specific analyses, we estimated net reductions in 2015 of 2.2-9.3 percentage points in the probability of an all-cause hospitalization and 1.4-7.2 percentage points in the probability of a potentially avoidable hospitalization for participating facility residents, relative to comparison-group members. In that year, average per resident Medicare expenditures were reduced by $60-$2,248 for all-cause hospitalizations and by $98-$577 for potentially avoidable hospitalizations. The effects for over half of the outcomes in these analyses were significant. Variability in implementation and engagement across the nursing facilities and organizations that customized and implemented the initiative helps explain the variability in the estimated effects. Initiative models that included registered nurses or nurse practitioners who provided consistent clinical care for residents demonstrated higher staff engagement and more positive outcomes, compared to models providing only education or intermittent clinical care. These results provide promising evidence of an effective approach for reducing avoidable hospitalizations among nursing facility residents.

  1. Quantifying microwear on experimental Mistassini quartzite scrapers: preliminary results of exploratory research using LSCM and scale-sensitive fractal analysis.

    PubMed

    Stemp, W James; Lerner, Harry J; Kristant, Elaine H

    2013-01-01

    Although previous use-wear studies involving quartz and quartzite have been undertaken by archaeologists, these are comparatively few in number. Moreover, there has been relatively little effort to quantify use-wear on stone tools made from quartzite. The purpose of this article is to determine the effectiveness of a measurement system, laser scanning confocal microscopy (LSCM), to document the surface roughness or texture of experimental Mistassini quartzite scrapers used on two different contact materials (fresh and dry deer hide). As in previous studies using LSCM on chert, flint, and obsidian, this exploratory study incorporates a mathematical algorithm that permits the discrimination of surface roughness based on comparisons at multiple scales. Specifically, we employ measures of relative area (RelA) coupled with the F-test to discriminate used from unused stone tool surfaces, as well as surfaces of quartzite scrapers used on dry and fresh deer hide. Our results further demonstrate the effect of raw material variation on use-wear formation and its documentation using LSCM and RelA.

  2. Mitochondrial DNA transmitted from sperm in the blue mussel Mytilus galloprovincialis showing doubly uniparental inheritance of mitochondria, quantified by real-time PCR.

    PubMed

    Sano, Natsumi; Obata, Mayu; Komaru, Akira

    2010-07-01

    Doubly uniparental inheritance (DUI) of mitochondrial DNA transmission to progeny has been reported in the mussel, Mytilus. In DUI, males have both paternally (M type) and maternally (F type) transmitted mitochondrial DNA (mtDNA), but females have only the F type. To estimate how much M type mtDNA enters the egg with sperm in the DUI system, ratios of M type to F type mtDNA were measured before and after fertilization. M type mtDNA content in eggs increased markedly after fertilization. Similar patterns in M type content changes after fertilization were observed in crosses using the same males. To compare mtDNA quantities, we subsequently measured the ratios of mtDNA to the 28S ribosomal RNA gene (an endogenous control sequence) in sperm or unfertilized eggs using a real-time polymerase chain reaction (PCR) assay. F type content in unfertilized eggs was greater than the M type in sperm by about 1000-fold on average. M type content in spermatozoa was greater than in unfertilized egg, but their distribution overlapped. These results may explain the post-fertilization changes in zygotic M type content. We previously demonstrated that paternal and maternal M type mtDNAs are transmitted to offspring, and hypothesized that the paternal M type contributed to M type transmission to the next generation more than the maternal type did. These quantitative data on M and F type mtDNA in sperm and eggs provide further support for that hypothesis.

  3. Seeking to quantify the ferromagnetic-to-antiferromagnetic interface coupling resulting in exchange bias with various thin-film conformations

    SciTech Connect

    Hsiao, C. H.; Wang, S.; Ouyang, H.; Desautels, R. D.; Lierop, J. van; Lin, K. W.

    2014-08-07

    Ni{sub 3}Fe/(Ni, Fe)O thin films with bilayer and nanocrystallite dispersion morphologies are prepared with a dual ion beam deposition technique permitting precise control of nanocrystallite growth, composition, and admixtures. A bilayer morphology provides a Ni{sub 3}Fe-to-NiO interface, while the dispersion films have different mixtures of Ni{sub 3}Fe, NiO, and FeO nanocrystallites. Using detailed analyses of high resolution transmission electron microscopy images with Multislice simulations, the nanocrystallites' structures and phases are determined, and the intermixing between the Ni{sub 3}Fe, NiO, and FeO interfaces is quantified. From field-cooled hysteresis loops, the exchange bias loop shift from spin interactions at the interfaces are determined. With similar interfacial molar ratios of FM-to-AF, we find the exchange bias field essentially unchanged. However, when the interfacial ratio of FM to AF was FM rich, the exchange bias field increases. Since the FM/AF interface ‘contact’ areas in the nanocrystallite dispersion films are larger than that of the bilayer film, and the nanocrystallite dispersions exhibit larger FM-to-AF interfacial contributions to the magnetism, we attribute the changes in the exchange bias to be from increases in the interfacial segments that suffer defects (such as vacancies and bond distortions), that also affects the coercive fields.

  4. Comparison of gas analyzers for quantifying eddy covariance fluxes- results from an irrigated alfalfa field in Davis, CA

    NASA Astrophysics Data System (ADS)

    Chan, S.; Biraud, S.; Polonik, P.; Billesbach, D.; Hanson, C. V.; Bogoev, I.; Conrad, B.; Alstad, K. P.; Burba, G. G.; Li, J.

    2015-12-01

    The eddy covariance technique requires simultaneous, rapid measurements of wind components and scalars (e.g., water vapor, carbon dioxide) to calculate the vertical exchange due to turbulent processes. The technique has been used extensively as a non-intrusive means to quantify land-atmosphere exchanges of mass and energy. A variety of sensor technologies and gas sampling designs have been tried. Gas concentrations are commonly measured using infrared or laser absorption spectroscopy. Open-path sensors directly sample the ambient environment but suffer when the sample volume is obstructed (e.g., rain, dust). Closed-path sensors utilize pumps to draw air into the analyzer through inlet tubes which can attenuate the signal. Enclosed-path sensors are a newer, hybrid of the open- and closed-path designs where the sensor is mounted in the environment and the sample is drawn through a short inlet tube with short residence time. Five gas analyzers were evaluated as part of this experiment: open-path LI-COR 7500A, enclosed-path LI-COR 7200, closed-path Picarro G2311-f, open-path Campbell Scientific IRGASON, and enclosed-path Campbell Scientific EC155. We compared the relative performance of the gas analyzers over an irrigated alfalfa field in Davis, CA. The field was host to a range of ancillary measurements including below-ground sensors, and a weighing lysimeter. The crop was flood irrigated and harvested monthly. To compare sensors, we evaluated the half-hour mean and variance of gas concentrations (or mole densities). Power spectra for the gas analyzers and turbulent fluxes (from a common sonic anemometer) were also calculated and analyzed. Eddy covariance corrections will be discussed as they relate to sensor design (e.g., density corrections, signal attenuation).

  5. Clean Colon Software Program (CCSP), Proposal of a standardized Method to quantify Colon Cleansing During Colonoscopy: Preliminary Results

    PubMed Central

    Rosa-Rizzotto, Erik; Dupuis, Adrian; Guido, Ennio; Caroli, Diego; Monica, Fabio; Canova, Daniele; Cervellin, Erica; Marin, Renato; Trovato, Cristina; Crosta, Cristiano; Cocchio, Silvia; Baldo, Vincenzo; De Lazzari, Franca

    2015-01-01

    Background and study aims: Neoplastic lesions can be missed during colonoscopy, especially when cleansing is inadequate. Bowel preparation scales have significant limitations and no objective and standardized method currently exists to establish colon cleanliness during colonoscopy. The aims of our study are to create a software algorithm that is able to analyze bowel cleansing during colonoscopies and to compare it to a validate bowel preparation scale. Patients and methods: A software application (the Clean Colon Software Program, CCSP) was developed. Fifty colonoscopies were carried out and video-recorded. Each video was divided into 3 segments: cecum-hepatic flexure (1st Segment), hepatic flexure-descending colon (2nd Segment) and rectosigmoid segment (3rd Segment). Each segment was recorded twice, both before and after careful cleansing of the intestinal wall. A score from 0 (dirty) to 3 (clean) was then assigned by CCSP. All the videos were also viewed by four endoscopists and colon cleansing was established using the Boston Bowel Preparation Scale. Interclass correlation coefficient was then calculated between the endoscopists and the software. Results: The cleansing score of the prelavage colonoscopies was 1.56 ± 0.52 and the postlavage one was 2,08 ± 0,59 (P < 0.001) showing an approximate 33.3 % improvement in cleansing after lavage. Right colon segment prelavage (0.99 ± 0.69) was dirtier than left colon segment prelavage (2.07 ± 0.71). The overall interobserver agreement between the average cleansing score for the 4 endoscopists and the software pre-cleansing was 0.87 (95 % CI, 0.84 – 0.90) and post-cleansing was 0.86 (95 % CI, 0.83 – 0.89). Conclusions: The software is able to discriminate clean from non-clean colon tracts with high significance and is comparable to endoscopist evaluation. PMID:26528508

  6. Quantifying the effect of crops surface albedo variability on GHG budgets in a life cycle assessment approach : methodology and results.

    NASA Astrophysics Data System (ADS)

    Ferlicoq, Morgan; Ceschia, Eric; Brut, Aurore; Tallec, Tiphaine

    2013-04-01

    We tested a new method to estimate the radiative forcing of several crops at the annual and rotation scales, using local measurements data from two ICOS experimental sites. We used jointly 1) the radiative forcing caused by greenhouse gas (GHG) net emissions, calculated by using a Life Cycle Analysis (LCA) approach and in situ measurements (Ceschia et al. 2010), and 2) the radiative forcing caused by rapid changes in surface albedo typical from those ecosystems and resulting from management and crop phenology. The carbon and GHG budgets (GHGB) of 2 crop sites with contrasted management located in South West France (Auradé and Lamasquère sites) was estimated over a complete rotation by combining a classical LCA approach with on site flux measurements. At both sites, carbon inputs (organic fertilisation and seeds), carbon exports (harvest) and net ecosystem production (NEP), measured with the eddy covariance technique, were calculated. The variability of the different terms and their relative contributions to the net ecosystem carbon budget (NECB) were analysed for all site-years, and the effect of management on NECB was assessed. To account for GHG fluxes that were not directly measured on site, we estimated the emissions caused by field operations (EFO) for each site using emission factors from the literature. The EFO were added to the NECB to calculate the total GHGB for a range of cropping systems and management regimes. N2O emissions were or calculated following the IPCC (2007) guidelines, and CH4 emissions were assumed to be negligible compared to other contributions to the net GHGB. Additionally, albedo was calculated continuously using the short wave incident and reflected radiation measurements in the field (0.3-3µm) from CNR1 sensors. Mean annual differences in albedo and deduced radiative forcing from a reference value were then compared for all site-years. Mean annual differences in radiative forcing were then converted in g C equivalent m-2 in order

  7. Conventional physical therapy and physical therapy based on reflex stimulation showed similar results in children with myelomeningocele.

    PubMed

    Aizawa, Carolina Y P; Morales, Mariana P; Lundberg, Carolina; Moura, Maria Clara D Soares de; Pinto, Fernando C G; Voos, Mariana C; Hasue, Renata H

    2017-03-01

    We aimed to investigate whether infants with myelomeningocele would improve their motor ability and functional independence after ten sessions of physical therapy and compare the outcomes of conventional physical therapy (CPT) to a physical therapy program based on reflex stimulation (RPT). Twelve children were allocated to CPT (n = 6, age 18.3 months) or RPT (n = 6, age 18.2 months). The RPT involved proprioceptive neuromuscular facilitation. Children were assessed with the Gross Motor Function Measure and the Pediatric Evaluation of Disability Inventory before and after treatment. Mann-Whitney tests compared the improvement on the two scales of CPT versus RPT and the Wilcoxon test compared CPT to RPT (before vs. after treatment). Possible correlations between the two scales were tested with Spearman correlation coefficients. Both groups showed improvement on self-care and mobility domains of both scales. There were no differences between the groups, before, or after intervention. The CPT and RPT showed similar results after ten weeks of treatment.

  8. The ankle ergometer: A new tool for quantifying changes in mechanical properties of human muscle as a result of spaceflight

    NASA Astrophysics Data System (ADS)

    Mainar, A.; Vanhoutte, C.; Pérot, C.; Voronine, L.; Goubel, F.

    A mechanical device for studying changes in mechanical properties of human muscle as a result of spaceflight is presented. Its main capacities are to allow during a given experiment investigation of both contractile and visco-elastic properties of a musculo-articular complex using respectively isometric contractions, isokinetic movements, quick-release tests and sinusoidal perturbations. This device is a motor driven ergometer associated to an experimental protocol designed for pre- and post-flight experiments. As microgravity preferentially affects postural muscles, the apparatus was designed to test muscle groups crossing the ankle joint. Three subjects were tested during the Euromir '94 mission. Preliminary results obtained on the european astronaut are briefly reported. During the next two years the experiments will be performed during six missions.

  9. Native trees show conservative water use relative to invasive trees: results from a removal experiment in a Hawaiian wet forest

    PubMed Central

    Cavaleri, Molly A.; Ostertag, Rebecca; Cordell, Susan; Sack, Lawren

    2014-01-01

    While the supply of freshwater is expected to decline in many regions in the coming decades, invasive plant species, often ‘high water spenders’, are greatly expanding their ranges worldwide. In this study, we quantified the ecohydrological differences between native and invasive trees and also the effects of woody invasive removal on plot-level water use in a heavily invaded mono-dominant lowland wet tropical forest on the Island of Hawaii. We measured transpiration rates of co-occurring native and invasive tree species with and without woody invasive removal treatments. Twenty native Metrosideros polymorpha and 10 trees each of three invasive species, Cecropia obtusifolia, Macaranga mappa and Melastoma septemnervium, were instrumented with heat-dissipation sap-flux probes in four 100 m2 plots (two invaded, two removal) for 10 months. In the invaded plots, where both natives and invasives were present, Metrosideros had the lowest sap-flow rates per unit sapwood, but the highest sap-flow rates per whole tree, owing to its larger mean diameter than the invasive trees. Stand-level water use within the removal plots was half that of the invaded plots, even though the removal of invasives caused a small but significant increase in compensatory water use by the remaining native trees. By investigating the effects of invasive species on ecohydrology and comparing native vs. invasive physiological traits, we not only gain understanding about the functioning of invasive species, but we also highlight potential water-conservation strategies for heavily invaded mono-dominant tropical forests worldwide. Native-dominated forests free of invasive species can be conservative in overall water use, providing a strong rationale for the control of invasive species and preservation of native-dominated stands. PMID:27293637

  10. Native trees show conservative water use relative to invasive trees: results from a removal experiment in a Hawaiian wet forest.

    PubMed

    Cavaleri, Molly A; Ostertag, Rebecca; Cordell, Susan; Sack, Lawren

    2014-01-01

    While the supply of freshwater is expected to decline in many regions in the coming decades, invasive plant species, often 'high water spenders', are greatly expanding their ranges worldwide. In this study, we quantified the ecohydrological differences between native and invasive trees and also the effects of woody invasive removal on plot-level water use in a heavily invaded mono-dominant lowland wet tropical forest on the Island of Hawaii. We measured transpiration rates of co-occurring native and invasive tree species with and without woody invasive removal treatments. Twenty native Metrosideros polymorpha and 10 trees each of three invasive species, Cecropia obtusifolia, Macaranga mappa and Melastoma septemnervium, were instrumented with heat-dissipation sap-flux probes in four 100 m(2) plots (two invaded, two removal) for 10 months. In the invaded plots, where both natives and invasives were present, Metrosideros had the lowest sap-flow rates per unit sapwood, but the highest sap-flow rates per whole tree, owing to its larger mean diameter than the invasive trees. Stand-level water use within the removal plots was half that of the invaded plots, even though the removal of invasives caused a small but significant increase in compensatory water use by the remaining native trees. By investigating the effects of invasive species on ecohydrology and comparing native vs. invasive physiological traits, we not only gain understanding about the functioning of invasive species, but we also highlight potential water-conservation strategies for heavily invaded mono-dominant tropical forests worldwide. Native-dominated forests free of invasive species can be conservative in overall water use, providing a strong rationale for the control of invasive species and preservation of native-dominated stands.

  11. Quantifying chain reptation in entangled polymer melts: topological and dynamical mapping of atomistic simulation results onto the tube model.

    PubMed

    Stephanou, Pavlos S; Baig, Chunggi; Tsolou, Georgia; Mavrantzas, Vlasis G; Kröger, Martin

    2010-03-28

    The topological state of entangled polymers has been analyzed recently in terms of primitive paths which allowed obtaining reliable predictions of the static (statistical) properties of the underlying entanglement network for a number of polymer melts. Through a systematic methodology that first maps atomistic molecular dynamics (MD) trajectories onto time trajectories of primitive chains and then documents primitive chain motion in terms of a curvilinear diffusion in a tubelike region around the coarse-grained chain contour, we are extending these static approaches here even further by computing the most fundamental function of the reptation theory, namely, the probability psi(s,t) that a segment s of the primitive chain remains inside the initial tube after time t, accounting directly for contour length fluctuations and constraint release. The effective diameter of the tube is independently evaluated by observing tube constraints either on atomistic displacements or on the displacement of primitive chain segments orthogonal to the initial primitive path. Having computed the tube diameter, the tube itself around each primitive path is constructed by visiting each entanglement strand along the primitive path one after the other and approximating it by the space of a small cylinder having the same axis as the entanglement strand itself and a diameter equal to the estimated effective tube diameter. Reptation of the primitive chain longitudinally inside the effective constraining tube as well as local transverse fluctuations of the chain driven mainly from constraint release and regeneration mechanisms are evident in the simulation results; the latter causes parts of the chains to venture outside their average tube surface for certain periods of time. The computed psi(s,t) curves account directly for both of these phenomena, as well as for contour length fluctuations, since all of them are automatically captured in the atomistic simulations. Linear viscoelastic

  12. Genomic and Enzymatic Results Show Bacillus cellulosilyticus Uses a Novel Set of LPXTA Carbohydrases to Hydrolyze Polysaccharides

    PubMed Central

    Mead, David; Drinkwater, Colleen; Brumm, Phillip J.

    2013-01-01

    Background Alkaliphilic Bacillus species are intrinsically interesting due to the bioenergetic problems posed by growth at high pH and high salt. Three alkaline cellulases have been cloned, sequenced and expressed from Bacillus cellulosilyticus N-4 (Bcell) making it an excellent target for genomic sequencing and mining of biomass-degrading enzymes. Methodology/Principal Findings The genome of Bcell is a single chromosome of 4.7 Mb with no plasmids present and three large phage insertions. The most unusual feature of the genome is the presence of 23 LPXTA membrane anchor proteins; 17 of these are annotated as involved in polysaccharide degradation. These two values are significantly higher than seen in any other Bacillus species. This high number of membrane anchor proteins is seen only in pathogenic Gram-positive organisms such as Listeria monocytogenes or Staphylococcus aureus. Bcell also possesses four sortase D subfamily 4 enzymes that incorporate LPXTA-bearing proteins into the cell wall; three of these are closely related to each other and unique to Bcell. Cell fractionation and enzymatic assay of Bcell cultures show that the majority of polysaccharide degradation is associated with the cell wall LPXTA-enzymes, an unusual feature in Gram-positive aerobes. Genomic analysis and growth studies both strongly argue against Bcell being a truly cellulolytic organism, in spite of its name. Preliminary results suggest that fungal mycelia may be the natural substrate for this organism. Conclusions/Significance Bacillus cellulosilyticus N-4, in spite of its name, does not possess any of the genes necessary for crystalline cellulose degradation, demonstrating the risk of classifying microorganisms without the benefit of genomic analysis. Bcell is the first Gram-positive aerobic organism shown to use predominantly cell-bound, non-cellulosomal enzymes for polysaccharide degradation. The LPXTA-sortase system utilized by Bcell may have applications both in anchoring

  13. Quantifying resilience

    USGS Publications Warehouse

    Allen, Craig R.; Angeler, David G.

    2016-01-01

    Several frameworks to operationalize resilience have been proposed. A decade ago, a special feature focused on quantifying resilience was published in the journal Ecosystems (Carpenter, Westley & Turner 2005). The approach there was towards identifying surrogates of resilience, but few of the papers proposed quantifiable metrics. Consequently, many ecological resilience frameworks remain vague and difficult to quantify, a problem that this special feature aims to address. However, considerable progress has been made during the last decade (e.g. Pope, Allen & Angeler 2014). Although some argue that resilience is best kept as an unquantifiable, vague concept (Quinlan et al. 2016), to be useful for managers, there must be concrete guidance regarding how and what to manage and how to measure success (Garmestani, Allen & Benson 2013; Spears et al. 2015). Ideas such as ‘resilience thinking’ have utility in helping stakeholders conceptualize their systems, but provide little guidance on how to make resilience useful for ecosystem management, other than suggesting an ambiguous, Goldilocks approach of being just right (e.g. diverse, but not too diverse; connected, but not too connected). Here, we clarify some prominent resilience terms and concepts, introduce and synthesize the papers in this special feature on quantifying resilience and identify core unanswered questions related to resilience.

  14. Development and application of methods to quantify spatial and temporal hyperpolarized 3He MRI ventilation dynamics: preliminary results in chronic obstructive pulmonary disease

    NASA Astrophysics Data System (ADS)

    Kirby, Miranda; Wheatley, Andrew; McCormack, David G.; Parraga, Grace

    2010-03-01

    Hyperpolarized helium-3 (3He) magnetic resonance imaging (MRI) has emerged as a non-invasive research method for quantifying lung structural and functional changes, enabling direct visualization in vivo at high spatial and temporal resolution. Here we described the development of methods for quantifying ventilation dynamics in response to salbutamol in Chronic Obstructive Pulmonary Disease (COPD). Whole body 3.0 Tesla Excite 12.0 MRI system was used to obtain multi-slice coronal images acquired immediately after subjects inhaled hyperpolarized 3He gas. Ventilated volume (VV), ventilation defect volume (VDV) and thoracic cavity volume (TCV) were recorded following segmentation of 3He and 1H images respectively, and used to calculate percent ventilated volume (PVV) and ventilation defect percent (VDP). Manual segmentation and Otsu thresholding were significantly correlated for VV (r=.82, p=.001), VDV (r=.87 p=.0002), PVV (r=.85, p=.0005), and VDP (r=.85, p=.0005). The level of agreement between these segmentation methods was also evaluated using Bland-Altman analysis and this showed that manual segmentation was consistently higher for VV (Mean=.22 L, SD=.05) and consistently lower for VDV (Mean=-.13, SD=.05) measurements than Otsu thresholding. To automate the quantification of newly ventilated pixels (NVp) post-bronchodilator, we used translation, rotation, and scaling transformations to register pre-and post-salbutamol images. There was a significant correlation between NVp and VDV (r=-.94 p=.005) and between percent newly ventilated pixels (PNVp) and VDP (r=- .89, p=.02), but not for VV or PVV. Evaluation of 3He MRI ventilation dynamics using Otsu thresholding and landmark-based image registration provides a way to regionally quantify functional changes in COPD subjects after treatment with beta-agonist bronchodilators, a common COPD and asthma therapy.

  15. Quantifying Surface Processes and Stratigraphic Characteristics Resulting from Large Magnitude High Frequency and Small Magnitude Low Frequency Relative Sea Level Cycles: An Experimental Study

    NASA Astrophysics Data System (ADS)

    Yu, L.; Li, Q.; Esposito, C. R.; Straub, K. M.

    2015-12-01

    Relative Sea-Level (RSL) change, which is a primary control on sequence stratigraphic architecture, has a close relationship with climate change. In order to explore the influence of RSL change on the stratigraphic record, we conducted three physical experiments which shared identical boundary conditions but differed in their RSL characteristics. Specifically, the three experiments differed with respect to two non-dimensional numbers that compare the magnitude and periodicity of RSL cycles to the spatial and temporal scales of autogenic processes, respectively. The magnitude of RSL change is quantified with H*, defined as the peak to trough difference in RSL during a cycle divided by a system's maximum autogenic channel depth. The periodicity of RSL change is quantified with T*, defined as the period of RSL cycles divided by the time required to deposit one channel depth of sediment, on average, everywhere in the basin. Experiments performed included: 1) a control experiment lacking RSL cycles, used to define a system's autogenics, 2) a high magnitude, high frequency RSL cycles experiment, and 3) a low magnitude, low frequency cycles experiment. We observe that the high magnitude, high frequency experiment resulted in the thickest channel bodies with the lowest width-to-depth ratios, while the low magnitude, long period experiment preserves a record of gradual shoreline transgression and regression producing facies that are the most continuous in space. We plan to integrate our experimental results with Delft3D numerical experiments models that sample similar non-dimensional characteristics of RSL cycles. Quantifying the influence of RSL change, normalized as a function of the spatial and temporal scales of autogenic processes will strengthen our ability to predict stratigraphic architecture and invert stratigraphy for paleo-environmental conditions.

  16. A collaborative accountable care model in three practices showed promising early results on costs and quality of care.

    PubMed

    Salmon, Richard B; Sanderson, Mark I; Walters, Barbara A; Kennedy, Karen; Flores, Robert C; Muney, Alan M

    2012-11-01

    Cigna's Collaborative Accountable Care initiative provides financial incentives to physician groups and integrated delivery systems to improve the quality and efficiency of care for patients in commercial open-access benefit plans. Registered nurses who serve as care coordinators employed by participating practices are a central feature of the initiative. They use patient-specific reports and practice performance reports provided by Cigna to improve care coordination, identify and close care gaps, and address other opportunities for quality improvement. We report interim quality and cost results for three geographically and structurally diverse provider practices in Arizona, New Hampshire, and Texas. Although not statistically significant, these early results revealed favorable trends in total medical costs and quality of care, suggesting that a shared-savings accountable care model and collaborative support from the payer can enable practices to take meaningful steps toward full accountability for care quality and efficiency.

  17. Volar locking distal radius plates show better short-term results than other treatment options: A prospective randomised controlled trial

    PubMed Central

    Drobetz, Herwig; Koval, Lidia; Weninger, Patrick; Luscombe, Ruth; Jeffries, Paula; Ehrendorfer, Stefan; Heal, Clare

    2016-01-01

    AIM To compare the outcomes of displaced distal radius fractures treated with volar locking plates and with immediate postoperative mobilisation with the outcomes of these fractures treated with modalities that necessitate 6 wk wrist immobilisation. METHODS A prospective, randomised controlled single-centre trial was conducted with 56 patients who had a displaced radius fracture were randomised to treatment either with a volar locking plate (n = 29), or another treatment modality (n = 27; cast immobilisation with or without wires or external fixator). Outcomes were measured at 12 wk. Functional outcome scores measured were the Patient-Rated Wrist Evaluation (PRWE) Score; Disabilities of the Arm, Shoulder and Hand and activities of daily living (ADLs). Clinical outcomes were wrist range of motion and grip strength. Radiographic parameters were volar inclination and ulnar variance. RESULTS Patients in the volar locking plate group had significantly better PRWE scores, ADL scores, grip strength and range of extension at three months compared with the control group. All radiological parameters were significantly better in the volar locking plate group at 3 mo. CONCLUSION The present study suggests that volar locking plates produced significantly better functional and clinical outcomes at 3 mo compared with other treatment modalities. Anatomical reduction was significantly more likely to be preserved in the plating group. Level of evidence: II. PMID:27795951

  18. Selection Indices and Multivariate Analysis Show Similar Results in the Evaluation of Growth and Carcass Traits in Beef Cattle

    PubMed Central

    Brito Lopes, Fernando; da Silva, Marcelo Corrêa; Magnabosco, Cláudio Ulhôa; Goncalves Narciso, Marcelo; Sainz, Roberto Daniel

    2016-01-01

    This research evaluated a multivariate approach as an alternative tool for the purpose of selection regarding expected progeny differences (EPDs). Data were fitted using a multi-trait model and consisted of growth traits (birth weight and weights at 120, 210, 365 and 450 days of age) and carcass traits (longissimus muscle area (LMA), back-fat thickness (BF), and rump fat thickness (RF)), registered over 21 years in extensive breeding systems of Polled Nellore cattle in Brazil. Multivariate analyses were performed using standardized (zero mean and unit variance) EPDs. The k mean method revealed that the best fit of data occurred using three clusters (k = 3) (P < 0.001). Estimates of genetic correlation among growth and carcass traits and the estimates of heritability were moderate to high, suggesting that a correlated response approach is suitable for practical decision making. Estimates of correlation between selection indices and the multivariate index (LD1) were moderate to high, ranging from 0.48 to 0.97. This reveals that both types of indices give similar results and that the multivariate approach is reliable for the purpose of selection. The alternative tool seems very handy when economic weights are not available or in cases where more rapid identification of the best animals is desired. Interestingly, multivariate analysis allowed forecasting information based on the relationships among breeding values (EPDs). Also, it enabled fine discrimination, rapid data summarization after genetic evaluation, and permitted accounting for maternal ability and the genetic direct potential of the animals. In addition, we recommend the use of longissimus muscle area and subcutaneous fat thickness as selection criteria, to allow estimation of breeding values before the first mating season in order to accelerate the response to individual selection. PMID:26789008

  19. Selection Indices and Multivariate Analysis Show Similar Results in the Evaluation of Growth and Carcass Traits in Beef Cattle.

    PubMed

    Brito Lopes, Fernando; da Silva, Marcelo Corrêa; Magnabosco, Cláudio Ulhôa; Goncalves Narciso, Marcelo; Sainz, Roberto Daniel

    2016-01-01

    This research evaluated a multivariate approach as an alternative tool for the purpose of selection regarding expected progeny differences (EPDs). Data were fitted using a multi-trait model and consisted of growth traits (birth weight and weights at 120, 210, 365 and 450 days of age) and carcass traits (longissimus muscle area (LMA), back-fat thickness (BF), and rump fat thickness (RF)), registered over 21 years in extensive breeding systems of Polled Nellore cattle in Brazil. Multivariate analyses were performed using standardized (zero mean and unit variance) EPDs. The k mean method revealed that the best fit of data occurred using three clusters (k = 3) (P < 0.001). Estimates of genetic correlation among growth and carcass traits and the estimates of heritability were moderate to high, suggesting that a correlated response approach is suitable for practical decision making. Estimates of correlation between selection indices and the multivariate index (LD1) were moderate to high, ranging from 0.48 to 0.97. This reveals that both types of indices give similar results and that the multivariate approach is reliable for the purpose of selection. The alternative tool seems very handy when economic weights are not available or in cases where more rapid identification of the best animals is desired. Interestingly, multivariate analysis allowed forecasting information based on the relationships among breeding values (EPDs). Also, it enabled fine discrimination, rapid data summarization after genetic evaluation, and permitted accounting for maternal ability and the genetic direct potential of the animals. In addition, we recommend the use of longissimus muscle area and subcutaneous fat thickness as selection criteria, to allow estimation of breeding values before the first mating season in order to accelerate the response to individual selection.

  20. QUANTIFYING FOREST ABOVEGROUND CARBON POOLS AND FLUXES USING MULTI-TEMPORAL LIDAR A report on field monitoring, remote sensing MMV, GIS integration, and modeling results for forestry field validation test to quantify aboveground tree biomass and carbon

    SciTech Connect

    Lee Spangler; Lee A. Vierling; Eva K. Stand; Andrew T. Hudak; Jan U.H. Eitel; Sebastian Martinuzzi

    2012-04-01

    Sound policy recommendations relating to the role of forest management in mitigating atmospheric carbon dioxide (CO{sub 2}) depend upon establishing accurate methodologies for quantifying forest carbon pools for large tracts of land that can be dynamically updated over time. Light Detection and Ranging (LiDAR) remote sensing is a promising technology for achieving accurate estimates of aboveground biomass and thereby carbon pools; however, not much is known about the accuracy of estimating biomass change and carbon flux from repeat LiDAR acquisitions containing different data sampling characteristics. In this study, discrete return airborne LiDAR data was collected in 2003 and 2009 across {approx}20,000 hectares (ha) of an actively managed, mixed conifer forest landscape in northern Idaho, USA. Forest inventory plots, established via a random stratified sampling design, were established and sampled in 2003 and 2009. The Random Forest machine learning algorithm was used to establish statistical relationships between inventory data and forest structural metrics derived from the LiDAR acquisitions. Aboveground biomass maps were created for the study area based on statistical relationships developed at the plot level. Over this 6-year period, we found that the mean increase in biomass due to forest growth across the non-harvested portions of the study area was 4.8 metric ton/hectare (Mg/ha). In these non-harvested areas, we found a significant difference in biomass increase among forest successional stages, with a higher biomass increase in mature and old forest compared to stand initiation and young forest. Approximately 20% of the landscape had been disturbed by harvest activities during the six-year time period, representing a biomass loss of >70 Mg/ha in these areas. During the study period, these harvest activities outweighed growth at the landscape scale, resulting in an overall loss in aboveground carbon at this site. The 30-fold increase in sampling density

  1. Quantifying Quantumness

    NASA Astrophysics Data System (ADS)

    Braun, Daniel; Giraud, Olivier; Braun, Peter A.

    2010-03-01

    We introduce and study a measure of ``quantumness'' of a quantum state based on its Hilbert-Schmidt distance from the set of classical states. ``Classical states'' were defined earlier as states for which a positive P-function exists, i.e. they are mixtures of coherent states [1]. We study invariance properties of the measure, upper bounds, and its relation to entanglement measures. We evaluate the quantumness of a number of physically interesting states and show that for any physical system in thermal equilibrium there is a finite critical temperature above which quantumness vanishes. We then use the measure for identifying the ``most quantum'' states. Such states are expected to be potentially most useful for quantum information theoretical applications. We find these states explicitly for low-dimensional spin-systems, and show that they possess beautiful, highly symmetric Majorana representations. [4pt] [1] Classicality of spin states, Olivier Giraud, Petr Braun, and Daniel Braun, Phys. Rev. A 78, 042112 (2008)

  2. In axial spondyloarthritis, never smokers, ex-smokers and current smokers show a gradient of increasing disease severity - results from the Scotland Registry for Ankylosing Spondylitis (SIRAS).

    PubMed

    Jones, Gareth T; Ratz, Tiara; Dean, Linda E; Macfarlane, Gary J; Atzeni, Fabiola

    2016-11-29

    Objectives To examine the relationship between smoking, smoking cessation, and disease characteristics/quality of life (QoL) in spondyloarthritis. Methods The Scotland Registry for Ankylosing Spondylitis collects data from clinically diagnosed patients with spondyloarthritis. Clinical data, including Bath Ankylosing Spondylitis indices of disease activity (BASDAI) and function (BASFI), was obtained from medical records. Postal questionnaires provided information on smoking status and QoL (Ankylosing Spondylitis QoL questionnaire; ASQoL). Linear and logistic regression quantified the effect of smoking, and smoking cessation, on various disease-specific and QoL outcomes, adjusting for age, sex, deprivation, education and alcohol status. Results are presented as regression coefficients (β) or odds ratios (OR) with 95% confidence intervals. Results 946 participants provided data (male 73.5%, mean age 52yrs). Current smoking was reported by 22%, and 38% were ex-smokers. Ever smokers experienced poorer BASDAI (β = 0.5; 0.2 to 0.9) and BASFI (β = 0.8; 0.4 to 1.2), and reported worse QoL (ASQoL, β = 1.5; 0.7 to 2.3). Compared to current smokers, ex-smokers reported lower disease activity (BASDAI, β = -0.5; -1.0 to -0.04) and significantly better QoL (ASQoL, β = -1.2; -2.3 to -0.2). They also were more likely to have a uveitis history (OR = 2.4; 1.5 to 3.8). Conclusions Smokers with spondyloarthritis experience worse disease than never smokers. However, we provide new evidence that, among smokers, smoking cessation is associated with lower disease activity and better physical function and QoL. Clinicians should specifically promote smoking cessation as an adjunct to usual therapy in patients with spondyloarthritis. This article is protected by copyright. All rights reserved.

  3. Methods for Quantifying the Uncertainties of LSIT Test Parameters, Test Results, and Full-Scale Mixing Performance Using Models Developed from Scaled Test Data

    SciTech Connect

    Piepel, Gregory F.; Cooley, Scott K.; Kuhn, William L.; Rector, David R.; Heredia-Langner, Alejandro

    2015-05-01

    This report discusses the statistical methods for quantifying uncertainties in 1) test responses and other parameters in the Large Scale Integrated Testing (LSIT), and 2) estimates of coefficients and predictions of mixing performance from models that relate test responses to test parameters. Testing at a larger scale has been committed to by Bechtel National, Inc. and the U.S. Department of Energy (DOE) to “address uncertainties and increase confidence in the projected, full-scale mixing performance and operations” in the Waste Treatment and Immobilization Plant (WTP).

  4. "The Show"

    ERIC Educational Resources Information Center

    Gehring, John

    2004-01-01

    For the past 16 years, the blue-collar city of Huntington, West Virginia, has rolled out the red carpet to welcome young wrestlers and their families as old friends. They have come to town chasing the same dream for a spot in what many of them call "The Show". For three days, under the lights of an arena packed with 5,000 fans, the…

  5. How to quantify ripple

    NASA Astrophysics Data System (ADS)

    Geib, H.; Kuehne, C.; Morgenbrod, E.

    In the present attempt to render the small area errors in large telescope mirror manufacture, known as 'ripple', numerically quantifiable, two-dimensional regularity is omitted, yielding greater clarity and comparability of results. In the measurement of the interference fringe, the central fringe is photometered in equidistant steps. Application of Fourier analysis to the average value obtained is followed by a power spectrum calculation. The test method is evaluated through the numerical examination of a ripple structure of known size and period length.

  6. Quantifying concordance in cosmology

    NASA Astrophysics Data System (ADS)

    Seehars, Sebastian; Grandis, Sebastian; Amara, Adam; Refregier, Alexandre

    2016-05-01

    Quantifying the concordance between different cosmological experiments is important for testing the validity of theoretical models and systematics in the observations. In earlier work, we thus proposed the Surprise, a concordance measure derived from the relative entropy between posterior distributions. We revisit the properties of the Surprise and describe how it provides a general, versatile, and robust measure for the agreement between data sets. We also compare it to other measures of concordance that have been proposed for cosmology. As an application, we extend our earlier analysis and use the Surprise to quantify the agreement between WMAP 9, Planck 13, and Planck 15 constraints on the Λ CDM model. Using a principle component analysis in parameter space, we find that the large Surprise between WMAP 9 and Planck 13 (S =17.6 bits, implying a deviation from consistency at 99.8% confidence) is due to a shift along a direction that is dominated by the amplitude of the power spectrum. The Planck 15 constraints deviate from the Planck 13 results (S =56.3 bits), primarily due to a shift in the same direction. The Surprise between WMAP and Planck consequently disappears when moving to Planck 15 (S =-5.1 bits). This means that, unlike Planck 13, Planck 15 is not in tension with WMAP 9. These results illustrate the advantages of the relative entropy and the Surprise for quantifying the disagreement between cosmological experiments and more generally as an information metric for cosmology.

  7. Analysis of conservative tracer measurement results using the Frechet distribution at planted horizontal subsurface flow constructed wetlands filled with coarse gravel and showing the effect of clogging processes.

    PubMed

    Dittrich, Ernő; Klincsik, Mihály

    2015-11-01

    A mathematical process, developed in Maple environment, has been successful in decreasing the error of measurement results and in the precise calculation of the moments of corrected tracer functions. It was proved that with this process, the measured tracer results of horizontal subsurface flow constructed wetlands filled with coarse gravel (HSFCW-C) can be fitted more accurately than with the conventionally used distribution functions (Gaussian, Lognormal, Fick (Inverse Gaussian) and Gamma). This statement is true only for the planted HSFCW-Cs. The analysis of unplanted HSFCW-Cs needs more research. The result of the analysis shows that the conventional solutions (completely stirred series tank reactor (CSTR) model and convection-dispersion transport (CDT) model) cannot describe these types of transport processes with sufficient accuracy. These outcomes can help in developing better process descriptions of very difficult transport processes in HSFCW-Cs. Furthermore, a new mathematical process can be developed for the calculation of real hydraulic residence time (HRT) and dispersion coefficient values. The presented method can be generalized to other kinds of hydraulic environments.

  8. T-cell lines from 2 patients with adenosine deaminase (ADA) deficiency showed the restoration of ADA activity resulted from the reversion of an inherited mutation.

    PubMed

    Ariga, T; Oda, N; Yamaguchi, K; Kawamura, N; Kikuta, H; Taniuchi, S; Kobayashi, Y; Terada, K; Ikeda, H; Hershfield, M S; Kobayashi, K; Sakiyama, Y

    2001-05-01

    Inherited deficiency of adenosine deaminase (ADA) results in one of the autosomal recessive forms of severe combined immunodeficiency. This report discusses 2 patients with ADA deficiency from different families, in whom a possible reverse mutation had occurred. The novel mutations were identified in the ADA gene from the patients, and both their parents were revealed to be carriers. Unexpectedly, established patient T-cell lines, not B-cell lines, showed half-normal levels of ADA enzyme activity. Reevaluation of the mutations in these T-cell lines indicated that one of the inherited ADA gene mutations was reverted in both patients. At least one of the patients seemed to possess the revertant cells in vivo; however, the mutant cells might have overcome the revertant after receiving ADA enzyme replacement therapy. These findings may have significant implications regarding the prospects for stem cell gene therapy for ADA deficiency.

  9. How often do German children and adolescents show signs of common mental health problems? Results from different methodological approaches – a cross-sectional study

    PubMed Central

    2014-01-01

    Background Child and adolescent mental health problems are ubiquitous and burdensome. Their impact on functional disability, the high rates of accompanying medical illnesses and the potential to last until adulthood make them a major public health issue. While methodological factors cause variability of the results from epidemiological studies, there is a lack of prevalence rates of mental health problems in children and adolescents according to ICD-10 criteria from nationally representative samples. International findings suggest only a small proportion of children with function impairing mental health problems receive treatment, but information about the health care situation of children and adolescents is scarce. The aim of this epidemiological study was a) to classify symptoms of common mental health problems according to ICD-10 criteria in order to compare the statistical and clinical case definition strategies using a single set of data and b) to report ICD-10 codes from health insurance claims data. Methods a) Based on a clinical expert rating, questionnaire items were mapped on ICD-10 criteria; data from the Mental Health Module (BELLA study) were analyzed for relevant ICD-10 and cut-off criteria; b) Claims data were analyzed for relevant ICD-10 codes. Results According to parent report 7.5% (n = 208) met the ICD-10 criteria of a mild depressive episode and 11% (n = 305) showed symptoms of depression according to cut-off score; Anxiety is reported in 5.6% (n = 156) and 11.6% (n = 323), conduct disorder in 15.2% (n = 373) and 14.6% (n = 357). Self-reported symptoms in 11 to 17 year olds resulted in 15% (n = 279) reporting signs of a mild depression according to ICD-10 criteria (vs. 16.7% (n = 307) based on cut-off) and 10.9% (n = 201) reported symptoms of anxiety (vs. 15.4% (n = 283)). Results from routine data identify 0.9% (n = 1,196) with a depression diagnosis, 3.1% (n = 6,729) with anxiety and 1.4% (n

  10. Quantifying edge significance on maintaining global connectivity

    PubMed Central

    Qian, Yuhua; Li, Yebin; Zhang, Min; Ma, Guoshuai; Lu, Furong

    2017-01-01

    Global connectivity is a quite important issue for networks. The failures of some key edges may lead to breakdown of the whole system. How to find them will provide a better understanding on system robustness. Based on topological information, we propose an approach named LE (link entropy) to quantify the edge significance on maintaining global connectivity. Then we compare the LE with the other six acknowledged indices on the edge significance: the edge betweenness centrality, degree product, bridgeness, diffusion importance, topological overlap and k-path edge centrality. Experimental results show that the LE approach outperforms in quantifying edge significance on maintaining global connectivity. PMID:28349923

  11. Transgene silencing of the Hutchinson-Gilford progeria syndrome mutation results in a reversible bone phenotype, whereas resveratrol treatment does not show overall beneficial effects.

    PubMed

    Strandgren, Charlotte; Nasser, Hasina Abdul; McKenna, Tomás; Koskela, Antti; Tuukkanen, Juha; Ohlsson, Claes; Rozell, Björn; Eriksson, Maria

    2015-08-01

    Hutchinson-Gilford progeria syndrome (HGPS) is a rare premature aging disorder that is most commonly caused by a de novo point mutation in exon 11 of the LMNA gene, c.1824C>T, which results in an increased production of a truncated form of lamin A known as progerin. In this study, we used a mouse model to study the possibility of recovering from HGPS bone disease upon silencing of the HGPS mutation, and the potential benefits from treatment with resveratrol. We show that complete silencing of the transgenic expression of progerin normalized bone morphology and mineralization already after 7 weeks. The improvements included lower frequencies of rib fractures and callus formation, an increased number of osteocytes in remodeled bone, and normalized dentinogenesis. The beneficial effects from resveratrol treatment were less significant and to a large extent similar to mice treated with sucrose alone. However, the reversal of the dental phenotype of overgrown and laterally displaced lower incisors in HGPS mice could be attributed to resveratrol. Our results indicate that the HGPS bone defects were reversible upon suppressed transgenic expression and suggest that treatments targeting aberrant progerin splicing give hope to patients who are affected by HGPS.

  12. A high-density wireless underground sensor network (WUSN) to quantify hydro-ecological interactions for a UK floodplain; project background and initial results

    NASA Astrophysics Data System (ADS)

    Verhoef, A.; Choudhary, B.; Morris, P. J.; McCann, J.

    2012-04-01

    Floodplain meadows support some of the most diverse vegetation in the UK, and also perform key ecosystem services, such as flood storage and sediment retention. However, the UK now has less than 1500 ha of this unique habitat remaining. In order to conserve and better exploit the services provided by this grassland, an improved understanding of its functioning is essential. Vegetation functioning and species composition are known to be tightly correlated to the hydrological regime, and related temperature and nutrient regime, but the mechanisms controlling these relationships are not well established. The FUSE* project aims to investigate the spatiotemporal variability in vegetation functioning (e.g. photosynthesis and transpiration) and plant community composition in a floodplain meadow near Oxford, UK (Yarnton Mead), and their relationship to key soil physical variables (soil temperature and moisture content), soil nutrient levels and the water- and energy-balance. A distributed high density Wireless Underground Sensor Network (WUSN) is in the process of being established on Yarnton Mead. The majority, or ideally all, of the sensing and transmitting components will be installed below-ground because Yarnton Mead is a SSSI (Site of Special Scientific Interest, due to its unique plant community) and because occasionally sheep or cattle are grazing on it, and that could damage the nodes. This prerequisite has implications for the maximum spacing between UG nodes and their communications technologies; in terms of signal strength, path losses and requirements for battery life. The success of underground wireless communication is highly dependent on the soil type and water content. This floodplain environment is particularly challenging in this context because the soil contains a large amount of clay near the surface and is therefore less favourable to EM wave propagation than sandy soils. Furthermore, due to high relative saturation levels (as a result of high

  13. Quantifying and Reducing the Uncertainties in Future Projections of Droughts and Heat Waves for North America that Result from the Diversity of Models in CMIP5

    NASA Astrophysics Data System (ADS)

    Herrera-Estrada, J. E.; Sheffield, J.

    2014-12-01

    There are many sources of uncertainty regarding the future projections of our climate, including the multiple possible Representative Concentration Pathways (RCPs), the variety of climate models used, and the initial and boundary conditions with which they are run. Moreover, it has been shown that the internal variability of the climate system can sometimes be of the same order of magnitude as the climate change signal or even larger for some variables. Nonetheless, in order to help inform stakeholders in water resources and agriculture in North America when developing adaptation strategies, particularly for extreme events such as droughts and heat waves, it is necessary to study the plausible range of changes that the region might experience during the 21st century. We aim to understand and reduce the uncertainties associated with this range of possible scenarios by focusing on the diversity of climate models involved in the Coupled Model Intercomparison Project Phase 5 (CMIP5). Data output from various CMIP5 models is compared against near surface climate and land-surface hydrological data from the North American Land Data Assimilation System (NLDAS)-2 to evaluate how well each climate model represents the land-surface processes associated with droughts and heat waves during the overlapping historical period (1979-2005). These processes include the representation of precipitation and radiation and their partitioning at the land surface, land-atmosphere interactions, and the propagation of signals of these extreme events through the land surface. The ability of the CMIP5 models to reproduce these important physical processes for regions of North America is used to inform a multi-model ensemble in which models that represent the processes relevant to droughts and heat waves better are given more importance. Furthermore, the future projections are clustered to identify possible dependencies in behavior across models. The results indicate a wide range in performance

  14. Quantifying Uncertainty in Expert Judgment: Initial Results

    DTIC Science & Technology

    2013-03-01

    like operating systems, most notably Linux. Powered by the WebKit engine, Epiphany aims to provide an uncomplicat- ed user interface that enables users... Powered by the WebKit engine, Epiphany aims to provide an uncomplicated user interface that enables users to focus on Web content instead of the...effort What percentage of the code is written in the product’s major language (Perl)? 5 PostgreSQL is a powerful , open source relational database

  15. Storytelling Slide Shows to Improve Diabetes and High Blood Pressure Knowledge and Self-Efficacy: Three-Year Results among Community Dwelling Older African Americans

    ERIC Educational Resources Information Center

    Bertera, Elizabeth M.

    2014-01-01

    This study combined the African American tradition of oral storytelling with the Hispanic medium of "Fotonovelas." A staggered pretest posttest control group design was used to evaluate four Storytelling Slide Shows on health that featured community members. A total of 212 participants were recruited for the intervention and 217 for the…

  16. Diesel Emissions Quantifier (DEQ)

    EPA Pesticide Factsheets

    .The Diesel Emissions Quantifier (Quantifier) is an interactive tool to estimate emission reductions and cost effectiveness. Publications EPA-420-F-13-008a (420f13008a), EPA-420-B-10-035 (420b10023), EPA-420-B-10-034 (420b10034)

  17. Quantifiable Lateral Flow Assay Test Strips

    NASA Technical Reports Server (NTRS)

    2003-01-01

    As easy to read as a home pregnancy test, three Quantifiable Lateral Flow Assay (QLFA) strips used to test water for E. coli show different results. The brightly glowing control line on the far right of each strip indicates that all three tests ran successfully. But the glowing test line on the middle left and bottom strips reveal their samples were contaminated with E. coli bacteria at two different concentrations. The color intensity correlates with concentration of contamination.

  18. Mathematical modelling in Matlab of the experimental results shows the electrochemical potential difference - temperature of the WC coatings immersed in a NaCl solution

    NASA Astrophysics Data System (ADS)

    Benea, M. L.; Benea, O. D.

    2016-02-01

    The method used for purchasing the corrosion behaviour the WC coatings deposited by plasma spraying, on a martensitic stainless steel substrate consists in measuring the electrochemical potential of the coating, respectively that of the substrate, immersed in a NaCl solution as corrosive agent. The mathematical processing of the obtained experimental results in Matlab allowed us to make some correlations between the electrochemical potential of the coating and the solution temperature is very well described by some curves having equations obtained by interpolation order 4.

  19. "First Things First" Shows Promising Results

    ERIC Educational Resources Information Center

    Hendrie, Caroline

    2005-01-01

    In this article, the author discusses a school improvement model, First Things First, developed by James P. Connell, a former tenured professor of psychology at the University of Rochester in New York. The model has three pillars for the high school level: (1) small, themed learning communities that each keep a group of students together…

  20. Quantifying the adaptive cycle

    USGS Publications Warehouse

    Angeler, David G.; Allen, Craig R.; Garmestani, Ahjond S.; Gunderson, Lance H.; Hjerne, Olle; Winder, Monika

    2015-01-01

    The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative approach for testing premises (reorganisation, conservatism, adaptation) in the adaptive cycle, using Baltic Sea phytoplankton communities as an example of such complex system dynamics. Phytoplankton organizes in recurring spring and summer blooms, a well-established paradigm in planktology and succession theory, with characteristic temporal trajectories during blooms that may be consistent with adaptive cycle phases. We used long-term (1994–2011) data and multivariate analysis of community structure to assess key components of the adaptive cycle. Specifically, we tested predictions about: reorganisation: spring and summer blooms comprise distinct community states; conservatism: community trajectories during individual adaptive cycles are conservative; and adaptation: phytoplankton species during blooms change in the long term. All predictions were supported by our analyses. Results suggest that traditional ecological paradigms such as phytoplankton successional models have potential for moving the adaptive cycle from a metaphor to a framework that can improve our understanding how complex systems organize and reorganize following collapse. Quantifying reorganization, conservatism and adaptation provides opportunities to cope with the intricacies and uncertainties associated with fast ecological change, driven by shifting system controls. Ultimately, combining traditional ecological paradigms with heuristics of complex system dynamics using quantitative approaches may help refine ecological theory and improve our understanding of the resilience of ecosystems.

  1. Quantifying the Adaptive Cycle.

    PubMed

    Angeler, David G; Allen, Craig R; Garmestani, Ahjond S; Gunderson, Lance H; Hjerne, Olle; Winder, Monika

    2015-01-01

    The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative approach for testing premises (reorganisation, conservatism, adaptation) in the adaptive cycle, using Baltic Sea phytoplankton communities as an example of such complex system dynamics. Phytoplankton organizes in recurring spring and summer blooms, a well-established paradigm in planktology and succession theory, with characteristic temporal trajectories during blooms that may be consistent with adaptive cycle phases. We used long-term (1994-2011) data and multivariate analysis of community structure to assess key components of the adaptive cycle. Specifically, we tested predictions about: reorganisation: spring and summer blooms comprise distinct community states; conservatism: community trajectories during individual adaptive cycles are conservative; and adaptation: phytoplankton species during blooms change in the long term. All predictions were supported by our analyses. Results suggest that traditional ecological paradigms such as phytoplankton successional models have potential for moving the adaptive cycle from a metaphor to a framework that can improve our understanding how complex systems organize and reorganize following collapse. Quantifying reorganization, conservatism and adaptation provides opportunities to cope with the intricacies and uncertainties associated with fast ecological change, driven by shifting system controls. Ultimately, combining traditional ecological paradigms with heuristics of complex system dynamics using quantitative approaches may help refine ecological theory and improve our understanding of the resilience of ecosystems.

  2. Results.

    ERIC Educational Resources Information Center

    Zemsky, Robert; Shaman, Susan; Shapiro, Daniel B.

    2001-01-01

    Describes the Collegiate Results Instrument (CRI), which measures a range of collegiate outcomes for alumni 6 years after graduation. The CRI was designed to target alumni from institutions across market segments and assess their values, abilities, work skills, occupations, and pursuit of lifelong learning. (EV)

  3. Quantifying Faculty Workloads.

    ERIC Educational Resources Information Center

    Archer, J. Andrew

    Teaching load depends on many variables, however most colleges define it strictly in terms of contact or credit hours. The failure to give weight to variables such as number of preparations, number of students served, committee and other noninstructional assignments is usually due to the lack of a formula that will quantify the effects of these…

  4. Catalysis: Quantifying charge transfer

    NASA Astrophysics Data System (ADS)

    James, Trevor E.; Campbell, Charles T.

    2016-02-01

    Improving the design of catalytic materials for clean energy production requires a better understanding of their electronic properties, which remains experimentally challenging. Researchers now quantify the number of electrons transferred from metal nanoparticles to an oxide support as a function of particle size.

  5. Quantifying Ubiquitin Signaling

    PubMed Central

    Ordureau, Alban; Münch, Christian; Harper, J. Wade

    2015-01-01

    Ubiquitin (UB)-driven signaling systems permeate biology, and are often integrated with other types of post-translational modifications (PTMs), most notably phosphorylation. Flux through such pathways is typically dictated by the fractional stoichiometry of distinct regulatory modifications and protein assemblies as well as the spatial organization of pathway components. Yet, we rarely understand the dynamics and stoichiometry of rate-limiting intermediates along a reaction trajectory. Here, we review how quantitative proteomic tools and enrichment strategies are being used to quantify UB-dependent signaling systems, and to integrate UB signaling with regulatory phosphorylation events. A key regulatory feature of ubiquitylation is that the identity of UB chain linkage types can control downstream processes. We also describe how proteomic and enzymological tools can be used to identify and quantify UB chain synthesis and linkage preferences. The emergence of sophisticated quantitative proteomic approaches will set a new standard for elucidating biochemical mechanisms of UB-driven signaling systems. PMID:26000850

  6. Quantifying PV power Output Variability

    SciTech Connect

    Hoff, Thomas E.; Perez, Richard

    2010-10-15

    This paper presents a novel approach to rigorously quantify power Output Variability from a fleet of photovoltaic (PV) systems, ranging from a single central station to a set of distributed PV systems. The approach demonstrates that the relative power Output Variability for a fleet of identical PV systems (same size, orientation, and spacing) can be quantified by identifying the number of PV systems and their Dispersion Factor. The Dispersion Factor is a new variable that captures the relationship between PV Fleet configuration, Cloud Transit Speed, and the Time Interval over which variability is evaluated. Results indicate that Relative Output Variability: (1) equals the inverse of the square root of the number of systems for fully dispersed PV systems; and (2) could be further minimized for optimally-spaced PV systems. (author)

  7. Quantifying surface normal estimation

    NASA Astrophysics Data System (ADS)

    Reid, Robert B.; Oxley, Mark E.; Eismann, Michael T.; Goda, Matthew E.

    2006-05-01

    An inverse algorithm for surface normal estimation from thermal polarimetric imagery was developed and used to quantify the requirements on a priori information. Building on existing knowledge that calculates the degree of linear polarization (DOLP) and the angle of polarization (AOP) for a given surface normal in a forward model (from an object's characteristics to calculation of the DOLP and AOP), this research quantifies the impact of a priori information with the development of an inverse algorithm to estimate surface normals from thermal polarimetric emissions in long-wave infrared (LWIR). The inverse algorithm assumes a polarized infrared focal plane array capturing LWIR intensity images which are then converted to Stokes vectors. Next, the DOLP and AOP are calculated from the Stokes vectors. Last, the viewing angles, θ v, to the surface normals are estimated assuming perfect material information about the imaged scene. A sensitivity analysis is presented to quantitatively describe the a priori information's impact on the amount of error in the estimation of surface normals, and a bound is determined given perfect information about an object. Simulations explored the impact of surface roughness (σ) and the real component (n) of a dielectric's complex index of refraction across a range of viewing angles (θ v) for a given wavelength of observation.

  8. On quantifying insect movements

    SciTech Connect

    Wiens, J.A.; Crist, T.O. ); Milne, B.T. )

    1993-08-01

    We elaborate on methods described by Turchin, Odendaal Rausher for quantifying insect movement pathways. We note the need to scale measurement resolution to the study insects and the questions being asked, and we discuss the use of surveying instrumentation for recording sequential positions of individuals on pathways. We itemize several measures that may be used to characterize movement pathways and illustrate these by comparisons among several Eleodes beetles occurring in shortgrass steppe. The fractal dimension of pathways may provide insights not available from absolute measures of pathway configuration. Finally, we describe a renormalization procedure that may be used to remove sequential interdependence among locations of moving individuals while preserving the basic attributes of the pathway.

  9. Quantifying bicycle network connectivity.

    PubMed

    Lowry, Michael; Loh, Tracy Hadden

    2017-02-01

    The intent of this study was to compare bicycle network connectivity for different types of bicyclists and different neighborhoods. Connectivity was defined as the ability to reach important destinations, such as grocery stores, banks, and elementary schools, via pathways or roads with low vehicle volumes and low speed limits. The analysis was conducted for 28 neighborhoods in Seattle, Washington under existing conditions and for a proposed bicycle master plan, which when complete will provide over 700 new bicycle facilities, including protected bike lanes, neighborhood greenways, and multi-use trails. The results showed different levels of connectivity across neighborhoods and for different types of bicyclists. Certain projects were shown to improve connectivity differently for confident and non-confident bicyclists. The analysis showed a positive correlation between connectivity and observed utilitarian bicycle trips. To improve connectivity for the majority of bicyclists, planners and policy-makers should provide bicycle facilities that allow immediate, low-stress access to the street network, such as neighborhood greenways. The analysis also suggests that policies and programs that build confidence for bicycling could greatly increase connectivity.

  10. A new index quantifying the precipitation extremes

    NASA Astrophysics Data System (ADS)

    Busuioc, Aristita; Baciu, Madalina; Stoica, Cerasela

    2015-04-01

    Meteorological Administration in Romania. These types of records contain the rainfall intensity (mm/minute) over various intervals for which it remains constant. The maximum intensity for each continuous rain over the May-August interval has been calculated for each year. The corresponding time series over the 1951-2008 period have been analysed in terms of their long term trends and shifts in the mean; the results have been compared to those resulted from other rainfall indices based on daily and hourly data, computed over the same interval such as: total rainfall amount, maximum daily amount, contribution of total hourly amounts exceeding 10mm/day, contribution of daily amounts exceeding the 90th percentile, the 90th, 99th and 99.9th percentiles of 1-hour data . The results show that the proposed index exhibit a coherent and stronger climate signal (significant increase) for all analysed stations compared to the other indices associated to precipitation extremes, which show either no significant change or weaker signal. This finding shows that the proposed index is most appropriate to quantify the climate change signal of the precipitation extremes. We consider that this index is more naturally connected to the maximum intensity of a real rainfall event. The results presented is this study were funded by the Executive Agency for Higher Education, Research, Development and Innovation Funding (UEFISCDI) through the research project CLIMHYDEX, "Changes in climate extremes and associated impact in hydrological events in Romania", code PNII-ID-2011-2-0073 (http://climhydex.meteoromania.ro)

  11. Quantifier Comprehension in Corticobasal Degeneration

    ERIC Educational Resources Information Center

    McMillan, Corey T.; Clark, Robin; Moore, Peachie; Grossman, Murray

    2006-01-01

    In this study, we investigated patients with focal neurodegenerative diseases to examine a formal linguistic distinction between classes of generalized quantifiers, like "some X" and "less than half of X." Our model of quantifier comprehension proposes that number knowledge is required to understand both first-order and higher-order quantifiers.…

  12. Quantifying the Wave Driving of the Stratosphere

    NASA Technical Reports Server (NTRS)

    Newman, Paul A.; Nash, Eric R.

    1999-01-01

    The zonal mean eddy heat flux is directly proportional to the wave activity that propagates from the troposphere into the stratosphere. This quantity is a simple eddy diagnostic which is easily calculated from conventional meteorological analyses. Because this "wave driving" of the stratosphere has a strong impact on the stratospheric temperature, it is necessary to compare the impact of the flux with respect to stratospheric radiative changes caused by greenhouse gas changes. Hence, we must understand the precision and accuracy of the heat flux derived from our global meteorological analyses. Herein, we quantify the stratospheric heat flux using five different meteorological analyses, and show that there are 30% differences between these analyses during the disturbed conditions of the northern hemisphere winter. Such large differences result from the planetary differences in the stationary temperature and meridional wind fields. In contrast, planetary transient waves show excellent agreement amongst these five analyses, and this transient heat flux appears to have a long term downward trend.

  13. Quantifying T Lymphocyte Turnover

    PubMed Central

    De Boer, Rob J.; Perelson, Alan S.

    2013-01-01

    Peripheral T cell populations are maintained by production of naive T cells in the thymus, clonal expansion of activated cells, cellular self-renewal (or homeostatic proliferation), and density dependent cell life spans. A variety of experimental techniques have been employed to quantify the relative contributions of these processes. In modern studies lymphocytes are typically labeled with 5-bromo-2′-deoxyuridine (BrdU), deuterium, or the fluorescent dye carboxy-fluorescein diacetate succinimidyl ester (CFSE), their division history has been studied by monitoring telomere shortening and the dilution of T cell receptor excision circles (TRECs) or the dye CFSE, and clonal expansion has been documented by recording changes in the population densities of antigen specific cells. Proper interpretation of such data in terms of the underlying rates of T cell production, division, and death has proven to be notoriously difficult and involves mathematical modeling. We review the various models that have been developed for each of these techniques, discuss which models seem most appropriate for what type of data, reveal open problems that require better models, and pinpoint how the assumptions underlying a mathematical model may influence the interpretation of data. Elaborating various successful cases where modeling has delivered new insights in T cell population dynamics, this review provides quantitative estimates of several processes involved in the maintenance of naive and memory, CD4+ and CD8+ T cell pools in mice and men. PMID:23313150

  14. Quantifying Anderson's fault types

    USGS Publications Warehouse

    Simpson, R.W.

    1997-01-01

    Anderson [1905] explained three basic types of faulting (normal, strike-slip, and reverse) in terms of the shape of the causative stress tensor and its orientation relative to the Earth's surface. Quantitative parameters can be defined which contain information about both shape and orientation [Ce??le??rier, 1995], thereby offering a way to distinguish fault-type domains on plots of regional stress fields and to quantify, for example, the degree of normal-faulting tendencies within strike-slip domains. This paper offers a geometrically motivated generalization of Angelier's [1979, 1984, 1990] shape parameters ?? and ?? to new quantities named A?? and A??. In their simple forms, A?? varies from 0 to 1 for normal, 1 to 2 for strike-slip, and 2 to 3 for reverse faulting, and A?? ranges from 0?? to 60??, 60?? to 120??, and 120?? to 180??, respectively. After scaling, A?? and A?? agree to within 2% (or 1??), a difference of little practical significance, although A?? has smoother analytical properties. A formulation distinguishing horizontal axes as well as the vertical axis is also possible, yielding an A?? ranging from -3 to +3 and A?? from -180?? to +180??. The geometrically motivated derivation in three-dimensional stress space presented here may aid intuition and offers a natural link with traditional ways of plotting yield and failure criteria. Examples are given, based on models of Bird [1996] and Bird and Kong [1994], of the use of Anderson fault parameters A?? and A?? for visualizing tectonic regimes defined by regional stress fields. Copyright 1997 by the American Geophysical Union.

  15. Mountain torrents: Quantifying vulnerability and assessing uncertainties

    PubMed Central

    Totschnig, Reinhold; Fuchs, Sven

    2013-01-01

    Vulnerability assessment for elements at risk is an important component in the framework of risk assessment. The vulnerability of buildings affected by torrent processes can be quantified by vulnerability functions that express a mathematical relationship between the degree of loss of individual elements at risk and the intensity of the impacting process. Based on data from the Austrian Alps, we extended a vulnerability curve for residential buildings affected by fluvial sediment transport processes to other torrent processes and other building types. With respect to this goal to merge different data based on different processes and building types, several statistical tests were conducted. The calculation of vulnerability functions was based on a nonlinear regression approach applying cumulative distribution functions. The results suggest that there is no need to distinguish between different sediment-laden torrent processes when assessing vulnerability of residential buildings towards torrent processes. The final vulnerability functions were further validated with data from the Italian Alps and different vulnerability functions presented in the literature. This comparison showed the wider applicability of the derived vulnerability functions. The uncertainty inherent to regression functions was quantified by the calculation of confidence bands. The derived vulnerability functions may be applied within the framework of risk management for mountain hazards within the European Alps. The method is transferable to other mountain regions if the input data needed are available. PMID:27087696

  16. Quantifying errors in trace species transport modeling

    PubMed Central

    Prather, Michael J.; Zhu, Xin; Strahan, Susan E.; Steenrod, Stephen D.; Rodriguez, Jose M.

    2008-01-01

    One expectation when computationally solving an Earth system model is that a correct answer exists, that with adequate physical approximations and numerical methods our solutions will converge to that single answer. With such hubris, we performed a controlled numerical test of the atmospheric transport of CO2 using 2 models known for accurate transport of trace species. Resulting differences were unexpectedly large, indicating that in some cases, scientific conclusions may err because of lack of knowledge of the numerical errors in tracer transport models. By doubling the resolution, thereby reducing numerical error, both models show some convergence to the same answer. Now, under realistic conditions, we identify a practical approach for finding the correct answer and thus quantifying the advection error. PMID:19066224

  17. Quantifying Aggressive Behavior in Zebrafish.

    PubMed

    Teles, Magda C; Oliveira, Rui F

    2016-01-01

    Aggression is a complex behavior that influences social relationships and can be seen as adaptive or maladaptive depending on the context and intensity of expression. A model organism suitable for genetic dissection of the underlying neural mechanisms of aggressive behavior is still needed. Zebrafish has already proven to be a powerful vertebrate model organism for the study of normal and pathological brain function. Despite the fact that zebrafish is a gregarious species that forms shoals, when allowed to interact in pairs, both males and females express aggressive behavior and establish dominance hierarchies. Here, we describe two protocols that can be used to quantify aggressive behavior in zebrafish, using two different paradigms: (1) staged fights between real opponents and (2) mirror-elicited fights. We also discuss the methodology for the behavior analysis, the expected results for both paradigms, and the advantages and disadvantages of each paradigm in face of the specific goals of the study.

  18. Quantifying protein by bicinchoninic Acid.

    PubMed

    Simpson, Richard J

    2008-08-01

    INTRODUCTIONThis protocol describes a method of quantifying protein that is a variation of the Lowry assay. It uses bicinchoninic acid (BCA) to enhance the detection of Cu(+) generated under alkaline conditions at sites of complexes between Cu(2+) and protein. The resulting chromophore absorbs at 562 nm. This technique is divided into three parts: Standard Procedure, Microprocedure, and 96-Well Microtiter Plate Procedure. For each procedure, test samples are assayed in parallel with protein standards that are used to generate a calibration curve, and the exact concentration of protein in the test samples is interpolated. The standard BCA assay uses large volumes of both reagents and samples and cannot easily be automated. If these issues are important, the Microprocedure is recommended. This in turn can be adapted for use with a microplate reader in the 96-Well Microtiter Plate Procedure. If the microplate reader is interfaced with a computer, more than 1000 samples can be read per hour.

  19. Quantifying the quiet epidemic

    PubMed Central

    2014-01-01

    During the late 20th century numerical rating scales became central to the diagnosis of dementia and helped transform attitudes about its causes and prevalence. Concentrating largely on the development and use of the Blessed Dementia Scale, I argue that rating scales served professional ends during the 1960s and 1970s. They helped old age psychiatrists establish jurisdiction over conditions such as dementia and present their field as a vital component of the welfare state, where they argued that ‘reliable modes of diagnosis’ were vital to the allocation of resources. I show how these arguments appealed to politicians, funding bodies and patient groups, who agreed that dementia was a distinct disease and claimed research on its causes and prevention should be designated ‘top priority’. But I also show that worries about the replacement of clinical acumen with technical and depersonalized methods, which could conceivably be applied by anyone, led psychiatrists to stress that rating scales had their limits and could be used only by trained experts. PMID:25866448

  20. A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT) assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE)

    NASA Astrophysics Data System (ADS)

    Fang, T.; Verma, V.; Guo, H.; King, L. E.; Edgerton, E. S.; Weber, R. J.

    2014-07-01

    A variety of methods are used to measure the capability of particulate matter (PM) to catalytically generate reactive oxygen species (ROS) in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT) assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants) as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with the large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (Coefficient of Variation of 12% for standards, 4% for ambient samples), and reasonably low limit of detection (0.31 nmol min-1). Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 ± 0.12, r2 = 0.92, N = 9). The system was utilized for the Southeastern Center for Air Pollution and Epidemiology (SCAPE) to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, road-side, and rural) in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per air volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88), suggesting regional sources contributing to the PM oxidative potential in southeast US. However, the greater heterogeneity in the intrinsic DTT activity (per PM mass basis) across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the instrument can also be used to determine oxidative potential with other acellular assays.

  1. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  2. Quantifying collective effervescence

    PubMed Central

    Konvalinka, Ivana; Bulbulia, Joseph; Roepstorff, Andreas

    2011-01-01

    Collective rituals are ubiquitous and resilient features of all known human cultures. They are also functionally opaque, costly, and sometimes dangerous. Social scientists have speculated that collective rituals generate benefits in excess of their costs by reinforcing social bonding and group solidarity, yet quantitative evidence for these conjectures is scarce. Our recent study measured the physiological effects of a highly arousing Spanish fire-walking ritual, revealing shared patterns in heart-rate dynamics between participants and related spectators. We briefly describe our results, and consider their implications. PMID:22446541

  3. Children's interpretations of general quantifiers, specific quantifiers, and generics

    PubMed Central

    Gelman, Susan A.; Leslie, Sarah-Jane; Was, Alexandra M.; Koch, Christina M.

    2014-01-01

    Recently, several scholars have hypothesized that generics are a default mode of generalization, and thus that young children may at first treat quantifiers as if they were generic in meaning. To address this issue, the present experiment provides the first in-depth, controlled examination of the interpretation of generics compared to both general quantifiers ("all Xs", "some Xs") and specific quantifiers ("all of these Xs", "some of these Xs"). We provided children (3 and 5 years) and adults with explicit frequency information regarding properties of novel categories, to chart when "some", "all", and generics are deemed appropriate. The data reveal three main findings. First, even 3-year-olds distinguish generics from quantifiers. Second, when children make errors, they tend to be in the direction of treating quantifiers like generics. Third, children were more accurate when interpreting specific versus general quantifiers. We interpret these data as providing evidence for the position that generics are a default mode of generalization, especially when reasoning about kinds. PMID:25893205

  4. Public medical shows.

    PubMed

    Walusinski, Olivier

    2014-01-01

    In the second half of the 19th century, Jean-Martin Charcot (1825-1893) became famous for the quality of his teaching and his innovative neurological discoveries, bringing many French and foreign students to Paris. A hunger for recognition, together with progressive and anticlerical ideals, led Charcot to invite writers, journalists, and politicians to his lessons, during which he presented the results of his work on hysteria. These events became public performances, for which physicians and patients were transformed into actors. Major newspapers ran accounts of these consultations, more like theatrical shows in some respects. The resultant enthusiasm prompted other physicians in Paris and throughout France to try and imitate them. We will compare the form and substance of Charcot's lessons with those given by Jules-Bernard Luys (1828-1897), Victor Dumontpallier (1826-1899), Ambroise-Auguste Liébault (1823-1904), Hippolyte Bernheim (1840-1919), Joseph Grasset (1849-1918), and Albert Pitres (1848-1928). We will also note their impact on contemporary cinema and theatre.

  5. A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT) assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE)

    NASA Astrophysics Data System (ADS)

    Fang, T.; Verma, V.; Guo, H.; King, L. E.; Edgerton, E. S.; Weber, R. J.

    2015-01-01

    A variety of methods are used to measure the capability of particulate matter (PM) to catalytically generate reactive oxygen species (ROS) in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT) assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants) as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (coefficient of variation of 15% for positive control, 4% for ambient samples) and reasonably low limit of detection (0.31 nmol min-1). Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 ± 0.12, r2 = 0.92, N = 9). The system was utilized for the Southeastern Center for Air Pollution & Epidemiology (SCAPE) to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, roadside, and rural) in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per-air-volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88), suggesting regional sources contributing to the PM oxidative potential in the southeastern US. The correlation may also suggest a mechanistic explanation (oxidative stress) for observed PM2.5 mass-health associations. The heterogeneity in the intrinsic DTT activity (per-PM-mass basis) across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the

  6. Quantifying periodicity in omics data

    PubMed Central

    Amariei, Cornelia; Tomita, Masaru; Murray, Douglas B.

    2014-01-01

    Oscillations play a significant role in biological systems, with many examples in the fast, ultradian, circadian, circalunar, and yearly time domains. However, determining periodicity in such data can be problematic. There are a number of computational methods to identify the periodic components in large datasets, such as signal-to-noise based Fourier decomposition, Fisher's g-test and autocorrelation. However, the available methods assume a sinusoidal model and do not attempt to quantify the waveform shape and the presence of multiple periodicities, which provide vital clues in determining the underlying dynamics. Here, we developed a Fourier based measure that generates a de-noised waveform from multiple significant frequencies. This waveform is then correlated with the raw data from the respiratory oscillation found in yeast, to provide oscillation statistics including waveform metrics and multi-periods. The method is compared and contrasted to commonly used statistics. Moreover, we show the utility of the program in the analysis of noisy datasets and other high-throughput analyses, such as metabolomics and flow cytometry, respectively. PMID:25364747

  7. Quantifying the seismicity on Taiwan

    NASA Astrophysics Data System (ADS)

    Wu, Yi-Hsuan; Chen, Chien-Chih; Turcotte, Donald L.; Rundle, John B.

    2013-07-01

    We quantify the seismicity on the island of Taiwan using the frequency-magnitude statistics of earthquakes since 1900. A break in Gutenberg-Richter scaling for large earthquakes in global seismicity has been observed, this break is also observed in our Taiwan study. The seismic data from the Central Weather Bureau Seismic Network are in good agreement with the Gutenberg-Richter relation taking b ≈ 1 when M < 7. For large earthquakes, M ≥ 7, the seismic data fit Gutenberg-Richter scaling with b ≈ 1.5. If the Gutenberg-Richter scaling for M < 7 earthquakes is extrapolated to larger earthquakes, we would expect a M > 8 earthquake in the study region about every 25 yr. However, our analysis shows a lower frequency of occurrence of large earthquakes so that the expected frequency of M > 8 earthquakes is about 200 yr. The level of seismicity for smaller earthquakes on Taiwan is about 12 times greater than in Southern California and the possibility of a M ≈ 9 earthquake north or south of Taiwan cannot be ruled out. In light of the Fukushima, Japan nuclear disaster, we also discuss the implications of our study for the three operating nuclear power plants on the coast of Taiwan.

  8. Television Quiz Show Simulation

    ERIC Educational Resources Information Center

    Hill, Jonnie Lynn

    2007-01-01

    This article explores the simulation of four television quiz shows for students in China studying English as a foreign language (EFL). It discusses the adaptation and implementation of television quiz shows and how the students reacted to them.

  9. The Great Cometary Show

    NASA Astrophysics Data System (ADS)

    2007-01-01

    The ESO Very Large Telescope Interferometer, which allows astronomers to scrutinise objects with a precision equivalent to that of a 130-m telescope, is proving itself an unequalled success every day. One of the latest instruments installed, AMBER, has led to a flurry of scientific results, an anthology of which is being published this week as special features in the research journal Astronomy & Astrophysics. ESO PR Photo 06a/07 ESO PR Photo 06a/07 The AMBER Instrument "With its unique capabilities, the VLT Interferometer (VLTI) has created itself a niche in which it provide answers to many astronomical questions, from the shape of stars, to discs around stars, to the surroundings of the supermassive black holes in active galaxies," says Jorge Melnick (ESO), the VLT Project Scientist. The VLTI has led to 55 scientific papers already and is in fact producing more than half of the interferometric results worldwide. "With the capability of AMBER to combine up to three of the 8.2-m VLT Unit Telescopes, we can really achieve what nobody else can do," added Fabien Malbet, from the LAOG (France) and the AMBER Project Scientist. Eleven articles will appear this week in Astronomy & Astrophysics' special AMBER section. Three of them describe the unique instrument, while the other eight reveal completely new results about the early and late stages in the life of stars. ESO PR Photo 06b/07 ESO PR Photo 06b/07 The Inner Winds of Eta Carinae The first results presented in this issue cover various fields of stellar and circumstellar physics. Two papers deal with very young solar-like stars, offering new information about the geometry of the surrounding discs and associated outflowing winds. Other articles are devoted to the study of hot active stars of particular interest: Alpha Arae, Kappa Canis Majoris, and CPD -57o2874. They provide new, precise information about their rotating gas envelopes. An important new result concerns the enigmatic object Eta Carinae. Using AMBER with

  10. Quantifying renewable groundwater stress with GRACE

    PubMed Central

    Richey, Alexandra S.; Thomas, Brian F.; Lo, Min‐Hui; Reager, John T.; Voss, Katalyn; Swenson, Sean; Rodell, Matthew

    2015-01-01

    Abstract Groundwater is an increasingly important water supply source globally. Understanding the amount of groundwater used versus the volume available is crucial to evaluate future water availability. We present a groundwater stress assessment to quantify the relationship between groundwater use and availability in the world's 37 largest aquifer systems. We quantify stress according to a ratio of groundwater use to availability, which we call the Renewable Groundwater Stress ratio. The impact of quantifying groundwater use based on nationally reported groundwater withdrawal statistics is compared to a novel approach to quantify use based on remote sensing observations from the Gravity Recovery and Climate Experiment (GRACE) satellite mission. Four characteristic stress regimes are defined: Overstressed, Variable Stress, Human‐dominated Stress, and Unstressed. The regimes are a function of the sign of use (positive or negative) and the sign of groundwater availability, defined as mean annual recharge. The ability to mitigate and adapt to stressed conditions, where use exceeds sustainable water availability, is a function of economic capacity and land use patterns. Therefore, we qualitatively explore the relationship between stress and anthropogenic biomes. We find that estimates of groundwater stress based on withdrawal statistics are unable to capture the range of characteristic stress regimes, especially in regions dominated by sparsely populated biome types with limited cropland. GRACE‐based estimates of use and stress can holistically quantify the impact of groundwater use on stress, resulting in both greater magnitudes of stress and more variability of stress between regions. PMID:26900185

  11. Stretched View Showing 'Victoria'

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Stretched View Showing 'Victoria'

    This pair of images from the panoramic camera on NASA's Mars Exploration Rover Opportunity served as initial confirmation that the two-year-old rover is within sight of 'Victoria Crater,' which it has been approaching for more than a year. Engineers on the rover team were unsure whether Opportunity would make it as far as Victoria, but scientists hoped for the chance to study such a large crater with their roving geologist. Victoria Crater is 800 meters (nearly half a mile) in diameter, about six times wider than 'Endurance Crater,' where Opportunity spent several months in 2004 examining rock layers affected by ancient water.

    When scientists using orbital data calculated that they should be able to detect Victoria's rim in rover images, they scrutinized frames taken in the direction of the crater by the panoramic camera. To positively characterize the subtle horizon profile of the crater and some of the features leading up to it, researchers created a vertically-stretched image (top) from a mosaic of regular frames from the panoramic camera (bottom), taken on Opportunity's 804th Martian day (April 29, 2006).

    The stretched image makes mild nearby dunes look like more threatening peaks, but that is only a result of the exaggerated vertical dimension. This vertical stretch technique was first applied to Viking Lander 2 panoramas by Philip Stooke, of the University of Western Ontario, Canada, to help locate the lander with respect to orbiter images. Vertically stretching the image allows features to be more readily identified by the Mars Exploration Rover science team.

    The bright white dot near the horizon to the right of center (barely visible without labeling or zoom-in) is thought to be a light-toned outcrop on the far wall of the crater, suggesting that the rover can see over the low rim of Victoria. In figure 1, the northeast and southeast rims are labeled

  12. A flow cytometric approach to quantify biofilms.

    PubMed

    Kerstens, Monique; Boulet, Gaëlle; Van Kerckhoven, Marian; Clais, Sofie; Lanckacker, Ellen; Delputte, Peter; Maes, Louis; Cos, Paul

    2015-07-01

    Since biofilms are important in many clinical, industrial, and environmental settings, reliable methods to quantify these sessile microbial populations are crucial. Most of the currently available techniques do not allow the enumeration of the viable cell fraction within the biofilm and are often time consuming. This paper proposes flow cytometry (FCM) using the single-stain viability dye TO-PRO(®)-3 iodide as a fast and precise alternative. Mature biofilms of Candida albicans and Escherichia coli were used to optimize biofilm removal and dissociation, as a single-cell suspension is needed for accurate FCM enumeration. To assess the feasibility of FCM quantification of biofilms, E. coli and C. albicans biofilms were analyzed using FCM and crystal violet staining at different time points. A combination of scraping and rinsing proved to be the most efficient technique for biofilm removal. Sonicating for 10 min eliminated the remaining aggregates, resulting in a single-cell suspension. Repeated FCM measurements of biofilm samples revealed a good intraday precision of approximately 5 %. FCM quantification and the crystal violet assay yielded similar biofilm growth curves for both microorganisms, confirming the applicability of our technique. These results show that FCM using TO-PRO(®)-3 iodide as a single-stain viability dye is a valid fast alternative for the quantification of viable cells in a biofilm.

  13. Quantifying foot deformation using finite helical angle.

    PubMed

    Pothrat, Claude; Goislard de Monsabert, Benjamin; Vigouroux, Laurent; Viehweger, Elke; Berton, Eric; Rao, Guillaume

    2015-10-15

    Foot intrinsic motion originates from the combination of numerous joint motions giving this segment a high adaptive ability. Existing foot kinematic models are mostly focused on analyzing small scale foot bone to bone motions which require both complex experimental methodology and complex interpretative work to assess the global foot functionality. This study proposes a method to assess the total foot deformation by calculating a helical angle from the relative motions of the rearfoot and the forefoot. This method required a limited number of retro-reflective markers placed on the foot and was tested for five different movements (walking, forefoot impact running, heel impact running, 90° cutting, and 180° U-turn) and 12 participants. Overtime intraclass correlation coefficients were calculated to quantify the helical angle pattern repeatability for each movement. Our results indicated that the method was suitable to identify the different motions as different amplitudes of helical angle were observed according to the flexibility required in each movement. Moreover, the results showed that the repeatability could be used to identify the mastering of each motion as this repeatability was high for well mastered movements. Together with existing methods, this new protocol could be applied to fully assess foot function in sport or clinical contexts.

  14. Cobalamin Concentrations in Fetal Liver Show Gender Differences: A Result from Using a High-Pressure Liquid Chromatography-Inductively Coupled Plasma Mass Spectrometry as an Ultratrace Cobalt Speciation Method.

    PubMed

    Bosle, Janine; Goetz, Sven; Raab, Andrea; Krupp, Eva M; Scheckel, Kirk G; Lombi, Enzo; Meharg, Andrew A; Fowler, Paul A; Feldmann, Jörg

    2016-12-20

    Maternal diet and lifestyle choices may affect placental transfer of cobalamin (Cbl) to the fetus. Fetal liver concentration of Cbl reflects nutritional status with regards to vitamin B12, but at these low concentration current Cbl measurement methods lack robustness. An analytical method based on enzymatic extraction with subsequent reversed-phase-high-pressure liquid chromatography (RP-HPLC) separation and parallel ICPMS and electrospray ionization (ESI)-Orbitrap-MS to determine specifically Cbl species in liver samples of only 10-50 mg was developed using 14 pig livers. Subsequently 55 human fetal livers were analyzed. HPLC-ICPMS analysis for cobalt (Co) and Cbl gave detection limits of 0.18 ng/g and 0.88 ng/g d.m. in liver samples, respectively, with a recovery of >95%. Total Co (Cot) concentration did not reflect the amount of Cbl or vitamin B12 in the liver. Cbl bound Co contributes only 45 ± 15% to Cot. XRF mapping and μXANES analysis confirmed the occurrence of non-Cbl cobalt in pig liver hot spots indicating particular Co. No correlations of total cobalt nor Cbl with fetal weight or weeks of gestation were found for the human fetal livers. Although no gender difference could be identified for total Co concentration, female livers were significantly higher in Cbl concentration (24.1 ± 7.8 ng/g) than those from male fetuses (19.8 ± 7.1 ng/g) (p = 0.04). This HPLC-ICPMS method was able to quantify total Cot and Cbl in fetus liver, and it was sensitive and precise enough to identify this gender difference.

  15. Showing What They Know

    ERIC Educational Resources Information Center

    Cech, Scott J.

    2008-01-01

    Having students show their skills in three dimensions, known as performance-based assessment, dates back at least to Socrates. Individual schools such as Barrington High School--located just outside of Providence--have been requiring students to actively demonstrate their knowledge for years. The Rhode Island's high school graduating class became…

  16. The Ozone Show.

    ERIC Educational Resources Information Center

    Mathieu, Aaron

    2000-01-01

    Uses a talk show activity for a final assessment tool for students to debate about the ozone hole. Students are assessed on five areas: (1) cooperative learning; (2) the written component; (3) content; (4) self-evaluation; and (5) peer evaluation. (SAH)

  17. What Do Maps Show?

    ERIC Educational Resources Information Center

    Geological Survey (Dept. of Interior), Reston, VA.

    This curriculum packet, appropriate for grades 4-8, features a teaching poster which shows different types of maps (different views of Salt Lake City, Utah), as well as three reproducible maps and reproducible activity sheets which complement the maps. The poster provides teacher background, including step-by-step lesson plans for four geography…

  18. Show Me the Way

    ERIC Educational Resources Information Center

    Dicks, Matthew J.

    2005-01-01

    Because today's students have grown up steeped in video games and the Internet, most of them expect feedback, and usually gratification, very soon after they expend effort on a task. Teachers can get quick feedback to students by showing them videotapes of their learning performances. The author, a 3rd grade teacher describes how the seemingly…

  19. Chemistry Game Shows

    NASA Astrophysics Data System (ADS)

    Campbell, Susan; Muzyka, Jennifer

    2002-04-01

    We present a technological improvement to the use of game shows to help students review for tests. Our approach uses HTML files interpreted with a browser on a computer attached to an LCD projector. The HTML files can be easily modified for use of the game in a variety of courses.

  20. Honored Teacher Shows Commitment.

    ERIC Educational Resources Information Center

    Ratte, Kathy

    1987-01-01

    Part of the acceptance speech of the 1985 National Council for the Social Studies Teacher of the Year, this article describes the censorship experience of this honored social studies teacher. The incident involved the showing of a videotape version of the feature film entitled "The Seduction of Joe Tynan." (JDH)

  1. Talk Show Science.

    ERIC Educational Resources Information Center

    Moore, Mitzi Ruth

    1992-01-01

    Proposes having students perform skits in which they play the roles of the science concepts they are trying to understand. Provides the dialog for a skit in which hot and cold gas molecules are interviewed on a talk show to study how these properties affect wind, rain, and other weather phenomena. (MDH)

  2. Stage a Water Show

    ERIC Educational Resources Information Center

    Frasier, Debra

    2008-01-01

    In the author's book titled "The Incredible Water Show," the characters from "Miss Alaineus: A Vocabulary Disaster" used an ocean of information to stage an inventive performance about the water cycle. In this article, the author relates how she turned the story into hands-on science teaching for real-life fifth-grade students. The author also…

  3. Use of the Concept of Equivalent Biologically Effective Dose (BED) to Quantify the Contribution of Hyperthermia to Local Tumor Control in Radiohyperthermia Cervical Cancer Trials, and Comparison With Radiochemotherapy Results

    SciTech Connect

    Plataniotis, George A. Dale, Roger G.

    2009-04-01

    Purpose: To express the magnitude of contribution of hyperthermia to local tumor control in radiohyperthermia (RT/HT) cervical cancer trials, in terms of the radiation-equivalent biologically effective dose (BED) and to explore the potential of the combined modalities in the treatment of this neoplasm. Materials and Methods: Local control rates of both arms of each study (RT vs. RT+HT) reported from randomized controlled trials (RCT) on concurrent RT/HT for cervical cancer were reviewed. By comparing the two tumor control probabilities (TCPs) from each study, we calculated the HT-related log cell-kill and then expressed it in terms of the number of 2 Gy fraction equivalents, for a range of tumor volumes and radiosensitivities. We have compared the contribution of each modality and made some exploratory calculations on the TCPs that might be expected from a combined trimodality treatment (RT+CT+HT). Results: The HT-equivalent number of 2-Gy fractions ranges from 0.6 to 4.8 depending on radiosensitivity. Opportunities for clinically detectable improvement by the addition of HT are only available in tumors with an alpha value in the approximate range of 0.22-0.28 Gy{sup -1}. A combined treatment (RT+CT+HT) is not expected to improve prognosis in radioresistant tumors. Conclusion: The most significant improvements in TCP, which may result from the combination of RT/CT/HT for locally advanced cervical carcinomas, are likely to be limited only to those patients with tumors of relatively low-intermediate radiosensitivity.

  4. Children’s developing intuitions about the truth conditions and implications of novel generics vs. quantified statements

    PubMed Central

    Brandone, Amanda C.; Gelman, Susan A; Hedglen, Jenna

    2014-01-01

    Generic statements express generalizations about categories and present a unique semantic profile that is distinct from quantified statements. This paper reports two studies examining the development of children’s intuitions about the semantics of generics and how they differ from statements quantified by all, most, and some. Results reveal that, like adults, preschoolers (1) recognize that generics have flexible truth conditions and are capable of representing a wide range of prevalence levels; and (2) interpret novel generics as having near-universal prevalence implications. Results further show that by age 4, children are beginning to differentiate the meaning of generics and quantified statements; however, even 7- to 11-year-olds are not adult-like in their intuitions about the meaning of most-quantified statements. Overall, these studies suggest that by preschool, children interpret generics in much the same way that adults do; however, mastery of the semantics of quantified statements follows a more protracted course. PMID:25297340

  5. Not a "reality" show.

    PubMed

    Wrong, Terence; Baumgart, Erica

    2013-01-01

    The authors of the preceding articles raise legitimate questions about patient and staff rights and the unintended consequences of allowing ABC News to film inside teaching hospitals. We explain why we regard their fears as baseless and not supported by what we heard from individuals portrayed in the filming, our decade-long experience making medical documentaries, and the full un-aired context of the scenes shown in the broadcast. The authors don't and can't know what conversations we had, what documents we reviewed, and what protections we put in place in each televised scene. Finally, we hope to correct several misleading examples cited by the authors as well as their offhand mischaracterization of our program as a "reality" show.

  6. Quantifying uncertainty in stable isotope mixing models

    SciTech Connect

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-19

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, SIAR [Parnell et al., 2010] a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the

  7. Quantifying uncertainty in stable isotope mixing models

    DOE PAGES

    Davis, Paul; Syme, James; Heikoop, Jeffrey; ...

    2015-05-19

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, SIAR [Parnell et al., 2010] a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods testedmore » are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated

  8. Quantifying uncertainty in stable isotope mixing models

    NASA Astrophysics Data System (ADS)

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-01

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, Stable Isotope Analysis in R (SIAR), a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated

  9. Children's school-breakfast reports and school-lunch reports (in 24-h dietary recalls): conventional and reporting-error-sensitive measures show inconsistent accuracy results for retention interval and breakfast location.

    PubMed

    Baxter, Suzanne D; Guinn, Caroline H; Smith, Albert F; Hitchcock, David B; Royer, Julie A; Puryear, Megan P; Collins, Kathleen L; Smith, Alyssa L

    2016-04-14

    Validation-study data were analysed to investigate retention interval (RI) and prompt effects on the accuracy of fourth-grade children's reports of school-breakfast and school-lunch (in 24-h recalls), and the accuracy of school-breakfast reports by breakfast location (classroom; cafeteria). Randomly selected fourth-grade children at ten schools in four districts were observed eating school-provided breakfast and lunch, and were interviewed under one of eight conditions created by crossing two RIs ('short'--prior-24-hour recall obtained in the afternoon and 'long'--previous-day recall obtained in the morning) with four prompts ('forward'--distant to recent, 'meal name'--breakfast, etc., 'open'--no instructions, and 'reverse'--recent to distant). Each condition had sixty children (half were girls). Of 480 children, 355 and 409 reported meals satisfying criteria for reports of school-breakfast and school-lunch, respectively. For breakfast and lunch separately, a conventional measure--report rate--and reporting-error-sensitive measures--correspondence rate and inflation ratio--were calculated for energy per meal-reporting child. Correspondence rate and inflation ratio--but not report rate--showed better accuracy for school-breakfast and school-lunch reports with the short RI than with the long RI; this pattern was not found for some prompts for each sex. Correspondence rate and inflation ratio showed better school-breakfast report accuracy for the classroom than for cafeteria location for each prompt, but report rate showed the opposite. For each RI, correspondence rate and inflation ratio showed better accuracy for lunch than for breakfast, but report rate showed the opposite. When choosing RI and prompts for recalls, researchers and practitioners should select a short RI to maximise accuracy. Recommendations for prompt selections are less clear. As report rates distort validation-study accuracy conclusions, reporting-error-sensitive measures are recommended.

  10. Quantifying reliability uncertainty : a proof of concept.

    SciTech Connect

    Diegert, Kathleen V.; Dvorack, Michael A.; Ringland, James T.; Mundt, Michael Joseph; Huzurbazar, Aparna; Lorio, John F.; Fatherley, Quinn; Anderson-Cook, Christine; Wilson, Alyson G.; Zurn, Rena M.

    2009-10-01

    This paper develops Classical and Bayesian methods for quantifying the uncertainty in reliability for a system of mixed series and parallel components for which both go/no-go and variables data are available. Classical methods focus on uncertainty due to sampling error. Bayesian methods can explore both sampling error and other knowledge-based uncertainties. To date, the reliability community has focused on qualitative statements about uncertainty because there was no consensus on how to quantify them. This paper provides a proof of concept that workable, meaningful quantification methods can be constructed. In addition, the application of the methods demonstrated that the results from the two fundamentally different approaches can be quite comparable. In both approaches, results are sensitive to the details of how one handles components for which no failures have been seen in relatively few tests.

  11. Quantifying Shape Changes and Tissue Deformation in Leaf Development.

    PubMed

    Rolland-Lagan, Anne-Gaëlle; Remmler, Lauren; Girard-Bock, Camille

    2014-06-01

    The analysis of biological shapes has applications in many areas of biology, and tools exist to quantify organ shape and detect shape differences between species or among variants. However, such measurements do not provide any information about the mechanisms of shape generation. Quantitative data on growth patterns may provide insights into morphogenetic processes, but since growth is a complex process occurring in four dimensions, growth patterns alone cannot intuitively be linked to shape outcomes. Here, we present computational tools to quantify tissue deformation and surface shape changes over the course of leaf development, applied to the first leaf of Arabidopsis (Arabidopsis thaliana). The results show that the overall leaf shape does not change notably during the developmental stages analyzed, yet there is a clear upward radial deformation of the leaf tissue in early time points. This deformation pattern may provide an explanation for how the Arabidopsis leaf maintains a relatively constant shape despite spatial heterogeneities in growth. These findings highlight the importance of quantifying tissue deformation when investigating the control of leaf shape. More generally, experimental mapping of deformation patterns may help us to better understand the link between growth and shape in organ development.

  12. Quantifying and measuring cyber resiliency

    NASA Astrophysics Data System (ADS)

    Cybenko, George

    2016-05-01

    Cyber resliency has become an increasingly attractive research and operational concept in cyber security. While several metrics have been proposed for quantifying cyber resiliency, a considerable gap remains between those metrics and operationally measurable and meaningful concepts that can be empirically determined in a scientific manner. This paper describes a concrete notion of cyber resiliency that can be tailored to meet specific needs of organizations that seek to introduce resiliency into their assessment of their cyber security posture.

  13. 2-Year follow-up to STeP trial shows sustainability of structured self-monitoring of blood glucose utilization: results from the STeP practice logistics and usability survey (STeP PLUS).

    PubMed

    Friedman, Kevin; Noyes, Jeannette; Parkin, Christopher G

    2013-04-01

    We report findings from a follow-up survey of clinicians from the STeP study that assessed their attitudes toward and current use of the Accu-Chek(®) 360° View tool (Roche Diagnostics, Indianapolis, IN) approximately 2 years after the study was completed. The Accu-Chek 360° View tool enables patients to record/plot a seven-point self-monitoring of blood glucose (SMBG) profile (fasting, preprandial/2-h postprandial at each of the three meals, and bedtime) on 3 consecutive days, document meal sizes and energy levels, and comment on their SMBG experiences. Our findings showed that the majority of these physicians continue to use the tool with their patients, citing enhanced patient understanding and engagement, better discussions with patients regarding the impact of lifestyle behaviors, improved clinical outcomes, and better practice efficiencies as significant benefits of the tool.

  14. Tracking and Quantifying Objects and Non-Cohesive Substances

    ERIC Educational Resources Information Center

    van Marle, Kristy; Wynn, Karen

    2011-01-01

    The present study tested infants' ability to assess and compare quantities of a food substance. Contrary to previous findings, the results suggest that by 10 months of age infants can quantify non-cohesive substances, and that this ability is different in important ways from their ability to quantify discrete objects: (1) In contrast to even much…

  15. In favour of the definition "adolescents with idiopathic scoliosis": juvenile and adolescent idiopathic scoliosis braced after ten years of age, do not show different end results. SOSORT award winner 2014

    PubMed Central

    2014-01-01

    Background The most important factor discriminating juvenile (JIS) from adolescent idiopathic scoliosis (AIS) is the risk of deformity progression. Brace treatment can change natural history, even when risk of progression is high. The aim of this study was to compare the end of growth results of JIS subjects, treated after 10 years of age, with final results of AIS. Methods Design: prospective observational controlled cohort study nested in a prospective database. Setting: outpatient tertiary referral clinic specialized in conservative treatment of spinal deformities. Inclusion criteria: idiopathic scoliosis; European Risser 0–2; 25 degrees to 45 degrees Cobb; start treatment age: 10 years or more, never treated before. Exclusion criteria: secondary scoliosis, neurological etiology, prior treatment for scoliosis (brace or surgery). Groups: 27 patients met the inclusion criteria for the AJIS, (Juvenile Idiopathic Scoliosis treated in adolescence), demonstrated by an x-ray before 10 year of age, and treatment start after 10 years of age. AIS group included 45 adolescents with a diagnostic x-ray made after the threshold of age 10 years. Results at the end of growth were analysed; the threshold of 5 Cobb degree to define worsened, improved and stabilized curves was considered. Statistics: Mean and SD were used for descriptive statistics of clinical and radiographic changes. Relative Risk of failure (RR), Chi-square and T-test of all data was calculated to find differences among the two groups. 95% Confidence Interval (CI) , and of radiographic changes have been calculated. Results We did not find any Cobb angle significant differences among groups at baseline and at the end of treatment. The only difference was in the number of patients progressed above 45 degrees, found in the JIS group. The RR of progression of AJIS was, 1.35 (IC95% 0.57-3.17) versus AIS, and it wasn't statistically significant in the AJIS group, in respect to AIS group (p = 0.5338). Conclusion

  16. Clinical consequences of the Calypso trial showing superiority of PEG-liposomal doxorubicin and carboplatin over paclitaxel and carboplatin in recurrent ovarian cancer: results of an Austrian gynecologic oncologists' expert meeting.

    PubMed

    Petru, Edgar; Reinthaller, Alexander; Angleitner-Boubenizek, Lukas; Schauer, Christian; Zeimet, Alain; Dirschlmayer, Wolfgang; Medl, Michael; Stummvoll, Wolfgang; Sevelda, Paul; Marth, Christian

    2010-11-01

    The Calypso trial showed an improved progression-free survival with PEG-liposomal doxorubicin (PLD) and carboplatin (P) as compared with the standard regimen paclitaxel (PCLTX) and P in the second- or third-line treatment of platinum-sensitive epithelial ovarian cancer [1]. A panel of Austrian gynecologic oncologists discussed the clinical consequences of the data from the Calypso study for the routine practice. PLD + P had a significantly lower rate of alopecia and neuropathy than the taxane regimen, both toxicities which compromise the quality of life. Due to possible significant thrombocytopenia, the blood counts of patients undergoing PLD + P therapy should be monitored weekly. Patients receiving PLD/P are at higher risk of nausea and vomiting. Palmoplantar erythrodysesthesia (hand-foot syndrome) is a significant toxicity of PLD + P most prevalent after the third or fourth cycle. Prophylaxis consists of avoiding pressure on feet and hands and other parts of the body. Similarly, prophylaxis of mucositis seems important and includes avoiding consumption of hot, spicy and salty foods and drinks. Mouth dryness should be avoided. Premedication with antiemetics and dexamethasone dissolved in 5% glucose is done to prevent hypersensitivity to PLD. In conclusion, the therapeutic index is more favorable for PLD + P than for PCTX + P.

  17. Numerical and analytic results showing the suppression of secondary electron emission from velvet and foam, and a geometric view factor model to guide the development of a surface to suppress SEE

    NASA Astrophysics Data System (ADS)

    Swanson, Charles; Kaganovich, I. D.

    2016-09-01

    The technique of suppressing secondary electron emission (SEE) from a surface by texturing it is developing rapidly in recent years. We have specific and general results in support of this technique: We have performed numerical and analytic calculations for determining the effective secondary electron yield (SEY) from velvet, which is an array of long cylinders on the micro-scale, and found velvet to be suitable for suppressing SEY from a normally incident primary distribution. We have performed numerical and analytic calculations also for metallic foams, which are an isotropic lattice of fibers on the micro-scale, and found foams to be suitable for suppressing SEY from an isotropic primary distribution. More generally, we have created a geometric weighted view factor model for determining the SEY suppression of a given surface geometry, which has optimization of SEY as a natural application. The optimal surface for suppressing SEY does not have finite area and has no smallest feature size, making it fractal in nature. This model gives simple criteria for a physical, non-fractal surface to suppress SEY. We found families of optimal surfaces to suppress SEY given a finite surface area. The research is supported by Air Force Office of Scientific Research (AFSOR).

  18. Quantifying pulsed laser induced damage to graphene

    SciTech Connect

    Currie, Marc; Caldwell, Joshua D.; Bezares, Francisco J.; Robinson, Jeremy; Anderson, Travis; Chun, Hayden; Tadjer, Marko

    2011-11-21

    As an emerging optical material, graphene's ultrafast dynamics are often probed using pulsed lasers yet the region in which optical damage takes place is largely uncharted. Here, femtosecond laser pulses induced localized damage in single-layer graphene on sapphire. Raman spatial mapping, SEM, and AFM microscopy quantified the damage. The resulting size of the damaged area has a linear correlation with the optical fluence. These results demonstrate local modification of sp{sup 2}-carbon bonding structures with optical pulse fluences as low as 14 mJ/cm{sup 2}, an order-of-magnitude lower than measured and theoretical ablation thresholds.

  19. SPACE: an algorithm to predict and quantify alternatively spliced isoforms using microarrays

    PubMed Central

    Anton, Miguel A; Gorostiaga, Dorleta; Guruceaga, Elizabeth; Segura, Victor; Carmona-Saez, Pedro; Pascual-Montano, Alberto; Pio, Ruben; Montuenga, Luis M; Rubio, Angel

    2008-01-01

    Exon and exon+junction microarrays are promising tools for studying alternative splicing. Current analytical tools applied to these arrays lack two relevant features: the ability to predict unknown spliced forms and the ability to quantify the concentration of known and unknown isoforms. SPACE is an algorithm that has been developed to (1) estimate the number of different transcripts expressed under several conditions, (2) predict the precursor mRNA splicing structure and (3) quantify the transcript concentrations including unknown forms. The results presented here show its robustness and accuracy for real and simulated data. PMID:18312629

  20. Quantifying Evaporation in a Permeable Pavement System

    EPA Science Inventory

    Studies quantifying evaporation from permeable pavement systems are limited to a few laboratory studies and one field application. This research quantifies evaporation for a larger-scale field application by measuring the water balance from lined permeable pavement sections. Th...

  1. QUANTIFIERS UNDONE: REVERSING PREDICTABLE SPEECH ERRORS IN COMPREHENSION

    PubMed Central

    Frazier, Lyn; Clifton, Charles

    2015-01-01

    Speakers predictably make errors during spontaneous speech. Listeners may identify such errors and repair the input, or their analysis of the input, accordingly. Two written questionnaire studies investigated error compensation mechanisms in sentences with doubled quantifiers such as Many students often turn in their assignments late. Results show a considerable number of undoubled interpretations for all items tested (though fewer for sentences containing doubled negation than for sentences containing many-often, every-always or few-seldom.) This evidence shows that the compositional form-meaning pairing supplied by the grammar is not the only systematic mapping between form and meaning. Implicit knowledge of the workings of the performance systems provides an additional mechanism for pairing sentence form and meaning. Alternate accounts of the data based on either a concord interpretation or an emphatic interpretation of the doubled quantifier don’t explain why listeners fail to apprehend the ‘extra meaning’ added by the potentially redundant material only in limited circumstances. PMID:26478637

  2. Quantifying Uncertainty in Epidemiological Models

    SciTech Connect

    Ramanathan, Arvind; Jha, Sumit Kumar

    2012-01-01

    Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.

  3. Quantifying offshore wind resources from satellite wind maps: study area the North Sea

    NASA Astrophysics Data System (ADS)

    Hasager, C. B.; Barthelmie, R. J.; Christiansen, M. B.; Nielsen, M.; Pryor, S. C.

    2006-01-01

    Offshore wind resources are quantified from satellite synthetic aperture radar (SAR) and satellite scatterometer observations at local and regional scale respectively at the Horns Rev site in Denmark. The method for wind resource estimation from satellite observations interfaces with the wind atlas analysis and application program (WAsP). An estimate of the wind resource at the new project site at Horns Rev is given based on satellite SAR observations. The comparison of offshore satellite scatterometer winds, global model data and in situ data shows good agreement. Furthermore, the wake effect of the Horns Rev wind farm is quantified from satellite SAR images and compared with state-of-the-art wake model results with good agreement. It is a unique method using satellite observations to quantify the spatial extent of the wake behind large offshore wind farms. Copyright

  4. Quantifying utricular stimulation during natural behavior

    PubMed Central

    Rivera, Angela R. V.; Davis, Julian; Grant, Wally; Blob, Richard W.; Peterson, Ellengene; Neiman, Alexander B.; Rowe, Michael

    2012-01-01

    The use of natural stimuli in neurophysiological studies has led to significant insights into the encoding strategies used by sensory neurons. To investigate these encoding strategies in vestibular receptors and neurons, we have developed a method for calculating the stimuli delivered to a vestibular organ, the utricle, during natural (unrestrained) behaviors, using the turtle as our experimental preparation. High-speed digital video sequences are used to calculate the dynamic gravito-inertial (GI) vector acting on the head during behavior. X-ray computed tomography (CT) scans are used to determine the orientation of the otoconial layer (OL) of the utricle within the head, and the calculated GI vectors are then rotated into the plane of the OL. Thus, the method allows us to quantify the spatio-temporal structure of stimuli to the OL during natural behaviors. In the future, these waveforms can be used as stimuli in neurophysiological experiments to understand how natural signals are encoded by vestibular receptors and neurons. We provide one example of the method which shows that turtle feeding behaviors can stimulate the utricle at frequencies higher than those typically used in vestibular studies. This method can be adapted to other species, to other vestibular end organs, and to other methods of quantifying head movements. PMID:22753360

  5. Quantifying knee mechanics during balance training exercises.

    PubMed

    Benson, Lauren C; Almonroeder, Thomas G; O'Connor, Kristian M

    2017-01-01

    Patellofemoral pain (PFP) is common among runners and those recovering from anterior cruciate ligament reconstruction. Training programs designed to prevent or treat injuries often include balance training, although balance interventions have been reported to coincide with more knee injuries. Knowledge of the effect of balance exercises on knee mechanics may be useful when designing training programs. High knee abduction moment has been implicated in the development of PFP, and imbalance between vastus lateralis (VL) and vastus medialis oblique (VMO) may contribute to patellofemoral stress. The purpose was to quantify knee abduction moment and vasti muscle activity during balance exercises. Muscle activity of VMO and VL, three-dimensional lower-extremity kinematics, and ground reaction forces of healthy recreational athletes (12M, 13F) were recorded during five exercises. Peak knee abduction moment, ratio of VMO:VL activity, and delay in onset of VMO relative to VL were quantified for each exercise. The influence of sex and exercise on each variable was determined using a mixed-model ANOVA. All analyses indicated a significant main effect of exercise, p<0.05. Follow-up comparisons showed low peak knee abduction moment and high VMO:VL ratio for the task with anterior-posterior motion. Delay of VMO relative to VL was similar among balance board tasks.

  6. Quantifying Emergent Behavior of Autonomous Robots

    NASA Astrophysics Data System (ADS)

    Martius, Georg; Olbrich, Eckehard

    2015-10-01

    Quantifying behaviors of robots which were generated autonomously from task-independent objective functions is an important prerequisite for objective comparisons of algorithms and movements of animals. The temporal sequence of such a behavior can be considered as a time series and hence complexity measures developed for time series are natural candidates for its quantification. The predictive information and the excess entropy are such complexity measures. They measure the amount of information the past contains about the future and thus quantify the nonrandom structure in the temporal sequence. However, when using these measures for systems with continuous states one has to deal with the fact that their values will depend on the resolution with which the systems states are observed. For deterministic systems both measures will diverge with increasing resolution. We therefore propose a new decomposition of the excess entropy in resolution dependent and resolution independent parts and discuss how they depend on the dimensionality of the dynamics, correlations and the noise level. For the practical estimation we propose to use estimates based on the correlation integral instead of the direct estimation of the mutual information using the algorithm by Kraskov et al. (2004) which is based on next neighbor statistics because the latter allows less control of the scale dependencies. Using our algorithm we are able to show how autonomous learning generates behavior of increasing complexity with increasing learning duration.

  7. Quantifying of bactericide properties of medicinal plants

    PubMed Central

    Ács, András; Gölöncsér, Flóra; Barabás, Anikó

    2011-01-01

    Extended research has been carried out to clarify the ecological role of plant secondary metabolites (SMs). Although their primary ecological function is self-defense, bioactive compounds have long been used in alternative medicine or in biological control of pests. Several members of the family Labiatae are known to have strong antimicrobial capacity. For testing and quantifying antibacterial activity, most often standard microbial protocols are used, assessing inhibitory activity on a selected strain. In this study, the applicability of a microbial ecotoxtest was evaluated to quantify the aggregate bactericide capacity of Labiatae species, based on the bioluminescence inhibition of the bacterium Vibrio fischeri. Striking differences were found amongst herbs, reaching even 10-fold toxicity. Glechoma hederacea L. proved to be the most toxic, with the EC50 of 0.4073 g dried plant/l. LC50 values generated by the standard bioassay seem to be a good indicator of the bactericide property of herbs. Traditional use of the selected herbs shows a good correlation with bioactivity expressed as bioluminescence inhibition, leading to the conclusion that the Vibrio fischeri bioassay can be a good indicator of the overall antibacterial capacity of herbs, at least on a screening level. PMID:21502819

  8. Quantifying cell behaviors during embryonic wound healing

    NASA Astrophysics Data System (ADS)

    Mashburn, David; Ma, Xiaoyan; Crews, Sarah; Lynch, Holley; McCleery, W. Tyler; Hutson, M. Shane

    2011-03-01

    During embryogenesis, internal forces induce motions in cells leading to widespread motion in tissues. We previously developed laser hole-drilling as a consistent, repeatable way to probe such epithelial mechanics. The initial recoil (less than 30s) gives information about physical properties (elasticity, force) of cells surrounding the wound, but the long-term healing process (tens of minutes) shows how cells adjust their behavior in response to stimuli. To study this biofeedback in many cells through time, we developed tools to quantify statistics of individual cells. By combining watershed segmentation with a powerful and efficient user interaction system, we overcome problems that arise in any automatic segmentation from poor image quality. We analyzed cell area, perimeter, aspect ratio, and orientation relative to wound for a wide variety of laser cuts in dorsal closure. We quantified statistics for different regions as well, i.e. cells near to and distant from the wound. Regional differences give a distribution of wound-induced changes, whose spatial localization provides clues into the physical/chemical signals that modulate the wound healing response. Supported by the Human Frontier Science Program (RGP0021/2007 C).

  9. Quantifying Scheduling Challenges for Exascale System Software

    SciTech Connect

    Mondragon, Oscar; Bridges, Patrick G.; Jones, Terry R

    2015-01-01

    The move towards high-performance computing (HPC) ap- plications comprised of coupled codes and the need to dra- matically reduce data movement is leading to a reexami- nation of time-sharing vs. space-sharing in HPC systems. In this paper, we discuss and begin to quantify the perfor- mance impact of a move away from strict space-sharing of nodes for HPC applications. Specifically, we examine the po- tential performance cost of time-sharing nodes between ap- plication components, we determine whether a simple coor- dinated scheduling mechanism can address these problems, and we research how suitable simple constraint-based opti- mization techniques are for solving scheduling challenges in this regime. Our results demonstrate that current general- purpose HPC system software scheduling and resource al- location systems are subject to significant performance de- ciencies which we quantify for six representative applica- tions. Based on these results, we discuss areas in which ad- ditional research is needed to meet the scheduling challenges of next-generation HPC systems.

  10. National Orange Show Photovoltaic Demonstration

    SciTech Connect

    Dan Jimenez Sheri Raborn, CPA; Tom Baker

    2008-03-31

    National Orange Show Photovoltaic Demonstration created a 400KW Photovoltaic self-generation plant at the National Orange Show Events Center (NOS). The NOS owns a 120-acre state fairground where it operates an events center and produces an annual citrus fair known as the Orange Show. The NOS governing board wanted to employ cost-saving programs for annual energy expenses. It is hoped the Photovoltaic program will result in overall savings for the NOS, help reduce the State's energy demands as relating to electrical power consumption, improve quality of life within the affected grid area as well as increase the energy efficiency of buildings at our venue. In addition, the potential to reduce operational expenses would have a tremendous effect on the ability of the NOS to service its community.

  11. Quantifying uncertainty from material inhomogeneity.

    SciTech Connect

    Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

    2009-09-01

    Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the

  12. A stochastic approach for quantifying immigrant integration: the Spanish test case

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Barra, Adriano; Contucci, Pierluigi; Sandell, Richard; Vernia, Cecilia

    2014-10-01

    We apply stochastic process theory to the analysis of immigrant integration. Using a unique and detailed data set from Spain, we study the relationship between local immigrant density and two social and two economic immigration quantifiers for the period 1999-2010. As opposed to the classic time-series approach, by letting immigrant density play the role of ‘time’ and the quantifier the role of ‘space,’ it becomes possible to analyse the behavior of the quantifiers by means of continuous time random walks. Two classes of results are then obtained. First, we show that social integration quantifiers evolve following diffusion law, while the evolution of economic quantifiers exhibits ballistic dynamics. Second, we make predictions of best- and worst-case scenarios taking into account large local fluctuations. Our stochastic process approach to integration lends itself to interesting forecasting scenarios which, in the hands of policy makers, have the potential to improve political responses to integration problems. For instance, estimating the standard first-passage time and maximum-span walk reveals local differences in integration performance for different immigration scenarios. Thus, by recognizing the importance of local fluctuations around national means, this research constitutes an important tool to assess the impact of immigration phenomena on municipal budgets and to set up solid multi-ethnic plans at the municipal level as immigration pressures build.

  13. Quantifying tissue hemodynamics by NIRS versus DOT: global versus focal changes in cerebral hemodynamics

    NASA Astrophysics Data System (ADS)

    Boas, David A.; Cheng, Xuefeng; Marota, John A.; Mandeville, Joseph B.

    1999-09-01

    Near infrared spectroscopy (NIRS) is used to quantify changes in oxy-hemoglobin (HbO) and deoxy-hemoglobin (Hb) concentrations in tissue. The analysis uses the modified Beer-Lambert law, which is generally valid for quantifying global concentration changes. We examine the errors that result from analyzing focal changes in HbO and Hb concentrations. We find that the measured focal change in HbO and Hb are linearly proportional to the actual focal changes but that the proportionally constants are different. Thus relative changes in HbO and Hb cannot, in general, be quantified. However, we show that under certain circumstances it is possible to quantify these relative changes. This builds the case for diffuse optical tomography (DOT) which in general should be able to quantify focal changes in HbO and Hb through the use of image reconstruction algorithms that deconvolve the photon diffusion point-spread-function. We demonstrate the differences between NIRS and DOT using a rat model of somatosensory stimulation.

  14. Quantifying facial paralysis using the Kinect v2.

    PubMed

    Gaber, Amira; Taher, Mona F; Wahed, Manal Abdel

    2015-01-01

    Assessment of facial paralysis (FP) and quantitative grading of facial asymmetry are essential in order to quantify the extent of the condition as well as to follow its improvement or progression. As such, there is a need for an accurate quantitative grading system that is easy to use, inexpensive and has minimal inter-observer variability. A comprehensive automated system to quantify and grade FP is the main objective of this work. An initial prototype has been presented by the authors. The present research aims to enhance the accuracy and robustness of one of this system's modules: the resting symmetry module. This is achieved by including several modifications to the computation method of the symmetry index (SI) for the eyebrows, eyes and mouth. These modifications are the gamma correction technique, the area of the eyes, and the slope of the mouth. The system was tested on normal subjects and showed promising results. The mean SI of the eyebrows decreased slightly from 98.42% to 98.04% using the modified method while the mean SI for the eyes and mouth increased from 96.93% to 99.63% and from 95.6% to 98.11% respectively while using the modified method. The system is easy to use, inexpensive, automated and fast, has no inter-observer variability and is thus well suited for clinical use.

  15. A Generalizable Methodology for Quantifying User Satisfaction

    NASA Astrophysics Data System (ADS)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  16. Quantifying, Visualizing, and Monitoring Lead Optimization.

    PubMed

    Maynard, Andrew T; Roberts, Christopher D

    2016-05-12

    Although lead optimization (LO) is by definition a process, process-centric analysis and visualization of this important phase of pharmaceutical R&D has been lacking. Here we describe a simple statistical framework to quantify and visualize the progression of LO projects so that the vital signs of LO convergence can be monitored. We refer to the resulting visualizations generated by our methodology as the "LO telemetry" of a project. These visualizations can be automated to provide objective, holistic, and instantaneous analysis and communication of LO progression. This enhances the ability of project teams to more effectively drive LO process, while enabling management to better coordinate and prioritize LO projects. We present the telemetry of five LO projects comprising different biological targets and different project outcomes, including clinical compound selection, termination due to preclinical safety/tox, and termination due to lack of tractability. We demonstrate that LO progression is accurately captured by the telemetry. We also present metrics to quantify LO efficiency and tractability.

  17. Measures of Complexity to quantify Bone Structure

    NASA Astrophysics Data System (ADS)

    Saparin, Peter; Gowin, Wolfgang; Kurths, Jürgen; Felsenberg, Dieter

    1998-03-01

    We propose a technique to assess structure of the bone in its spatial distribution by describing and quantifying the structural architecture as a whole. The concept of measures of complexity based on symbolic dynamics is applied to computed tomography (CT) - images obtained from human lumbar vertebra. CT-images have been transformed into images consisting of 5 different symbols, whereby both statical and dynamical coding are included. Different aspects of the bone structure are quantified by several measures which have been introduced: index of global ensemble of elements composing the bone; complexity, homogeneity and dynamics within the bone architecture; complexity and inhomogeneity of the trabecular net. This leads to new insides to the understanding of bone's internal structure. The results give the first experimental and quantitative evidence of the theoretical prediction that complexity of bone structure declines rapidly with the increased disintegration of bone structures leading to the loss of bone mass and specify experimentally that bone structure is exponentially related to its density. Especially, osteoporotic vertebrae are less complex organized than normal ones. In addition, this method is significantly sensitive to changes in bone structure and provides improvements of diagnostic of pathological structural loss.

  18. Stimfit: quantifying electrophysiological data with Python

    PubMed Central

    Guzman, Segundo J.; Schlögl, Alois; Schmidt-Hieber, Christoph

    2013-01-01

    Intracellular electrophysiological recordings provide crucial insights into elementary neuronal signals such as action potentials and synaptic currents. Analyzing and interpreting these signals is essential for a quantitative understanding of neuronal information processing, and requires both fast data visualization and ready access to complex analysis routines. To achieve this goal, we have developed Stimfit, a free software package for cellular neurophysiology with a Python scripting interface and a built-in Python shell. The program supports most standard file formats for cellular neurophysiology and other biomedical signals through the Biosig library. To quantify and interpret the activity of single neurons and communication between neurons, the program includes algorithms to characterize the kinetics of presynaptic action potentials and postsynaptic currents, estimate latencies between pre- and postsynaptic events, and detect spontaneously occurring events. We validate and benchmark these algorithms, give estimation errors, and provide sample use cases, showing that Stimfit represents an efficient, accessible and extensible way to accurately analyze and interpret neuronal signals. PMID:24600389

  19. Quantifying a cellular automata simulation of electric vehicles

    NASA Astrophysics Data System (ADS)

    Hill, Graeme; Bell, Margaret; Blythe, Phil

    2014-12-01

    Within this work the Nagel-Schreckenberg (NS) cellular automata is used to simulate a basic cyclic road network. Results from SwitchEV, a real world Electric Vehicle trial which has collected more than two years of detailed electric vehicle data, are used to quantify the results of the NS automata, demonstrating similar power consumption behavior to that observed in the experimental results. In particular the efficiency of the electric vehicles reduces as the vehicle density increases, due in part to the reduced efficiency of EVs at low speeds, but also due to the energy consumption inherent in changing speeds. Further work shows the results from introducing spatially restricted speed restriction. In general it can be seen that induced congestion from spatially transient events propagates back through the road network and alters the energy and efficiency profile of the simulated vehicles, both before and after the speed restriction. Vehicles upstream from the restriction show a reduced energy usage and an increased efficiency, and vehicles downstream show an initial large increase in energy usage as they accelerate away from the speed restriction.

  20. Heart rate measurement as a tool to quantify sedentary behavior.

    PubMed

    Åkerberg, Anna; Koshmak, Gregory; Johansson, Anders; Lindén, Maria

    2015-01-01

    Sedentary work is very common today. The aim of this pilot study was to attempt to differentiate between typical work situations and to investigate the possibility to break sedentary behavior, based on physiological measurement among office workers. Ten test persons used one heart rate based activity monitor (Linkura), one pulse oximeter device (Wrist) and one movement based activity wristband (Fitbit Flex), in different working situations. The results showed that both heart rate devices, Linkura and Wrist, were able to detect differences in heart rate between the different working situations (resting, sitting, standing, slow walk and medium fast walk). The movement based device, Fitbit Flex, was only able to separate differences in steps between slow walk and medium fast walk. It can be concluded that heart rate measurement is a promising tool for quantifying and separating different working situations, such as sitting, standing and walking.

  1. Fuzzy Entropy Method for Quantifying Supply Chain Networks Complexity

    NASA Astrophysics Data System (ADS)

    Zhang, Jihui; Xu, Junqin

    Supply chain is a special kind of complex network. Its complexity and uncertainty makes it very difficult to control and manage. Supply chains are faced with a rising complexity of products, structures, and processes. Because of the strong link between a supply chain’s complexity and its efficiency the supply chain complexity management becomes a major challenge of today’s business management. The aim of this paper is to quantify the complexity and organization level of an industrial network working towards the development of a ‘Supply Chain Network Analysis’ (SCNA). By measuring flows of goods and interaction costs between different sectors of activity within the supply chain borders, a network of flows is built and successively investigated by network analysis. The result of this study shows that our approach can provide an interesting conceptual perspective in which the modern supply network can be framed, and that network analysis can handle these issues in practice.

  2. Quantifying consumption rates of dissolved oxygen along bed forms

    NASA Astrophysics Data System (ADS)

    Boano, Fulvio; De Falco, Natalie; Arnon, Shai

    2016-04-01

    Streambed interfaces represent hotspots for nutrient transformations because they host different microbial species, and the evaluation of these reaction rates is important to assess the fate of nutrients in riverine environments. In this work we analyze a series of flume experiments on oxygen demand in dune-shaped hyporheic sediments under losing and gaining flow conditions. We employ a new modeling code to quantify oxygen consumption rates from observed vertical profiles of oxygen concentration. The code accounts for transport by molecular diffusion and water advection, and automatically determines the reaction rates that provide the best fit between observed and modeled concentration values. The results show that reaction rates are not uniformly distributed across the streambed, in agreement with the expected behavior predicted by hyporheic exchange theory. Oxygen consumption was found to be highly influenced by the presence of gaining or losing flow conditions, which controlled the delivery of labile DOC to streambed microorganisms.

  3. Measuring political polarization: Twitter shows the two sides of Venezuela

    NASA Astrophysics Data System (ADS)

    Morales, A. J.; Borondo, J.; Losada, J. C.; Benito, R. M.

    2015-03-01

    We say that a population is perfectly polarized when divided in two groups of the same size and opposite opinions. In this paper, we propose a methodology to study and measure the emergence of polarization from social interactions. We begin by proposing a model to estimate opinions in which a minority of influential individuals propagate their opinions through a social network. The result of the model is an opinion probability density function. Next, we propose an index to quantify the extent to which the resulting distribution is polarized. Finally, we apply the proposed methodology to a Twitter conversation about the late Venezuelan president, Hugo Chávez, finding a good agreement between our results and offline data. Hence, we show that our methodology can detect different degrees of polarization, depending on the structure of the network.

  4. Quantifying Global Uncertainties in a Simple Microwave Rainfall Algorithm

    NASA Technical Reports Server (NTRS)

    Kummerow, Christian; Berg, Wesley; Thomas-Stahle, Jody; Masunaga, Hirohiko

    2006-01-01

    While a large number of methods exist in the literature for retrieving rainfall from passive microwave brightness temperatures, little has been written about the quantitative assessment of the expected uncertainties in these rainfall products at various time and space scales. The latter is the result of two factors: sparse validation sites over most of the world's oceans, and algorithm sensitivities to rainfall regimes that cause inconsistencies against validation data collected at different locations. To make progress in this area, a simple probabilistic algorithm is developed. The algorithm uses an a priori database constructed from the Tropical Rainfall Measuring Mission (TRMM) radar data coupled with radiative transfer computations. Unlike efforts designed to improve rainfall products, this algorithm takes a step backward in order to focus on uncertainties. In addition to inversion uncertainties, the construction of the algorithm allows errors resulting from incorrect databases, incomplete databases, and time- and space-varying databases to be examined. These are quantified. Results show that the simple algorithm reduces errors introduced by imperfect knowledge of precipitation radar (PR) rain by a factor of 4 relative to an algorithm that is tuned to the PR rainfall. Database completeness does not introduce any additional uncertainty at the global scale, while climatologically distinct space/time domains add approximately 25% uncertainty that cannot be detected by a radiometer alone. Of this value, 20% is attributed to changes in cloud morphology and microphysics, while 5% is a result of changes in the rain/no-rain thresholds. All but 2%-3% of this variability can be accounted for by considering the implicit assumptions in the algorithm. Additional uncertainties introduced by the details of the algorithm formulation are not quantified in this study because of the need for independent measurements that are beyond the scope of this paper. A validation strategy

  5. An Ethogram to Quantify Operating Room Behavior

    PubMed Central

    Jones, Laura K.; Jennings, Bonnie Mowinski; Goelz, Ryan M.; Haythorn, Kent W.; Zivot, Joel B.; de Waal, Frans B. M.

    2017-01-01

    Background The operating room (OR) is a highly social and hierarchical setting where interprofessional team members must work interdependently under pressure. Due primarily to methodological challenges, the social and behavioral sciences have had trouble offering insight into OR dynamics. Purpose We adopted a method from the field of ethology for observing and quantifying the interpersonal interactions of OR team members. Methods We created and refined an ethogram, a catalog of all our subjects’ observable social behaviors. The ethogram was then assessed for its feasibility and interobserver reliability. Results It was feasible to use an ethogram to gather data in the OR. The high interobserver reliability (Cohen’s Kappa coefficients of 81 % and higher) indicates its utility for yielding largely objective, descriptive, quantitative data on OR behavior. Conclusions The method we propose has potential for social research conducted in healthcare settings as complex as the OR. PMID:26813263

  6. Quantifying the risk of extreme aviation accidents

    NASA Astrophysics Data System (ADS)

    Das, Kumer Pial; Dey, Asim Kumer

    2016-12-01

    Air travel is considered a safe means of transportation. But when aviation accidents do occur they often result in fatalities. Fortunately, the most extreme accidents occur rarely. However, 2014 was the deadliest year in the past decade causing 111 plane crashes, and among them worst four crashes cause 298, 239, 162 and 116 deaths. In this study, we want to assess the risk of the catastrophic aviation accidents by studying historical aviation accidents. Applying a generalized Pareto model we predict the maximum fatalities from an aviation accident in future. The fitted model is compared with some of its competitive models. The uncertainty in the inferences are quantified using simulated aviation accident series, generated by bootstrap resampling and Monte Carlo simulations.

  7. Quantifying capital goods for waste incineration

    SciTech Connect

    Brogaard, L.K.; Riber, C.; Christensen, T.H.

    2013-06-15

    Highlights: • Materials and energy used for the construction of waste incinerators were quantified. • The data was collected from five incineration plants in Scandinavia. • Included were six main materials, electronic systems, cables and all transportation. • The capital goods contributed 2–3% compared to the direct emissions impact on GW. - Abstract: Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000–240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000–26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000–5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7–14 kg CO{sub 2} per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2–3% with respect to kg CO{sub 2} per tonne of waste combusted.

  8. Quantifying diet for nutrigenomic studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The field of nutrigenomics shows tremendous promise for improved understanding of the effects of dietary intake on health. The knowledge that metabolic pathways may be altered in individuals with genetic variants in the presence of certain dietary exposures offers great potential for personalized nu...

  9. Talker-specificity and adaptation in quantifier interpretation

    PubMed Central

    Yildirim, Ilker; Degen, Judith; Tanenhaus, Michael K.; Jaeger, T. Florian

    2015-01-01

    Linguistic meaning has long been recognized to be highly context-dependent. Quantifiers like many and some provide a particularly clear example of context-dependence. For example, the interpretation of quantifiers requires listeners to determine the relevant domain and scale. We focus on another type of context-dependence that quantifiers share with other lexical items: talker variability. Different talkers might use quantifiers with different interpretations in mind. We used a web-based crowdsourcing paradigm to study participants’ expectations about the use of many and some based on recent exposure. We first established that the mapping of some and many onto quantities (candies in a bowl) is variable both within and between participants. We then examined whether and how listeners’ expectations about quantifier use adapts with exposure to talkers who use quantifiers in different ways. The results demonstrate that listeners can adapt to talker-specific biases in both how often and with what intended meaning many and some are used. PMID:26858511

  10. Quantifying Sensible Weather Forecast Variability

    DTIC Science & Technology

    2011-09-30

    came from the summer of 2010 and represented days when significant marine stratus transitions occurred over the course of a 24-hour period. These days...skies over most of the bay at the model start time. The satellite image shows a complex evolution of marine stratus from mostly clear the afternoon...and the primary focus of this study. To a first order approximation, the development and evolution of marine stratus depends largely on the

  11. Scalar Quantifiers: Logic, Acquisition, and Processing

    ERIC Educational Resources Information Center

    Geurts, Bart; Katsos, Napoleon; Cummins, Chris; Moons, Jonas; Noordman, Leo

    2010-01-01

    Superlative quantifiers ("at least 3", "at most 3") and comparative quantifiers ("more than 2", "fewer than 4") are traditionally taken to be interdefinable: the received view is that "at least n" and "at most n" are equivalent to "more than n-1" and "fewer than n+1",…

  12. Processing of Numerical and Proportional Quantifiers

    ERIC Educational Resources Information Center

    Shikhare, Sailee; Heim, Stefan; Klein, Elise; Huber, Stefan; Willmes, Klaus

    2015-01-01

    Quantifier expressions like "many" and "at least" are part of a rich repository of words in language representing magnitude information. The role of numerical processing in comprehending quantifiers was studied in a semantic truth value judgment task, asking adults to quickly verify sentences about visual displays using…

  13. quantifying and Predicting Reactive Transport

    SciTech Connect

    Peter C. Burns, Department of Civil Engineering and Geological Sciences, University of Notre Dame

    2009-12-04

    This project was led by Dr. Jiamin Wan at Lawrence Berkeley National Laboratory. Peter Burns provided expertise in uranium mineralogy and in identification of uranium minerals in test materials. Dr. Wan conducted column tests regarding uranium transport at LBNL, and samples of the resulting columns were sent to Dr. Burns for analysis. Samples were analyzed for uranium mineralogy by X-ray powder diffraction and by scanning electron microscopy, and results were provided to Dr. Wan for inclusion in the modeling effort. Full details of the project can be found in Dr. Wan's final reports for the associated effort at LBNL.

  14. Quantifying chaos for ecological stoichiometry.

    PubMed

    Duarte, Jorge; Januário, Cristina; Martins, Nuno; Sardanyés, Josep

    2010-09-01

    The theory of ecological stoichiometry considers ecological interactions among species with different chemical compositions. Both experimental and theoretical investigations have shown the importance of species composition in the outcome of the population dynamics. A recent study of a theoretical three-species food chain model considering stoichiometry [B. Deng and I. Loladze, Chaos 17, 033108 (2007)] shows that coexistence between two consumers predating on the same prey is possible via chaos. In this work we study the topological and dynamical measures of the chaotic attractors found in such a model under ecological relevant parameters. By using the theory of symbolic dynamics, we first compute the topological entropy associated with unimodal Poincaré return maps obtained by Deng and Loladze from a dimension reduction. With this measure we numerically prove chaotic competitive coexistence, which is characterized by positive topological entropy and positive Lyapunov exponents, achieved when the first predator reduces its maximum growth rate, as happens at increasing δ1. However, for higher values of δ1 the dynamics become again stable due to an asymmetric bubble-like bifurcation scenario. We also show that a decrease in the efficiency of the predator sensitive to prey's quality (increasing parameter ζ) stabilizes the dynamics. Finally, we estimate the fractal dimension of the chaotic attractors for the stoichiometric ecological model.

  15. Quantifying Coral Reef Ecosystem Services

    EPA Science Inventory

    Coral reefs have been declining during the last four decades as a result of both local and global anthropogenic stresses. Numerous research efforts to elucidate the nature, causes, magnitude, and potential remedies for the decline have led to the widely held belief that the recov...

  16. 15. Detail showing lower chord pinconnected to vertical member, showing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. Detail showing lower chord pin-connected to vertical member, showing floor beam riveted to extension of vertical member below pin-connection, and showing brackets supporting cantilevered sidewalk. View to southwest. - Selby Avenue Bridge, Spanning Short Line Railways track at Selby Avenue between Hamline & Snelling Avenues, Saint Paul, Ramsey County, MN

  17. Quantifying Diet for Nutrigenomic Studies

    PubMed Central

    Tucker, Katherine L.; Smith, Caren E.; Lai, Chao-Qiang; Ordovas, Jose M.

    2015-01-01

    The field of nutrigenomics shows tremendous promise for improved understanding of the effects of dietary intake on health. The knowledge that metabolic pathways may be altered in individuals with genetic variants in the presence of certain dietary exposures offers great potential for personalized nutrition advice. However, although considerable resources have gone into improving technology for measurement of the genome and biological systems, dietary intake assessment remains inadequate. Each of the methods currently used has limitations tliat may be exaggerated in the context of gene x nutrient interaction in large multiethnic studies. Because of the specificity of most gene x nutrient interactions, valid data are needed for nutrient intakes at the individual level. Most statistical adjustment efforts are designed to improve estimates of nutrient intake distributions in populations and are unlikely to solve this problem. An improved method of direct measurement of individual usual dietary intake that is unbiased across populations is urgently needed. PMID:23642200

  18. Dust as interstellar catalyst. I. Quantifying the chemical desorption process

    NASA Astrophysics Data System (ADS)

    Minissale, M.; Dulieu, F.; Cazaux, S.; Hocuk, S.

    2016-01-01

    Context. The presence of dust in the interstellar medium has profound consequences on the chemical composition of regions where stars are forming. Recent observations show that many species formed onto dust are populating the gas phase, especially in cold environments where UV- and cosmic-ray-induced photons do not account for such processes. Aims: The aim of this paper is to understand and quantify the process that releases solid species into the gas phase, the so-called chemical desorption process, so that an explicit formula can be derived that can be included in astrochemical models. Methods: We present a collection of experimental results of more than ten reactive systems. For each reaction, different substrates such as oxidized graphite and compact amorphous water ice were used. We derived a formula for reproducing the efficiencies of the chemical desorption process that considers the equipartition of the energy of newly formed products, followed by classical bounce on the surface. In part II of this study we extend these results to astrophysical conditions. Results: The equipartition of energy correctly describes the chemical desorption process on bare surfaces. On icy surfaces, the chemical desorption process is much less efficient, and a better description of the interaction with the surface is still needed. Conclusions: We show that the mechanism that directly transforms solid species into gas phase species is efficient for many reactions.

  19. Common ecology quantifies human insurgency.

    PubMed

    Bohorquez, Juan Camilo; Gourley, Sean; Dixon, Alexander R; Spagat, Michael; Johnson, Neil F

    2009-12-17

    Many collective human activities, including violence, have been shown to exhibit universal patterns. The size distributions of casualties both in whole wars from 1816 to 1980 and terrorist attacks have separately been shown to follow approximate power-law distributions. However, the possibility of universal patterns ranging across wars in the size distribution or timing of within-conflict events has barely been explored. Here we show that the sizes and timing of violent events within different insurgent conflicts exhibit remarkable similarities. We propose a unified model of human insurgency that reproduces these commonalities, and explains conflict-specific variations quantitatively in terms of underlying rules of engagement. Our model treats each insurgent population as an ecology of dynamically evolving, self-organized groups following common decision-making processes. Our model is consistent with several recent hypotheses about modern insurgency, is robust to many generalizations, and establishes a quantitative connection between human insurgency, global terrorism and ecology. Its similarity to financial market models provides a surprising link between violent and non-violent forms of human behaviour.

  20. Quantifying Barrier Island Recovery Following a Hurricane

    NASA Astrophysics Data System (ADS)

    Hammond, B.; Houser, C.

    2014-12-01

    Barrier islands are dynamic landscapes that are believed to minimize storm impact to mainland communities and also provide important ecological services in the coastal environment. The protection afforded by the island and the services it provides, however, depend on island resiliency in the face of accelerated sea level rise, which is in turn dependent on the rate of island recovery following storm events that may also change in both frequency and magnitude in the future. These changes in frequency may affect even large dunes and their resiliency, resulting in the island transitioning from a high to a low elevation. Previous research has shown that the condition of the foredune depends on the recovery of the nearshore and beach profile and the ability of vegetation to capture aeolian-transported sediment. An inability of the foredune to recover may result in mainland susceptibility to storm energy, inability for ecosystems to recover and thrive, and sediment budget instability. In this study, LiDAR data is used to quantify the rates of dune recovery at Fire Island, NY, the Outer Banks, NC, Santa Rosa Island, FL, and Matagorda Island, TX. Preliminary results indicate foredune recovery varies significantly both alongshore and in the cross-shore, suggesting that barrier island response and recovery to storm events cannot be considered from a strictly two-dimensional (cross-shore) perspective.

  1. Quantifying global international migration flows.

    PubMed

    Abel, Guy J; Sander, Nikola

    2014-03-28

    Widely available data on the number of people living outside of their country of birth do not adequately capture contemporary intensities and patterns of global migration flows. We present data on bilateral flows between 196 countries from 1990 through 2010 that provide a comprehensive view of international migration flows. Our data suggest a stable intensity of global 5-year migration flows at ~0.6% of world population since 1995. In addition, the results aid the interpretation of trends and patterns of migration flows to and from individual countries by placing them in a regional or global context. We estimate the largest movements to occur between South and West Asia, from Latin to North America, and within Africa.

  2. Diagnostic measure to quantify loss of clinical components in multi-lead electrocardiogram

    PubMed Central

    Sharma, L.N.; Dandapat, S.

    2016-01-01

    In this Letter, a novel principal component (PC)-based diagnostic measure (PCDM) is proposed to quantify loss of clinical components in the multi-lead electrocardiogram (MECG) signals. The analysis of MECG shows that, the clinical components are captured in few PCs. The proposed diagnostic measure is defined as the sum of weighted percentage root mean square difference (PRD) between the PCs of original and processed MECG signals. The values of the weight depend on the clinical importance of PCs. The PCDM is tested over MECG enhancement and a novel MECG data reduction scheme. The proposed measure is compared with weighted diagnostic distortion, wavelet energy diagnostic distortion and PRD. The qualitative evaluation is performed using Spearman rank-order correlation coefficient (SROCC) and Pearson linear correlation coefficient. The simulation result demonstrates that the PCDM performs better to quantify loss of clinical components in MECG and shows a SROCC value of 0.9686 with subjective measure.

  3. Quantifying Antimicrobial Resistance at Veal Calf Farms

    PubMed Central

    Bosman, Angela B.; Wagenaar, Jaap; Stegeman, Arjan; Vernooij, Hans; Mevius, Dik

    2012-01-01

    This study was performed to determine a sampling strategy to quantify the prevalence of antimicrobial resistance on veal calf farms, based on the variation in antimicrobial resistance within and between calves on five farms. Faecal samples from 50 healthy calves (10 calves/farm) were collected. From each individual sample and one pooled faecal sample per farm, 90 selected Escherichia coli isolates were tested for their resistance against 25 mg/L amoxicillin, 25 mg/L tetracycline, 0.5 mg/L cefotaxime, 0.125 mg/L ciprofloxacin and 8/152 mg/L trimethoprim/sulfamethoxazole (tmp/s) by replica plating. From each faecal sample another 10 selected E. coli isolates were tested for their resistance by broth microdilution as a reference. Logistic regression analysis was performed to compare the odds of testing an isolate resistant between both test methods (replica plating vs. broth microdilution) and to evaluate the effect of pooling faecal samples. Bootstrap analysis was used to investigate the precision of the estimated prevalence of resistance to each antimicrobial obtained by several simulated sampling strategies. Replica plating showed similar odds of E. coli isolates tested resistant compared to broth microdilution, except for ciprofloxacin (OR 0.29, p≤0.05). Pooled samples showed in general lower odds of an isolate being resistant compared to individual samples, although these differences were not significant. Bootstrap analysis showed that within each antimicrobial the various compositions of a pooled sample provided consistent estimates for the mean proportion of resistant isolates. Sampling strategies should be based on the variation in resistance among isolates within faecal samples and between faecal samples, which may vary by antimicrobial. In our study, the optimal sampling strategy from the perspective of precision of the estimated levels of resistance and practicality consists of a pooled faecal sample from 20 individual animals, of which 90 isolates are

  4. Quantifying tetrodotoxin levels in the California newt using a non-destructive sampling method.

    PubMed

    Bucciarelli, Gary M; Li, Amy; Zimmer, Richard K; Kats, Lee B; Green, David B

    2014-03-01

    Toxic or noxious substances often serve as a means of chemical defense for numerous taxa. However, such compounds may also facilitate ecological or evolutionary processes. The neurotoxin, tetrodotoxin (TTX), which is found in newts of the genus Taricha, acts as a selection pressure upon predatory garter snakes, is a chemical cue to conspecific larvae, which elicits antipredator behavior, and may also affect macroinvertebrate foraging behavior. To understand selection patterns and how potential variation might affect ecological and evolutionary processes, it is necessary to quantify TTX levels within individuals and populations. To do so has often required that animals be destructively sampled or removed from breeding habitats and brought into the laboratory. Here we demonstrate a non-destructive method of sampling adult Taricha that obviates the need to capture and collect individuals. We also show that embryos from oviposited California newt (Taricha torosa) egg masses can be individually sampled and TTX quantified from embryos. We employed three different extraction techniques to isolate TTX. Using a custom fabricated high performance liquid chromatography (HPLC) system we quantified recovery of TTX. We found that a newly developed micro-extraction technique significantly improved recovery compared to previously used methods. Results also indicate our improvements to the HPLC method have high repeatability and increased sensitivity, with a detection limit of 48 pg (0.15 pmol) TTX. The quantified amounts of TTX in adult newts suggest fine geographic variation in toxin levels between sampling localities isolated by as little as 3 km.

  5. Asteroid Geophysics and Quantifying the Impact Hazard

    NASA Technical Reports Server (NTRS)

    Sears, D.; Wooden, D. H.; Korycanksy, D. G.

    2015-01-01

    Probably the major challenge in understanding, quantifying, and mitigating the effects of an impact on Earth is understanding the nature of the impactor. Of the roughly 25 meteorite craters on the Earth that have associated meteorites, all but one was produced by an iron meteorite and only one was produced by a stony meteorite. Equally important, even meteorites of a given chemical class produce a wide variety of behavior in the atmosphere. This is because they show considerable diversity in their mechanical properties which have a profound influence on the behavior of meteorites during atmospheric passage. Some stony meteorites are weak and do not reach the surface or reach the surface as thousands of relatively harmless pieces. Some stony meteorites roll into a maximum drag configuration and are strong enough to remain intact so a large single object reaches the surface. Others have high concentrations of water that may facilitate disruption. However, while meteorite falls and meteorites provide invaluable information on the physical nature of the objects entering the atmosphere, there are many unknowns concerning size and scale that can only be determined by from the pre-atmospheric properties of the asteroids. Their internal structure, their thermal properties, their internal strength and composition, will all play a role in determining the behavior of the object as it passes through the atmosphere, whether it produces an airblast and at what height, and the nature of the impact and amount and distribution of ejecta.

  6. The missing metric: quantifying contributions of reviewers

    PubMed Central

    Cantor, Maurício; Gero, Shane

    2015-01-01

    The number of contributing reviewers often outnumbers the authors of publications. This has led to apathy towards reviewing and the conclusion that the peer-review system is broken. Given the trade-offs between submitting and reviewing manuscripts, reviewers and authors naturally want visibility for their efforts. While study after study has called for revolutionizing publication practices, the current paradigm does not recognize reviewers' time and expertise. We propose the R-index as a simple way to quantify scientists' contributions as reviewers. We modelled its performance using simulations based on real data to show that early–mid career scientists, who complete high-quality reviews of longer manuscripts within their field, can perform as well as leading scientists reviewing only for high-impact journals. By giving citeable academic recognition for reviewing, R-index will encourage more participation with better reviews, regardless of the career stage. Moreover, the R-index will allow editors to exploit scores to manage and improve their review team, and for journals to promote high average scores as signals of a practical and efficient service to authors. Peer-review is a pervasive necessity across disciplines and the simple utility of this missing metric will credit a valuable aspect of academic productivity without having to revolutionize the current peer-review system. PMID:26064609

  7. Quantifying Flaw Characteristics from IR NDE Data

    SciTech Connect

    Miller, W; Philips, N R; Burke, M W; Robbins, C L

    2003-02-14

    Work is presented which allows flaw characteristics to be quantified from the transient IR NDE signature. The goal of this effort was to accurately determine the type, size and depth of flaws revealed with IR NDE, using sonic IR as the example IR NDE technique. Typically an IR NDE experiment will result in a positive qualitative indication of a flaw such as a cold or hot spot in the image, but will not provide quantitative data thereby leaving the practitioner to make educated guesses as to the source of the signal. The technique presented here relies on comparing the transient IR signature to exact heat transfer analytical results for prototypical flaws, using the flaw characteristics as unknown fitting parameters. A nonlinear least squares algorithm is used to evaluate the fitting parameters, which then provide a direct measure of the flaw characteristics that can be mapped to the imaged surface for visual reference. The method uses temperature data for the heat transfer analysis, so radiometric calibration of the IR signal is required. The method provides quantitative data with a single thermal event (e.g. acoustic pulse or flash), as compared to phase-lock techniques that require many events. The work has been tested with numerical data but remains to be validated by experimental data, and that effort is underway.

  8. Quantifying drug-protein binding in vivo.

    SciTech Connect

    Buchholz, B; Bench, G; Keating III, G; Palmblad, M; Vogel, J; Grant, P G; Hillegonds, D

    2004-02-17

    Accelerator mass spectrometry (AMS) provides precise quantitation of isotope labeled compounds that are bound to biological macromolecules such as DNA or proteins. The sensitivity is high enough to allow for sub-pharmacological (''micro-'') dosing to determine macromolecular targets without inducing toxicities or altering the system under study, whether it is healthy or diseased. We demonstrated an application of AMS in quantifying the physiologic effects of one dosed chemical compound upon the binding level of another compound in vivo at sub-toxic doses [4].We are using tissues left from this study to develop protocols for quantifying specific binding to isolated and identified proteins. We also developed a new technique to quantify nanogram to milligram amounts of isolated protein at precisions that are comparable to those for quantifying the bound compound by AMS.

  9. Quantifying Climatological Ranges and Anomalies for Pacific Coral Reef Ecosystems

    PubMed Central

    Gove, Jamison M.; Williams, Gareth J.; McManus, Margaret A.; Heron, Scott F.; Sandin, Stuart A.; Vetter, Oliver J.; Foley, David G.

    2013-01-01

    Coral reef ecosystems are exposed to a range of environmental forcings that vary on daily to decadal time scales and across spatial scales spanning from reefs to archipelagos. Environmental variability is a major determinant of reef ecosystem structure and function, including coral reef extent and growth rates, and the abundance, diversity, and morphology of reef organisms. Proper characterization of environmental forcings on coral reef ecosystems is critical if we are to understand the dynamics and implications of abiotic–biotic interactions on reef ecosystems. This study combines high-resolution bathymetric information with remotely sensed sea surface temperature, chlorophyll-a and irradiance data, and modeled wave data to quantify environmental forcings on coral reefs. We present a methodological approach to develop spatially constrained, island- and atoll-scale metrics that quantify climatological range limits and anomalous environmental forcings across U.S. Pacific coral reef ecosystems. Our results indicate considerable spatial heterogeneity in climatological ranges and anomalies across 41 islands and atolls, with emergent spatial patterns specific to each environmental forcing. For example, wave energy was greatest at northern latitudes and generally decreased with latitude. In contrast, chlorophyll-a was greatest at reef ecosystems proximate to the equator and northern-most locations, showing little synchrony with latitude. In addition, we find that the reef ecosystems with the highest chlorophyll-a concentrations; Jarvis, Howland, Baker, Palmyra and Kingman are each uninhabited and are characterized by high hard coral cover and large numbers of predatory fishes. Finally, we find that scaling environmental data to the spatial footprint of individual islands and atolls is more likely to capture local environmental forcings, as chlorophyll-a concentrations decreased at relatively short distances (>7 km) from 85% of our study locations. These metrics will

  10. Quantifying climatological ranges and anomalies for Pacific coral reef ecosystems.

    PubMed

    Gove, Jamison M; Williams, Gareth J; McManus, Margaret A; Heron, Scott F; Sandin, Stuart A; Vetter, Oliver J; Foley, David G

    2013-01-01

    Coral reef ecosystems are exposed to a range of environmental forcings that vary on daily to decadal time scales and across spatial scales spanning from reefs to archipelagos. Environmental variability is a major determinant of reef ecosystem structure and function, including coral reef extent and growth rates, and the abundance, diversity, and morphology of reef organisms. Proper characterization of environmental forcings on coral reef ecosystems is critical if we are to understand the dynamics and implications of abiotic-biotic interactions on reef ecosystems. This study combines high-resolution bathymetric information with remotely sensed sea surface temperature, chlorophyll-a and irradiance data, and modeled wave data to quantify environmental forcings on coral reefs. We present a methodological approach to develop spatially constrained, island- and atoll-scale metrics that quantify climatological range limits and anomalous environmental forcings across U.S. Pacific coral reef ecosystems. Our results indicate considerable spatial heterogeneity in climatological ranges and anomalies across 41 islands and atolls, with emergent spatial patterns specific to each environmental forcing. For example, wave energy was greatest at northern latitudes and generally decreased with latitude. In contrast, chlorophyll-a was greatest at reef ecosystems proximate to the equator and northern-most locations, showing little synchrony with latitude. In addition, we find that the reef ecosystems with the highest chlorophyll-a concentrations; Jarvis, Howland, Baker, Palmyra and Kingman are each uninhabited and are characterized by high hard coral cover and large numbers of predatory fishes. Finally, we find that scaling environmental data to the spatial footprint of individual islands and atolls is more likely to capture local environmental forcings, as chlorophyll-a concentrations decreased at relatively short distances (>7 km) from 85% of our study locations. These metrics will help

  11. Hey Teacher, Your Personality's Showing!

    ERIC Educational Resources Information Center

    Paulsen, James R.

    1977-01-01

    A study of 30 fourth, fifth, and sixth grade teachers and 300 of their students showed that a teacher's age, sex, and years of experience did not relate to students' mathematics achievement, but that more effective teachers showed greater "freedom from defensive behavior" than did less effective teachers. (DT)

  12. Planning a Successful Tech Show

    ERIC Educational Resources Information Center

    Nikirk, Martin

    2011-01-01

    Tech shows are a great way to introduce prospective students, parents, and local business and industry to a technology and engineering or career and technical education program. In addition to showcasing instructional programs, a tech show allows students to demonstrate their professionalism and skills, practice public presentations, and interact…

  13. Quantifying Urban Groundwater in Environmental Field Observatories

    NASA Astrophysics Data System (ADS)

    Welty, C.; Miller, A. J.; Belt, K.; Smith, J. A.; Band, L. E.; Groffman, P.; Scanlon, T.; Warner, J.; Ryan, R. J.; Yeskis, D.; McGuire, M. P.

    2006-12-01

    Despite the growing footprint of urban landscapes and their impacts on hydrologic and biogeochemical cycles, comprehensive field studies of urban water budgets are few. The cumulative effects of urban infrastructure (buildings, roads, culverts, storm drains, detention ponds, leaking water supply and wastewater pipe networks) on temporal and spatial patterns of groundwater stores, fluxes, and flowpaths are poorly understood. The goal of this project is to develop expertise and analytical tools for urban groundwater systems that will inform future environmental observatory planning and that can be shared with research teams working in urban environments elsewhere. The work plan for this project draws on a robust set of information resources in Maryland provided by ongoing monitoring efforts of the Baltimore Ecosystem Study (BES), USGS, and the U.S. Forest Service working together with university scientists and engineers from multiple institutions. A key concern is to bridge the gap between small-scale intensive field studies and larger-scale and longer-term hydrologic patterns using synoptic field surveys, remote sensing, numerical modeling, data mining and visualization tools. Using the urban water budget as a unifying theme, we are working toward estimating the various elements of the budget in order to quantify the influence of urban infrastructure on groundwater. Efforts include: (1) comparison of base flow behavior from stream gauges in a nested set of watersheds at four different spatial scales from 0.8 to 171 km2, with diverse patterns of impervious cover and urban infrastructure; (2) synoptic survey of well water levels to characterize the regional water table; (3) use of airborne thermal infrared imagery to identify locations of groundwater seepage into streams across a range of urban development patterns; (4) use of seepage transects and tracer tests to quantify the spatial pattern of groundwater fluxes to the drainage network in selected subwatersheds; (5

  14. Quantifying aggregation dynamics during Myxococcus xanthus development.

    PubMed

    Zhang, Haiyang; Angus, Stuart; Tran, Michael; Xie, Chunyan; Igoshin, Oleg A; Welch, Roy D

    2011-10-01

    Under starvation conditions, a swarm of Myxococcus xanthus cells will undergo development, a multicellular process culminating in the formation of many aggregates called fruiting bodies, each of which contains up to 100,000 spores. The mechanics of symmetry breaking and the self-organization of cells into fruiting bodies is an active area of research. Here we use microcinematography and automated image processing to quantify several transient features of developmental dynamics. An analysis of experimental data indicates that aggregation reaches its steady state in a highly nonmonotonic fashion. The number of aggregates rapidly peaks at a value 2- to 3-fold higher than the final value and then decreases before reaching a steady state. The time dependence of aggregate size is also nonmonotonic, but to a lesser extent: average aggregate size increases from the onset of aggregation to between 10 and 15 h and then gradually decreases thereafter. During this process, the distribution of aggregates transitions from a nearly random state early in development to a more ordered state later in development. A comparison of experimental results to a mathematical model based on the traffic jam hypothesis indicates that the model fails to reproduce these dynamic features of aggregation, even though it accurately describes its final outcome. The dynamic features of M. xanthus aggregation uncovered in this study impose severe constraints on its underlying mechanisms.

  15. Identifying and quantifying urban recharge: a review

    NASA Astrophysics Data System (ADS)

    Lerner, David N.

    2002-02-01

    The sources of and pathways for groundwater recharge in urban areas are more numerous and complex than in rural environments. Buildings, roads, and other surface infrastructure combine with man-made drainage networks to change the pathways for precipitation. Some direct recharge is lost, but additional recharge can occur from storm drainage systems. Large amounts of water are imported into most cities for supply, distributed through underground pipes, and collected again in sewers or septic tanks. The leaks from these pipe networks often provide substantial recharge. Sources of recharge in urban areas are identified through piezometry, chemical signatures, and water balances. All three approaches have problems. Recharge is quantified either by individual components (direct recharge, water-mains leakage, septic tanks, etc.) or holistically. Working with individual components requires large amounts of data, much of which is uncertain and is likely to lead to large uncertainties in the final result. Recommended holistic approaches include the use of groundwater modelling and solute balances, where various types of data are integrated. Urban recharge remains an under-researched topic, with few high-quality case studies reported in the literature.

  16. Quantifying Ant Activity Using Vibration Measurements

    PubMed Central

    Oberst, Sebastian; Baro, Enrique Nava; Lai, Joseph C. S.; Evans, Theodore A.

    2014-01-01

    Ant behaviour is of great interest due to their sociality. Ant behaviour is typically observed visually, however there are many circumstances where visual observation is not possible. It may be possible to assess ant behaviour using vibration signals produced by their physical movement. We demonstrate through a series of bioassays with different stimuli that the level of activity of meat ants (Iridomyrmex purpureus) can be quantified using vibrations, corresponding to observations with video. We found that ants exposed to physical shaking produced the highest average vibration amplitudes followed by ants with stones to drag, then ants with neighbours, illuminated ants and ants in darkness. In addition, we devised a novel method based on wavelet decomposition to separate the vibration signal owing to the initial ant behaviour from the substrate response, which will allow signals recorded from different substrates to be compared directly. Our results indicate the potential to use vibration signals to classify some ant behaviours in situations where visual observation could be difficult. PMID:24658467

  17. Quantifying ant activity using vibration measurements.

    PubMed

    Oberst, Sebastian; Baro, Enrique Nava; Lai, Joseph C S; Evans, Theodore A

    2014-01-01

    Ant behaviour is of great interest due to their sociality. Ant behaviour is typically observed visually, however there are many circumstances where visual observation is not possible. It may be possible to assess ant behaviour using vibration signals produced by their physical movement. We demonstrate through a series of bioassays with different stimuli that the level of activity of meat ants (Iridomyrmex purpureus) can be quantified using vibrations, corresponding to observations with video. We found that ants exposed to physical shaking produced the highest average vibration amplitudes followed by ants with stones to drag, then ants with neighbours, illuminated ants and ants in darkness. In addition, we devised a novel method based on wavelet decomposition to separate the vibration signal owing to the initial ant behaviour from the substrate response, which will allow signals recorded from different substrates to be compared directly. Our results indicate the potential to use vibration signals to classify some ant behaviours in situations where visual observation could be difficult.

  18. Quantifying acoustic damping using flame chemiluminescence

    NASA Astrophysics Data System (ADS)

    Boujo, E.; Denisov, A.; Schuermans, B.; Noiray, N.

    2016-12-01

    Thermoacoustic instabilities in gas turbines and aeroengine combustors falls within the category of complex systems. They can be described phenomenologically using nonlinear stochastic differential equations, which constitute the grounds for output-only model-based system identification. It has been shown recently that one can extract the governing parameters of the instabilities, namely the linear growth rate and the nonlinear component of the thermoacoustic feedback, using dynamic pressure time series only. This is highly relevant for practical systems, which cannot be actively controlled due to a lack of cost-effective actuators. The thermoacoustic stability is given by the linear growth rate, which results from the combination of the acoustic damping and the coherent feedback from the flame. In this paper, it is shown that it is possible to quantify the acoustic damping of the system, and thus to separate its contribution to the linear growth rate from the one of the flame. This is achieved by post-processing in a simple way simultaneously acquired chemiluminescence and acoustic pressure data. It provides an additional approach to further unravel from observed time series the key mechanisms governing the system dynamics. This straightforward method is illustrated here using experimental data from a combustion chamber operated at several linearly stable and unstable operating conditions.

  19. Quantifying the limits of fingerprint variability.

    PubMed

    Fagert, Michael; Morris, Keith

    2015-09-01

    The comparison and identification of fingerprints are made difficult by fingerprint variability arising from distortion. This study seeks to quantify both the limits of fingerprint variability when subject to heavy distortion, and the variability observed in repeated inked planar impressions. A total of 30 fingers were studied: 10 right slant loops, 10 plain whorls, and 10 plain arches. Fingers were video recorded performing several distortion movements under heavy deposition pressure: left, right, up, and down translation of the finger, clockwise and counter-clockwise torque of the finger, and planar impressions. Fingerprint templates, containing 'true' minutiae locations, were created for each finger using 10 repeated inked planar impressions. A minimal amount of variability, 0.18mm globally, was observed for minutiae in repeated inked planar impressions. When subject to heavy distortion minutiae can be displaced by upwards of 3mm and their orientation altered by as much as 30° in relation to their template positions. Minutiae displacements of 1mm and 10° changes in orientation are readily observed. The results of this study will allow fingerprint examiners to identify and understand the degree of variability that can be reasonably expected throughout the various regions of fingerprints.

  20. Quantifying Potential Groundwater Recharge In South Texas

    NASA Astrophysics Data System (ADS)

    Basant, S.; Zhou, Y.; Leite, P. A.; Wilcox, B. P.

    2015-12-01

    Groundwater in South Texas is heavily relied on for human consumption and irrigation for food crops. Like most of the south west US, woody encroachment has altered the grassland ecosystems here too. While brush removal has been widely implemented in Texas with the objective of increasing groundwater recharge, the linkage between vegetation and groundwater recharge in South Texas is still unclear. Studies have been conducted to understand plant-root-water dynamics at the scale of plants. However, little work has been done to quantify the changes in soil water and deep percolation at the landscape scale. Modeling water flow through soil profiles can provide an estimate of the total water flowing into deep percolation. These models are especially powerful with parameterized and calibrated with long term soil water data. In this study we parameterize the HYDRUS soil water model using long term soil water data collected in Jim Wells County in South Texas. Soil water was measured at every 20 cm intervals up to a depth of 200 cm. The parameterized model will be used to simulate soil water dynamics under a variety of precipitation regimes ranging from well above normal to severe drought conditions. The results from the model will be compared with the changes in soil moisture profile observed in response to vegetation cover and treatments from a study in a similar. Comparative studies like this can be used to build new and strengthen existing hypotheses regarding deep percolation and the role of soil texture and vegetation in groundwater recharge.

  1. Quantifying Wrinkle Features of Thin Membrane Structures

    NASA Technical Reports Server (NTRS)

    Jacobson, Mindy B.; Iwasa, Takashi; Naton, M. C.

    2004-01-01

    For future micro-systems utilizing membrane based structures, quantified predictions of wrinkling behavior in terms of amplitude, angle and wavelength are needed to optimize the efficiency and integrity of such structures, as well as their associated control systems. For numerical analyses performed in the past, limitations on the accuracy of membrane distortion simulations have often been related to the assumptions made. This work demonstrates that critical assumptions include: effects of gravity, supposed initial or boundary conditions, and the type of element used to model the membrane. In this work, a 0.2 m x 02 m membrane is treated as a structural material with non-negligible bending stiffness. Finite element modeling is used to simulate wrinkling behavior due to a constant applied in-plane shear load. Membrane thickness, gravity effects, and initial imperfections with respect to flatness were varied in numerous nonlinear analysis cases. Significant findings include notable variations in wrinkle modes for thickness in the range of 50 microns to 1000 microns, which also depend on the presence of an applied gravity field. However, it is revealed that relationships between overall strain energy density and thickness for cases with differing initial conditions are independent of assumed initial conditions. In addition, analysis results indicate that the relationship between wrinkle amplitude scale (W/t) and structural scale (L/t) is independent of the nonlinear relationship between thickness and stiffness.

  2. Quantifying capital goods for biological treatment of organic waste.

    PubMed

    Brogaard, Line K; Petersen, Per H; Nielsen, Peter D; Christensen, Thomas H

    2015-02-01

    Materials and energy used for construction of anaerobic digestion (AD) and windrow composting plants were quantified in detail. The two technologies were quantified in collaboration with consultants and producers of the parts used to construct the plants. The composting plants were quantified based on the different sizes for the three different types of waste (garden and park waste, food waste and sludge from wastewater treatment) in amounts of 10,000 or 50,000 tonnes per year. The AD plant was quantified for a capacity of 80,000 tonnes per year. Concrete and steel for the tanks were the main materials for the AD plant. For the composting plants, gravel and concrete slabs for the pavement were used in large amounts. To frame the quantification, environmental impact assessments (EIAs) showed that the steel used for tanks at the AD plant and the concrete slabs at the composting plants made the highest contribution to Global Warming. The total impact on Global Warming from the capital goods compared to the operation reported in the literature on the AD plant showed an insignificant contribution of 1-2%. For the composting plants, the capital goods accounted for 10-22% of the total impact on Global Warming from composting.

  3. Portable XRF Technology to Quantify Pb in Bone In Vivo

    PubMed Central

    Specht, Aaron James; Weisskopf, Marc; Nie, Linda Huiling

    2014-01-01

    Lead is a ubiquitous toxicant. Bone lead has been established as an important biomarker for cumulative lead exposures and has been correlated with adverse health effects on many systems in the body. K-shell X-ray fluorescence (KXRF) is the standard method for measuring bone lead, but this approach has many difficulties that have limited the widespread use of this exposure assessment method. With recent advancements in X-ray fluorescence (XRF) technology, we have developed a portable system that can quantify lead in bone in vivo within 3 minutes. Our study investigated improvements to the system, four calibration methods, and system validation for in vivo measurements. Our main results show that the detection limit of the system is 2.9 ppm with 2 mm soft tissue thickness, the best calibration method for in vivo measurement is background subtraction, and there is strong correlation between KXRF and portable LXRF bone lead results. Our results indicate that the technology is ready to be used in large human population studies to investigate adverse health effects of lead exposure. The portability of the system and fast measurement time should allow for this technology to greatly advance the research on lead exposure and public/environmental health. PMID:26317033

  4. Satellite Animation Shows California Storms

    NASA Video Gallery

    This animation of visible and infrared imagery from NOAA's GOES-West satellite shows a series of moisture-laden storms affecting California from Jan. 6 through Jan. 9, 2017. TRT: 00:36 Credit: NASA...

  5. Satellite Movie Shows Erika Dissipate

    NASA Video Gallery

    This animation of visible and infrared imagery from NOAA's GOES-West satellite from Aug. 27 to 29 shows Tropical Storm Erika move through the Eastern Caribbean Sea and dissipate near eastern Cuba. ...

  6. Quantifying tissue viscoelasticity using optical coherence elastography and the Rayleigh wave model

    NASA Astrophysics Data System (ADS)

    Han, Zhaolong; Singh, Manmohan; Aglyamov, Salavat R.; Liu, Chih-Hao; Nair, Achuth; Raghunathan, Raksha; Wu, Chen; Li, Jiasong; Larin, Kirill V.

    2016-09-01

    This study demonstrates the feasibility of using the Rayleigh wave model (RWM) in combination with optical coherence elastography (OCE) technique to assess the viscoelasticity of soft tissues. Dispersion curves calculated from the spectral decomposition of OCE-measured air-pulse induced elastic waves were used to quantify the viscoelasticity of samples using the RWM. Validation studies were first conducted on 10% gelatin phantoms with different concentrations of oil. The results showed that the oil increased the viscosity of the gelatin phantom samples. This method was then used to quantify the viscoelasticity of chicken liver. The Young's modulus of the chicken liver tissues was estimated as E=2.04±0.88 kPa with a shear viscosity η=1.20±0.13 Pa s. The analytical solution of the RWM correlated very well with the OCE-measured phased velocities (R2=0.96±0.04). The results show that the combination of the RWM and OCE is a promising method for noninvasively quantifying the biomechanical properties of soft tissues and may be a useful tool for detecting disease.

  7. Quantifying Uncertainties in Land-Surface Microwave Emissivity Retrievals

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko

    2013-01-01

    Uncertainties in the retrievals of microwaveland-surface emissivities are quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including the Special Sensor Microwave Imager, the Tropical Rainfall Measuring Mission Microwave Imager, and the Advanced Microwave Scanning Radiometer for Earth Observing System, are studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land-surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors inthe retrievals. Generally, these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 1%-4% (3-12 K) over desert and 1%-7% (3-20 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.5%-2% (2-6 K). In particular, at 85.5/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are most likely caused by rain/cloud contamination, which can lead to random errors up to 10-17 K under the most severe conditions.

  8. Quantifying uncertainty in the phylogenetics of Australian numeral systems.

    PubMed

    Zhou, Kevin; Bowern, Claire

    2015-09-22

    Researchers have long been interested in the evolution of culture and the ways in which change in cultural systems can be reconstructed and tracked. Within the realm of language, these questions are increasingly investigated with Bayesian phylogenetic methods. However, such work in cultural phylogenetics could be improved by more explicit quantification of reconstruction and transition probabilities. We apply such methods to numerals in the languages of Australia. As a large phylogeny with almost universal 'low-limit' systems, Australian languages are ideal for investigating numeral change over time. We reconstruct the most likely extent of the system at the root and use that information to explore the ways numerals evolve. We show that these systems do not increment serially, but most commonly vary their upper limits between 3 and 5. While there is evidence for rapid system elaboration beyond the lower limits, languages lose numerals as well as gain them. We investigate the ways larger numerals build on smaller bases, and show that there is a general tendency to both gain and replace 4 by combining 2 + 2 (rather than inventing a new unanalysable word 'four'). We develop a series of methods for quantifying and visualizing the results.

  9. Quantifying dynamical spillover in co-evolving multiplex networks

    PubMed Central

    Vijayaraghavan, Vikram S.; Noël, Pierre-André; Maoz, Zeev; D’Souza, Raissa M.

    2015-01-01

    Multiplex networks (a system of multiple networks that have different types of links but share a common set of nodes) arise naturally in a wide spectrum of fields. Theoretical studies show that in such multiplex networks, correlated edge dynamics between the layers can have a profound effect on dynamical processes. However, how to extract the correlations from real-world systems is an outstanding challenge. Here we introduce the Multiplex Markov chain to quantify correlations in edge dynamics found in longitudinal data of multiplex networks. By comparing the results obtained from the multiplex perspective to a null model which assumes layers in a network are independent, we can identify real correlations as distinct from simultaneous changes that occur due to random chance. We use this approach on two different data sets: the network of trade and alliances between nation states, and the email and co-commit networks between developers of open source software. We establish the existence of “dynamical spillover” showing the correlated formation (or deletion) of edges of different types as the system evolves. The details of the dynamics over time provide insight into potential causal pathways. PMID:26459949

  10. Quantifying Uncertainties in Land Surface Microwave Emissivity Retrievals

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko

    2012-01-01

    Uncertainties in the retrievals of microwave land surface emissivities were quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including SSM/I, TMI and AMSR-E, were studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors in the retrievals. Generally these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 14% (312 K) over desert and 17% (320 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.52% (26 K). In particular, at 85.0/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are mostly likely caused by rain/cloud contamination, which can lead to random errors up to 1017 K under the most severe conditions.

  11. Quantifying evolutionary dynamics from variant-frequency time series

    PubMed Central

    Khatri, Bhavin S.

    2016-01-01

    From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series. PMID:27616332

  12. Quantifying evolutionary dynamics from variant-frequency time series

    NASA Astrophysics Data System (ADS)

    Khatri, Bhavin S.

    2016-09-01

    From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series.

  13. Quantifying Sentiment and Influence in Blogspaces

    SciTech Connect

    Hui, Peter SY; Gregory, Michelle L.

    2010-07-25

    The weblog, or blog, has become a popular form of social media, through which authors can write posts, which can in turn generate feedback in the form of user comments. When considered in totality, a collection of blogs can thus be viewed as a sort of informal collection of mass sentiment and opinion. An obvious topic of interest might be to mine this collection to obtain some gauge of public sentiment over the wide variety of topics contained therein. However, the sheer size of the so-called blogosphere, combined with the fact that the subjects of posts can vary over a practically limitless number of topics poses some serious challenges when any meaningful analysis is attempted. Namely, the fact that largely anyone with access to the Internet can author their own blog, raises the serious issue of credibility— should some blogs be considered to be more influential than others, and consequently, when gauging sentiment with respect to a topic, should some blogs be weighted more heavily than others? In addition, as new posts and comments can be made on almost a constant basis, any blog analysis algorithm must be able to handle such updates efficiently. In this paper, we give a formalization of the blog model. We give formal methods of quantifying sentiment and influence with respect to a hierarchy of topics, with the specific aim of facilitating the computation of a per-topic, influence-weighted sentiment measure. Finally, as efficiency is a specific endgoal, we give upper bounds on the time required to update these values with new posts, showing that our analysis and algorithms are scalable.

  14. Phyllodes tumor showing intraductal growth.

    PubMed

    Makidono, Akari; Tsunoda, Hiroko; Mori, Miki; Yagata, Hiroshi; Onoda, Yui; Kikuchi, Mari; Nozaki, Taiki; Saida, Yukihisa; Nakamura, Seigo; Suzuki, Koyu

    2013-07-01

    Phyllodes tumor of the breast is a rare fibroepithelial lesion and particularly uncommon in adolescent girls. It is thought to arise from the periductal rather than intralobular stroma. Usually, it is seen as a well-defined mass. Phyllodes tumor showing intraductal growth is extremely rare. Here we report a girl who has a phyllodes tumor with intraductal growth.

  15. Shakespeare and other English Renaissance authors as characterized by Information Theory complexity quantifiers

    NASA Astrophysics Data System (ADS)

    Rosso, Osvaldo A.; Craig, Hugh; Moscato, Pablo

    2009-03-01

    We introduce novel Information Theory quantifiers in a computational linguistic study that involves a large corpus of English Renaissance literature. The 185 texts studied (136 plays and 49 poems in total), with first editions that range from 1580 to 1640, form a representative set of its period. Our data set includes 30 texts unquestionably attributed to Shakespeare; in addition we also included A Lover’s Complaint, a poem which generally appears in Shakespeare collected editions but whose authorship is currently in dispute. Our statistical complexity quantifiers combine the power of Jensen-Shannon’s divergence with the entropy variations as computed from a probability distribution function of the observed word use frequencies. Our results show, among other things, that for a given entropy poems display higher complexity than plays, that Shakespeare’s work falls into two distinct clusters in entropy, and that his work is remarkable for its homogeneity and for its closeness to overall means.

  16. Integrative modelling of tumour DNA methylation quantifies the contribution of metabolism

    PubMed Central

    Mehrmohamadi, Mahya; Mentch, Lucas K.; Clark, Andrew G.; Locasale, Jason W.

    2016-01-01

    Altered DNA methylation is common in cancer and often considered an early event in tumorigenesis. However, the sources of heterogeneity of DNA methylation among tumours remain poorly defined. Here we capitalize on the availability of multi-platform data on thousands of human tumours to build integrative models of DNA methylation. We quantify the contribution of clinical and molecular factors in explaining intertumoral variability in DNA methylation. We show that the levels of a set of metabolic genes involved in the methionine cycle is predictive of several features of DNA methylation in tumours, including the methylation of cancer genes. Finally, we demonstrate that patients whose DNA methylation can be predicted from the methionine cycle exhibited improved survival over cases where this regulation is disrupted. This study represents a comprehensive analysis of the determinants of methylation and demonstrates the surprisingly large interaction between metabolism and DNA methylation variation. Together, our results quantify links between tumour metabolism and epigenetics and outline clinical implications. PMID:27966532

  17. Quantifying consistent individual differences in habitat selection.

    PubMed

    Leclerc, Martin; Vander Wal, Eric; Zedrosser, Andreas; Swenson, Jon E; Kindberg, Jonas; Pelletier, Fanie

    2016-03-01

    Habitat selection is a fundamental behaviour that links individuals to the resources required for survival and reproduction. Although natural selection acts on an individual's phenotype, research on habitat selection often pools inter-individual patterns to provide inferences on the population scale. Here, we expanded a traditional approach of quantifying habitat selection at the individual level to explore the potential for consistent individual differences of habitat selection. We used random coefficients in resource selection functions (RSFs) and repeatability estimates to test for variability in habitat selection. We applied our method to a detailed dataset of GPS relocations of brown bears (Ursus arctos) taken over a period of 6 years, and assessed whether they displayed repeatable individual differences in habitat selection toward two habitat types: bogs and recent timber-harvest cut blocks. In our analyses, we controlled for the availability of habitat, i.e. the functional response in habitat selection. Repeatability estimates of habitat selection toward bogs and cut blocks were 0.304 and 0.420, respectively. Therefore, 30.4 and 42.0 % of the population-scale habitat selection variability for bogs and cut blocks, respectively, was due to differences among individuals, suggesting that consistent individual variation in habitat selection exists in brown bears. Using simulations, we posit that repeatability values of habitat selection are not related to the value and significance of β estimates in RSFs. Although individual differences in habitat selection could be the results of non-exclusive factors, our results illustrate the evolutionary potential of habitat selection.

  18. Quantifying indirect evidence in network meta-analysis.

    PubMed

    Noma, Hisashi; Tanaka, Shiro; Matsui, Shigeyuki; Cipriani, Andrea; Furukawa, Toshi A

    2017-03-15

    Network meta-analysis enables comprehensive synthesis of evidence concerning multiple treatments and their simultaneous comparisons based on both direct and indirect evidence. A fundamental pre-requisite of network meta-analysis is the consistency of evidence that is obtained from different sources, particularly whether direct and indirect evidence are in accordance with each other or not, and how they may influence the overall estimates. We have developed an efficient method to quantify indirect evidence, as well as a testing procedure to evaluate their inconsistency using Lindsay's composite likelihood method. We also show that this estimator has complete information for the indirect evidence. Using this method, we can assess the degree of consistency between direct and indirect evidence and their contribution rates to the overall estimate. Sensitivity analyses can be also conducted with this method to assess the influences of potentially inconsistent treatment contrasts on the overall results. These methods can provide useful information for overall comparative results that might be biased from specific inconsistent treatment contrasts. We also provide some fundamental requirements for valid inference on these methods concerning consistency restrictions on multi-arm trials. In addition, the efficiency of the developed method is demonstrated based on simulation studies. Applications to a network meta-analysis of 12 new-generation antidepressants are presented. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Quantifying the provenance of aeolian sediments using multiple composite fingerprints

    NASA Astrophysics Data System (ADS)

    Liu, Benli; Niu, Qinghe; Qu, Jianjun; Zu, Ruiping

    2016-09-01

    We introduce a new fingerprinting method that uses multiple composite fingerprints for studies of aeolian sediment provenance. We used this method to quantify the provenance of sediments on both sides of the Qinghai-Tibetan Railway (QTR) in the Cuona Lake section of the Tibetan Plateau (TP), in an environment characterized by aeolian and fluvial interactions. The method involves repeatedly solving a linear mixing model based on mass conservation; the model is not limited to spatial scale or transport types and uses all the tracer groups that passed the range check, Kruskal-Wallis H-test, and a strict analytical solution screening. The proportional estimates that result from using different composite fingerprints are highly variable; however, the average of these fingerprints has a greater accuracy and certainty than any single fingerprint. The results show that sand from the lake beach, hilly surface, and gullies contribute, respectively, 48%, 31% and 21% to the western railway sediments and 43%, 33% and 24% to the eastern railway sediments. The difference between contributions from various sources on either side of the railway, which may increase in the future, was clearly related to variations in local transport characteristics, a conclusion that is supported by grain size analysis. The construction of the QTR changed the local cycling of materials, and the difference in provenance between the sediments that are separated by the railway reflects the changed sedimentary conditions on either side of the railway. The effectiveness of this method suggests that it will be useful in other studies of aeolian sediments.

  20. Towards quantifying cochlear implant localization performance in complex acoustic environments.

    PubMed

    Kerber, S; Seeber, B U

    2011-08-01

    Cochlear implant (CI) users frequently report listening difficulties in reverberant and noisy spaces. While it is common to assess speech understanding with implants in background noise, binaural hearing performance has rarely been quantified in the presence of other sources, although the binaural system is a major contributor to the robustness of speech understanding in noisy situations with normal hearing. Here, a pointing task was used to measure horizontal localization ability of a bilateral CI user in quiet and in a continuous diffuse noise interferer at a signal-to-noise ratio of 0 dB. Results were compared to localization performance of six normal hearing listeners. The average localization error of the normal hearing listeners was within normal ranges reported previously and only increased by 1.8° when the interfering noise was introduced. In contrast, the bilateral CI user showed a localization error of 22° in quiet which rose to 31° in noise. This increase was partly due to target sounds being inaudible when presented from frontal locations between -20° and +20°. With the noise present, the implant user was only able to reliably hear target sounds presented from locations well off the median plane. The results give support to the informal complaints raised by CI users and can help to define targets for the design of, e.g., noise reduction algorithms for implant processors.

  1. Quantifying distinct associations on different temporal scales: comparison of DCCA and Pearson methods

    PubMed Central

    Piao, Lin; Fu, Zuntao

    2016-01-01

    Cross-correlation between pairs of variables takes multi-time scale characteristic, and it can be totally different on different time scales (changing from positive correlation to negative one), e.g., the associations between mean air temperature and relative humidity over regions to the east of Taihang mountain in China. Therefore, how to correctly unveil these correlations on different time scales is really of great importance since we actually do not know if the correlation varies with scales in advance. Here, we compare two methods, i.e. Detrended Cross-Correlation Analysis (DCCA for short) and Pearson correlation, in quantifying scale-dependent correlations directly to raw observed records and artificially generated sequences with known cross-correlation features. Studies show that 1) DCCA related methods can indeed quantify scale-dependent correlations, but not Pearson method; 2) the correlation features from DCCA related methods are robust to contaminated noises, however, the results from Pearson method are sensitive to noise; 3) the scale-dependent correlation results from DCCA related methods are robust to the amplitude ratio between slow and fast components, while Pearson method may be sensitive to the amplitude ratio. All these features indicate that DCCA related methods take some advantages in correctly quantifying scale-dependent correlations, which results from different physical processes. PMID:27827426

  2. Quantifying distinct associations on different temporal scales: comparison of DCCA and Pearson methods

    NASA Astrophysics Data System (ADS)

    Piao, Lin; Fu, Zuntao

    2016-11-01

    Cross-correlation between pairs of variables takes multi-time scale characteristic, and it can be totally different on different time scales (changing from positive correlation to negative one), e.g., the associations between mean air temperature and relative humidity over regions to the east of Taihang mountain in China. Therefore, how to correctly unveil these correlations on different time scales is really of great importance since we actually do not know if the correlation varies with scales in advance. Here, we compare two methods, i.e. Detrended Cross-Correlation Analysis (DCCA for short) and Pearson correlation, in quantifying scale-dependent correlations directly to raw observed records and artificially generated sequences with known cross-correlation features. Studies show that 1) DCCA related methods can indeed quantify scale-dependent correlations, but not Pearson method; 2) the correlation features from DCCA related methods are robust to contaminated noises, however, the results from Pearson method are sensitive to noise; 3) the scale-dependent correlation results from DCCA related methods are robust to the amplitude ratio between slow and fast components, while Pearson method may be sensitive to the amplitude ratio. All these features indicate that DCCA related methods take some advantages in correctly quantifying scale-dependent correlations, which results from different physical processes.

  3. Quantifying distinct associations on different temporal scales: comparison of DCCA and Pearson methods.

    PubMed

    Piao, Lin; Fu, Zuntao

    2016-11-09

    Cross-correlation between pairs of variables takes multi-time scale characteristic, and it can be totally different on different time scales (changing from positive correlation to negative one), e.g., the associations between mean air temperature and relative humidity over regions to the east of Taihang mountain in China. Therefore, how to correctly unveil these correlations on different time scales is really of great importance since we actually do not know if the correlation varies with scales in advance. Here, we compare two methods, i.e. Detrended Cross-Correlation Analysis (DCCA for short) and Pearson correlation, in quantifying scale-dependent correlations directly to raw observed records and artificially generated sequences with known cross-correlation features. Studies show that 1) DCCA related methods can indeed quantify scale-dependent correlations, but not Pearson method; 2) the correlation features from DCCA related methods are robust to contaminated noises, however, the results from Pearson method are sensitive to noise; 3) the scale-dependent correlation results from DCCA related methods are robust to the amplitude ratio between slow and fast components, while Pearson method may be sensitive to the amplitude ratio. All these features indicate that DCCA related methods take some advantages in correctly quantifying scale-dependent correlations, which results from different physical processes.

  4. Magic Carpet Shows Its Colors

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The upper left image in this display is from the panoramic camera on the Mars Exploration Rover Spirit, showing the 'Magic Carpet' region near the rover at Gusev Crater, Mars, on Sol 7, the seventh martian day of its journey (Jan. 10, 2004). The lower image, also from the panoramic camera, is a monochrome (single filter) image of a rock in the 'Magic Carpet' area. Note that colored portions of the rock correlate with extracted spectra shown in the plot to the side. Four different types of materials are shown: the rock itself, the soil in front of the rock, some brighter soil on top of the rock, and some dust that has collected in small recesses on the rock face ('spots'). Each color on the spectra matches a line on the graph, showing how the panoramic camera's different colored filters are used to broadly assess the varying mineral compositions of martian rocks and soils.

  5. Uncertainty of natural tracer methods for quantifying river-aquifer interaction in a large river

    NASA Astrophysics Data System (ADS)

    Xie, Yueqing; Cook, Peter G.; Shanafield, Margaret; Simmons, Craig T.; Zheng, Chunmiao

    2016-04-01

    The quantification of river-aquifer interaction is critical to the conjunctive management of surface water and groundwater, in particular in the arid and semiarid environment with much higher potential evapotranspiration than precipitation. A variety of natural tracer methods are available to quantify river-aquifer interaction at different scales. These methods however have only been tested in rivers with relatively low flow rates (mostly less than 5 m3 s-1). In this study, several natural tracers including heat, radon-222 and electrical conductivity were measured both on vertical riverbed profiles and on longitudinal river samples to quantify river-aquifer exchange flux at both point and regional scales in the Heihe River (northwest China; flow rate 63 m3 s-1). Results show that the radon-222 profile method can estimate a narrower range of point-scale flux than the temperature profile method. In particular, three vertical radon-222 profiles failed to estimate the upper bounds of plausible flux ranges. Results also show that when quantifying regional-scale river-aquifer exchange flux, the river chemistry method constrained the flux (5.20-10.39 m2 d-1) better than the river temperature method (-100 to 100 m2 d-1). The river chemistry method also identified spatial variability of flux, whereas the river temperature method did not have sufficient resolution. Overall, for quantifying river-aquifer exchange flux in a large river, both the temperature profile method and the radon-222 profile method provide useful complementary information at the point scale to complement each other, whereas the river chemistry method is recommended over the river temperature method at the regional scale.

  6. Quantifying the Clinical Significance of Cannabis Withdrawal

    PubMed Central

    Allsop, David J.; Copeland, Jan; Norberg, Melissa M.; Fu, Shanlin; Molnar, Anna; Lewis, John; Budney, Alan J.

    2012-01-01

    Background and Aims Questions over the clinical significance of cannabis withdrawal have hindered its inclusion as a discrete cannabis induced psychiatric condition in the Diagnostic and Statistical Manual of Mental Disorders (DSM IV). This study aims to quantify functional impairment to normal daily activities from cannabis withdrawal, and looks at the factors predicting functional impairment. In addition the study tests the influence of functional impairment from cannabis withdrawal on cannabis use during and after an abstinence attempt. Methods and Results A volunteer sample of 49 non-treatment seeking cannabis users who met DSM-IV criteria for dependence provided daily withdrawal-related functional impairment scores during a one-week baseline phase and two weeks of monitored abstinence from cannabis with a one month follow up. Functional impairment from withdrawal symptoms was strongly associated with symptom severity (p = 0.0001). Participants with more severe cannabis dependence before the abstinence attempt reported greater functional impairment from cannabis withdrawal (p = 0.03). Relapse to cannabis use during the abstinence period was associated with greater functional impairment from a subset of withdrawal symptoms in high dependence users. Higher levels of functional impairment during the abstinence attempt predicted higher levels of cannabis use at one month follow up (p = 0.001). Conclusions Cannabis withdrawal is clinically significant because it is associated with functional impairment to normal daily activities, as well as relapse to cannabis use. Sample size in the relapse group was small and the use of a non-treatment seeking population requires findings to be replicated in clinical samples. Tailoring treatments to target withdrawal symptoms contributing to functional impairment during a quit attempt may improve treatment outcomes. PMID:23049760

  7. Graphene Oxides Show Angiogenic Properties.

    PubMed

    Mukherjee, Sudip; Sriram, Pavithra; Barui, Ayan Kumar; Nethi, Susheel Kumar; Veeriah, Vimal; Chatterjee, Suvro; Suresh, Kattimuttathu Ittara; Patra, Chitta Ranjan

    2015-08-05

    Angiogenesis, a process resulting in the formation of new capillaries from the pre-existing vasculature plays vital role for the development of therapeutic approaches for cancer, atherosclerosis, wound healing, and cardiovascular diseases. In this report, the synthesis, characterization, and angiogenic properties of graphene oxide (GO) and reduced graphene oxide (rGO) have been demonstrated, observed through several in vitro and in vivo angiogenesis assays. The results here demonstrate that the intracellular formation of reactive oxygen species and reactive nitrogen species as well as activation of phospho-eNOS and phospho-Akt might be the plausible mechanisms for GO and rGO induced angiogenesis. The results altogether suggest the possibilities for the development of alternative angiogenic therapeutic approach for the treatment of cardiovascular related diseases where angiogenesis plays a significant role.

  8. "Medicine show." Alice in Doctorland.

    PubMed

    1987-01-01

    This is an excerpt from the script of a 1939 play provided to the Institute of Social Medicine and Community Health by the Library of Congress Federal Theater Project Collection at George Mason University Library, Fairfax, Virginia, pages 2-1-8 thru 2-1-14. The Federal Theatre Project (FTP) was part of the New Deal program for the arts 1935-1939. Funded by the Works Progress Administration (WPA) its goal was to employ theater professionals from the relief rolls. A number of FTP plays deal with aspects of medicine and public health. Pageants, puppet shows and documentary plays celebrated progress in medical science while examining social controversies in medical services and the public health movement. "Medicine Show" sharply contrasts technological wonders with social backwardness. The play was rehearsed by the FTP but never opened because funding ended. A revised version ran on Broadway in 1940. The preceding comments are adapted from an excellent, well-illustrated review of five of these plays by Barabara Melosh: "The New Deal's Federal Theatre Project," Medical Heritage, Vol. 2, No. 1 (Jan/Feb 1986), pp. 36-47.

  9. Quantifying and Communicating Uncertainty in Preclinical Human Dose-Prediction

    PubMed Central

    Sundqvist, M; Lundahl, A; Någård, MB; Bredberg, U; Gennemark, P

    2015-01-01

    Human dose-prediction is fundamental for ranking lead-optimization compounds in drug discovery and to inform design of early clinical trials. This tutorial describes how uncertainty in such predictions can be quantified and efficiently communicated to facilitate decision-making. Using three drug-discovery case studies, we show how several uncertain pieces of input information can be integrated into one single uncomplicated plot with key predictions, including their uncertainties, for many compounds or for many scenarios, or both. PMID:26225248

  10. What Do Blood Tests Show?

    MedlinePlus

    ... Range Results* Red blood cell (varies with altitude) Male: 5 to 6 million cells/mcL Female: 4 to 5 million cells/mcL ... cells/mcL Platelets 140,000 to 450,000 cells/mcL Hemoglobin (varies with altitude) Male: 14 to 17 gm/dL Female: 12 to ...

  11. "Show me" bioethics and politics.

    PubMed

    Christopher, Myra J

    2007-10-01

    Missouri, the "Show Me State," has become the epicenter of several important national public policy debates, including abortion rights, the right to choose and refuse medical treatment, and, most recently, early stem cell research. In this environment, the Center for Practical Bioethics (formerly, Midwest Bioethics Center) emerged and grew. The Center's role in these "cultural wars" is not to advocate for a particular position but to provide well researched and objective information, perspective, and advocacy for the ethical justification of policy positions; and to serve as a neutral convener and provider of a public forum for discussion. In this article, the Center's work on early stem cell research is a case study through which to argue that not only the Center, but also the field of bioethics has a critical role in the politics of public health policy.

  12. Phoenix Scoop Inverted Showing Rasp

    NASA Technical Reports Server (NTRS)

    2008-01-01

    This image taken by the Surface Stereo Imager on Sol 49, or the 49th Martian day of the mission (July 14, 2008), shows the silver colored rasp protruding from NASA's Phoenix Mars Lander's Robotic Arm scoop. The scoop is inverted and the rasp is pointing up.

    Shown with its forks pointing toward the ground is the thermal and electrical conductivity probe, at the lower right. The Robotic Arm Camera is pointed toward the ground.

    The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is led by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  13. ShowMe3D

    SciTech Connect

    Sinclair, Michael B

    2012-01-05

    ShowMe3D is a data visualization graphical user interface specifically designed for use with hyperspectral image obtained from the Hyperspectral Confocal Microscope. The program allows the user to select and display any single image from a three dimensional hyperspectral image stack. By moving a slider control, the user can easily move between images of the stack. The user can zoom into any region of the image. The user can select any pixel or region from the displayed image and display the fluorescence spectrum associated with that pixel or region. The user can define up to 3 spectral filters to apply to the hyperspectral image and view the image as it would appear from a filter-based confocal microscope. The user can also obtain statistics such as intensity average and variance from selected regions.

  14. Sensitivity of edge detection methods for quantifying cell migration assays.

    PubMed

    Treloar, Katrina K; Simpson, Matthew J

    2013-01-01

    Quantitative imaging methods to analyze cell migration assays are not standardized. Here we present a suite of two-dimensional barrier assays describing the collective spreading of an initially-confined population of 3T3 fibroblast cells. To quantify the motility rate we apply two different automatic image detection methods to locate the position of the leading edge of the spreading population after 24, 48 and 72 hours. These results are compared with a manual edge detection method where we systematically vary the detection threshold. Our results indicate that the observed spreading rates are very sensitive to the choice of image analysis tools and we show that a standard measure of cell migration can vary by as much as 25% for the same experimental images depending on the details of the image analysis tools. Our results imply that it is very difficult, if not impossible, to meaningfully compare previously published measures of cell migration since previous results have been obtained using different image analysis techniques and the details of these techniques are not always reported. Using a mathematical model, we provide a physical interpretation of our edge detection results. The physical interpretation is important since edge detection algorithms alone do not specify any physical measure, or physical definition, of the leading edge of the spreading population. Our modeling indicates that variations in the image threshold parameter correspond to a consistent variation in the local cell density. This means that varying the threshold parameter is equivalent to varying the location of the leading edge in the range of approximately 1-5% of the maximum cell density.

  15. Quantifying the Thermal Fatigue of CPV Modules

    SciTech Connect

    Bosco, N.; Kurtz, S.

    2011-02-01

    A method is presented to quantify thermal fatigue in the CPV die-attach from meteorological data. A comparative; study between cities demonstrates a significant difference in the accumulated damage. These differences are most; sensitive to the number of larger (ΔT) thermal cycles experienced for a location. High frequency data (<1/min) may; be required to most accurately employ this method.

  16. Subtleties of Hidden Quantifiers in Implication

    ERIC Educational Resources Information Center

    Shipman, Barbara A.

    2016-01-01

    Mathematical conjectures and theorems are most often of the form P(x) ? Q(x), meaning ?x,P(x) ? Q(x). The hidden quantifier ?x is crucial in understanding the implication as a statement with a truth value. Here P(x) and Q(x) alone are only predicates, without truth values, since they contain unquantified variables. But standard textbook…

  17. Quantifying Semantic Linguistic Maturity in Children

    ERIC Educational Resources Information Center

    Hansson, Kristina; Bååth, Rasmus; Löhndorf, Simone; Sahlén, Birgitta; Sikström, Sverker

    2016-01-01

    We propose a method to quantify "semantic linguistic maturity" (SELMA) based on a high dimensional semantic representation of words created from the co-occurrence of words in a large text corpus. The method was applied to oral narratives from 108 children aged 4;0-12;10. By comparing the SELMA measure with maturity ratings made by human…

  18. Quantifying the Reuse of Learning Objects

    ERIC Educational Resources Information Center

    Elliott, Kristine; Sweeney, Kevin

    2008-01-01

    This paper reports the findings of one case study from a larger project, which aims to quantify the claimed efficiencies of reusing learning objects to develop e-learning resources. The case study describes how an online inquiry project "Diabetes: A waste of energy" was developed by searching for, evaluating, modifying and then…

  19. Classifying and quantifying basins of attraction

    SciTech Connect

    Sprott, J. C.; Xiong, Anda

    2015-08-15

    A scheme is proposed to classify the basins for attractors of dynamical systems in arbitrary dimensions. There are four basic classes depending on their size and extent, and each class can be further quantified to facilitate comparisons. The calculation uses a Monte Carlo method and is applied to numerous common dissipative chaotic maps and flows in various dimensions.

  20. Quantifying Item Dependency by Fisher's Z.

    ERIC Educational Resources Information Center

    Shen, Linjun

    Three aspects of the usual approach to assessing local item dependency, Yen's "Q" (H. Huynh, H. Michaels, and S. Ferrara, 1995), deserve further investigation. Pearson correlation coefficients do not distribute normally when the coefficients are large, and thus cannot quantify the dependency well. In the second place, the accuracy of…

  1. Interpreting Quantifier Scope Ambiguity: Evidence of Heuristic First, Algorithmic Second Processing

    PubMed Central

    Dwivedi, Veena D.

    2013-01-01

    The present work suggests that sentence processing requires both heuristic and algorithmic processing streams, where the heuristic processing strategy precedes the algorithmic phase. This conclusion is based on three self-paced reading experiments in which the processing of two-sentence discourses was investigated, where context sentences exhibited quantifier scope ambiguity. Experiment 1 demonstrates that such sentences are processed in a shallow manner. Experiment 2 uses the same stimuli as Experiment 1 but adds questions to ensure deeper processing. Results indicate that reading times are consistent with a lexical-pragmatic interpretation of number associated with context sentences, but responses to questions are consistent with the algorithmic computation of quantifier scope. Experiment 3 shows the same pattern of results as Experiment 2, despite using stimuli with different lexical-pragmatic biases. These effects suggest that language processing can be superficial, and that deeper processing, which is sensitive to structure, only occurs if required. Implications for recent studies of quantifier scope ambiguity are discussed. PMID:24278439

  2. EMG and acceleration signal analysis for quantifying the effects of medication in Parkinson's disease.

    PubMed

    Rissanen, Saara M; Kankaanpaa, Markku; Tarvainen, Mika P; Nuutinen, Juho; Airaksinen, Olavi; Karjalainen, Pasi A

    2011-01-01

    Parkinson's disease (PD) is characterized by motor disabilities that can be alleviated reasonably with appropriate medication. However, there is a lack of objective methods for quantifying the efficacy of treatment in PD. We applied here an objective method for quantifying the effects of medication in PD using EMG and acceleration measurements and analysis. In the method, four signal features were calculated from the EMG and acceleration recordings of both sides of the body: the kurtosis and recurrence rate of EMG, and the amplitude and sample entropy of acceleration. Principal component approach was used for reducing the number of variables. EMG and acceleration data measured from nine PD patients were used for analysis. The patients were measured in four different medication conditions: with medication off, and two and three and four hours after taking the medication. The results showed that in eight patients the EMG recordings changed into less spiky and the acceleration recordings into more complex after taking the medication. A reverse phenomenon in the signal characteristics was observed in seven patients 3-4 hours after taking the medication. The results indicate that the presented method is potentially useful for quantifying objectively the effects of medication on the neuromuscular function in PD.

  3. A novel method to visualise and quantify circadian misalignment

    PubMed Central

    Fischer, Dorothee; Vetter, Céline; Roenneberg, Till

    2016-01-01

    The circadian clock governs virtually all processes in the human body, including sleep-wake behaviour. Circadian misalignment describes the off-set between sleep-wake cycles and clock-regulated physiology. This strain is predominantly caused by external (societal) demands including shift work, early school start times and fast travels across time zones. Sleeping at the ‘wrong’ internal time can jeopardise health and safety, and we therefore need a good quantification of this phenomenon. Here, we propose a novel method to quantify the mistiming of sleep-wake rhythms and demonstrate its versatility in day workers and shift workers. Based on a single time series, our Composite Phase Deviation method unveils distinct, subject- and schedule-specific geometries (‘islands and pancakes’) that illustrate how modern work times interfere with sleep. With increasing levels of circadian strain, the resulting shapes change systematically from small, connected forms to large and fragmented patterns. Our method shows good congruence with published measures of circadian misalignment (i.e., Inter-daily Stability and ‘Behavioural Entrainment’), but offers added value as to its requirements, e.g., being computable for sleep logs and questionnaires. Composite Phase Deviations will help to understand the mechanisms that link ‘living against the clock’ with health and disease on an individual basis. PMID:27929109

  4. Statistical physics approach to quantifying differences in myelinated nerve fibers

    NASA Astrophysics Data System (ADS)

    Comin, César H.; Santos, João R.; Corradini, Dario; Morrison, Will; Curme, Chester; Rosene, Douglas L.; Gabrielli, Andrea; da F. Costa, Luciano; Stanley, H. Eugene

    2014-03-01

    We present a new method to quantify differences in myelinated nerve fibers. These differences range from morphologic characteristics of individual fibers to differences in macroscopic properties of collections of fibers. Our method uses statistical physics tools to improve on traditional measures, such as fiber size and packing density. As a case study, we analyze cross-sectional electron micrographs from the fornix of young and old rhesus monkeys using a semi-automatic detection algorithm to identify and characterize myelinated axons. We then apply a feature selection approach to identify the features that best distinguish between the young and old age groups, achieving a maximum accuracy of 94% when assigning samples to their age groups. This analysis shows that the best discrimination is obtained using the combination of two features: the fraction of occupied axon area and the effective local density. The latter is a modified calculation of axon density, which reflects how closely axons are packed. Our feature analysis approach can be applied to characterize differences that result from biological processes such as aging, damage from trauma or disease or developmental differences, as well as differences between anatomical regions such as the fornix and the cingulum bundle or corpus callosum.

  5. Quantifying the Frictional Forces between Skin and Nonwoven Fabrics

    PubMed Central

    Jayawardana, Kavinda; Ovenden, Nicholas C.; Cottenden, Alan

    2017-01-01

    When a compliant sheet of material is dragged over a curved surface of a body, the frictional forces generated can be many times greater than they would be for a planar interface. This phenomenon is known to contribute to the abrasion damage to skin often suffered by wearers of incontinence pads and bed/chairbound people susceptible to pressure sores. Experiments that attempt to quantify these forces often use a simple capstan-type equation to obtain a characteristic coefficient of friction. In general, the capstan approach assumes the ratio of applied tensions depends only on the arc of contact and the coefficient of friction, and ignores other geometric and physical considerations; this approach makes it straightforward to obtain explicitly a coefficient of friction from the tensions measured. In this paper, two mathematical models are presented that compute the material displacements and surface forces generated by, firstly, a membrane under tension in moving contact with a rigid obstacle and, secondly, a shell-membrane under tension in contact with a deformable substrate. The results show that, while the use of a capstan equation remains fairly robust in some cases, effects such as the curvature and flaccidness of the underlying body, and the mass density of the fabric can lead to significant variations in stresses generated in the contact region. Thus, the coefficient of friction determined by a capstan model may not be an accurate reflection of the true frictional behavior of the contact region. PMID:28321192

  6. Quantifying solute transport processes: are chemically "conservative" tracers electrically conservative?

    USGS Publications Warehouse

    Singha, Kamini; Li, Li; Day-Lewis, Frederick D.; Regberg, Aaron B.

    2012-01-01

    The concept of a nonreactive or conservative tracer, commonly invoked in investigations of solute transport, requires additional study in the context of electrical geophysical monitoring. Tracers that are commonly considered conservative may undergo reactive processes, such as ion exchange, thus changing the aqueous composition of the system. As a result, the measured electrical conductivity may reflect not only solute transport but also reactive processes. We have evaluated the impacts of ion exchange reactions, rate-limited mass transfer, and surface conduction on quantifying tracer mass, mean arrival time, and temporal variance in laboratory-scale column experiments. Numerical examples showed that (1) ion exchange can lead to resistivity-estimated tracer mass, velocity, and dispersivity that may be inaccurate; (2) mass transfer leads to an overestimate in the mobile tracer mass and an underestimate in velocity when using electrical methods; and (3) surface conductance does not notably affect estimated moments when high-concentration tracers are used, although this phenomenon may be important at low concentrations or in sediments with high and/or spatially variable cation-exchange capacity. In all cases, colocated groundwater concentration measurements are of high importance for interpreting geophysical data with respect to the controlling transport processes of interest.

  7. Cross-Linguistic Relations between Quantifiers and Numerals in Language Acquisition: Evidence from Japanese

    ERIC Educational Resources Information Center

    Barner, David; Libenson, Amanda; Cheung, Pierina; Takasaki, Mayu

    2009-01-01

    A study of 104 Japanese-speaking 2- to 5-year-olds tested the relation between numeral and quantifier acquisition. A first study assessed Japanese children's comprehension of quantifiers, numerals, and classifiers. Relative to English-speaking counterparts, Japanese children were delayed in numeral comprehension at 2 years of age but showed no…

  8. Casimir experiments showing saturation effects

    SciTech Connect

    Sernelius, Bo E.

    2009-10-15

    We address several different Casimir experiments where theory and experiment disagree. First out is the classical Casimir force measurement between two metal half spaces; here both in the form of the torsion pendulum experiment by Lamoreaux and in the form of the Casimir pressure measurement between a gold sphere and a gold plate as performed by Decca et al.; theory predicts a large negative thermal correction, absent in the high precision experiments. The third experiment is the measurement of the Casimir force between a metal plate and a laser irradiated semiconductor membrane as performed by Chen et al.; the change in force with laser intensity is larger than predicted by theory. The fourth experiment is the measurement of the Casimir force between an atom and a wall in the form of the measurement by Obrecht et al. of the change in oscillation frequency of a {sup 87}Rb Bose-Einstein condensate trapped to a fused silica wall; the change is smaller than predicted by theory. We show that saturation effects can explain the discrepancies between theory and experiment observed in all these cases.

  9. Quantifying climate changes of the Common Era for Finland

    NASA Astrophysics Data System (ADS)

    Luoto, Tomi P.; Nevalainen, Liisa

    2016-11-01

    In this study, we aim to quantify summer air temperatures from sediment records from Southern, Central and Northern Finland over the past 2000 years. We use lake sediment archives to estimate paleotemperatures applying fossil Chironomidae assemblages and the transfer function approach. The used enhanced Chironomidae-based temperature calibration set was validated in a 70-year high-resolution sediment record against instrumentally measured temperatures. Since the inferred and observed temperatures showed close correlation, we deduced that the new calibration model is reliable for reconstructions beyond the monitoring records. The 700-year long temperature reconstructions from three sites at multi-decadal temporal resolution showed similar trends, although they had differences in timing of the cold Little Ice Age (LIA) and the initiation of recent warming. The 2000-year multi-centennial reconstructions from three different sites showed resemblance with each other having clear signals of the Medieval Climate Anomaly (MCA) and LIA, but with differences in their timing. The influence of external forcing on climate of the southern and central sites appeared to be complex at the decadal scale, but the North Atlantic Oscillation (NAO) was closely linked to the temperature development of the northern site. Solar activity appears to be synchronous with the temperature fluctuations at the multi-centennial scale in all the sites. The present study provides new insights into centennial and decadal variability in air temperature dynamics in Northern Europe and on the external forcing behind these trends. These results are particularly useful in comparing regional responses and lags of temperature trends between different parts of Scandinavia.

  10. The use of automatic speech recognition showing the influence of nasality on speech intelligibility.

    PubMed

    Mayr, S; Burkhardt, K; Schuster, M; Rogler, K; Maier, A; Iro, H

    2010-11-01

    Altered nasality influences speech intelligibility. Automatic speech recognition (ASR) has proved suitable for quantifying speech intelligibility in patients with different degrees of nasal emissions. We investigated the influence of hyponasality on the results of speech recognition before and after nasal surgery using ASR. Speech recordings, nasal peak inspiratory flow and self-perception measurements were carried out in 20 German-speaking patients (8 women, 12 men; aged 38 ± 22 years) who underwent surgery for various nasal and sinus pathologies. The degree of speech intelligibility was quantified as the percentage of correctly recognized words of a standardized word chain by ASR (word recognition rate; WR). WR was measured 1 day before (t1), 1 day after with nasal packings (t2), and 3 months after (t3) surgery; nasal peak flow on t1 and t3. WR was calculated with program for the automatic evaluation of all kinds of speech disorders (PEAKS). WR as a parameter of speech intelligibility was significantly decreased immediately after surgery (t1 vs. t2 p < 0.01) but increased 3 months after surgery (t2 vs. t3 p < 0.01). WR showed no association with age or gender. There was no significant difference between WR at t1 and t3, despite a post-operative increase in nasal peak inspiratory flow measurements. The results show that ASR is capable of quantifying the influence of hyponasality on speech; nasal obstruction leads to significantly reduced WR and nasal peak flow cannot replace evaluation of nasality.

  11. Radiative transfer modeling for quantifying lunar mineral abundance

    NASA Astrophysics Data System (ADS)

    Li, S.; Li, L.

    2010-12-01

    This work is part of our efforts for quantifying lunar surface minerals (agglutinate, clinopyroxene, orthopyroxene, plagioclase, olivine, ilmenite, and volcanic glass) from the lunar soil characterization consortium (LSCC) dataset with Hapke's radiative transfer model. We have implemented Hapke's radiative transfer model in the inverse mode in which instead of commonly used look-up table (LUT) Newton's theory was used to solve nonlinear questions for derivation of mineral absorption coefficients and estimation of mineral abundances. While the effects of temperature and surface roughness are incorporated into the implementation to improve the model performance for application of lunar spacecraft data, these effects are not considered in the current work because of the use of lab measured reflectance data. We first tested the inverse model with all samples of the LSCC dataset, the model showed poor performance, which is primarily degraded by samples with a high amount of SMFe. The model was then tested with relatively fresh samples (Is/FeO <= 50, totally 20 samples), and the results were compared with those resulting from genetic algorithm - partial least square models (GA-PLS). This comparison indicates radiative transfer modeling resulted in higher squared correlations and lower root mean square correlations than those from GA-PLS for all minerals (Figure 1). It is concluded that the inverse RTM is preferred over GA-PLS for deriving mineral information of lunar fresh samples. To apply this approach to lunar spacecraft data for mineral abundance estimation, the model needs to be improved for handling more mature lunar soil samples. Figure 1. Comparison of relative RMSE and r-squares of GA-PLS and inversion RTM results.

  12. Entropy generation method to quantify thermal comfort

    NASA Technical Reports Server (NTRS)

    Boregowda, S. C.; Tiwari, S. N.; Chaturvedi, S. K.

    2001-01-01

    The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study

  13. Quantifiable diagnosis of muscular dystrophies and neurogenic atrophies through network analysis

    PubMed Central

    2013-01-01

    Background The diagnosis of neuromuscular diseases is strongly based on the histological characterization of muscle biopsies. However, this morphological analysis is mostly a subjective process and difficult to quantify. We have tested if network science can provide a novel framework to extract useful information from muscle biopsies, developing a novel method that analyzes muscle samples in an objective, automated, fast and precise manner. Methods Our database consisted of 102 muscle biopsy images from 70 individuals (including controls, patients with neurogenic atrophies and patients with muscular dystrophies). We used this to develop a new method, Neuromuscular DIseases Computerized Image Analysis (NDICIA), that uses network science analysis to capture the defining signature of muscle biopsy images. NDICIA characterizes muscle tissues by representing each image as a network, with fibers serving as nodes and fiber contacts as links. Results After a ‘training’ phase with control and pathological biopsies, NDICIA was able to quantify the degree of pathology of each sample. We validated our method by comparing NDICIA quantification of the severity of muscular dystrophies with a pathologist’s evaluation of the degree of pathology, resulting in a strong correlation (R = 0.900, P <0.00001). Importantly, our approach can be used to quantify new images without the need for prior ‘training’. Therefore, we show that network science analysis captures the useful information contained in muscle biopsies, helping the diagnosis of muscular dystrophies and neurogenic atrophies. Conclusions Our novel network analysis approach will serve as a valuable tool for assessing the etiology of muscular dystrophies or neurogenic atrophies, and has the potential to quantify treatment outcomes in preclinical and clinical trials. PMID:23514382

  14. Mimas Showing False Colors #1

    NASA Technical Reports Server (NTRS)

    2005-01-01

    False color images of Saturn's moon, Mimas, reveal variation in either the composition or texture across its surface.

    During its approach to Mimas on Aug. 2, 2005, the Cassini spacecraft narrow-angle camera obtained multi-spectral views of the moon from a range of 228,000 kilometers (142,500 miles).

    The image at the left is a narrow angle clear-filter image, which was separately processed to enhance the contrast in brightness and sharpness of visible features. The image at the right is a color composite of narrow-angle ultraviolet, green, infrared and clear filter images, which have been specially processed to accentuate subtle changes in the spectral properties of Mimas' surface materials. To create this view, three color images (ultraviolet, green and infrared) were combined into a single black and white picture that isolates and maps regional color differences. This 'color map' was then superimposed over the clear-filter image at the left.

    The combination of color map and brightness image shows how the color differences across the Mimas surface materials are tied to geological features. Shades of blue and violet in the image at the right are used to identify surface materials that are bluer in color and have a weaker infrared brightness than average Mimas materials, which are represented by green.

    Herschel crater, a 140-kilometer-wide (88-mile) impact feature with a prominent central peak, is visible in the upper right of each image. The unusual bluer materials are seen to broadly surround Herschel crater. However, the bluer material is not uniformly distributed in and around the crater. Instead, it appears to be concentrated on the outside of the crater and more to the west than to the north or south. The origin of the color differences is not yet understood. It may represent ejecta material that was excavated from inside Mimas when the Herschel impact occurred. The bluer color of these materials may be caused by subtle differences in

  15. Quantifiers more or less quantify online: ERP evidence for partial incremental interpretation

    PubMed Central

    Urbach, Thomas P.; Kutas, Marta

    2010-01-01

    Event-related brain potentials were recorded during RSVP reading to test the hypothesis that quantifier expressions are incrementally interpreted fully and immediately. In sentences tapping general knowledge (Farmers grow crops/worms as their primary source of income), Experiment 1 found larger N400s for atypical (worms) than typical objects (crops). Experiment 2 crossed object typicality with non-logical subject-noun phrase quantifiers (most, few). Off-line plausibility ratings exhibited the crossover interaction predicted by full quantifier interpretation: Most farmers grow crops and Few farmers grow worms were rated more plausible than Most farmers grow worms and Few farmers grow crops. Object N400s, although modulated in the expected direction, did not reverse. Experiment 3 replicated these findings with adverbial quantifiers (Farmers often/rarely grow crops/worms). Interpretation of quantifier expressions thus is neither fully immediate nor fully delayed. Furthermore, object atypicality was associated with a frontal slow positivity in few-type/rarely quantifier contexts, suggesting systematic processing differences among quantifier types. PMID:20640044

  16. Quantifying the Gurken Morphogen Gradient in Drosophila Oogenesis

    PubMed Central

    Goentoro, Lea A.; Reeves, Gregory T.; Kowal, Craig P.; Martinelli, Luigi; Schüpbach, Trudi; Shvartsman, Stanislav Y.

    2014-01-01

    Summary Quantitative information about the distribution of morphogens is crucial for understanding their effects on cell-fate determination, yet it is difficult to obtain through direct measurements. We have developed a parameter estimation approach for quantifying the spatial distribution of Gurken, a TGFα-like EGFR ligand that acts as a morphogen in Drosophila oogenesis. Modeling of Gurken/EGFR system shows that the shape of the Gurken gradient is controlled by a single dimensionless parameter, the Thiele modulus, which reflects the relative importance of ligand diffusion and degradation. By combining the model with genetic alterations of EGFR levels, we have estimated the value of the Thiele modulus in the wild-type egg chamber. This provides a direct characterization of the shape of the Gurken gradient and demonstrates how parameter estimation techniques can be used to quantify morphogen gradients in development. PMID:16890165

  17. Quantifying geocode location error using GIS methods

    PubMed Central

    Strickland, Matthew J; Siffel, Csaba; Gardner, Bennett R; Berzen, Alissa K; Correa, Adolfo

    2007-01-01

    Background The Metropolitan Atlanta Congenital Defects Program (MACDP) collects maternal address information at the time of delivery for infants and fetuses with birth defects. These addresses have been geocoded by two independent agencies: (1) the Georgia Division of Public Health Office of Health Information and Policy (OHIP) and (2) a commercial vendor. Geographic information system (GIS) methods were used to quantify uncertainty in the two sets of geocodes using orthoimagery and tax parcel datasets. Methods We sampled 599 infants and fetuses with birth defects delivered during 1994–2002 with maternal residence in either Fulton or Gwinnett County. Tax parcel datasets were obtained from the tax assessor's offices of Fulton and Gwinnett County. High-resolution orthoimagery for these counties was acquired from the U.S. Geological Survey. For each of the 599 addresses we attempted to locate the tax parcel corresponding to the maternal address. If the tax parcel was identified the distance and the angle between the geocode and the residence were calculated. We used simulated data to characterize the impact of geocode location error. In each county 5,000 geocodes were generated and assigned their corresponding Census 2000 tract. Each geocode was then displaced at a random angle by a random distance drawn from the distribution of observed geocode location errors. The census tract of the displaced geocode was determined. We repeated this process 5,000 times and report the percentage of geocodes that resolved into incorrect census tracts. Results Median location error was less than 100 meters for both OHIP and commercial vendor geocodes; the distribution of angles appeared uniform. Median location error was approximately 35% larger in Gwinnett (a suburban county) relative to Fulton (a county with urban and suburban areas). Location error occasionally caused the simulated geocodes to be displaced into incorrect census tracts; the median percentage of geocodes resolving

  18. Quantifying Stock Return Distributions in Financial Markets

    PubMed Central

    Botta, Federico; Moat, Helen Susannah; Stanley, H. Eugene; Preis, Tobias

    2015-01-01

    Being able to quantify the probability of large price changes in stock markets is of crucial importance in understanding financial crises that affect the lives of people worldwide. Large changes in stock market prices can arise abruptly, within a matter of minutes, or develop across much longer time scales. Here, we analyze a dataset comprising the stocks forming the Dow Jones Industrial Average at a second by second resolution in the period from January 2008 to July 2010 in order to quantify the distribution of changes in market prices at a range of time scales. We find that the tails of the distributions of logarithmic price changes, or returns, exhibit power law decays for time scales ranging from 300 seconds to 3600 seconds. For larger time scales, we find that the distributions tails exhibit exponential decay. Our findings may inform the development of models of market behavior across varying time scales. PMID:26327593

  19. Quantifying the Discrepancy in RANS Modeling of Reynolds Stress Eigenvectors System

    NASA Astrophysics Data System (ADS)

    Wu, Jinlong; Thompson, Roney; Wang, Jianxun; Sampaio, Luiz; Xiao, Heng

    2016-11-01

    Reynolds-Averaged Navier-Stokes (RANS) equations are the dominant tool for engineering design and analysis applications involving wall bounded turbulent flows. However, the modeled Reynolds stress tensor is known to be a main source of uncertainty, comparing to other sources like geometry, boundary conditions, etc. Recently, several works have been conducted with the aim to quantify the uncertainty of RANS simulation by studying the discrepancy of anisotropy and turbulence kinetic energy of the Reynolds stress tensor with respect to a reference database obtained from DNS. On the other hand, the eigenvectors system of Reynolds stress tensor is less investigated. In this work, a general metric is proposed to visualize the discrepancy between two eigenvectors systems. More detailed metrics based on the Euler angle and the direction cosine are also proposed to quantify the discrepancy of eigenvectors systems. The results show that even a small discrepancy of the eigenvectors of the Reynolds stress can lead to a drastically different mean velocity field, demonstrating the importance of quantifying this kind of uncertainty/error. Furthermore, the Euler angle and the direction cosine are compared for the purpose of uncertainty quantification and machine learning, respectively.

  20. Development and application of a method for quantifying factors affecting chloramine decay in service reservoirs.

    PubMed

    Sathasivan, Arumugam; Krishna, K C Bal; Fisher, Ian

    2010-08-01

    Service reservoirs play an important role in maintaining water quality in distribution systems. Several factors affect the reservoir water quality, including bulk water reactions, stratification, sediment accumulation and wall reactions. It is generally thought that biofilm and sediments can harbour microorganisms, especially in chloraminated reservoirs, but their impact on disinfectant loss on disinfectant loss has not been quantified. Hence, debate exists as to the extent of the problem. To quantify the impact, the reservoir acceleration factor (F(Ra)) is defined. This factor represents the acceleration of chloramine decay arising from all causes, including changes in retention time, assuming that the reservoir is completely mixed. Such an approach quantifies the impact of factors, other than chemical reactions, in the bulk water. Data from three full-scale chloraminated service reservoirs in distribution systems of Sydney, Australia, were analysed to demonstrate the generality of the method. Results showed that in two large service reservoirs (404 x 10(3) m(3) and 82 x 10(3) m(3)) there was minimal impact from biofilm/sediment. However, in a small reservoir (3 x 10(3) m(3)), the biofilm/sediment had significant impact. In both small and large reservoirs, the effect of stratification was significant.

  1. Quantifying the CV: Adapting an Impact Assessment Model to Astronomy

    NASA Astrophysics Data System (ADS)

    Bohémier, K. A.

    2015-04-01

    We present the process and results of applying the Becker Model to the curriculum vitae of a Yale University astronomy professor. As background, in July 2013, the Becker Medical Library at Washington Univ. in St. Louis held a workshop for librarians on the Becker Model, a framework developed by research assessment librarians for quantifying medical researchers' individual and group outputs. Following the workshop, the model was analyzed for content to adapt it to the physical sciences.

  2. Quantifying the coherence of pure quantum states

    NASA Astrophysics Data System (ADS)

    Chen, Jianxin; Grogan, Shane; Johnston, Nathaniel; Li, Chi-Kwong; Plosker, Sarah

    2016-10-01

    In recent years, several measures have been proposed for characterizing the coherence of a given quantum state. We derive several results that illuminate how these measures behave when restricted to pure states. Notably, we present an explicit characterization of the closest incoherent state to a given pure state under the trace distance measure of coherence. We then use this result to show that the states maximizing the trace distance of coherence are exactly the maximally coherent states. We define the trace distance of entanglement and show that it coincides with the trace distance of coherence for pure states. Finally, we give an alternate proof to a recent result that the ℓ1 measure of coherence of a pure state is never smaller than its relative entropy of coherence.

  3. An In Vivo Method to Quantify Lymphangiogenesis in Zebrafish

    PubMed Central

    Hoffman, Scott J.; Psaltis, Peter J.; Clark, Karl J.; Spoon, Daniel B.; Chue, Colin D.; Ekker, Stephen C.; Simari, Robert D.

    2012-01-01

    Background Lymphangiogenesis is a highly regulated process involved in the pathogenesis of disease. Current in vivo models to assess lymphangiogenesis are largely unphysiologic. The zebrafish is a powerful model system for studying development, due to its rapid growth and transparency during early stages of life. Identification of a network of trunk lymphatic capillaries in zebrafish provides an opportunity to quantify lymphatic growth in vivo. Methods and Results Late-phase microangiography was used to detect trunk lymphatic capillaries in zebrafish 2- and 3-days post-fertilization. Using this approach, real-time changes in lymphatic capillary development were measured in response to modulators of lymphangiogenesis. Recombinant human vascular endothelial growth factor (VEGF)-C added directly to the zebrafish aqueous environment as well as human endothelial and mouse melanoma cell transplantation resulted in increased lymphatic capillary growth, while morpholino-based knockdown of vegfc and chemical inhibitors of lymphangiogenesis added to the aqueous environment resulted in decreased lymphatic capillary growth. Conclusion Lymphatic capillaries in embryonic and larval zebrafish can be quantified using late-phase microangiography. Human activators and small molecule inhibitors of lymphangiogenesis, as well as transplanted human endothelial and mouse melanoma cells, alter lymphatic capillary development in zebrafish. The ability to rapidly quantify changes in lymphatic growth under physiologic conditions will allow for broad screening of lymphangiogenesis modulators, as well as help define cellular roles and elucidate pathways of lymphatic development. PMID:23028871

  4. Quantifying stimulus discriminability: a comparison of information theory and ideal observer analysis.

    PubMed

    Thomson, Eric E; Kristan, William B

    2005-04-01

    Performance in sensory discrimination tasks is commonly quantified using either information theory or ideal observer analysis. These two quantitative frameworks are often assumed to be equivalent. For example, higher mutual information is said to correspond to improved performance of an ideal observer in a stimulus estimation task. To the contrary, drawing on and extending previous results, we show that five information-theoretic quantities (entropy, response-conditional entropy, specific information, equivocation, and mutual information) violate this assumption. More positively, we show how these information measures can be used to calculate upper and lower bounds on ideal observer performance, and vice versa. The results show that the mathematical resources of ideal observer analysis are preferable to information theory for evaluating performance in a stimulus discrimination task. We also discuss the applicability of information theory to questions that ideal observer analysis cannot address.

  5. Quantifying food intake in socially housed monkeys: social status effects on caloric consumption

    PubMed Central

    Wilson, Mark E.; Fisher, Jeff; Fischer, Andrew; Lee, Vanessa; Harris, Ruth B.; Bartness, Timothy J.

    2008-01-01

    Obesity results from a number of factors including socio-environmental influences and rodent models show that several different stressors increase the preference for calorically dense foods leading to an obese phenotype. We present here a non-human primate model using socially housed adult female macaques living in long-term stable groups given access to diets of different caloric density. Consumption of a low fat (LFD; 15% of calories from fat) and a high fat diet (HFD; 45% of calories from fat) was quantified by means of a custom-built, automated feeder that dispensed a pellet of food when activated by a radiofrequency chip implanted subcutaneously in the animal’s wrist. Socially subordinate females showed indices of chronic psychological stress having reduced glucocorticoid negative feedback and higher frequencies of anxiety-like behavior. Twenty-four hour intakes of both the LFD and HFD were significantly greater in subordinates than dominates, an effect that persisted whether standard monkey chow (13% of calories from fat) was present or absent. Furthermore, although dominants restricted their food intake to daylight, subordinates continued to feed at night. Total caloric intake was significantly correlated with body weight change. Collectively, these results show that food intake can be reliably quantified in non-human primates living in complex social environments and suggest that socially-subordinate females consume more calories, suggesting this ethologically relevant model may help understand how psychosocial stress changes food preferences and consumption leading to obesity. PMID:18486158

  6. A novel real time imaging platform to quantify macrophage phagocytosis.

    PubMed

    Kapellos, Theodore S; Taylor, Lewis; Lee, Heyne; Cowley, Sally A; James, William S; Iqbal, Asif J; Greaves, David R

    2016-09-15

    Phagocytosis of pathogens, apoptotic cells and debris is a key feature of macrophage function in host defense and tissue homeostasis. Quantification of macrophage phagocytosis in vitro has traditionally been technically challenging. Here we report the optimization and validation of the IncuCyte ZOOM® real time imaging platform for macrophage phagocytosis based on pHrodo® pathogen bioparticles, which only fluoresce when localized in the acidic environment of the phagolysosome. Image analysis and fluorescence quantification were performed with the automated IncuCyte™ Basic Software. Titration of the bioparticle number showed that the system is more sensitive than a spectrofluorometer, as it can detect phagocytosis when using 20× less E. coli bioparticles. We exemplified the power of this real time imaging platform by studying phagocytosis of murine alveolar, bone marrow and peritoneal macrophages. We further demonstrate the ability of this platform to study modulation of the phagocytic process, as pharmacological inhibitors of phagocytosis suppressed bioparticle uptake in a concentration-dependent manner, whereas opsonins augmented phagocytosis. We also investigated the effects of macrophage polarization on E. coli phagocytosis. Bone marrow-derived macrophage (BMDM) priming with M2 stimuli, such as IL-4 and IL-10 resulted in higher engulfment of bioparticles in comparison with M1 polarization. Moreover, we demonstrated that tolerization of BMDMs with lipopolysaccharide (LPS) results in impaired E. coli bioparticle phagocytosis. This novel real time assay will enable researchers to quantify macrophage phagocytosis with a higher degree of accuracy and sensitivity and will allow investigation of limited populations of primary phagocytes in vitro.

  7. Quantifying the sources of error in measurements of urine activity

    SciTech Connect

    Mozley, P.D.; Kim, H.J.; McElgin, W.

    1994-05-01

    Accurate scintigraphic measurements of radioactivity in the bladder and voided urine specimens can be limited by scatter, attenuation, and variations in the volume of urine that a given dose is distributed in. The purpose of this study was to quantify some of the errors that these problems can introduce. Transmission scans and 41 conjugate images of the bladder were sequentially acquired on a dual headed camera over 24 hours in 6 subjects after the intravenous administration of 100-150 MBq (2.7-3.6 mCi) of a novel I-123 labeled benzamide. Renal excretion fractions were calculated by measuring the counts in conjugate images of 41 sequentially voided urine samples. A correction for scatter was estimated by comparing the count rates in images that were acquired with the photopeak centered an 159 keV and images that were made simultaneously with the photopeak centered on 126 keV. The decay and attenuation corrected, geometric mean activities were compared to images of the net dose injected. Checks of the results were performed by measuring the total volume of each voided urine specimen and determining the activity in a 20 ml aliquot of it with a dose calibrator. Modeling verified the experimental results which showed that 34% of the counts were attenuated when the bladder had been expanded to a volume of 300 ml. Corrections for attenuation that were based solely on the transmission scans were limited by the volume of non-radioactive urine in the bladder before the activity was administered. The attenuation of activity in images of the voided wine samples was dependent on the geometry of the specimen container. The images of urine in standard, 300 ml laboratory specimen cups had 39{plus_minus}5% fewer counts than images of the same samples laid out in 3 liter bedpans. Scatter through the carbon fiber table substantially increased the number of counts in the images by an average of 14%.

  8. Quantifying Filopodia in Cultured Astrocytes by an Algorithm.

    PubMed

    Aumann, Georg; Friedländer, Felix; Thümmler, Matthias; Keil, Fabian; Brunkhorst, Robert; Korf, Horst-Werner; Derouiche, Amin

    2017-02-27

    Astrocytes in vivo extend thin processes termed peripheral astrocyte processes (PAPs), in particular around synapses where they can mediate glia-neuronal communication. The relation of PAPs to synapses is not based on coincidence, but it is not clear which stimuli and mechanisms lead to their formation and are active during process extension/ retraction in response to neuronal activity. Also, the molecular basis of the extremely fine PAP morphology (often 50 to 100 nm) is not understood. These open questions can be best investigated under in vitro conditions studying glial filopodia. We have previously analyzed filopodial mechanisms (Lavialle et al. PNAS 108:12915) applying an automated method for filopodia morphometry, which is now described in greater detail. The Filopodia Specific Shape Factor (FSSF) developed integrates number and length of filopodia. It quantifies filopodia independent of overall astrocytic shape or size, which can be intricate in itself. The algorithm supplied here permits automated image processing and measurements using ImageJ. Cells have to be sampled in higher numbers to obtain significant results. We validate the FSSF, and characterize the systematic influence of thresholding and camera pixel grid on measurements. We provide exemplary results of substance-induced filopodia dynamics (glutamate, mGluR agonists, EGF), and show that filopodia formation is highly sensitive to medium pH (CO2) and duration of cell culture. Although the FSSF was developed to study astrocyte filopodia with focus on the perisynaptic glial sheath, we expect that this parameter can also be applied to neuronal growth cones, non-neural cell types, or cell lines.

  9. Entropy Transfer between Residue Pairs and Allostery in Proteins: Quantifying Allosteric Communication in Ubiquitin

    PubMed Central

    2017-01-01

    It has recently been proposed by Gunasakaran et al. that allostery may be an intrinsic property of all proteins. Here, we develop a computational method that can determine and quantify allosteric activity in any given protein. Based on Schreiber's transfer entropy formulation, our approach leads to an information transfer landscape for the protein that shows the presence of entropy sinks and sources and explains how pairs of residues communicate with each other using entropy transfer. The model can identify the residues that drive the fluctuations of others. We apply the model to Ubiquitin, whose allosteric activity has not been emphasized until recently, and show that there are indeed systematic pathways of entropy and information transfer between residues that correlate well with the activities of the protein. We use 600 nanosecond molecular dynamics trajectories for Ubiquitin and its complex with human polymerase iota and evaluate entropy transfer between all pairs of residues of Ubiquitin and quantify the binding susceptibility changes upon complex formation. We explain the complex formation propensities of Ubiquitin in terms of entropy transfer. Important residues taking part in allosteric communication in Ubiquitin predicted by our approach are in agreement with results of NMR relaxation dispersion experiments. Finally, we show that time delayed correlation of fluctuations of two interacting residues possesses an intrinsic causality that tells which residue controls the interaction and which one is controlled. Our work shows that time delayed correlations, entropy transfer and causality are the required new concepts for explaining allosteric communication in proteins. PMID:28095404

  10. Using multilevel models to quantify heterogeneity in resource selection

    USGS Publications Warehouse

    Wagner, T.; Diefenbach, D.R.; Christensen, S.A.; Norton, A.S.

    2011-01-01

    Models of resource selection are being used increasingly to predict or model the effects of management actions rather than simply quantifying habitat selection. Multilevel, or hierarchical, models are an increasingly popular method to analyze animal resource selection because they impose a relatively weak stochastic constraint to model heterogeneity in habitat use and also account for unequal sample sizes among individuals. However, few studies have used multilevel models to model coefficients as a function of predictors that may influence habitat use at different scales or quantify differences in resource selection among groups. We used an example with white-tailed deer (Odocoileus virginianus) to illustrate how to model resource use as a function of distance to road that varies among deer by road density at the home range scale. We found that deer avoidance of roads decreased as road density increased. Also, we used multilevel models with sika deer (Cervus nippon) and white-tailed deer to examine whether resource selection differed between species. We failed to detect differences in resource use between these two species and showed how information-theoretic and graphical measures can be used to assess how resource use may have differed. Multilevel models can improve our understanding of how resource selection varies among individuals and provides an objective, quantifiable approach to assess differences or changes in resource selection. ?? The Wildlife Society, 2011.

  11. Gains and Pitfalls of Quantifier Elimination as a Teaching Tool

    ERIC Educational Resources Information Center

    Oldenburg, Reinhard

    2015-01-01

    Quantifier Elimination is a procedure that allows simplification of logical formulas that contain quantifiers. Many mathematical concepts are defined in terms of quantifiers and especially in calculus their use has been identified as an obstacle in the learning process. The automatic deduction provided by quantifier elimination thus allows…

  12. Quantifying the Impact of Scenic Environments on Health

    NASA Astrophysics Data System (ADS)

    Seresinhe, Chanuki Illushka; Preis, Tobias; Moat, Helen Susannah

    2015-11-01

    Few people would deny an intuitive sense of increased wellbeing when spending time in beautiful locations. Here, we ask: can we quantify the relationship between environmental aesthetics and human health? We draw on data from Scenic-Or-Not, a website that crowdsources ratings of “scenicness” for geotagged photographs across Great Britain, in combination with data on citizen-reported health from the Census for England and Wales. We find that inhabitants of more scenic environments report better health, across urban, suburban and rural areas, even when taking core socioeconomic indicators of deprivation into account, such as income, employment and access to services. Our results provide evidence in line with the striking hypothesis that the aesthetics of the environment may have quantifiable consequences for our wellbeing.

  13. Quantifying the Impact of Scenic Environments on Health

    PubMed Central

    Seresinhe, Chanuki Illushka; Preis, Tobias; Moat, Helen Susannah

    2015-01-01

    Few people would deny an intuitive sense of increased wellbeing when spending time in beautiful locations. Here, we ask: can we quantify the relationship between environmental aesthetics and human health? We draw on data from Scenic-Or-Not, a website that crowdsources ratings of “scenicness” for geotagged photographs across Great Britain, in combination with data on citizen-reported health from the Census for England and Wales. We find that inhabitants of more scenic environments report better health, across urban, suburban and rural areas, even when taking core socioeconomic indicators of deprivation into account, such as income, employment and access to services. Our results provide evidence in line with the striking hypothesis that the aesthetics of the environment may have quantifiable consequences for our wellbeing. PMID:26603464

  14. Analysis of subsurface temperature data to quantify groundwater recharge rates in a closed Altiplano basin, northern Chile

    NASA Astrophysics Data System (ADS)

    Kikuchi, C. P.; Ferré, T. P. A.

    2016-09-01

    Quantifying groundwater recharge is a fundamental part of groundwater resource assessment and management, and is requisite to determining the safe yield of an aquifer. Natural groundwater recharge in arid and semi-arid regions comprises several mechanisms: in-place, mountain-front, and mountain-block recharge. A field study was undertaken in a high-plain basin in the Altiplano region of northern Chile to quantify the magnitude of in-place and mountain-front recharge. Water fluxes corresponding to both recharge mechanisms were calculated using heat as a natural tracer. To quantify in-place recharge, time-series temperature data in cased boreholes were collected, and the annual fluctuation at multiple depths analyzed to infer the water flux through the unsaturated zone. To quantify mountain-front recharge, time-series temperature data were collected in perennial and ephemeral stream channels. Streambed thermographs were analyzed to determine the onset and duration of flow in ephemeral channels, and the vertical water fluxes into both perennial and ephemeral channels. The point flux estimates in streambeds and the unsaturated zone were upscaled to channel and basin-floor areas to provide comparative estimates of the range of volumetric recharge rates corresponding to each recharge mechanism. The results of this study show that mountain-front recharge is substantially more important than in-place recharge in this basin. The results further demonstrate the worth of time-series subsurface temperature data to characterize both in-place and mountain-front recharge processes.

  15. Quantifying the underlying landscape and paths of cancer.

    PubMed

    Li, Chunhe; Wang, Jin

    2014-11-06

    Cancer is a disease regulated by the underlying gene networks. The emergence of normal and cancer states as well as the transformation between them can be thought of as a result of the gene network interactions and associated changes. We developed a global potential landscape and path framework to quantify cancer and associated processes. We constructed a cancer gene regulatory network based on the experimental evidences and uncovered the underlying landscape. The resulting tristable landscape characterizes important biological states: normal, cancer and apoptosis. The landscape topography in terms of barrier heights between stable state attractors quantifies the global stability of the cancer network system. We propose two mechanisms of cancerization: one is by the changes of landscape topography through the changes in regulation strengths of the gene networks. The other is by the fluctuations that help the system to go over the critical barrier at fixed landscape topography. The kinetic paths from least action principle quantify the transition processes among normal state, cancer state and apoptosis state. The kinetic rates provide the quantification of transition speeds among normal, cancer and apoptosis attractors. By the global sensitivity analysis of the gene network parameters on the landscape topography, we uncovered some key gene regulations determining the transitions between cancer and normal states. This can be used to guide the design of new anti-cancer tactics, through cocktail strategy of targeting multiple key regulation links simultaneously, for preventing cancer occurrence or transforming the early cancer state back to normal state.

  16. Quantifying uncertainty in state and parameter estimation.

    PubMed

    Parlitz, Ulrich; Schumann-Bischoff, Jan; Luther, Stefan

    2014-05-01

    Observability of state variables and parameters of a dynamical system from an observed time series is analyzed and quantified by means of the Jacobian matrix of the delay coordinates map. For each state variable and each parameter to be estimated, a measure of uncertainty is introduced depending on the current state and parameter values, which allows us to identify regions in state and parameter space where the specific unknown quantity can(not) be estimated from a given time series. The method is demonstrated using the Ikeda map and the Hindmarsh-Rose model.

  17. Oxygen-enhanced MRI accurately identifies, quantifies, and maps tumor hypoxia in preclinical cancer models

    PubMed Central

    O’Connor, James PB; Boult, Jessica KR; Jamin, Yann; Babur, Muhammad; Finegan, Katherine G; Williams, Kaye J; Little, Ross A; Jackson, Alan; Parker, Geoff JM; Reynolds, Andrew R; Waterton, John C; Robinson, Simon P

    2015-01-01

    There is a clinical need for non-invasive biomarkers of tumor hypoxia for prognostic and predictive studies, radiotherapy planning and therapy monitoring. Oxygen enhanced MRI (OE-MRI) is an emerging imaging technique for quantifying the spatial distribution and extent of tumor oxygen delivery in vivo. In OE-MRI, the longitudinal relaxation rate of protons (ΔR1) changes in proportion to the concentration of molecular oxygen dissolved in plasma or interstitial tissue fluid. Therefore, well-oxygenated tissues show positive ΔR1. We hypothesized that the fraction of tumor tissue refractory to oxygen challenge (lack of positive ΔR1, termed “Oxy-R fraction”) would be a robust biomarker of hypoxia in models with varying vascular and hypoxic features. Here we demonstrate that OE-MRI signals are accurate, precise and sensitive to changes in tumor pO2 in highly vascular 786-0 renal cancer xenografts. Furthermore, we show that Oxy-R fraction can quantify the hypoxic fraction in multiple models with differing hypoxic and vascular phenotypes, when used in combination with measurements of tumor perfusion. Finally, Oxy-R fraction can detect dynamic changes in hypoxia induced by the vasomodulator agent hydralazine. In contrast, more conventional biomarkers of hypoxia (derived from blood oxygenation-level dependent MRI and dynamic contrast-enhanced MRI) did not relate to tumor hypoxia consistently. Our results show that the Oxy-R fraction accurately quantifies tumor hypoxia non-invasively and is immediately translatable to the clinic. PMID:26659574

  18. How to quantify coherence: Distinguishing speakable and unspeakable notions

    NASA Astrophysics Data System (ADS)

    Marvian, Iman; Spekkens, Robert W.

    2016-11-01

    Quantum coherence is a critical resource for many operational tasks. Understanding how to quantify and manipulate it also promises to have applications for a diverse set of problems in theoretical physics. For certain applications, however, one requires coherence between the eigenspaces of specific physical observables, such as energy, angular momentum, or photon number, and it makes a difference which eigenspaces appear in the superposition. For others, there is a preferred set of subspaces relative to which coherence is deemed a resource, but it is irrelevant which of the subspaces appear in the superposition. We term these two types of coherence unspeakable and speakable, respectively. We argue that a useful approach to quantifying and characterizing unspeakable coherence is provided by the resource theory of asymmetry when the symmetry group is a group of translations, and we translate a number of prior results on asymmetry into the language of coherence. We also highlight some of the applications of this approach, for instance, in the context of quantum metrology, quantum speed limits, quantum thermodynamics, and nuclear magnetic resonance (NMR). The question of how best to treat speakable coherence as a resource is also considered. We review a popular approach in terms of operations that preserve the set of incoherent states, propose an alternative approach in terms of operations that are covariant under dephasing, and we outline the challenge of providing a physical justification for either approach. Finally, we note some mathematical connections that hold among the different approaches to quantifying coherence.

  19. Precise thermal NDE for quantifying structural damage

    SciTech Connect

    Del Grande, N.K.; Durbin, P.F.

    1995-09-18

    The authors demonstrated a fast, wide-area, precise thermal NDE imaging system to quantify aircraft corrosion damage, such as percent metal loss, above a threshold of 5% with 3% overall uncertainties. The DBIR precise thermal imaging and detection method has been used successfully to characterize defect types, and their respective depths, in aircraft skins, and multi-layered composite materials used for wing patches, doublers and stiffeners. This precise thermal NDE inspection tool has long-term potential benefits to evaluate the structural integrity of airframes, pipelines and waste containers. They proved the feasibility of the DBIR thermal NDE imaging system to inspect concrete and asphalt-concrete bridge decks. As a logical extension to the successful feasibility study, they plan to inspect a concrete bridge deck from a moving vehicle to quantify the volumetric damage within the deck and the percent of the deck which has subsurface delaminations. Potential near-term benefits are in-service monitoring from a moving vehicle to inspect the structural integrity of the bridge deck. This would help prioritize the repair schedule for a reported 200,000 bridge decks in the US which need substantive repairs. Potential long-term benefits are affordable, and reliable, rehabilitation for bridge decks.

  20. Quantifying meta-correlations in financial markets

    NASA Astrophysics Data System (ADS)

    Kenett, Dror Y.; Preis, Tobias; Gur-Gershgoren, Gitit; Ben-Jacob, Eshel

    2012-08-01

    Financial markets are modular multi-level systems, in which the relationships between the individual components are not constant in time. Sudden changes in these relationships significantly affect the stability of the entire system, and vice versa. Our analysis is based on historical daily closing prices of the 30 components of the Dow Jones Industrial Average (DJIA) from March 15th, 1939 until December 31st, 2010. We quantify the correlation among these components by determining Pearson correlation coefficients, to investigate whether mean correlation of the entire portfolio can be used as a precursor for changes in the index return. To this end, we quantify the meta-correlation - the correlation of mean correlation and index return. We find that changes in index returns are significantly correlated with changes in mean correlation. Furthermore, we study the relationship between the index return and correlation volatility - the standard deviation of correlations for a given time interval. This parameter provides further evidence of the effect of the index on market correlations and their fluctuations. Our empirical findings provide new information and quantification of the index leverage effect, and have implications to risk management, portfolio optimization, and to the increased stability of financial markets.

  1. Quantifying chemical reactions by using mixing analysis.

    PubMed

    Jurado, Anna; Vázquez-Suñé, Enric; Carrera, Jesús; Tubau, Isabel; Pujades, Estanislao

    2015-01-01

    This work is motivated by a sound understanding of the chemical processes that affect the organic pollutants in an urban aquifer. We propose an approach to quantify such processes using mixing calculations. The methodology consists of the following steps: (1) identification of the recharge sources (end-members) and selection of the species (conservative and non-conservative) to be used, (2) identification of the chemical processes and (3) evaluation of mixing ratios including the chemical processes. This methodology has been applied in the Besòs River Delta (NE Barcelona, Spain), where the River Besòs is the main aquifer recharge source. A total number of 51 groundwater samples were collected from July 2007 to May 2010 during four field campaigns. Three river end-members were necessary to explain the temporal variability of the River Besòs: one river end-member is from the wet periods (W1) and two are from dry periods (D1 and D2). This methodology has proved to be useful not only to compute the mixing ratios but also to quantify processes such as calcite and magnesite dissolution, aerobic respiration and denitrification undergone at each observation point.

  2. Quantifying electrocardiogram RT-RR variability interactions.

    PubMed

    Porta, A; Baselli, G; Caiani, E; Malliani, A; Lombardi, F; Cerutti, S

    1998-01-01

    A dynamic linear parametric model is designed to quantify the dependence of ventricular repolarisation duration variability on heart period changes and other immeasurable factors. The model analyses the beat-to-beat series of the RR duration and of the interval between R- and T-wave apexes (RT period). Directly from these two signals, a parametric identification procedure and spectral decomposition techniques allow RT variability to be divided into RR-related and RR-unrelated parts and allow the RT-RR transfer function to be calculated. RT variability is driven by RR changes at low frequency (LF, around 0.1 Hz) and high frequency (HF, at the respiratory rate), whereas, at very low frequencies, the RR-unrelated contribution to the total RT variability is remarkable. During tilt at LF the RR-related RT percentage power increases (p < 0.02), the RR-unrelated RT percentage power remains unchanged, the gain of the RT-RR relationship largely increases (p < 0.001), and the phase is not significantly modified. Both the RR-related and the RR-unrelated RT percentage powers at LF are not affected by controlled respiration, and an increase in the RT-RR gain at HF is observed (p < 0.02). The proposed analysis may help to describe the regulation of the ventricular repolarisation process and to extract indexes quantifying the coupling between heart period and ventricular repolarisation interval changes.

  3. Quantifying and scaling airplane performance in turbulence

    NASA Astrophysics Data System (ADS)

    Richardson, Johnhenri R.

    This dissertation studies the effects of turbulent wind on airplane airspeed and normal load factor, determining how these effects scale with airplane size and developing envelopes to account for them. The results have applications in design and control of aircraft, especially small scale aircraft, for robustness with respect to turbulence. Using linearized airplane dynamics and the Dryden gust model, this dissertation presents analytical and numerical scaling laws for airplane performance in gusts, safety margins that guarantee, with specified probability, that steady flight can be maintained when stochastic wind gusts act upon an airplane, and envelopes to visualize these safety margins. Presented here for the first time are scaling laws for the phugoid natural frequency, phugoid damping ratio, airspeed variance in turbulence, and flight path angle variance in turbulence. The results show that small aircraft are more susceptible to high frequency gusts, that the phugoid damping ratio does not depend directly on airplane size, that the airspeed and flight path angle variances can be parameterized by the ratio of the phugoid natural frequency to a characteristic turbulence frequency, and that the coefficient of variation of the airspeed decreases with increasing airplane size. Accompanying numerical examples validate the results using eleven different airplanes models, focusing on NASA's hypothetical Boeing 757 analog the Generic Transport Model and its operational 5.5% scale model, the NASA T2. Also presented here for the first time are stationary flight, where the flight state is a stationary random process, and the stationary flight envelope, an adjusted steady flight envelope to visualize safety margins for stationary flight. The dissertation shows that driving the linearized airplane equations of motion with stationary, stochastic gusts results in stationary flight. It also shows how feedback control can enlarge the stationary flight envelope by alleviating

  4. Quantifying the Determinants of Evolutionary Dynamics Leading to Drug Resistance

    PubMed Central

    Chevereau, Guillaume; Dravecká, Marta; Batur, Tugce; Guvenek, Aysegul; Ayhan, Dilay Hazal; Toprak, Erdal; Bollenbach, Tobias

    2015-01-01

    The emergence of drug resistant pathogens is a serious public health problem. It is a long-standing goal to predict rates of resistance evolution and design optimal treatment strategies accordingly. To this end, it is crucial to reveal the underlying causes of drug-specific differences in the evolutionary dynamics leading to resistance. However, it remains largely unknown why the rates of resistance evolution via spontaneous mutations and the diversity of mutational paths vary substantially between drugs. Here we comprehensively quantify the distribution of fitness effects (DFE) of mutations, a key determinant of evolutionary dynamics, in the presence of eight antibiotics representing the main modes of action. Using precise high-throughput fitness measurements for genome-wide Escherichia coli gene deletion strains, we find that the width of the DFE varies dramatically between antibiotics and, contrary to conventional wisdom, for some drugs the DFE width is lower than in the absence of stress. We show that this previously underappreciated divergence in DFE width among antibiotics is largely caused by their distinct drug-specific dose-response characteristics. Unlike the DFE, the magnitude of the changes in tolerated drug concentration resulting from genome-wide mutations is similar for most drugs but exceptionally small for the antibiotic nitrofurantoin, i.e., mutations generally have considerably smaller resistance effects for nitrofurantoin than for other drugs. A population genetics model predicts that resistance evolution for drugs with this property is severely limited and confined to reproducible mutational paths. We tested this prediction in laboratory evolution experiments using the “morbidostat”, a device for evolving bacteria in well-controlled drug environments. Nitrofurantoin resistance indeed evolved extremely slowly via reproducible mutations—an almost paradoxical behavior since this drug causes DNA damage and increases the mutation rate. Overall

  5. Quantifying uncertainties in U.S. wildland fire emissions across space and time scales

    NASA Astrophysics Data System (ADS)

    Larkin, N. K.; Strand, T. T.; Raffuse, S. M.; Drury, S.

    2011-12-01

    Smoke from wildland fire is a growing concern as air quality regulations tighten and public acceptance declines. Wildland fire emissions inventories are not only important for understanding smoke impacts on air quality but also in quantifying sources of greenhouse gas emissions. Wildland fire emissions can be calculated using a number of models and methods. We show an overview of results from the Smoke and Emissions Model Intercomparison Project (SEMIP) describing uncertainties in calculations of U.S. wildland fire emissions across space and time scales from single fires to annual national totals. Differences in emissions calculated from different models and systems and satallite algorithms and ground based systems are shown. The relative importance of uncertainties in fire size and available fuel data, consumption modeling techniques, and emissions factors are compared and quantified and can be applied to various use cases that include air quality impact modeling and greenhouse gas accounting. The results of this work show where additional information and updated models can most improve wildland fire emission inventories.

  6. Toward quantifying the deep Atlantic carbon storage increase during the last glaciation

    NASA Astrophysics Data System (ADS)

    Yu, J.; Menviel, L.; Jin, Z.

    2014-12-01

    Ice core records show that atmospheric CO2 concentrations during peak glacial time were ~30% lower than the levels during interglacial periods. The terrestrial biosphere carbon stock was likely reduced during glacials. Increased carbon storage in the deep ocean is thought to play an important role in lowering glacial atmospheric CO2. However, it has been challenging to quantify carbon storage changes in the deep ocean using existing proxy data. Here, we present deepwater carbonate ion reconstructions for a few locations in the deep Atlantic. These data allow us to estimate the minimum carbon storage increase in the deep Atlantic Ocean during the last glaciation. Our results show that, despite its relative small volume, the deep Atlantic Ocean may contribute significantly to atmospheric CO2 variations at major climate transitions. Furthermore, our results suggest a strong coupling of ocean circulation and carbon cycle in the deep Atlantic during the last glaciation.

  7. Quantifying voids effecting delamination in carbon/epoxy composites: static and fatigue fracture behavior

    NASA Astrophysics Data System (ADS)

    Hakim, I.; May, D.; Abo Ras, M.; Meyendorf, N.; Donaldson, S.

    2016-04-01

    On the present work, samples of carbon fiber/epoxy composites with different void levels were fabricated using hand layup vacuum bagging process by varying the pressure. Thermal nondestructive methods: thermal conductivity measurement, pulse thermography, pulse phase thermography and lock-in-thermography, and mechanical testing: modes I and II interlaminar fracture toughness were conducted. Comparing the parameters resulted from the thermal nondestructive testing revealed that voids lead to reductions in thermal properties in all directions of composites. The results of mode I and mode II interlaminar fracture toughness showed that voids lead to reductions in interlaminar fracture toughness. The parameters resulted from thermal nondestructive testing were correlated to the results of mode I and mode II interlaminar fracture toughness and voids were quantified.

  8. Methods for quantifying training in sprint kayak.

    PubMed

    Borges, Thiago Oliveira; Bullock, Nicola; Duff, Christine; Coutts, Aaron J

    2014-02-01

    The aims of this study were to determine the validity of the session rating of perceived exertion (session-RPE) method by comparing 3 different scales of perceived exertion with common measures of training load (TL). A secondary aim was to verify the relationship between TLs, fitness, and performance in Sprint Kayak athletes. After laboratory assessment of maximal oxygen uptake (V[Combining Dot Above]O2peak) and lactate threshold, the athletes performed on water time trials over 200 and 1,000 m. Training load was quantified for external (distance and speed) and internal (session-RPE: 6-20, category ratio [CR]-10 and CR-100 scales, training impulse [TRIMP], and individual TRIMP). Ten (6 male, 4 female) well-trained junior Sprint Kayak athletes (age 17.1 ± 1.2 years; V[Combining Dot Above]O2peak 4.2 ± 0.7 L·min) were monitored over a 7-week period. There were large-to-very large within-individual correlations between the session distance and the various heart rate (HR) and RPE-based methods for quantifying TL (0.58-0.91). Correlations between the mean session speed and various HR- and RPE-based methods for quantifying TL were small to large (0.12-0.50). The within-individual relationships between the various objective and subjective methods of internal TL were large to very large (0.62-0.94). Moderate-to-large inverse relationships were found between mean session-RPE TL and various aerobic fitness variables (-0.58 to -0.37). Large-to-very large relationships were found between mean session-RPE TL and on water performance (0.57-0.75). In conclusion, session-RPE is a valid method for monitoring TL for junior Sprint Kayak athletes, regardless of the RPE scale used. The session-RPE TL relates to fitness and performance, supporting the use of session-RPE in Sprint Kayak training.

  9. Quantifying near-surface water exchange to assess hydrometeorological models

    NASA Astrophysics Data System (ADS)

    Parent, Annie-Claude; Anctil, François; Morais, Anne

    2013-04-01

    Modelling water exchange from the lower atmosphere, crop and soil system using hydrometeorological models allows processing an actual evapotranspiration (ETa) which is a complex but critical value for numerous hydrological purposes e.g. hydrological modelling and crop irrigation. This poster presents a summary of the hydrometeorological research activity conducted by our research group. The first purpose of this research is to quantify ETa and drainage of a rainfed potato crop located in South-Eastern Canada. Then, the outputs of the hydrometeorological models under study are compared with the observed turbulent fluxes. Afterwards, the sensibility of the hydrometeorological models to different inputs is assessed for an environment under a changing climate. ETa was measured from micrometeorological instrumentation (CSAT3, Campbell SCI Inc.; Li7500, LiCor Inc.), and the eddy covariance techniques. Near surface soil heat flux and soil water content at different layers from 10 cm to 100 cm were also measured. Other parameters required by the hydrometeorological models were observed using meteorological standard instrumentation: shortwave and longwave solar radiation, wind speed, air temperature, atmospheric pressure and precipitation. The cumulative ETa during the growth season (123 days) was 331.5 mm, with a daily maximum of 6.5 mm at full coverage; precipitation was 350.6 mm which is rather small compared with the historical mean (563.3 mm). This experimentation allowed calculating crop coefficients that vary among the growth season for a rainfed potato crop. Land surface schemes as CLASS (Canadian Land Surface Scheme) and c-ISBA (a Canadian version of the model Interaction Sol-Biosphère-Atmosphère) are 1-D physical hydrometeorological models that produce turbulent fluxes (including ETa) for a given crop. The schemes performances were assessed for both energy and water balance, based on the resulting turbulent fluxes and the given observations. CLASS showed

  10. Obtaining Laws Through Quantifying Experiments: Justifications of Pre-service Physics Teachers in the Case of Electric Current, Voltage and Resistance

    NASA Astrophysics Data System (ADS)

    Mäntylä, Terhi; Hämäläinen, Ari

    2015-07-01

    The language of physics is mathematics, and physics ideas, laws and models describing phenomena are usually represented in mathematical form. Therefore, an understanding of how to navigate between phenomena and the models representing them in mathematical form is important for a physics teacher so that the teacher can make physics understandable to students. Here, the focus is on the "experimental mathematization," how laws are established through quantifying experiments. A sequence from qualitative experiments to mathematical formulations through quantifying experiments on electric current, voltage and resistance in pre-service physics teachers' laboratory reports is examined. The way students reason and justify the mathematical formulation of the measurement results and how they combine the treatment and presentation of empirical data to their justifications is analyzed. The results show that pre-service physics teachers understand the basic idea of how quantifying experiments establish the quantities and laws but are not able to argue it in a justified manner.

  11. Quantifying International Travel Flows Using Flickr

    PubMed Central

    Barchiesi, Daniele; Moat, Helen Susannah; Alis, Christian; Bishop, Steven; Preis, Tobias

    2015-01-01

    Online social media platforms are opening up new opportunities to analyse human behaviour on an unprecedented scale. In some cases, the fast, cheap measurements of human behaviour gained from these platforms may offer an alternative to gathering such measurements using traditional, time consuming and expensive surveys. Here, we use geotagged photographs uploaded to the photo-sharing website Flickr to quantify international travel flows, by extracting the location of users and inferring trajectories to track their movement across time. We find that Flickr based estimates of the number of visitors to the United Kingdom significantly correlate with the official estimates released by the UK Office for National Statistics, for 28 countries for which official estimates are calculated. Our findings underline the potential for indicators of key aspects of human behaviour, such as mobility, to be generated from data attached to the vast volumes of photographs posted online. PMID:26147500

  12. Quantifying the Anthropogenic Footprint in Eastern China

    NASA Astrophysics Data System (ADS)

    Meng, Chunlei; Dou, Youjun

    2016-04-01

    Urban heat island (UHI) is one of the most focuses in urban climate study. The parameterization of the anthropogenic heat (AH) is crucial important in UHI study, but universal method to parameterize the spatial pattern of the AH is lacking now. This paper uses the NOAA DMSP/OLS nighttime light data to parameterize the spatial pattern of the AH. Two experiments were designed and performed to quantify the influences of the AH to land surface temperature (LST) in eastern China and 24 big cities. The annual mean heating caused by AH is up to 1 K in eastern China. This paper uses the relative LST differences rather than the absolute LST differences between the control run and contrast run of common land model (CoLM) to find the drivers. The heating effect of the anthropogenic footprint has less influence on relatively warm and wet cities.

  13. Quantifying Power Grid Risk from Geomagnetic Storms

    NASA Astrophysics Data System (ADS)

    Homeier, N.; Wei, L. H.; Gannon, J. L.

    2012-12-01

    We are creating a statistical model of the geophysical environment that can be used to quantify the geomagnetic storm hazard to power grid infrastructure. Our model is developed using a database of surface electric fields for the continental United States during a set of historical geomagnetic storms. These electric fields are derived from the SUPERMAG compilation of worldwide magnetometer data and surface impedances from the United States Geological Survey. This electric field data can be combined with a power grid model to determine GICs per node and reactive MVARs at each minute during a storm. Using publicly available substation locations, we derive relative risk maps by location by combining magnetic latitude and ground conductivity. We also estimate the surface electric fields during the August 1972 geomagnetic storm that caused a telephone cable outage across the middle of the United States. This event produced the largest surface electric fields in the continental U.S. in at least the past 40 years.

  14. A colorimetric reaction to quantify fluid mixing

    NASA Astrophysics Data System (ADS)

    Oates, Peter M.; Harvey, Charles F.

    2006-11-01

    We found the colorimetric reaction of Tiron (1,2-dihydroxybenzene-3,5-disulfonic acid) and molybdate suitable for optical quantification of chemical reaction during fluid-fluid mixing in laboratory chambers. This reaction consists of two colorless reagents that mix to rapidly form colored, stable, soluble products. These products can be digitally imaged and quantified using light absorbance to study fluid-fluid mixing. Here we provide a model and equilibrium constants for the relevant complexation reactions. We also provide methods for relating light absorbance to product concentrations. Practical implementation issues of this reaction are discussed and an example of imaged absorbances for fluid-fluid mixing in heterogeneous porous media is given.

  15. Animal biometrics: quantifying and detecting phenotypic appearance.

    PubMed

    Kühl, Hjalmar S; Burghardt, Tilo

    2013-07-01

    Animal biometrics is an emerging field that develops quantified approaches for representing and detecting the phenotypic appearance of species, individuals, behaviors, and morphological traits. It operates at the intersection between pattern recognition, ecology, and information sciences, producing computerized systems for phenotypic measurement and interpretation. Animal biometrics can benefit a wide range of disciplines, including biogeography, population ecology, and behavioral research. Currently, real-world applications are gaining momentum, augmenting the quantity and quality of ecological data collection and processing. However, to advance animal biometrics will require integration of methodologies among the scientific disciplines involved. Such efforts will be worthwhile because the great potential of this approach rests with the formal abstraction of phenomics, to create tractable interfaces between different organizational levels of life.

  16. Quantifying fault recovery in multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Malek, Miroslaw; Harary, Frank

    1990-01-01

    Various aspects of reliable computing are formalized and quantified with emphasis on efficient fault recovery. The mathematical model which proves to be most appropriate is provided by the theory of graphs. New measures for fault recovery are developed and the value of elements of the fault recovery vector are observed to depend not only on the computation graph H and the architecture graph G, but also on the specific location of a fault. In the examples, a hypercube is chosen as a representative of parallel computer architecture, and a pipeline as a typical configuration for program execution. Dependability qualities of such a system is defined with or without a fault. These qualities are determined by the resiliency triple defined by three parameters: multiplicity, robustness, and configurability. Parameters for measuring the recovery effectiveness are also introduced in terms of distance, time, and the number of new, used, and moved nodes and edges.

  17. Quantifying forest visibility with spatial data.

    PubMed

    Wing, M G; Johnson, R

    2001-03-01

    We use spatial data representing transportation networks, elevation, stand height, and recreation use to construct and compare models of recreation use patterns and visibility in a forest. The recreation use pattern model depicts use frequencies along travel corridors. The visibility model quantifies visibility for all forest areas. We find that the models provide different but complementary types of information. Forest managers who are involved in scheduling harvest operations and want to address the visual concerns of forest visitors may benefit most from the visibility model. Managers who wish to know more about travel patterns or to reroute forest visitors affected by operations may benefit from the use pattern model. A combination of the two models has the highest potential for providing planning assistance in multiple-use forests. Both models may be able to enhance visual resource management (VRM) systems already in use by providing spatially explicit recreation use and visibility data.

  18. Quantifying International Travel Flows Using Flickr.

    PubMed

    Barchiesi, Daniele; Moat, Helen Susannah; Alis, Christian; Bishop, Steven; Preis, Tobias

    2015-01-01

    Online social media platforms are opening up new opportunities to analyse human behaviour on an unprecedented scale. In some cases, the fast, cheap measurements of human behaviour gained from these platforms may offer an alternative to gathering such measurements using traditional, time consuming and expensive surveys. Here, we use geotagged photographs uploaded to the photo-sharing website Flickr to quantify international travel flows, by extracting the location of users and inferring trajectories to track their movement across time. We find that Flickr based estimates of the number of visitors to the United Kingdom significantly correlate with the official estimates released by the UK Office for National Statistics, for 28 countries for which official estimates are calculated. Our findings underline the potential for indicators of key aspects of human behaviour, such as mobility, to be generated from data attached to the vast volumes of photographs posted online.

  19. Quantifying the Anthropogenic Footprint in Eastern China

    PubMed Central

    Meng, Chunlei; Dou, Youjun

    2016-01-01

    Urban heat island (UHI) is one of the most focuses in urban climate study. The parameterization of the anthropogenic heat (AH) is crucial important in UHI study, but universal method to parameterize the spatial pattern of the AH is lacking now. This paper uses the NOAA DMSP/OLS nighttime light data to parameterize the spatial pattern of the AH. Two experiments were designed and performed to quantify the influences of the AH to land surface temperature (LST) in eastern China and 24 big cities. The annual mean heating caused by AH is up to 1 K in eastern China. This paper uses the relative LST differences rather than the absolute LST differences between the control run and contrast run of common land model (CoLM) to find the drivers. The heating effect of the anthropogenic footprint has less influence on relatively warm and wet cities. PMID:27067132

  20. Quantifying the carcinogenicity of antineoplastic drugs.

    PubMed

    Kaldor, J M; Day, N E; Hemminki, K

    1988-04-01

    It has been well established that many of the drugs used in cancer therapy are themselves potentially carcinogenic. It is therefore important to quantify the carcinogenic risk associated with specific agents, and to investigate ways of predicting their risk from animal and in vitro studies. In this paper, an index of carcinogenic potency is defined, and applied to published data on acute non-lymphocytic leukemia following therapy with cytotoxic drugs used as single agents. Carcinogenic potency estimates for rats and mice are also obtained for 15 antineoplastic drugs, and the potency correlation between humans and rodents is examined for the five agents for which there are data in common. The broader implications for quantitative cancer risk prediction are discussed.

  1. How to quantify conduits in wood?

    PubMed

    Scholz, Alexander; Klepsch, Matthias; Karimi, Zohreh; Jansen, Steven

    2013-01-01

    Vessels and tracheids represent the most important xylem cells with respect to long distance water transport in plants. Wood anatomical studies frequently provide several quantitative details of these cells, such as vessel diameter, vessel density, vessel element length, and tracheid length, while important information on the three dimensional structure of the hydraulic network is not considered. This paper aims to provide an overview of various techniques, although there is no standard protocol to quantify conduits due to high anatomical variation and a wide range of techniques available. Despite recent progress in image analysis programs and automated methods for measuring cell dimensions, density, and spatial distribution, various characters remain time-consuming and tedious. Quantification of vessels and tracheids is not only important to better understand functional adaptations of tracheary elements to environment parameters, but will also be essential for linking wood anatomy with other fields such as wood development, xylem physiology, palaeobotany, and dendrochronology.

  2. How to quantify conduits in wood?

    PubMed Central

    Scholz, Alexander; Klepsch, Matthias; Karimi, Zohreh; Jansen, Steven

    2013-01-01

    Vessels and tracheids represent the most important xylem cells with respect to long distance water transport in plants. Wood anatomical studies frequently provide several quantitative details of these cells, such as vessel diameter, vessel density, vessel element length, and tracheid length, while important information on the three dimensional structure of the hydraulic network is not considered. This paper aims to provide an overview of various techniques, although there is no standard protocol to quantify conduits due to high anatomical variation and a wide range of techniques available. Despite recent progress in image analysis programs and automated methods for measuring cell dimensions, density, and spatial distribution, various characters remain time-consuming and tedious. Quantification of vessels and tracheids is not only important to better understand functional adaptations of tracheary elements to environment parameters, but will also be essential for linking wood anatomy with other fields such as wood development, xylem physiology, palaeobotany, and dendrochronology. PMID:23507674

  3. Quantifying creativity: can measures span the spectrum?

    PubMed

    Simonton, Dean Keith

    2012-03-01

    Because the cognitive neuroscientists have become increasingly interested in the phenomenon of creativity, the issue arises of how creativity is to be optimally measured. Unlike intelligence, which can be assessed across the full range of intellectual ability creativity measures tend to concentrate on different sections of the overall spectrum. After first defining creativity in terms of the three criteria of novelty, usefulness, and surprise, this article provides an overview of the available measures. Not only do these instruments vary according to whether they focus on the creative process, person, or product, but they differ regarding whether they tap into "little-c" versus "Big-C" creativity; only productivity and eminence measures reach into genius-level manifestations of the phenomenon. The article closes by discussing whether various alternative assessment techniques can be integrated into a single measure that quantifies creativity across the full spectrum.

  4. Quantifying nonadditive selection caused by indirect ecological effects.

    PubMed

    TerHorst, Casey P; Lau, Jennifer A; Cooper, Idelle A; Keller, Kane R; La Rosa, Raffica J; Royer, Anne M; Schultheis, Elizabeth H; Suwa, Tomomi; Conner, Jeffrey K

    2015-09-01

    In natural biological communities, species interact with many other species. Multiple species interactions can lead to indirect ecological effects that have important fitness consequences and can cause nonadditive patterns of natural selection. Given that indirect ecological effects are common in nature, nonadditive selection may also be quite common. As a result, quantifying nonadditive selection resulting from indirect ecological effects may be critical for understanding adaptation in natural communities composed of many interacting species. We describe how to quantify the relative strength of nonadditive selection resulting from indirect ecological effects compared to the strength of pairwise selection. We develop a clear method for testing for nonadditive selection caused by indirect ecological effects and consider how it might affect adaptation in multispecies communities. We use two case studies to illustrate how our method can be applied to empirical data sets. Our results suggest that nonadditive selection caused by indirect ecological effects may be common in nature. Our hope is that trait-based approaches, combined with multifactorial experiments, will result in more estimates of nonadditive selection that reveal the relative importance of indirect ecological effects for evolution in a community context.

  5. Crisis of Japanese vascular flora shown by quantifying extinction risks for 1618 taxa.

    PubMed

    Kadoya, Taku; Takenaka, Akio; Ishihama, Fumiko; Fujita, Taku; Ogawa, Makoto; Katsuyama, Teruo; Kadono, Yasuro; Kawakubo, Nobumitsu; Serizawa, Shunsuke; Takahashi, Hideki; Takamiya, Masayuki; Fujii, Shinji; Matsuda, Hiroyuki; Muneda, Kazuo; Yokota, Masatsugu; Yonekura, Koji; Yahara, Tetsukazu

    2014-01-01

    Although many people have expressed alarm that we are witnessing a mass extinction, few projections have been quantified, owing to limited availability of time-series data on threatened organisms, especially plants. To quantify the risk of extinction, we need to monitor changes in population size over time for as many species as possible. Here, we present the world's first quantitative projection of plant species loss at a national level, with stochastic simulations based on the results of population censuses of 1618 threatened plant taxa in 3574 map cells of ca. 100 km2. More than 500 lay botanists helped monitor those taxa in 1994-1995 and in 2003-2004. We projected that between 370 and 561 vascular plant taxa will go extinct in Japan during the next century if past trends of population decline continue. This extinction rate is approximately two to three times the global rate. Using time-series data, we show that existing national protected areas (PAs) covering ca. 7% of Japan will not adequately prevent population declines: even core PAs can protect at best <60% of local populations from decline. Thus, the Aichi Biodiversity Target to expand PAs to 17% of land (and inland water) areas, as committed to by many national governments, is not enough: only 29.2% of currently threatened species will become non-threatened under the assumption that probability of protection success by PAs is 0.5, which our assessment shows is realistic. In countries where volunteers can be organized to monitor threatened taxa, censuses using our method should be able to quantify how fast we are losing species and to assess how effective current conservation measures such as PAs are in preventing species extinction.

  6. Crisis of Japanese Vascular Flora Shown By Quantifying Extinction Risks for 1618 Taxa

    PubMed Central

    Kadoya, Taku; Takenaka, Akio; Ishihama, Fumiko; Fujita, Taku; Ogawa, Makoto; Katsuyama, Teruo; Kadono, Yasuro; Kawakubo, Nobumitsu; Serizawa, Shunsuke; Takahashi, Hideki; Takamiya, Masayuki; Fujii, Shinji; Matsuda, Hiroyuki; Muneda, Kazuo; Yokota, Masatsugu; Yonekura, Koji; Yahara, Tetsukazu

    2014-01-01

    Although many people have expressed alarm that we are witnessing a mass extinction, few projections have been quantified, owing to limited availability of time-series data on threatened organisms, especially plants. To quantify the risk of extinction, we need to monitor changes in population size over time for as many species as possible. Here, we present the world's first quantitative projection of plant species loss at a national level, with stochastic simulations based on the results of population censuses of 1618 threatened plant taxa in 3574 map cells of ca. 100 km2. More than 500 lay botanists helped monitor those taxa in 1994–1995 and in 2003–2004. We projected that between 370 and 561 vascular plant taxa will go extinct in Japan during the next century if past trends of population decline continue. This extinction rate is approximately two to three times the global rate. Using time-series data, we show that existing national protected areas (PAs) covering ca. 7% of Japan will not adequately prevent population declines: even core PAs can protect at best <60% of local populations from decline. Thus, the Aichi Biodiversity Target to expand PAs to 17% of land (and inland water) areas, as committed to by many national governments, is not enough: only 29.2% of currently threatened species will become non-threatened under the assumption that probability of protection success by PAs is 0.5, which our assessment shows is realistic. In countries where volunteers can be organized to monitor threatened taxa, censuses using our method should be able to quantify how fast we are losing species and to assess how effective current conservation measures such as PAs are in preventing species extinction. PMID:24922311

  7. Quantifying uncertainty in observational rainfall datasets

    NASA Astrophysics Data System (ADS)

    Lennard, Chris; Dosio, Alessandro; Nikulin, Grigory; Pinto, Izidine; Seid, Hussen

    2015-04-01

    rainfall datasets available over Africa on monthly, daily and sub-daily time scales as appropriate to quantify spatial and temporal differences between the datasets. We find regional wet and dry biases between datasets (using the ensemble mean as a reference) with generally larger biases in reanalysis products. Rainfall intensity is poorly represented in some datasets which demonstrates some datasets should not be used for rainfall intensity analyses. Using 10 CORDEX models we show in east Africa that the spread between observed datasets is often similar to the spread between models. We recommend that specific observational rainfall datasets datasets be used for specific investigations and also that where many datasets are applicable to an investigation, a probabilistic view be adopted for rainfall studies over Africa. Endris, H. S., P. Omondi, S. Jain, C. Lennard, B. Hewitson, L. Chang'a, J. L. Awange, A. Dosio, P. Ketiem, G. Nikulin, H-J. Panitz, M. Büchner, F. Stordal, and L. Tazalika (2013) Assessment of the Performance of CORDEX Regional Climate Models in Simulating East African Rainfall. J. Climate, 26, 8453-8475. DOI: 10.1175/JCLI-D-12-00708.1 Gbobaniyi, E., A. Sarr, M. B. Sylla, I. Diallo, C. Lennard, A. Dosio, A. Dhie ?diou, A. Kamga, N. A. B. Klutse, B. Hewitson, and B. Lamptey (2013) Climatology, annual cycle and interannual variability of precipitation and temperature in CORDEX simulations over West Africa. Int. J. Climatol., DOI: 10.1002/joc.3834 Hernández-Díaz, L., R. Laprise, L. Sushama, A. Martynov, K. Winger, and B. Dugas (2013) Climate simulation over CORDEX Africa domain using the fifth-generation Canadian Regional Climate Model (CRCM5). Clim. Dyn. 40, 1415-1433. DOI: 10.1007/s00382-012-1387-z Kalognomou, E., C. Lennard, M. Shongwe, I. Pinto, A. Favre, M. Kent, B. Hewitson, A. Dosio, G. Nikulin, H. Panitz, and M. Büchner (2013) A diagnostic evaluation of precipitation in CORDEX models over southern Africa. Journal of Climate, 26, 9477-9506. DOI:10

  8. Quantifying the complexities of Saccharomyces cerevisiae's ecosystem engineering via fermentation.

    PubMed

    Goddard, Matthew R

    2008-08-01

    The theory of niche construction suggests that organisms may engineer environments via their activities. Despite the potential of this phenomenon being realized by Darwin, the capability of niche construction to generally unite ecological and evolutionary biology has never been empirically quantified. Here I quantify the fitness effects of Saccharomyces cerevisiae's ecosystem engineering in a natural ferment in order to understand the interaction between ecological and evolutionary processes. I show that S. cerevisiae eventually dominates in fruit niches, where it is naturally initially rare, by modifying the environment through fermentation (the Crabtree effect) in ways which extend beyond just considering ethanol production. These data show that an additional cause of S. cerevisiae's competitive advantage over the other yeasts in the community is due to the production of heat via fermentation. Even though fermentation is less energetically efficient than respiration, it seems that this trait has been selected for because its net effect provides roughly a 7% fitness advantage over the other members of the community. These data provide an elegant example of niche construction because this trait clearly modifies the environment and therefore the selection pressures to which S. cerevisiae, and other organisms that access the fruit resource, including humans, are exposed to.

  9. Quantifiers More or Less Quantify On-Line: ERP Evidence for Partial Incremental Interpretation

    ERIC Educational Resources Information Center

    Urbach, Thomas P.; Kutas, Marta

    2010-01-01

    Event-related brain potentials were recorded during RSVP reading to test the hypothesis that quantifier expressions are incrementally interpreted fully and immediately. In sentences tapping general knowledge ("Farmers grow crops/worms as their primary source of income"), Experiment 1 found larger N400s for atypical ("worms") than typical objects…

  10. 3D Wind: Quantifying wind speed and turbulence intensity

    NASA Astrophysics Data System (ADS)

    Barthelmie, R. J.; Pryor, S. C.; Wang, H.; Crippa, P.

    2013-12-01

    Integrating measurements and modeling of wind characteristics for wind resource assessment and wind farm control is increasingly challenging as the scales of wind farms increases. Even offshore or in relatively homogeneous landscapes, there are significant gradients of both wind speed and turbulence intensity on scales that typify large wind farms. Our project is, therefore, focused on (i) improving methods to integrate remote sensing and in situ measurements with model simulations to produce a 3-dimensional view of the flow field on wind farm scales and (ii) investigating important controls on the spatiotemporal variability of flow fields within the coastal zone. The instrument suite deployed during the field experiments includes; 3-D sonic and cup anemometers deployed on meteorological masts and buoys, anemometers deployed on tethersondes and an Unmanned Aerial Vehicle, multiple vertically-pointing continuous-wave lidars and scanning Doppler lidars. We also integrate data from satellite-borne instrumentation - specifically synthetic aperture radar and scatterometers and output from the Weather Research and Forecast (WRF) model. Spatial wind fields and vertical profiles of wind speed from WRF and from the full in situ observational suite exhibit excellent agreement in a proof-of-principle experiment conducted in north Indiana particularly during convective conditions, but showed some discrepancies during the breakdown of the nocturnal stable layer. Our second experiment in May 2013 focused on triangulating a volume above an area of coastal water extending from the port in Cleveland out to an offshore water intake crib (about 5 km) and back to the coast, and includes extremely high resolution WRF simulations designed to characterize the coastal zone. Vertically pointing continuous-wave lidars were operated at each apex of the triangle, while the scanning Doppler lidar scanned out across the water over 90 degrees azimuth angle. Preliminary results pertaining to

  11. Quantifying touch feel perception: tribological aspects

    NASA Astrophysics Data System (ADS)

    Liu, X.; Yue, Z.; Cai, Z.; Chetwynd, D. G.; Smith, S. T.

    2008-08-01

    We report a new investigation into how surface topography and friction affect human touch-feel perception. In contrast with previous work based on micro-scale mapping of surface mechanical and tribological properties, this investigation focuses on the direct measurement of the friction generated when a fingertip is stroked on a test specimen. A special friction apparatus was built for the in situ testing, based on a linear flexure mechanism with both contact force and frictional force measured simultaneously. Ten specimens, already independently assessed in a 'perception clinic', with materials including natural wood, leather, engineered plastics and metal were tested and the results compared with the perceived rankings. Because surface geometrical features are suspected to play a significant role in perception, a second set of samples, all of one material, were prepared and tested in order to minimize the influence of properties such as hardness and thermal conductivity. To minimize subjective effects, all specimens were also tested in a roller-on-block configuration based upon the same friction apparatus, with the roller materials being steel, brass and rubber. This paper reports the detailed design and instrumentation of the friction apparatus, the experimental set-up and the friction test results. Attempts have been made to correlate the measured properties and the perceived feelings for both roughness and friction. The results show that the measured roughness and friction coefficient both have a strong correlation with the rough-smooth and grippy-slippery feelings.

  12. Quantifying the Cosmic Web using the Shapefinder diagonistic

    NASA Astrophysics Data System (ADS)

    Sarkar, Prakash

    2016-10-01

    One of the most successful method in quantifying the structures in the Cosmic Web is the Minkowski Functionals. In 3D, there are four minkowski Functionals: Area, Volume, Integrated Mean Curvature and the Integrated Gaussian Curvature. For defining the Minkowski Functionals one should define a surface. We have developed a method based on Marching cube 33 algorithm to generate a surface from a discrete data sets. Next we calculate the Minkowski Functionals and Shapefinder from the triangulated polyhedral surface. Applying this methodology to different data sets , we obtain interesting results related to geometry, morphology and topology of the large scale structure

  13. Quantifying Subsidence in the 1999-2000 Arctic Winter Vortex

    NASA Technical Reports Server (NTRS)

    Greenblatt, Jeffery B.; Jost, Hans-juerg; Loewenstein, Max; Podolske, James R.; Bui, T. Paul; Elkins, James W.; Moore, Fred L.; Ray, Eric A.; Sen, Bhaswar; Margitan, James J.; Hipskind, R. Stephen (Technical Monitor)

    2000-01-01

    Quantifying the subsidence of the polar winter stratospheric vortex is essential to the analysis of ozone depletion, as chemical destruction often occurs against a large, altitude-dependent background ozone concentration. Using N2O measurements made during SOLVE on a variety of platforms (ER-2, in-situ balloon and remote balloon), the 1999-2000 Arctic winter subsidence is determined from N2O-potential temperature correlations along several N2O isopleths. The subsidence rates are compared to those determined in other winters, and comparison is also made with results from the SLIMCAT stratospheric chemical transport model.

  14. A mass-balance model to separate and quantify colloidal and solute redistributions in soil

    USGS Publications Warehouse

    Bern, C.R.; Chadwick, O.A.; Hartshorn, A.S.; Khomo, L.M.; Chorover, J.

    2011-01-01

    Studies of weathering and pedogenesis have long used calculations based upon low solubility index elements to determine mass gains and losses in open systems. One of the questions currently unanswered in these settings is the degree to which mass is transferred in solution (solutes) versus suspension (colloids). Here we show that differential mobility of the low solubility, high field strength (HFS) elements Ti and Zr can trace colloidal redistribution, and we present a model for distinguishing between mass transfer in suspension and solution. The model is tested on a well-differentiated granitic catena located in Kruger National Park, South Africa. Ti and Zr ratios from parent material, soil and colloidal material are substituted into a mixing equation to quantify colloidal movement. The results show zones of both colloid removal and augmentation along the catena. Colloidal losses of 110kgm-2 (-5% relative to parent material) are calculated for one eluviated soil profile. A downslope illuviated profile has gained 169kgm-2 (10%) colloidal material. Elemental losses by mobilization in true solution are ubiquitous across the catena, even in zones of colloidal accumulation, and range from 1418kgm-2 (-46%) for an eluviated profile to 195kgm-2 (-23%) at the bottom of the catena. Quantification of simultaneous mass transfers in solution and suspension provide greater specificity on processes within soils and across hillslopes. Additionally, because colloids include both HFS and other elements, the ability to quantify their redistribution has implications for standard calculations of soil mass balances using such index elements. ?? 2011.

  15. Using nitrate to quantify quick flow in a karst aquifer

    USGS Publications Warehouse

    Mahler, B.J.; Garner, B.D.

    2009-01-01

    In karst aquifers, contaminated recharge can degrade spring water quality, but quantifying the rapid recharge (quick flow) component of spring flow is challenging because of its temporal variability. Here, we investigate the use of nitrate in a two-endmember mixing model to quantify quick flow in Barton Springs, Austin, Texas. Historical nitrate data from recharging creeks and Barton Springs were evaluated to determine a representative nitrate concentration for the aquifer water endmember (1.5 mg/L) and the quick flow endmember (0.17 mg/L for nonstormflow conditions and 0.25 mg/L for stormflow conditions). Under nonstormflow conditions for 1990 to 2005, model results indicated that quick flow contributed from 0% to 55% of spring flow. The nitrate-based two-endmember model was applied to the response of Barton Springs to a storm and results compared to those produced using the same model with ??18O and specific conductance (SC) as tracers. Additionally, the mixing model was modified to allow endmember quick flow values to vary over time. Of the three tracers, nitrate appears to be the most advantageous because it is conservative and because the difference between the concentrations in the two endmembers is large relative to their variance. The ??18O- based model was very sensitive to variability within the quick flow endmember, and SC was not conservative over the timescale of the storm response. We conclude that a nitrate-based two-endmember mixing model might provide a useful approach for quantifying the temporally variable quick flow component of spring flow in some karst systems. ?? 2008 National Ground Water Association.

  16. Quantifying Volume of Groundwater in High Elevation Meadows

    NASA Astrophysics Data System (ADS)

    Ciruzzi, D.; Lowry, C.

    2013-12-01

    Assessing the current and future water needs of high elevation meadows is dependent on quantifying the volume of groundwater stored within the meadow sediment. As groundwater dependent ecosystems, these meadows rely on their ability to capture and store water in order to support ecologic function and base flow to streams. Previous research of these meadows simplified storage by assuming a homogenous reservoir of constant thickness. These previous storage models were able to close the water mass balance, but it is unclear if these assumptions will be successful under future anthropogenic impacts, such as increased air temperature resulting in dryer and longer growing seasons. Applying a geophysical approach, ground-penetrating radar was used at Tuolumne Meadows, CA to qualitatively and quantitatively identify the controls on volume of groundwater storage. From the geophysical results, a three-dimensional model of Tuolumne Meadows was created, which identified meadow thickness and bedrock geometry. This physical model was used in a suite of numerical models simulating high elevation meadows in order to quantify volume of groundwater stored with temporal and spatial variability. Modeling efforts tested both wet and dry water years in order to quantify the variability in the volume of groundwater storage for a range of aquifer properties. Each model was evaluated based on the seasonal depth to water in order to evaluate a particular scenario's ability to support ecological function and base flow. Depending on the simulated meadows ability or inability to support its ecosystem, each representative meadow was categorized as successful or unsuccessful. Restoration techniques to increase active storage volume were suggested at unsuccessful meadows.

  17. A new way of quantifying diagnostic information from multilead electrocardiogram for cardiac disease classification

    PubMed Central

    Sharma, L.N.; Dandapat, S.

    2014-01-01

    A new measure for quantifying diagnostic information from a multilead electrocardiogram (MECG) is proposed. This diagnostic measure is based on principal component (PC) multivariate multiscale sample entropy (PMMSE). The PC analysis is used to reduce the dimension of the MECG data matrix. The multivariate multiscale sample entropy is evaluated over the PC matrix. The PMMSE values along each scale are used as a diagnostic feature vector. The performance of the proposed measure is evaluated using a least square support vector machine classifier for detection and classification of normal (healthy control) and different cardiovascular diseases such as cardiomyopathy, cardiac dysrhythmia, hypertrophy and myocardial infarction. The results show that the cardiac diseases are successfully detected and classified with an average accuracy of 90.34%. Comparison with some of the recently published methods shows improved performance of the proposed measure of cardiac disease classification. PMID:26609392

  18. Quantifying the complexity of the delayed logistic map.

    PubMed

    Masoller, Cristina; Rosso, Osvaldo A

    2011-01-28

    Statistical complexity measures are used to quantify the degree of complexity of the delayed logistic map, with linear and nonlinear feedback. We employ two methods for calculating the complexity measures, one with the 'histogram-based' probability distribution function and the other one with ordinal patterns. We show that these methods provide complementary information about the complexity of the delay-induced dynamics: there are parameter regions where the histogram-based complexity is zero while the ordinal pattern complexity is not, and vice versa. We also show that the time series generated from the nonlinear delayed logistic map can present zero missing or forbidden patterns, i.e. all possible ordinal patterns are realized into orbits.

  19. The Allostery Landscape: Quantifying Thermodynamic Couplings in Biomolecular Systems

    PubMed Central

    2016-01-01

    Allostery plays a fundamental role in most biological processes. However, little theory is available to describe it outside of two-state models. Here we use a statistical mechanical approach to show that the allosteric coupling between two collective variables is not a single number, but instead a two-dimensional thermodynamic coupling function that is directly related to the mutual information from information theory and the copula density function from probability theory. On this basis, we demonstrate how to quantify the contribution of specific energy terms to this thermodynamic coupling function, enabling an approximate decomposition that reveals the mechanism of allostery. We illustrate the thermodynamic coupling function and its use by showing how allosteric coupling in the alanine dipeptide molecule contributes to the overall shape of the Φ/Ψ free energy surface, and by identifying the interactions that are necessary for this coupling. PMID:27766843

  20. Quantifying Speech Rhythm Abnormalities in the Dysarthrias

    PubMed Central

    Liss, Julie M.; White, Laurence; Mattys, Sven L.; Lansford, Kaitlin; Lotto, Andrew J.; Spitzer, Stephanie M.; Caviness, John N.

    2013-01-01

    Purpose In this study, the authors examined whether rhythm metrics capable of distinguishing languages with high and low temporal stress contrast also can distinguish among control and dysarthric speakers of American English with perceptually distinct rhythm patterns. Methods Acoustic measures of vocalic and consonantal segment durations were obtained for speech samples from 55 speakers across 5 groups (hypokinetic, hyperkinetic, flaccid-spastic, ataxic dysarthrias, and controls). Segment durations were used to calculate standard and new rhythm metrics. Discriminant function analyses (DFAs) were used to determine which sets of predictor variables (rhythm metrics) best discriminated between groups (control vs. dysarthrias; and among the 4 dysarthrias). A cross-validation method was used to test the robustness of each original DFA. Results The majority of classification functions were more than 80% successful in classifying speakers into their appropriate group. New metrics that combined successive vocalic and consonantal segments emerged as important predictor variables. DFAs pitting each dysarthria group against the combined others resulted in unique constellations of predictor variables that yielded high levels of classification accuracy. Conclusions: This study confirms the ability of rhythm metrics to distinguish control speech from dysarthrias and to discriminate dysarthria subtypes. Rhythm metrics show promise for use as a rational and objective clinical tool. PMID:19717656

  1. Quantifying Nitrogen Loss From Flooded Hawaiian Taro Fields

    NASA Astrophysics Data System (ADS)

    Deenik, J. L.; Penton, C. R.; Bruland, G. L.; Popp, B. N.; Engstrom, P.; Mueller, J. A.; Tiedje, J.

    2010-12-01

    In 2004 a field fertilization experiment showed that approximately 80% of the fertilizer nitrogen (N) added to flooded Hawaiian taro (Colocasia esculenta) fields could not be accounted for using classic N balance calculations. To quantify N loss through denitrification and anaerobic ammonium oxidation (anammox) pathways in these taro systems we utilized a slurry-based isotope pairing technique (IPT). Measured nitrification rates and porewater N profiles were also used to model ammonium and nitrate fluxes through the top 10 cm of soil. Quantitative PCR of nitrogen cycling functional genes was used to correlate porewater N dynamics with potential microbial activity. Rates of denitrification calculated using porewater profiles were compared to those obtained using the slurry method. Potential denitrification rates of surficial sediments obtained with the slurry method were found to drastically overestimate the calculated in-situ rates. The largest discrepancies were present in fields greater than one month after initial fertilization, reflecting a microbial community poised to denitrify the initial N pulse. Potential surficial nitrification rates varied between 1.3% of the slurry-measured denitrification potential in a heavily-fertilized site to 100% in an unfertilized site. Compared to the use of urea, fish bone meal fertilizer use resulted in decreased N loss through denitrification in the surface sediment, according to both porewater modeling and IPT measurements. In addition, sub-surface porewater profiles point to root-mediated coupled nitrification/denitrification as a potential N loss pathway that is not captured in surface-based incubations. Profile-based surface plus subsurface coupled nitrification/denitrification estimates were between 1.1 and 12.7 times denitrification estimates from the surface only. These results suggest that the use of a ‘classic’ isotope pairing technique that employs 15NO3- in fertilized agricultural systems can lead to a drastic

  2. Quantifying succulence: a rapid, physiologically meaningful metric of plant water storage.

    PubMed

    Ogburn, R Matthew; Edwards, Erika J

    2012-09-01

    Quantification of succulence should ideally convey information about physiological function and yet also be straightforward to measure. While important aspects of succulence and its physiological consequences may be quantified using parameters derived from pressure-volume (P-V) curves, this technique applied to succulent tissues is difficult, time consuming and generally not suitable for large comparative datasets. We performed P-V curves on leaves of 25 taxa from across Caryophyllales and compared the results with direct measures of saturated water content (SWC(meas) ), the ratio of water mass at full saturation to tissue dry mass, for the same taxa. SWC(meas) was significantly related to relative capacitance, the most physiologically relevant parameter describing tissue succulence. We developed a linear model describing SWC(meas) as a function of relative capacitance and leaf volume, which is also supported when accounting for the phylogenetic relationships among taxa. These results indicate that SWC(meas) is a suitable proxy for tissue succulence, and that both cellular properties and variation in gross morphology contribute towards a plant's relative water storage capacity. Quantifying SWC(meas) across many taxa showing variation in tissue succulence will provide a new avenue for exploring the evolutionary dynamics of this important ecological adaptation.

  3. Cascading "Triclick" functionalization of poly(caprolactone) thin films quantified via a quartz crystal microbalance.

    PubMed

    Lin, Fei; Zheng, Jukuan; Yu, Jiayi; Zhou, Jinjun; Becker, Matthew L

    2013-08-12

    A series of mono- and multifunctionalized degradable polyesters bearing various "clickable" groups, including ketone, alkyne, azide, and methyl acrylate (MA) are reported. Using this approach, we demonstrate a cascade approach to immobilize and quantitate three separate bioactive groups onto poly(caprolactone) (PCL) thin films. The materials are based on tunable copolymer compositions of ε-caprolactone and 2-oxepane-1,5-dione. A quartz crystal microbalance (QCM) was used to quantify the rate and extent of surface conjugation between RGD peptide and polymer thin films using "click" chemistry methods. The results show that alkyne-functionalized polymers have the highest conversion efficiency, followed by MA and azide polymers, while polymer films possessing keto groups are less amenable to surface functionalization. The successful conjugation was further confirmed by static contact angle measurements, with a smaller contact angle correlating directly with lower levels of surface peptide conjugation. QCM results quantify the sequential immobilization of peptides on the PCL thin films and indicate that Michael addition must occur first, followed by azide-alkyne Huisgen cycloadditions.

  4. Quantifying the Robustness of the English Sibilant Fricative Contrast in Children

    PubMed Central

    Reidy, Patrick F.; Beckman, Mary E.; Edwards, Jan

    2015-01-01

    Purpose Four measures of children's developing robustness of phonological contrast were compared to see how they correlated with age, vocabulary size, and adult listeners' correctness ratings. Method Word-initial sibilant fricative productions from eighty-one 2- to 5-year-old children and 20 adults were phonetically transcribed and acoustically analyzed. Four measures of robustness of contrast were calculated for each speaker on the basis of the centroid frequency measured from each fricative token. Productions that were transcribed as correct from different children were then used as stimuli in a perception experiment in which adult listeners rated the goodness of each production. Results Results showed that the degree of category overlap, quantified as the percentage of a child's productions whose category could be correctly predicted from the output of a mixed-effects logistic regression model, was the measure that correlated best with listeners' goodness judgments. Conclusions Even when children's productions have been transcribed as correct, adult listeners are sensitive to within-category variation quantified by the child's degree of category overlap. Further research is needed to explore the relationship between the age of a child and adults' sensitivity to different types of within-category variation in children's speech. PMID:25766040

  5. Quantifying Variability of Avian Colours: Are Signalling Traits More Variable?

    PubMed Central

    Delhey, Kaspar; Peters, Anne

    2008-01-01

    Background Increased variability in sexually selected ornaments, a key assumption of evolutionary theory, is thought to be maintained through condition-dependence. Condition-dependent handicap models of sexual selection predict that (a) sexually selected traits show amplified variability compared to equivalent non-sexually selected traits, and since males are usually the sexually selected sex, that (b) males are more variable than females, and (c) sexually dimorphic traits more variable than monomorphic ones. So far these predictions have only been tested for metric traits. Surprisingly, they have not been examined for bright coloration, one of the most prominent sexual traits. This omission stems from computational difficulties: different types of colours are quantified on different scales precluding the use of coefficients of variation. Methodology/Principal Findings Based on physiological models of avian colour vision we develop an index to quantify the degree of discriminable colour variation as it can be perceived by conspecifics. A comparison of variability in ornamental and non-ornamental colours in six bird species confirmed (a) that those coloured patches that are sexually selected or act as indicators of quality show increased chromatic variability. However, we found no support for (b) that males generally show higher levels of variability than females, or (c) that sexual dichromatism per se is associated with increased variability. Conclusions/Significance We show that it is currently possible to realistically estimate variability of animal colours as perceived by them, something difficult to achieve with other traits. Increased variability of known sexually-selected/quality-indicating colours in the studied species, provides support to the predictions borne from sexual selection theory but the lack of increased overall variability in males or dimorphic colours in general indicates that sexual differences might not always be shaped by similar selective

  6. Quantifying Different Tactile Sensations Evoked by Cutaneous Electrical Stimulation Using Electroencephalography Features.

    PubMed

    Zhang, Dingguo; Xu, Fei; Xu, Heng; Shull, Peter B; Zhu, Xiangyang

    2016-03-01

    Psychophysical tests and standardized questionnaires are often used to analyze tactile sensation based on subjective judgment in conventional studies. In contrast with the subjective evaluation, a novel method based on electroencephalography (EEG) is proposed to explore the possibility of quantifying tactile sensation in an objective way. The proposed experiments adopt cutaneous electrical stimulation to generate two kinds of sensations (vibration and pressure) with three grades (low/medium/strong) on eight subjects. Event-related potentials (ERPs) and event-related synchronization/desynchronization (ERS/ERD) are extracted from EEG, which are used as evaluation indexes to distinguish between vibration and pressure, and also to discriminate sensation grades. Results show that five-phase P1–N1–P2–N2–P3 deflection is induced in EEG. Using amplitudes of latter ERP components (N2 and P3), vibration and pressure sensations can be discriminated on both individual and grand-averaged ERP (p < 0.05). The grand-average ERPs can distinguish the three sensations grades, but there is no significant difference on individuals. In addition, ERS/ERD features of mu rhythm (8–13 Hz) are adopted. Vibration and pressure sensations can be discriminated on grand-average ERS/ERD (p < 0.05), but only some individuals show significant difference. The grand-averaged results show that most sensation grades can be differentiated, and most pairwise comparisons show significant difference on individuals (p < 0.05). The work suggests that ERP- and ERS/ERD-based EEG features may have potential to quantify tactile sensations for medical diagnosis or engineering applications.

  7. Quantifying and Mapping Global Data Poverty.

    PubMed

    Leidig, Mathias; Teeuw, Richard M

    2015-01-01

    Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. However, access to these technologies, as well as their associated software and training materials, is not evenly distributed: since the 1990s there has been concern about a "Digital Divide" between the data-rich and the data-poor. We present an innovative metric for evaluating international variations in access to digital data: the Data Poverty Index (DPI). The DPI is based on Internet speeds, numbers of computer owners and Internet users, mobile phone ownership and network coverage, as well as provision of higher education. The datasets used to produce the DPI are provided annually for almost all the countries of the world and can be freely downloaded. The index that we present in this 'proof of concept' study is the first to quantify and visualise the problem of global data poverty, using the most recent datasets, for 2013. The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. The DPI highlights countries where support is needed for improving access to the Internet and for the provision of training in geoinfomatics. We conclude that the DPI is of value as a potential metric for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction.

  8. Smart velocity ranging quantifiable optical microangiography

    NASA Astrophysics Data System (ADS)

    Zhi, Zhongwei; Wang, Ruikang K.

    2011-03-01

    We introduce a new type of Optical Microangiography (OMAG) called Quantifiable Optical Microangiography (QOMAG) which is capable of performing quantitative flow imaging with smart velocity ranging. In order to extracting multi-range velocity, two three dimensional data sets need to be acquired at the same imaging area. One data set performs dense scanning in B-scan direction and Doppler analysis was done at the basis of subsequent A-scans, while the other data set performs dense scanning in C-scan direction and Doppler analysis was done at the basis of consecutive B-scan. Since the velocity ranging is determined by the time interval between consecutive measurements of the spectral fringes, longer time interval will give us higher sensitivity to slow velocity. By simultaneous acquiring data sets with different time intervals, we can perform smart velocity ranging quantification on blood flow characterized by different velocity values. The feasibility of QOMAG for variable blood flow imaging is demonstrated by in vivo studies executed on cerebral blood flow of mouse model. Multi-range detailed blood flow map within intracranial Dura mater and cortex of mouse brain can be given by QOMAG.

  9. Automated Counting of Particles To Quantify Cleanliness

    NASA Technical Reports Server (NTRS)

    Rhode, James

    2005-01-01

    A machine vision system, similar to systems used in microbiological laboratories to count cultured microbes, has been proposed for quantifying the cleanliness of nominally precisely cleaned hardware by counting residual contaminant particles. The system would include a microscope equipped with an electronic camera and circuitry to digitize the camera output, a personal computer programmed with machine-vision and interface software, and digital storage media. A filter pad, through which had been aspirated solvent from rinsing the hardware in question, would be placed on the microscope stage. A high-resolution image of the filter pad would be recorded. The computer would analyze the image and present a histogram of sizes of particles on the filter. On the basis of the histogram and a measure of the desired level of cleanliness, the hardware would be accepted or rejected. If the hardware were accepted, the image would be saved, along with other information, as a quality record. If the hardware were rejected, the histogram and ancillary information would be recorded for analysis of trends. The software would perceive particles that are too large or too numerous to meet a specified particle-distribution profile. Anomalous particles or fibrous material would be flagged for inspection.

  10. Quantifying and Mapping Global Data Poverty

    PubMed Central

    2015-01-01

    Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. However, access to these technologies, as well as their associated software and training materials, is not evenly distributed: since the 1990s there has been concern about a "Digital Divide" between the data-rich and the data-poor. We present an innovative metric for evaluating international variations in access to digital data: the Data Poverty Index (DPI). The DPI is based on Internet speeds, numbers of computer owners and Internet users, mobile phone ownership and network coverage, as well as provision of higher education. The datasets used to produce the DPI are provided annually for almost all the countries of the world and can be freely downloaded. The index that we present in this ‘proof of concept’ study is the first to quantify and visualise the problem of global data poverty, using the most recent datasets, for 2013. The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. The DPI highlights countries where support is needed for improving access to the Internet and for the provision of training in geoinfomatics. We conclude that the DPI is of value as a potential metric for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction. PMID:26560884

  11. Quantifying the transmission potential of pandemic influenza

    NASA Astrophysics Data System (ADS)

    Chowell, Gerardo; Nishiura, Hiroshi

    2008-03-01

    This article reviews quantitative methods to estimate the basic reproduction number of pandemic influenza, a key threshold quantity to help determine the intensity of interventions required to control the disease. Although it is difficult to assess the transmission potential of a probable future pandemic, historical epidemiologic data is readily available from previous pandemics, and as a reference quantity for future pandemic planning, mathematical and statistical analyses of historical data are crucial. In particular, because many historical records tend to document only the temporal distribution of cases or deaths (i.e. epidemic curve), our review focuses on methods to maximize the utility of time-evolution data and to clarify the detailed mechanisms of the spread of influenza. First, we highlight structured epidemic models and their parameter estimation method which can quantify the detailed disease dynamics including those we cannot observe directly. Duration-structured epidemic systems are subsequently presented, offering firm understanding of the definition of the basic and effective reproduction numbers. When the initial growth phase of an epidemic is investigated, the distribution of the generation time is key statistical information to appropriately estimate the transmission potential using the intrinsic growth rate. Applications of stochastic processes are also highlighted to estimate the transmission potential using similar data. Critically important characteristics of influenza data are subsequently summarized, followed by our conclusions to suggest potential future methodological improvements.

  12. Quantifying the structural integrity of nanorod arrays.

    PubMed

    Thöle, Florian; Xue, Longjian; HEß, Claudia; Hillebrand, Reinald; Gorb, Stanislav N; Steinhart, Martin

    2017-02-01

    Arrays of aligned nanorods oriented perpendicular to a support, which are accessible by top-down lithography or by means of shape-defining hard templates, have received increasing interest as sensor components, components for nanophotonics and nanoelectronics, substrates for tissue engineering, surfaces having specific adhesive or antiadhesive properties and as surfaces with customized wettability. Agglomeration of the nanorods deteriorates the performance of components based on nanorod arrays. A comprehensive body of literature deals with mechanical failure mechanisms of nanorods and design criteria for mechanically stable nanorod arrays. However, the structural integrity of nanorod arrays is commonly evaluated only visually and qualitatively. We use real-space analysis of microscopic images to quantify the fraction of condensed nanorods in nanorod arrays. We suggest the number of array elements apparent in the micrographs divided by the number of array elements a defect-free array would contain in the same area, referred to as integrity fraction, as a measure of structural array integrity. Reproducible procedures to determine the imaged number of array elements are introduced. Thus, quantitative comparisons of different nanorod arrays, or of one nanorod array at different stages of its use, are possible. Structural integrities of identical nanorod arrays differing only in the length of the nanorods are exemplarily analysed.

  13. Quantifying bank storage of variably saturated aquifers.

    PubMed

    Li, Hailong; Boufadel, Michel C; Weaver, James W

    2008-01-01

    Numerical simulations were conducted to quantify bank storage in a variably saturated, homogenous, and anisotropic aquifer abutting a stream during rising stream stage. Seepage faces and bank slopes ranging from 1/3 to 100/3 were simulated. The initial conditions were assumed steady-state flow with water draining toward the stream. Then, the stream level rose at a constant rate to the specified elevation of the water table given by the landward boundary condition and stayed there until the system reached a new steady state. This represents a highly simplified version of a real world hydrograph. For the specific examples considered, the following conclusions can be made. The volume of surface water entering the bank increased with the rate of stream level rise, became negligible when the rate of rise was slow, and approached a positive constant when the rate was large. Also, the volume decreased with the dimensionless parameter M (the product of the anisotropy ratio and the square of the domain's aspect ratio). When M was large (>10), bank storage was small because most pore space was initially saturated with ground water due to the presence of a significant seepage face. When M was small, the seepage face became insignificant and capillarity began to play a role. The weaker the capillary effect, the easier for surface water to enter the bank. The effect of the capillary forces on the volume of surface water entering the bank was significant and could not be neglected.

  14. Quantifying Climate Risks for Urban Environments

    NASA Astrophysics Data System (ADS)

    Hayhoe, K.; Stoner, A. K.; Dickson, L.

    2013-12-01

    High-density urban areas are both uniquely vulnerable and uniquely able to adapt to climate change. Enabling this potential requires identifying the vulnerabilities, however, and these depend strongly on location: the challenges climate change poses for a southern coastal city such as Miami, for example, have little in common with those facing a northern inland city such as Chicago. By combining local knowledge with climate science, risk assessment, engineering analysis, and adaptation planning, it is possible to develop relevant climate information that feeds directly into vulnerability assessment and long-term planning. Key steps include: developing climate projections tagged to long-term weather stations within the city itself that reflect local characteristics; mining local knowledge to identify existing vulnerabilities to, and costs of, weather and climate extremes; understanding how future projections can be integrated into the planning process; and identifying ways in which the city may adapt. Using examples from our work in the cities of Boston, Chicago, and Mobile we illustrate the practical application of this approach to quantify the impacts of climate change on these cities and identify robust adaptation options as diverse as reducing the urban heat island effect, protecting essential infrastructure, changing design standards and building codes, developing robust emergency management plans, and rebuilding storm sewer systems.

  15. Data Used in Quantified Reliability Models

    NASA Technical Reports Server (NTRS)

    DeMott, Diana; Kleinhammer, Roger K.; Kahn, C. J.

    2014-01-01

    Data is the crux to developing quantitative risk and reliability models, without the data there is no quantification. The means to find and identify reliability data or failure numbers to quantify fault tree models during conceptual and design phases is often the quagmire that precludes early decision makers consideration of potential risk drivers that will influence design. The analyst tasked with addressing a system or product reliability depends on the availability of data. But, where is does that data come from and what does it really apply to? Commercial industries, government agencies, and other international sources might have available data similar to what you are looking for. In general, internal and external technical reports and data based on similar and dissimilar equipment is often the first and only place checked. A common philosophy is "I have a number - that is good enough". But, is it? Have you ever considered the difference in reported data from various federal datasets and technical reports when compared to similar sources from national and/or international datasets? Just how well does your data compare? Understanding how the reported data was derived, and interpreting the information and details associated with the data is as important as the data itself.

  16. Smartphone quantifies Salmonella from paper microfluidics.

    PubMed

    Park, Tu San; Li, Wenyue; McCracken, Katherine E; Yoon, Jeong-Yeol

    2013-12-21

    Smartphone-based optical detection is a potentially easy-to-use, handheld, true point-of-care diagnostic tool for the early and rapid detection of pathogens. Paper microfluidics is a low-cost, field-deployable, and easy-to-use alternative to conventional microfluidic devices. Most paper-based microfluidic assays typically utilize dyes or enzyme-substrate binding, while bacterial detection on paper microfluidics is rare. We demonstrate a novel application of smartphone-based detection of Salmonella on paper microfluidics. Each paper microfluidic channel was pre-loaded with anti-Salmonella Typhimurium and anti-Escherichia coli conjugated submicroparticles. Dipping the paper microfluidic device into the Salmonella solutions led to the antibody-conjugated particles that were still confined within the paper fibers to immunoagglutinate. The extent of immunoagglutination was quantified by evaluating Mie scattering from the digital images taken at an optimized angle and distance with a smartphone. A smartphone application was designed and programmed to allow the user to position the smartphone at an optimized angle and distance from the paper microfluidic device, and a simple image processing algorithm was implemented to calculate and display the bacterial concentration on the smartphone. The detection limit was single-cell-level and the total assay time was less than one minute.

  17. Fluorescence imaging to quantify crop residue cover

    NASA Technical Reports Server (NTRS)

    Daughtry, C. S. T.; Mcmurtrey, J. E., III; Chappelle, E. W.

    1994-01-01

    Crop residues, the portion of the crop left in the field after harvest, can be an important management factor in controlling soil erosion. Methods to quantify residue cover are needed that are rapid, accurate, and objective. Scenes with known amounts of crop residue were illuminated with long wave ultraviolet (UV) radiation and fluorescence images were recorded with an intensified video camera fitted with a 453 to 488 nm band pass filter. A light colored soil and a dark colored soil were used as background for the weathered soybean stems. Residue cover was determined by counting the proportion of the pixels in the image with fluorescence values greater than a threshold. Soil pixels had the lowest gray levels in the images. The values of the soybean residue pixels spanned nearly the full range of the 8-bit video data. Classification accuracies typically were within 3(absolute units) of measured cover values. Video imaging can provide an intuitive understanding of the fraction of the soil covered by residue.

  18. SANTA: quantifying the functional content of molecular networks.

    PubMed

    Cornish, Alex J; Markowetz, Florian

    2014-09-01

    Linking networks of molecular interactions to cellular functions and phenotypes is a key goal in systems biology. Here, we adapt concepts of spatial statistics to assess the functional content of molecular networks. Based on the guilt-by-association principle, our approach (called SANTA) quantifies the strength of association between a gene set and a network, and functionally annotates molecular networks like other enrichment methods annotate lists of genes. As a general association measure, SANTA can (i) functionally annotate experimentally derived networks using a collection of curated gene sets and (ii) annotate experimentally derived gene sets using a collection of curated networks, as well as (iii) prioritize genes for follow-up analyses. We exemplify the efficacy of SANTA in several case studies using the S. cerevisiae genetic interaction network and genome-wide RNAi screens in cancer cell lines. Our theory, simulations, and applications show that SANTA provides a principled statistical way to quantify the association between molecular networks and cellular functions and phenotypes. SANTA is available from http://bioconductor.org/packages/release/bioc/html/SANTA.html.

  19. A revised metric for quantifying body shape in vertebrates.

    PubMed

    Collar, David C; Reynaga, Crystal M; Ward, Andrea B; Mehta, Rita S

    2013-08-01

    Vertebrates exhibit tremendous diversity in body shape, though quantifying this variation has been challenging. In the past, researchers have used simplified metrics that either describe overall shape but reveal little about its anatomical basis or that characterize only a subset of the morphological features that contribute to shape variation. Here, we present a revised metric of body shape, the vertebrate shape index (VSI), which combines the four primary morphological components that lead to shape diversity in vertebrates: head shape, length of the second major body axis (depth or width), and shape of the precaudal and caudal regions of the vertebral column. We illustrate the usefulness of VSI on a data set of 194 species, primarily representing five major vertebrate clades: Actinopterygii, Lissamphibia, Squamata, Aves, and Mammalia. We quantify VSI diversity within each of these clades and, in the course of doing so, show how measurements of the morphological components of VSI can be obtained from radiographs, articulated skeletons, and cleared and stained specimens. We also demonstrate that head shape, secondary body axis, and vertebral characteristics are important independent contributors to body shape diversity, though their importance varies across vertebrate groups. Finally, we present a functional application of VSI to test a hypothesized relationship between body shape and the degree of axial bending associated with locomotor modes in ray-finned fishes. Altogether, our study highlights the promise VSI holds for identifying the morphological variation underlying body shape diversity as well as the selective factors driving shape evolution.

  20. Systematic and general method for quantifying localization in microscopy images

    PubMed Central

    Sheng, Huanjie; Stauffer, Weston

    2016-01-01

    ABSTRACT Quantifying the localization of molecules with respect to other molecules, cell structures and intracellular regions is essential to understanding their regulation and actions. However, measuring localization from microscopy images is often difficult with existing metrics. Here, we evaluate a metric for quantifying localization termed the threshold overlap score (TOS), and show it is simple to calculate, easy to interpret, able to be used to systematically characterize localization patterns, and generally applicable. TOS is calculated by: (i) measuring the overlap of pixels that are above the intensity thresholds for two signals; (ii) determining whether the overlap is more, less, or the same as expected by chance, i.e. colocalization, anti-colocalization, or non-colocalization; and (iii) rescaling to allow comparison at different thresholds. The above is repeated at multiple threshold combinations to generate a TOS matrix to systematically characterize the relationship between localization and signal intensities. TOS matrices were used to identify and distinguish localization patterns of different proteins in various simulations, cell types and organisms with greater specificity and sensitivity than common metrics. For all the above reasons, TOS is an excellent first line metric, particularly for cells with mixed localization patterns. PMID:27979831

  1. A framework for quantifying net benefits of alternative prognostic models.

    PubMed

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-30

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks.

  2. A Method to Quantify Mouse Coat-Color Proportions

    PubMed Central

    Ounpraseuth, Songthip; Rafferty, Tonya M.; McDonald-Phillips, Rachel E.; Gammill, Whitney M.; Siegel, Eric R.; Wheeler, Kristin L.; Nilsson, Erik A.; Cooney, Craig A.

    2009-01-01

    Coat-color proportions and patterns in mice are used as assays for many processes such as transgene expression, chimerism, and epigenetics. In many studies, coat-color readouts are estimated from subjective scoring of individual mice. Here we show a method by which mouse coat color is quantified as the proportion of coat shown in one or more digital images. We use the yellow-agouti mouse model of epigenetic variegation to demonstrate this method. We apply this method to live mice using a conventional digital camera for data collection. We use a raster graphics editing program to convert agouti regions of the coat to a standard, uniform, brown color and the yellow regions of the coat to a standard, uniform, yellow color. We use a second program to quantify the proportions of these standard colors. This method provides quantification that relates directly to the visual appearance of the live animal. It also provides an objective analysis with a traceable record, and it should allow for precise comparisons of mouse coats and mouse cohorts within and between studies. PMID:19404391

  3. Quantifying Floods of a Flood Regime in Space and Time

    NASA Astrophysics Data System (ADS)

    Whipple, A. A.; Fleenor, W. E.; Viers, J. H.

    2015-12-01

    Interaction between a flood hydrograph and floodplain topography results in spatially and temporally variable conditions important for ecosystem process and function. Individual floods whose frequency and dimensionality comprise a river's flood regime contribute to that variability and in aggregate are important drivers of floodplain ecosystems. Across the globe, water management actions, land use changes as well as hydroclimatic change associated with climate change have profoundly affected natural flood regimes and their expression within the floodplain landscape. Homogenization of riverscapes has degraded once highly diverse and productive ecosystems. Improved understanding of the range of flood conditions and spatial variability within floodplains, or hydrospatial conditions, is needed to improve water and land management and restoration activities to support the variable conditions under which species adapted. This research quantifies the flood regime of a floodplain site undergoing restoration through levee breaching along the lower Cosumnes River of California. One of the few lowland alluvial rivers of California with an unregulated hydrograph and regular floodplain connectivity, the Cosumnes River provides a useful test-bed for exploring river-floodplain interaction. Representative floods of the Cosumnes River are selected from previously-established flood types comprising the flood regime and applied within a 2D hydrodynamic model representing the floodplain restoration site. Model output is analyzed and synthesized to quantify and compare conditions in space and time, using metrics such as depth and velocity. This research establishes methods for quantifying a flood regime's floodplain inundation characteristics, illustrates the role of flow variability and landscape complexity in producing heterogeneous floodplain conditions, and suggests important implications for managing more ecologically functional floodplains.

  4. Quantifying the ventilatory control contribution to sleep apnoea using polysomnography.

    PubMed

    Terrill, Philip I; Edwards, Bradley A; Nemati, Shamim; Butler, James P; Owens, Robert L; Eckert, Danny J; White, David P; Malhotra, Atul; Wellman, Andrew; Sands, Scott A

    2015-02-01

    Elevated loop gain, consequent to hypersensitive ventilatory control, is a primary nonanatomical cause of obstructive sleep apnoea (OSA) but it is not possible to quantify this in the clinic. Here we provide a novel method to estimate loop gain in OSA patients using routine clinical polysomnography alone. We use the concept that spontaneous ventilatory fluctuations due to apnoeas/hypopnoeas (disturbance) result in opposing changes in ventilatory drive (response) as determined by loop gain (response/disturbance). Fitting a simple ventilatory control model (including chemical and arousal contributions to ventilatory drive) to the ventilatory pattern of OSA reveals the underlying loop gain. Following mathematical-model validation, we critically tested our method in patients with OSA by comparison with a standard (continuous positive airway pressure (CPAP) drop method), and by assessing its ability to detect the known reduction in loop gain with oxygen and acetazolamide. Our method quantified loop gain from baseline polysomnography (correlation versus CPAP-estimated loop gain: n=28; r=0.63, p<0.001), detected the known reduction in loop gain with oxygen (n=11; mean±sem change in loop gain (ΔLG) -0.23±0.08, p=0.02) and acetazolamide (n=11; ΔLG -0.20±0.06, p=0.005), and predicted the OSA response to loop gain-lowering therapy. We validated a means to quantify the ventilatory control contribution to OSA pathogenesis using clinical polysomnography, enabling identification of likely responders to therapies targeting ventilatory control.

  5. Quantifying and transferring contextual information in object detection.

    PubMed

    Zheng, Wei-Shi; Gong, Shaogang; Xiang, Tao

    2012-04-01

    Context is critical for reducing the uncertainty in object detection. However, context modeling is challenging because there are often many different types of contextual information coexisting with different degrees of relevance to the detection of target object(s) in different images. It is therefore crucial to devise a context model to automatically quantify and select the most effective contextual information for assisting in detecting the target object. Nevertheless, the diversity of contextual information means that learning a robust context model requires a larger training set than learning the target object appearance model, which may not be available in practice. In this work, a novel context modeling framework is proposed without the need for any prior scene segmentation or context annotation. We formulate a polar geometric context descriptor for representing multiple types of contextual information. In order to quantify context, we propose a new maximum margin context (MMC) model to evaluate and measure the usefulness of contextual information directly and explicitly through a discriminant context inference method. Furthermore, to address the problem of context learning with limited data, we exploit the idea of transfer learning based on the observation that although two categories of objects can have very different visual appearance, there can be similarity in their context and/or the way contextual information helps to distinguish target objects from nontarget objects. To that end, two novel context transfer learning models are proposed which utilize training samples from source object classes to improve the learning of the context model for a target object class based on a joint maximum margin learning framework. Experiments are carried out on PASCAL VOC2005 and VOC2007 data sets, a luggage detection data set extracted from the i-LIDS data set, and a vehicle detection data set extracted from outdoor surveillance footage. Our results validate the

  6. Quantifying VOC emissions for the strategic petroleum reserve.

    SciTech Connect

    Knowlton, Robert G.; Lord, David L.

    2013-06-01

    A very important aspect of the Department of Energys (DOEs) Strategic Petroleum Reserve (SPR) program is regulatory compliance. One of the regulatory compliance issues deals with limiting the amount of volatile organic compounds (VOCs) that are emitted into the atmosphere from brine wastes when they are discharged to brine holding ponds. The US Environmental Protection Agency (USEPA) has set limits on the amount of VOCs that can be discharged to the atmosphere. Several attempts have been made to quantify the VOC emissions associated with the brine ponds going back to the late 1970s. There are potential issues associated with each of these quantification efforts. Two efforts were made to quantify VOC emissions by analyzing VOC content of brine samples obtained from wells. Efforts to measure air concentrations were mentioned in historical reports but no data have been located to confirm these assertions. A modeling effort was also performed to quantify the VOC emissions. More recently in 2011- 2013, additional brine sampling has been performed to update the VOC emissions estimate. An analysis of the statistical confidence in these results is presented here. Arguably, there are uncertainties associated with each of these efforts. The analysis herein indicates that the upper confidence limit in VOC emissions based on recent brine sampling is very close to the 0.42 ton/MMB limit used historically on the project. Refining this estimate would require considerable investment in additional sampling, analysis, and monitoring. An analysis of the VOC emissions at each site suggests that additional discharges could be made and stay within current regulatory limits.

  7. The quantified process approach: an emerging methodology to neuropsychological assessment.

    PubMed

    Poreh, A M

    2000-05-01

    An important development in the field of neuropsychological assessment is the quantification of the process by which individuals solve common neuropsychological tasks. The present article outlines the history leading to this development, the Quantified Process Approach, and suggests that this line of applied research bridges the gap between the clinical and statistical approaches to neuropsychological assessment. It is argued that the enterprise of quantifying the process approach proceeds via three major methodologies: (1) the "Satellite" Testing Paradigm: an approach by which new tasks are developed to complement existing tests so as to clarify a given test performance; (2) the Composition Paradigm: an approach by which data on a given test that have been largely overlooked are compiled and subsequently analyzed, resulting in new indices that are believed to reflect underlying constructs accounting for test performance; and (3) the Decomposition Paradigm: an approach which investigates the relationship between test items of a given measure according to underlying facets, resulting in the development of new subscores. The article illustrates each of the above paradigms, offers a critique of this new field according to prevailing professional standards for psychological measures, and provides suggestions for future research.

  8. Quantifying uncertainty in LCA-modelling of waste management systems

    SciTech Connect

    Clavreul, Julie; Guyonnet, Dominique; Christensen, Thomas H.

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Uncertainty in LCA-modelling of waste management is significant. Black-Right-Pointing-Pointer Model, scenario and parameter uncertainties contribute. Black-Right-Pointing-Pointer Sequential procedure for quantifying uncertainty is proposed. Black-Right-Pointing-Pointer Application of procedure is illustrated by a case-study. - Abstract: Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining the selected methods: (Step 1) a sensitivity analysis evaluating the sensitivities of the results with respect to the input uncertainties, (Step 2) an uncertainty propagation providing appropriate tools for representing uncertainties and calculating the overall uncertainty of the model results, (Step 3) an uncertainty contribution analysis quantifying the contribution of each parameter uncertainty to the final uncertainty and (Step 4) as a new approach, a combined sensitivity analysis providing a visualisation of the shift in the ranking of different options due to variations of selected key parameters. This tiered approach optimises the resources available to LCA practitioners by only propagating the most influential uncertainties.

  9. Quantifying prosthetic gait deviation using simple outcome measures

    PubMed Central

    Kark, Lauren; Odell, Ross; McIntosh, Andrew S; Simmons, Anne

    2016-01-01

    AIM: To develop a subset of simple outcome measures to quantify prosthetic gait deviation without needing three-dimensional gait analysis (3DGA). METHODS: Eight unilateral, transfemoral amputees and 12 unilateral, transtibial amputees were recruited. Twenty-eight able-bodied controls were recruited. All participants underwent 3DGA, the timed-up-and-go test and the six-minute walk test (6MWT). The lower-limb amputees also completed the Prosthesis Evaluation Questionnaire. Results from 3DGA were summarised using the gait deviation index (GDI), which was subsequently regressed, using stepwise regression, against the other measures. RESULTS: Step-length (SL), self-selected walking speed (SSWS) and the distance walked during the 6MWT (6MWD) were significantly correlated with GDI. The 6MWD was the strongest, single predictor of the GDI, followed by SL and SSWS. The predictive ability of the regression equations were improved following inclusion of self-report data related to mobility and prosthetic utility. CONCLUSION: This study offers a practicable alternative to quantifying kinematic deviation without the need to conduct complete 3DGA. PMID:27335814

  10. Quantifying individual performance in Cricket — A network analysis of batsmen and bowlers

    NASA Astrophysics Data System (ADS)

    Mukherjee, Satyam

    2014-01-01

    Quantifying individual performance in the game of Cricket is critical for team selection in International matches. The number of runs scored by batsmen and wickets taken by bowlers serves as a natural way of quantifying the performance of a cricketer. Traditionally the batsmen and bowlers are rated on their batting or bowling average respectively. However, in a game like Cricket it is always important the manner in which one scores the runs or claims a wicket. Scoring runs against a strong bowling line-up or delivering a brilliant performance against a team with a strong batting line-up deserves more credit. A player’s average is not able to capture this aspect of the game. In this paper we present a refined method to quantify the ‘quality’ of runs scored by a batsman or wickets taken by a bowler. We explore the application of Social Network Analysis (SNA) to rate the players in a team performance. We generate a directed and weighted network of batsmen-bowlers using the player-vs-player information available for Test cricket and ODI cricket. Additionally we generate a network of batsmen and bowlers based on the dismissal record of batsmen in the history of cricket-Test (1877-2011) and ODI (1971-2011). Our results show that M. Muralitharan is the most successful bowler in the history of Cricket. Our approach could potentially be applied in domestic matches to judge a player’s performance which in turn paves the way for a balanced team selection for International matches.

  11. Quantifying Numerical Model Accuracy and Variability

    NASA Astrophysics Data System (ADS)

    Montoya, L. H.; Lynett, P. J.

    2015-12-01

    The 2011 Tohoku tsunami event has changed the logic on how to evaluate tsunami hazard on coastal communities. Numerical models are a key component for methodologies used to estimate tsunami risk. Model predictions are essential for the development of Tsunami Hazard Assessments (THA). By better understanding model bias and uncertainties and if possible minimizing them, a more accurate and reliable THA will result. In this study we compare runup height, inundation lines and flow velocity field measurements between GeoClaw and the Method Of Splitting Tsunami (MOST) predictions in the Sendai plain. Runup elevation and average inundation distance was in general overpredicted by the models. However, both models agree relatively well with each other when predicting maximum sea surface elevation and maximum flow velocities. Furthermore, to explore the variability and uncertainties in numerical models, MOST is used to compare predictions from 4 different grid resolutions (30m, 20m, 15m and 12m). Our work shows that predictions of particular products (runup and inundation lines) do not require the use of high resolution (less than 30m) Digital Elevation Maps (DEMs). When predicting runup heights and inundation lines, numerical convergence was achieved using the 30m resolution grid. On the contrary, poor convergence was found in the flow velocity predictions, particularly the 1 meter depth maximum flow velocities. Also, runup height measurements and elevations from the DEM were used to estimate model bias. The results provided in this presentation will help understand the uncertainties in model predictions and locate possible sources of errors within a model.

  12. Quantifiers are incrementally interpreted in context, more than less

    PubMed Central

    Urbach, Thomas P.; DeLong, Katherine A.; Kutas, Marta

    2015-01-01

    Language interpretation is often assumed to be incremental. However, our studies of quantifier expressions in isolated sentences found N400 event-related brain potential (ERP) evidence for partial but not full immediate quantifier interpretation (Urbach & Kutas, 2010). Here we tested similar quantifier expressions in pragmatically supporting discourse contexts (Alex was an unusual toddler. Most/Few kids prefer sweets/vegetables…) while participants made plausibility judgments (Experiment 1) or read for comprehension (Experiment 2). Control Experiments 3A (plausibility) and 3B (comprehension) removed the discourse contexts. Quantifiers always modulated typical and/or atypical word N400 amplitudes. However, only the real-time N400 effects only in Experiment 2 mirrored offline quantifier and typicality crossover interaction effects for plausibility ratings and cloze probabilities. We conclude that quantifier expressions can be interpreted fully and immediately, though pragmatic and task variables appear to impact the speed and/or depth of quantifier interpretation. PMID:26005285

  13. Quantifying Riverscape Connectivity with Graph Theory

    NASA Astrophysics Data System (ADS)

    Carbonneau, P.; Milledge, D.; Sinha, R.; Tandon, S. K.

    2013-12-01

    Fluvial catchments convey fluxes of water, sediment, nutrients and aquatic biota. At continental scales, crustal topography defines the overall path of channels whilst at local scales depositional and/or erosional features generally determine the exact path of a channel. Furthermore, constructions such as dams, for either water abstraction or hydropower, often have a significant impact on channel networks.The concept of ';connectivity' is commonly invoked when conceptualising the structure of a river network.This concept is easy to grasp but there have been uneven efforts across the environmental sciences to actually quantify connectivity. Currently there have only been a few studies reporting quantitative indices of connectivity in river sciences, notably, in the study of avulsion processes. However, the majority of current work describing some form of environmental connectivity in a quantitative manner is in the field of landscape ecology. Driven by the need to quantify habitat fragmentation, landscape ecologists have returned to graph theory. Within this formal setting, landscape ecologists have successfully developed a range of indices which can model connectivity loss. Such formal connectivity metrics are currently needed for a range of applications in fluvial sciences. One of the most urgent needs relates to dam construction. In the developed world, hydropower development has generally slowed and in many countries, dams are actually being removed. However, this is not the case in the developing world where hydropower is seen as a key element to low-emissions power-security. For example, several dam projects are envisaged in Himalayan catchments in the next 2 decades. This region is already under severe pressure from climate change and urbanisation, and a better understanding of the network fragmentation which can be expected in this system is urgently needed. In this paper, we apply and adapt connectivity metrics from landscape ecology. We then examine the

  14. Quantifying fluvial bedrock erosion using repeat terrestrial Lidar

    NASA Astrophysics Data System (ADS)

    Cook, Kristen

    2013-04-01

    The Da'an River Gorge in western Taiwan provides a unique opportunity to observe the formation and evolution of a natural bedrock gorge. The 1.2 km long and up to 20 m deep gorge has formed since 1999 in response to uplift of the riverbed during the Chi-Chi earthquake. The extremely rapid pace of erosion enables us to observe both downcutting and channel widening over short time periods. We have monitored the evolution of the gorge since 2009 using repeat RTK GPS surveys and terrestrial Lidar scans. GPS surveys of the channel profile are conducted frequently, with 24 surveys to date, while Lidar scans are conducted after major floods, or after 5-9 months without a flood, for a total of 8 scans to date. The Lidar data are most useful for recording erosion of channel walls, which is quite episodic and highly variable along the channel. By quantifying the distribution of wall erosion in space and time, we can improve our understanding of channel widening processes and of the development of the channel planform, particularly the growth of bends. During the summer of 2012, the Da'an catchment experienced two large storm events, a meiyu (plum rain) event on June 10-13 that brought 800 mm of rain and a typhoon on August 1-3 that brought 650 mm of rain. The resulting floods had significant geomorphic effects on the Da'an gorge, including up to 10s of meters of erosion in some sections of the gorge walls. We quantify these changes using Lidar surveys conducted on June 7, July 3, and August 30. Channel wall collapses also occur in the absence of large floods, and we use scans from August 23, 2011 and June 7, 2012 to quantify erosion during a period that included a number of small floods, but no large ones. This allows us to compare the impact of 9 months of normal conditions to the impact of short-duration extreme events. The observed variability of erosion in space and time highlights the need for 3D techniques such as terrestrial Lidar to properly quantify erosion in this

  15. Quantifying unsteadiness and dynamics of pulsatory volcanic activity

    NASA Astrophysics Data System (ADS)

    Dominguez, L.; Pioli, L.; Bonadonna, C.; Connor, C. B.; Andronico, D.; Harris, A. J. L.; Ripepe, M.

    2016-06-01

    , can be also described based on the log-logistic parameter s, which is found to increase from regular mafic systems to highly variable silicic systems. These results suggest that the periodicity of explosions, quantified in terms of the distribution of repose times, can give fundamental information about the system dynamics and change regularly across eruptive styles (i.e., Strombolian to Vulcanian), allowing for direct comparison and quantification of different types of pulsatory activity during these eruptions.

  16. Quantifying landscape resilience using vegetation indices

    NASA Astrophysics Data System (ADS)

    Eddy, I. M. S.; Gergel, S. E.

    2014-12-01

    Landscape resilience refers to the ability of systems to adapt to and recover from disturbance. In pastoral landscapes, degradation can be measured in terms of increased desertification and/or shrub encroachment. In many countries across Central Asia, the use and resilience of pastoral systems has changed markedly over the past 25 years, influenced by centralized Soviet governance, private property rights and recently, communal resource governance. In Kyrgyzstan, recent governance reforms were in response to the increasing degradation of pastures attributed to livestock overgrazing. Our goal is to examine and map the landscape-level factors that influence overgrazing throughout successive governance periods. Here, we map and examine some of the spatial factors influencing landscape resilience in agro-pastoral systems in the Kyrgyzstan Republic where pastures occupy >50% of the country's area. We ask three questions: 1) which mechanisms of pasture degradation (desertification vs. shrub encroachment), are detectable using remote sensing vegetation indices?; 2) Are these degraded pastures associated with landscape features that influence herder mobility and accessibility (e.g., terrain, distance to other pastures)?; and 3) Have these patterns changed through successive governance periods? Using a chronosequence of Landsat imagery (1999-2014), NDVI and other VIs were used to identify trends in pasture condition during the growing season. Least-cost path distances as well as graph theoretic indices were derived from topographic factors to assess landscape connectivity (from villages to pastures and among pastures). Fieldwork was used to assess the feasibility and accuracy of this approach using the most recent imagery. Previous research concluded that low herder mobility hindered pasture use, thus we expect the distance from pasture to village to be an important predictor of pasture condition. This research will quantify the magnitude of pastoral degradation and test

  17. A new model for quantifying climate episodes

    NASA Astrophysics Data System (ADS)

    Biondi, Franco; Kozubowski, Tomasz J.; Panorska, Anna K.

    2005-07-01

    When long records of climate (precipitation, temperature, stream runoff, etc.) are available, either from instrumental observations or from proxy records, the objective evaluation and comparison of climatic episodes becomes necessary. Such episodes can be quantified in terms of duration (the number of time intervals, e.g. years, the process remains continuously above or below a reference level) and magnitude (the sum of all series values for a given duration). The joint distribution of duration and magnitude is represented here by a stochastic model called BEG, for bivariate distribution with exponential and geometric marginals. The model is based on the theory of random sums, and its mathematical derivation confirms and extends previous empirical findings. Probability statements that can be obtained from the model are illustrated by applying it to a 2300-year dendroclimatic reconstruction of water-year precipitation for the eastern Sierra Nevada-western Great Basin. Using the Dust Bowl drought period as an example, the chance of a longer or greater drought is 8%. Conditional probabilities are much higher, i.e. a drought of that magnitude has a 62% chance of lasting for 11 years or longer, and a drought that lasts 11 years has a 46% chance of having an equal or greater magnitude. In addition, because of the bivariate model, we can estimate a 6% chance of witnessing a drought that is both longer and greater. Additional examples of model application are also provided. This type of information provides a way to place any climatic episode in a temporal perspective, and such numerical statements help with reaching science-based management and policy decisions.

  18. Quantifying human vitamin kinetics using AMS

    SciTech Connect

    Hillegonds, D; Dueker, S; Ognibene, T; Buchholz, B; Lin, Y; Vogel, J; Clifford, A

    2004-02-19

    Tracing vitamin kinetics at physiologic concentrations has been hampered by a lack of quantitative sensitivity for chemically equivalent tracers that could be used safely in healthy people. Instead, elderly or ill volunteers were sought for studies involving pharmacologic doses with radioisotopic labels. These studies fail to be relevant in two ways: vitamins are inherently micronutrients, whose biochemical paths are saturated and distorted by pharmacological doses; and while vitamins remain important for health in the elderly or ill, their greatest effects may be in preventing slow and cumulative diseases by proper consumption throughout youth and adulthood. Neither the target dose nor the target population are available for nutrient metabolic studies through decay counting of radioisotopes at high levels. Stable isotopic labels are quantified by isotope ratio mass spectrometry at levels that trace physiologic vitamin doses, but the natural background of stable isotopes severely limits the time span over which the tracer is distinguishable. Indeed, study periods seldom ranged over a single biological mean life of the labeled nutrients, failing to provide data on the important final elimination phase of the compound. Kinetic data for the absorption phase is similarly rare in micronutrient research because the phase is rapid, requiring many consecutive plasma samples for accurate representation. However, repeated blood samples of sufficient volume for precise stable or radio-isotope quantitations consume an indefensible amount of the volunteer's blood over a short period. Thus, vitamin pharmacokinetics in humans has often relied on compartmental modeling based upon assumptions and tested only for the short period of maximal blood circulation, a period that poorly reflects absorption or final elimination kinetics except for the most simple models.

  19. Quantifying compositional impacts of ambient aerosol on cloud droplet formation

    NASA Astrophysics Data System (ADS)

    Lance, Sara

    It has been historically assumed that most of the uncertainty associated with the aerosol indirect effect on climate can be attributed to the unpredictability of updrafts. In Chapter 1, we analyze the sensitivity of cloud droplet number density, to realistic variations in aerosol chemical properties and to variable updraft velocities using a 1-dimensional cloud parcel model in three important environmental cases (continental, polluted and remote marine). The results suggest that aerosol chemical variability may be as important to the aerosol indirect effect as the effect of unresolved cloud dynamics, especially in polluted environments. We next used a continuous flow streamwise thermal gradient Cloud Condensation Nuclei counter (CCNc) to study the water-uptake properties of the ambient aerosol, by exposing an aerosol sample to a controlled water vapor supersaturation and counting the resulting number of droplets. In Chapter 2, we modeled and experimentally characterized the heat transfer properties and droplet growth within the CCNc. Chapter 3 describes results from the MIRAGE field campaign, in which the CCNc and a Hygroscopicity Tandem Differential Mobility Analyzer (HTDMA) were deployed at a ground-based site during March, 2006. Size-resolved CCN activation spectra and growth factor distributions of the ambient aerosol in Mexico City were obtained, and an analytical technique was developed to quantify a probability distribution of solute volume fractions for the CCN in addition to the aerosol mixing-state. The CCN were shown to be much less CCN active than ammonium sulfate, with water uptake properties more consistent with low molecular weight organic compounds. The pollution outflow from Mexico City was shown to have CCN with an even lower fraction of soluble material. "Chemical Closure" was attained for the CCN, by comparing the inferred solute volume fraction with that from direct chemical measurements. A clear diurnal pattern was observed for the CCN solute

  20. Understanding and quantifying foliar temperature acclimation for Earth System Models

    NASA Astrophysics Data System (ADS)

    Smith, N. G.; Dukes, J.

    2015-12-01

    Photosynthesis and respiration on land are the two largest carbon fluxes between the atmosphere and Earth's surface. The parameterization of these processes represent major uncertainties in the terrestrial component of the Earth System Models used to project future climate change. Research has shown that much of this uncertainty is due to the parameterization of the temperature responses of leaf photosynthesis and autotrophic respiration, which are typically based on short-term empirical responses. Here, we show that including longer-term responses to temperature, such as temperature acclimation, can help to reduce this uncertainty and improve model performance, leading to drastic changes in future land-atmosphere carbon feedbacks across multiple models. However, these acclimation formulations have many flaws, including an underrepresentation of many important global flora. In addition, these parameterizations were done using multiple studies that employed differing methodology. As such, we used a consistent methodology to quantify the short- and long-term temperature responses of maximum Rubisco carboxylation (Vcmax), maximum rate of Ribulos-1,5-bisphosphate regeneration (Jmax), and dark respiration (Rd) in multiple species representing each of the plant functional types used in global-scale land surface models. Short-term temperature responses of each process were measured in individuals acclimated for 7 days at one of 5 temperatures (15-35°C). The comparison of short-term curves in plants acclimated to different temperatures were used to evaluate long-term responses. Our analyses indicated that the instantaneous response of each parameter was highly sensitive to the temperature at which they were acclimated. However, we found that this sensitivity was larger in species whose leaves typically experience a greater range of temperatures over the course of their lifespan. These data indicate that models using previous acclimation formulations are likely incorrectly

  1. Quantifying methane flux from lake sediments using multibeam sonar

    NASA Astrophysics Data System (ADS)

    Scandella, B.; Urban, P.; Delwiche, K.; Greinert, J.; Hemond, H.; Ruppel, C. D.; Juanes, R.

    2013-12-01

    Methane is a potent greenhouse gas, and the production and emission of methane from sediments in wetlands, lakes and rivers both contributes to and may be exacerbated by climate change. In some of these shallow-water settings, methane fluxes may be largely controlled by episodic venting that can be triggered by drops in hydrostatic pressure. Even with better constraints on the mechanisms for gas release, quantifying these fluxes has remained a challenge due to rapid spatiotemporal changes in the patterns of bubble emissions from the sediments. The research presented here uses a fixed-location Imagenex DeltaT 837B multibeam sonar to estimate methane-venting fluxes from organic-rich lake sediments over a large area (~400 m2) and over a multi-season deployment period with unprecedented spatial and temporal resolution. Simpler, single-beam sonar systems have been used in the past to estimate bubble fluxes in a variety of settings. Here we extend this methodology to a multibeam system by means of: (1) detailed calibration of the sonar signal against imposed bubble streams, and (2) validation against an in situ independent record of gas flux captured by overlying bubble traps. The calibrated sonar signals then yield estimates of the methane flux with high spatial resolution (~1 m) and temporal frequency (6 Hz) from a portion of the deepwater basin of Upper Mystic Lake, MA, USA, a temperate eutrophic kettle lake. These results in turn inform mathematical models of methane transport and release from the sediments, which reproduce with high fidelity the ebullitive response to hydrostatic pressure variations. In addition, the detailed information about spatial variability of methane flux derived from sonar records is used to estimate the uncertainty associated with upscaling flux measurements from bubble traps to the scale of the sonar observation area. Taken together, these multibeam sonar measurements and analysis provide a novel quantitative approach for the assessment of

  2. A new soil mechanics approach to quantify and predict land subsidence by peat compression

    NASA Astrophysics Data System (ADS)

    Koster, Kay; Erkens, Gilles; Zwanenburg, Cor

    2016-10-01

    Land subsidence threatens many coastal areas. Quantifying current and predicting future subsidence are essential to sustain the viability of these areas with respect to rising sea levels. Despite its scale and severity, methods to quantify subsidence are scarce. In peat-rich subsidence hot spots, subsidence is often caused by peat compression. We introduce the standard Cone Penetration Test (CPT) as a technique to quantify subsidence due to compression of peat. In a test in the Holland coastal plain, the Netherlands, we found a strong relationship between thickness reduction of peat and cone resistance, due to an increase in peat stiffness after compression. We use these results to quantify subsidence of peat in subsiding areas of Sacramento-San Joaquin delta and Kalimantan, and found values corresponding with previously made observations. These results open the door for CPT as a new method to document past and predict future subsidence due to peat compression over large areas.

  3. Quantifying spore viability of the honey bee pathogen Nosema apis using flow cytometry.

    PubMed

    Peng, Yan; Lee-Pullen, Tracey F; Heel, Kathy; Millar, A Harvey; Baer, Boris

    2014-05-01

    Honey bees are hosts to more than 80 different parasites, some of them being highly virulent and responsible for substantial losses in managed honey bee populations. The study of honey bee pathogens and their interactions with the bees' immune system has therefore become a research area of major interest. Here we developed a fast, accurate and reliable method to quantify the viability of spores of the honey bee gut parasite Nosema apis. To verify this method, a dilution series with 0, 25, 50, 75, and 100% live N. apis was made and SYTO 16 and Propidium Iodide (n = 35) were used to distinguish dead from live spores. The viability of spores in each sample was determined by flow cytometry and compared with the current method based on fluorescence microscopy. Results show that N. apis viability counts using flow cytometry produced very similar results when compared with fluorescence microscopy. However, we found that fluorescence microscopy underestimates N. apis viability in samples with higher percentages of viable spores, the latter typically being what is found in biological samples. A series of experiments were conducted to confirm that flow cytometry allows the use of additional fluorescent dyes such as SYBR 14 and SYTOX Red (used in combination with SYTO 16 or Propidium Iodide) to distinguish dead from live spores. We also show that spore viability quantification with flow cytometry can be undertaken using substantially lower dye concentrations than fluorescence microscopy. In conclusion, our data show flow cytometry to be a fast, reliable method to quantify N. apis spore viabilities, which has a number of advantages compared with existing methods.

  4. Quantifying Temperature-Dependent T1 Changes in Cortical Bone Using Ultrashort Echo-Time MRI

    PubMed Central

    Han, Misung; Rieke, Viola; Scott, Serena J; Ozhinsky, Eugene; Salgaonkar, Vasant A; Jones, Peter D; Larson, Peder E Z; Diederich, Chris J; Krug, Roland

    2015-01-01

    Purpose To demonstrate the feasibility of using ultrashort echo-time (UTE) MRI to quantify T1 changes in cortical bone due to heating. Methods Variable flip-angle T1 mapping combined with 3D UTE imaging was used to measure T1 in cortical bone. A calibration experiment was performed to detect T1 changes with temperature in ex vivo cortical bone samples from a bovine femur. Ultrasound heating experiments were performed using an interstitial applicator in ex vivo bovine femur specimens, and heat-induced T1 changes were quantified. Results The calibration experiment demonstrated that T1 increases with temperature in cortical bone. We observed a linear relationship between temperature and T1 with a linear coefficient of 0.67–0.84 ms/°C over a range of 25–70°C. The ultrasound heating experiments showed increased T1 changes in the heated regions, and the relationship between the temperature changes and T1 changes was similar to that of the calibration. Conclusion We demonstrated a temperature dependence of T1 in ex vivo cortical bone using a variable flip-angle UTE T1 mapping method. PMID:26390357

  5. A Frailty Index Based On Deficit Accumulation Quantifies Mortality Risk in Humans and in Mice

    PubMed Central

    Rockwood, K.; Blodgett, J. M.; Theou, O.; Sun, M. H.; Feridooni, H. A.; Mitnitski, A.; Rose, R. A.; Godin, J.; Gregson, E.; Howlett, S. E.

    2017-01-01

    Although many common diseases occur mostly in old age, the impact of ageing itself on disease risk and expression often goes unevaluated. To consider the impact of ageing requires some useful means of measuring variability in health in animals of the same age. In humans, this variability has been quantified by counting age-related health deficits in a frailty index. Here we show the results of extending that approach to mice. Across the life course, many important features of deficit accumulation are present in both species. These include gradual rates of deficit accumulation (slope = 0.029 in humans; 0.036 in mice), a submaximal limit (0.54 in humans; 0.44 in mice), and a strong relationship to mortality (1.05 [1.04–1.05] in humans; 1.15 [1.12–1.18] in mice). Quantifying deficit accumulation in individual mice provides a powerful new tool that can facilitate translation of research on ageing, including in relation to disease. PMID:28220898

  6. Quantifying the effect of environment stability on the transcription factor repertoire of marine microbes

    PubMed Central

    2011-01-01

    Background DNA-binding transcription factors (TFs) regulate cellular functions in prokaryotes, often in response to environmental stimuli. Thus, the environment exerts constant selective pressure on the TF gene content of microbial communities. Recently a study on marine Synechococcus strains detected differences in their genomic TF content related to environmental adaptation, but so far the effect of environmental parameters on the content of TFs in bacterial communities has not been systematically investigated. Results We quantified the effect of environment stability on the transcription factor repertoire of marine pelagic microbes from the Global Ocean Sampling (GOS) metagenome using interpolated physico-chemical parameters and multivariate statistics. Thirty-five percent of the difference in relative TF abundances between samples could be explained by environment stability. Six percent was attributable to spatial distance but none to a combination of both spatial distance and stability. Some individual TFs showed a stronger relationship to environment stability and space than the total TF pool. Conclusions Environmental stability appears to have a clearly detectable effect on TF gene content in bacterioplanktonic communities described by the GOS metagenome. Interpolated environmental parameters were shown to compare well to in situ measurements and were essential for quantifying the effect of the environment on the TF content. It is demonstrated that comprehensive and well-structured contextual data will strongly enhance our ability to interpret the functional potential of microbes from metagenomic data. PMID:22587903

  7. Chimpanzees (Pan troglodytes) and bonobos (Pan paniscus) quantify split solid objects.

    PubMed

    Cacchione, Trix; Hrubesch, Christine; Call, Josep

    2013-01-01

    Recent research suggests that gorillas' and orangutans' object representations survive cohesion violations (e.g., a split of a solid object into two halves), but that their processing of quantities may be affected by them. We assessed chimpanzees' (Pan troglodytes) and bonobos' (Pan paniscus) reactions to various fission events in the same series of action tasks modelled after infant studies previously run on gorillas and orangutans (Cacchione and Call in Cognition 116:193-203, 2010b). Results showed that all four non-human great ape species managed to quantify split objects but that their performance varied as a function of the non-cohesiveness produced in the splitting event. Spatial ambiguity and shape invariance had the greatest impact on apes' ability to represent and quantify objects. Further, we observed species differences with gorillas performing lower than other species. Finally, we detected a substantial age effect, with ape infants below 6 years of age being outperformed by both juvenile/adolescent and adult apes.

  8. Characterizing uncertainties for quantifying bathymetry change between time-separated multibeam echo-sounder surveys

    NASA Astrophysics Data System (ADS)

    Schmitt, Thierry; Mitchell, Neil C.; Ramsay, A. Tony S.

    2008-05-01

    Changes of bathymetry derived from multibeam sonars are useful for quantifying the effects of many sedimentary, tectonic and volcanic processes, but depth changes also require an assessment of their uncertainty. Here, we outline and illustrate a simple technique that aims both to quantify uncertainties and to help reveal the spatial character of errors. An area of immobile seafloor is mapped in each survey, providing a common 'benchmark'. Each survey dataset over the benchmark is filtered with a simple moving-averaging window and depth differences between the two surveys are collated to derive a difference histogram. The procedure is repeated using different length-scales of filtering. By plotting the variability of the differences versus the length-scale of the filter, the different effects of spatially uncorrelated and correlated noise can be deduced. The former causes variability to decrease systematically as predicted by the Central Limit Theorem, whereas the remaining variability not predicted by the Central Limit Theorem then represents the effect of spatially correlated noise. Calculations made separately for different beams can reveal whether problems are due to heave, roll, etc., which affect inner and outer beams differently. We show how the results can be applied to create a map of uncertainties, which can be used to remove insignificant data from the bathymetric change map. We illustrate the technique by characterizing changes in nearshore bed morphology over one annual cycle using data from a subtidal bay, bedrock headland and a banner sand bank in the Bristol Channel UK.

  9. Quantifying Mountain Block Recharge by Means of Catchment-Scale Storage-Discharge Relationships

    NASA Astrophysics Data System (ADS)

    Ajami, H.; Troch, P. A.; Maddock, T.; Meixner, T.; Eastoe, C. J.

    2009-12-01

    Despite the hydrologic significance of mountainous catchments in providing freshwater resources, especially in semi-arid regions, little is known about key hydrological processes in these systems, such as mountain block recharge (MBR). We developed an empirical approach based on the storage sensitivity function introduced by Kirchner (2009) to develop storage-discharge relationships from stream flow analysis. We investigated sensitivity of MBR estimates to uncertainty in the derivation of the catchment storage-discharge relations. We implemented this technique in a semi-arid mountainous catchment in South-east Arizona, USA (the Marshall Gulch catchment in the Santa Catalina Mountains near Tucson) with two distinct rainy seasons, winter frontal storms and summer monsoon separated by prolonged dry periods. Developing storage-discharge relation based on baseflow data in the dry period allowed quantifying change in fractured bedrock storage caused by MBR. Contribution of fractured bedrock to stream flow was confirmed using stable isotope data. Our results show that 1) incorporating scalable time steps to correct for stream flow measurement errors improves the model fit; 2) the quantile method is more suitable for stream flow data binning; 3) the choice of the regression model is more critical when the stage-discharge function is used to predict changes in bedrock storage beyond the maximum observed flow in the catchment and 4) application of daily versus hourly flow did not affect the storage-discharge relationship. This methodology allowed quantifying MBR using stream flow recession analysis from within the mountain system.

  10. A Synthetic Phased Array Surface Acoustic Wave Sensor for Quantifying Bolt Tension

    PubMed Central

    Martinez, Jairo; Sisman, Alper; Onen, Onursal; Velasquez, Dean; Guldiken, Rasim

    2012-01-01

    In this paper, we report our findings on implementing a synthetic phased array surface acoustic wave sensor to quantify bolt tension. Maintaining proper bolt tension is important in many fields such as for ensuring safe operation of civil infrastructures. Significant advantages of this relatively simple methodology is its capability to assess bolt tension without any contact with the bolt, thus enabling measurement at inaccessible locations, multiple bolt measurement capability at a time, not requiring data collection during the installation and no calibration requirements. We performed detailed experiments on a custom-built flexible bench-top experimental setup consisting of 1018 steel plate of 12.7 mm (½ in) thickness, a 6.4 mm (¼ in) grade 8 bolt and a stainless steel washer with 19 mm (¾ in) of external diameter. Our results indicate that this method is not only capable of clearly distinguishing properly bolted joints from loosened joints but also capable of quantifying how loose the bolt actually is. We also conducted detailed signal-to-noise (SNR) analysis and showed that the SNR value for the entire bolt tension range was sufficient for image reconstruction.

  11. Quantifying Land Use Impacts on Biodiversity: Combining Species-Area Models and Vulnerability Indicators.

    PubMed

    Chaudhary, Abhishek; Verones, Francesca; de Baan, Laura; Hellweg, Stefanie

    2015-08-18

    Habitat degradation and subsequent biodiversity damage often take place far from the place of consumption because of globalization and the increasing level of international trade. Informing consumers and policy makers about the biodiversity impacts "hidden" in the life cycle of imported products is an important step toward achieving sustainable consumption patterns. Spatially explicit methods are needed in life cycle assessment to accurately quantify biodiversity impacts of products and processes. We use the Countryside species-area relationship (SAR) to quantify regional species loss due to land occupation and transformation for five taxa and six land use types in 804 terrestrial ecoregions. Further, we calculate vulnerability scores for each ecoregion based on the fraction of each species' geographic range (endemic richness) hosted by the ecoregion and the IUCN assigned threat level of each species. Vulnerability scores are multiplied with SAR-predicted regional species loss to estimate potential global extinctions per unit of land use. As a case study, we assess the land use biodiversity impacts of 1 kg of bioethanol produced using six different feed stocks in different parts of the world. Results show that the regions with highest biodiversity impacts differed markedly when the vulnerability of species was included.

  12. An integrated method for quantifying root architecture of field-grown maize

    PubMed Central

    Wu, Jie; Guo, Yan

    2014-01-01

    Background and Aims A number of techniques have recently been developed for studying the root system architecture (RSA) of seedlings grown in various media. In contrast, methods for sampling and analysis of the RSA of field-grown plants, particularly for details of the lateral root components, are generally inadequate. Methods An integrated methodology was developed that includes a custom-made root-core sampling system for extracting intact root systems of individual maize plants, a combination of proprietary software and a novel program used for collecting individual RSA information, and software for visualizing the measured individual nodal root architecture. Key Results Example experiments show that large root cores can be sampled, and topological and geometrical structure of field-grown maize root systems can be quantified and reconstructed using this method. Second- and higher order laterals are found to contribute substantially to total root number and length. The length of laterals of distinct orders varies significantly. Abundant higher order laterals can arise from a single first-order lateral, and they concentrate in the proximal axile branching zone. Conclusions The new method allows more meaningful sampling than conventional methods because of its easily opened, wide corer and sampling machinery, and effective analysis of RSA using the software. This provides a novel technique for quantifying RSA of field-grown maize and also provides a unique evaluation of the contribution of lateral roots. The method also offers valuable potential for parameterization of root architectural models. PMID:24532646

  13. Comprehensive analysis of individual pulp fiber bonds quantifies the mechanisms of fiber bonding in paper

    NASA Astrophysics Data System (ADS)

    Hirn, Ulrich; Schennach, Robert

    2015-05-01

    The process of papermaking requires substantial amounts of energy and wood consumption, which contributes to larger environmental costs. In order to optimize the production of papermaking to suit its many applications in material science and engineering, a quantitative understanding of bonding forces between the individual pulp fibers is of importance. Here we show the first approach to quantify the bonding energies contributed by the individual bonding mechanisms. We calculated the impact of the following mechanisms necessary for paper formation: mechanical interlocking, interdiffusion, capillary bridges, hydrogen bonding, Van der Waals forces, and Coulomb forces on the bonding energy. Experimental results quantify the area in molecular contact necessary for bonding. Atomic force microscopy experiments derive the impact of mechanical interlocking. Capillary bridges also contribute to the bond. A model based on the crystal structure of cellulose leads to values for the chemical bonds. In contrast to general believe which favors hydrogen bonding Van der Waals bonds play the most important role according to our model. Comparison with experimentally derived bond energies support the presented model. This study characterizes bond formation between pulp fibers leading to insight that could be potentially used to optimize the papermaking process, while reducing energy and wood consumption.

  14. Quantifying temporal bone morphology of great apes and humans: an approach using geometric morphometrics

    PubMed Central

    Lockwood, Charles A; Lynch, John M; Kimbel, William H

    2002-01-01

    The hominid temporal bone offers a complex array of morphology that is linked to several different functional systems. Its frequent preservation in the fossil record gives the temporal bone added significance in the study of human evolution, but its morphology has proven difficult to quantify. In this study we use techniques of 3D geometric morphometrics to quantify differences among humans and great apes and discuss the results in a phylogenetic context. Twenty-three landmarks on the ectocranial surface of the temporal bone provide a high level of anatomical detail. Generalized Procrustes analysis (GPA) is used to register (adjust for position, orientation and scale) landmark data from 405 adults representing Homo, Pan, Gorilla and Pongo. Principal components analysis of residuals from the GPA shows that the major source of variation is between humans and apes. Human characteristics such as a coronally orientated petrous axis, a deep mandibular fossa, a projecting mastoid process, and reduced lateral extension of the tympanic element strongly impact the analysis. In phenetic cluster analyses, gorillas and orangutans group together with respect to chimpanzees, and all apes group together with respect to humans. Thus, the analysis contradicts depictions of African apes as a single morphotype. Gorillas and orangutans lack the extensive preglenoid surface of chimpanzees, and their mastoid processes are less medially inflected. These and other characters shared by gorillas and orangutans are probably primitive for the African hominid clade. PMID:12489757

  15. Quantifying Relative Diver Effects in Underwater Visual Censuses

    PubMed Central

    Dickens, Luke C.; Goatley, Christopher H. R.; Tanner, Jennifer K.; Bellwood, David R.

    2011-01-01

    Diver-based Underwater Visual Censuses (UVCs), particularly transect-based surveys, are key tools in the study of coral reef fish ecology. These techniques, however, have inherent problems that make it difficult to collect accurate numerical data. One of these problems is the diver effect (defined as the reaction of fish to a diver). Although widely recognised, its effects have yet to be quantified and the extent of taxonomic variation remains to be determined. We therefore examined relative diver effects on a reef fish assemblage on the Great Barrier Reef. Using common UVC methods, the recorded abundance of seven reef fish groups were significantly affected by the ongoing presence of SCUBA divers. Overall, the diver effect resulted in a 52% decrease in the mean number of individuals recorded, with declines of up to 70% in individual families. Although the diver effect appears to be a significant problem, UVCs remain a useful approach for quantifying spatial and temporal variation in relative fish abundances, especially if using methods that minimise the exposure of fishes to divers. Fixed distance transects using tapes or lines deployed by a second diver (or GPS-calibrated timed swims) would appear to maximise fish counts and minimise diver effects. PMID:21533039

  16. Methods for quantifying uncertainty in fast reactor analyses.

    SciTech Connect

    Fanning, T. H.; Fischer, P. F.

    2008-04-07

    Liquid-metal-cooled fast reactors in the form of sodium-cooled fast reactors have been successfully built and tested in the U.S. and throughout the world. However, no fast reactor has operated in the U.S. for nearly fourteen years. More importantly, the U.S. has not constructed a fast reactor in nearly 30 years. In addition to reestablishing the necessary industrial infrastructure, the development, testing, and licensing of a new, advanced fast reactor concept will likely require a significant base technology program that will rely more heavily on modeling and simulation than has been done in the past. The ability to quantify uncertainty in modeling and simulations will be an important part of any experimental program and can provide added confidence that established design limits and safety margins are appropriate. In addition, there is an increasing demand from the nuclear industry for best-estimate analysis methods to provide confidence bounds along with their results. The ability to quantify uncertainty will be an important component of modeling that is used to support design, testing, and experimental programs. Three avenues of UQ investigation are proposed. Two relatively new approaches are described which can be directly coupled to simulation codes currently being developed under the Advanced Simulation and Modeling program within the Reactor Campaign. A third approach, based on robust Monte Carlo methods, can be used in conjunction with existing reactor analysis codes as a means of verification and validation of the more detailed approaches.

  17. Quantifying serum antibody in bird fanciers' hypersensitivity pneumonitis

    PubMed Central

    McSharry, Charles; Dye, George M; Ismail, Tengku; Anderson, Kenneth; Spiers, Elizabeth M; Boyd, Gavin

    2006-01-01

    Background Detecting serum antibody against inhaled antigens is an important diagnostic adjunct for hypersensitivity pneumonitis (HP). We sought to validate a quantitative fluorimetric assay testing serum from bird fanciers. Methods Antibody activity was assessed in bird fanciers and control subjects using various avian antigens and serological methods, and the titer was compared with symptoms of HP. Results IgG antibody against pigeon serum antigens, quantified by fluorimetry, provided a good discriminator of disease. Levels below 10 mg/L were insignificant, and increasing titers were associated with disease. The assay was unaffected by total IgG, autoantibodies and antibody to dietary hen's egg antigens. Antigens from pigeon serum seem sufficient to recognize immune sensitivity to most common pet avian species. Decreasing antibody titers confirmed antigen avoidance. Conclusion Increasing antibody titer reflected the likelihood of HP, and decreasing titers confirmed antigen avoidance. Quantifying antibody was rapid and the increased sensitivity will improve the rate of false-negative reporting and obviate the need for invasive diagnostic procedures. Automated fluorimetry provides a method for the international standardization of HP serology thereby improving quality control and improving its suitability as a diagnostic adjunct. PMID:16800875

  18. Quantifying the Magnitude of Anomalous Solar Absorption

    SciTech Connect

    Ackerman, Thomas P.; Flynn, Donna M.; Marchand, Roger T.

    2003-05-16

    The data set from ARESE II, sponsored by the Atmospheric Radiation Measurement Program, provides a unique opportunity to understand solar absorption in the atmosphere because of the combination of three sets of broadband solar radiometers mounted on the Twin Otter aircraft and the ground based instruments at the ARM Southern Great Plains facility. In this study, we analyze the measurements taken on two clear sky days and three cloudy days and model the solar radiative transfer in each case with two different models. On the two clear days, the calculated and measured column absorptions agree to better than 10 Wm-2, which is about 10% of the total column absorption. Because both the model fluxes and the individual radiometer measurements are accurate to no better than 10 Wm-2, we conclude that the models and measurements are essentially in agreement. For the three cloudy days, the model calculations agree very well with each other and on two of the three days agree with the measurements to 20 Wm-2 or less out of a total column absorption of more than 200 Wm-2, which is again agreement at better than 10%. On the third day, the model and measurements agree to either 8% or 14% depending on which value of surface albedo is used. Differences exceeding 10% represent a significant absorption difference between model and observations. In addition to the uncertainty in absorption due to surface albedo, we show that including aerosol with an optical depth similar to that found on clear days can reduce the difference between model and measurement by 5% or more. Thus, we conclude that the ARESE II results are incompatible with previous studies reporting extreme anomalous absorption and can be modeled with our current understanding of radiative transfer.

  19. Human platelet sulfotransferase shows seasonal rhythms.

    PubMed

    Marazziti, D; Palego, L; Mazzanti, C; Silvestri, S; Cassano, G B

    1995-04-01

    Our study aimed to investigate the possible presence of seasonal changes in platelet phenolsulfotransferase (ST) in a group of 20 healthy, drug-free subjects of both sexes between 24 and 37 years of age. Blood samples were taken four times a year in the period immediately following the equinoxes and the solstices. The results showed that both Sts underwent seasonal changes: the lowest values were found in autumn and in winter, and the highest in the summer. A positive correlation between the two STs and the length of the photoperiod was observed in winter whereas in the spring we detected a negative correlation between the TL ST and the photoperiod length. Future studies should clarify whether platelet ST of patients with mood disorders shows a similar seasonality.

  20. Quantifying the exploratory behaviour of Amphibalanus amphitrite cyprids.

    PubMed

    Chaw, Kuan Chun; Birch, William R

    2009-10-01

    The behavioural response of cypris larvae from A. amphitrite (=Balanus amphitrite) exploring three model glass surfaces is quantified by close-range microscopy. Step length and step duration measurements reveal a response to both surface properties and flow. Without flow, 2-day-old cyprids took larger steps with shorter step duration on hydrophilic glass surfaces (bare and NH2-treated) vs hydrophobic glass (CH3-treated). These parameters suggest a more detailed, local inspection of hydrophobic surfaces and a more extensive exploration for hydrophilic surfaces. Cyprids under flow took longer steps and exhibited shorter probing times on hydrophobic glass. On hydrophilic glass, cyprids increased their step duration under flow. This active response is attributed to drag and lift forces challenging the cyprids' temporary anchoring to the substratum. Seven-day-old cyprids showed almost no discrimination between the model surfaces. Microscopic-scale observation of cyprid exploration is expected to provide new insights into interactions between cyprids and surfaces.

  1. Quantifying Age-dependent Extinction from Species Phylogenies

    PubMed Central

    Alexander, Helen K.; Lambert, Amaury; Stadler, Tanja

    2016-01-01

    Several ecological factors that could play into species extinction are expected to correlate with species age, i.e., time elapsed since the species arose by speciation. To date, however, statistical tools to incorporate species age into likelihood-based phylogenetic inference have been lacking. We present here a computational framework to quantify age-dependent extinction through maximum likelihood parameter estimation based on phylogenetic trees, assuming species lifetimes are gamma distributed. Testing on simulated trees shows that neglecting age dependence can lead to biased estimates of key macroevolutionary parameters. We then apply this method to two real data sets, namely a complete phylogeny of birds (class Aves) and a clade of self-compatible and -incompatible nightshades (Solanaceae), gaining initial insights into the extent to which age-dependent extinction may help explain macroevolutionary patterns. Our methods have been added to the R package TreePar. PMID:26405218

  2. Front tracking for characterizing and quantifying reactive mixing

    NASA Astrophysics Data System (ADS)

    Kelley, Douglas; Nevins, Thomas

    2016-11-01

    Mixing in industrial chemical reactors involves complicated interactions between advection, reaction, and diffusion that are difficult to simulate or measure in detail. However, in large-Damköhler-number systems which show sharp fronts between reacted and unreacted regions, reactor dynamics might be more simply and usefully characterized in terms of the reaction fronts themselves. In fact, prior work has already shown that the reaction rate and material diffusivity can be calculated directly if front speed and front thickness are known. We have developed methods to optically track reaction fronts, measuring their speed and thickness throughout space and time. We will present such measurements in both simulation and experiment, consider their statistics, and discuss future efforts to characterize and quantify mixing in chemical reactors.

  3. Quantifying chaotic dynamics from integrate-and-fire processes

    SciTech Connect

    Pavlov, A. N.; Pavlova, O. N.; Mohammad, Y. K.; Kurths, J.

    2015-01-15

    Characterizing chaotic dynamics from integrate-and-fire (IF) interspike intervals (ISIs) is relatively easy performed at high firing rates. When the firing rate is low, a correct estimation of Lyapunov exponents (LEs) describing dynamical features of complex oscillations reflected in the IF ISI sequences becomes more complicated. In this work we discuss peculiarities and limitations of quantifying chaotic dynamics from IF point processes. We consider main factors leading to underestimated LEs and demonstrate a way of improving numerical determining of LEs from IF ISI sequences. We show that estimations of the two largest LEs can be performed using around 400 mean periods of chaotic oscillations in the regime of phase-coherent chaos. Application to real data is discussed.

  4. Quantifying disorder through conditional entropy: an application to fluid mixing.

    PubMed

    Brandani, Giovanni B; Schor, Marieke; Macphee, Cait E; Grubmüller, Helmut; Zachariae, Ulrich; Marenduzzo, Davide

    2013-01-01

    In this paper, we present a method to quantify the extent of disorder in a system by using conditional entropies. Our approach is especially useful when other global, or mean field, measures of disorder fail. The method is equally suited for both continuum and lattice models, and it can be made rigorous for the latter. We apply it to mixing and demixing in multicomponent fluid membranes, and show that it has advantages over previous measures based on Shannon entropies, such as a much diminished dependence on binning and the ability to capture local correlations. Further potential applications are very diverse, and could include the study of local and global order in fluid mixtures, liquid crystals, magnetic materials, and particularly biomolecular systems.

  5. A Methodological Approach to Quantifying Plyometric Intensity.

    PubMed

    Jarvis, Mark M; Graham-Smith, Phil; Comfort, Paul

    2016-09-01

    Jarvis, MM, Graham-Smith, P, and Comfort, P. A Methodological approach to quantifying plyometric intensity. J Strength Cond Res 30(9): 2522-2532, 2016-In contrast to other methods of training, the quantification of plyometric exercise intensity is poorly defined. The purpose of this study was to evaluate the suitability of a range of neuromuscular and mechanical variables to describe the intensity of plyometric exercises. Seven male recreationally active subjects performed a series of 7 plyometric exercises. Neuromuscular activity was measured using surface electromyography (SEMG) at vastus lateralis (VL) and biceps femoris (BF). Surface electromyography data were divided into concentric (CON) and eccentric (ECC) phases of movement. Mechanical output was measured by ground reaction forces and processed to provide peak impact ground reaction force (PF), peak eccentric power (PEP), and impulse (IMP). Statistical analysis was conducted to assess the reliability intraclass correlation coefficient and sensitivity smallest detectable difference of all variables. Mean values of SEMG demonstrate high reliability (r ≥ 0.82), excluding ECC VL during a 40-cm drop jump (r = 0.74). PF, PEP, and IMP demonstrated high reliability (r ≥ 0.85). Statistical power for force variables was excellent (power = 1.0), and good for SEMG (power ≥0.86) excluding CON BF (power = 0.57). There was no significant difference (p > 0.05) in CON SEMG between exercises. Eccentric phase SEMG only distinguished between exercises involving a landing and those that did not (percentage of maximal voluntary isometric contraction [%MVIC] = no landing -65 ± 5, landing -140 ± 8). Peak eccentric power, PF, and IMP all distinguished between exercises. In conclusion, CON neuromuscular activity does not appear to vary when intent is maximal, whereas ECC activity is dependent on the presence of a landing. Force characteristics provide a reliable and sensitive measure enabling precise description of intensity

  6. Quantifying the impacts of global disasters

    NASA Astrophysics Data System (ADS)

    Jones, L. M.; Ross, S.; Wilson, R. I.; Borrero, J. C.; Brosnan, D.; Bwarie, J. T.; Geist, E. L.; Hansen, R. A.; Johnson, L. A.; Kirby, S. H.; Long, K.; Lynett, P. J.; Miller, K. M.; Mortensen, C. E.; Perry, S. C.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Thio, H. K.; Wein, A. M.; Whitmore, P.; Wood, N. J.

    2012-12-01

    The US Geological Survey, National Oceanic and Atmospheric Administration, California Geological Survey, and other entities are developing a Tsunami Scenario, depicting a realistic outcome of a hypothetical but plausible large tsunami originating in the eastern Aleutian Arc, affecting the west coast of the United States, including Alaska and Hawaii. The scenario includes earth-science effects, damage and restoration of the built environment, and social and economic impacts. Like the earlier ShakeOut and ARkStorm disaster scenarios, the purpose of the Tsunami Scenario is to apply science to quantify the impacts of natural disasters in a way that can be used by decision makers in the affected sectors to reduce the potential for loss. Most natural disasters are local. A major hurricane can destroy a city or damage a long swath of coastline while mostly sparing inland areas. The largest earthquake on record caused strong shaking along 1500 km of Chile, but left the capital relatively unscathed. Previous scenarios have used the local nature of disasters to focus interaction with the user community. However, the capacity for global disasters is growing with the interdependency of the global economy. Earthquakes have disrupted global computer chip manufacturing and caused stock market downturns. Tsunamis, however, can be global in their extent and direct impact. Moreover, the vulnerability of seaports to tsunami damage can increase the global consequences. The Tsunami Scenario is trying to capture the widespread effects while maintaining the close interaction with users that has been one of the most successful features of the previous scenarios. The scenario tsunami occurs in the eastern Aleutians with a source similar to the 2011 Tohoku event. Geologic similarities support the argument that a Tohoku-like source is plausible in Alaska. It creates a major nearfield tsunami in the Aleutian arc and peninsula, a moderate tsunami in the US Pacific Northwest, large but not the

  7. The Use of Micro-CT with Image Segmentation to Quantify Leakage in Dental Restorations

    PubMed Central

    Carrera, Carola A.; Lan, Caixia; Escobar-Sanabria, David; Li, Yuping; Rudney, Joel; Aparicio, Conrado; Fok, Alex

    2015-01-01

    Objective To develop a method for quantifying leakage in composite resin restorations after curing, using non-destructive X-ray micro-computed tomography (micro-CT) and image segmentation. Methods Class-I cavity preparations were made in 20 human third molars, which were divided into 2 groups. Group I was restored with Z100 and Group II with Filtek LS. Micro-CT scans were taken for both groups before and after they were submerged in silver nitrate solution (AgNO3 50%) to reveal any interfacial gap and leakage at the tooth restoration interface. Image segmentation was carried out by first performing image correlation to align the before- and after-treatment images and then by image subtraction to isolate the silver nitrate penetrant for precise volume calculation. Two-tailed Student’s t-test was used to analyze the results, with the level of significance set at p<0.05. Results All samples from Group I showed silver nitrate penetration with a mean volume of 1.3 ± 0.7 mm3. In Group II, only 2 out of the 10 restorations displayed infiltration along the interface, giving a mean volume of 0.3 ± 0.3 mm3. The difference between the two groups was statistically significant (p < 0.05). The infiltration showed non-uniform patterns within the interface. Significance We have developed a method to quantify the volume of leakage using non-destructive micro-CT, silver nitrate infiltration and image segmentation. Our results confirmed that substantial leakage could occur in composite restorations that have imperfections in the adhesive layer or interfacial debonding through polymerization shrinkage. For the restorative systems investigated in this study, this occurred mostly at the interface between the adhesive system and the tooth structure. PMID:25649496

  8. Interpreting cortical bone adaptation and load history by quantifying osteon morphotypes in circularly polarized light images.

    PubMed

    Skedros, John G; Mendenhall, Shaun D; Kiser, Casey J; Winet, Howard

    2009-03-01

    Birefringence variations in circularly polarized light (CPL) images of thin plane-parallel sections of cortical bone can be used to quantify regional differences in predominant collagen fiber orientation (CFO). Using CPL images of equine third metacarpals (MC3s), R.B. Martin, V.A. Gibson, S.M. Stover, J.C. Gibeling, and L.V. Griffin. (40) described six secondary osteon variants ('morphotypes') and suggested that differences in their regional prevalence affect fatigue resistance and toughness. They devised a numerical osteon morphotype score (MTS) for quantifying regional differences in osteon morphotypes. We have observed that a modification of this score could significantly improve its use for interpreting load history. We hypothesized that our modified osteon MTS would more accurately reveal differences in osteon MTSs between opposing "tension" and "compression" cortices of diaphyses of habitually bent bones. This was tested using CPL images in transverse sections of calcanei from sheep, deer, and horses, and radii from sheep and horses. Equine MC3s and sheep tibiae were examined as controls because they experience comparatively greater load complexity that, because of increased prevalence of torsion/shear, would not require regional mechanical enhancements provided by different osteon morphotypes. Predominant CFO, which can reliably reflect adaptation for a regionally prevalent strain mode, was quantified as mean gray levels from birefringence of entire images (excluding pore spaces) in anterior, posterior, medial, and lateral cortices. Results showed that, in contrast to the original scoring scheme of Martin et al., the modified scheme revealed significant anterior/posterior differences in osteon MTSs in nearly all "tension/compression" bones (p<0.0001), but not in equine MC3s (p=0.30) and sheep tibiae (p=0.35). Among habitually bent bones, sheep radii were the exception; relatively lower osteon populations and the birefringence of the primary bone contributed

  9. Quantifying moisture transport in cementitious materials using neutron radiography

    NASA Astrophysics Data System (ADS)

    Lucero, Catherine L.

    . It has been found through this study that small pores, namely voids created by chemical shrinkage, gel pores, and capillary pores, ranging from 0.5 nm to 50 microm, fill quickly through capillary action. However, large entrapped and entrained air voids ranging from 0.05 to 1.25 mm remain empty during the initial filling process. In mortar exposed to calcium chloride solution, a decrease in sorptivity was observed due to an increase in viscosity and surface tension of the solution as proposed by Spragg et al 2011. This work however also noted a decrease in the rate of absorption due to a reaction between the salt and matrix which results in the filling of the pores in the concrete. The results from neutron imaging can help in the interpretation of standard absorption tests. ASTM C1585 test results can be further analyzed in several ways that could give an accurate indication of the durability of the concrete. Results can be reported in depth of penetration versus the square root of time rather than mm3 of fluid per mm2 of exposed surface area. Since a known fraction of pores are initially filling before reaching the edge of the sample, the actual depth of penetration can be calculated. This work is compared with an 'intrinsic sorptivity' that can be used to interpret mass measurements. Furthermore, the influence of shrinkage reducing admixtures (SRAs) on drying was studied. Neutron radiographs showed that systems saturated in water remain "wetter" than systems saturated in 5% SRA solution. The SRA in the system reduces the moisture diffusion coefficient due an increase in viscosity and decrease in surface tension. Neutron radiography provided spatial information of the drying front that cannot be achieved using other methods.

  10. Quantifying Nanomolar Protein Concentrations Using Designed DNA Carriers and Solid-State Nanopores

    PubMed Central

    2016-01-01

    Designed “DNA carriers” have been proposed as a new method for nanopore based specific protein detection. In this system, target protein molecules bind to a long DNA strand at a defined position creating a second level transient current drop against the background DNA translocation. Here, we demonstrate the ability of this system to quantify protein concentrations in the nanomolar range. After incubation with target protein at different concentrations, the fraction of DNA translocations showing a secondary current spike allows for the quantification of the corresponding protein concentration. For our proof-of-principle experiments we use two standard binding systems, biotin–streptavidin and digoxigenin–antidigoxigenin, that allow for measurements of the concentration down to the low nanomolar range. The results demonstrate the potential for a novel quantitative and specific protein detection scheme using the DNA carrier method. PMID:27121643

  11. A methodology to quantify the differences between alternative methods of heart rate variability measurement.

    PubMed

    García-González, M A; Fernández-Chimeno, M; Guede-Fernández, F; Ferrer-Mileo, V; Argelagós-Palau, A; Álvarez-Gómez, L; Parrado, E; Moreno, J; Capdevila, L; Ramos-Castro, J

    2016-01-01

    This work proposes a systematic procedure to report the differences between heart rate variability time series obtained from alternative measurements reporting the spread and mean of the differences as well as the agreement between measuring procedures and quantifying how stationary, random and normal the differences between alternative measurements are. A description of the complete automatic procedure to obtain a differences time series (DTS) from two alternative methods, a proposal of a battery of statistical tests, and a set of statistical indicators to better describe the differences in RR interval estimation are also provided. Results show that the spread and agreement depend on the choice of alternative measurements and that the DTS cannot be considered generally as a white or as a normally distributed process. Nevertheless, in controlled measurements the DTS can be considered as a stationary process.

  12. Validated methodology for quantifying infestation levels of dreissenid mussels in environmental DNA (eDNA) samples

    PubMed Central

    Peñarrubia, Luis; Alcaraz, Carles; Vaate, Abraham bij de; Sanz, Nuria; Pla, Carles; Vidal, Oriol; Viñas, Jordi

    2016-01-01

    The zebra mussel (Dreissena polymorpha Pallas, 1771) and the quagga mussel (D. rostriformis Deshayes, 1838) are successful invasive bivalves with substantial ecological and economic impacts in freshwater systems once they become established. Since their eradication is extremely difficult, their detection at an early stage is crucial to prevent spread. In this study, we optimized and validated a qPCR detection method based on the histone H2B gene to quantify combined infestation levels of zebra and quagga mussels in environmental DNA samples. Our results show specific dreissenid DNA present in filtered water samples for which microscopic diagnostic identification for larvae failed. Monitoring a large number of locations for invasive dreissenid species based on a highly specific environmental DNA qPCR assay may prove to be an essential tool for management and control plans focused on prevention of establishment of dreissenid mussels in new locations. PMID:27966602

  13. Quantifying fiber formation in meat analogs under high moisture extrusion using image processing

    NASA Astrophysics Data System (ADS)

    Ranasinghesagara, J.; Hsieh, F.; Yao, G.

    2005-11-01

    High moisture extrusion using twin-screw extruders shows great promise of producing meat analog products with vegetable proteins. The resulting products have well defined fiber formations; resemble real meat in both visual appearance and taste sensation. Developing reliable non-destructive techniques to quantify the textural properties of extrudates is important for quality control in the manufacturing process. In this study, we developed an image processing technique to automatically characterize sample fiber formation using digital imaging. The algorithm is based on statistical analysis of Hough transform. This objective method can be used as a standard method for evaluating other non-invasive methods. We have compared the fiber formation indices measured using this technique and a non-invasive fluorescence polarization method and obtained a high correlation.

  14. Life cycle assessment of urban wastewater systems: Quantifying the relative contribution of sewer systems.

    PubMed

    Risch, Eva; Gutierrez, Oriol; Roux, Philippe; Boutin, Catherine; Corominas, Lluís

    2015-06-15

    This study aims to propose a holistic, life cycle assessment (LCA) of urban wastewater systems (UWS) based on a comprehensive inventory including detailed construction and operation of sewer systems and wastewater treatment plants (WWTPs). For the first time, the inventory of sewers infrastructure construction includes piping materials and aggregates, manholes, connections, civil works and road rehabilitation. The operation stage comprises energy consumption in pumping stations together with air emissions of methane and hydrogen sulphide, and water emissions from sewer leaks. Using a real case study, this LCA aims to quantify the contributions of sewer systems to the total environmental impacts of the UWS. The results show that the construction of sewer infrastructures has an environmental impact (on half of the 18 studied impact categories) larger than both the construction and operation of the WWTP. This study highlights the importance of including the construction and operation of sewer systems in the environmental assessment of centralised versus decentralised options for UWS.

  15. GFP-tagged E. coli shows bacterial distribution in mouse organs: pathogen tracking using fluorescence signal

    PubMed Central

    Park, Pil-Gu; Cho, Min-Hee; Rhie, Gi-eun; Jeong, Haeseul; Youn, Hyewon

    2012-01-01

    Purpose In vaccine efficacy evaluation, visualization of pathogens in whole organism at each time point would be able to reduce the consuming animals and provide the in vivo information within consistent background with identical organism. Materials and Methods Using IVIS spectrum whole live-animal imaging system, fluorescent intensity was optimized and visualized proportionately by concentrating Escherichia coli MC1061 strain which expresses GFP (E. coli-GFP) in BALB/C mice after injection. Results Local distribution of disseminated E. coli-GFP was traced in each organ by fluorescence. Detached organ showed more obvious fluorescent signal, and intestine showed strongest fluorescent signal. Conclusion This in vivo imaging method using GFP-tagged pathogen strain suggest quantified infected pathogens by fluorescence intensity in whole animals can provide the information about the localization and distribution after infection. PMID:23596581

  16. Quantifying the effect size of changing environmental controls on carbon release from permafrost-affected soils

    NASA Astrophysics Data System (ADS)

    Schaedel, C.; Bader, M. K. F.; Schuur, E. A. G.; Bracho, R. G.; Capek, P.; De Baets, S. L.; Diakova, K.; Ernakovich, J. G.; Hartley, I. P.; Iversen, C. M.; Kane, E. S.; Knoblauch, C.; Lupascu, M.; Natali, S.; Norby, R. J.; O'Donnell, J. A.; Roy Chowdhury, T.; Santruckova, H.; Shaver, G. R.; Sloan, V. L.; Treat, C. C.; Waldrop, M. P.

    2014-12-01

    High-latitude surface air temperatures are rising twice as fast as the global mean, causing permafrost to thaw and thereby exposing large quantities of previously frozen organic carbon (C) to microbial decomposition. Increasing temperatures in high latitude ecosystems not only increase C emissions from previously frozen C in permafrost but also indirectly affect the C cycle through changes in regional and local hydrology. Warmer temperatures increase thawing of ice-rich permafrost, causing land surface subsidence where soils become waterlogged, anoxic conditions prevail and C is released in form of anaerobic CO2 and CH4. Although substrate quality, physical protection, and nutrient availability affect C decomposition, increasing temperatures and changes in surface and sub-surface hydrology are likely the dominant factors affecting the rate and form of C release from permafrost; however, their effect size on C release is poorly quantified. We have compiled a database of 24 incubation studies with soils from active layer and permafrost from across the entire permafrost zone to quantify a) the effect size of increasing temperatures and b) the changes from aerobic to anaerobic environmental soil conditions on C release. Results from two different meta-analyses show that a 10°C increase in temperature increased C release by a factor of two in boreal forest, peatland and tundra ecosystems. Under aerobic incubation conditions, soils released on average three times more C than under anaerobic conditions with large variation among the different ecosystems. While peatlands showed similar amounts of C release under aerobic and anaerobic soil conditions, tundra and boreal forest ecosystems released up to 8 times more C under anoxic conditions. This pan-arctic synthesis shows that boreal forest and tundra soils will have a larger impact on climate change when newly thawed permafrost C decomposes in an aerobic environment compared to an anaerobic environment even when

  17. Path Similarity Analysis: A Method for Quantifying Macromolecular Pathways

    PubMed Central

    Seyler, Sean L.; Kumar, Avishek; Thorpe, M. F.; Beckstein, Oliver

    2015-01-01

    Diverse classes of proteins function through large-scale conformational changes and various sophisticated computational algorithms have been proposed to enhance sampling of these macromolecular transition paths. Because such paths are curves in a high-dimensional space, it has been difficult to quantitatively compare multiple paths, a necessary prerequisite to, for instance, assess the quality of different algorithms. We introduce a method named Path Similarity Analysis (PSA) that enables us to quantify the similarity between two arbitrary paths and extract the atomic-scale determinants responsible for their differences. PSA utilizes the full information available in 3N-dimensional configuration space trajectories by employing the Hausdorff or Fréchet metrics (adopted from computational geometry) to quantify the degree of similarity between piecewise-linear curves. It thus completely avoids relying on projections into low dimensional spaces, as used in traditional approaches. To elucidate the principles of PSA, we quantified the effect of path roughness induced by thermal fluctuations using a toy model system. Using, as an example, the closed-to-open transitions of the enzyme adenylate kinase (AdK) in its substrate-free form, we compared a range of protein transition path-generating algorithms. Molecular dynamics-based dynamic importance sampling (DIMS) MD and targeted MD (TMD) and the purely geometric FRODA (Framework Rigidity Optimized Dynamics Algorithm) were tested along with seven other methods publicly available on servers, including several based on the popular elastic network model (ENM). PSA with clustering revealed that paths produced by a given method are more similar to each other than to those from another method and, for instance, that the ENM-based methods produced relatively similar paths. PSA applied to ensembles of DIMS MD and FRODA trajectories of the conformational transition of diphtheria toxin, a particularly challenging example, showed that

  18. Radiative transfer modeling for quantifying lunar surface minerals, particle size, and submicroscopic metallic Fe

    NASA Astrophysics Data System (ADS)

    Li, Shuai; Li, Lin

    2011-09-01

    The main objective of this work is to quantify lunar surface minerals (agglutinate, clinopyroxene, orthopyroxene, plagioclase, olivine, ilmenite, and volcanic glass), particle sizes, and the abundance of submicroscopic metallic Fe (SMFe) from the Lunar Soil Characterization Consortium (LSCC) data set with Hapke's radiative transfer theory. The mode is implemented for both forward and inverse modeling. We implement Hapke's radiative transfer theory in the inverse mode in which, instead of commonly used look-up tables, Newton's method and least squares are jointly used to solve nonlinear questions. Although the effects of temperature and surface roughness are incorporated into the implementation to improve the model performance for application of lunar spacecraft data, these effects cannot be extensively addressed in the current work because of the use of lab-measured reflectance data. Our forward radiative transfer model results show that the correlation coefficients between modeled and measured spectra are over 0.99. For the inverse model, the distribution of the particle sizes is all within their measured range. The range of modeled SMFe for highland samples is 0.01%-0.5%, and for mare samples it is 0.03%-1%. The linear trend between SMFe and ferromagnetic resonance (Is) for all the LSCC samples is consistent with laboratory measurements. For quantifying lunar mineral abundances, the results show that the R squared for the training samples (Is/FeO ≤ 65) are over 0.65 with plagioclase having highest correlation (0.94) and pyroxene having the lowest correlation (0.68). In future work, the model needs to be improved for handling more mature lunar soil samples.

  19. An experimental study quantifying pulmonary ventilation on inhalation of aerosol under steady and episodic emission.

    PubMed

    Poon, Carmen K M; Lai, Alvin C K

    2011-09-15

    Estimating inhalation dose accurately under realistic conditions can enhance the accuracy of risk assessment. Conventional methods to quantify aerosol concentration that susceptible victims in contaminated environments are exposed to use real time particle counters to measure concentrations in environments without occupancy. Breathing-induced airflow interacts and influences concentration around nostrils or mouth and alter the ultimate exposure. This subject has not yet been systematically studied, particularly under transient emission. In this work, an experimental facility comprising two manikins was designed and fabricated. One of them mimicked realistic breathing, acting as a susceptible victim. Both steady and episodic emissions were generated in an air-conditioned environmental chamber in which two different ventilation schemes were tested. The scaled-dose of the victim under different expiratory velocities and pulmonary ventilation was measured. Inferring from results obtained from comprehensive tests, it can be concluded that breathing has very significant influence on the ultimate dose compared with that without breathing. Majority of results show that breathing reduces inhalation quantity and the reduction magnitude increases with breathing rate. This is attributed to the fact that the exhalation process plays a more significant role in reducing the dose level than the enhanced effect during inhalation period. The higher the breathing rate, the sharper the decline of the resultant concentration would be leading to lower dose. Nevertheless, under low pulmonary ventilation, results show that breathing increases dose marginally. Results also reveals that ventilation scheme also affects the exposure.

  20. Quantifying uncertainty in NIF implosion performance across target scales

    NASA Astrophysics Data System (ADS)

    Spears, Brian; Baker, K.; Brandon, S.; Buchoff, M.; Callahan, D.; Casey, D.; Field, J.; Gaffney, J.; Hammer, J.; Humbird, K.; Hurricane, O.; Kruse, M.; Munro, D.; Nora, R.; Peterson, L.; Springer, P.; Thomas, C.

    2016-10-01

    Ignition experiments at NIF are being performed at a variety of target scales. Smaller targets require less energy and can be fielded more frequently. Successful small target designs can be scaled up to take advantage of the full NIF laser energy and power. In this talk, we will consider a rigorous framework for scaling from smaller to larger targets. The framework uses both simulation and experimental results to build a statistical prediction of target performance as scale is increased. Our emphasis is on quantifying uncertainty in scaling predictions with the goal of identifying the dominant contributors to that uncertainty. We take as a particular example the Big Foot platform that produces a round, 0.8 scale implosion with the potential to scale to full NIF size (1.0 scale). This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  1. Quantifying the Relationship Between Financial News and the Stock Market

    NASA Astrophysics Data System (ADS)

    Alanyali, Merve; Moat, Helen Susannah; Preis, Tobias

    2013-12-01

    The complex behavior of financial markets emerges from decisions made by many traders. Here, we exploit a large corpus of daily print issues of the Financial Times from 2nd January 2007 until 31st December 2012 to quantify the relationship between decisions taken in financial markets and developments in financial news. We find a positive correlation between the daily number of mentions of a company in the Financial Times and the daily transaction volume of a company's stock both on the day before the news is released, and on the same day as the news is released. Our results provide quantitative support for the suggestion that movements in financial markets and movements in financial news are intrinsically interlinked.

  2. Quantifying Interparticle Forces and Heterogeneity in 3D Granular Materials

    NASA Astrophysics Data System (ADS)

    Hurley, R. C.; Hall, S. A.; Andrade, J. E.; Wright, J.

    2016-08-01

    Interparticle forces in granular materials are intimately linked to mechanical properties and are known to self-organize into heterogeneous structures, or force chains, under external load. Despite progress in understanding the statistics and spatial distribution of interparticle forces in recent decades, a systematic method for measuring forces in opaque, three-dimensional (3D), frictional, stiff granular media has yet to emerge. In this Letter, we present results from an experiment that combines 3D x-ray diffraction, x-ray tomography, and a numerical force inference technique to quantify interparticle forces and their heterogeneity in an assembly of quartz grains undergoing a one-dimensional compression cycle. Forces exhibit an exponential decay above the mean and partition into strong and weak networks. We find a surprising inverse relationship between macroscopic load and the heterogeneity of interparticle forces, despite the clear emergence of two force chains that span the system.

  3. Quantifying side-chain conformational variations in protein structure

    NASA Astrophysics Data System (ADS)

    Miao, Zhichao; Cao, Yang

    2016-11-01

    Protein side-chain conformation is closely related to their biological functions. The side-chain prediction is a key step in protein design, protein docking and structure optimization. However, side-chain polymorphism comprehensively exists in protein as various types and has been long overlooked by side-chain prediction. But such conformational variations have not been quantitatively studied and the correlations between these variations and residue features are vague. Here, we performed statistical analyses on large scale data sets and found that the side-chain conformational flexibility is closely related to the exposure to solvent, degree of freedom and hydrophilicity. These analyses allowed us to quantify different types of side-chain variabilities in PDB. The results underscore that protein side-chain conformation prediction is not a single-answer problem, leading us to reconsider the assessment approaches of side-chain prediction programs.

  4. Quantifying light exposure patterns in young adult students

    NASA Astrophysics Data System (ADS)

    Alvarez, Amanda A.; Wildsoet, Christine F.

    2013-08-01

    Exposure to bright light appears to be protective against myopia in both animals (chicks, monkeys) and children, but quantitative data on human light exposure are limited. In this study, we report on a technique for quantifying light exposure using wearable sensors. Twenty-seven young adult subjects wore a light sensor continuously for two weeks during one of three seasons, and also completed questionnaires about their visual activities. Light data were analyzed with respect to refractive error and season, and the objective sensor data were compared with subjects' estimates of time spent indoors and outdoors. Subjects' estimates of time spent indoors and outdoors were in poor agreement with durations reported by the sensor data. The results of questionnaire-based studies of light exposure should thus be interpreted with caution. The role of light in refractive error development should be investigated using multiple methods such as sensors to complement questionnaires.

  5. Quantifying the Impact of Unavailability in Cyber-Physical Environments

    SciTech Connect

    Aissa, Anis Ben; Abercrombie, Robert K; Sheldon, Federick T.; Mili, Ali

    2014-01-01

    The Supervisory Control and Data Acquisition (SCADA) system discussed in this work manages a distributed control network for the Tunisian Electric & Gas Utility. The network is dispersed over a large geographic area that monitors and controls the flow of electricity/gas from both remote and centralized locations. The availability of the SCADA system in this context is critical to ensuring the uninterrupted delivery of energy, including safety, security, continuity of operations and revenue. Such SCADA systems are the backbone of national critical cyber-physical infrastructures. Herein, we propose adapting the Mean Failure Cost (MFC) metric for quantifying the cost of unavailability. This new metric combines the classic availability formulation with MFC. The resulting metric, so-called Econometric Availability (EA), offers a computational basis to evaluate a system in terms of the gain/loss ($/hour of operation) that affects each stakeholder due to unavailability.

  6. Identifying and quantifying interactions in a laboratory swarm

    NASA Astrophysics Data System (ADS)

    Puckett, James; Kelley, Douglas; Ouellette, Nicholas

    2013-03-01

    Emergent collective behavior, such as in flocks of birds or swarms of bees, is exhibited throughout the animal kingdom. Many models have been developed to describe swarming and flocking behavior using systems of self-propelled particles obeying simple rules or interacting via various potentials. However, due to experimental difficulties and constraints, little empirical data exists for characterizing the exact form of the biological interactions. We study laboratory swarms of flying Chironomus riparius midges, using stereoimaging and particle tracking techniques to record three-dimensional trajectories for all the individuals in the swarm. We describe methods to identify and quantify interactions by examining these trajectories, and report results on interaction magnitude, frequency, and mutuality.

  7. Quantifying the Behavior of Stock Correlations Under Market Stress

    NASA Astrophysics Data System (ADS)

    Preis, Tobias; Kenett, Dror Y.; Stanley, H. Eugene; Helbing, Dirk; Ben-Jacob, Eshel

    2012-10-01

    Understanding correlations in complex systems is crucial in the face of turbulence, such as the ongoing financial crisis. However, in complex systems, such as financial systems, correlations are not constant but instead vary in time. Here we address the question of quantifying state-dependent correlations in stock markets. Reliable estimates of correlations are absolutely necessary to protect a portfolio. We analyze 72 years of daily closing prices of the 30 stocks forming the Dow Jones Industrial Average (DJIA). We find the striking result that the average correlation among these stocks scales linearly with market stress reflected by normalized DJIA index returns on various time scales. Consequently, the diversification effect which should protect a portfolio melts away in times of market losses, just when it would most urgently be needed. Our empirical analysis is consistent with the interesting possibility that one could anticipate diversification breakdowns, guiding the design of protected portfolios.

  8. Quantifying side-chain conformational variations in protein structure.

    PubMed

    Miao, Zhichao; Cao, Yang

    2016-11-15

    Protein side-chain conformation is closely related to their biological functions. The side-chain prediction is a key step in protein design, protein docking and structure optimization. However, side-chain polymorphism comprehensively exists in protein as various types and has been long overlooked by side-chain prediction. But such conformational variations have not been quantitatively studied and the correlations between these variations and residue features are vague. Here, we performed statistical analyses on large scale data sets and found that the side-chain conformational flexibility is closely related to the exposure to solvent, degree of freedom and hydrophilicity. These analyses allowed us to quantify different types of side-chain variabilities in PDB. The results underscore that protein side-chain conformation prediction is not a single-answer problem, leading us to reconsider the assessment approaches of side-chain prediction programs.

  9. Quantifying the Behavior of Stock Correlations Under Market Stress

    PubMed Central

    Preis, Tobias; Kenett, Dror Y.; Stanley, H. Eugene; Helbing, Dirk; Ben-Jacob, Eshel

    2012-01-01

    Understanding correlations in complex systems is crucial in the face of turbulence, such as the ongoing financial crisis. However, in complex systems, such as financial systems, correlations are not constant but instead vary in time. Here we address the question of quantifying state-dependent correlations in stock markets. Reliable estimates of correlations are absolutely necessary to protect a portfolio. We analyze 72 years of daily closing prices of the 30 stocks forming the Dow Jones Industrial Average (DJIA). We find the striking result that the average correlation among these stocks scales linearly with market stress reflected by normalized DJIA index returns on various time scales. Consequently, the diversification effect which should protect a portfolio melts away in times of market losses, just when it would most urgently be needed. Our empirical analysis is consistent with the interesting possibility that one could anticipate diversification breakdowns, guiding the design of protected portfolios. PMID:23082242

  10. Quantifying the Relationship Between Financial News and the Stock Market

    PubMed Central

    Alanyali, Merve; Moat, Helen Susannah; Preis, Tobias

    2013-01-01

    The complex behavior of financial markets emerges from decisions made by many traders. Here, we exploit a large corpus of daily print issues of the Financial Times from 2nd January 2007 until 31st December 2012 to quantify the relationship between decisions taken in financial markets and developments in financial news. We find a positive correlation between the daily number of mentions of a company in the Financial Times and the daily transaction volume of a company's stock both on the day before the news is released, and on the same day as the news is released. Our results provide quantitative support for the suggestion that movements in financial markets and movements in financial news are intrinsically interlinked. PMID:24356666

  11. Quantifying the behavior of stock correlations under market stress.

    PubMed

    Preis, Tobias; Kenett, Dror Y; Stanley, H Eugene; Helbing, Dirk; Ben-Jacob, Eshel

    2012-01-01

    Understanding correlations in complex systems is crucial in the face of turbulence, such as the ongoing financial crisis. However, in complex systems, such as financial systems, correlations are not constant but instead vary in time. Here we address the question of quantifying state-dependent correlations in stock markets. Reliable estimates of correlations are absolutely necessary to protect a portfolio. We analyze 72 years of daily closing prices of the 30 stocks forming the Dow Jones Industrial Average (DJIA). We find the striking result that the average correlation among these stocks scales linearly with market stress reflected by normalized DJIA index returns on various time scales. Consequently, the diversification effect which should protect a portfolio melts away in times of market losses, just when it would most urgently be needed. Our empirical analysis is consistent with the interesting possibility that one could anticipate diversification breakdowns, guiding the design of protected portfolios.

  12. Quantifying the relationship between financial news and the stock market.

    PubMed

    Alanyali, Merve; Moat, Helen Susannah; Preis, Tobias

    2013-12-20

    The complex behavior of financial markets emerges from decisions made by many traders. Here, we exploit a large corpus of daily print issues of the Financial Times from 2(nd) January 2007 until 31(st) December 2012 to quantify the relationship between decisions taken in financial markets and developments in financial news. We find a positive correlation between the daily number of mentions of a company in the Financial Times and the daily transaction volume of a company's stock both on the day before the news is released, and on the same day as the news is released. Our results provide quantitative support for the suggestion that movements in financial markets and movements in financial news are intrinsically interlinked.

  13. Quantifying complexity of the chaotic regime of a semiconductor laser subject to feedback via information theory measures

    NASA Astrophysics Data System (ADS)

    Soriano, Miguel C.; Zunino, Luciano; Rosso, Osvaldo A.; Mirasso, Claudio R.

    2010-04-01

    The time evolution of the output of a semiconductor laser subject to optical feedback can exhibit high-dimensional chaotic fluctuations. In this contribution, our aim is to quantify the complexity of the chaotic time-trace generated by a semiconductor laser subject to delayed optical feedback. To that end, we discuss the properties of two recently introduced complexity measures based on information theory, namely the permutation entropy (PE) and the statistical complexity measure (SCM). The PE and SCM are defined as a functional of a symbolic probability distribution, evaluated using the Bandt-Pompe recipe to assign a probability distribution function to the time series generated by the chaotic system. In order to evaluate the performance of these novel complexity quantifiers, we compare them to a more standard chaos quantifier, namely the Kolmogorov-Sinai entropy. Here, we present numerical results showing that the statistical complexity and the permutation entropy, evaluated at the different time-scales involved in the chaotic regime of the laser subject to optical feedback, give valuable information about the complexity of the laser dynamics.

  14. UV-vis spectra as an alternative to the Lowry method for quantify hair damage induced by surfactants.

    PubMed

    Pires-Oliveira, Rafael; Joekes, Inés

    2014-11-01

    It is well known that long term use of shampoo causes damage to human hair. Although the Lowry method has been widely used to quantify hair damage, it is unsuitable to determine this in the presence of some surfactants and there is no other method proposed in literature. In this work, a different method is used to investigate and compare the hair damage induced by four types of surfactants (including three commercial-grade surfactants) and water. Hair samples were immersed in aqueous solution of surfactants under conditions that resemble a shower (38 °C, constant shaking). These solutions become colored with time of contact with hair and its UV-vis spectra were recorded. For comparison, the amount of extracted proteins from hair by sodium dodecyl sulfate (SDS) and by water were estimated by the Lowry method. Additionally, non-pigmented vs. pigmented hair and also sepia melanin were used to understand the washing solution color and their spectra. The results presented herein show that hair degradation is mostly caused by the extraction of proteins, cuticle fragments and melanin granules from hair fiber. It was found that the intensity of solution color varies with the charge density of the surfactants. Furthermore, the intensity of solution color can be correlated to the amount of proteins quantified by the Lowry method as well as to the degree of hair damage. UV-vis spectrum of hair washing solutions is a simple and straightforward method to quantify and compare hair damages induced by different commercial surfactants.

  15. The SEGUE K Giant Survey. III. Quantifying Galactic Halo Substructure

    NASA Astrophysics Data System (ADS)

    Janesh, William; Morrison, Heather L.; Ma, Zhibo; Rockosi, Constance; Starkenburg, Else; Xue, Xiang Xiang; Rix, Hans-Walter; Harding, Paul; Beers, Timothy C.; Johnson, Jennifer; Lee, Young Sun; Schneider, Donald P.

    2016-01-01

    We statistically quantify the amount of substructure in the Milky Way stellar halo using a sample of 4568 halo K giant stars at Galactocentric distances ranging over 5-125 kpc. These stars have been selected photometrically and confirmed spectroscopically as K giants from the Sloan Digital Sky Survey’s Sloan Extension for Galactic Understanding and Exploration project. Using a position-velocity clustering estimator (the 4distance) and a model of a smooth stellar halo, we quantify the amount of substructure in the halo, divided by distance and metallicity. Overall, we find that the halo as a whole is highly structured. We also confirm earlier work using blue horizontal branch (BHB) stars which showed that there is an increasing amount of substructure with increasing Galactocentric radius, and additionally find that the amount of substructure in the halo increases with increasing metallicity. Comparing to resampled BHB stars, we find that K giants and BHBs have similar amounts of substructure over equivalent ranges of Galactocentric radius. Using a friends-of-friends algorithm to identify members of individual groups, we find that a large fraction (˜33%) of grouped stars are associated with Sgr, and identify stars belonging to other halo star streams: the Orphan Stream, the Cetus Polar Stream, and others, including previously unknown substructures. A large fraction of sample K giants (more than 50%) are not grouped into any substructure. We find also that the Sgr stream strongly dominates groups in the outer halo for all except the most metal-poor stars, and suggest that this is the source of the increase of substructure with Galactocentric radius and metallicity.

  16. Quantifying realized inbreeding in wild and captive animal populations.

    PubMed

    Knief, U; Hemmrich-Stanisak, G; Wittig, M; Franke, A; Griffith, S C; Kempenaers, B; Forstmeier, W

    2015-04-01

    Most molecular measures of inbreeding do not measure inbreeding at the scale that is most relevant for understanding inbreeding depression-namely the proportion of the genome that is identical-by-descent (IBD). The inbreeding coefficient FPed obtained from pedigrees is a valuable estimator of IBD, but pedigrees are not always available, and cannot capture inbreeding loops that reach back in time further than the pedigree. We here propose a molecular approach to quantify the realized proportion of the genome that is IBD (propIBD), and we apply this method to a wild and a captive population of zebra finches (Taeniopygia guttata). In each of 948 wild and 1057 captive individuals we analyzed available single-nucleotide polymorphism (SNP) data (260 SNPs) spread over four different genomic regions in each population. This allowed us to determine whether any of these four regions was completely homozygous within an individual, which indicates IBD with high confidence. In the highly nomadic wild population, we did not find a single case of IBD, implying that inbreeding must be extremely rare (propIBD=0-0.00094, 95% CI). In the captive population, a five-generation pedigree strongly underestimated the average amount of realized inbreeding (FPed=0.013quantifying inbreeding at the individual or population level, and we show analytically that it can capture inbreeding loops that reach back up to a few hundred generations.

  17. THE SEGUE K GIANT SURVEY. III. QUANTIFYING GALACTIC HALO SUBSTRUCTURE

    SciTech Connect

    Janesh, William; Morrison, Heather L.; Ma, Zhibo; Harding, Paul; Rockosi, Constance; Xue, Xiang Xiang; Rix, Hans-Walter; Beers, Timothy C.; Johnson, Jennifer; Lee, Young Sun; Schneider, Donald P.

    2016-01-10

    We statistically quantify the amount of substructure in the Milky Way stellar halo using a sample of 4568 halo K giant stars at Galactocentric distances ranging over 5–125 kpc. These stars have been selected photometrically and confirmed spectroscopically as K giants from the Sloan Digital Sky Survey’s Sloan Extension for Galactic Understanding and Exploration project. Using a position–velocity clustering estimator (the 4distance) and a model of a smooth stellar halo, we quantify the amount of substructure in the halo, divided by distance and metallicity. Overall, we find that the halo as a whole is highly structured. We also confirm earlier work using blue horizontal branch (BHB) stars which showed that there is an increasing amount of substructure with increasing Galactocentric radius, and additionally find that the amount of substructure in the halo increases with increasing metallicity. Comparing to resampled BHB stars, we find that K giants and BHBs have similar amounts of substructure over equivalent ranges of Galactocentric radius. Using a friends-of-friends algorithm to identify members of individual groups, we find that a large fraction (∼33%) of grouped stars are associated with Sgr, and identify stars belonging to other halo star streams: the Orphan Stream, the Cetus Polar Stream, and others, including previously unknown substructures. A large fraction of sample K giants (more than 50%) are not grouped into any substructure. We find also that the Sgr stream strongly dominates groups in the outer halo for all except the most metal-poor stars, and suggest that this is the source of the increase of substructure with Galactocentric radius and metallicity.

  18. Design and Analysis of a Micromechanical Three-Component Force Sensor for Characterizing and Quantifying Surface Roughness

    NASA Astrophysics Data System (ADS)

    Liang, Q.; Wu, W.; Zhang, D.; Wei, B.; Sun, W.; Wang, Y.; Ge, Y.

    2015-10-01

    Roughness, which can represent the trade-off between manufacturing cost and performance of mechanical components, is a critical predictor of cracks, corrosion and fatigue damage. In order to measure polished or super-finished surfaces, a novel touch probe based on three-component force sensor for characterizing and quantifying surface roughness is proposed by using silicon micromachining technology. The sensor design is based on a cross-beam structure, which ensures that the system possesses high sensitivity and low coupling. The results show that the proposed sensor possesses high sensitivity, low coupling error, and temperature compensation function. The proposed system can be used to investigate micromechanical structures with nanometer accuracy.

  19. Quantifying Qualitative Data Using Cognitive Maps

    ERIC Educational Resources Information Center

    Scherp, Hans-Ake

    2013-01-01

    The aim of the article is to show how substantial qualitative material consisting of graphic cognitive maps can be analysed by using digital CmapTools, Excel and SPSS. Evidence is provided of how qualitative and quantitative methods can be combined in educational research by transforming qualitative data into quantitative data to facilitate…

  20. Quantifying the Nonlinear, Anisotropic Material Response of Spinal Ligaments

    NASA Astrophysics Data System (ADS)

    Robertson, Daniel J.

    Spinal ligaments may be a significant source of chronic back pain, yet they are often disregarded by the clinical community due to a lack of information with regards to their material response, and innervation characteristics. The purpose of this dissertation was to characterize the material response of spinal ligaments and to review their innervation characteristics. Review of relevant literature revealed that all of the major spinal ligaments are innervated. They cause painful sensations when irritated and provide reflexive control of the deep spinal musculature. As such, including the neurologic implications of iatrogenic ligament damage in the evaluation of surgical procedures aimed at relieving back pain will likely result in more effective long-term solutions. The material response of spinal ligaments has not previously been fully quantified due to limitations associated with standard soft tissue testing techniques. The present work presents and validates a novel testing methodology capable of overcoming these limitations. In particular, the anisotropic, inhomogeneous material constitutive properties of the human supraspinous ligament are quantified and methods for determining the response of the other spinal ligaments are presented. In addition, a method for determining the anisotropic, inhomogeneous pre-strain distribution of the spinal ligaments is presented. The multi-axial pre-strain distributions of the human anterior longitudinal ligament, ligamentum flavum and supraspinous ligament were determined using this methodology. Results from this work clearly demonstrate that spinal ligaments are not uniaxial structures, and that finite element models which account for pre-strain and incorporate ligament's complex material properties may provide increased fidelity to the in vivo condition.

  1. Quantifying Particle Numbers and Mass Flux in Drifting Snow

    NASA Astrophysics Data System (ADS)

    Crivelli, Philip; Paterna, Enrico; Horender, Stefan; Lehning, Michael

    2016-12-01

    We compare two of the most common methods of quantifying mass flux, particle numbers and particle-size distribution for drifting snow events, the snow-particle counter (SPC), a laser-diode-based particle detector, and particle tracking velocimetry based on digital shadowgraphic imaging. The two methods were correlated for mass flux and particle number flux. For the SPC measurements, the device was calibrated by the manufacturer beforehand. The shadowgrapic imaging method measures particle size and velocity directly from consecutive images, and before each new test the image pixel length is newly calibrated. A calibration study with artificially scattered sand particles and glass beads provides suitable settings for the shadowgraphical imaging as well as obtaining a first correlation of the two methods in a controlled environment. In addition, using snow collected in trays during snowfall, several experiments were performed to observe drifting snow events in a cold wind tunnel. The results demonstrate a high correlation between the mass flux obtained for the calibration studies (r ≥slant 0.93) and good correlation for the drifting snow experiments (r ≥slant 0.81). The impact of measurement settings is discussed in order to reliably quantify particle numbers and mass flux in drifting snow. The study was designed and performed to optimize the settings of the digital shadowgraphic imaging system for both the acquisition and the processing of particles in a drifting snow event. Our results suggest that these optimal settings can be transferred to different imaging set-ups to investigate sediment transport processes.

  2. Quantifying the Bioporosity of Recent Po Delta Sediments

    NASA Astrophysics Data System (ADS)

    Locat, J.; Levesque, M.; Lee, H.; Leroueil, S.

    2004-12-01

    As part of project EuroSTRATAFORM, a series of shallow box-core samples were collected near the Po Delta in Italy. They reveal the presence of intense biogenic structures that are believed to have a strong influence on the physical properties of the sediments as they introduce another class of porosity: bioporosity (volume of biopores/total volume of the sample). Therefore, it required that they be evaluated and quantified. Two of these cores were selected for a detailed analysis. This was done primarily using 3D CATSCAN imagery (tomographic intensity) and direct physico-chemical measurements. The tomographic intensity is a complex value controlled by many factors such as the grain size, mineralogy, consolidation, water content and porosity. Two methods were used to quantify the bioporosity: an absolute and a relative bioporosity measurements both based on the use of the tomographic intensity. The relative method takes into account the variability of the sediment densities along the core, whereas the absolute method fixes the tomographic intensity based on the mean density of sediment for the whole core. Because of the evolution of the geometry of the biogenic structures, it became clear that the relative method was much better. Results have shown that the bioporosity could reach values as high a 40% and could account for more than half of the total porosity. These results suggest that significant bias on water content measurement of the matrix thus influencing estimation of physical properties like plastic and liquid limits and the liquidity index via the bias on the matrix water content measurement.

  3. Tactical Wheeled Vehicle Survivability: Results of Experiments to Quantify Aboveground Impulse

    DTIC Science & Technology

    2010-03-01

    for standard oven- dry water contents. The microwave water content measurement was taken to ensure that the water content of the material was within...the target specifi- cation before proceeding to the next lift. The oven- dry water contents are reported as the final water contents. An elevation

  4. Quantifying viruses and bacteria in wastewater - results, quality control, and interpretation methods

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Membrane bioreactors (MBR), used for wastewater treatment in Ohio and elsewhere in the United States, have pore sizes large enough to theoretically reduce concentrations of protozoa and bacteria, but not viruses. Sampling for viruses in wastewater is seldom done and not required. Instead, the bac...

  5. In Vivo Angiography Quantifies Oxygen-Induced Retinopathy Vascular Recovery

    PubMed Central

    Mezu-Ndubuisi, Olachi J.

    2016-01-01

    ABSTRACT Purpose Retinopathy of prematurity (ROP) is a potentially blinding vasoproliferative disease. There is no standardized way to quantify plus disease (tortuous and dilated retinal vessels) or characterize abnormal recovery during ROP monitoring. This study objectively studies vascular features in live mice during development using noninvasive retinal imaging. Methods Using fluorescein angiography (FA), retinal vascular features were quantified in live mice with oxygen induced retinopathy (OIR). A total of 105 wild-type mice were exposed to 77% oxygen from postnatal day 7 (P7) till P12 (OIR mice). Also, 105 age-matched pups were raised in room air (RA mice). In vivo FA was performed at early (P16 to P20), mid (P23 to P27), late (P30 to P34), and mature (P47) phases of retinal vascular development. Retinal vascular area, retinal vein width, and retinal artery tortuosity were quantified. Results Retinal artery tortuosity was higher in OIR than RA mice at early (p < 0.0001), mid (p < 0.0001), late (p < 0.0001), and mature (p < 0.0001) phases. Retinal vascular area in OIR mice increased from early to mid-phase (p < 0.0001), but remained unchanged from mid to late (p = 0.23), and from late to mature phase (p = 0.98). Retinal vein width was larger in OIR mice compared to RA mice during early phase only. Arteries in OIR mice were more tortuous from early to mid-phase (p < 0.0001), but tortuosity remained stable from mid through mature phase. RA mice had an increase in retinal vascular area from early to late phase, but maintained uniform retinal vein width and retinal artery tortuosity in all phases. Conclusions In vivo FA distinguished arterial and venous features, similar to plus disease, and revealed aberrant recovery of OIR mice (arterial tortuosity, reduced capillary density, and absent neovascular buds) that persisted into adulthood. Retinal artery tortuosity may be a reliable, objective marker of severity of ROP. Infants with abnormal retinal vascular

  6. Using Accelerometer and Gyroscopic Measures to Quantify Postural Stability

    PubMed Central

    Alberts, Jay L.; Hirsch, Joshua R.; Koop, Mandy Miller; Schindler, David D.; Kana, Daniel E.; Linder, Susan M.; Campbell, Scott; Thota, Anil K.

    2015-01-01

    Context Force platforms and 3-dimensional motion-capture systems provide an accurate method of quantifying postural stability. Substantial cost, space, time to administer, and need for trained personnel limit widespread use of biomechanical techniques in the assessment of postural stability in clinical or field environments. Objective To determine whether accelerometer and gyroscope data sampled from a consumer electronics device (iPad2) provide sufficient resolution of center-of-gravity (COG) movements to accurately quantify postural stability in healthy young people. Design Controlled laboratory study. Setting Research laboratory in an academic medical center. Patients or Other Participants A total of 49 healthy individuals (age = 19.5 ± 3.1 years, height = 167.7 ± 13.2 cm, mass = 68.5 ± 17.5 kg). Intervention(s) Participants completed the NeuroCom Sensory Organization Test (SOT) with an iPad2 affixed at the sacral level. Main Outcome Measure(s) Primary outcomes were equilibrium scores from both systems and the time series of the angular displacement of the anteroposterior COG sway during each trial. A Bland-Altman assessment for agreement was used to compare equilibrium scores produced by the NeuroCom and iPad2 devices. Limits of agreement was defined as the mean bias (NeuroCom − iPad) ± 2 standard deviations. Mean absolute percentage error and median difference between the NeuroCom and iPad2 measurements were used to evaluate how closely the real-time COG sway measured by the 2 systems tracked each other. Results The limits between the 2 devices ranged from −0.5° to 0.5° in SOT condition 1 to −2.9° to 1.3° in SOT condition 5. The largest absolute value of the measurement error within the 95% confidence intervals for all conditions was 2.9°. The mean absolute percentage error analysis indicated that the iPad2 tracked NeuroCom COG with an average error ranging from 5.87% to 10.42% of the NeuroCom measurement across SOT conditions. Conclusions The i

  7. Quantifying the Ease of Scientific Discovery.

    PubMed

    Arbesman, Samuel

    2011-02-01

    It has long been known that scientific output proceeds on an exponential increase, or more properly, a logistic growth curve. The interplay between effort and discovery is clear, and the nature of the functional form has been thought to be due to many changes in the scientific process over time. Here I show a quantitative method for examining the ease of scientific progress, another necessary component in understanding scientific discovery. Using examples from three different scientific disciplines - mammalian species, chemical elements, and minor planets - I find the ease of discovery to conform to an exponential decay. In addition, I show how the pace of scientific discovery can be best understood as the outcome of both scientific output and ease of discovery. A quantitative study of the ease of scientific discovery in the aggregate, such as done here, has the potential to provide a great deal of insight into both the nature of future discoveries and the technical processes behind discoveries in science.

  8. Quantifying nursing workflow in medication administration.

    PubMed

    Keohane, Carol A; Bane, Anne D; Featherstone, Erica; Hayes, Judy; Woolf, Seth; Hurley, Ann; Bates, David W; Gandhi, Tejal K; Poon, Eric G

    2008-01-01

    New medication administration systems are showing promise in improving patient safety at the point of care, but adoption of these systems requires significant changes in nursing workflow. To prepare for these changes, the authors report on a time-motion study that measured the proportion of time that nurses spend on various patient care activities, focusing on medication administration-related activities. Implications of their findings are discussed.

  9. Use of short half-life cosmogenic isotopes to quantify sediment mixing and transport in karst conduits

    NASA Astrophysics Data System (ADS)

    Paylor, R.

    2011-12-01

    Particulate inorganic carbon (PIC) transport and flux in karst aquifers is poorly understood. Methods to quantify PIC flux are needed in order to account for total inorganic carbon removal (chemical plus mechanical) from karst settings. Quantifying PIC flux will allow more accurate calculations of landscape denudation and global carbon sink processes. The study concentrates on the critical processes of the suspended sediment component of mass flux - surface soil/stored sediment mixing, transport rates and distance, and sediment storage times. The primary objective of the study is to describe transport and mixing with the resolution of single storm-flow events. To quantify the transport processes, short half-life cosmogenic isotopes are utilized. The isotopes 7Be (t1/2 = 53d) and 210Pb (t1/2 = 22y) are the primary isotopes measured, and other potential isotopes such as 137Cs and 241Am are investigated. The study location is at Mammoth Cave National Park within the Logsdon River watershed. The Logsdon River conduit is continuously traversable underground for two kilometers. Background levels and input concentrations of isotopes are determined from soil samples taken at random locations in the catchment area, and suspended sediment collected from the primary sinking stream during a storm event. Suspended sediment was also collected from the downstream end of the conduit during the storm event. After the storm flow receded, fine sediment samples were taken from the cave stream at regular intervals to determine transport distances and mixing ratios along the conduit. Samples were analyzed with a Canberra Industries gamma ray spectrometer, counted for 24 hours to increase detection of low radionuclide activities. The measured activity levels of radionuclides in the samples were adjusted for decay from time of sampling using standard decay curves. The results of the study show that surface sediment mixing, transport and storage in karst conduits is a dynamic but

  10. Land cover change and remote sensing: Examples of quantifying spatiotemporal dynamics in tropical forests

    SciTech Connect

    Krummel, J.R.; Su, Haiping; Fox, J.; Yarnasan, S.; Ekasingh, M.

    1995-06-01

    Research on human impacts or natural processes that operate over broad geographic areas must explicitly address issues of scale and spatial heterogeneity. While the tropical forests of Southeast Asia and Mexico have been occupied and used to meet human needs for thousands of years, traditional forest management systems are currently being transformed by rapid and far-reaching demographic, political, economic, and environmental changes. The dynamics of population growth, migration into the remaining frontiers, and responses to national and international market forces result in a demand for land to produce food and fiber. These results illustrate some of the mechanisms that drive current land use changes, especially in the tropical forest frontiers. By linking the outcome of individual land use decisions and measures of landscape fragmentation and change, the aggregated results shows the hierarchy of temporal and spatial events that in summation result in global changes to the most complex and sensitive biome -- tropical forests. By quantifying the spatial and temporal patterns of tropical forest change, researchers can assist policy makers by showing how landscape systems in these tropical forests are controlled by physical, biological, social, and economic parameters.

  11. Children's Knowledge of the Quantifier "Dou" in Mandarin Chinese

    ERIC Educational Resources Information Center

    Zhou, Peng; Crain, Stephen

    2011-01-01

    The quantifier "dou" (roughly corresponding to English "all") in Mandarin Chinese has been the topic of much discussion in the theoretical literature. This study investigated children's knowledge of this quantifier using a new methodological technique, which we dubbed the Question-Statement Task. Three questions were addressed: (i) whether young…

  12. Shortcuts to Quantifier Interpretation in Children and Adults

    ERIC Educational Resources Information Center

    Brooks, Patricia J.; Sekerina, Irina

    2006-01-01

    Errors involving universal quantification are common in contexts depicting sets of individuals in partial, one-to-one correspondence. In this article, we explore whether quantifier-spreading errors are more common with distributive quantifiers each and every than with all. In Experiments 1 and 2, 96 children (5- to 9-year-olds) viewed pairs of…

  13. Visual Attention and Quantifier-Spreading in Heritage Russian Bilinguals

    ERIC Educational Resources Information Center

    Sekerina, Irina A.; Sauermann, Antje

    2015-01-01

    It is well established in language acquisition research that monolingual children and adult second language learners misinterpret sentences with the universal quantifier "every" and make quantifier-spreading errors that are attributed to a preference for a match in number between two sets of objects. The present Visual World eye-tracking…

  14. Quantifying terpenes in rumen fluid, serum, and plasma from sheep

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Determining the fate of terpenes consumed by browsing ruminants require methods to quantify their presence in blood and rumen fluid. Our objective was to modify an existing procedure for plasma terpenes to quantify 25 structurally diverse mono- and sesquiterpenes in serum, plasma, and rumen fluid fr...

  15. Tetrahydrobiopterin shows chaperone activity for tyrosine hydroxylase.

    PubMed

    Thöny, Beat; Calvo, Ana C; Scherer, Tanja; Svebak, Randi M; Haavik, Jan; Blau, Nenad; Martinez, Aurora

    2008-07-01

    Tyrosine hydroxylase (TH) is the rate-limiting enzyme in the synthesis of catecholamine neurotransmitters. Primary inherited defects in TH have been associated with l-DOPA responsive and non-responsive dystonia and infantile parkinsonism. In this study, we show that both the cofactor (6R)-l-erythro-5,6,7,8-tetrahydrobiopterin (BH(4)) and the feedback inhibitor and catecholamine product dopamine increase the kinetic stability of human TH isoform 1 in vitro. Activity measurements and synthesis of the enzyme by in vitro transcription-translation revealed a complex regulation by the cofactor including both enzyme inactivation and conformational stabilization. Oral BH(4) supplementation to mice increased TH activity and protein levels in brain extracts, while the Th-mRNA level was not affected. All together our results indicate that the molecular mechanisms for the stabilization are a primary folding-aid effect of BH(4) and a secondary effect by increased synthesis and binding of catecholamine ligands. Our results also establish that orally administered BH(4) crosses the blood-brain barrier and therapeutic regimes based on BH(4) supplementation should thus consider the effect on TH. Furthermore, BH(4) supplementation arises as a putative therapeutic agent in the treatment of brain disorders associated with TH misfolding, such as for the human TH isoform 1 mutation L205P.

  16. Quantifiably secure power grid operation, management, and evolution :

    SciTech Connect

    Gray, Genetha Anne.; Watson, Jean-Paul; Silva Monroy, Cesar Augusto; Gramacy, Robert B.

    2013-09-01

    This report summarizes findings and results of the Quantifiably Secure Power Grid Operation, Management, and Evolution LDRD. The focus of the LDRD was to develop decisionsupport technologies to enable rational and quantifiable risk management for two key grid operational timescales: scheduling (day-ahead) and planning (month-to-year-ahead). Risk or resiliency metrics are foundational in this effort. The 2003 Northeast Blackout investigative report stressed the criticality of enforceable metrics for system resiliency the grids ability to satisfy demands subject to perturbation. However, we neither have well-defined risk metrics for addressing the pervasive uncertainties in a renewable energy era, nor decision-support tools for their enforcement, which severely impacts efforts to rationally improve grid security. For day-ahead unit commitment, decision-support tools must account for topological security constraints, loss-of-load (economic) costs, and supply and demand variability especially given high renewables penetration. For long-term planning, transmission and generation expansion must ensure realized demand is satisfied for various projected technological, climate, and growth scenarios. The decision-support tools investigated in this project paid particular attention to tailoriented risk metrics for explicitly addressing high-consequence events. Historically, decisionsupport tools for the grid consider expected cost minimization, largely ignoring risk and instead penalizing loss-of-load through artificial parameters. The technical focus of this work was the development of scalable solvers for enforcing risk metrics. Advanced stochastic programming solvers were developed to address generation and transmission expansion and unit commitment, minimizing cost subject to pre-specified risk thresholds. Particular attention was paid to renewables where security critically depends on production and demand prediction accuracy. To address this

  17. Quantifying Selective Pressures Driving Bacterial Evolution Using Lineage Analysis

    NASA Astrophysics Data System (ADS)

    Lambert, Guillaume; Kussell, Edo

    2015-01-01

    Organisms use a variety of strategies to adapt to their environments and maximize long-term growth potential, but quantitative characterization of the benefits conferred by the use of such strategies, as well as their impact on the whole population's rate of growth, remains challenging. Here, we use a path-integral framework that describes how selection acts on lineages—i.e., the life histories of individuals and their ancestors—to demonstrate that lineage-based measurements can be used to quantify the selective pressures acting on a population. We apply this analysis to Escherichia coli bacteria exposed to cyclical treatments of carbenicillin, an antibiotic that interferes with cell-wall synthesis and affects cells in an age-dependent manner. While the extensive characterization of the life history of thousands of cells is necessary to accurately extract the age-dependent selective pressures caused by carbenicillin, the same measurement can be recapitulated using lineage-based statistics of a single surviving cell. Population-wide evolutionary pressures can be extracted from the properties of the surviving lineages within a population, providing an alternative and efficient procedure to quantify the evolutionary forces acting on a population. Importantly, this approach is not limited to age-dependent selection, and the framework can be generalized to detect signatures of other trait-specific selection using lineage-based measurements. Our results establish a powerful way to study the evolutionary dynamics of life under selection and may be broadly useful in elucidating selective pressures driving the emergence of antibiotic resistance and the evolution of survival strategies in biological systems.

  18. Hyperspectral remote sensing tools for quantifying plant litter and invasive species in arid ecosystems

    USGS Publications Warehouse

    Nagler, Pamela L.; Sridhar, B.B. Maruthi; Olsson, Aaryn Dyami; Glenn, Edward P.; van Leeuwen, Willem J.D.; Thenkabail, Prasad S.; Huete, Alfredo; Lyon, John G.

    2012-01-01

    Green vegetation can be distinguished using visible and infrared multi-band and hyperspectral remote sensing methods. The problem has been in identifying and distinguishing the non-photosynthetically active radiation (PAR) landscape components, such as litter and soils, and from green vegetation. Additionally, distinguishing different species of green vegetation is challenging using the relatively few bands available on most satellite sensors. This chapter focuses on hyperspectral remote sensing characteristics that aim to distinguish between green vegetation, soil, and litter (or senescent vegetation). Quantifying litter by remote sensing methods is important in constructing carbon budgets of natural and agricultural ecosystems. Distinguishing between plant types is important in tracking the spread of invasive species. Green leaves of different species usually have similar spectra, making it difficult to distinguish between species. However, in this chapter we show that phenological differences between species can be used to detect some invasive species by their distinct patterns of greening and dormancy over an annual cycle based on hyperspectral data. Both applications require methods to quantify the non-green cellulosic fractions of plant tissues by remote sensing even in the presence of soil and green plant cover. We explore these methods and offer three case studies. The first concerns distinguishing surface litter from soil using the Cellulose Absorption Index (CAI), as applied to no-till farming practices where plant litter is left on the soil after harvest. The second involves using different band combinations to distinguish invasive saltcedar from agricultural and native riparian plants on the Lower Colorado River. The third illustrates the use of the CAI and NDVI in time-series analyses to distinguish between invasive buffelgrass and native plants in a desert environment in Arizona. Together the results show how hyperspectral imagery can be applied to

  19. New primers for detecting and quantifying denitrifying anaerobic methane oxidation archaea in different ecological niches.

    PubMed

    Ding, Jing; Ding, Zhao-Wei; Fu, Liang; Lu, Yong-Ze; Cheng, Shuk H; Zeng, Raymond J

    2015-11-01

    The significance of ANME-2d in methane sink in the environment has been overlooked, and there was no any study evaluating the distribution of ANME-2d in the environment. New primers were thus needed to be designed for following research. In this paper, a pair of primers (DP397F and DP569R) was designed to quantify ANME-2d. The specificity and amplification efficiency of this primer pair were acceptable. PCR amplification of another pair of primers (DP142F and DP779R) generated a single, bright targeted band from the enrichment sample, but yielded faint, multiple bands from the environmental samples. Nested PCR was conducted using the primers DP142F/DP779R in the first round and DP142F/DP569R in the second round, which generated a bright targeted band. Further phylogenetic analysis showed that these targeted bands were ANME-2d-related sequences. Real-time PCR showed that the copies of the 16s ribosomal RNA gene of ANME-2d in these samples ranged from 3.72 × 10(4) to 2.30 × 10(5) copies μg(-1) DNA, indicating that the percentage of ANME-2d was greatest in a polluted river sample and least in a rice paddy sample. These results demonstrate that the newly developed real-time PCR primers could sufficiently quantify ANME-2d and that nested PCR with an appropriate combination of the new primers could successfully detect ANME-2d in environmental samples; the latter finding suggests that ANME-2d may spread in environments.

  20. Quantifying and Generalizing Hydrologic Responses to Dam Regulation using a Statistical Modeling Approach

    SciTech Connect

    McManamay, Ryan A

    2014-01-01

    Despite the ubiquitous existence of dams within riverscapes, much of our knowledge about dams and their environmental effects remains context-specific. Hydrology, more than any other environmental variable, has been studied in great detail with regard to dam regulation. While much progress has been made in generalizing the hydrologic effects of regulation by large dams, many aspects of hydrology show site-specific fidelity to dam operations, small dams (including diversions), and regional hydrologic regimes. A statistical modeling framework is presented to quantify and generalize hydrologic responses to varying degrees of dam regulation. Specifically, the objectives were to 1) compare the effects of local versus cumulative dam regulation, 2) determine the importance of different regional hydrologic regimes in influencing hydrologic responses to dams, and 3) evaluate how different regulation contexts lead to error in predicting hydrologic responses to dams. Overall, model performance was poor in quantifying the magnitude of hydrologic responses, but performance was sufficient in classifying hydrologic responses as negative or positive. Responses of some hydrologic indices to dam regulation were highly dependent upon hydrologic class membership and the purpose of the dam. The opposing coefficients between local and cumulative-dam predictors suggested that hydrologic responses to cumulative dam regulation are complex, and predicting the hydrology downstream of individual dams, as opposed to multiple dams, may be more easy accomplished using statistical approaches. Results also suggested that particular contexts, including multipurpose dams, high cumulative regulation by multiple dams, diversions, close proximity to dams, and certain hydrologic classes are all sources of increased error when predicting hydrologic responses to dams. Statistical models, such as the ones presented herein, show promise in their ability to model the effects of dam regulation effects at

  1. Quantifying tissue mechanical properties using photoplethysmography

    SciTech Connect

    Akl, Tony; Wilson, Mark A.; Ericson, Milton Nance; Cote, Gerard L.

    2014-01-01

    Photoplethysmography (PPG) is a non-invasive optical method that can be used to detect blood volume changes in the microvascular bed of tissue. The PPG signal comprises two components; a pulsatile waveform (AC) attributed to changes in the interrogated blood volume with each heartbeat, and a slowly varying baseline (DC) combining low frequency fluctuations mainly due to respiration and sympathetic nervous system activity. In this report, we investigate the AC pulsatile waveform of the PPG pulse for ultimate use in extracting information regarding the biomechanical properties of tissue and vasculature. By analyzing the rise time of the pulse in the diastole period, we show that PPG is capable of measuring changes in the Young s Modulus of tissue mimicking phantoms with a resolution of 4 KPa in the range of 12 to 61 KPa. In addition, the shape of the pulse can potentially be used to diagnose vascular complications by differentiating upstream from downstream complications. A Windkessel model was used to model changes in the biomechanical properties of the circulation and to test the proposed concept. The modeling data confirmed the response seen in vitro and showed the same trends in the PPG rise and fall times with changes in compliance and vascular resistance.

  2. Quantifying MCMC Exploration of Phylogenetic Tree Space

    PubMed Central

    Whidden, Chris; Matsen, Frederick A.

    2015-01-01

    In order to gain an understanding of the effectiveness of phylogenetic Markov chain Monte Carlo (MCMC), it is important to understand how quickly the empirical distribution of the MCMC converges to the posterior distribution. In this article, we investigate this problem on phylogenetic tree topologies with a metric that is especially well suited to the task: the subtree prune-and-regraft (SPR) metric. This metric directly corresponds to the minimum number of MCMC rearrangements required to move between trees in common phylogenetic MCMC implementations. We develop a novel graph-based approach to analyze tree posteriors and find that the SPR metric is much more informative than simpler metrics that are unrelated to MCMC moves. In doing so, we show conclusively that topological peaks do occur in Bayesian phylogenetic posteriors from real data sets as sampled with standard MCMC approaches, investigate the efficiency of Metropolis-coupled MCMC (MCMCMC) in traversing the valleys between peaks, and show that conditional clade distribution (CCD) can have systematic problems when there are multiple peaks. PMID:25631175

  3. Quantifying MCMC exploration of phylogenetic tree space.

    PubMed

    Whidden, Chris; Matsen, Frederick A

    2015-05-01

    In order to gain an understanding of the effectiveness of phylogenetic Markov chain Monte Carlo (MCMC), it is important to understand how quickly the empirical distribution of the MCMC converges to the posterior distribution. In this article, we investigate this problem on phylogenetic tree topologies with a metric that is especially well suited to the task: the subtree prune-and-regraft (SPR) metric. This metric directly corresponds to the minimum number of MCMC rearrangements required to move between trees in common phylogenetic MCMC implementations. We develop a novel graph-based approach to analyze tree posteriors and find that the SPR metric is much more informative than simpler metrics that are unrelated to MCMC moves. In doing so, we show conclusively that topological peaks do occur in Bayesian phylogenetic posteriors from real data sets as sampled with standard MCMC approaches, investigate the efficiency of Metropolis-coupled MCMC (MCMCMC) in traversing the valleys between peaks, and show that conditional clade distribution (CCD) can have systematic problems when there are multiple peaks.

  4. Quantifying Effects Of Water Stress On Sunflowers

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This poster presentation describes the data collection and analysis procedures and results for 2009 from a research grant funded by the National Sunflower Association. The primary objective was to evaluate the use of crop canopy temperature measured with infrared temperature sensors, as a more time ...

  5. Quantifying App Store Dynamics: Longitudinal Tracking of Mental Health Apps

    PubMed Central

    Nicholas, Jennifer; Christensen, Helen

    2016-01-01

    Background For many mental health conditions, mobile health apps offer the ability to deliver information, support, and intervention outside the clinical setting. However, there are difficulties with the use of a commercial app store to distribute health care resources, including turnover of apps, irrelevance of apps, and discordance with evidence-based practice. Objective The primary aim of this study was to quantify the longevity and rate of turnover of mental health apps within the official Android and iOS app stores. The secondary aim was to quantify the proportion of apps that were clinically relevant and assess whether the longevity of these apps differed from clinically nonrelevant apps. The tertiary aim was to establish the proportion of clinically relevant apps that included claims of clinical effectiveness. We performed additional subgroup analyses using additional data from the app stores, including search result ranking, user ratings, and number of downloads. Methods We searched iTunes (iOS) and the Google Play (Android) app stores each day over a 9-month period for apps related to depression, bipolar disorder, and suicide. We performed additional app-specific searches if an app no longer appeared within the main search Results On the Android platform, 50% of the search results changed after 130 days (depression), 195 days (bipolar disorder), and 115 days (suicide). Search results were more stable on the iOS platform, with 50% of the search results remaining at the end of the study period. Approximately 75% of Android and 90% of iOS apps were still available to download at the end of the study. We identified only 35.3% (347/982) of apps as being clinically relevant for depression, of which 9 (2.6%) claimed clinical effectiveness. Only 3 included a full citation to a published study. Conclusions The mental health app environment is volatile, with a clinically relevant app for depression becoming unavailable to download every 2.9 days. This poses

  6. Quantifying and predicting Drosophila larvae crawling phenotypes

    PubMed Central

    Günther, Maximilian N.; Nettesheim, Guilherme; Shubeita, George T.

    2016-01-01

    The fruit fly Drosophila melanogaster is a widely used model for cell biology, development, disease, and neuroscience. The fly’s power as a genetic model for disease and neuroscience can be augmented by a quantitative description of its behavior. Here we show that we can accurately account for the complex and unique crawling patterns exhibited by individual Drosophila larvae using a small set of four parameters obtained from the trajectories of a few crawling larvae. The values of these parameters change for larvae from different genetic mutants, as we demonstrate for fly models of Alzheimer’s disease and the Fragile X syndrome, allowing applications such as genetic or drug screens. Using the quantitative model of larval crawling developed here we use the mutant-specific parameters to robustly simulate larval crawling, which allows estimating the feasibility of laborious experimental assays and aids in their design. PMID:27323901

  7. Quantifying and predicting Drosophila larvae crawling phenotypes.

    PubMed

    Günther, Maximilian N; Nettesheim, Guilherme; Shubeita, George T

    2016-06-21

    The fruit fly Drosophila melanogaster is a widely used model for cell biology, development, disease, and neuroscience. The fly's power as a genetic model for disease and neuroscience can be augmented by a quantitative description of its behavior. Here we show that we can accurately account for the complex and unique crawling patterns exhibited by individual Drosophila larvae using a small set of four parameters obtained from the trajectories of a few crawling larvae. The values of these parameters change for larvae from different genetic mutants, as we demonstrate for fly models of Alzheimer's disease and the Fragile X syndrome, allowing applications such as genetic or drug screens. Using the quantitative model of larval crawling developed here we use the mutant-specific parameters to robustly simulate larval crawling, which allows estimating the feasibility of laborious experimental assays and aids in their design.

  8. Quantifying Irregularity in Pulsating Red Giants

    NASA Astrophysics Data System (ADS)

    Percy, J. R.; Esteves, S.; Lin, A.; Menezes, C.; Wu, S.

    2009-12-01

    Hundreds of red giant variable stars are classified as “type L,” which the General Catalogue of Variable Stars (GCVS) defines as “slow irregular variables of late spectral type...which show no evidence of periodicity, or any periodicity present is very poorly defined....” Self-correlation (Percy and Muhammed 2004) is a simple form of time-series analysis which determines the cycle-to-cycle behavior of a star, averaged over all the available data. It is well suited for analyzing stars which are not strictly periodic. Even for non-periodic stars, it provides a “profile” of the variability, including the average “characteristic time” of variability. We have applied this method to twenty-three L-type variables which have been measured extensively by AAVSO visual observers. We find a continuous spectrum of behavior, from irregular to semiregular.

  9. Quantifying and predicting Drosophila larvae crawling phenotypes

    NASA Astrophysics Data System (ADS)

    Günther, Maximilian N.; Nettesheim, Guilherme; Shubeita, George T.

    2016-06-01

    The fruit fly Drosophila melanogaster is a widely used model for cell biology, development, disease, and neuroscience. The fly’s power as a genetic model for disease and neuroscience can be augmented by a quantitative description of its behavior. Here we show that we can accurately account for the complex and unique crawling patterns exhibited by individual Drosophila larvae using a small set of four parameters obtained from the trajectories of a few crawling larvae. The values of these parameters change for larvae from different genetic mutants, as we demonstrate for fly models of Alzheimer’s disease and the Fragile X syndrome, allowing applications such as genetic or drug screens. Using the quantitative model of larval crawling developed here we use the mutant-specific parameters to robustly simulate larval crawling, which allows estimating the feasibility of laborious experimental assays and aids in their design.

  10. Quantifying protein diffusion and capture on filaments.

    PubMed

    Reithmann, Emanuel; Reese, Louis; Frey, Erwin

    2015-02-17

    The functional relevance of regulating proteins is often limited to specific binding sites such as the ends of microtubules or actin-filaments. A localization of proteins on these functional sites is of great importance. We present a quantitative theory for a diffusion and capture process, where proteins diffuse on a filament and stop diffusing when reaching the filament's end. It is found that end-association after one-dimensional diffusion is the main source for tip-localization of such proteins. As a consequence, diffusion and capture is highly efficient in enhancing the reaction velocity of enzymatic reactions, where proteins and filament ends are to each other as enzyme and substrate. We show that the reaction velocity can effectively be described within a Michaelis-Menten framework. Together, one-dimensional diffusion and capture beats the (three-dimensional) Smoluchowski diffusion limit for the rate of protein association to filament ends.

  11. Quantifying human response capabilities towards tsunami threats at community level

    NASA Astrophysics Data System (ADS)

    Post, J.; Mück, M.; Zosseder, K.; Wegscheider, S.; Taubenböck, H.; Strunz, G.; Muhari, A.; Anwar, H. Z.; Birkmann, J.; Gebert, N.

    2009-04-01

    besides others play a role. An attempt to quantify this variable under high uncertainty is also presented. Quantifying ET is based on a GIS modelling using a Cost Weighted Distance approach. Basic principle is to define the best evacuation path from a given point to the next safe area (shelter location). Here the fastest path from that point to the shelter location has to be found. Thereby the impact of land cover, slope, population density, population age and gender distribution are taken into account as literature studies prove these factors as highly important. Knowing the fastest path and the distance to the next safe area together with a spatially distributed pattern of evacuation speed delivers the time needed from each location to a safe area. By considering now the obtained time value for RsT the coverage area of an evacuation target point (safe area) can be assigned. Incorporating knowledge on people capacity of an evacuation target point the respective coverage area is refined. Hence areas with weak, moderate and good human response capabilities can be detected. This allows calculation of potential amount of people affected (dead or injured) and amount of people dislocated. First results for Kuta (Bali) for a worst case tsunami event deliver people affected of approx. 25 000 when RT = 0 minutes (direct evacuation when receiving a tsunami warning to 120 000 when RT > ETA (no evacuation action until tsunami hits the land). Additionally fastest evacuation routes to the evacuation target points can be assigned. Areas with weak response capabilities can be assigned as priority areas to install e.g. additional evacuation target points or to increase tsunami knowledge and awareness to promote a faster reaction time. Especially in analyzing underlying socio-economic properties causing deficiencies in responding to a tsunami threat can lead to valuable information and direct planning of adaptation measures. Keywords: Community level, Risk and vulnerability assessment

  12. Quantifying Visual Similarity in Clinical Iconic Graphics

    PubMed Central

    Payne, Philip R.O.; Starren, Justin B.

    2005-01-01

    Objective: The use of icons and other graphical components in user interfaces has become nearly ubiquitous. The interpretation of such icons is based on the assumption that different users perceive the shapes similarly. At the most basic level, different users must agree on which shapes are similar and which are different. If this similarity can be measured, it may be usable as the basis to design better icons. Design: The purpose of this study was to evaluate a novel method for categorizing the visual similarity of graphical primitives, called Presentation Discovery, in the domain of mammography. Six domain experts were given 50 common textual mammography findings and asked to draw how they would represent those findings graphically. Nondomain experts sorted the resulting graphics into groups based on their visual characteristics. The resulting groups were then analyzed using traditional statistics and hypothesis discovery tools. Strength of agreement was evaluated using computational simulations of sorting behavior. Measurements: Sorter agreement was measured at both the individual graphical and concept-group levels using a novel simulation-based method. “Consensus clusters” of graphics were derived using a hierarchical clustering algorithm. Results: The multiple sorters were able to reliably group graphics into similar groups that strongly correlated with underlying domain concepts. Visual inspection of the resulting consensus clusters indicated that graphical primitives that could be informative in the design of icons were present. Conclusion: The method described provides a rigorous alternative to intuitive design processes frequently employed in the design of icons and other graphical interface components. PMID:15684136

  13. Children with Autism Show Reduced Somatosensory Response: An MEG Study

    PubMed Central

    Marco, Elysa J.; Khatibi, Kasra; Hill, Susanna S.; Siegel, Bryna; Arroyo, Monica S.; Dowling, Anne F.; Neuhaus, John M.; Sherr, Elliott H.; Hinkley, Leighton N. B.; Nagarajan, Srikantan S.

    2012-01-01

    Lay Abstract Autism spectrum disorders are reported to affect nearly one out of every one hundred children, with over 90% of these children showing behavioral disturbances related to the processing of basic sensory information. Behavioral sensitivity to light touch, such as profound discomfort with clothing tags and physical contact, is a ubiquitous finding in children on the autism spectrum. In this study, we investigate the strength and timing of brain activity in response to simple, light taps to the fingertip. Our results suggest that children with autism show a diminished early response in the primary somatosensory cortex (S1). This finding is most evident in the left hemisphere. In exploratory analysis, we also show that tactile sensory behavior, as measured by the Sensory Profile, may be a better predictor of the intensity and timing of brain activity related to touch than a clinical autism diagnosis. We report that children with atypical tactile behavior have significantly lower amplitude somatosensory cortical responses in both hemispheres. Thus sensory behavioral phenotype appears to be a more powerful strategy for investigating neural activity in this cohort. This study provides evidence for atypical brain activity during sensory processing in autistic children and suggests that our sensory behavior based methodology may be an important approach to investigating brain activity in people with autism and neurodevelopmental disorders. Scientific Abstract The neural underpinnings of sensory processing differences in autism remain poorly understood. This prospective magnetoencephalography (MEG) study investigates whether children with autism show atypical cortical activity in the primary somatosensory cortex (S1) in comparison to matched controls. Tactile stimuli were clearly detectable, painless taps applied to the distal phalanx of the second (D2) and third (D3) fingers of the right and left hands. Three tactile paradigms were administered: an oddball

  14. Digital PCR for Quantifying Norovirus in Oysters Implicated in Outbreaks, France

    PubMed Central

    Polo, David; Schaeffer, Julien; Fournet, Nelly; Le Saux, Jean-Claude; Parnaudeau, Sylvain; McLeod, Catherine

    2016-01-01

    Using samples from oysters clearly implicated in human disease, we quantified norovirus levels by using digital PCR. Concentrations varied from 43 to 1,170 RNA copies/oyster. The analysis of frozen samples from the production area showed the presence of norovirus 2 weeks before consumption. PMID:27869597

  15. Quantifying the Components of Impervious Surfaces

    USGS Publications Warehouse

    Tilley, Janet S.; Slonecker, E. Terrence

    2006-01-01

    This study's objectives were to (1) determine the relative contribution of impervious surface individual components by collecting digital information from high-resolution imagery, 1-meter or better; and to (2) determine which of the more advanced techniques, such as spectral unmixing or the application of coefficients to land use or land cover data, was the most suitable method that could be used by State and local governments as well as Federal agencies to efficiently measure the imperviousness in any given watershed or area of interest. The components of impervious surfaces, combined from all the watersheds and time periods from objective one were the following: buildings 29.2-percent, roads 28.3-percent, parking lots 24.6-percent; with the remaining three totaling 14-percent - driveways, sidewalks, and other, where other were any other features that were not contained within the first five. Results from objective two were spectral unmixing techniques will ultimately be the most efficient method of determining imperviousness, but are not yet accurate enough as it is critical to achieve accuracy better than 10-percent of the truth, of which the method is not consistently accomplishing as observed in this study. Of the three techniques in coefficient application tested, land use coefficient application was not practical, while if the last two methods, coefficients applied to land cover data, were merged, their end results could be to within 5-percent or better, of the truth. Until the spectral unmixing technique has been further refined, land cover coefficients should be used, which offer quick results, but not current as they were developed for the 1992 National Land Characteristics Data.

  16. Quantifying entanglement of overlapping indistinguishable particles

    NASA Astrophysics Data System (ADS)

    Gittings, Joseph R.

    This thesis develops the quantitative study of quantum entanglement in systems of identical particles. Understanding this topic is essential for the construction of quantum information processing devices involving identical particles. A brief overview of necessary concepts and methods, such as the density matrix, the entanglement in pure and mixed states of distinguishable particles, and some common applications of entanglement is given in the introduction. Some competing methods of calculating the entanglement in bipartite pure states of indistinguishable particles are examined. It is shown that only the 'site entropy' measure introduced by Zanardi satisfies all the criteria for a correct entanglement measure. A teleportation protocol which utilizes all the entanglement carried (in both the spin and space degrees of freedom) in a doubly- occupied molecular bonding orbital is presented. The output from an interferometer in a thought experiment described by Omar et al. is studied as an example to see whether entanglement can be separated into space-only, spin-only, and space-spin components. A similar exercise is performed for a doubly-occupied molecular bonding orbital. The relationship between these results and the application of superselection rules (SSRs) to the quantification of useful entanglement is discussed. A numerical method for estimating the entanglement of formation of a mixed state of arbitrary dimension by a conjugate gradient algorithm is described. The results of applying an implementation of the algorithm to both random and isotropic states of 2 qutrits (i.e. two three-dimensional systems) is described. Existing work on calculating entanglement between two sites in various spin systems is outlined. New methods for calculating the entanglement between two sites in various types of degenerate quantum gas - a Fermi gas, a Bose condensate, and a BCS superconductor - are described. The results of numerical studies of the entanglement in a normal metal

  17. NASA GIBS Use in Live Planetarium Shows

    NASA Astrophysics Data System (ADS)

    Emmart, C. B.

    2015-12-01

    The American Museum of Natural History's Hayden Planetarium was rebuilt in year 2000 as an immersive theater for scientific data visualization to show the universe in context to our planet. Specific astrophysical movie productions provide the main daily programming, but interactive control software, developed at AMNH allows immersive presentation within a data aggregation of astronomical catalogs called the Digital Universe 3D Atlas. Since 2006, WMS globe browsing capabilities have been built into a software development collaboration with Sweden's Linkoping University (LiU). The resulting Uniview software, now a product of the company SCISS, is operated by about fifty planetariums around that world with ability to network amongst the sites for global presentations. Public presentation of NASA GIBS has allowed authoritative narratives to be presented within the range of data available in context to other sources such as Science on a Sphere, NASA Earth Observatory and Google Earth KML resources. Specifically, the NOAA supported World Views Network conducted a series of presentations across the US that focused on local ecological issues that could then be expanded in the course of presentation to national and global scales of examination. NASA support of for GIBS resources in an easy access multi scale streaming format like WMS has tremendously enabled particularly facile presentations of global monitoring like never before. Global networking of theaters for distributed presentations broadens out the potential for impact of this medium. Archiving and refinement of these presentations has already begun to inform new types of documentary productions that examine pertinent, global interdependency topics.

  18. Ancient bacteria show evidence of DNA repair

    PubMed Central

    Johnson, Sarah Stewart; Hebsgaard, Martin B.; Christensen, Torben R.; Mastepanov, Mikhail; Nielsen, Rasmus; Munch, Kasper; Brand, Tina; Gilbert, M. Thomas P.; Zuber, Maria T.; Bunce, Michael; Rønn, Regin; Gilichinsky, David; Froese, Duane; Willerslev, Eske

    2007-01-01

    Recent claims of cultivable ancient bacteria within sealed environments highlight our limited understanding of the mechanisms behind long-term cell survival. It remains unclear how dormancy, a favored explanation for extended cellular persistence, can cope with spontaneous genomic decay over geological timescales. There has been no direct evidence in ancient microbes for the most likely mechanism, active DNA repair, or for the metabolic activity necessary to sustain it. In this paper, we couple PCR and enzymatic treatment of DNA with direct respiration measurements to investigate long-term survival of bacteria sealed in frozen conditions for up to one million years. Our results show evidence of bacterial survival in samples up to half a million years in age, making this the oldest independently authenticated DNA to date obtained from viable cells. Additionally, we find strong evidence that this long-term survival is closely tied to cellular metabolic activity and DNA repair that over time proves to be superior to dormancy as a mechanism in sustaining bacteria viability. PMID:17728401

  19. Quantifying Photonic High-Dimensional Entanglement

    NASA Astrophysics Data System (ADS)

    Martin, Anthony; Guerreiro, Thiago; Tiranov, Alexey; Designolle, Sébastien; Fröwis, Florian; Brunner, Nicolas; Huber, Marcus; Gisin, Nicolas

    2017-03-01

    High-dimensional entanglement offers promising perspectives in quantum information science. In practice, however, the main challenge is to devise efficient methods to characterize high-dimensional entanglement, based on the available experimental data which is usually rather limited. Here we report the characterization and certification of high-dimensional entanglement in photon pairs, encoded in temporal modes. Building upon recently developed theoretical methods, we certify an entanglement of formation of 2.09(7) ebits in a time-bin implementation, and 4.1(1) ebits in an energy-time implementation. These results are based on very limited sets of local measurements, which illustrates the practical relevance of these methods.

  20. Quantifying oil filtration effects on bearing life

    NASA Technical Reports Server (NTRS)

    Needelman, William M.; Zaretsky, Erwin V.

    1991-01-01

    Rolling-element bearing life is influenced by the number, size, and material properties of particles entering the Hertzian contact of the rolling element and raceway. In general, rolling-element bearing life increases with increasing level of oil filtration. Based upon test results, two equations are presented which allow for the adjustment of bearing L(sub 10) or catalog life based upon oil filter rating. It is recommended that where no oil filtration is used catalog life be reduced by 50 percent.

  1. Quantifying data worth toward reducing predictive uncertainty

    USGS Publications Warehouse

    Dausman, A.M.; Doherty, J.; Langevin, C.D.; Sukop, M.C.

    2010-01-01

    The present study demonstrates a methodology for optimization of environmental data acquisition. Based on the premise that the worth of data increases in proportion to its ability to reduce the uncertainty of key model predictions, the methodology can be used to compare the worth of different data types, gathered at different locations within study areas of arbitrary complexity. The method is applied to a hypothetical nonlinear, variable density numerical model of salt and heat transport. The relative utilities of temperature and concentration measurements at different locations within the model domain are assessed in terms of their ability to reduce the uncertainty associated with predictions of movement of the salt water interface in response to a decrease in fresh water recharge. In order to test the sensitivity of the method to nonlinear model behavior, analyses were repeated for multiple realizations of system properties. Rankings of observation worth were similar for all realizations, indicating robust performance of the methodology when employed in conjunction with a highly nonlinear model. The analysis showed that while concentration and temperature measurements can both aid in the prediction of interface movement, concentration measurements, especially when taken in proximity to the interface at locations where the interface is expected to move, are of greater worth than temperature measurements. Nevertheless, it was also demonstrated that pairs of temperature measurements, taken in strategic locations with respect to the interface, can also lead to more precise predictions of interface movement. Journal compilation ?? 2010 National Ground Water Association.

  2. Quantifying the origin of metallic glass formation

    NASA Astrophysics Data System (ADS)

    Johnson, W. L.; Na, J. H.; Demetriou, M. D.

    2016-01-01

    The waiting time to form a crystal in a unit volume of homogeneous undercooled liquid exhibits a pronounced minimum τX* at a `nose temperature' T* located between the glass transition temperature Tg, and the crystal melting temperature, TL. Turnbull argued that τX* should increase rapidly with the dimensionless ratio trg=Tg/TL. Angell introduced a dimensionless `fragility parameter', m, to characterize the fall of atomic mobility with temperature above Tg. Both trg and m are widely thought to play a significant role in determining τX*. Here we survey and assess reported data for TL, Tg, trg, m and τX* for a broad range of metallic glasses with widely varying τX*. By analysing this database, we derive a simple empirical expression for τX*(trg, m) that depends exponentially on trg and m, and two fitting parameters. A statistical analysis shows that knowledge of trg and m alone is therefore sufficient to predict τX* within estimated experimental errors. Surprisingly, the liquid/crystal interfacial free energy does not appear in this expression for τX*.

  3. Quantifying Transmission Investment in Malaria Parasites.

    PubMed

    Greischar, Megan A; Mideo, Nicole; Read, Andrew F; Bjørnstad, Ottar N

    2016-02-01

    Many microparasites infect new hosts with specialized life stages, requiring a subset of the parasite population to forgo proliferation and develop into transmission forms. Transmission stage production influences infectivity, host exploitation, and the impact of medical interventions like drug treatment. Predicting how parasites will respond to public health efforts on both epidemiological and evolutionary timescales requires understanding transmission strategies. These strategies can rarely be observed directly and must typically be inferred from infection dynamics. Using malaria as a case study, we test previously described methods for inferring transmission stage investment against simulated data generated with a model of within-host infection dynamics, where the true transmission investment is known. We show that existing methods are inadequate and potentially very misleading. The key difficulty lies in separating transmission stages produced by different generations of parasites. We develop a new approach that performs much better on simulated data. Applying this approach to real data from mice infected with a single Plasmodium chabaudi strain, we estimate that transmission investment varies from zero to 20%, with evidence for variable investment over time in some hosts, but not others. These patterns suggest that, even in experimental infections where host genetics and other environmental factors are controlled, parasites may exhibit remarkably different patterns of transmission investment.

  4. Quantifying Transmission Investment in Malaria Parasites

    PubMed Central

    Greischar, Megan A.; Mideo, Nicole; Read, Andrew F.; Bjørnstad, Ottar N.

    2016-01-01

    Many microparasites infect new hosts with specialized life stages, requiring a subset of the parasite population to forgo proliferation and develop into transmission forms. Transmission stage production influences infectivity, host exploitation, and the impact of medical interventions like drug treatment. Predicting how parasites will respond to public health efforts on both epidemiological and evolutionary timescales requires understanding transmission strategies. These strategies can rarely be observed directly and must typically be inferred from infection dynamics. Using malaria as a case study, we test previously described methods for inferring transmission stage investment against simulated data generated with a model of within-host infection dynamics, where the true transmission investment is known. We show that existing methods are inadequate and potentially very misleading. The key difficulty lies in separating transmission stages produced by different generations of parasites. We develop a new approach that performs much better on simulated data. Applying this approach to real data from mice infected with a single Plasmodium chabaudi strain, we estimate that transmission investment varies from zero to 20%, with evidence for variable investment over time in some hosts, but not others. These patterns suggest that, even in experimental infections where host genetics and other environmental factors are controlled, parasites may exhibit remarkably different patterns of transmission investment. PMID:26890485

  5. A graph-theoretic method to quantify the airline route authority

    NASA Technical Reports Server (NTRS)

    Chan, Y.

    1979-01-01

    The paper introduces a graph-theoretic method to quantify the legal statements in route certificate which specifies the airline routing restrictions. All the authorized nonstop and multistop routes, including the shortest time routes, can be obtained, and the method suggests profitable route structure alternatives to airline analysts. This method to quantify the C.A.B. route authority was programmed in a software package, Route Improvement Synthesis and Evaluation, and demonstrated in a case study with a commercial airline. The study showed the utility of this technique in suggesting route alternatives and the possibility of improvements in the U.S. route system.

  6. Quantifying Explosive Actions in International Women's Soccer.

    PubMed

    Meylan, César M; Trewin, Joshua; McKean, Kelly

    2016-08-24

    The aims of the current study were to examine the external validity of inertial based parameters (inertial movement analysis; IMA) to detect multi-planar explosive actions during maximal sprinting, change of direction (COD) and to further determine its reliability, set appropriate magnitude bands for match analysis and assess its variability during international women's soccer matches. Twenty U20 female soccer players, wearing GPS units with a built-in accelerometer, completed three trials of a 40-m sprint and a 20-m sprint with a change of direction to the right or left at 10-m. Further, thirteen women's national team players (157 files; 4-27 matches per player) were analyzed to ascertain match-to-match variability. Video synchronization indicated IMA signal was instantaneous with explosive movement (acceleration/deceleration/COD). Peak GPS velocity during the 40-m sprint showed similar reliability (CV = 2.1%) to timing gates, but increased pre- and post-COD (CV = 4.5-13%). IMA variability was greater at the start of sprints (CV = 16-21%) compared to pre- and post-COD (CV = 13-16%). IMA threshold for match analysis was set at 2.5m.s(-2) by subtracting one standard deviation from the mean IMA during sprint trials. IMA match variability (CV = 14%) differed from high-speed GPS metrics (35-60%). Practitioners are advised that timing lights should remain the gold standard for monitoring sprint and acceleration capabilities of athletes. However, IMA indicates a reliable method to monitor between match explosive actions and assess changes due to various factors such as congested schedule, tactics, heat or altitude.

  7. Quantifying uncertainty in future ocean carbon uptake

    NASA Astrophysics Data System (ADS)

    Dunne, John P.

    2016-10-01

    Attributing uncertainty in ocean carbon uptake between societal trajectory (scenarios), Earth System Model construction (structure), and inherent natural variation in climate (internal) is critical to make progress in identifying, understanding, and reducing those uncertainties. In the present issue of Global Biogeochemical Cycles, Lovenduski et al. (2016) disentangle these drivers of uncertainty in ocean carbon uptake over time and space and assess the resulting implications for the emergence timescales of structural and scenario uncertainty over internal variability. Such efforts are critical for establishing realizable and efficient monitoring goals and prioritizing areas of continued model development. Under recently proposed climate stabilization targets, such efforts to partition uncertainty also become increasingly critical to societal decision-making in the context of carbon stabilization.

  8. Quantifying the Consistency of Scientific Databases

    PubMed Central

    Šubelj, Lovro; Bajec, Marko; Mileva Boshkoska, Biljana; Kastrin, Andrej; Levnajić, Zoran

    2015-01-01

    Science is a social process with far-reaching impact on our modern society. In recent years, for the first time we are able to scientifically study the science itself. This is enabled by massive amounts of data on scientific publications that is increasingly becoming available. The data is contained in several databases such as Web of Science or PubMed, maintained by various public and private entities. Unfortunately, these databases are not always consistent, which considerably hinders this study. Relying on the powerful framework of complex networks, we conduct a systematic analysis of the consistency among six major scientific databases. We found that identifying a single "best" database is far from easy. Nevertheless, our results indicate appreciable differences in mutual consistency of different databases, which we interpret as recipes for future bibliometric studies. PMID:25984946

  9. Quantifying uncertainties in wind energy assessment

    NASA Astrophysics Data System (ADS)

    Patlakas, Platon; Galanis, George; Kallos, George

    2015-04-01

    The constant rise of wind energy production and the subsequent penetration in global energy markets during the last decades resulted in new sites selection with various types of problems. Such problems arise due to the variability and the uncertainty of wind speed. The study of the wind speed distribution lower and upper tail may support the quantification of these uncertainties. Such approaches focused on extreme wind conditions or periods below the energy production threshold are necessary for a better management of operations. Towards this direction, different methodologies are presented for the credible evaluation of potential non-frequent/extreme values for these environmental conditions. The approaches used, take into consideration the structural design of the wind turbines according to their lifespan, the turbine failures, the time needed for repairing as well as the energy production distribution. In this work, a multi-parametric approach for studying extreme wind speed values will be discussed based on tools of Extreme Value Theory. In particular, the study is focused on extreme wind speed return periods and the persistence of no energy production based on a weather modeling system/hind cast/10-year dataset. More specifically, two methods (Annual Maxima and Peaks Over Threshold) were used for the estimation of extreme wind speeds and their recurrence intervals. Additionally, two different methodologies (intensity given duration and duration given intensity, both based on Annual Maxima method) were implied to calculate the extreme events duration, combined with their intensity as well as the event frequency. The obtained results prove that the proposed approaches converge, at least on the main findings, for each case. It is also remarkable that, despite the moderate wind speed climate of the area, several consequent days of no energy production are observed.

  10. Quantifying the multiple, environmental benefits of reintroducing the Eurasian Beaver

    NASA Astrophysics Data System (ADS)

    Brazier, Richard; Puttock, Alan; Graham, Hugh; Anderson, Karen; Cunliffe, Andrew; Elliott, Mark

    2016-04-01

    Beavers are ecological engineers with an ability to modify the structure and flow of fluvial systems and create complex wetland environments with dams, ponds and canals. Consequently, beaver activity has potential for river restoration, management and the provision of multiple environmental ecosystem services including biodiversity, flood risk mitigation, water quality and sustainable drinking water provision. With the current debate surrounding the reintroduction of beavers into the United Kingdom, it is critical to monitor the impact of beavers upon the environment. We have developed and implemented a monitoring strategy to quantify the impact of reintroducing the Eurasian Beaver on multiple environmental ecosystem services and river systems at a range of scales. First, the experimental design and preliminary results will be presented from the Mid-Devon Beaver Trial, where a family of beavers has been introduced to a 3 ha enclosure situated upon a first order tributary of the River Tamar. The site was instrumented to monitor the flow rate and quality of water entering and leaving the site. Additionally, the impacts of beavers upon riparian vegetation structure, water/carbon storage were investigated. Preliminary results indicate that beaver activity, particularly the building of ponds and dams, increases water storage within the landscape and moderates the river response to rainfall. Baseflow is enhanced during dry periods and storm flow is attenuated, potentially reducing the risk of flooding downstream. Initial analysis of water quality indicates that water entering the site (running off intensively managed grasslands upslope), has higher suspended sediment loads and nitrate levels, than that leaving the site, after moving through the series of beaver ponds. These results suggest beaver activity may also act as a means by which the negative impact of diffuse water pollution from agriculture can be mitigated thus providing cleaner water in rivers downstream

  11. Quantifying drug induced dyskinesia in Parkinson's disease patients using standardized videos.

    PubMed

    Rao, Anusha S; Bodenheimer, Robert E; Davis, Thomas L; Li, Rui; Voight, Cissy; Dawant, Benoit M

    2008-01-01

    This paper presents a video based method to quantify drug induced dyskinesias in Parkinson's disease (PD) patients. Dyskinetic movement in standard clinical videos of patients is analyzed by tracking landmark points on the video frames using non-rigid image registration. The novel application of Point Distribution Models (PDM) allows geometric variations and covariations of the landmark points to be captured from each video sequence. The PDM parameters represent quantifiable information that can be used to rate dyskinesia effectively, analogously to a neurologist's strategy of assessing the movement of multiple body parts simultaneously to effectively rate dyskinesia. A heuristic decision function is then developed using the PDM parameters to quantify the severity of the dyskinesia. The severity score using our decision function showed a high correlation to the dyskinesia rating of a neurologist on the corresponding patient videos.

  12. Qualifying and quantifying minimal hepatic encephalopathy.

    PubMed

    Morgan, Marsha Y; Amodio, Piero; Cook, Nicola A; Jackson, Clive D; Kircheis, Gerald; Lauridsen, Mette M; Montagnese, Sara; Schiff, Sami; Weissenborn, Karin

    2016-12-01

    Minimal hepatic encephalopathy is the term applied to the neuropsychiatric status of patients with cirrhosis who are unimpaired on clinical examination but show alterations in neuropsychological tests exploring psychomotor speed/executive function and/or in neurophysiological variables. There is no gold standard for the diagnosis of this syndrome. As these patients have, by definition, no recognizable clinical features of brain dysfunction, the primary prerequisite for the diagnosis is careful exclusion of clinical symptoms and signs. A large number of psychometric tests/test systems have been evaluated in this patient group. Of these the best known and validated is the Portal Systemic Hepatic Encephalopathy Score (PHES) derived from a test battery of five paper and pencil tests; normative reference data are available in several countries. The electroencephalogram (EEG) has been used to diagnose hepatic encephalopathy since the 1950s but, once popular, the technology is not as accessible now as it once was. The performance characteristics of the EEG are critically dependent on the type of analysis undertaken; spectral analysis has better performance characteristics than visual analysis; evolving analytical techniques may provide better diagnostic information while the advent of portable wireless headsets may facilitate more widespread use. A large number of other diagnostic tools have been validated for the diagnosis of minimal hepatic encephalopathy including Critical Flicker Frequency, the Inhibitory Control Test, the Stroop test, the Scan package and the Continuous Reaction Time; each has its pros and cons; strengths and weaknesses; protagonists and detractors. Recent AASLD/EASL Practice Guidelines suggest that the diagnosis of minimal hepatic encephalopathy should be based on the PHES test together with one of the validated alternative techniques or the EEG. Minimal hepatic encephalopathy has a detrimental effect on the well-being of patients and their care

  13. Quantifying the Fate of Stablised Criegee Intermediates under Atmospheric Conditions

    NASA Astrophysics Data System (ADS)

    Newland, Mike; Rickard, Andrew; Alam, Mohammed; Vereecken, Luc; Muñoz, Amalia; Ródenas, Milagros; Bloss, William

    2014-05-01

    The products of alkene ozonolysis have been shown in field experiments to convert SO2 to H2SO4. One fate of H2SO4 formed in the atmosphere is the formation of sulphate aerosol. This has been reported to contribute - 0.4 W m-2 to anthropogenic radiative forcing via the direct aerosol effect and can also contribute to the indirect aerosol effect, currently one of the greatest uncertainties in climate modelling. The observed SO2 oxidation has been proposed to arise from reactions of the carbonyl oxide, or Criegee Intermediate (CI), formed during alkene ozonolysis reactions, with SO2. Direct laboratory experiments have confirmed that stabilised CIs (SCIs) react more quickly with SO2 (k > 10-11 cm3 s-1) than was previously thought. The major sink for SCI in the troposphere is reaction with water vapour. The importance of the SO2 + SCI reaction in H2SO4 formation has been shown in modelling work to be critically dependent on the ratio of the rate constants for the reaction of the SCI with SO2 and with H2O. Such modelling work has suggested that the SCI + SO2 reaction is only likely to be important in regions with high alkene emissions, e.g. forests. Here we present results from a series of ozonolysis experiments performed at the EUPHORE atmospheric simulation chamber, Valencia. These experiments measure the loss of SO2, in the presence of an alkene (ethene, cis-but-2-ene and 2,3-dimethyl butene), as a function of water vapour. From these experiments we quantify the relative rates of reaction of the three smallest SCI with water and SO2 and their decomposition rates. In addition the results appear to suggest that the conversion of SO2 to H2SO4 during alkene ozonolysis may be inconsistent with the SCI + SO2 mechanism alone, particularly at high relative humidities. The results suggest that SCI are likely to provide at least an equivalent sink for SO2 to that of OH in the troposphere, in agreement with field observations. This work highlights the importance of alkene

  14. Quantifying peak discharges for historical floods

    USGS Publications Warehouse

    Cook, J.L.

    1987-01-01

    It is usually advantageous to use information regarding historical floods, if available, to define the flood-frequency relation for a stream. Peak stages can sometimes be determined for outstanding floods that occurred many years ago before systematic gaging of streams began. In the United States, this information is usually not available for more than 100-200 years, but in countries with long cultural histories, such as China, historical flood data are available at some sites as far back as 2,000 years or more. It is important in flood studies to be able to assign a maximum discharge rate and an associated error range to the historical flood. This paper describes the significant characteristics and uncertainties of four commonly used methods for estimating the peak discharge of a flood. These methods are: (1) rating curve (stage-discharge relation) extension; (2) slope conveyance; (3) slope area; and (4) step backwater. Logarithmic extensions of rating curves are based on theoretical plotting techniques that results in straight line extensions provided that channel shape and roughness do not change significantly. The slope-conveyance and slope-area methods are based on the Manning equation, which requires specific data on channel size, shape and roughness, as well as the water-surface slope for one or more cross-sections in a relatively straight reach of channel. The slope-conveyance method is used primarily for shaping and extending rating curves, whereas the slope-area method is used for specific floods. The step-backwater method, also based on the Manning equation, requires more cross-section data than the slope-area ethod, but has a water-surface profile convergence characteristic that negates the need for known or estimated water-surface slope. Uncertainties in calculating peak discharge for historical floods may be quite large. Various investigations have shown that errors in calculating peak discharges by the slope-area method under ideal conditions for

  15. Quantifying the properties of nano-composites.

    NASA Astrophysics Data System (ADS)

    Daw, Murray; Zhang, Bo; He, Jian; Tritt, Terry

    2008-03-01

    With the proliferation of nano-composites produced for possible thermoelectric application, we ask the question: To what extent is a given nano-composite like other composites? Or, in other words, when do we know that we have something new? To address this we apply the classical theory of composites to specific nano-composites grown and characterized at Clemson. The theory is very simple and assumes explicitly very simple properties of the materials, the most important being Fourier's Law/Ohm's Law. Given this assumption, the theory of composites can be applied to the nano-composites based on what is known of the microstructure. This ``classical'' result then forms the basis by which the properties can be compared to determine if non-classical effects are being observed. One simple theory is the application of rigorous bounds, such as the Hashin-Strikman Bounds which are based only on very simple microstructural descriptors. Another simple theory is the application of FEM, which can be constructed directly from SEM images of the samples using the NIST code ``OOF''. The FEM produces specific predictions for the composite properties. We find that the Hashin-Strikman Bounds are very useful for analyzing the thermal conductivities of composites, but are too loose to be useful for low-temperature electrical conductivity of composites composed of metals and insulators, where the FEM technique can be applied successfully.

  16. A method to quantify organic functional groups and inorganic compounds in ambient aerosols using attenuated total reflectance FTIR spectroscopy and multivariate chemometric techniques

    NASA Astrophysics Data System (ADS)

    Coury, Charity; Dillner, Ann M.

    An attenuated total reflectance-Fourier transform infrared (ATR-FTIR) spectroscopic technique and a multivariate calibration method were developed to quantify ambient aerosol organic functional groups and inorganic compounds. These methods were applied to size-resolved particulate matter samples collected in winter and summer of 2004 at three sites: a downtown Phoenix, Arizona location, a rural site near Phoenix, and an urban fringe site between the urban and rural site. Ten organic compound classes, including four classes which contain a carbonyl functional group, and three inorganic species were identified in the ambient samples. A partial least squares calibration was developed and applied to the ambient spectra, and 13 functional groups related to organic compounds (aliphatic and aromatic CH, methylene, methyl, alkene, aldehydes/ketones, carboxylic acids, esters/lactones, acid anhydrides, carbohydrate hydroxyl and ethers, amino acids, and amines) as well as ammonium sulfate and ammonium nitrate were quantified. Comparison of the sum of the mass measured by the ATR-FTIR technique and gravimetric mass indicates that this method can quantify nearly all of the aerosol mass on sub-micrometer size-segregated samples. Analysis of sample results shows that differences in organic functional group and inorganic compound concentrations at the three sampling sites can be measured with these methods. Future work will analyze the quantified data from these three sites in detail.

  17. Quantifying surface roughness over debris covered ice

    NASA Astrophysics Data System (ADS)

    Quincey, Duncan; Rounce, David; Ross, Andrew

    2016-04-01

    Aerodynamic roughness length (z0) remains a major uncertainty when determining turbulent heat fluxes over glacier surfaces, and can vary by an order of magnitude even within a small area and through the melt season. Defining z0 over debris-covered ice is particularly complex, because the surface may comprise clasts of greatly varying size, and the broader-scale surface relief can be similarly heterogeneous. Several recent studies have used Structure from Motion to data model debris-covered surfaces at the centimetric scale and calculate z0 based on measurements of surface microtopography. However, few have validated these measurements with independent vertical wind profile measurements, or considered how the measurements vary over a range of different surface types or scales of analysis. Here, we present the results of a field investigation conducted on the debris covered Khumbu Glacier during the post-monsoon season of 2015. We focus on two sites. The first is characterised by gravels and cobbles supported by a fine sandy matrix. The second comprises cobbles and boulders separated by voids. Vertical profiles of wind speed measured over both sites enable us to derive measurements of aerodynamic roughness that are similar in magnitude, with z0 at the second site exceeding that at the first by < 1 cm. During our observation period, snow covered the second site for three days, but the impact on z0 is small, implying that roughness is predominantly determined by major rock size obstacles rather than the general form of the surface. To complement these aerodynamic measurements we also conducted a Structure from Motion survey across each patch and calculated z0 using microtopographic methods published in a range of recent studies. We compare the outputs of each of these algorithms with each other and with the aerodynamic measurements, assess how they perform over a range of scales, and evaluate the validity of using microtopographic methods where aerodynamic measurements

  18. Quantifying the cleanliness of glass capillaries.

    PubMed

    Bowman, C L

    1998-01-01

    I used capillary rise methods to investigate the lumenal surface properties of quartz (fused silica, Amersil T-08), borosilicate (Corning 7800), and high-lead glass (Corning 0010) capillaries commonly used to make patch pipets. I calculated the capillary rise and contact angle for water and methanol from weight measurements. The capillary rise was compared with the theoretical maximum value calculated by assuming each fluid perfectly wetted the lumenal surface of the glass (i.e., zero contact angle, which reflects the absence of surface contamination). For borosilicate, high-lead, and quartz capillaries, the rise for water was substantially less than the theoretical maximum rise. Exposure of the borosilicate, lead, and quartz capillaries to several cleaning methods resulted in substantially better--but not perfect--agreement between the theoretical maximum rise and calculated capillary rise. By contrast, the capillary rise for methanol was almost identical in untreated and cleaned capillaries, but less than its theoretical maximum rise. The residual discrepancy between the observed and theoretical rise for water could not be improved on by trying a variety of cleaning procedures, but some cleaning methods were superior to others. The water solubility of the surface contaminants, deduced from the effectiveness of repeated rinsing, was different for each of the three types of capillaries examined: Corning 7800 > quartz > Corning 0010. A surface film was also detected in quatz tubing with an internal filament. I conclude that these borosilicate, quartz, and high-lead glass capillaries have a film on the lumenal surface, which can be removed using appropriate cleaning methods. The surface contaminants may be unique to each type of capillary and may also be hydrophobic. Two simple methods are presented to quantitate the cleanliness of glass capillary tubing commonly used to make pipets for studies of biological membranes. It is not known if the surface film is of

  19. Quantifying the biodiversity value of tropical primary, secondary, and plantation forests.

    PubMed

    Barlow, J; Gardner, T A; Araujo, I S; Avila-Pires, T C; Bonaldo, A B; Costa, J E; Esposito, M C; Ferreira, L V; Hawes, J; Hernandez, M I M; Hoogmoed, M S; Leite, R N; Lo-Man-Hung, N F; Malcolm, J R; Martins, M B; Mestre, L A M; Miranda-Santos, R; Nunes-Gutjahr, A L; Overal, W L; Parry, L; Peters, S L; Ribeiro-Junior, M A; da Silva, M N F; da Silva Motta, C; Peres, C A

    2007-11-20

    Biodiversity loss from deforestation may be partly offset by the expansion of secondary forests and plantation forestry in the tropics. However, our current knowledge of the value of these habitats for biodiversity conservation is limited to very few taxa, and many studies are severely confounded by methodological shortcomings. We examined the conservation value of tropical primary, secondary, and plantation forests for 15 taxonomic groups using a robust and replicated sample design that minimized edge effects. Different taxa varied markedly in their response to patterns of land use in terms of species richness and the percentage of species restricted to primary forest (varying from 5% to 57%), yet almost all between-forest comparisons showed marked differences in community structure and composition. Cross-taxon congruence in response patterns was very weak when evaluated using abundance or species richness data, but much stronger when using metrics based upon community similarity. Our results show that, whereas the biodiversity indicator group concept may hold some validity for several taxa that are frequently sampled (such as birds and fruit-feeding butterflies), it fails for those exhibiting highly idiosyncratic responses to tropical land-use change (including highly vagile species groups such as bats and orchid bees), highlighting the problems associated with quantifying the biodiversity value of anthropogenic habitats. Finally, although we show that areas of native regeneration and exotic tree plantations can provide complementary conservation services, we also provide clear empirical evidence demonstrating the irreplaceable value of primary forests.

  20. Quantifying the biodiversity value of tropical primary, secondary, and plantation forests

    PubMed Central

    Barlow, J.; Gardner, T. A.; Araujo, I. S.; Ávila-Pires, T. C.; Bonaldo, A. B.; Costa, J. E.; Esposito, M. C.; Ferreira, L. V.; Hawes, J.; Hernandez, M. I. M.; Hoogmoed, M. S.; Leite, R. N.; Lo-Man-Hung, N. F.; Malcolm, J. R.; Martins, M. B.; Mestre, L. A. M.; Miranda-Santos, R.; Nunes-Gutjahr, A. L.; Overal, W. L.; Parry, L.; Peters, S. L.; Ribeiro-Junior, M. A.; da Silva, M. N. F.; da Silva Motta, C.; Peres, C. A.

    2007-01-01

    Biodiversity loss from deforestation may be partly offset by the expansion of secondary forests and plantation forestry in the tropics. However, our current knowledge of the value of these habitats for biodiversity conservation is limited to very few taxa, and many studies are severely confounded by methodological shortcomings. We examined the conservation value of tropical primary, secondary, and plantation forests for 15 taxonomic groups using a robust and replicated sample design that minimized edge effects. Different taxa varied markedly in their response to patterns of land use in terms of species richness and the percentage of species restricted to primary forest (varying from 5% to 57%), yet almost all between-forest comparisons showed marked differences in community structure and composition. Cross-taxon congruence in response patterns was very weak when evaluated using abundance or species richness data, but much stronger when using metrics based upon community similarity. Our results show that, whereas the biodiversity indicator group concept may hold some validity for several taxa that are frequently sampled (such as birds and fruit-feeding butterflies), it fails for those exhibiting highly idiosyncratic responses to tropical land-use change (including highly vagile species groups such as bats and orchid bees), highlighting the problems associated with quantifying the biodiversity value of anthropogenic habitats. Finally, although we show that areas of native regeneration and exotic tree plantations can provide complementary conservation services, we also provide clear empirical evidence demonstrating the irreplaceable value of primary forests. PMID:18003934

  1. Quantifying the benefits of vehicle pooling with shareability networks

    PubMed Central

    Santi, Paolo; Resta, Giovanni; Szell, Michael; Sobolevsky, Stanislav; Strogatz, Steven H.; Ratti, Carlo

    2014-01-01

    Taxi services are a vital part of urban transportation, and a considerable contributor to traffic congestion and air pollution causing substantial adverse effects on human health. Sharing taxi trips is a possible way of reducing the negative impact of taxi services on cities, but this comes at the expense of passenger discomfort quantifiable in terms of a longer travel time. Due to computational challenges, taxi sharing has traditionally been approached on small scales, such as within airport perimeters, or with dynamical ad hoc heuristics. However, a mathematical framework for the systematic understanding of the tradeoff between collective benefits of sharing and individual passenger discomfort is lacking. Here we introduce the notion of shareability network, which allows us to model the collective benefits of sharing as a function of passenger inconvenience, and to efficiently compute optimal sharing strategies on massive datasets. We apply this framework to a dataset of millions of taxi trips taken in New York City, showing that with increasing but still relatively low passenger discomfort, cumulative trip length can be cut by 40% or more. This benefit comes with reductions in service cost, emissions, and with split fares, hinting toward a wide passenger acceptance of such a shared service. Simulation of a realistic online system demonstrates the feasibility of a shareable taxi service in New York City. Shareability as a function of trip density saturates fast, suggesting effectiveness of the taxi sharing system also in cities with much sparser taxi fleets or when willingness to share is low. PMID:25197046

  2. Quantifying global dust devil occurrence from meteorological analyses

    PubMed Central

    Jemmett-Smith, Bradley C; Marsham, John H; Knippertz, Peter; Gilkeson, Carl A

    2015-01-01

    Dust devils and nonrotating dusty plumes are effective uplift mechanisms for fine particles, but their contribution to the global dust budget is uncertain. By applying known bulk thermodynamic criteria to European Centre for Medium-Range Weather Forecasts (ECMWF) operational analyses, we provide the first global hourly climatology of potential dust devil and dusty plume (PDDP) occurrence. In agreement with observations, activity is highest from late morning into the afternoon. Combining PDDP frequencies with dust source maps and typical emission values gives the best estimate of global contributions of 3.4% (uncertainty 0.9–31%), 1 order of magnitude lower than the only estimate previously published. Total global hours of dust uplift by dry convection are ∼0.002% of the dust-lifting winds resolved by ECMWF, consistent with dry convection making a small contribution to global uplift. Reducing uncertainty requires better knowledge of factors controlling PDDP occurrence, source regions, and dust fluxes induced by dry convection. Key Points Global potential dust devil occurrence quantified from meteorological analyses Climatology shows realistic diurnal cycle and geographical distribution Best estimate of global contribution of 3.4% is 10 times smaller than the previous estimate PMID:26681815

  3. A Methodology for Quantifying Heart Function in the Embryonic Zebrafish

    NASA Astrophysics Data System (ADS)

    Johnson, Brennan; Garrity, Deborah; Dasi, Lakshmi

    2012-11-01

    Several studies have linked epigenetic factors such as blood flow dynamics and cardiac function to proper heart development. To better understand this process, it is essential to develop robust quantitative methods to investigate the blood dynamics and wall kinematics in vivo. Here, we develop a methodology that can be used throughout the early stages of development which requires no specialized equipment other than a bright field microscope and high-speed camera. We use the embryonic zebrafish as our model due to its superb optical access and widespread acceptance as a powerful model for human heart development. Using these methods, we quantify blood flow rates, stroke volume, cardiac output, ejection fraction, and other important parameters related to heart function. We also investigate the pumping mechanics from heart tube to looped configuration. We show that although the mechanism changes fundamentally, it does so in a continuous fashion that can incorporate combined pumping mechanisms at intermediate stages. This work provides a basis for quantitatively comparing normal and abnormal heart development, and may help us gain a better understanding of congenital heart defects. Funded by NSF.

  4. Quantifying Parkinson's disease progression by simulating gait patterns

    NASA Astrophysics Data System (ADS)

    Cárdenas, Luisa; Martínez, Fabio; Atehortúa, Angélica; Romero, Eduardo

    2015-12-01

    Modern rehabilitation protocols of most neurodegenerative diseases, in particular the Parkinson Disease, rely on a clinical analysis of gait patterns. Currently, such analysis is highly dependent on both the examiner expertise and the type of evaluation. Development of evaluation methods with objective measures is then crucial. Physical models arise as a powerful alternative to quantify movement patterns and to emulate the progression and performance of specific treatments. This work introduces a novel quantification of the Parkinson disease progression using a physical model that accurately represents the main gait biomarker, the body Center of Gravity (CoG). The model tracks the whole gait cycle by a coupled double inverted pendulum that emulates the leg swinging for the single support phase and by a damper-spring System (SDP) that recreates both legs in contact with the ground for the double phase. The patterns generated by the proposed model are compared with actual ones learned from 24 subjects in stages 2,3, and 4. The evaluation performed demonstrates a better performance of the proposed model when compared with a baseline model(SP) composed of a coupled double pendulum and a mass-spring system. The Frechet distance measured differences between model estimations and real trajectories, showing for stages 2, 3 and 4 distances of 0.137, 0.155, 0.38 for the baseline and 0.07, 0.09, 0.29 for the proposed method.

  5. Quantifying neurotransmission reliability through metrics-based information analysis.

    PubMed

    Brasselet, Romain; Johansson, Roland S; Arleo, Angelo

    2011-04-01

    We set forth an information-theoretical measure to quantify neurotransmission reliability while taking into full account the metrical properties of the spike train space. This parametric information analysis relies on similarity measures induced by the metrical relations between neural responses as spikes flow in. Thus, in order to assess the entropy, the conditional entropy, and the overall information transfer, this method does not require any a priori decoding algorithm to partition the space into equivalence classes. It therefore allows the optimal parameters of a class of distances to be determined with respect to information transmission. To validate the proposed information-theoretical approach, we study precise temporal decoding of human somatosensory signals recorded using microneurography experiments. For this analysis, we employ a similarity measure based on the Victor-Purpura spike train metrics. We show that with appropriate parameters of this distance, the relative spike times of the mechanoreceptors' responses convey enough information to perform optimal discrimination--defined as maximum metrical information and zero conditional entropy--of 81 distinct stimuli within 40 ms of the first afferent spike. The proposed information-theoretical measure proves to be a suitable generalization of Shannon mutual information in order to consider the metrics of temporal codes explicitly. It allows neurotransmission reliability to be assessed in the presence of large spike train spaces (e.g., neural population codes) with high temporal precision.

  6. Quantifying capture efficiency of gas collection wells with gas tracers.

    PubMed

    Yazdani, Ramin; Imhoff, Paul; Han, Byunghyun; Mei, Changen; Augenstein, Don

    2015-09-01

    A new in situ method for directly measuring the gas collection efficiency in the region around a gas extraction well was developed. Thirteen tests were conducted by injecting a small volume of gas tracer sequentially at different locations in the landfill cell, and the gas tracer mass collected from each test was used to assess the collection efficiency at each injection point. For 11 tests the gas collection was excellent, always exceeding 70% with seven tests showing a collection efficiency exceeding 90%. For one test the gas collection efficiency was 8±6%. Here, the poor efficiency was associated with a water-laden refuse or remnant daily cover soil located between the point of tracer injection and the extraction well. The utility of in situ gas tracer tests for quantifying landfill gas capture at particular locations within a landfill cell was demonstrated. While there are certainly limitations to this technology, this method may be a valuable tool to help answer questions related to landfill gas collection efficiency and gas flow within landfills. Quantitative data from tracer tests may help assess the utility and cost-effectiveness of alternative cover systems, well designs and landfill gas collection management practices.

  7. Global climate change: the quantifiable sustainability challenge.

    PubMed

    Princiotta, Frank T; Loughlin, Daniel H

    2014-09-01

    Population growth and the pressures spawned by increasing demands for energy and resource-intensive goods, foods, and services are driving unsustainable growth in greenhouse gas (GHG) emissions. Recent GHG emission trends are consistent with worst-case scenarios of the previous decade. Dramatic and near-term emission reductions likely will be needed to ameliorate the potential deleterious impacts of climate change. To achieve such reductions, fundamental changes are required in the way that energy is generated and used. New technologies must be developed and deployed at a rapid rate. Advances in carbon capture and storage, renewable, nuclear and transportation technologies are particularly important; however, global research and development efforts related to these technologies currently appear to fall short relative to needs. Even with a proactive and international mitigation effort, humanity will need to adapt to climate change, but the adaptation needs and damages will be far greater if mitigation activities are not pursued in earnest. In this review, research is highlighted that indicates increasing global and regional temperatures and ties climate changes to increasing GHG emissions. GHG mitigation targets necessary for limiting future global temperature increases are discussed, including how factors such as population growth and the growing energy intensity of the developing world will make these reduction targets more challenging. Potential technological pathways for meeting emission reduction targets are examined, barriers are discussed, and global and US. modeling results are presented that suggest that the necessary pathways will require radically transformed electric and mobile sectors. While geoengineering options have been proposed to allow more time for serious emission reductions, these measures are at the conceptual stage with many unanswered cost, environmental, and political issues. Implications: This paper lays out the case that mitigating the

  8. Quantifying mixing using magnetic resonance imaging.

    PubMed

    Tozzi, Emilio J; McCarthy, Kathryn L; Bacca, Lori A; Hartt, William H; McCarthy, Michael J

    2012-01-25

    Mixing is a unit operation that combines two or more components into a homogeneous mixture. This work involves mixing two viscous liquid streams using an in-line static mixer. The mixer is a split-and-recombine design that employs shear and extensional flow to increase the interfacial contact between the components. A prototype split-and-recombine (SAR) mixer was constructed by aligning a series of thin laser-cut Poly (methyl methacrylate) (PMMA) plates held in place in a PVC pipe. Mixing in this device is illustrated in the photograph in Fig. 1. Red dye was added to a portion of the test fluid and used as the minor component being mixed into the major (undyed) component. At the inlet of the mixer, the injected layer of tracer fluid is split into two layers as it flows through the mixing section. On each subsequent mixing section, the number of horizontal layers is duplicated. Ultimately, the single stream of dye is uniformly dispersed throughout the cross section of the device. Using a non-Newtonian test fluid of 0.2% Carbopol and a doped tracer fluid of similar composition, mixing in the unit is visualized using magnetic resonance imaging (MRI). MRI is a very powerful experimental probe of molecular chemical and physical environment as well as sample structure on the length scales from microns to centimeters. This sensitivity has resulted in broad application of these techniques to characterize physical, chemical and/or biological properties of materials ranging from humans to foods to porous media (1, 2). The equipment and conditions used here are suitable for imaging liquids containing substantial amounts of NMR mobile (1)H such as ordinary water and organic liquids including oils. Traditionally MRI has utilized super conducting magnets which are not suitable for industrial environments and not portable within a laboratory (Fig. 2). Recent advances in magnet technology have permitted the construction of large volume industrially compatible magnets suitable for

  9. The Physics of Equestrian Show Jumping

    ERIC Educational Resources Information Center

    Stinner, Art

    2014-01-01

    This article discusses the kinematics and dynamics of equestrian show jumping. For some time I have attended a series of show jumping events at Spruce Meadows, an international equestrian center near Calgary, Alberta, often referred to as the "Wimbledon of equestrian jumping." I have always had a desire to write an article such as this…

  10. Serving Up Activities for TV Cooking Shows.

    ERIC Educational Resources Information Center

    Katchen, Johanna E.

    This paper documents a presentation given on the use of English-language television cooking shows in English-as-a-Second-Language (ESL) and English-as-a-Foreign-Language (EFL) classrooms in Taiwan. Such shows can be ideal for classroom use, since they have a predictable structure consisting of short segments, are of interest to most students,…

  11. 47 CFR 90.505 - Showing required.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... MOBILE RADIO SERVICES Developmental Operation § 90.505 Showing required. (a) Except as provided in paragraph (b) of this section, each application for developmental operation shall be accompanied by a showing that: (1) The applicant has an organized plan of development leading to a specific objective;...

  12. The Language of Show Biz: A Dictionary.

    ERIC Educational Resources Information Center

    Sergel, Sherman Louis, Ed.

    This dictionary of the language of show biz provides the layman with definitions and essays on terms and expressions often used in show business. The overall pattern of selection was intended to be more rather than less inclusive, though radio, television, and film terms were deliberately omitted. Lengthy explanations are sometimes used to express…

  13. Quantifying the Primary Controls on Silica Storage and Mobilization in Grass Dominated Ecosystems

    NASA Astrophysics Data System (ADS)

    Melzer, S. E.; Kelly, E. F.; Yonker, C. M.; Knapp, A. K.; Chadwick, O. A.; Smith, M. D.; Fynn, R. W.; Kirkman, K. P.

    2006-12-01

    The contribution of biogenic silica (BSi) to ecosystem Si pools as well as the influence of BSi on weathering rates in terrestrial systems must be understood in order to further quantify the global biogeochemistry of Si. Recent research suggests that BSi production and storage may be most important in grass-dominated ecosystems relative to other terrestrial biomes. In these ecosystems, silica accumulates to high levels due to (1) the relatively high concentrations found in the dominant vegetation and (2) the lack of significant ground- and stream-water export pathways in these more water limited biomes. Further, although BSi can significantly affect mineral weathering by transforming silica to a soluble form that can be lost from ecosystems more easily, plant uptake and the stabilizing effect of silica storage in phytoliths renders BSi less subject to export than silica dissolved in soil solution. In this study, we identify and quantify the primary ecological and pedological drivers of the terrestrial Si cycle in grass dominated ecosystems. We sampled sites within the temperate grasslands of North America and in sub-tropical savannas and grasslands of South Africa to fill critical knowledge gaps and to further improve our assessment of the range and variability of BSi in grasslands globally. Although these sites share many similarities, they also have unique ecological, pedological and geological attributes that make them valuable for assessing potential controls on BSi. Our objectives were as follows: 1) to quantify the production, storage, and output of BSi within and among grass dominated ecosystems, and 2) to identify key controls on the size of the BSi pools by sampling sites that differ in (A) precipitation amount, (B) parent material, (C) age, and (D) fire regime. Our results show that, in the younger temperate grasslands, BSi derived from plants significantly increased (á=0.05) as a function of mean annual precipitation. In contrast, in the older, sub

  14. Feasibility of Quantifying Arterial Cerebral Blood Volume Using Multiphase Alternate Ascending/Descending Directional Navigation (ALADDIN)

    PubMed Central

    Kim, Ki Hwan; Choi, Seung Hong; Park, Sung-Hong

    2016-01-01

    Arterial cerebral blood volume (aCBV) is associated with many physiologic and pathologic conditions. Recently, multiphase balanced steady state free precession (bSSFP) readout was introduced to measure labeled blood signals in the arterial compartment, based on the fact that signal difference between labeled and unlabeled blood decreases with the number of RF pulses that is affected by blood velocity. In this study, we evaluated the feasibility of a new 2D inter-slice bSSFP-based arterial spin labeling (ASL) technique termed, alternate ascending/descending directional navigation (ALADDIN), to quantify aCBV using multiphase acquisition in six healthy subjects. A new kinetic model considering bSSFP RF perturbations was proposed to describe the multiphase data and thus to quantify aCBV. Since the inter-slice time delay (TD) and gap affected the distribution of labeled blood spins in the arterial and tissue compartments, we performed the experiments with two TDs (0 and 500 ms) and two gaps (300% and 450% of slice thickness) to evaluate their roles in quantifying aCBV. Comparison studies using our technique and an existing method termed arterial volume using arterial spin tagging (AVAST) were also separately performed in five subjects. At 300% gap or 500-ms TD, significant tissue perfusion signals were demonstrated, while tissue perfusion signals were minimized and arterial signals were maximized at 450% gap and 0-ms TD. ALADDIN has an advantage of visualizing bi-directional flow effects (ascending/descending) in a single experiment. Labeling efficiency (α) of inter-slice blood flow effects could be measured in the superior sagittal sinus (SSS) (20.8±3.7%.) and was used for aCBV quantification. As a result of fitting to the proposed model, aCBV values in gray matter (1.4–2.3 mL/100 mL) were in good agreement with those from literature. Our technique showed high correlation with AVAST, especially when arterial signals were accentuated (i.e., when TD = 0 ms) (r = 0

  15. Quantifying the direct use value of Condor seamount

    NASA Astrophysics Data System (ADS)

    Ressurreição, Adriana; Giacomello, Eva

    2013-12-01

    Seamounts often satisfy numerous uses and interests. Multiple uses can generate multiple benefits but also conflicts and impacts, calling, therefore, for integrated and sustainable management. To assist in developing comprehensive management strategies, policymakers recognise the need to include measures of socioeconomic analysis alongside ecological data so that practical compromises can be made. This study assessed the direct output impact (DOI) of the relevant marine activities operating at Condor seamount (Azores, central northeast Atlantic) as proxies of the direct use values provided by the resource system. Results demonstrated that Condor seamount supported a wide range of uses yielding distinct economic outputs. Demersal fisheries, scientific research and shark diving were the top-three activities generating the highest revenues, while tuna fisheries, whale watching and scuba-diving had marginal economic significance. Results also indicated that the economic importance of non-extractive uses of Condor is considerable, highlighting the importance of these uses as alternative income-generating opportunities for local communities. It is hoped that quantifying the direct use values provided by Condor seamount will contribute to the decision making process towards its long-term conservation and sustainable use.

  16. Quantifying repetitive speech in autism spectrum disorders and language impairment.

    PubMed

    van Santen, Jan P H; Sproat, Richard W; Hill, Alison Presmanes

    2013-10-01

    We report on an automatic technique for quantifying two types of repetitive speech: repetitions of what the child says him/herself (self-repeats) and of what is uttered by an interlocutor (echolalia). We apply this technique to a sample of 111 children between the ages of four and eight: 42 typically developing children (TD), 19 children with specific language impairment (SLI), 25 children with autism spectrum disorders (ASD) plus language impairment (ALI), and 25 children with ASD with normal, non-impaired language (ALN). The results indicate robust differences in echolalia between the TD and ASD groups as a whole (ALN + ALI), and between TD and ALN children. There were no significant differences between ALI and SLI children for echolalia or self-repetitions. The results confirm previous findings that children with ASD repeat the language of others more than other populations of children. On the other hand, self-repetition does not appear to be significantly more frequent in ASD, nor does it matter whether the child's echolalia occurred within one (immediate) or two turns (near-immediate) of the adult's original utterance. Furthermore, non-significant differences between ALN and SLI, between TD and SLI, and between ALI and TD are suggestive that echolalia may not be specific to ALN or to ASD in general. One important innovation of this work is an objective fully automatic technique for assessing the amount of repetition in a transcript of a child's utterances.

  17. Quantifying colloid retention in partially saturated porous media

    NASA Astrophysics Data System (ADS)

    Zevi, Yuniati; Dathe, Annette; Gao, Bin; Richards, Brian K.; Steenhuis, Tammo S.

    2006-12-01

    The transport of colloid-contaminant complexes and colloid-sized pathogens through soil to groundwater is of concern. Visualization and quantification of pore-scale colloid behavior will enable better description and simulation of retention mechanisms at individual surfaces, in contrast to breakthrough curves which only provide an integrated signal. We tested two procedures for quantifying colloid movement and retention as observed in pore-scale image sequences. After initial testing with static images, three series of images of synthetic microbead suspensions passing through unsaturated sand were examined. The region procedure (implemented in ImageJ) and the Boolean procedure (implemented in KS400) yielded nearly identical results for initial test images and for total colloid-covered areas in three image series. Because of electronic noise resulting in pixel-level brightness fluctuations the Boolean procedure tended to underestimate attached colloid counts and conversely overestimate mobile colloid counts. The region procedure had a smaller overestimation error of attached colloids. Reliable quantification of colloid retention at pore scale can be used to improve current understanding on the transport mechanisms of colloids in unsaturated porous media. For example, attachment counts at individual air/water meniscus/solid interface were well described by Langmuir isotherms.

  18. Method for quantifying optical properties of the human lens

    DOEpatents

    Loree, T.R.; Bigio, I.J.; Zuclich, J.A.; Shimada, Tsutomu; Strobl, K.

    1999-04-13

    A method is disclosed for quantifying optical properties of the human lens. The present invention includes the application of fiberoptic, OMA-based instrumentation as an in vivo diagnostic tool for the human ocular lens. Rapid, noninvasive and comprehensive assessment of the optical characteristics of a lens using very modest levels of exciting light are described. Typically, the backscatter and fluorescence spectra (from about 300- to 900-nm) elicited by each of several exciting wavelengths (from about 300- to 600-nm) are collected within a few seconds. The resulting optical signature of individual lenses is then used to assess the overall optical quality of the lens by comparing the results with a database of similar measurements obtained from a reference set of normal human lenses having various ages. Several metrics have been identified which gauge the optical quality of a given lens relative to the norm for the subject`s chronological age. These metrics may also serve to document accelerated optical aging and/or as early indicators of cataract or other disease processes. 8 figs.

  19. Method for quantifying optical properties of the human lens

    DOEpatents

    Loree, deceased, Thomas R.; Bigio, Irving J.; Zuclich, Joseph A.; Shimada, Tsutomu; Strobl, Karlheinz

    1999-01-01

    Method for quantifying optical properties of the human lens. The present invention includes the application of fiberoptic, OMA-based instrumentation as an in vivo diagnostic tool for the human ocular lens. Rapid, noninvasive and comprehensive assessment of the optical characteristics of a lens using very modest levels of exciting light are described. Typically, the backscatter and fluorescence spectra (from about 300- to 900-nm) elicited by each of several exciting wavelengths (from about 300- to 600-nm) are collected within a few seconds. The resulting optical signature of individual lenses is then used to assess the overall optical quality of the lens by comparing the results with a database of similar measurements obtained from a reference set of normal human lenses having various ages. Several metrics have been identified which gauge the optical quality of a given lens relative to the norm for the subject's chronological age. These metrics may also serve to document accelerated optical aging and/or as early indicators of cataract or other disease processes.

  20. Quantifying uncertainty in LCA-modelling of waste management systems.

    PubMed

    Clavreul, Julie; Guyonnet, Dominique; Christensen, Thomas H

    2012-12-01

    Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining the selected methods: (Step 1) a sensitivity analysis evaluating the sensitivities of the results with respect to the input uncertainties, (Step 2) an uncertainty propagation providing appropriate tools for representing uncertainties and calculating the overall uncertainty of the model results, (Step 3) an uncertainty contribution analysis quantifying the contribution of each parameter uncertainty to the final uncertainty and (Step 4) as a new approach, a combined sensitivity analysis providing a visualisation of the shift in the ranking of different options due to variations of selected key parameters. This tiered approach optimises the resources available to LCA practitioners by only propagating the most influential uncertainties.

  1. Quantifying Russian wheat aphid pest intensity across the Great Plains.

    PubMed

    Merrill, Scott C; Peairs, Frank B

    2012-12-01

    Wheat, the most important cereal crop in the Northern Hemisphere, is at-risk for an approximate 10% reduction in worldwide production because of animal pests. The potential economic impact of cereal crop pests has resulted in substantial research efforts into the understanding of pest agroecosystems and development of pest management strategy. Management strategy is informed frequently by models that describe the population dynamics of important crop pests and because of the economic impact of these pests, many models have been developed. Yet, limited effort has ensued to compare and contrast models for their strategic applicability and quality. One of the most damaging pests of wheat in North America is the Russian wheat aphid, Diuraphis noxia (Kurdjumov). Eighteen D. noxia population dynamic models were developed from the literature to describe pest intensity. The strongest models quantified the negative effects of fall and spring precipitation on aphid intensity, and the positive effects associated with alternate food source availability. Population dynamic models were transformed into spatially explicit models and combined to form a spatially explicit, model-averaged result. Our findings were used to delineate pest intensity on winter wheat across much of the Great Plains and will help improve D. noxia management strategy.

  2. Quantifying dose to the reconstructed breast: Can we adequately treat?

    SciTech Connect

    Chung, Eugene; Marsh, Robin B.; Griffith, Kent A.; Moran, Jean M.; Pierce, Lori J.

    2013-04-01

    To evaluate how immediate reconstruction (IR) impacts postmastectomy radiotherapy (PMRT) dose distributions to the reconstructed breast (RB), internal mammary nodes (IMN), heart, and lungs using quantifiable dosimetric end points. 3D conformal plans were developed for 20 IR patients, 10 autologous reconstruction (AR), and 10 expander-implant (EI) reconstruction. For each reconstruction type, 5 right- and 5 left-sided reconstructions were selected. Two plans were created for each patient, 1 with RB coverage alone and 1 with RB + IMN coverage. Left-sided EI plans without IMN coverage had higher heart Dmean than left-sided AR plans (2.97 and 0.84 Gy, p = 0.03). Otherwise, results did not vary by reconstruction type and all remaining metrics were evaluated using a combined AR and EI dataset. RB coverage was adequate regardless of laterality or IMN coverage (Dmean 50.61 Gy, D95 45.76 Gy). When included, IMN Dmean and D95 were 49.57 and 40.96 Gy, respectively. Mean heart doses increased with left-sided treatment plans and IMN inclusion. Right-sided treatment plans and IMN inclusion increased mean lung V{sub 20}. Using standard field arrangements and 3D planning, we observed excellent coverage of the RB and IMN, regardless of laterality or reconstruction type. Our results demonstrate that adequate doses can be delivered to the RB with or without IMN coverage.

  3. Quantifying signal dispersion in a hybrid ice core melting system.

    PubMed

    Breton, Daniel J; Koffman, Bess G; Kurbatov, Andrei V; Kreutz, Karl J; Hamilton, Gordon S

    2012-11-06

    We describe a microcontroller-based ice core melting and data logging system allowing simultaneous depth coregistration of a continuous flow analysis (CFA) system (for microparticle and conductivity measurement) and a discrete sample analysis system (for geochemistry and microparticles), both supplied from the same melted ice core section. This hybrid melting system employs an ice parcel tracking algorithm which calculates real-time sample transport through all portions of the meltwater handling system, enabling accurate (1 mm) depth coregistration of all measurements. Signal dispersion is analyzed using residence time theory, experimental results of tracer injection tests and antiparallel melting of replicate cores to rigorously quantify the signal dispersion in our system. Our dispersion-limited resolution is 1.0 cm in ice and ~2 cm in firn. We experimentally observe the peak lead phenomenon, where signal dispersion causes the measured CFA peak associated with a given event to be depth assigned ~1 cm shallower than the true event depth. Dispersion effects on resolution and signal depth assignment are discussed in detail. Our results have implications for comparisons of chemistry and physical properties data recorded using multiple instruments and for deconvolution methods of enhancing CFA depth resolution.

  4. Quantifying the leakage of quantum protocols for classical two-party cryptography

    NASA Astrophysics Data System (ADS)

    Salvail, Louis; Schaffner, Christian; Sotáková, Miroslava

    2014-12-01

    We study quantum protocols among two distrustful parties. By adopting a rather strict definition of correctness — guaranteeing that honest players obtain their correct outcomes only — we can show that every strictly correct quantum protocol implementing a non-trivial classical primitive necessarily leaks information to a dishonest player. This extends known impossibility results to all non-trivial primitives. We provide a framework for quantifying this leakage and argue that leakage is a good measure for the privacy provided to the players by a given protocol. Our framework also covers the case where the two players are helped by a trusted third party. We show that despite the help of a trusted third party, the players cannot amplify the cryptographic power of any primitive. All our results hold even against quantum honest-but-curious adversaries who honestly follow the protocol but purify their actions and apply a different measurement at the end of the protocol. As concrete examples, we establish lower bounds on the leakage of standard universal two-party primitives such as oblivious transfer.

  5. Quantifying the leakage of quantum protocols for classical two-party cryptography

    NASA Astrophysics Data System (ADS)

    Salvail, Louis; Schaffner, Christian; Sotáková, Miroslava

    2015-12-01

    We study quantum protocols among two distrustful parties. By adopting a rather strict definition of correctness — guaranteeing that honest players obtain their correct outcomes only — we can show that every strictly correct quantum protocol implementing a non-trivial classical primitive necessarily leaks information to a dishonest player. This extends known impossibility results to all non-trivial primitives. We provide a framework for quantifying this leakage and argue that leakage is a good measure for the privacy provided to the players by a given protocol. Our framework also covers the case where the two players are helped by a trusted third party. We show that despite the help of a trusted third party, the players cannot amplify the cryptographic power of any primitive. All our results hold even against quantum honest-but-curious adversaries who honestly follow the protocol but purify their actions and apply a different measurement at the end of the protocol. As concrete examples, we establish lower bounds on the leakage of standard universal two-party primitives such as oblivious transfer.

  6. Quantified Energy Dissipation Rates in the Terrestrial Bow Shock. 1.; Analysis Techniques and Methodology

    NASA Technical Reports Server (NTRS)

    Wilson, L. B., III; Sibeck, D. G.; Breneman, A.W.; Le Contel, O.; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.

    2014-01-01

    We present a detailed outline and discussion of the analysis techniques used to compare the relevance of different energy dissipation mechanisms at collisionless shock waves. We show that the low-frequency, quasi-static fields contribute less to ohmic energy dissipation, (-j · E ) (minus current density times measured electric field), than their high-frequency counterparts. In fact, we found that high-frequency, large-amplitude (greater than 100 millivolts per meter and/or greater than 1 nanotesla) waves are ubiquitous in the transition region of collisionless shocks. We quantitatively show that their fields, through wave-particle interactions, cause enough energy dissipation to regulate the global structure of collisionless shocks. The purpose of this paper, part one of two, is to outline and describe in detail the background, analysis techniques, and theoretical motivation for our new results presented in the companion paper. The companion paper presents the results of our quantitative energy dissipation rate estimates and discusses the implications. Together, the two manuscripts present the first study quantifying the contribution that high-frequency waves provide, through wave-particle interactions, to the total energy dissipation budget of collisionless shock waves.

  7. Quantifying Attachment and Antibiotic Resistance of from Conventional and Organic Swine Manure.

    PubMed

    Zwonitzer, Martha R; Soupir, Michelle L; Jarboe, Laura R; Smith, Douglas R

    2016-03-01

    Broad-spectrum antibiotics are often administered to swine, contributing to the occurrence of antibiotic-resistant bacteria in their manure. During land application, the bacteria in swine manure preferentially attach to particles in the soil, affecting their transport in overland flow. However, a quantitative understanding of these attachment mechanisms is lacking, and their relationship to antibiotic resistance is unknown. The objective of this study is to examine the relationships between antibiotic resistance and attachment to very fine silica sand in collected from swine manure. A total of 556 isolates were collected from six farms, two organic and four conventional (antibiotics fed prophylactically). Antibiotic resistance was quantified using 13 antibiotics at three minimum inhibitory concentrations: resistant, intermediate, and susceptible. Of the 556 isolates used in the antibiotic resistance assays, 491 were subjected to an attachment assay. Results show that isolates from conventional systems were significantly more resistant to amoxicillin, ampicillin, chlortetracycline, erythromycin, kanamycin, neomycin, streptomycin, tetracycline, and tylosin ( < 0.001). Results also indicate that isolated from conventional systems attached to very fine silica sand at significantly higher levels than those from organic systems ( < 0.001). Statistical analysis showed that a significant relationship did not exist between antibiotic resistance levels and attachment in from conventional systems but did for organic systems ( < 0.001). Better quantification of these relationships is critical to understanding the behavior of in the environment and preventing exposure of human populations to antibiotic-resistant bacteria.

  8. Quantifying the Carbon Intensity of Biomass Energy

    NASA Astrophysics Data System (ADS)

    Hodson, E. L.; Wise, M.; Clarke, L.; McJeon, H.; Mignone, B.

    2012-12-01

    emissions occur when biomass production used for energy displaces land used for food crops, forest products, pasture, or other arable land in the same region. Indirect emissions occur when increased food crop production, compensating for displaced food crop production in the biomass production region, displaces land in regions outside of the region of biomass production. Initial results from this study suggest that indirect land use emissions, mainly from converting unmanaged forest land, are likely to be as important as direct land use emissions in determining the carbon intensity of biomass energy. Finally, we value the emissions of a marginal unit of biomass production for a given carbon price path and a range of assumed social discount rates. We also compare the cost of bioenergy emissions as valued by a hypothetical private actor to the relevant cost of emissions from conventional fossil fuels, such as coal or natural gas.

  9. Quantifying the Restorable Water Volume of California's Sierra Nevada Meadows

    NASA Astrophysics Data System (ADS)

    Emmons, J. D.; Yarnell, S. M.; Fryjoff-Hung, A.; Viers, J.

    2013-12-01

    The Sierra Nevada is estimated to provide over 66% of California's water supply, which is largely derived from snowmelt. Global climate warming is expected to result in a decrease in snow pack and an increase in melting rate, making the attenuation of snowmelt by any means, an important ecosystem service for ensuring water availability. Montane meadows are dispersed throughout the mountain range and can act like natural reservoirs, and also provide wildlife habitat, water filtration, and water storage. Despite the important role of meadows in the Sierra Nevada, a large proportion is degraded from stream incision, which increases volume outflows and reduces overbank flooding, thus reducing infiltration and potential water storage. Restoration of meadow stream channels would therefore improve hydrological functioning, including increased water storage. The potential water holding capacity of restored meadows has yet to be quantified, thus this research seeks to address this knowledge gap by estimating the restorable water volume due to stream incision. More than 17,000 meadows were analyzed by categorizing their erosion potential using channel slope and soil texture, ultimately resulting in six general erodibility types. Field measurements of over 100 meadows, stratified by latitude, elevation, and geologic substrate, were then taken and analyzed for each erodibility type to determine average depth of incision. Restorable water volume was then quantified as a function of water holding capacity of the soil, meadow area and incised depth. Total restorable water volume was found to be 120 x 10^6 m3, or approximately 97,000 acre-feet. Using 95% confidence intervals for incised depth, the upper and lower bounds of the total restorable water volume were found to be 107 - 140 x 10^6 m3. Though this estimate of restorable water volume is small in regards to the storage capacity of typical California reservoirs, restoration of Sierra Nevada meadows remains an important

  10. Quantifying the Climate-Scale Accuracy of Satellite Cloud Retrievals

    NASA Astrophysics Data System (ADS)

    Roberts, Y.; Wielicki, B. A.; Sun-Mack, S.; Minnis, P.; Liang, L.; Di Girolamo, L.

    2014-12-01

    Instrument calibration and cloud retrieval algorithms have been developed to minimize retrieval errors on small scales. However, measurement uncertainties and assumptions within retrieval algorithms at the pixel level may alias into decadal-scale trends of cloud properties. We first, therefore, quantify how instrument calibration changes could alias into cloud property trends. For a perfect observing system the climate trend accuracy is limited only by the natural variability of the climate variable. Alternatively, for an actual observing system, the climate trend accuracy is additionally limited by the measurement uncertainty. Drifts in calibration over time may therefore be disguised as a true climate trend. We impose absolute calibration changes to MODIS spectral reflectance used as input to the CERES Cloud Property Retrieval System (CPRS) and run the modified MODIS reflectance through the CPRS to determine the sensitivity of cloud properties to calibration changes. We then use these changes to determine the impact of instrument calibration changes on trend uncertainty in reflected solar cloud properties. Secondly, we quantify how much cloud retrieval algorithm assumptions alias into cloud optical retrieval trends by starting with the largest of these biases: the plane-parallel assumption in cloud optical thickness (τC) retrievals. First, we collect liquid water cloud fields obtained from Multi-angle Imaging Spectroradiometer (MISR) measurements to construct realistic probability distribution functions (PDFs) of 3D cloud anisotropy (a measure of the degree to which clouds depart from plane-parallel) for different ISCCP cloud types. Next, we will conduct a theoretical study with dynamically simulated cloud fields and a 3D radiative transfer model to determine the relationship between 3D cloud anisotropy and 3D τC bias for each cloud type. Combining these results provides distributions of 3D τC bias by cloud type. Finally, we will estimate the change in

  11. A NEW METHOD TO QUANTIFY CORE TEMPERATURE INSTABILITY IN RODENTS.

    EPA Science Inventory

    Methods to quantify instability of autonomic systems such as temperature regulation should be important in toxicant and drug safety studies. Stability of core temperature (Tc) in laboratory rodents is susceptible to a variety of stimuli. Calculating the temperature differential o...

  12. Wireless accelerometer iPod application for quantifying gait characteristics.

    PubMed

    LeMoyne, Robert; Mastroianni, Timothy; Grundfest, Warren

    2011-01-01

    The capability to quantify gait characteristics through a wireless accelerometer iPod application in an effectively autonomous environment may alleviate the progressive strain on highly specific medical resources. The iPod consists of the inherent attributes imperative for robust gait quantification, such as a three dimensional accelerometer, data storage, flexible software, and the capacity for wireless transmission of the gait data through email. Based on the synthesis of the integral components of the iPod, a wireless accelerometer iPod application for quantifying gait characteristics has been tested and evaluated in an essentially autonomous environment. The quantified gait acceleration waveforms were wirelessly transmitted using email for postprocessing. The site for the gait experiment occurred in a remote location relative to the location where the postprocessing was conducted. The wireless accelerometer iPod application for quantifying gait characteristics demonstrated sufficient accuracy and consistency.

  13. Quantifying Phycocyanin Concentration in Cyanobacterial Algal Blooms from Remote Sensing Reflectance-A Quasi Analytical Approach

    NASA Astrophysics Data System (ADS)

    Mishra, S.; Mishra, D. R.; Tucker, C.

    2011-12-01

    Cyanobacterial harmful algal blooms (CHAB) are notorious for depleting dissolved oxygen level, producing various toxins, causing threats to aquatic life, altering the food-web dynamics and the overall ecosystem functioning in inland lakes, estuaries, and coastal waters. Most of these algal blooms produce various toxins that can damage cells, tissues and even cause mortality of living organisms. Frequent monitoring of water quality in a synoptic scale has been possible by the virtue of remote sensing techniques. In this research, we present a novel technique to monitor CHAB using remote sensing reflectance products. We have modified a multi-band quasi analytical algorithm that determines phytoplankton absorption coefficients from above surface remote sensing reflectance measurements using an inversion method. In situ hyperspectral remote sensing reflectance data were collected from several highly turbid and productive aquaculture ponds. A novel technique was developed to further decompose the phytoplankton absorption coefficients at 620 nm and obtain phycocyanin absorption coefficient at the same wavelength. An empirical relationship was established between phycocyanin absorption coefficients at 620 nm and measured phycocyanin concentrations. Model calibration showed strong relationship between phycocyanin absorption coefficients and phycocyanin pigment concentration (r2=0.94). Validation of the model in a separate dataset produced a root mean squared error of 167 mg m-3 (phycocyanin range: 26-1012 mg m-3). Results demonstrate that the new approach will be suitable for quantifying phycocyanin concentration in cyanobacteria dominated turbid productive waters. Band architecture of the model matches with the band configuration of the Medium Resolution Imaging Spectrometer (MERIS) and assures that MERIS reflectance products can be used to quantify phycocyanin in cyanobacterial harmful algal blooms in optically complex waters.

  14. Quantifying forearm muscle activity during wrist and finger movements by means of multi-channel electromyography.

    PubMed

    Gazzoni, Marco; Celadon, Nicolò; Mastrapasqua, Davide; Paleari, Marco; Margaria, Valentina; Ariano, Paolo

    2014-01-01

    The study of hand and finger movement is an important topic with applications in prosthetics, rehabilitation, and ergonomics. Surface electromyography (sEMG) is the gold standard for the analysis of muscle activation. Previous studies investigated the optimal electrode number and positioning on the forearm to obtain information representative of muscle activation and robust to movements. However, the sEMG spatial distribution on the forearm during hand and finger movements and its changes due to different hand positions has never been quantified. The aim of this work is to quantify 1) the spatial localization of surface EMG activity of distinct forearm muscles during dynamic free movements of wrist and single fingers and 2) the effect of hand position on sEMG activity distribution. The subjects performed cyclic dynamic tasks involving the wrist and the fingers. The wrist tasks and the hand opening/closing task were performed with the hand in prone and neutral positions. A sensorized glove was used for kinematics recording. sEMG signals were acquired from the forearm muscles using a grid of 112 electrodes integrated into a stretchable textile sleeve. The areas of sEMG activity have been identified by a segmentation technique after a data dimensionality reduction step based on Non Negative Matrix Factorization applied to the EMG envelopes. The results show that 1) it is possible to identify distinct areas of sEMG activity on the forearm for different fingers; 2) hand position influences sEMG activity level and spatial distribution. This work gives new quantitative information about sEMG activity distribution on the forearm in healthy subjects and provides a basis for future works on the identification of optimal electrode configuration for sEMG based control of prostheses, exoskeletons, or orthoses. An example of use of this information for the optimization of the detection system for the estimation of joint kinematics from sEMG is reported.

  15. Signal enhancement ratio (SER) quantified from breast DCE-MRI and breast cancer risk

    NASA Astrophysics Data System (ADS)

    Wu, Shandong; Kurland, Brenda F.; Berg, Wendie A.; Zuley, Margarita L.; Jankowitz, Rachel C.; Sumkin, Jules; Gur, David

    2015-03-01

    Breast magnetic resonance imaging (MRI) is recommended as an adjunct to mammography for women who are considered at elevated risk of developing breast cancer. As a key component of breast MRI, dynamic contrast-enhanced MRI (DCE-MRI) uses a contrast agent to provide high intensity contrast between breast tissues, making it sensitive to tissue composition and vascularity. Breast DCE-MRI characterizes certain physiologic properties of breast tissue that are potentially related to breast cancer risk. Studies have shown that increased background parenchymal enhancement (BPE), which is the contrast enhancement occurring in normal cancer-unaffected breast tissues in post-contrast sequences, predicts increased breast cancer risk. Signal enhancement ratio (SER) computed from pre-contrast and post-contrast sequences in DCE-MRI measures change in signal intensity due to contrast uptake over time and is a measure of contrast enhancement kinetics. SER quantified in breast tumor has been shown potential as a biomarker for characterizing tumor response to treatments. In this work we investigated the relationship between quantitative measures of SER and breast cancer risk. A pilot retrospective case-control study was performed using a cohort of 102 women, consisting of 51 women who had diagnosed with unilateral breast cancer and 51 matched controls (by age and MRI date) with a unilateral biopsy-proven benign lesion. SER was quantified using fully-automated computerized algorithms and three SER-derived quantitative volume measures were compared between the cancer cases and controls using logistic regression analysis. Our preliminary results showed that SER is associated with breast cancer risk, after adjustment for the Breast Imaging Reporting and Data System (BI-RADS)-based mammographic breast density measures. This pilot study indicated that SER has potential for use as a risk factor for breast cancer risk assessment in women at elevated risk of developing breast cancer.

  16. Learned control over spinal nociception reduces supraspinal nociception as quantified by late somatosensory evoked potentials.

    PubMed

    Ruscheweyh, Ruth; Bäumler, Maximilian; Feller, Moritz; Krafft, Stefanie; Sommer, Jens; Straube, Andreas

    2015-12-01

    We have recently shown that subjects can learn to use cognitive-emotional strategies to suppress their spinal nociceptive flexor reflex (RIII reflex) under visual RIII feedback and proposed that this reflects learned activation of descending pain inhibition. Here, we investigated whether learned RIII suppression also affects supraspinal nociception and whether previous relaxation training increases success. Subjects were trained over 3 sessions to reduce their RIII size by self-selected cognitive-emotional strategies. Two groups received true RIII feedback (with or without previous relaxation training) and a sham group received false feedback (15 subjects per group). RIII reflexes, late somatosensory evoked potentials (SEPs), and F-waves were recorded and pain intensity ratings collected. Both true feedback groups achieved significant (P < 0.01) but similar RIII suppression (to 79% ± 21% and 70% ± 17% of control). Somatosensory evoked potential amplitude (100-150 milliseconds after stimulation) was reduced in parallel with the RIII size (r = 0.57, P < 0.01). In the sham group, neither RIII size nor SEP amplitude was significantly reduced during feedback training. Pain intensity was significantly reduced in all 3 groups and also correlated with RIII reduction (r = 0.44, P < 0.01). F-wave parameters were not affected during RIII suppression. The present results show that learned RIII suppression also affects supraspinal nociception as quantified by SEPs, although effects on pain ratings were less clear. Lower motor neuron excitability as quantified by F-waves was not affected. Previous relaxation training did not significantly improve RIII feedback training success.

  17. Quantifying fluvial topography using UAS imagery and SfM photogrammetry

    NASA Astrophysics Data System (ADS)

    Woodget, Amy; Carbonneau, Patrice; Visser, Fleur; Maddock, Ian; Habit, Evelyn

    2014-05-01

    The measurement and monitoring of fluvial topography at high spatial and temporal resolutions is in increasing demand for a range of river science and management applications, including change detection, hydraulic models, habitat assessments, river restorations and sediment budgets. Existing approaches are yet to provide a single technique for rapidly quantifying fluvial topography in both exposed and submerged areas, with high spatial resolution, reach-scale continuous coverage, high accuracy and reasonable cost. In this paper, we explore the potential of using imagery acquired from a small unmanned aerial system (UAS) and processed using Structure-from-Motion (SfM) photogrammetry for filling this gap. We use a rotary winged hexacopter known as the Draganflyer X6, a consumer grade digital camera (Panasonic Lumix DMC-LX3) and the commercially available PhotoScan Pro SfM software (Agisoft LLC). We test the approach on three contrasting river systems; a shallow margin of the San Pedro River in the Valdivia region of south-central Chile, the lowland River Arrow in Warwickshire, UK, and the upland Coledale Beck in Cumbria, UK. Digital elevation models (DEMs) and orthophotos of hyperspatial resolution (0.01-0.02m) are produced. Mean elevation errors are found to vary somewhat between sites, dependent on vegetation coverage and the spatial arrangement of ground control points (GCPs) used to georeference the data. Mean errors are in the range 4-44mm for exposed areas and 17-89mm for submerged areas. Errors in submerged areas can be improved to 4-56mm with the application of a simple refraction correction procedure. Multiple surveys of the River Arrow site show consistently high quality results, indicating the repeatability of the approach. This work therefore demonstrates the potential of a UAS-SfM approach for quantifying fluvial topography.