Science.gov

Sample records for quantified results show

  1. Quantifying causal emergence shows that macro can beat micro.

    PubMed

    Hoel, Erik P; Albantakis, Larissa; Tononi, Giulio

    2013-12-01

    Causal interactions within complex systems can be analyzed at multiple spatial and temporal scales. For example, the brain can be analyzed at the level of neurons, neuronal groups, and areas, over tens, hundreds, or thousands of milliseconds. It is widely assumed that, once a micro level is fixed, macro levels are fixed too, a relation called supervenience. It is also assumed that, although macro descriptions may be convenient, only the micro level is causally complete, because it includes every detail, thus leaving no room for causation at the macro level. However, this assumption can only be evaluated under a proper measure of causation. Here, we use a measure [effective information (EI)] that depends on both the effectiveness of a system's mechanisms and the size of its state space: EI is higher the more the mechanisms constrain the system's possible past and future states. By measuring EI at micro and macro levels in simple systems whose micro mechanisms are fixed, we show that for certain causal architectures EI can peak at a macro level in space and/or time. This happens when coarse-grained macro mechanisms are more effective (more deterministic and/or less degenerate) than the underlying micro mechanisms, to an extent that overcomes the smaller state space. Thus, although the macro level supervenes upon the micro, it can supersede it causally, leading to genuine causal emergence--the gain in EI when moving from a micro to a macro level of analysis.

  2. Storing CO2 underground shows promising results

    NASA Astrophysics Data System (ADS)

    Zweigel, Peter; Gale, John

    Long-term underground storage of CO2 is an important element in concepts to reduce atmospheric CO2 emissions as the use of fossil fuels continues. The first results of a multinational research project evaluating the injection of CO2 into a saline aquifer in the North Sea are validating this method of CO2 reduction, and are serving to further define the research needed to develop the technology for large-scale applicability. Reducing the emission of substances that have potentially harmful effects on global climate— for example, CO2—has become a central issue of environmental policy at least since the 1997 Kyoto conference on climate change.

  3. Different methods to quantify Listeria monocytogenes biofilms cells showed different profile in their viability

    PubMed Central

    Winkelströter, Lizziane Kretli; Martinis, Elaine C.P. De

    2015-01-01

    Listeria monocytogenes is a foodborne pathogen able to adhere and to form biofilms in several materials commonly present in food processing plants. The aim of this study was to evaluate the resistance of Listeria monocytogenes attached to abiotic surface, after treatment with sanitizers, by culture method, microscopy and Quantitative Real Time Polymerase Chain Reaction (qPCR). Biofilms of L. monocytogenes were obtained in stainless steel coupons immersed in Brain Heart Infusion Broth, under agitation at 37 °C for 24 h. The methods selected for this study were based on plate count, microscopic count with the aid of viability dyes (CTC-DAPI), and qPCR. Results of culture method showed that peroxyacetic acid was efficient to kill sessile L. monocytogenes populations, while sodium hypochlorite was only partially effective to kill attached L. monocytogenes (p < 0.05). When, viability dyes (CTC/DAPI) combined with fluorescence microscopy and qPCR were used and lower counts were found after treatments (p < 0.05). Selective quantification of viable cells of L. monocytogenes by qPCR using EMA revelead that the pre-treatment with EMA was not appropriate since it also inhibited amplification of DNA from live cells by ca. 2 log. Thus, the use of CTC counts was the best method to count viable cells in biofilms. PMID:26221112

  4. 14. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF INADEQUATE TAMPING. THE SIZE OF THE GRANITE AGGREGATE USED IN THE DAMS CONCRETE IS CLEARLY SHOWN. - Hume Lake Dam, Sequioa National Forest, Hume, Fresno County, CA

  5. Quantifying IOHDR brachytherapy underdosage resulting from an incomplete scatter environment

    SciTech Connect

    Raina, Sanjay; Avadhani, Jaiteerth S.; Oh, Moonseong; Malhotra, Harish K.; Jaggernauth, Wainwright; Kuettel, Michael R.; Podgorsak, Matthew B. . E-mail: matthew.podgorsak@roswellpark.org

    2005-04-01

    Purpose: Most brachytherapy planning systems are based on a dose calculation algorithm that assumes an infinite scatter environment surrounding the target volume and applicator. Dosimetric errors from this assumption are negligible. However, in intraoperative high-dose-rate brachytherapy (IOHDR) where treatment catheters are typically laid either directly on a tumor bed or within applicators that may have little or no scatter material above them, the lack of scatter from one side of the applicator can result in underdosage during treatment. This study was carried out to investigate the magnitude of this underdosage. Methods: IOHDR treatment geometries were simulated using a solid water phantom beneath an applicator with varying amounts of bolus material on the top and sides of the applicator to account for missing tissue. Treatment plans were developed for 3 different treatment surface areas (4 x 4, 7 x 7, 12 x 12 cm{sup 2}), each with prescription points located at 3 distances (0.5 cm, 1.0 cm, and 1.5 cm) from the source dwell positions. Ionization measurements were made with a liquid-filled ionization chamber linear array with a dedicated electrometer and data acquisition system. Results: Measurements showed that the magnitude of the underdosage varies from about 8% to 13% of the prescription dose as the prescription depth is increased from 0.5 cm to 1.5 cm. This treatment error was found to be independent of the irradiated area and strongly dependent on the prescription distance. Furthermore, for a given prescription depth, measurements in planes parallel to an applicator at distances up to 4.0 cm from the applicator plane showed that the dose delivery error is equal in magnitude throughout the target volume. Conclusion: This study demonstrates the magnitude of underdosage in IOHDR treatments delivered in a geometry that may not result in a full scatter environment around the applicator. This implies that the target volume and, specifically, the prescription

  6. 13. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF POOR CONSTRUCTION WORK. THOUGH NOT A SERIOUS STRUCTURAL DEFICIENCY, THE 'HONEYCOMB' TEXTURE OF THE CONCRETE SURFACE WAS THE RESULT OF INADEQUATE TAMPING AT THE TIME OF THE INITIAL 'POUR'. - Hume Lake Dam, Sequioa National Forest, Hume, Fresno County, CA

  7. Emerging Trends in Contextual Learning Show Positive Results for Students.

    ERIC Educational Resources Information Center

    WorkAmerica, 2001

    2001-01-01

    This issue focuses on contextual learning (CL), in which students master rigorous academic content in real-world or work-based learning experiences. "Emerging Trends in CL Show Positive Results for Students" discusses CL as an important strategy for improving student achievement. It describes: how CL raises the bar for all students, challenging…

  8. Breast vibro-acoustography: initial results show promise

    PubMed Central

    2012-01-01

    Introduction Vibro-acoustography (VA) is a recently developed imaging modality that is sensitive to the dynamic characteristics of tissue. It detects low-frequency harmonic vibrations in tissue that are induced by the radiation force of ultrasound. Here, we have investigated applications of VA for in vivo breast imaging. Methods A recently developed combined mammography-VA system for in vivo breast imaging was tested on female volunteers, aged 25 years or older, with suspected breast lesions on their clinical examination. After mammography, a set of VA scans was acquired by the experimental device. In a masked assessment, VA images were evaluated independently by 3 reviewers who identified mass lesions and calcifications. The diagnostic accuracy of this imaging method was determined by comparing the reviewers' responses with clinical data. Results We collected images from 57 participants: 7 were used for training and 48 for evaluation of diagnostic accuracy (images from 2 participants were excluded because of unexpected imaging artifacts). In total, 16 malignant and 32 benign lesions were examined. Specificity for diagnostic accuracy was 94% or higher for all 3 reviewers, but sensitivity varied (69% to 100%). All reviewers were able to detect 97% of masses, but sensitivity for detection of calcification was lower (≤ 72% for all reviewers). Conclusions VA can be used to detect various breast abnormalities, including calcifications and benign and malignant masses, with relatively high specificity. VA technology may lead to a new clinical tool for breast imaging applications. PMID:23021305

  9. Quantifying Fine Root Carbon Inputs To Soil: Results From Combining Radiocarbon And Traditional Methodologies.

    NASA Astrophysics Data System (ADS)

    Gaudinski, J. B.; Trumbore, S. E.; Dawson, T.; Torn, M.; Pregitzer, K.; Joslin, J. D.

    2002-12-01

    Estimates of high belowground net primary productivity (50% or more) in forest ecosystems are often based on assumptions that almost all fine roots (< 2 mm in diameter) live and die within one year. Recent radiocarbon (14C) measurements of fine root cellulose in three eastern temperate forests of the United States show that at least a portion of fine roots are living for more than 8 years (Gaudinski et al. 2001) and that fine root lifespans likely vary as a function of both diameter and position on the root branch system. New data from investigations under way in several different temperate forests further support the idea of large variations in root lifespans with radiocarbon-derived ages ranging from approximately one year to several years. In forests where both mini-rhizotron and 14C lifespan estimates have been made, the two techniques agree well when the 14C sampling is made on the same types of roots viewed by mini-rhizotron cameras (i.e. first and second order roots; the most distal and newest roots on the root branching system), and the 14C signature of new root growth is known. We have quantified the signature of new tree roots by taking advantage of locally-elevated 14C at Oak Ridge Tennessee, which shows that carbon making up new roots was photosynthesized approximately 1.5 years prior to new root growth. Position on the root branching system shows a correlation with age, with ages up to 7 years for 4th order roots of red maple. The method by which roots are sampled also affects the 14C-estimated age, with total fine root population, sampled via soil cores, showing longer lifespans relative to roots sampled by position on the root branch system (when similar diameter classes are compared). Overall, the implication of our studies is that assumptions of turnover times of 1 year result in underestimates of the true lifespan of a large portion of fine root biomass in temperate forests. This suggests that future calculations of belowground net primary

  10. Quantifying Uncertainty in Model Predictions for the Pliocene (Plio-QUMP): Initial results

    USGS Publications Warehouse

    Pope, J.O.; Collins, M.; Haywood, A.M.; Dowsett, H.J.; Hunter, S.J.; Lunt, D.J.; Pickering, S.J.; Pound, M.J.

    2011-01-01

    Examination of the mid-Pliocene Warm Period (mPWP; ~. 3.3 to 3.0. Ma BP) provides an excellent opportunity to test the ability of climate models to reproduce warm climate states, thereby assessing our confidence in model predictions. To do this it is necessary to relate the uncertainty in model simulations of mPWP climate to uncertainties in projections of future climate change. The uncertainties introduced by the model can be estimated through the use of a Perturbed Physics Ensemble (PPE). Developing on the UK Met Office Quantifying Uncertainty in Model Predictions (QUMP) Project, this paper presents the results from an initial investigation using the end members of a PPE in a fully coupled atmosphere-ocean model (HadCM3) running with appropriate mPWP boundary conditions. Prior work has shown that the unperturbed version of HadCM3 may underestimate mPWP sea surface temperatures at higher latitudes. Initial results indicate that neither the low sensitivity nor the high sensitivity simulations produce unequivocally improved mPWP climatology relative to the standard. Whilst the high sensitivity simulation was able to reconcile up to 6 ??C of the data/model mismatch in sea surface temperatures in the high latitudes of the Northern Hemisphere (relative to the standard simulation), it did not produce a better prediction of global vegetation than the standard simulation. Overall the low sensitivity simulation was degraded compared to the standard and high sensitivity simulations in all aspects of the data/model comparison. The results have shown that a PPE has the potential to explore weaknesses in mPWP modelling simulations which have been identified by geological proxies, but that a 'best fit' simulation will more likely come from a full ensemble in which simulations that contain the strengths of the two end member simulations shown here are combined. ?? 2011 Elsevier B.V.

  11. Quantifying viruses and bacteria in wastewater—Results, interpretation methods, and quality control

    USGS Publications Warehouse

    Francy, Donna S.; Stelzer, Erin A.; Bushon, Rebecca N.; Brady, Amie M.G.; Mailot, Brian E.; Spencer, Susan K.; Borchardt, Mark A.; Elber, Ashley G.; Riddell, Kimberly R.; Gellner, Terry M.

    2011-01-01

    Membrane bioreactors (MBR), used for wastewater treatment in Ohio and elsewhere in the United States, have pore sizes small enough to theoretically reduce concentrations of protozoa and bacteria, but not viruses. Sampling for viruses in wastewater is seldom done and not required. Instead, the bacterial indicators Escherichia coli (E. coli) and fecal coliforms are the required microbial measures of effluents for wastewater-discharge permits. Information is needed on the effectiveness of MBRs in removing human enteric viruses from wastewaters, particularly as compared to conventional wastewater treatment before and after disinfection. A total of 73 regular and 28 quality-control (QC) samples were collected at three MBR and two conventional wastewater plants in Ohio during 23 regular and 3 QC sampling trips in 2008-10. Samples were collected at various stages in the treatment processes and analyzed for bacterial indicators E. coli, fecal coliforms, and enterococci by membrane filtration; somatic and F-specific coliphage by the single agar layer (SAL) method; adenovirus, enterovirus, norovirus GI and GII, rotavirus, and hepatitis A virus by molecular methods; and viruses by cell culture. While addressing the main objective of the study-comparing removal of viruses and bacterial indicators in MBR and conventional plants-it was realized that work was needed to identify data analysis and quantification methods for interpreting enteric virus and QC data. Therefore, methods for quantifying viruses, qualifying results, and applying QC data to interpretations are described in this report. During each regular sampling trip, samples were collected (1) before conventional or MBR treatment (post-preliminary), (2) after secondary or MBR treatment (post-secondary or post-MBR), (3) after tertiary treatment (one conventional plant only), and (4) after disinfection (post-disinfection). Glass-wool fiber filtration was used to concentrate enteric viruses from large volumes, and small

  12. Quantifying viruses and bacteria in wastewater—Results, interpretation methods, and quality control

    USGS Publications Warehouse

    Francy, Donna S.; Stelzer, Erin A.; Bushon, Rebecca N.; Brady, Amie M.G.; Mailot, Brian E.; Spencer, Susan K.; Borchardt, Mark A.; Elber, Ashley G.; Riddell, Kimberly R.; Gellner, Terry M.

    2011-01-01

    Membrane bioreactors (MBR), used for wastewater treatment in Ohio and elsewhere in the United States, have pore sizes small enough to theoretically reduce concentrations of protozoa and bacteria, but not viruses. Sampling for viruses in wastewater is seldom done and not required. Instead, the bacterial indicators Escherichia coli (E. coli) and fecal coliforms are the required microbial measures of effluents for wastewater-discharge permits. Information is needed on the effectiveness of MBRs in removing human enteric viruses from wastewaters, particularly as compared to conventional wastewater treatment before and after disinfection. A total of 73 regular and 28 quality-control (QC) samples were collected at three MBR and two conventional wastewater plants in Ohio during 23 regular and 3 QC sampling trips in 2008-10. Samples were collected at various stages in the treatment processes and analyzed for bacterial indicators E. coli, fecal coliforms, and enterococci by membrane filtration; somatic and F-specific coliphage by the single agar layer (SAL) method; adenovirus, enterovirus, norovirus GI and GII, rotavirus, and hepatitis A virus by molecular methods; and viruses by cell culture. While addressing the main objective of the study-comparing removal of viruses and bacterial indicators in MBR and conventional plants-it was realized that work was needed to identify data analysis and quantification methods for interpreting enteric virus and QC data. Therefore, methods for quantifying viruses, qualifying results, and applying QC data to interpretations are described in this report. During each regular sampling trip, samples were collected (1) before conventional or MBR treatment (post-preliminary), (2) after secondary or MBR treatment (post-secondary or post-MBR), (3) after tertiary treatment (one conventional plant only), and (4) after disinfection (post-disinfection). Glass-wool fiber filtration was used to concentrate enteric viruses from large volumes, and small

  13. Preliminary Results In Quantifying The Climatic Impact Forcing Factors Around 3 Ma Ago

    NASA Astrophysics Data System (ADS)

    Fluteau, F.; Ramstein, G.; Duringer, P.; Schuster, M.; Tiercelin, J. J.

    What is exactly the control of climate changes on the development of the Hominids ? Is it possible to quantify such changes ? and which are the forcing factors that create these changes ? We use here a General Circulation Model to investigate the climate sensitivity of 3 different forcing factors : the uplift of the East African Rift, the ex- tent (more than twenty time PD surfaces) of the Chad Lake and ultimately we shall with a coupled oceanatmospher GCM test the the effect of Indonesian throughflow changes. To achieve these goals, we need a multidisciplinary group to assess the evo- lution of the Rift and the extent of the Lake. We prescribe these different boundary conditions to the GCM and use a biome model to assess the vegetation changes. In this presentation we will only focus on the Rift uplift and the Chad lake impacts on Atmospheric circulation, monsoon and their environmental consequences in term of vegetation changes.

  14. Astronomy Diagnostic Test Results Reflect Course Goals and Show Room for Improvement

    ERIC Educational Resources Information Center

    LoPresto, Michael C.

    2007-01-01

    The results of administering the Astronomy Diagnostic Test (ADT) to introductory astronomy students at Henry Ford Community College over three years have shown gains comparable with national averages. Results have also accurately corresponded to course goals, showing greater gains in topics covered in more detail, and lower gains in topics covered…

  15. Gun shows and gun violence: fatally flawed study yields misleading results.

    PubMed

    Wintemute, Garen J; Hemenway, David; Webster, Daniel; Pierce, Glenn; Braga, Anthony A

    2010-10-01

    A widely publicized but unpublished study of the relationship between gun shows and gun violence is being cited in debates about the regulation of gun shows and gun commerce. We believe the study is fatally flawed. A working paper entitled "The Effect of Gun Shows on Gun-Related Deaths: Evidence from California and Texas" outlined this study, which found no association between gun shows and gun-related deaths. We believe the study reflects a limited understanding of gun shows and gun markets and is not statistically powered to detect even an implausibly large effect of gun shows on gun violence. In addition, the research contains serious ascertainment and classification errors, produces results that are sensitive to minor specification changes in key variables and in some cases have no face validity, and is contradicted by 1 of its own authors' prior research. The study should not be used as evidence in formulating gun policy.

  16. Showing Value in Newborn Screening: Challenges in Quantifying the Effectiveness and Cost-Effectiveness of Early Detection of Phenylketonuria and Cystic Fibrosis

    PubMed Central

    Grosse, Scott D.

    2015-01-01

    Decision makers sometimes request information on the cost savings, cost-effectiveness, or cost-benefit of public health programs. In practice, quantifying the health and economic benefits of population-level screening programs such as newborn screening (NBS) is challenging. It requires that one specify the frequencies of health outcomes and events, such as hospitalizations, for a cohort of children with a given condition under two different scenarios—with or without NBS. Such analyses also assume that everything else, including treatments, is the same between groups. Lack of comparable data for representative screened and unscreened cohorts that are exposed to the same treatments following diagnosis can result in either under- or over-statement of differences. Accordingly, the benefits of early detection may be understated or overstated. This paper illustrates these common problems through a review of past economic evaluations of screening for two historically significant conditions, phenylketonuria and cystic fibrosis. In both examples qualitative judgments about the value of prompt identification and early treatment to an affected child were more influential than specific numerical estimates of lives or costs saved. PMID:26702401

  17. Image analysis techniques: Used to quantify and improve the precision of coatings testing results

    SciTech Connect

    Duncan, D.J.; Whetten, A.R.

    1993-12-31

    Coating evaluations often specify tests to measure performance characteristics rather than coating physical properties. These evaluation results are often very subjective. A new tool, Digital Video Image Analysis (DVIA), is successfully being used for two automotive evaluations; cyclic (scab) corrosion, and gravelometer (chip) test. An experimental design was done to evaluate variability and interactions among the instrumental factors. This analysis method has proved to be an order of magnitude more sensitive and reproducible than the current evaluations. Coating evaluations can be described and measured that had no way to be expressed previously. For example, DVIA chip evaluations can differentiate how much damage was done to the topcoat, primer even to the metal. DVIA with or without magnification, has the capability to become the quantitative measuring tool for several other coating evaluations, such as T-bends, wedge bends, acid etch analysis, coating defects, observing cure, defect formation or elimination over time, etc.

  18. Long-Term Trial Results Show No Mortality Benefit from Annual Prostate Cancer Screening

    Cancer.gov

    Thirteen year follow-up data from the Prostate, Lung, Colorectal and Ovarian (PLCO) cancer screening trial show higher incidence but similar mortality among men screened annually with the prostate-specific antigen (PSA) test and digital rectal examination

  19. Comparison of some results of program SHOW with other solar hot water computer programs

    NASA Astrophysics Data System (ADS)

    Young, M. F.; Baughn, J. W.

    Subroutines and the driver program for the simulation code SHOW (solar hot water) for solar thermosyphon systems are discussed, and simulations are compared with predictions by the F-CHART and TRNSYS codes. SHOW has the driver program MAIN, which defines the system control logic for choosing the appropriate system subroutine for analysis. Ten subroutines are described, which account for the solar system physical parameters, the weather data, the manufacturer-supplied system specifications, mass flow rates, pumped systems, total transformed radiation, load use profiles, stratification in storage, an electric water heater, and economic analyses. The three programs are employed to analyze a thermosiphon installation in Sacramento with two storage tanks. TRNSYS and SHOW were in agreement and lower than F-CHARt for annual predictions, although significantly more computer time was necessary to make TRNSYS converge.

  20. Data for behavioral results and brain regions showing a time effect during pair-association retrieval.

    PubMed

    Jimura, Koji; Hirose, Satoshi; Wada, Hiroyuki; Yoshizawa, Yasunori; Imai, Yoshio; Akahane, Masaaki; Machida, Toru; Shirouzu, Ichiro; Koike, Yasuharu; Konishi, Seiki

    2016-09-01

    The current data article provides behavioral and neuroimaging data for the research article "Relatedness-dependent rapid development of brain activity in anterior temporal cortex during pair-association retrieval" (Jimura et al., 2016) [1]. Behavioral performance is provided in a table. Fig. 2 of the article is based on this table. Brain regions showing time effect are provided in a table. A statistical activation map for the time effect is shown in Fig. 3C of the article. PMID:27508239

  1. Stem cells show promising results for lymphoedema treatment--a literature review.

    PubMed

    Toyserkani, Navid Mohamadpour; Christensen, Marlene Louise; Sheikh, Søren Paludan; Sørensen, Jens Ahm

    2015-04-01

    Lymphoedema is a debilitating condition, manifesting in excess lymphatic fluid and swelling of subcutaneous tissues. Lymphoedema is as of yet still an incurable condition and current treatment modalities are not satisfactory. The capacity of mesenchymal stem cells to promote angiogenesis, secrete growth factors, regulate the inflammatory process, and differentiate into multiple cell types make them a potential ideal therapy for lymphoedema. Adipose tissue is the richest and most accessible source of mesenchymal stem cells and they can be harvested, isolated, and used for therapy in a single stage procedure as an autologous treatment. The aim of this paper was to review all studies using mesenchymal stem cells for lymphoedema treatment with a special focus on the potential use of adipose-derived stem cells. A systematic search was performed and five preclinical and two clinical studies were found. Different stem cell sources and lymphoedema models were used in the described studies. Most studies showed a decrease in lymphoedema and an increased lymphangiogenesis when treated with stem cells and this treatment modality has so far shown great potential. The present studies are, however, subject to bias and more preclinical studies and large-scale high quality clinical trials are needed to show if this emerging therapy can satisfy expectations.

  2. Aortic emboli show surprising size dependent predilection for cerebral arteries: Results from computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Carr, Ian; Schwartz, Robert; Shadden, Shawn

    2012-11-01

    Cardiac emboli can have devastating consequences if they enter the cerebral circulation, and are the most common cause of embolic stroke. Little is known about relationships of embolic origin/density/size to cerebral events; as these relationships are difficult to observe. To better understand stoke risk from cardiac and aortic emboli, we developed a computational model to track emboli from the heart to the brain. Patient-specific models of the human aorta and arteries to the brain were derived from CT angiography from 10 MHIF patients. Blood flow was modeled by the Navier-Stokes equations using pulsatile inflow at the aortic valve, and physiologic Windkessel models at the outlets. Particulate was injected at the aortic valve and tracked using modified Maxey-Riley equations with a wall collision model. Results demonstrate aortic emboli that entered the cerebral circulation through the carotid or vertebral arteries were localized to specific locations of the proximal aorta. The percentage of released particles embolic to the brain markedly increased with particle size from 0 to ~1-1.5 mm in all patients. Larger particulate became less likely to traverse the cerebral vessels. These findings are consistent with sparse literature based on transesophageal echo measurements. This work was supported in part by the National Science Foundation, award number 1157041.

  3. Animation shows promise in initiating timely cardiopulmonary resuscitation: results of a pilot study.

    PubMed

    Attin, Mina; Winslow, Katheryn; Smith, Tyler

    2014-04-01

    Delayed responses during cardiac arrest are common. Timely interventions during cardiac arrest have a direct impact on patient survival. Integration of technology in nursing education is crucial to enhance teaching effectiveness. The goal of this study was to investigate the effect of animation on nursing students' response time to cardiac arrest, including initiation of timely chest compression. Nursing students were randomized into experimental and control groups prior to practicing in a high-fidelity simulation laboratory. The experimental group was educated, by discussion and animation, about the importance of starting cardiopulmonary resuscitation upon recognizing an unresponsive patient. Afterward, a discussion session allowed students in the experimental group to gain more in-depth knowledge about the most recent changes in the cardiac resuscitation guidelines from the American Heart Association. A linear mixed model was run to investigate differences in time of response between the experimental and control groups while controlling for differences in those with additional degrees, prior code experience, and basic life support certification. The experimental group had a faster response time compared with the control group and initiated timely cardiopulmonary resuscitation upon recognition of deteriorating conditions (P < .0001). The results demonstrated the efficacy of combined teaching modalities for timely cardiopulmonary resuscitation. Providing opportunities for repetitious practice when a patient's condition is deteriorating is crucial for teaching safe practice.

  4. QUantifying the Aerosol Direct and Indirect Effect over Eastern Mediterranean from Satellites (QUADIEEMS): Overview and preliminary results

    NASA Astrophysics Data System (ADS)

    Georgoulias, Aristeidis K.; Zanis, Prodromos; Pöschl, Ulrich; Kourtidis, Konstantinos A.; Alexandri, Georgia; Ntogras, Christos; Marinou, Eleni; Amiridis, Vassilis

    2013-04-01

    An overview and preliminary results from the research implemented within the framework of QUADIEEMS project are presented. For the scopes of the project, satellite data from five sensors (MODIS aboard EOS TERRA, MODIS aboard EOS AQUA, TOMS aboard Earth Probe, OMI aboard EOS AURA and CALIOP aboard CALIPSO) are used in conjunction with meteorological data from ECMWF ERA-interim reanalysis and data from a global chemical-aerosol-transport model as well as simulation results from a regional climate model (RegCM4) coupled with a simplified aerosol scheme. QUADIEEMS focuses on Eastern Mediterranean [30oN-45No, 17.5oE-37.5oE], a region situated at the crossroad of different aerosol types and thus ideal for the investigation of the direct and indirect effects of various aerosol types at a high spatial resolution. The project consists of five components. First, raw data from various databases are acquired, analyzed and spatially homogenized with the outcome being a high resolution (0.1x0.1 degree) and a moderate resolution (1.0x1.0 degree) gridded dataset of aerosol and cloud optical properties. The marine, dust and anthropogenic fraction of aerosols over the region is quantified making use of the homogenized dataset. Regional climate model simulations with REGCM4/aerosol are also implemented for the greater European region for the period 2000-2010 at a resolution of 50 km. REGCM4's ability to simulate AOD550 over Europe is evaluated. The aerosol-cloud relationships, for sub-regions of Eastern Mediterranean characterized by the presence of predominant aerosol types, are examined. The aerosol-cloud relationships are also examined taking into account the relative position of aerosol and cloud layers as defined by CALIPSO observations. Within the final component of the project, results and data that emerged from all the previous components are used in satellite-based parameterizations in order to quantify the direct and indirect (first) radiative effect of the different

  5. Quantifying geological processes on Mars-Results of the high resolution stereo camera (HRSC) on Mars express

    NASA Astrophysics Data System (ADS)

    Jaumann, R.; Tirsch, D.; Hauber, E.; Ansan, V.; Di Achille, G.; Erkeling, G.; Fueten, F.; Head, J.; Kleinhans, M. G.; Mangold, N.; Michael, G. G.; Neukum, G.; Pacifici, A.; Platz, T.; Pondrelli, M.; Raack, J.; Reiss, D.; Williams, D. A.; Adeli, S.; Baratoux, D.; de Villiers, G.; Foing, B.; Gupta, S.; Gwinner, K.; Hiesinger, H.; Hoffmann, H.; Deit, L. Le; Marinangeli, L.; Matz, K.-D.; Mertens, V.; Muller, J. P.; Pasckert, J. H.; Roatsch, T.; Rossi, A. P.; Scholten, F.; Sowe, M.; Voigt, J.; Warner, N.

    2015-07-01

    This review summarizes the use of High Resolution Stereo Camera (HRSC) data as an instrumental tool and its application in the analysis of geological processes and landforms on Mars during the last 10 years of operation. High-resolution digital elevations models on a local to regional scale are the unique strength of the HRSC instrument. The analysis of these data products enabled quantifying geological processes such as effusion rates of lava flows, tectonic deformation, discharge of water in channels, formation timescales of deltas, geometry of sedimentary deposits as well as estimating the age of geological units by crater size-frequency distribution measurements. Both the quantification of geological processes and the age determination allow constraining the evolution of Martian geologic activity in space and time. A second major contribution of HRSC is the discovery of episodicity in the intensity of geological processes on Mars. This has been revealed by comparative age dating of volcanic, fluvial, glacial, and lacustrine deposits. Volcanic processes on Mars have been active over more than 4 Gyr, with peak phases in all three geologic epochs, generally ceasing towards the Amazonian. Fluvial and lacustrine activity phases spread a time span from Noachian until Amazonian times, but detailed studies show that they have been interrupted by multiple and long lasting phases of quiescence. Also glacial activity shows discrete phases of enhanced intensity that may correlate with periods of increased spin-axis obliquity. The episodicity of geological processes like volcanism, erosion, and glaciation on Mars reflects close correlation between surface processes and endogenic activity as well as orbit variations and changing climate condition.

  6. Quantifying microwear on experimental Mistassini quartzite scrapers: preliminary results of exploratory research using LSCM and scale-sensitive fractal analysis.

    PubMed

    Stemp, W James; Lerner, Harry J; Kristant, Elaine H

    2013-01-01

    Although previous use-wear studies involving quartz and quartzite have been undertaken by archaeologists, these are comparatively few in number. Moreover, there has been relatively little effort to quantify use-wear on stone tools made from quartzite. The purpose of this article is to determine the effectiveness of a measurement system, laser scanning confocal microscopy (LSCM), to document the surface roughness or texture of experimental Mistassini quartzite scrapers used on two different contact materials (fresh and dry deer hide). As in previous studies using LSCM on chert, flint, and obsidian, this exploratory study incorporates a mathematical algorithm that permits the discrimination of surface roughness based on comparisons at multiple scales. Specifically, we employ measures of relative area (RelA) coupled with the F-test to discriminate used from unused stone tool surfaces, as well as surfaces of quartzite scrapers used on dry and fresh deer hide. Our results further demonstrate the effect of raw material variation on use-wear formation and its documentation using LSCM and RelA. PMID:22688593

  7. Quantifying microwear on experimental Mistassini quartzite scrapers: preliminary results of exploratory research using LSCM and scale-sensitive fractal analysis.

    PubMed

    Stemp, W James; Lerner, Harry J; Kristant, Elaine H

    2013-01-01

    Although previous use-wear studies involving quartz and quartzite have been undertaken by archaeologists, these are comparatively few in number. Moreover, there has been relatively little effort to quantify use-wear on stone tools made from quartzite. The purpose of this article is to determine the effectiveness of a measurement system, laser scanning confocal microscopy (LSCM), to document the surface roughness or texture of experimental Mistassini quartzite scrapers used on two different contact materials (fresh and dry deer hide). As in previous studies using LSCM on chert, flint, and obsidian, this exploratory study incorporates a mathematical algorithm that permits the discrimination of surface roughness based on comparisons at multiple scales. Specifically, we employ measures of relative area (RelA) coupled with the F-test to discriminate used from unused stone tool surfaces, as well as surfaces of quartzite scrapers used on dry and fresh deer hide. Our results further demonstrate the effect of raw material variation on use-wear formation and its documentation using LSCM and RelA.

  8. Mitochondrial DNA transmitted from sperm in the blue mussel Mytilus galloprovincialis showing doubly uniparental inheritance of mitochondria, quantified by real-time PCR.

    PubMed

    Sano, Natsumi; Obata, Mayu; Komaru, Akira

    2010-07-01

    Doubly uniparental inheritance (DUI) of mitochondrial DNA transmission to progeny has been reported in the mussel, Mytilus. In DUI, males have both paternally (M type) and maternally (F type) transmitted mitochondrial DNA (mtDNA), but females have only the F type. To estimate how much M type mtDNA enters the egg with sperm in the DUI system, ratios of M type to F type mtDNA were measured before and after fertilization. M type mtDNA content in eggs increased markedly after fertilization. Similar patterns in M type content changes after fertilization were observed in crosses using the same males. To compare mtDNA quantities, we subsequently measured the ratios of mtDNA to the 28S ribosomal RNA gene (an endogenous control sequence) in sperm or unfertilized eggs using a real-time polymerase chain reaction (PCR) assay. F type content in unfertilized eggs was greater than the M type in sperm by about 1000-fold on average. M type content in spermatozoa was greater than in unfertilized egg, but their distribution overlapped. These results may explain the post-fertilization changes in zygotic M type content. We previously demonstrated that paternal and maternal M type mtDNAs are transmitted to offspring, and hypothesized that the paternal M type contributed to M type transmission to the next generation more than the maternal type did. These quantitative data on M and F type mtDNA in sperm and eggs provide further support for that hypothesis.

  9. Quantifying contextuality.

    PubMed

    Grudka, A; Horodecki, K; Horodecki, M; Horodecki, P; Horodecki, R; Joshi, P; Kłobus, W; Wójcik, A

    2014-03-28

    Contextuality is central to both the foundations of quantum theory and to the novel information processing tasks. Despite some recent proposals, it still faces a fundamental problem: how to quantify its presence? In this work, we provide a universal framework for quantifying contextuality. We conduct two complementary approaches: (i) the bottom-up approach, where we introduce a communication game, which grasps the phenomenon of contextuality in a quantitative manner; (ii) the top-down approach, where we just postulate two measures, relative entropy of contextuality and contextuality cost, analogous to existent measures of nonlocality (a special case of contextuality). We then match the two approaches by showing that the measure emerging from the communication scenario turns out to be equal to the relative entropy of contextuality. Our framework allows for the quantitative, resource-type comparison of completely different games. We give analytical formulas for the proposed measures for some contextual systems, showing in particular that the Peres-Mermin game is by order of magnitude more contextual than that of Klyachko et al. Furthermore, we explore properties of these measures such as monotonicity or additivity. PMID:24724629

  10. Seeking to quantify the ferromagnetic-to-antiferromagnetic interface coupling resulting in exchange bias with various thin-film conformations

    SciTech Connect

    Hsiao, C. H.; Wang, S.; Ouyang, H.; Desautels, R. D.; Lierop, J. van; Lin, K. W.

    2014-08-07

    Ni{sub 3}Fe/(Ni, Fe)O thin films with bilayer and nanocrystallite dispersion morphologies are prepared with a dual ion beam deposition technique permitting precise control of nanocrystallite growth, composition, and admixtures. A bilayer morphology provides a Ni{sub 3}Fe-to-NiO interface, while the dispersion films have different mixtures of Ni{sub 3}Fe, NiO, and FeO nanocrystallites. Using detailed analyses of high resolution transmission electron microscopy images with Multislice simulations, the nanocrystallites' structures and phases are determined, and the intermixing between the Ni{sub 3}Fe, NiO, and FeO interfaces is quantified. From field-cooled hysteresis loops, the exchange bias loop shift from spin interactions at the interfaces are determined. With similar interfacial molar ratios of FM-to-AF, we find the exchange bias field essentially unchanged. However, when the interfacial ratio of FM to AF was FM rich, the exchange bias field increases. Since the FM/AF interface ‘contact’ areas in the nanocrystallite dispersion films are larger than that of the bilayer film, and the nanocrystallite dispersions exhibit larger FM-to-AF interfacial contributions to the magnetism, we attribute the changes in the exchange bias to be from increases in the interfacial segments that suffer defects (such as vacancies and bond distortions), that also affects the coercive fields.

  11. Comparison of gas analyzers for quantifying eddy covariance fluxes- results from an irrigated alfalfa field in Davis, CA

    NASA Astrophysics Data System (ADS)

    Chan, S.; Biraud, S.; Polonik, P.; Billesbach, D.; Hanson, C. V.; Bogoev, I.; Conrad, B.; Alstad, K. P.; Burba, G. G.; Li, J.

    2015-12-01

    The eddy covariance technique requires simultaneous, rapid measurements of wind components and scalars (e.g., water vapor, carbon dioxide) to calculate the vertical exchange due to turbulent processes. The technique has been used extensively as a non-intrusive means to quantify land-atmosphere exchanges of mass and energy. A variety of sensor technologies and gas sampling designs have been tried. Gas concentrations are commonly measured using infrared or laser absorption spectroscopy. Open-path sensors directly sample the ambient environment but suffer when the sample volume is obstructed (e.g., rain, dust). Closed-path sensors utilize pumps to draw air into the analyzer through inlet tubes which can attenuate the signal. Enclosed-path sensors are a newer, hybrid of the open- and closed-path designs where the sensor is mounted in the environment and the sample is drawn through a short inlet tube with short residence time. Five gas analyzers were evaluated as part of this experiment: open-path LI-COR 7500A, enclosed-path LI-COR 7200, closed-path Picarro G2311-f, open-path Campbell Scientific IRGASON, and enclosed-path Campbell Scientific EC155. We compared the relative performance of the gas analyzers over an irrigated alfalfa field in Davis, CA. The field was host to a range of ancillary measurements including below-ground sensors, and a weighing lysimeter. The crop was flood irrigated and harvested monthly. To compare sensors, we evaluated the half-hour mean and variance of gas concentrations (or mole densities). Power spectra for the gas analyzers and turbulent fluxes (from a common sonic anemometer) were also calculated and analyzed. Eddy covariance corrections will be discussed as they relate to sensor design (e.g., density corrections, signal attenuation).

  12. Quantifying stomatal and non-stomatal limitations to carbon assimilation resulting from leaf aging and drought in mature deciduous tree species.

    PubMed

    Wilson, Kell B.; Baldocchi, Dennis D.; Hanson, Paul J.

    2000-06-01

    Gas exchange techniques were used to investigate light-saturated carbon assimilation and its stomatal and non-stomatal limitations over two seasons in mature trees of five species in a closed deciduous forest. Stomatal and non-stomatal contributions to decreases in assimilation resulting from leaf age and drought were quantified relative to the maximum rates obtained early in the season at optimal soil water contents. Although carbon assimilation, stomatal conductance and photosynthetic capacity (V(cmax)) decreased with leaf age, decreases in V(cmax) accounted for about 75% of the leaf-age related reduction in light-saturated assimilation rates, with a secondary role for stomatal conductance (around 25%). However, when considered independently from leaf age, the drought response was dominated by stomatal limitations, accounting for about 75% of the total limitation. Some of the analytical difficulties associated with computing limitation partitioning are discussed, including path dependence, patchy stomatal closure and diffusion in the mesophyll. Although these considerations may introduce errors in our estimates, our analysis establishes some reasonable boundaries on relative limitations and shows differences between drought and non-drought years. Estimating seasonal limitations under natural conditions, as shown in this study, provides a useful basis for comparing limitation processes between years and species.

  13. Quantifying the effect of crops surface albedo variability on GHG budgets in a life cycle assessment approach : methodology and results.

    NASA Astrophysics Data System (ADS)

    Ferlicoq, Morgan; Ceschia, Eric; Brut, Aurore; Tallec, Tiphaine

    2013-04-01

    We tested a new method to estimate the radiative forcing of several crops at the annual and rotation scales, using local measurements data from two ICOS experimental sites. We used jointly 1) the radiative forcing caused by greenhouse gas (GHG) net emissions, calculated by using a Life Cycle Analysis (LCA) approach and in situ measurements (Ceschia et al. 2010), and 2) the radiative forcing caused by rapid changes in surface albedo typical from those ecosystems and resulting from management and crop phenology. The carbon and GHG budgets (GHGB) of 2 crop sites with contrasted management located in South West France (Auradé and Lamasquère sites) was estimated over a complete rotation by combining a classical LCA approach with on site flux measurements. At both sites, carbon inputs (organic fertilisation and seeds), carbon exports (harvest) and net ecosystem production (NEP), measured with the eddy covariance technique, were calculated. The variability of the different terms and their relative contributions to the net ecosystem carbon budget (NECB) were analysed for all site-years, and the effect of management on NECB was assessed. To account for GHG fluxes that were not directly measured on site, we estimated the emissions caused by field operations (EFO) for each site using emission factors from the literature. The EFO were added to the NECB to calculate the total GHGB for a range of cropping systems and management regimes. N2O emissions were or calculated following the IPCC (2007) guidelines, and CH4 emissions were assumed to be negligible compared to other contributions to the net GHGB. Additionally, albedo was calculated continuously using the short wave incident and reflected radiation measurements in the field (0.3-3µm) from CNR1 sensors. Mean annual differences in albedo and deduced radiative forcing from a reference value were then compared for all site-years. Mean annual differences in radiative forcing were then converted in g C equivalent m-2 in order

  14. Quantifying resilience

    USGS Publications Warehouse

    Allen, Craig R.; Angeler, David G.

    2016-01-01

    Several frameworks to operationalize resilience have been proposed. A decade ago, a special feature focused on quantifying resilience was published in the journal Ecosystems (Carpenter, Westley & Turner 2005). The approach there was towards identifying surrogates of resilience, but few of the papers proposed quantifiable metrics. Consequently, many ecological resilience frameworks remain vague and difficult to quantify, a problem that this special feature aims to address. However, considerable progress has been made during the last decade (e.g. Pope, Allen & Angeler 2014). Although some argue that resilience is best kept as an unquantifiable, vague concept (Quinlan et al. 2016), to be useful for managers, there must be concrete guidance regarding how and what to manage and how to measure success (Garmestani, Allen & Benson 2013; Spears et al. 2015). Ideas such as ‘resilience thinking’ have utility in helping stakeholders conceptualize their systems, but provide little guidance on how to make resilience useful for ecosystem management, other than suggesting an ambiguous, Goldilocks approach of being just right (e.g. diverse, but not too diverse; connected, but not too connected). Here, we clarify some prominent resilience terms and concepts, introduce and synthesize the papers in this special feature on quantifying resilience and identify core unanswered questions related to resilience.

  15. Native trees show conservative water use relative to invasive trees: results from a removal experiment in a Hawaiian wet forest

    PubMed Central

    Cavaleri, Molly A.; Ostertag, Rebecca; Cordell, Susan; Sack, Lawren

    2014-01-01

    While the supply of freshwater is expected to decline in many regions in the coming decades, invasive plant species, often ‘high water spenders’, are greatly expanding their ranges worldwide. In this study, we quantified the ecohydrological differences between native and invasive trees and also the effects of woody invasive removal on plot-level water use in a heavily invaded mono-dominant lowland wet tropical forest on the Island of Hawaii. We measured transpiration rates of co-occurring native and invasive tree species with and without woody invasive removal treatments. Twenty native Metrosideros polymorpha and 10 trees each of three invasive species, Cecropia obtusifolia, Macaranga mappa and Melastoma septemnervium, were instrumented with heat-dissipation sap-flux probes in four 100 m2 plots (two invaded, two removal) for 10 months. In the invaded plots, where both natives and invasives were present, Metrosideros had the lowest sap-flow rates per unit sapwood, but the highest sap-flow rates per whole tree, owing to its larger mean diameter than the invasive trees. Stand-level water use within the removal plots was half that of the invaded plots, even though the removal of invasives caused a small but significant increase in compensatory water use by the remaining native trees. By investigating the effects of invasive species on ecohydrology and comparing native vs. invasive physiological traits, we not only gain understanding about the functioning of invasive species, but we also highlight potential water-conservation strategies for heavily invaded mono-dominant tropical forests worldwide. Native-dominated forests free of invasive species can be conservative in overall water use, providing a strong rationale for the control of invasive species and preservation of native-dominated stands. PMID:27293637

  16. Genomic and Enzymatic Results Show Bacillus cellulosilyticus Uses a Novel Set of LPXTA Carbohydrases to Hydrolyze Polysaccharides

    PubMed Central

    Mead, David; Drinkwater, Colleen; Brumm, Phillip J.

    2013-01-01

    Background Alkaliphilic Bacillus species are intrinsically interesting due to the bioenergetic problems posed by growth at high pH and high salt. Three alkaline cellulases have been cloned, sequenced and expressed from Bacillus cellulosilyticus N-4 (Bcell) making it an excellent target for genomic sequencing and mining of biomass-degrading enzymes. Methodology/Principal Findings The genome of Bcell is a single chromosome of 4.7 Mb with no plasmids present and three large phage insertions. The most unusual feature of the genome is the presence of 23 LPXTA membrane anchor proteins; 17 of these are annotated as involved in polysaccharide degradation. These two values are significantly higher than seen in any other Bacillus species. This high number of membrane anchor proteins is seen only in pathogenic Gram-positive organisms such as Listeria monocytogenes or Staphylococcus aureus. Bcell also possesses four sortase D subfamily 4 enzymes that incorporate LPXTA-bearing proteins into the cell wall; three of these are closely related to each other and unique to Bcell. Cell fractionation and enzymatic assay of Bcell cultures show that the majority of polysaccharide degradation is associated with the cell wall LPXTA-enzymes, an unusual feature in Gram-positive aerobes. Genomic analysis and growth studies both strongly argue against Bcell being a truly cellulolytic organism, in spite of its name. Preliminary results suggest that fungal mycelia may be the natural substrate for this organism. Conclusions/Significance Bacillus cellulosilyticus N-4, in spite of its name, does not possess any of the genes necessary for crystalline cellulose degradation, demonstrating the risk of classifying microorganisms without the benefit of genomic analysis. Bcell is the first Gram-positive aerobic organism shown to use predominantly cell-bound, non-cellulosomal enzymes for polysaccharide degradation. The LPXTA-sortase system utilized by Bcell may have applications both in anchoring

  17. Development and application of methods to quantify spatial and temporal hyperpolarized 3He MRI ventilation dynamics: preliminary results in chronic obstructive pulmonary disease

    NASA Astrophysics Data System (ADS)

    Kirby, Miranda; Wheatley, Andrew; McCormack, David G.; Parraga, Grace

    2010-03-01

    Hyperpolarized helium-3 (3He) magnetic resonance imaging (MRI) has emerged as a non-invasive research method for quantifying lung structural and functional changes, enabling direct visualization in vivo at high spatial and temporal resolution. Here we described the development of methods for quantifying ventilation dynamics in response to salbutamol in Chronic Obstructive Pulmonary Disease (COPD). Whole body 3.0 Tesla Excite 12.0 MRI system was used to obtain multi-slice coronal images acquired immediately after subjects inhaled hyperpolarized 3He gas. Ventilated volume (VV), ventilation defect volume (VDV) and thoracic cavity volume (TCV) were recorded following segmentation of 3He and 1H images respectively, and used to calculate percent ventilated volume (PVV) and ventilation defect percent (VDP). Manual segmentation and Otsu thresholding were significantly correlated for VV (r=.82, p=.001), VDV (r=.87 p=.0002), PVV (r=.85, p=.0005), and VDP (r=.85, p=.0005). The level of agreement between these segmentation methods was also evaluated using Bland-Altman analysis and this showed that manual segmentation was consistently higher for VV (Mean=.22 L, SD=.05) and consistently lower for VDV (Mean=-.13, SD=.05) measurements than Otsu thresholding. To automate the quantification of newly ventilated pixels (NVp) post-bronchodilator, we used translation, rotation, and scaling transformations to register pre-and post-salbutamol images. There was a significant correlation between NVp and VDV (r=-.94 p=.005) and between percent newly ventilated pixels (PNVp) and VDP (r=- .89, p=.02), but not for VV or PVV. Evaluation of 3He MRI ventilation dynamics using Otsu thresholding and landmark-based image registration provides a way to regionally quantify functional changes in COPD subjects after treatment with beta-agonist bronchodilators, a common COPD and asthma therapy.

  18. Quantifying Surface Processes and Stratigraphic Characteristics Resulting from Large Magnitude High Frequency and Small Magnitude Low Frequency Relative Sea Level Cycles: An Experimental Study

    NASA Astrophysics Data System (ADS)

    Yu, L.; Li, Q.; Esposito, C. R.; Straub, K. M.

    2015-12-01

    Relative Sea-Level (RSL) change, which is a primary control on sequence stratigraphic architecture, has a close relationship with climate change. In order to explore the influence of RSL change on the stratigraphic record, we conducted three physical experiments which shared identical boundary conditions but differed in their RSL characteristics. Specifically, the three experiments differed with respect to two non-dimensional numbers that compare the magnitude and periodicity of RSL cycles to the spatial and temporal scales of autogenic processes, respectively. The magnitude of RSL change is quantified with H*, defined as the peak to trough difference in RSL during a cycle divided by a system's maximum autogenic channel depth. The periodicity of RSL change is quantified with T*, defined as the period of RSL cycles divided by the time required to deposit one channel depth of sediment, on average, everywhere in the basin. Experiments performed included: 1) a control experiment lacking RSL cycles, used to define a system's autogenics, 2) a high magnitude, high frequency RSL cycles experiment, and 3) a low magnitude, low frequency cycles experiment. We observe that the high magnitude, high frequency experiment resulted in the thickest channel bodies with the lowest width-to-depth ratios, while the low magnitude, long period experiment preserves a record of gradual shoreline transgression and regression producing facies that are the most continuous in space. We plan to integrate our experimental results with Delft3D numerical experiments models that sample similar non-dimensional characteristics of RSL cycles. Quantifying the influence of RSL change, normalized as a function of the spatial and temporal scales of autogenic processes will strengthen our ability to predict stratigraphic architecture and invert stratigraphy for paleo-environmental conditions.

  19. A collaborative accountable care model in three practices showed promising early results on costs and quality of care.

    PubMed

    Salmon, Richard B; Sanderson, Mark I; Walters, Barbara A; Kennedy, Karen; Flores, Robert C; Muney, Alan M

    2012-11-01

    Cigna's Collaborative Accountable Care initiative provides financial incentives to physician groups and integrated delivery systems to improve the quality and efficiency of care for patients in commercial open-access benefit plans. Registered nurses who serve as care coordinators employed by participating practices are a central feature of the initiative. They use patient-specific reports and practice performance reports provided by Cigna to improve care coordination, identify and close care gaps, and address other opportunities for quality improvement. We report interim quality and cost results for three geographically and structurally diverse provider practices in Arizona, New Hampshire, and Texas. Although not statistically significant, these early results revealed favorable trends in total medical costs and quality of care, suggesting that a shared-savings accountable care model and collaborative support from the payer can enable practices to take meaningful steps toward full accountability for care quality and efficiency.

  20. QUANTIFYING FOREST ABOVEGROUND CARBON POOLS AND FLUXES USING MULTI-TEMPORAL LIDAR A report on field monitoring, remote sensing MMV, GIS integration, and modeling results for forestry field validation test to quantify aboveground tree biomass and carbon

    SciTech Connect

    Lee Spangler; Lee A. Vierling; Eva K. Stand; Andrew T. Hudak; Jan U.H. Eitel; Sebastian Martinuzzi

    2012-04-01

    Sound policy recommendations relating to the role of forest management in mitigating atmospheric carbon dioxide (CO{sub 2}) depend upon establishing accurate methodologies for quantifying forest carbon pools for large tracts of land that can be dynamically updated over time. Light Detection and Ranging (LiDAR) remote sensing is a promising technology for achieving accurate estimates of aboveground biomass and thereby carbon pools; however, not much is known about the accuracy of estimating biomass change and carbon flux from repeat LiDAR acquisitions containing different data sampling characteristics. In this study, discrete return airborne LiDAR data was collected in 2003 and 2009 across {approx}20,000 hectares (ha) of an actively managed, mixed conifer forest landscape in northern Idaho, USA. Forest inventory plots, established via a random stratified sampling design, were established and sampled in 2003 and 2009. The Random Forest machine learning algorithm was used to establish statistical relationships between inventory data and forest structural metrics derived from the LiDAR acquisitions. Aboveground biomass maps were created for the study area based on statistical relationships developed at the plot level. Over this 6-year period, we found that the mean increase in biomass due to forest growth across the non-harvested portions of the study area was 4.8 metric ton/hectare (Mg/ha). In these non-harvested areas, we found a significant difference in biomass increase among forest successional stages, with a higher biomass increase in mature and old forest compared to stand initiation and young forest. Approximately 20% of the landscape had been disturbed by harvest activities during the six-year time period, representing a biomass loss of >70 Mg/ha in these areas. During the study period, these harvest activities outweighed growth at the landscape scale, resulting in an overall loss in aboveground carbon at this site. The 30-fold increase in sampling density

  1. Presentation Showing Results of a Hydrogeochemical Investigation of the Standard Mine Vicinity, Upper Elk Creek Basin, Colorado

    USGS Publications Warehouse

    Manning, Andrew H.; Verplanck, Philip L.; Mast, M. Alisa; Wanty, Richard B.

    2008-01-01

    PREFACE This Open-File Report consists of a presentation given in Crested Butte, Colorado on December 13, 2007 to the Standard Mine Advisory Group. The presentation was paired with another presentation given by the Colorado Division of Reclamation, Mining, and Safety on the physical features and geology of the Standard Mine. The presentation in this Open-File Report summarizes the results and conclusions of a hydrogeochemical investigation of the Standard Mine performed by the U.S. Geological Survey (Manning and others, in press). The purpose of the investigation was to aid the U.S. Environmental Protection Agency in evaluating remediation options for the Standard Mine site. Additional details and supporting data related to the information in this presentation can be found in Manning and others (in press).

  2. Low-frequency ac electroporation shows strong frequency dependence and yields comparable transfection results to dc electroporation.

    PubMed

    Zhan, Yihong; Cao, Zhenning; Bao, Ning; Li, Jianbo; Wang, Jun; Geng, Tao; Lin, Hao; Lu, Chang

    2012-06-28

    Conventional electroporation has been conducted by employing short direct current (dc) pulses for delivery of macromolecules such as DNA into cells. The use of alternating current (ac) field for electroporation has mostly been explored in the frequency range of 10kHz-1MHz. Based on Schwan equation, it was thought that with low ac frequencies (10Hz-10kHz), the transmembrane potential does not vary with the frequency. In this report, we utilized a flow-through electroporation technique that employed continuous 10Hz-10kHz ac field (based on either sine waves or square waves) for electroporation of cells with defined duration and intensity. Our results reveal that electropermeabilization becomes weaker with increased frequency in this range. In contrast, transfection efficiency with DNA reaches its maximum at medium frequencies (100-1000Hz) in the range. We postulate that the relationship between the transfection efficiency and the ac frequency is determined by combined effects from electrophoretic movement of DNA in the ac field, dependence of the DNA/membrane interaction on the ac frequency, and variation of transfection under different electropermeabilization intensities. The fact that ac electroporation in this frequency range yields high efficiency for transfection (up to ~71% for Chinese hamster ovary cells) and permeabilization suggests its potential for gene delivery.

  3. Volar locking distal radius plates show better short-term results than other treatment options: A prospective randomised controlled trial

    PubMed Central

    Drobetz, Herwig; Koval, Lidia; Weninger, Patrick; Luscombe, Ruth; Jeffries, Paula; Ehrendorfer, Stefan; Heal, Clare

    2016-01-01

    AIM To compare the outcomes of displaced distal radius fractures treated with volar locking plates and with immediate postoperative mobilisation with the outcomes of these fractures treated with modalities that necessitate 6 wk wrist immobilisation. METHODS A prospective, randomised controlled single-centre trial was conducted with 56 patients who had a displaced radius fracture were randomised to treatment either with a volar locking plate (n = 29), or another treatment modality (n = 27; cast immobilisation with or without wires or external fixator). Outcomes were measured at 12 wk. Functional outcome scores measured were the Patient-Rated Wrist Evaluation (PRWE) Score; Disabilities of the Arm, Shoulder and Hand and activities of daily living (ADLs). Clinical outcomes were wrist range of motion and grip strength. Radiographic parameters were volar inclination and ulnar variance. RESULTS Patients in the volar locking plate group had significantly better PRWE scores, ADL scores, grip strength and range of extension at three months compared with the control group. All radiological parameters were significantly better in the volar locking plate group at 3 mo. CONCLUSION The present study suggests that volar locking plates produced significantly better functional and clinical outcomes at 3 mo compared with other treatment modalities. Anatomical reduction was significantly more likely to be preserved in the plating group. Level of evidence: II. PMID:27795951

  4. Quantifying errors without random sampling

    PubMed Central

    Phillips, Carl V; LaPole, Luwanna M

    2003-01-01

    Background All quantifications of mortality, morbidity, and other health measures involve numerous sources of error. The routine quantification of random sampling error makes it easy to forget that other sources of error can and should be quantified. When a quantification does not involve sampling, error is almost never quantified and results are often reported in ways that dramatically overstate their precision. Discussion We argue that the precision implicit in typical reporting is problematic and sketch methods for quantifying the various sources of error, building up from simple examples that can be solved analytically to more complex cases. There are straightforward ways to partially quantify the uncertainty surrounding a parameter that is not characterized by random sampling, such as limiting reported significant figures. We present simple methods for doing such quantifications, and for incorporating them into calculations. More complicated methods become necessary when multiple sources of uncertainty must be combined. We demonstrate that Monte Carlo simulation, using available software, can estimate the uncertainty resulting from complicated calculations with many sources of uncertainty. We apply the method to the current estimate of the annual incidence of foodborne illness in the United States. Summary Quantifying uncertainty from systematic errors is practical. Reporting this uncertainty would more honestly represent study results, help show the probability that estimated values fall within some critical range, and facilitate better targeting of further research. PMID:12892568

  5. Selection Indices and Multivariate Analysis Show Similar Results in the Evaluation of Growth and Carcass Traits in Beef Cattle

    PubMed Central

    Brito Lopes, Fernando; da Silva, Marcelo Corrêa; Magnabosco, Cláudio Ulhôa; Goncalves Narciso, Marcelo; Sainz, Roberto Daniel

    2016-01-01

    This research evaluated a multivariate approach as an alternative tool for the purpose of selection regarding expected progeny differences (EPDs). Data were fitted using a multi-trait model and consisted of growth traits (birth weight and weights at 120, 210, 365 and 450 days of age) and carcass traits (longissimus muscle area (LMA), back-fat thickness (BF), and rump fat thickness (RF)), registered over 21 years in extensive breeding systems of Polled Nellore cattle in Brazil. Multivariate analyses were performed using standardized (zero mean and unit variance) EPDs. The k mean method revealed that the best fit of data occurred using three clusters (k = 3) (P < 0.001). Estimates of genetic correlation among growth and carcass traits and the estimates of heritability were moderate to high, suggesting that a correlated response approach is suitable for practical decision making. Estimates of correlation between selection indices and the multivariate index (LD1) were moderate to high, ranging from 0.48 to 0.97. This reveals that both types of indices give similar results and that the multivariate approach is reliable for the purpose of selection. The alternative tool seems very handy when economic weights are not available or in cases where more rapid identification of the best animals is desired. Interestingly, multivariate analysis allowed forecasting information based on the relationships among breeding values (EPDs). Also, it enabled fine discrimination, rapid data summarization after genetic evaluation, and permitted accounting for maternal ability and the genetic direct potential of the animals. In addition, we recommend the use of longissimus muscle area and subcutaneous fat thickness as selection criteria, to allow estimation of breeding values before the first mating season in order to accelerate the response to individual selection. PMID:26789008

  6. Philippines: decentralized approach shows results.

    PubMed

    1983-01-01

    In the Philippines several steps have been taken to meet the challenge of increasing population growth. Commencing with the Republic Act 6365, known as the Population Act (1971) program directives focus on achieving and maintaining population levels most conducive to the national welfare. In 1978 a Special Committee was constituted by the President to review the population program. Pursuant to the Committee's findings certain changes were adopted. The thrust is now towards longterm planning to ensure a more significant and perceptible demographic impact of development programs and policies. Increasing attention is paid to regional development and spatial distribution in the country. The 1978-82 Development Plan states more clearly the interaction between population and development. The National Economic and Development Authority, the central policy and planning agency of the government, takes charge of formulation and coordinating the broader aspects of population policy and integrating population with socioeconomic plans and policies. At present the National Economic and Development Authority (NEDA) is implementing a project known as the Population/Development Planning and Research (PDPR) project with financial support from the UN Fund for Population Activities (UNFPA). This project promotes and facilitates the integration of the population dimension in the planning process. It does this by maintaining linkages and instituting collaborative mechanisms with the different NEDA regional offices and sectoral ministries. It also trains government planners in ways of integrating population concerns into the development plan. PDPR promotes the use of population and development research for planning purposes and policy formation. The Philippine Development Plan, 1978-82, recognized that an improvement in the level of 1 sector reinforces the performance of the other sectors. Since the establishment of the National Population Program 12 years ago, population and family planning have been successfully integrated with various development sectors, notably, labor, health, and education. Through the policies of integration, multiagency participation, and partnership of the public and private sectors, the Commission on Population uses existing development programs of government and private organizations as vehicles for family planning information and services and shares the responsibility of implementing all facets of the population program with various participating agencies in the government and private sector.

  7. QUANTIFYING SPICULES

    SciTech Connect

    Pereira, Tiago M. D.; De Pontieu, Bart; Carlsson, Mats

    2012-11-01

    Understanding the dynamic solar chromosphere is fundamental in solar physics. Spicules are an important feature of the chromosphere, connecting the photosphere to the corona, potentially mediating the transfer of energy and mass. The aim of this work is to study the properties of spicules over different regions of the Sun. Our goal is to investigate if there is more than one type of spicule, and how spicules behave in the quiet Sun, coronal holes, and active regions. We make use of high cadence and high spatial resolution Ca II H observations taken by Hinode/Solar Optical Telescope. Making use of a semi-automated detection algorithm, we self-consistently track and measure the properties of 519 spicules over different regions. We find clear evidence of two types of spicules. Type I spicules show a rise and fall and have typical lifetimes of 150-400 s and maximum ascending velocities of 15-40 km s{sup -1}, while type II spicules have shorter lifetimes of 50-150 s, faster velocities of 30-110 km s{sup -1}, and are not seen to fall down, but rather fade at around their maximum length. Type II spicules are the most common, seen in the quiet Sun and coronal holes. Type I spicules are seen mostly in active regions. There are regional differences between quiet-Sun and coronal hole spicules, likely attributable to the different field configurations. The properties of type II spicules are consistent with published results of rapid blueshifted events (RBEs), supporting the hypothesis that RBEs are their disk counterparts. For type I spicules we find the relations between their properties to be consistent with a magnetoacoustic shock wave driver, and with dynamic fibrils as their disk counterpart. The driver of type II spicules remains unclear from limb observations.

  8. Uncertainty quantified trait predictions

    NASA Astrophysics Data System (ADS)

    Fazayeli, Farideh; Kattge, Jens; Banerjee, Arindam; Schrodt, Franziska; Reich, Peter

    2015-04-01

    Functional traits of organisms are key to understanding and predicting biodiversity and ecological change, which motivates continuous collection of traits and their integration into global databases. Such composite trait matrices are inherently sparse, severely limiting their usefulness for further analyses. On the other hand, traits are characterized by the phylogenetic trait signal, trait-trait correlations and environmental constraints, all of which provide information that could be used to statistically fill gaps. We propose the application of probabilistic models which, for the first time, utilize all three characteristics to fill gaps in trait databases and predict trait values at larger spatial scales. For this purpose we introduce BHPMF, a hierarchical Bayesian extension of Probabilistic Matrix Factorization (PMF). PMF is a machine learning technique which exploits the correlation structure of sparse matrices to impute missing entries. BHPMF additionally utilizes the taxonomic hierarchy for trait prediction. Implemented in the context of a Gibbs Sampler MCMC approach BHPMF provides uncertainty estimates for each trait prediction. We present comprehensive experimental results on the problem of plant trait prediction using the largest database of plant traits, where BHPMF shows strong empirical performance in uncertainty quantified trait prediction, outperforming the state-of-the-art based on point estimates. Further, we show that BHPMF is more accurate when it is confident, whereas the error is high when the uncertainty is high.

  9. Methods for Quantifying the Uncertainties of LSIT Test Parameters, Test Results, and Full-Scale Mixing Performance Using Models Developed from Scaled Test Data

    SciTech Connect

    Piepel, Gregory F.; Cooley, Scott K.; Kuhn, William L.; Rector, David R.; Heredia-Langner, Alejandro

    2015-05-01

    This report discusses the statistical methods for quantifying uncertainties in 1) test responses and other parameters in the Large Scale Integrated Testing (LSIT), and 2) estimates of coefficients and predictions of mixing performance from models that relate test responses to test parameters. Testing at a larger scale has been committed to by Bechtel National, Inc. and the U.S. Department of Energy (DOE) to “address uncertainties and increase confidence in the projected, full-scale mixing performance and operations” in the Waste Treatment and Immobilization Plant (WTP).

  10. Quantifying solvated electrons' delocalization.

    PubMed

    Janesko, Benjamin G; Scalmani, Giovanni; Frisch, Michael J

    2015-07-28

    Delocalized, solvated electrons are a topic of much recent interest. We apply the electron delocalization range EDR(r;u) (J. Chem. Phys., 2014, 141, 144104) to quantify the extent to which a solvated electron at point r in a calculated wavefunction delocalizes over distance u. Calculations on electrons in one-dimensional model cavities illustrate fundamental properties of the EDR. Mean-field calculations on hydrated electrons (H2O)n(-) show that the density-matrix-based EDR reproduces existing molecular-orbital-based measures of delocalization. Correlated calculations on hydrated electrons and electrons in lithium-ammonia clusters illustrates how electron correlation tends to move surface- and cavity-bound electrons onto the cluster or cavity surface. Applications to multiple solvated electrons in lithium-ammonia clusters provide a novel perspective on the interplay of delocalization and strong correlation central to lithium-ammonia solutions' concentration-dependent insulator-to-metal transition. The results motivate continued application of the EDR to simulations of delocalized electrons.

  11. "The Show"

    ERIC Educational Resources Information Center

    Gehring, John

    2004-01-01

    For the past 16 years, the blue-collar city of Huntington, West Virginia, has rolled out the red carpet to welcome young wrestlers and their families as old friends. They have come to town chasing the same dream for a spot in what many of them call "The Show". For three days, under the lights of an arena packed with 5,000 fans, the state's best…

  12. Two heteronuclear dipolar results at the price of one: Quantifying Na/P contacts in phosphosilicate glasses and biomimetic hydroxy-apatite

    NASA Astrophysics Data System (ADS)

    Stevensson, Baltzar; Mathew, Renny; Yu, Yang; Edén, Mattias

    2015-02-01

    The analysis of S{I} recoupling experiments applied to amorphous solids yields a heteronuclear second moment M2 (S-I) that represents the effective through-space dipolar interaction between the detected S spins and the neighboring I-spin species. We show that both M2 (S-I) and M2 (I-S) values are readily accessible from a sole S{I} or I{S} experiment, which may involve either S or I detection, and is naturally selected as the most favorable option under the given experimental conditions. For the common case where I has half-integer spin, an I{S} REDOR implementation is preferred to the S{I} REAPDOR counterpart. We verify the procedure by 23Na{31P} REDOR and 31P{23Na} REAPDOR NMR applied to Na2O-CaO-SiO2-P2O5 glasses and biomimetic hydroxyapatite, where the M2 (P-Na) values directly determined by REAPDOR agree very well with those derived from the corresponding M2 (Na-P) results measured by REDOR. Moreover, we show that dipolar second moments are readily extracted from the REAPDOR NMR protocol by a straightforward numerical fitting of the initial dephasing data, in direct analogy with the well-established procedure to determine M2 (S-I) values from REDOR NMR experiments applied to amorphous materials; this avoids the problems with time-consuming numerically exact simulations whose accuracy is limited for describing the dynamics of a priori unknown multi-spin systems in disordered structures.

  13. Analysis of conservative tracer measurement results using the Frechet distribution at planted horizontal subsurface flow constructed wetlands filled with coarse gravel and showing the effect of clogging processes.

    PubMed

    Dittrich, Ernő; Klincsik, Mihály

    2015-11-01

    A mathematical process, developed in Maple environment, has been successful in decreasing the error of measurement results and in the precise calculation of the moments of corrected tracer functions. It was proved that with this process, the measured tracer results of horizontal subsurface flow constructed wetlands filled with coarse gravel (HSFCW-C) can be fitted more accurately than with the conventionally used distribution functions (Gaussian, Lognormal, Fick (Inverse Gaussian) and Gamma). This statement is true only for the planted HSFCW-Cs. The analysis of unplanted HSFCW-Cs needs more research. The result of the analysis shows that the conventional solutions (completely stirred series tank reactor (CSTR) model and convection-dispersion transport (CDT) model) cannot describe these types of transport processes with sufficient accuracy. These outcomes can help in developing better process descriptions of very difficult transport processes in HSFCW-Cs. Furthermore, a new mathematical process can be developed for the calculation of real hydraulic residence time (HRT) and dispersion coefficient values. The presented method can be generalized to other kinds of hydraulic environments.

  14. How often do German children and adolescents show signs of common mental health problems? Results from different methodological approaches – a cross-sectional study

    PubMed Central

    2014-01-01

    Background Child and adolescent mental health problems are ubiquitous and burdensome. Their impact on functional disability, the high rates of accompanying medical illnesses and the potential to last until adulthood make them a major public health issue. While methodological factors cause variability of the results from epidemiological studies, there is a lack of prevalence rates of mental health problems in children and adolescents according to ICD-10 criteria from nationally representative samples. International findings suggest only a small proportion of children with function impairing mental health problems receive treatment, but information about the health care situation of children and adolescents is scarce. The aim of this epidemiological study was a) to classify symptoms of common mental health problems according to ICD-10 criteria in order to compare the statistical and clinical case definition strategies using a single set of data and b) to report ICD-10 codes from health insurance claims data. Methods a) Based on a clinical expert rating, questionnaire items were mapped on ICD-10 criteria; data from the Mental Health Module (BELLA study) were analyzed for relevant ICD-10 and cut-off criteria; b) Claims data were analyzed for relevant ICD-10 codes. Results According to parent report 7.5% (n = 208) met the ICD-10 criteria of a mild depressive episode and 11% (n = 305) showed symptoms of depression according to cut-off score; Anxiety is reported in 5.6% (n = 156) and 11.6% (n = 323), conduct disorder in 15.2% (n = 373) and 14.6% (n = 357). Self-reported symptoms in 11 to 17 year olds resulted in 15% (n = 279) reporting signs of a mild depression according to ICD-10 criteria (vs. 16.7% (n = 307) based on cut-off) and 10.9% (n = 201) reported symptoms of anxiety (vs. 15.4% (n = 283)). Results from routine data identify 0.9% (n = 1,196) with a depression diagnosis, 3.1% (n = 6,729) with anxiety and 1.4% (n

  15. Transgene silencing of the Hutchinson-Gilford progeria syndrome mutation results in a reversible bone phenotype, whereas resveratrol treatment does not show overall beneficial effects.

    PubMed

    Strandgren, Charlotte; Nasser, Hasina Abdul; McKenna, Tomás; Koskela, Antti; Tuukkanen, Juha; Ohlsson, Claes; Rozell, Björn; Eriksson, Maria

    2015-08-01

    Hutchinson-Gilford progeria syndrome (HGPS) is a rare premature aging disorder that is most commonly caused by a de novo point mutation in exon 11 of the LMNA gene, c.1824C>T, which results in an increased production of a truncated form of lamin A known as progerin. In this study, we used a mouse model to study the possibility of recovering from HGPS bone disease upon silencing of the HGPS mutation, and the potential benefits from treatment with resveratrol. We show that complete silencing of the transgenic expression of progerin normalized bone morphology and mineralization already after 7 weeks. The improvements included lower frequencies of rib fractures and callus formation, an increased number of osteocytes in remodeled bone, and normalized dentinogenesis. The beneficial effects from resveratrol treatment were less significant and to a large extent similar to mice treated with sucrose alone. However, the reversal of the dental phenotype of overgrown and laterally displaced lower incisors in HGPS mice could be attributed to resveratrol. Our results indicate that the HGPS bone defects were reversible upon suppressed transgenic expression and suggest that treatments targeting aberrant progerin splicing give hope to patients who are affected by HGPS.

  16. Magnetic Sphincter Augmentation for Gastroesophageal Reflux at 5 Years: Final Results of a Pilot Study Show Long-Term Acid Reduction and Symptom Improvement

    PubMed Central

    Saino, Greta; Bonavina, Luigi; Lipham, John C.; Dunn, Daniel

    2015-01-01

    Abstract Background: As previously reported, the magnetic sphincter augmentation device (MSAD) preserves gastric anatomy and results in less severe side effects than traditional antireflux surgery. The final 5-year results of a pilot study are reported here. Patients and Methods: A prospective, multicenter study evaluated safety and efficacy of the MSAD for 5 years. Prior to MSAD placement, patients had abnormal esophageal acid and symptoms poorly controlled by proton pump inhibitors (PPIs). Patients served as their own control, which allowed comparison between baseline and postoperative measurements to determine individual treatment effect. At 5 years, gastroesophageal reflux disease (GERD)-Health Related Quality of Life (HRQL) questionnaire score, esophageal pH, PPI use, and complications were evaluated. Results: Between February 2007 and October 2008, 44 patients (26 males) had an MSAD implanted by laparoscopy, and 33 patients were followed up at 5 years. Mean total percentage of time with pH <4 was 11.9% at baseline and 4.6% at 5 years (P < .001), with 85% of patients achieving pH normalization or at least a 50% reduction. Mean total GERD-HRQL score improved significantly from 25.7 to 2.9 (P < .001) when comparing baseline and 5 years, and 93.9% of patients had at least a 50% reduction in total score compared with baseline. Complete discontinuation of PPIs was achieved by 87.8% of patients. No complications occurred in the long term, including no device erosions or migrations at any point. Conclusions: Based on long-term reduction in esophageal acid, symptom improvement, and no late complications, this study shows the relative safety and efficacy of magnetic sphincter augmentation for GERD. PMID:26437027

  17. A high-density wireless underground sensor network (WUSN) to quantify hydro-ecological interactions for a UK floodplain; project background and initial results

    NASA Astrophysics Data System (ADS)

    Verhoef, A.; Choudhary, B.; Morris, P. J.; McCann, J.

    2012-04-01

    Floodplain meadows support some of the most diverse vegetation in the UK, and also perform key ecosystem services, such as flood storage and sediment retention. However, the UK now has less than 1500 ha of this unique habitat remaining. In order to conserve and better exploit the services provided by this grassland, an improved understanding of its functioning is essential. Vegetation functioning and species composition are known to be tightly correlated to the hydrological regime, and related temperature and nutrient regime, but the mechanisms controlling these relationships are not well established. The FUSE* project aims to investigate the spatiotemporal variability in vegetation functioning (e.g. photosynthesis and transpiration) and plant community composition in a floodplain meadow near Oxford, UK (Yarnton Mead), and their relationship to key soil physical variables (soil temperature and moisture content), soil nutrient levels and the water- and energy-balance. A distributed high density Wireless Underground Sensor Network (WUSN) is in the process of being established on Yarnton Mead. The majority, or ideally all, of the sensing and transmitting components will be installed below-ground because Yarnton Mead is a SSSI (Site of Special Scientific Interest, due to its unique plant community) and because occasionally sheep or cattle are grazing on it, and that could damage the nodes. This prerequisite has implications for the maximum spacing between UG nodes and their communications technologies; in terms of signal strength, path losses and requirements for battery life. The success of underground wireless communication is highly dependent on the soil type and water content. This floodplain environment is particularly challenging in this context because the soil contains a large amount of clay near the surface and is therefore less favourable to EM wave propagation than sandy soils. Furthermore, due to high relative saturation levels (as a result of high

  18. Rapamycin and Chloroquine: The In Vitro and In Vivo Effects of Autophagy-Modifying Drugs Show Promising Results in Valosin Containing Protein Multisystem Proteinopathy

    PubMed Central

    Nalbandian, Angèle; Llewellyn, Katrina J.; Nguyen, Christopher; Yazdi, Puya G.; Kimonis, Virginia E.

    2015-01-01

    Mutations in the valosin containing protein (VCP) gene cause hereditary Inclusion body myopathy (hIBM) associated with Paget disease of bone (PDB), frontotemporal dementia (FTD), more recently termed multisystem proteinopathy (MSP). Affected individuals exhibit scapular winging and die from progressive muscle weakness, and cardiac and respiratory failure, typically in their 40s to 50s. Histologically, patients show the presence of rimmed vacuoles and TAR DNA-binding protein 43 (TDP-43)-positive large ubiquitinated inclusion bodies in the muscles. We have generated a VCPR155H/+ mouse model which recapitulates the disease phenotype and impaired autophagy typically observed in patients with VCP disease. Autophagy-modifying agents, such as rapamycin and chloroquine, at pharmacological doses have previously shown to alter the autophagic flux. Herein, we report results of administration of rapamycin, a specific inhibitor of the mechanistic target of rapamycin (mTOR) signaling pathway, and chloroquine, a lysosomal inhibitor which reverses autophagy by accumulating in lysosomes, responsible for blocking autophagy in 20-month old VCPR155H/+ mice. Rapamycin-treated mice demonstrated significant improvement in muscle performance, quadriceps histological analysis, and rescue of ubiquitin, and TDP-43 pathology and defective autophagy as indicated by decreased protein expression levels of LC3-I/II, p62/SQSTM1, optineurin and inhibiting the mTORC1 substrates. Conversely, chloroquine-treated VCPR155H/+ mice revealed progressive muscle weakness, cytoplasmic accumulation of TDP-43, ubiquitin-positive inclusion bodies and increased LC3-I/II, p62/SQSTM1, and optineurin expression levels. Our in vitro patient myoblasts studies treated with rapamycin demonstrated an overall improvement in the autophagy markers. Targeting the mTOR pathway ameliorates an increasing list of disorders, and these findings suggest that VCP disease and related neurodegenerative multisystem proteinopathies can

  19. Modeling upward brine migration through faults as a result of CO2 storage in the Northeast German Basin shows negligible salinization in shallow aquifers

    NASA Astrophysics Data System (ADS)

    Kuehn, M.; Tillner, E.; Kempka, T.; Nakaten, B.

    2012-12-01

    The geological storage of CO2 in deep saline formations may cause salinization of shallower freshwater resources by upward flow of displaced brine from the storage formation into potable groundwater. In this regard, permeable faults or fractures can serve as potential leakage pathways for upward brine migration. The present study uses a regional-scale 3D model based on real structural data of a prospective CO2 storage site in Northeastern Germany to determine the impact of compartmentalization and fault permeability on upward brine migration as a result of pressure elevation by CO2 injection. To evaluate the degree of salinization in the shallower aquifers, different fault leakage scenarios were carried out using a newly developed workflow in which the model grid from the software package Petrel applied for pre-processing is transferred to the reservoir simulator TOUGH2-MP/ECO2N. A discrete fault description is achieved by using virtual elements. A static 3D geological model of the CO2 storage site with an a real size of 40 km x 40 km and a thickness of 766 m was implemented. Subsequently, large-scale numerical multi-phase multi-component (CO2, NaCl, H2O) flow simulations were carried out on a high performance computing system. The prospective storage site, located in the Northeast German Basin is part of an anticline structure characterized by a saline multi-layer aquifer system. The NE and SW boundaries of the study area are confined by the Fuerstenwalde Gubener and the Lausitzer Abbruch fault zones represented by four discrete faults in the model. Two formations of the Middle Bunter were chosen to assess brine migration through faults triggered by an annual injection rate of 1.7 Mt CO2 into the lowermost formation over a time span of 20 years. In addition to varying fault permeabilities, different boundary conditions were applied to evaluate the effects of reservoir compartmentalization. Simulation results show that the highest pressurization within the storage

  20. Storytelling Slide Shows to Improve Diabetes and High Blood Pressure Knowledge and Self-Efficacy: Three-Year Results among Community Dwelling Older African Americans

    ERIC Educational Resources Information Center

    Bertera, Elizabeth M.

    2014-01-01

    This study combined the African American tradition of oral storytelling with the Hispanic medium of "Fotonovelas." A staggered pretest posttest control group design was used to evaluate four Storytelling Slide Shows on health that featured community members. A total of 212 participants were recruited for the intervention and 217 for the…

  1. Quantifiable Lateral Flow Assay Test Strips

    NASA Technical Reports Server (NTRS)

    2003-01-01

    As easy to read as a home pregnancy test, three Quantifiable Lateral Flow Assay (QLFA) strips used to test water for E. coli show different results. The brightly glowing control line on the far right of each strip indicates that all three tests ran successfully. But the glowing test line on the middle left and bottom strips reveal their samples were contaminated with E. coli bacteria at two different concentrations. The color intensity correlates with concentration of contamination.

  2. Mathematical modelling in Matlab of the experimental results shows the electrochemical potential difference - temperature of the WC coatings immersed in a NaCl solution

    NASA Astrophysics Data System (ADS)

    Benea, M. L.; Benea, O. D.

    2016-02-01

    The method used for purchasing the corrosion behaviour the WC coatings deposited by plasma spraying, on a martensitic stainless steel substrate consists in measuring the electrochemical potential of the coating, respectively that of the substrate, immersed in a NaCl solution as corrosive agent. The mathematical processing of the obtained experimental results in Matlab allowed us to make some correlations between the electrochemical potential of the coating and the solution temperature is very well described by some curves having equations obtained by interpolation order 4.

  3. Transitioning from preclinical to clinical chemopreventive assessments of lyophilized black raspberries: interim results show berries modulate markers of oxidative stress in Barrett's esophagus patients.

    PubMed

    Kresty, Laura A; Frankel, Wendy L; Hammond, Cynthia D; Baird, Maureen E; Mele, Jennifer M; Stoner, Gary D; Fromkes, John J

    2006-01-01

    Increased fruit and vegetable consumption is associated with decreased risk of a number of cancers of epithelial origin, including esophageal cancer. Dietary administration of lyophilized black raspberries (LBRs) has significantly inhibited chemically induced oral, esophageal, and colon carcinogenesis in animal models. Likewise, berry extracts added to cell cultures significantly inhibited cancer-associated processes. Positive results in preclinical studies have supported further investigation of berries and berry extracts in high-risk human cohorts, including patients with existing premalignancy or patients at risk for cancer recurrence. We are currently conducting a 6-mo chemopreventive pilot study administering 32 or 45 g (female and male, respectively) of LBRs to patients with Barrett's esophagus (BE), a premalignant esophageal condition in which the normal stratified squamous epithelium changes to a metaplastic columnar-lined epithelium. BE's importance lies in the fact that it confers a 30- to 40-fold increased risk for the development of esophageal adenocarcinoma, a rapidly increasing and extremely deadly malignancy. This is a report on interim findings from 10 patients. To date, the results support that daily consumption of LBRs promotes reductions in the urinary excretion of two markers of oxidative stress, 8-epi-prostaglandin F2alpha (8-Iso-PGF2) and, to a lesser more-variable extent, 8-hydroxy-2'-deoxyguanosine (8-OHdG), among patients with BE.

  4. Working memory mechanism in proportional quantifier verification.

    PubMed

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-12-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g., "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow dots". The second study reveals that both types of sentences are correlated with memory storage, however, only proportional sentences are associated with the cognitive control. This result suggests that the cognitive mechanism underlying the verification of proportional quantifiers is crucially related to the integration process, in which an individual has to compare in memory the cardinalities of two sets. In the third study we find that the numerical distance between two cardinalities that must be compared significantly influences the verification time and accuracy. The results of our studies are discussed in the broader context of processing complex sentences. PMID:24374596

  5. Wireless quantified reflex device

    NASA Astrophysics Data System (ADS)

    Lemoyne, Robert Charles

    The deep tendon reflex is a fundamental aspect of a neurological examination. The two major parameters of the tendon reflex are response and latency, which are presently evaluated qualitatively during a neurological examination. The reflex loop is capable of providing insight for the status and therapy response of both upper and lower motor neuron syndromes. Attempts have been made to ascertain reflex response and latency, however these systems are relatively complex, resource intensive, with issues of consistent and reliable accuracy. The solution presented is a wireless quantified reflex device using tandem three dimensional wireless accelerometers to obtain response based on acceleration waveform amplitude and latency derived from temporal acceleration waveform disparity. Three specific aims have been established for the proposed wireless quantified reflex device: 1. Demonstrate the wireless quantified reflex device is reliably capable of ascertaining quantified reflex response and latency using a quantified input. 2. Evaluate the precision of the device using an artificial reflex system. 3.Conduct a longitudinal study respective of subjects with healthy patellar tendon reflexes, using the wireless quantified reflex evaluation device to obtain quantified reflex response and latency. Aim 1 has led to the steady evolution of the wireless quantified reflex device from a singular two dimensional wireless accelerometer capable of measuring reflex response to a tandem three dimensional wireless accelerometer capable of reliably measuring reflex response and latency. The hypothesis for aim 1 is that a reflex quantification device can be established for reliably measuring reflex response and latency for the patellar tendon reflex, comprised of an integrated system of wireless three dimensional MEMS accelerometers. Aim 2 further emphasized the reliability of the wireless quantified reflex device by evaluating an artificial reflex system. The hypothesis for aim 2 is that

  6. Quantifying the Adaptive Cycle.

    PubMed

    Angeler, David G; Allen, Craig R; Garmestani, Ahjond S; Gunderson, Lance H; Hjerne, Olle; Winder, Monika

    2015-01-01

    The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative approach for testing premises (reorganisation, conservatism, adaptation) in the adaptive cycle, using Baltic Sea phytoplankton communities as an example of such complex system dynamics. Phytoplankton organizes in recurring spring and summer blooms, a well-established paradigm in planktology and succession theory, with characteristic temporal trajectories during blooms that may be consistent with adaptive cycle phases. We used long-term (1994-2011) data and multivariate analysis of community structure to assess key components of the adaptive cycle. Specifically, we tested predictions about: reorganisation: spring and summer blooms comprise distinct community states; conservatism: community trajectories during individual adaptive cycles are conservative; and adaptation: phytoplankton species during blooms change in the long term. All predictions were supported by our analyses. Results suggest that traditional ecological paradigms such as phytoplankton successional models have potential for moving the adaptive cycle from a metaphor to a framework that can improve our understanding how complex systems organize and reorganize following collapse. Quantifying reorganization, conservatism and adaptation provides opportunities to cope with the intricacies and uncertainties associated with fast ecological change, driven by shifting system controls. Ultimately, combining traditional ecological paradigms with heuristics of complex system dynamics using quantitative approaches may help refine ecological theory and improve our understanding of the resilience of ecosystems. PMID:26716453

  7. Quantifying the adaptive cycle

    USGS Publications Warehouse

    Angeler, David G.; Allen, Craig R.; Garmestani, Ahjond S.; Gunderson, Lance H.; Hjerne, Olle; Winder, Monika

    2015-01-01

    The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative approach for testing premises (reorganisation, conservatism, adaptation) in the adaptive cycle, using Baltic Sea phytoplankton communities as an example of such complex system dynamics. Phytoplankton organizes in recurring spring and summer blooms, a well-established paradigm in planktology and succession theory, with characteristic temporal trajectories during blooms that may be consistent with adaptive cycle phases. We used long-term (1994–2011) data and multivariate analysis of community structure to assess key components of the adaptive cycle. Specifically, we tested predictions about: reorganisation: spring and summer blooms comprise distinct community states; conservatism: community trajectories during individual adaptive cycles are conservative; and adaptation: phytoplankton species during blooms change in the long term. All predictions were supported by our analyses. Results suggest that traditional ecological paradigms such as phytoplankton successional models have potential for moving the adaptive cycle from a metaphor to a framework that can improve our understanding how complex systems organize and reorganize following collapse. Quantifying reorganization, conservatism and adaptation provides opportunities to cope with the intricacies and uncertainties associated with fast ecological change, driven by shifting system controls. Ultimately, combining traditional ecological paradigms with heuristics of complex system dynamics using quantitative approaches may help refine ecological theory and improve our understanding of the resilience of ecosystems.

  8. Quantifying the Adaptive Cycle

    PubMed Central

    Angeler, David G.; Allen, Craig R.; Garmestani, Ahjond S.; Gunderson, Lance H.; Hjerne, Olle; Winder, Monika

    2015-01-01

    The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative approach for testing premises (reorganisation, conservatism, adaptation) in the adaptive cycle, using Baltic Sea phytoplankton communities as an example of such complex system dynamics. Phytoplankton organizes in recurring spring and summer blooms, a well-established paradigm in planktology and succession theory, with characteristic temporal trajectories during blooms that may be consistent with adaptive cycle phases. We used long-term (1994–2011) data and multivariate analysis of community structure to assess key components of the adaptive cycle. Specifically, we tested predictions about: reorganisation: spring and summer blooms comprise distinct community states; conservatism: community trajectories during individual adaptive cycles are conservative; and adaptation: phytoplankton species during blooms change in the long term. All predictions were supported by our analyses. Results suggest that traditional ecological paradigms such as phytoplankton successional models have potential for moving the adaptive cycle from a metaphor to a framework that can improve our understanding how complex systems organize and reorganize following collapse. Quantifying reorganization, conservatism and adaptation provides opportunities to cope with the intricacies and uncertainties associated with fast ecological change, driven by shifting system controls. Ultimately, combining traditional ecological paradigms with heuristics of complex system dynamics using quantitative approaches may help refine ecological theory and improve our understanding of the resilience of ecosystems. PMID:26716453

  9. Results.

    ERIC Educational Resources Information Center

    Zemsky, Robert; Shaman, Susan; Shapiro, Daniel B.

    2001-01-01

    Describes the Collegiate Results Instrument (CRI), which measures a range of collegiate outcomes for alumni 6 years after graduation. The CRI was designed to target alumni from institutions across market segments and assess their values, abilities, work skills, occupations, and pursuit of lifelong learning. (EV)

  10. Catalysis: Quantifying charge transfer

    NASA Astrophysics Data System (ADS)

    James, Trevor E.; Campbell, Charles T.

    2016-02-01

    Improving the design of catalytic materials for clean energy production requires a better understanding of their electronic properties, which remains experimentally challenging. Researchers now quantify the number of electrons transferred from metal nanoparticles to an oxide support as a function of particle size.

  11. Quantifying Faculty Workloads.

    ERIC Educational Resources Information Center

    Archer, J. Andrew

    Teaching load depends on many variables, however most colleges define it strictly in terms of contact or credit hours. The failure to give weight to variables such as number of preparations, number of students served, committee and other noninstructional assignments is usually due to the lack of a formula that will quantify the effects of these…

  12. Quantifying Ubiquitin Signaling

    PubMed Central

    Ordureau, Alban; Münch, Christian; Harper, J. Wade

    2015-01-01

    Ubiquitin (UB)-driven signaling systems permeate biology, and are often integrated with other types of post-translational modifications (PTMs), most notably phosphorylation. Flux through such pathways is typically dictated by the fractional stoichiometry of distinct regulatory modifications and protein assemblies as well as the spatial organization of pathway components. Yet, we rarely understand the dynamics and stoichiometry of rate-limiting intermediates along a reaction trajectory. Here, we review how quantitative proteomic tools and enrichment strategies are being used to quantify UB-dependent signaling systems, and to integrate UB signaling with regulatory phosphorylation events. A key regulatory feature of ubiquitylation is that the identity of UB chain linkage types can control downstream processes. We also describe how proteomic and enzymological tools can be used to identify and quantify UB chain synthesis and linkage preferences. The emergence of sophisticated quantitative proteomic approaches will set a new standard for elucidating biochemical mechanisms of UB-driven signaling systems. PMID:26000850

  13. Processing of Numerical and Proportional Quantifiers.

    PubMed

    Shikhare, Sailee; Heim, Stefan; Klein, Elise; Huber, Stefan; Willmes, Klaus

    2015-09-01

    Quantifier expressions like "many" and "at least" are part of a rich repository of words in language representing magnitude information. The role of numerical processing in comprehending quantifiers was studied in a semantic truth value judgment task, asking adults to quickly verify sentences about visual displays using numerical (at least seven, at least thirteen, at most seven, at most thirteen) or proportional (many, few) quantifiers. The visual displays were composed of systematically varied proportions of yellow and blue circles. The results demonstrated that numerical estimation and numerical reference information are fundamental in encoding the meaning of quantifiers in terms of response times and acceptability judgments. However, a difference emerges in the comparison strategies when a fixed external reference numerosity (seven or thirteen) is used for numerical quantifiers, whereas an internal numerical criterion is invoked for proportional quantifiers. Moreover, for both quantifier types, quantifier semantics and its polarity (positive vs. negative) biased the response direction (accept/reject). Overall, our results indicate that quantifier comprehension involves core numerical and lexical semantic properties, demonstrating integrated processing of language and numbers. PMID:25631283

  14. Quantifying light pollution

    NASA Astrophysics Data System (ADS)

    Cinzano, P.; Falchi, F.

    2014-05-01

    In this paper we review new available indicators useful to quantify and monitor light pollution, defined as the alteration of the natural quantity of light in the night environment due to introduction of manmade light. With the introduction of recent radiative transfer methods for the computation of light pollution propagation, several new indicators become available. These indicators represent a primary step in light pollution quantification, beyond the bare evaluation of the night sky brightness, which is an observational effect integrated along the line of sight and thus lacking the three-dimensional information.

  15. Quantifying magma segregation in dykes

    NASA Astrophysics Data System (ADS)

    Yamato, P.; Duretz, T.; May, D. A.; Tartèse, R.

    2015-10-01

    The dynamics of magma flow is highly affected by the presence of a crystalline load. During magma ascent, it has been demonstrated that crystal-melt segregation constitutes a viable mechanism for magmatic differentiation. Moreover, crystal-melt segregation during magma transport has important implications not only in terms of magma rheology, but also in terms of differentiation of the continental crust. However, the influences of the crystal volume percentage (φ), of their geometry, their size and their density on crystal-melt segregation are still not well constrained. To address these issues, we performed a parametric study using 2D direct numerical simulations, which model the ascension of a crystal-bearing magma in a vertical dyke. Using these models, we have characterised the amount of segregation as a function of different physical properties including φ, the density contrast between crystals and the melt phase (Δρ), the size of the crystals (Ac) and their aspect ratio (R). Results show that small values of R do not affect the segregation. In this case, the amount of segregation depends upon four parameters. Segregation is highest when Δρ and Ac are large, and lowest for large pressure gradient (Pd) and/or large values of dyke width (Wd). These four parameters can be combined into a single one, the Snumber, which can be used to quantify the amount of segregation occurring during magma ascent. Based on systematic numerical modelling and dimensional analysis, we provide a first order scaling law which allows quantification of the segregation for an arbitrary Snumber and φ, encompassing a wide range of typical parameters encountered in terrestrial magmatic systems. Although developed in a simplified system, this study has strong implications regarding our understanding of crystal segregation processes during magma transport. Our first order scaling law allows to immediately determine the amount of crystal-melt segregation occurring in any given magmatic

  16. On quantifying insect movements

    SciTech Connect

    Wiens, J.A.; Crist, T.O. ); Milne, B.T. )

    1993-08-01

    We elaborate on methods described by Turchin, Odendaal Rausher for quantifying insect movement pathways. We note the need to scale measurement resolution to the study insects and the questions being asked, and we discuss the use of surveying instrumentation for recording sequential positions of individuals on pathways. We itemize several measures that may be used to characterize movement pathways and illustrate these by comparisons among several Eleodes beetles occurring in shortgrass steppe. The fractal dimension of pathways may provide insights not available from absolute measures of pathway configuration. Finally, we describe a renormalization procedure that may be used to remove sequential interdependence among locations of moving individuals while preserving the basic attributes of the pathway.

  17. A new index quantifying the precipitation extremes

    NASA Astrophysics Data System (ADS)

    Busuioc, Aristita; Baciu, Madalina; Stoica, Cerasela

    2015-04-01

    Meteorological Administration in Romania. These types of records contain the rainfall intensity (mm/minute) over various intervals for which it remains constant. The maximum intensity for each continuous rain over the May-August interval has been calculated for each year. The corresponding time series over the 1951-2008 period have been analysed in terms of their long term trends and shifts in the mean; the results have been compared to those resulted from other rainfall indices based on daily and hourly data, computed over the same interval such as: total rainfall amount, maximum daily amount, contribution of total hourly amounts exceeding 10mm/day, contribution of daily amounts exceeding the 90th percentile, the 90th, 99th and 99.9th percentiles of 1-hour data . The results show that the proposed index exhibit a coherent and stronger climate signal (significant increase) for all analysed stations compared to the other indices associated to precipitation extremes, which show either no significant change or weaker signal. This finding shows that the proposed index is most appropriate to quantify the climate change signal of the precipitation extremes. We consider that this index is more naturally connected to the maximum intensity of a real rainfall event. The results presented is this study were funded by the Executive Agency for Higher Education, Research, Development and Innovation Funding (UEFISCDI) through the research project CLIMHYDEX, "Changes in climate extremes and associated impact in hydrological events in Romania", code PNII-ID-2011-2-0073 (http://climhydex.meteoromania.ro)

  18. Quantifying social group evolution

    NASA Astrophysics Data System (ADS)

    Palla, Gergely; Barabási, Albert-László; Vicsek, Tamás

    2007-04-01

    The rich set of interactions between individuals in society results in complex community structure, capturing highly connected circles of friends, families or professional cliques in a social network. Thanks to frequent changes in the activity and communication patterns of individuals, the associated social and communication network is subject to constant evolution. Our knowledge of the mechanisms governing the underlying community dynamics is limited, but is essential for a deeper understanding of the development and self-optimization of society as a whole. We have developed an algorithm based on clique percolation that allows us to investigate the time dependence of overlapping communities on a large scale, and thus uncover basic relationships characterizing community evolution. Our focus is on networks capturing the collaboration between scientists and the calls between mobile phone users. We find that large groups persist for longer if they are capable of dynamically altering their membership, suggesting that an ability to change the group composition results in better adaptability. The behaviour of small groups displays the opposite tendency-the condition for stability is that their composition remains unchanged. We also show that knowledge of the time commitment of members to a given community can be used for estimating the community's lifetime. These findings offer insight into the fundamental differences between the dynamics of small groups and large institutions.

  19. Quantifier Comprehension in Corticobasal Degeneration

    ERIC Educational Resources Information Center

    McMillan, Corey T.; Clark, Robin; Moore, Peachie; Grossman, Murray

    2006-01-01

    In this study, we investigated patients with focal neurodegenerative diseases to examine a formal linguistic distinction between classes of generalized quantifiers, like "some X" and "less than half of X." Our model of quantifier comprehension proposes that number knowledge is required to understand both first-order and higher-order quantifiers.…

  20. How to quantify structural anomalies in fluids?

    PubMed

    Fomin, Yu D; Ryzhov, V N; Klumov, B A; Tsiok, E N

    2014-07-21

    Some fluids are known to behave anomalously. The so-called structural anomaly which means that the fluid becomes less structures under isothermal compression is among the most frequently discussed ones. Several methods for quantifying the degree of structural order are described in the literature and are used for calculating the region of structural anomaly. It is generally thought that all of the structural order determinations yield qualitatively identical results. However, no explicit comparison was made. This paper presents such a comparison for the first time. The results of some definitions are shown to contradict the intuitive notion of a fluid. On the basis of this comparison, we show that the region of structural anomaly can be most reliably determined from the behavior of the excess entropy. PMID:25053327

  1. Quantifying Anderson's fault types

    USGS Publications Warehouse

    Simpson, R.W.

    1997-01-01

    Anderson [1905] explained three basic types of faulting (normal, strike-slip, and reverse) in terms of the shape of the causative stress tensor and its orientation relative to the Earth's surface. Quantitative parameters can be defined which contain information about both shape and orientation [Ce??le??rier, 1995], thereby offering a way to distinguish fault-type domains on plots of regional stress fields and to quantify, for example, the degree of normal-faulting tendencies within strike-slip domains. This paper offers a geometrically motivated generalization of Angelier's [1979, 1984, 1990] shape parameters ?? and ?? to new quantities named A?? and A??. In their simple forms, A?? varies from 0 to 1 for normal, 1 to 2 for strike-slip, and 2 to 3 for reverse faulting, and A?? ranges from 0?? to 60??, 60?? to 120??, and 120?? to 180??, respectively. After scaling, A?? and A?? agree to within 2% (or 1??), a difference of little practical significance, although A?? has smoother analytical properties. A formulation distinguishing horizontal axes as well as the vertical axis is also possible, yielding an A?? ranging from -3 to +3 and A?? from -180?? to +180??. The geometrically motivated derivation in three-dimensional stress space presented here may aid intuition and offers a natural link with traditional ways of plotting yield and failure criteria. Examples are given, based on models of Bird [1996] and Bird and Kong [1994], of the use of Anderson fault parameters A?? and A?? for visualizing tectonic regimes defined by regional stress fields. Copyright 1997 by the American Geophysical Union.

  2. Quantifying lateral tissue heterogeneities in hadron therapy

    SciTech Connect

    Pflugfelder, D.; Wilkens, J. J.; Szymanowski, H.; Oelfke, U.

    2007-04-15

    In radiotherapy with scanned particle beams, tissue heterogeneities lateral to the beam direction are problematic in two ways: they pose a challenge to dose calculation algorithms, and they lead to a high sensitivity to setup errors. In order to quantify and avoid these problems, a heterogeneity number H{sub i} as a method to quantify lateral tissue heterogeneities of single beam spot i is introduced. To evaluate this new concept, two kinds of potential errors were investigated for single beam spots: First, the dose calculation error has been obtained by comparing the dose distribution computed by a simple pencil beam algorithm to more accurate Monte Carlo simulations. The resulting error is clearly correlated with H{sub i}. Second, the analysis of the sensitivity to setup errors of single beam spots also showed a dependence on H{sub i}. From this data it is concluded that H{sub i} can be used as a criterion to assess the risks of a compromised delivered dose due to lateral tissue heterogeneities. Furthermore, a method how to incorporate this information into the inverse planning process for intensity modulated proton therapy is presented. By suppressing beam spots with a high value of H{sub i}, the unfavorable impact of lateral tissue heterogeneities can be reduced, leading to treatment plans which are more robust to dose calculation errors of the pencil beam algorithm. Additional possibilities to use the information of H{sub i} are outlined in the discussion.

  3. Mountain torrents: Quantifying vulnerability and assessing uncertainties

    PubMed Central

    Totschnig, Reinhold; Fuchs, Sven

    2013-01-01

    Vulnerability assessment for elements at risk is an important component in the framework of risk assessment. The vulnerability of buildings affected by torrent processes can be quantified by vulnerability functions that express a mathematical relationship between the degree of loss of individual elements at risk and the intensity of the impacting process. Based on data from the Austrian Alps, we extended a vulnerability curve for residential buildings affected by fluvial sediment transport processes to other torrent processes and other building types. With respect to this goal to merge different data based on different processes and building types, several statistical tests were conducted. The calculation of vulnerability functions was based on a nonlinear regression approach applying cumulative distribution functions. The results suggest that there is no need to distinguish between different sediment-laden torrent processes when assessing vulnerability of residential buildings towards torrent processes. The final vulnerability functions were further validated with data from the Italian Alps and different vulnerability functions presented in the literature. This comparison showed the wider applicability of the derived vulnerability functions. The uncertainty inherent to regression functions was quantified by the calculation of confidence bands. The derived vulnerability functions may be applied within the framework of risk management for mountain hazards within the European Alps. The method is transferable to other mountain regions if the input data needed are available. PMID:27087696

  4. Quantifying actin wave modulation on periodic topography

    NASA Astrophysics Data System (ADS)

    Guven, Can; Driscoll, Meghan; Sun, Xiaoyu; Parker, Joshua; Fourkas, John; Carlsson, Anders; Losert, Wolfgang

    2014-03-01

    Actin is the essential builder of the cell cytoskeleton, whose dynamics are responsible for generating the necessary forces for the formation of protrusions. By exposing amoeboid cells to periodic topographical cues, we show that actin can be directionally guided via inducing preferential polymerization waves. To quantify the dynamics of these actin waves and their interaction with the substrate, we modify a technique from computer vision called ``optical flow.'' We obtain vectors that represent the apparent actin flow and cluster these vectors to obtain patches of newly polymerized actin, which represent actin waves. Using this technique, we compare experimental results, including speed distribution of waves and distance from the wave centroid to the closest ridge, with actin polymerization simulations. We hypothesize the modulation of the activity of nucleation promotion factors on ridges (elevated regions of the surface) as a potential mechanism for the wave-substrate coupling. Funded by NIH grant R01GM085574.

  5. Quantifying Aggressive Behavior in Zebrafish.

    PubMed

    Teles, Magda C; Oliveira, Rui F

    2016-01-01

    Aggression is a complex behavior that influences social relationships and can be seen as adaptive or maladaptive depending on the context and intensity of expression. A model organism suitable for genetic dissection of the underlying neural mechanisms of aggressive behavior is still needed. Zebrafish has already proven to be a powerful vertebrate model organism for the study of normal and pathological brain function. Despite the fact that zebrafish is a gregarious species that forms shoals, when allowed to interact in pairs, both males and females express aggressive behavior and establish dominance hierarchies. Here, we describe two protocols that can be used to quantify aggressive behavior in zebrafish, using two different paradigms: (1) staged fights between real opponents and (2) mirror-elicited fights. We also discuss the methodology for the behavior analysis, the expected results for both paradigms, and the advantages and disadvantages of each paradigm in face of the specific goals of the study. PMID:27464816

  6. Quantifying the quiet epidemic

    PubMed Central

    2014-01-01

    During the late 20th century numerical rating scales became central to the diagnosis of dementia and helped transform attitudes about its causes and prevalence. Concentrating largely on the development and use of the Blessed Dementia Scale, I argue that rating scales served professional ends during the 1960s and 1970s. They helped old age psychiatrists establish jurisdiction over conditions such as dementia and present their field as a vital component of the welfare state, where they argued that ‘reliable modes of diagnosis’ were vital to the allocation of resources. I show how these arguments appealed to politicians, funding bodies and patient groups, who agreed that dementia was a distinct disease and claimed research on its causes and prevention should be designated ‘top priority’. But I also show that worries about the replacement of clinical acumen with technical and depersonalized methods, which could conceivably be applied by anyone, led psychiatrists to stress that rating scales had their limits and could be used only by trained experts. PMID:25866448

  7. Quantifying nonisothermal subsurface soil water evaporation

    NASA Astrophysics Data System (ADS)

    Deol, Pukhraj; Heitman, Josh; Amoozegar, Aziz; Ren, Tusheng; Horton, Robert

    2012-11-01

    Accurate quantification of energy and mass transfer during soil water evaporation is critical for improving understanding of the hydrologic cycle and for many environmental, agricultural, and engineering applications. Drying of soil under radiation boundary conditions results in formation of a dry surface layer (DSL), which is accompanied by a shift in the position of the latent heat sink from the surface to the subsurface. Detailed investigation of evaporative dynamics within this active near-surface zone has mostly been limited to modeling, with few measurements available to test models. Soil column studies were conducted to quantify nonisothermal subsurface evaporation profiles using a sensible heat balance (SHB) approach. Eleven-needle heat pulse probes were used to measure soil temperature and thermal property distributions at the millimeter scale in the near-surface soil. Depth-integrated SHB evaporation rates were compared with mass balance evaporation estimates under controlled laboratory conditions. The results show that the SHB method effectively measured total subsurface evaporation rates with only 0.01-0.03 mm h-1difference from mass balance estimates. The SHB approach also quantified millimeter-scale nonisothermal subsurface evaporation profiles over a drying event, which has not been previously possible. Thickness of the DSL was also examined using measured soil thermal conductivity distributions near the drying surface. Estimates of the DSL thickness were consistent with observed evaporation profile distributions from SHB. Estimated thickness of the DSL was further used to compute diffusive vapor flux. The diffusive vapor flux also closely matched both mass balance evaporation rates and subsurface evaporation rates estimated from SHB.

  8. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  9. Children's interpretations of general quantifiers, specific quantifiers, and generics

    PubMed Central

    Gelman, Susan A.; Leslie, Sarah-Jane; Was, Alexandra M.; Koch, Christina M.

    2014-01-01

    Recently, several scholars have hypothesized that generics are a default mode of generalization, and thus that young children may at first treat quantifiers as if they were generic in meaning. To address this issue, the present experiment provides the first in-depth, controlled examination of the interpretation of generics compared to both general quantifiers ("all Xs", "some Xs") and specific quantifiers ("all of these Xs", "some of these Xs"). We provided children (3 and 5 years) and adults with explicit frequency information regarding properties of novel categories, to chart when "some", "all", and generics are deemed appropriate. The data reveal three main findings. First, even 3-year-olds distinguish generics from quantifiers. Second, when children make errors, they tend to be in the direction of treating quantifiers like generics. Third, children were more accurate when interpreting specific versus general quantifiers. We interpret these data as providing evidence for the position that generics are a default mode of generalization, especially when reasoning about kinds. PMID:25893205

  10. A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT) assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE)

    NASA Astrophysics Data System (ADS)

    Fang, T.; Verma, V.; Guo, H.; King, L. E.; Edgerton, E. S.; Weber, R. J.

    2014-07-01

    A variety of methods are used to measure the capability of particulate matter (PM) to catalytically generate reactive oxygen species (ROS) in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT) assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants) as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with the large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (Coefficient of Variation of 12% for standards, 4% for ambient samples), and reasonably low limit of detection (0.31 nmol min-1). Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 ± 0.12, r2 = 0.92, N = 9). The system was utilized for the Southeastern Center for Air Pollution and Epidemiology (SCAPE) to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, road-side, and rural) in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per air volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88), suggesting regional sources contributing to the PM oxidative potential in southeast US. However, the greater heterogeneity in the intrinsic DTT activity (per PM mass basis) across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the instrument can also be used to determine oxidative potential with other acellular assays.

  11. Quantifying periodicity in omics data.

    PubMed

    Amariei, Cornelia; Tomita, Masaru; Murray, Douglas B

    2014-01-01

    Oscillations play a significant role in biological systems, with many examples in the fast, ultradian, circadian, circalunar, and yearly time domains. However, determining periodicity in such data can be problematic. There are a number of computational methods to identify the periodic components in large datasets, such as signal-to-noise based Fourier decomposition, Fisher's g-test and autocorrelation. However, the available methods assume a sinusoidal model and do not attempt to quantify the waveform shape and the presence of multiple periodicities, which provide vital clues in determining the underlying dynamics. Here, we developed a Fourier based measure that generates a de-noised waveform from multiple significant frequencies. This waveform is then correlated with the raw data from the respiratory oscillation found in yeast, to provide oscillation statistics including waveform metrics and multi-periods. The method is compared and contrasted to commonly used statistics. Moreover, we show the utility of the program in the analysis of noisy datasets and other high-throughput analyses, such as metabolomics and flow cytometry, respectively.

  12. Public medical shows.

    PubMed

    Walusinski, Olivier

    2014-01-01

    In the second half of the 19th century, Jean-Martin Charcot (1825-1893) became famous for the quality of his teaching and his innovative neurological discoveries, bringing many French and foreign students to Paris. A hunger for recognition, together with progressive and anticlerical ideals, led Charcot to invite writers, journalists, and politicians to his lessons, during which he presented the results of his work on hysteria. These events became public performances, for which physicians and patients were transformed into actors. Major newspapers ran accounts of these consultations, more like theatrical shows in some respects. The resultant enthusiasm prompted other physicians in Paris and throughout France to try and imitate them. We will compare the form and substance of Charcot's lessons with those given by Jules-Bernard Luys (1828-1897), Victor Dumontpallier (1826-1899), Ambroise-Auguste Liébault (1823-1904), Hippolyte Bernheim (1840-1919), Joseph Grasset (1849-1918), and Albert Pitres (1848-1928). We will also note their impact on contemporary cinema and theatre. PMID:25273491

  13. A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT) assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE)

    NASA Astrophysics Data System (ADS)

    Fang, T.; Verma, V.; Guo, H.; King, L. E.; Edgerton, E. S.; Weber, R. J.

    2015-01-01

    A variety of methods are used to measure the capability of particulate matter (PM) to catalytically generate reactive oxygen species (ROS) in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT) assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants) as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (coefficient of variation of 15% for positive control, 4% for ambient samples) and reasonably low limit of detection (0.31 nmol min-1). Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 ± 0.12, r2 = 0.92, N = 9). The system was utilized for the Southeastern Center for Air Pollution & Epidemiology (SCAPE) to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, roadside, and rural) in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per-air-volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88), suggesting regional sources contributing to the PM oxidative potential in the southeastern US. The correlation may also suggest a mechanistic explanation (oxidative stress) for observed PM2.5 mass-health associations. The heterogeneity in the intrinsic DTT activity (per-PM-mass basis) across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the

  14. Quantifying renewable groundwater stress with GRACE

    PubMed Central

    Richey, Alexandra S.; Thomas, Brian F.; Lo, Min‐Hui; Reager, John T.; Voss, Katalyn; Swenson, Sean; Rodell, Matthew

    2015-01-01

    Abstract Groundwater is an increasingly important water supply source globally. Understanding the amount of groundwater used versus the volume available is crucial to evaluate future water availability. We present a groundwater stress assessment to quantify the relationship between groundwater use and availability in the world's 37 largest aquifer systems. We quantify stress according to a ratio of groundwater use to availability, which we call the Renewable Groundwater Stress ratio. The impact of quantifying groundwater use based on nationally reported groundwater withdrawal statistics is compared to a novel approach to quantify use based on remote sensing observations from the Gravity Recovery and Climate Experiment (GRACE) satellite mission. Four characteristic stress regimes are defined: Overstressed, Variable Stress, Human‐dominated Stress, and Unstressed. The regimes are a function of the sign of use (positive or negative) and the sign of groundwater availability, defined as mean annual recharge. The ability to mitigate and adapt to stressed conditions, where use exceeds sustainable water availability, is a function of economic capacity and land use patterns. Therefore, we qualitatively explore the relationship between stress and anthropogenic biomes. We find that estimates of groundwater stress based on withdrawal statistics are unable to capture the range of characteristic stress regimes, especially in regions dominated by sparsely populated biome types with limited cropland. GRACE‐based estimates of use and stress can holistically quantify the impact of groundwater use on stress, resulting in both greater magnitudes of stress and more variability of stress between regions. PMID:26900185

  15. Television Quiz Show Simulation

    ERIC Educational Resources Information Center

    Hill, Jonnie Lynn

    2007-01-01

    This article explores the simulation of four television quiz shows for students in China studying English as a foreign language (EFL). It discusses the adaptation and implementation of television quiz shows and how the students reacted to them.

  16. The Great Cometary Show

    NASA Astrophysics Data System (ADS)

    2007-01-01

    The ESO Very Large Telescope Interferometer, which allows astronomers to scrutinise objects with a precision equivalent to that of a 130-m telescope, is proving itself an unequalled success every day. One of the latest instruments installed, AMBER, has led to a flurry of scientific results, an anthology of which is being published this week as special features in the research journal Astronomy & Astrophysics. ESO PR Photo 06a/07 ESO PR Photo 06a/07 The AMBER Instrument "With its unique capabilities, the VLT Interferometer (VLTI) has created itself a niche in which it provide answers to many astronomical questions, from the shape of stars, to discs around stars, to the surroundings of the supermassive black holes in active galaxies," says Jorge Melnick (ESO), the VLT Project Scientist. The VLTI has led to 55 scientific papers already and is in fact producing more than half of the interferometric results worldwide. "With the capability of AMBER to combine up to three of the 8.2-m VLT Unit Telescopes, we can really achieve what nobody else can do," added Fabien Malbet, from the LAOG (France) and the AMBER Project Scientist. Eleven articles will appear this week in Astronomy & Astrophysics' special AMBER section. Three of them describe the unique instrument, while the other eight reveal completely new results about the early and late stages in the life of stars. ESO PR Photo 06b/07 ESO PR Photo 06b/07 The Inner Winds of Eta Carinae The first results presented in this issue cover various fields of stellar and circumstellar physics. Two papers deal with very young solar-like stars, offering new information about the geometry of the surrounding discs and associated outflowing winds. Other articles are devoted to the study of hot active stars of particular interest: Alpha Arae, Kappa Canis Majoris, and CPD -57o2874. They provide new, precise information about their rotating gas envelopes. An important new result concerns the enigmatic object Eta Carinae. Using AMBER with

  17. Covariation and Quantifier Polarity: What Determines Causal Attribution in Vignettes?

    ERIC Educational Resources Information Center

    Majid, Asifa; Sanford, Anthony J.; Pickering, Martin J.

    2006-01-01

    Tests of causal attribution often use verbal vignettes, with covariation information provided through statements quantified with natural language expressions. The effect of covariation information has typically been taken to show that set size information affects attribution. However, recent research shows that quantifiers provide information…

  18. Detecting, visualising, and quantifying mucins.

    PubMed

    Harrop, Ceri A; Thornton, David J; McGuckin, Michael A

    2012-01-01

    The extreme size, extensive glycosylation, and gel-forming nature of mucins make them a challenge to work with, and methodologies for the detection of mucins must take into consideration these features to ensure that one obtains both accurate and meaningful results. In understanding and appreciating the nature of mucins, this affords the researcher a valuable toolkit which can be used to full advantage in detecting, quantifying, and visualising mucins. The employment of a combinatorial approach to mucin detection, using antibody, chemical, and lectin detection methods, allows important information to be gleaned regarding the size, extent of glycosylation, specific mucin species, and distribution of mucins within a given sample. In this chapter, the researcher is guided through considerations into the structure of mucins and how this both affects the detection of mucins and can be used to full advantage. Techniques including ELISA, dot/slot blotting, and Western blotting, use of lectins and antibodies in mucin detection on membranes as well as immunohistochemistry and immunofluorescence on both tissues and cells grown on Transwell™ inserts are described. Notes along with each section advice the researcher on best practice and describe any associated limitations of a particular technique from which the researcher can further develop a particular protocol.

  19. A flow cytometric approach to quantify biofilms.

    PubMed

    Kerstens, Monique; Boulet, Gaëlle; Van Kerckhoven, Marian; Clais, Sofie; Lanckacker, Ellen; Delputte, Peter; Maes, Louis; Cos, Paul

    2015-07-01

    Since biofilms are important in many clinical, industrial, and environmental settings, reliable methods to quantify these sessile microbial populations are crucial. Most of the currently available techniques do not allow the enumeration of the viable cell fraction within the biofilm and are often time consuming. This paper proposes flow cytometry (FCM) using the single-stain viability dye TO-PRO(®)-3 iodide as a fast and precise alternative. Mature biofilms of Candida albicans and Escherichia coli were used to optimize biofilm removal and dissociation, as a single-cell suspension is needed for accurate FCM enumeration. To assess the feasibility of FCM quantification of biofilms, E. coli and C. albicans biofilms were analyzed using FCM and crystal violet staining at different time points. A combination of scraping and rinsing proved to be the most efficient technique for biofilm removal. Sonicating for 10 min eliminated the remaining aggregates, resulting in a single-cell suspension. Repeated FCM measurements of biofilm samples revealed a good intraday precision of approximately 5 %. FCM quantification and the crystal violet assay yielded similar biofilm growth curves for both microorganisms, confirming the applicability of our technique. These results show that FCM using TO-PRO(®)-3 iodide as a single-stain viability dye is a valid fast alternative for the quantification of viable cells in a biofilm.

  20. Quantifying foot deformation using finite helical angle.

    PubMed

    Pothrat, Claude; Goislard de Monsabert, Benjamin; Vigouroux, Laurent; Viehweger, Elke; Berton, Eric; Rao, Guillaume

    2015-10-15

    Foot intrinsic motion originates from the combination of numerous joint motions giving this segment a high adaptive ability. Existing foot kinematic models are mostly focused on analyzing small scale foot bone to bone motions which require both complex experimental methodology and complex interpretative work to assess the global foot functionality. This study proposes a method to assess the total foot deformation by calculating a helical angle from the relative motions of the rearfoot and the forefoot. This method required a limited number of retro-reflective markers placed on the foot and was tested for five different movements (walking, forefoot impact running, heel impact running, 90° cutting, and 180° U-turn) and 12 participants. Overtime intraclass correlation coefficients were calculated to quantify the helical angle pattern repeatability for each movement. Our results indicated that the method was suitable to identify the different motions as different amplitudes of helical angle were observed according to the flexibility required in each movement. Moreover, the results showed that the repeatability could be used to identify the mastering of each motion as this repeatability was high for well mastered movements. Together with existing methods, this new protocol could be applied to fully assess foot function in sport or clinical contexts.

  1. Stretched View Showing 'Victoria'

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Stretched View Showing 'Victoria'

    This pair of images from the panoramic camera on NASA's Mars Exploration Rover Opportunity served as initial confirmation that the two-year-old rover is within sight of 'Victoria Crater,' which it has been approaching for more than a year. Engineers on the rover team were unsure whether Opportunity would make it as far as Victoria, but scientists hoped for the chance to study such a large crater with their roving geologist. Victoria Crater is 800 meters (nearly half a mile) in diameter, about six times wider than 'Endurance Crater,' where Opportunity spent several months in 2004 examining rock layers affected by ancient water.

    When scientists using orbital data calculated that they should be able to detect Victoria's rim in rover images, they scrutinized frames taken in the direction of the crater by the panoramic camera. To positively characterize the subtle horizon profile of the crater and some of the features leading up to it, researchers created a vertically-stretched image (top) from a mosaic of regular frames from the panoramic camera (bottom), taken on Opportunity's 804th Martian day (April 29, 2006).

    The stretched image makes mild nearby dunes look like more threatening peaks, but that is only a result of the exaggerated vertical dimension. This vertical stretch technique was first applied to Viking Lander 2 panoramas by Philip Stooke, of the University of Western Ontario, Canada, to help locate the lander with respect to orbiter images. Vertically stretching the image allows features to be more readily identified by the Mars Exploration Rover science team.

    The bright white dot near the horizon to the right of center (barely visible without labeling or zoom-in) is thought to be a light-toned outcrop on the far wall of the crater, suggesting that the rover can see over the low rim of Victoria. In figure 1, the northeast and southeast rims are labeled

  2. A Holographic Road Show.

    ERIC Educational Resources Information Center

    Kirkpatrick, Larry D.; Rugheimer, Mac

    1979-01-01

    Describes the viewing sessions and the holograms of a holographic road show. The traveling exhibits, believed to stimulate interest in physics, include a wide variety of holograms and demonstrate several physical principles. (GA)

  3. The Ozone Show.

    ERIC Educational Resources Information Center

    Mathieu, Aaron

    2000-01-01

    Uses a talk show activity for a final assessment tool for students to debate about the ozone hole. Students are assessed on five areas: (1) cooperative learning; (2) the written component; (3) content; (4) self-evaluation; and (5) peer evaluation. (SAH)

  4. Show What You Know

    ERIC Educational Resources Information Center

    Eccleston, Jeff

    2007-01-01

    Big things come in small packages. This saying came to the mind of the author after he created a simple math review activity for his fourth grade students. Though simple, it has proven to be extremely advantageous in reinforcing math concepts. He uses this activity, which he calls "Show What You Know," often. This activity provides the perfect…

  5. Showing What They Know

    ERIC Educational Resources Information Center

    Cech, Scott J.

    2008-01-01

    Having students show their skills in three dimensions, known as performance-based assessment, dates back at least to Socrates. Individual schools such as Barrington High School--located just outside of Providence--have been requiring students to actively demonstrate their knowledge for years. The Rhode Island's high school graduating class became…

  6. Stage a Water Show

    ERIC Educational Resources Information Center

    Frasier, Debra

    2008-01-01

    In the author's book titled "The Incredible Water Show," the characters from "Miss Alaineus: A Vocabulary Disaster" used an ocean of information to stage an inventive performance about the water cycle. In this article, the author relates how she turned the story into hands-on science teaching for real-life fifth-grade students. The author also…

  7. What Do Maps Show?

    ERIC Educational Resources Information Center

    Geological Survey (Dept. of Interior), Reston, VA.

    This curriculum packet, appropriate for grades 4-8, features a teaching poster which shows different types of maps (different views of Salt Lake City, Utah), as well as three reproducible maps and reproducible activity sheets which complement the maps. The poster provides teacher background, including step-by-step lesson plans for four geography…

  8. Obesity in show cats.

    PubMed

    Corbee, R J

    2014-12-01

    Obesity is an important disease with a high prevalence in cats. Because obesity is related to several other diseases, it is important to identify the population at risk. Several risk factors for obesity have been described in the literature. A higher incidence of obesity in certain cat breeds has been suggested. The aim of this study was to determine whether obesity occurs more often in certain breeds. The second aim was to relate the increased prevalence of obesity in certain breeds to the official standards of that breed. To this end, 268 cats of 22 different breeds investigated by determining their body condition score (BCS) on a nine-point scale by inspection and palpation, at two different cat shows. Overall, 45.5% of the show cats had a BCS > 5, and 4.5% of the show cats had a BCS > 7. There were significant differences between breeds, which could be related to the breed standards. Most overweight and obese cats were in the neutered group. It warrants firm discussions with breeders and cat show judges to come to different interpretations of the standards in order to prevent overweight conditions in certain breeds from being the standard of beauty. Neutering predisposes for obesity and requires early nutritional intervention to prevent obese conditions. PMID:24612018

  9. Show Me the Way

    ERIC Educational Resources Information Center

    Dicks, Matthew J.

    2005-01-01

    Because today's students have grown up steeped in video games and the Internet, most of them expect feedback, and usually gratification, very soon after they expend effort on a task. Teachers can get quick feedback to students by showing them videotapes of their learning performances. The author, a 3rd grade teacher describes how the seemingly…

  10. The Art Show

    ERIC Educational Resources Information Center

    Scolarici, Alicia

    2004-01-01

    This article describes what once was thought to be impossible--a formal art show extravaganza at an elementary school with 1,000 students, a Department of Defense Dependent School (DODDS) located overseas, on RAF Lakenheath, England. The dream of this this event involved the transformation of the school cafeteria into an elegant art show…

  11. Honored Teacher Shows Commitment.

    ERIC Educational Resources Information Center

    Ratte, Kathy

    1987-01-01

    Part of the acceptance speech of the 1985 National Council for the Social Studies Teacher of the Year, this article describes the censorship experience of this honored social studies teacher. The incident involved the showing of a videotape version of the feature film entitled "The Seduction of Joe Tynan." (JDH)

  12. Competition during the processing of quantifier scope ambiguities: evidence from eye movements during reading.

    PubMed

    Paterson, Kevin B; Filik, Ruth; Liversedge, Simon P

    2008-03-01

    We investigated the processing of sentences containing a quantifier scope ambiguity, such as Kelly showed a photo to each critic, which is ambiguous between the indefinite phrase (a photo) having one or many referents. Ambiguity resolution requires the computation of relative quantifier scope, with either a photo or each critic taking wide scope, thereby determining the number of referents. Using eye tracking, we established that multiple factors, including the grammatical function and surface linear order of quantified phrases, along with their lexical characteristics, interact during the processing of relative quantifier scope, with conflict between factors incurring a processing cost. We discuss the results in terms of theoretical accounts attributing sentence-processing difficulty to either reanalysis (e.g., Fodor, 1982) or competition between rival analyses (e.g., Kurtzman & MacDonald, 1993).

  13. Quantifying cognitive decrements caused by cranial radiotherapy.

    PubMed

    Christie, Lori-Ann; Acharya, Munjal M; Limoli, Charles L

    2011-01-01

    With the exception of survival, cognitive impairment stemming from the clinical management of cancer is a major factor dictating therapeutic outcome. For many patients afflicted with CNS and non-CNS malignancies, radiotherapy and chemotherapy offer the best options for disease control. These treatments however come at a cost, and nearly all cancer survivors (~11 million in the US alone as of 2006) incur some risk for developing cognitive dysfunction, with the most severe cases found in patients subjected to cranial radiotherapy (~200,000/yr) for the control of primary and metastatic brain tumors. Particularly problematic are pediatric cases, whose long-term survival plagued with marked cognitive decrements results in significant socioeconomic burdens. To date, there are still no satisfactory solutions to this significant clinical problem. We have addressed this serious health concern using transplanted stem cells to combat radiation-induced cognitive decline in athymic rats subjected to cranial irradiation. Details of the stereotaxic irradiation and the in vitro culturing and transplantation of human neural stem cells (hNSCs) can be found in our companion paper (Acharya et al., JoVE reference). Following irradiation and transplantation surgery, rats are then assessed for changes in cognition, grafted cell survival and expression of differentiation-specific markers 1 and 4-months after irradiation. To critically evaluate the success or failure of any potential intervention designed to ameliorate radiation-induced cognitive sequelae, a rigorous series of quantitative cognitive tasks must be performed. To accomplish this, we subject our animals to a suite of cognitive testing paradigms including novel place recognition, water maze, elevated plus maze and fear conditioning, in order to quantify hippocampal and non-hippocampal learning and memory. We have demonstrated the utility of these tests for quantifying specific types of cognitive decrements in irradiated animals

  14. Children’s developing intuitions about the truth conditions and implications of novel generics vs. quantified statements

    PubMed Central

    Brandone, Amanda C.; Gelman, Susan A; Hedglen, Jenna

    2014-01-01

    Generic statements express generalizations about categories and present a unique semantic profile that is distinct from quantified statements. This paper reports two studies examining the development of children’s intuitions about the semantics of generics and how they differ from statements quantified by all, most, and some. Results reveal that, like adults, preschoolers (1) recognize that generics have flexible truth conditions and are capable of representing a wide range of prevalence levels; and (2) interpret novel generics as having near-universal prevalence implications. Results further show that by age 4, children are beginning to differentiate the meaning of generics and quantified statements; however, even 7- to 11-year-olds are not adult-like in their intuitions about the meaning of most-quantified statements. Overall, these studies suggest that by preschool, children interpret generics in much the same way that adults do; however, mastery of the semantics of quantified statements follows a more protracted course. PMID:25297340

  15. Use of the Concept of Equivalent Biologically Effective Dose (BED) to Quantify the Contribution of Hyperthermia to Local Tumor Control in Radiohyperthermia Cervical Cancer Trials, and Comparison With Radiochemotherapy Results

    SciTech Connect

    Plataniotis, George A. Dale, Roger G.

    2009-04-01

    Purpose: To express the magnitude of contribution of hyperthermia to local tumor control in radiohyperthermia (RT/HT) cervical cancer trials, in terms of the radiation-equivalent biologically effective dose (BED) and to explore the potential of the combined modalities in the treatment of this neoplasm. Materials and Methods: Local control rates of both arms of each study (RT vs. RT+HT) reported from randomized controlled trials (RCT) on concurrent RT/HT for cervical cancer were reviewed. By comparing the two tumor control probabilities (TCPs) from each study, we calculated the HT-related log cell-kill and then expressed it in terms of the number of 2 Gy fraction equivalents, for a range of tumor volumes and radiosensitivities. We have compared the contribution of each modality and made some exploratory calculations on the TCPs that might be expected from a combined trimodality treatment (RT+CT+HT). Results: The HT-equivalent number of 2-Gy fractions ranges from 0.6 to 4.8 depending on radiosensitivity. Opportunities for clinically detectable improvement by the addition of HT are only available in tumors with an alpha value in the approximate range of 0.22-0.28 Gy{sup -1}. A combined treatment (RT+CT+HT) is not expected to improve prognosis in radioresistant tumors. Conclusion: The most significant improvements in TCP, which may result from the combination of RT/CT/HT for locally advanced cervical carcinomas, are likely to be limited only to those patients with tumors of relatively low-intermediate radiosensitivity.

  16. Quantifying uncertainty in stable isotope mixing models

    DOE PAGES

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-19

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, SIAR [Parnell et al., 2010] a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods testedmore » are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated

  17. Quantifying uncertainty in stable isotope mixing models

    NASA Astrophysics Data System (ADS)

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-01

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, Stable Isotope Analysis in R (SIAR), a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated

  18. Quantifying uncertainty in stable isotope mixing models

    SciTech Connect

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-19

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, SIAR [Parnell et al., 2010] a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the

  19. Taking in a Show.

    PubMed

    Boden, Timothy W

    2016-01-01

    Many medical practices have cut back on education and staff development expenses, especially those costs associated with conventions and conferences. But there are hard-to-value returns on your investment in these live events--beyond the obvious benefits of acquired knowledge and skills. Major vendors still exhibit their services and wares at many events, and the exhibit hall is a treasure-house of information and resources for the savvy physician or administrator. Make and stick to a purposeful plan to exploit the trade show. You can compare products, gain new insights and ideas, and even negotiate better deals with representatives anxious to realize returns on their exhibition investments. PMID:27249887

  20. Taking in a Show.

    PubMed

    Boden, Timothy W

    2016-01-01

    Many medical practices have cut back on education and staff development expenses, especially those costs associated with conventions and conferences. But there are hard-to-value returns on your investment in these live events--beyond the obvious benefits of acquired knowledge and skills. Major vendors still exhibit their services and wares at many events, and the exhibit hall is a treasure-house of information and resources for the savvy physician or administrator. Make and stick to a purposeful plan to exploit the trade show. You can compare products, gain new insights and ideas, and even negotiate better deals with representatives anxious to realize returns on their exhibition investments.

  1. Obesity in show dogs.

    PubMed

    Corbee, R J

    2013-10-01

    Obesity is an important disease with a growing incidence. Because obesity is related to several other diseases, and decreases life span, it is important to identify the population at risk. Several risk factors for obesity have been described in the literature. A higher incidence of obesity in certain breeds is often suggested. The aim of this study was to determine whether obesity occurs more often in certain breeds. The second aim was to relate the increased prevalence of obesity in certain breeds to the official standards of that breed. To this end, we investigated 1379 dogs of 128 different breeds by determining their body condition score (BCS). Overall, 18.6% of the show dogs had a BCS >5, and 1.1% of the show dogs had a BCS>7. There were significant differences between breeds, which could be correlated to the breed standards. It warrants firm discussions with breeders and judges in order to come to different interpretations of the standards to prevent overweight conditions from being the standard of beauty. PMID:22882163

  2. Quantifying the mutational meltdown in diploid populations.

    PubMed

    Coron, Camille; Méléard, Sylvie; Porcher, Emmanuelle; Robert, Alexandre

    2013-05-01

    Mutational meltdown, in which demographic and genetic processes mutually reinforce one another to accelerate the extinction of small populations, has been poorly quantified despite its potential importance in conservation biology. Here we present a model-based framework to study and quantify the mutational meltdown in a finite diploid population that is evolving continuously in time and subject to resource competition. We model slightly deleterious mutations affecting the population demographic parameters and study how the rate of mutation fixation increases as the genetic load increases, a process that we investigate at two timescales: an ecological scale and a mutational scale. Unlike most previous studies, we treat population size as a random process in continuous time. We show that as deleterious mutations accumulate, the decrease in mean population size accelerates with time relative to a null model with a constant mean fixation time. We quantify this mutational meltdown via the change in the mean fixation time after each new mutation fixation, and we show that the meltdown appears less severe than predicted by earlier theoretical work. We also emphasize that mean population size alone can be a misleading index of the risk of population extinction, which could be better evaluated with additional information on demographic parameters.

  3. Not a "reality" show.

    PubMed

    Wrong, Terence; Baumgart, Erica

    2013-01-01

    The authors of the preceding articles raise legitimate questions about patient and staff rights and the unintended consequences of allowing ABC News to film inside teaching hospitals. We explain why we regard their fears as baseless and not supported by what we heard from individuals portrayed in the filming, our decade-long experience making medical documentaries, and the full un-aired context of the scenes shown in the broadcast. The authors don't and can't know what conversations we had, what documents we reviewed, and what protections we put in place in each televised scene. Finally, we hope to correct several misleading examples cited by the authors as well as their offhand mischaracterization of our program as a "reality" show. PMID:23631336

  4. Not a "reality" show.

    PubMed

    Wrong, Terence; Baumgart, Erica

    2013-01-01

    The authors of the preceding articles raise legitimate questions about patient and staff rights and the unintended consequences of allowing ABC News to film inside teaching hospitals. We explain why we regard their fears as baseless and not supported by what we heard from individuals portrayed in the filming, our decade-long experience making medical documentaries, and the full un-aired context of the scenes shown in the broadcast. The authors don't and can't know what conversations we had, what documents we reviewed, and what protections we put in place in each televised scene. Finally, we hope to correct several misleading examples cited by the authors as well as their offhand mischaracterization of our program as a "reality" show.

  5. Quantifying reliability uncertainty : a proof of concept.

    SciTech Connect

    Diegert, Kathleen V.; Dvorack, Michael A.; Ringland, James T.; Mundt, Michael Joseph; Huzurbazar, Aparna; Lorio, John F.; Fatherley, Quinn; Anderson-Cook, Christine; Wilson, Alyson G.; Zurn, Rena M.

    2009-10-01

    This paper develops Classical and Bayesian methods for quantifying the uncertainty in reliability for a system of mixed series and parallel components for which both go/no-go and variables data are available. Classical methods focus on uncertainty due to sampling error. Bayesian methods can explore both sampling error and other knowledge-based uncertainties. To date, the reliability community has focused on qualitative statements about uncertainty because there was no consensus on how to quantify them. This paper provides a proof of concept that workable, meaningful quantification methods can be constructed. In addition, the application of the methods demonstrated that the results from the two fundamentally different approaches can be quite comparable. In both approaches, results are sensitive to the details of how one handles components for which no failures have been seen in relatively few tests.

  6. Incremental comprehension of spoken quantifier sentences: Evidence from brain potentials.

    PubMed

    Freunberger, Dominik; Nieuwland, Mante S

    2016-09-01

    Do people incrementally incorporate the meaning of quantifier expressions to understand an unfolding sentence? Most previous studies concluded that quantifiers do not immediately influence how a sentence is understood based on the observation that online N400-effects differed from offline plausibility judgments. Those studies, however, used serial visual presentation (SVP), which involves unnatural reading. In the current ERP-experiment, we presented spoken positive and negative quantifier sentences ("Practically all/practically no postmen prefer delivering mail, when the weather is good/bad during the day"). Different from results obtained in a previously reported SVP-study (Nieuwland, 2016) sentence truth-value N400 effects occurred in positive and negative quantifier sentences alike, reflecting fully incremental quantifier comprehension. This suggests that the prosodic information available during spoken language comprehension supports the generation of online predictions for upcoming words and that, at least for quantifier sentences, comprehension of spoken language may proceed more incrementally than comprehension during SVP reading. PMID:27346365

  7. Quantifying Poststroke Apathy With Actimeters.

    PubMed

    Goldfine, Andrew M; Dehbandi, Behdad; Kennedy, Juliana M; Sabot, Briana; Semper, Cory; Putrino, David

    2016-01-01

    The authors tested the hypothesis that wrist-worn actimeters can quantify the severity of poststroke apathy. The authors studied 57 patients admitted to an acute rehabilitation unit for ischemic or hemorrhagic stroke. After accounting for motor deficit of the affected arm and accounting for age, each increment of the Apathy Inventory score correlated with 5.6 fewer minutes of moving per hour. The overall statistical model had an R(2) of only 0.34, suggesting unexplained factors for total movement time. Wrist-worn actimeters may serve as an objective, quantifiable measure of poststroke apathy in patients with an intact upper extremity but cannot be used alone to diagnose apathy. PMID:26900735

  8. Quantifying and measuring cyber resiliency

    NASA Astrophysics Data System (ADS)

    Cybenko, George

    2016-05-01

    Cyber resliency has become an increasingly attractive research and operational concept in cyber security. While several metrics have been proposed for quantifying cyber resiliency, a considerable gap remains between those metrics and operationally measurable and meaningful concepts that can be empirically determined in a scientific manner. This paper describes a concrete notion of cyber resiliency that can be tailored to meet specific needs of organizations that seek to introduce resiliency into their assessment of their cyber security posture.

  9. Probabilistic Approaches to Better Quantifying the Results of Epidemiologic Studies

    PubMed Central

    Gustafson, Paul; McCandless, Lawrence C.

    2010-01-01

    Typical statistical analysis of epidemiologic data captures uncertainty due to random sampling variation, but ignores more systematic sources of variation such as selection bias, measurement error, and unobserved confounding. Such sources are often only mentioned via qualitative caveats, perhaps under the heading of ‘study limitations.’ Recently, however, there has been considerable interest and advancement in probabilistic methodologies for more integrated statistical analysis. Such techniques hold the promise of replacing a confidence interval reflecting only random sampling variation with an interval reflecting all, or at least more, sources of uncertainty. We survey and appraise the recent literature in this area, giving some prominence to the use of Bayesian statistical methodology. PMID:20617044

  10. Children's school-breakfast reports and school-lunch reports (in 24-h dietary recalls): conventional and reporting-error-sensitive measures show inconsistent accuracy results for retention interval and breakfast location.

    PubMed

    Baxter, Suzanne D; Guinn, Caroline H; Smith, Albert F; Hitchcock, David B; Royer, Julie A; Puryear, Megan P; Collins, Kathleen L; Smith, Alyssa L

    2016-04-14

    Validation-study data were analysed to investigate retention interval (RI) and prompt effects on the accuracy of fourth-grade children's reports of school-breakfast and school-lunch (in 24-h recalls), and the accuracy of school-breakfast reports by breakfast location (classroom; cafeteria). Randomly selected fourth-grade children at ten schools in four districts were observed eating school-provided breakfast and lunch, and were interviewed under one of eight conditions created by crossing two RIs ('short'--prior-24-hour recall obtained in the afternoon and 'long'--previous-day recall obtained in the morning) with four prompts ('forward'--distant to recent, 'meal name'--breakfast, etc., 'open'--no instructions, and 'reverse'--recent to distant). Each condition had sixty children (half were girls). Of 480 children, 355 and 409 reported meals satisfying criteria for reports of school-breakfast and school-lunch, respectively. For breakfast and lunch separately, a conventional measure--report rate--and reporting-error-sensitive measures--correspondence rate and inflation ratio--were calculated for energy per meal-reporting child. Correspondence rate and inflation ratio--but not report rate--showed better accuracy for school-breakfast and school-lunch reports with the short RI than with the long RI; this pattern was not found for some prompts for each sex. Correspondence rate and inflation ratio showed better school-breakfast report accuracy for the classroom than for cafeteria location for each prompt, but report rate showed the opposite. For each RI, correspondence rate and inflation ratio showed better accuracy for lunch than for breakfast, but report rate showed the opposite. When choosing RI and prompts for recalls, researchers and practitioners should select a short RI to maximise accuracy. Recommendations for prompt selections are less clear. As report rates distort validation-study accuracy conclusions, reporting-error-sensitive measures are recommended. PMID

  11. Quantifying potential recharge in mantled sinkholes using ERT.

    PubMed

    Schwartz, Benjamin F; Schreiber, Madeline E

    2009-01-01

    Potential recharge through thick soils in mantled sinkholes was quantified using differential electrical resistivity tomography (ERT). Conversion of time series two-dimensional (2D) ERT profiles into 2D volumetric water content profiles using a numerically optimized form of Archie's law allowed us to monitor temporal changes in water content in soil profiles up to 9 m in depth. Combining Penman-Monteith daily potential evapotranspiration (PET) and daily precipitation data with potential recharge calculations for three sinkhole transects indicates that potential recharge occurred only during brief intervals over the study period and ranged from 19% to 31% of cumulative precipitation. Spatial analysis of ERT-derived water content showed that infiltration occurred both on sinkhole flanks and in sinkhole bottoms. Results also demonstrate that mantled sinkholes can act as regions of both rapid and slow recharge. Rapid recharge is likely the result of flow through macropores (such as root casts and thin gravel layers), while slow recharge is the result of unsaturated flow through fine-grained sediments. In addition to developing a new method for quantifying potential recharge at the field scale in unsaturated conditions, we show that mantled sinkholes are an important component of storage in a karst system.

  12. A lidar technique to quantify surface deposition from atmospheric releases of bulk liquids

    NASA Astrophysics Data System (ADS)

    Post, Madison J.; Glaes, Thomas; Matta, Joseph; Sommerville, Douglas; Einfeld, Wayne

    We show that a scanning, pulsed lidar can be used to quantify the time history and areal concentration of mass deposited on the ground from an elevated release of bulk liquid. Aircraft measurements, witness car depositions and evaporative modelling crudely support results from analysed lidar data.

  13. What Do Blood Tests Show?

    MedlinePlus

    ... shows the ranges for blood glucose levels after 8 to 12 hours of fasting (not eating). It shows the normal range and the abnormal ranges that are a sign of prediabetes or diabetes. Plasma Glucose Results (mg/dL)* Diagnosis 70 to 99 ...

  14. Quantifying crystal-melt segregation in dykes

    NASA Astrophysics Data System (ADS)

    Yamato, Philippe; Duretz, Thibault; May, Dave A.; Tartèse, Romain

    2015-04-01

    The dynamics of magma flow is highly affected by the presence of a crystalline load. During magma ascent, it has been demonstrated that crystal-melt segregation constitutes a viable mechanism for magmatic differentiation. However, the influences of crystal volume fraction, geometry, size and density on crystal melt segregation are still not well constrained. In order to address these issues, we performed a parametric study using 2D direct numerical simulations, which model the ascension of crystal-bearing magma in a vertical dyke. Using these models, we have characterised the amount of segregation as a function of different quantities including: the crystal fraction (φ), the density contrast between crystals and melt (Δρ), the size of the crystals (Ac) and their aspect ratio (R). Results show that crystal aspect ratio does not affect the segregation if R is small enough (long axis smaller than ~1/6 of the dyke width, Wd). Inertia within the system was also found not to influence crystal-melt segregation. The degree of segregation was however found to be highly dependent upon other parameters. Segregation is highest when Δρ and Ac are large, and lowest for large pressure gradient (Pd) and/or large values of Wd. These four parameters can be combined into a single one, the Snumber, which can be used to quantify the segregation. Based on systematic numerical modelling and dimensional analysis, we provide a first order scaling law which allows quantification of the segregation for an arbitrary Snumber and φ, encompassing a wide range of typical parameters encountered in terrestrial magmatic systems.

  15. Quantifying Connectivity in the Coastal Ocean

    NASA Astrophysics Data System (ADS)

    Mitarai, S.; Siegel, D.; Watson, J.; Dong, C.; McWilliams, J.

    2008-12-01

    The quantification of coastal connectivity is important for a wide range of real-world applications ranging from marine pollution to nearshore fisheries management. For these purposes, coastal connectivity is best defined as the probability that water parcels from one nearshore location are advected to another site over a given time interval. Here, we demonstrate how to quantify coastal connectivity using Lagrangian probability- density function (PDF) methods, a classic modeling approach for many turbulent applications, and numerical solutions of coastal circulation for the Southern California Bight. Mean dispersal patterns from a single release site (or Lagrangian PDFs) show a strong dependency to the particle-release location and seasonal variability, reflecting circulation patterns in the Southern California Bight. Strong interannual variations, responding to El Nino and La Nina transitions are also observed. Mean connectivity patterns, deduced from Lagrangian PDFs, is spatially heterogeneous for the advection time of around 30 days or less, resulting from distinctive circulation patterns, and becomes more homogeneous for a longer advection time. A given realization of connectivity is stochastic because of eddy-driven transport and synoptic wind forcing changes. In general, mainland sites are good sources while both Northern and Southern Channel Islands are poor source sites, although they receive substantial fluxes of water parcels from the mainland. The predicted connectivity gives useful information to ecological and other applications for the Southern California Bight (e.g., designing marine protected areas, understanding gene structures, and predicting the impact of a pollution event) and provide a path for assessing connectivity for other regions of the coastal ocean.

  16. Attentional focusing with quantifiers in production and comprehension.

    PubMed

    Sanford, A J; Moxey, L M; Paterson, K B

    1996-03-01

    There is a very large number of quantifiers in English, so many that it seems impossible that the only information that they convey is about amounts. Building on the earlier work of Moxey and Sanford (1987), we report three experiments showing that positive and negative quantifiers focus on different subsets of the logical possibilities that quantifiers allow semantically. Experiments 1 and 2 feature a continuation task with quantifiers that span a full range of denotations (from near 0% to near 100%) and show that the effect is not restricted to quantifiers denoting small amounts. This enables a distinction to be made between generalization and complement set focus proper. The focus effects extend to comprehension, as shown by a self-paced reading study (Experiment 3). It is noted that the focus effects obtained are compatible with findings from earlier work by Just and Carpenter (1971), which used a verification paradigm, and in fact these effects constitute a direct test of inferences Just and Carpenter made about mechanisms of encoding negative quantifiers. A related but different explanation is put forward to explain the present data. The experiments show a quantifier function beyond the simple denotation of amount.

  17. Quantifying torso deformity in scoliosis

    NASA Astrophysics Data System (ADS)

    Ajemba, Peter O.; Kumar, Anish; Durdle, Nelson G.; Raso, V. James

    2006-03-01

    Scoliosis affects the alignment of the spine and the shape of the torso. Most scoliosis patients and their families are more concerned about the effect of scoliosis on the torso than its effect on the spine. There is a need to develop robust techniques for quantifying torso deformity based on full torso scans. In this paper, deformation indices obtained from orthogonal maps of full torso scans are used to quantify torso deformity in scoliosis. 'Orthogonal maps' are obtained by applying orthogonal transforms to 3D surface maps. (An 'orthogonal transform' maps a cylindrical coordinate system to a Cartesian coordinate system.) The technique was tested on 361 deformed computer models of the human torso and on 22 scans of volunteers (8 normal and 14 scoliosis). Deformation indices from the orthogonal maps correctly classified up to 95% of the volunteers with a specificity of 1.00 and a sensitivity of 0.91. In addition to classifying scoliosis, the system gives a visual representation of the entire torso in one view and is viable for use in a clinical environment for managing scoliosis.

  18. Quantifying the Bull's-Eye Effect: Thick Slices

    NASA Astrophysics Data System (ADS)

    Praton, E. A.; Bilikova, J.; Melott, A. L.; Thomas, B. C.

    2004-12-01

    We present the results of an investigation into the method proposed by Melott et al. (1998) for quantifying the bull's-eye effect in maps of large scale structure. The bull's-eye effect is a distortion in redshift-space produced by peculiar velocities. Structures lying across the line of sight are enhanced, but structures lying along the line of sight are not, producing an impression of walls ringing the observer (Praton, Melott, & McKee 1997), much like those seen in recent large scale surveys. Simulations show that the strength of the pattern varies with initial cosmological conditions; thus, our interest in developing a reliable way to quantify the effect, as a possible way to independently determine parameters such as Ω . Thomas et al. (2004) showed that the proposed method can successfully distinguish between high and low Ω simulations, independent of bias, for thin slices in the cartesian limit. We carry that investigation further, looking at thick slices in the cartesian limit. Since the bull's-eye pattern grows stronger as slice thickness increases (Praton, Melott, & Peterson 1997), we expect the method to become more reliable. Instead, we find the opposite. We discuss the reasons, as well as results from one or two possible alternatives.

  19. SPACE: an algorithm to predict and quantify alternatively spliced isoforms using microarrays.

    PubMed

    Anton, Miguel A; Gorostiaga, Dorleta; Guruceaga, Elizabeth; Segura, Victor; Carmona-Saez, Pedro; Pascual-Montano, Alberto; Pio, Ruben; Montuenga, Luis M; Rubio, Angel

    2008-01-01

    Exon and exon+junction microarrays are promising tools for studying alternative splicing. Current analytical tools applied to these arrays lack two relevant features: the ability to predict unknown spliced forms and the ability to quantify the concentration of known and unknown isoforms. SPACE is an algorithm that has been developed to (1) estimate the number of different transcripts expressed under several conditions, (2) predict the precursor mRNA splicing structure and (3) quantify the transcript concentrations including unknown forms. The results presented here show its robustness and accuracy for real and simulated data. PMID:18312629

  20. SPACE: an algorithm to predict and quantify alternatively spliced isoforms using microarrays

    PubMed Central

    Anton, Miguel A; Gorostiaga, Dorleta; Guruceaga, Elizabeth; Segura, Victor; Carmona-Saez, Pedro; Pascual-Montano, Alberto; Pio, Ruben; Montuenga, Luis M; Rubio, Angel

    2008-01-01

    Exon and exon+junction microarrays are promising tools for studying alternative splicing. Current analytical tools applied to these arrays lack two relevant features: the ability to predict unknown spliced forms and the ability to quantify the concentration of known and unknown isoforms. SPACE is an algorithm that has been developed to (1) estimate the number of different transcripts expressed under several conditions, (2) predict the precursor mRNA splicing structure and (3) quantify the transcript concentrations including unknown forms. The results presented here show its robustness and accuracy for real and simulated data. PMID:18312629

  1. Quantifying entanglement with scattering experiments

    NASA Astrophysics Data System (ADS)

    Marty, O.; Epping, M.; Kampermann, H.; Bruß, D.; Plenio, M. B.; Cramer, M.

    2014-03-01

    We show how the entanglement contained in states of spins arranged on a lattice may be lower bounded with observables arising in scattering experiments. We focus on the partial differential cross section obtained in neutron scattering from magnetic materials but our results are sufficiently general such that they may also be applied to, e.g., optical Bragg scattering from ultracold atoms in optical lattices or from ion chains. We discuss resonating valence bond states and ground and thermal states of experimentally relevant models—such as the Heisenberg, Majumdar-Ghosh, and XY models—in different geometries and with different spin numbers. As a by-product, we find that for the one-dimensional XY model in a transverse field such measurements reveal factorization and the quantum phase transition at zero temperature.

  2. Quantifying Evaporation in a Permeable Pavement System

    EPA Science Inventory

    Studies quantifying evaporation from permeable pavement systems are limited to a few laboratory studies and one field application. This research quantifies evaporation for a larger-scale field application by measuring the water balance from lined permeable pavement sections. Th...

  3. New Drug Shows Mixed Results Against Early Alzheimer's

    MedlinePlus

    ... Sign Up See recent e-Newsletters Preserving Your Memory Magazine Get Your Copy Now Subscribe to our ... 3 Letter Resources Articles Brochure Download Preserving Your Memory Magazine e-Newsletter Resource Locator Videos Charity Navigator ...

  4. Quantifying the vitamin D economy.

    PubMed

    Heaney, Robert P; Armas, Laura A G

    2015-01-01

    Vitamin D enters the body through multiple routes and in a variety of chemical forms. Utilization varies with input, demand, and genetics. Vitamin D and its metabolites are carried in the blood on a Gc protein that has three principal alleles with differing binding affinities and ethnic prevalences. Three major metabolites are produced, which act via two routes, endocrine and autocrine/paracrine, and in two compartments, extracellular and intracellular. Metabolic consumption is influenced by physiological controls, noxious stimuli, and tissue demand. When administered as a supplement, varying dosing schedules produce major differences in serum metabolite profiles. To understand vitamin D's role in human physiology, it is necessary both to identify the foregoing entities, mechanisms, and pathways and, specifically, to quantify them. This review was performed to delineate the principal entities and transitions involved in the vitamin D economy, summarize the status of present knowledge of the applicable rates and masses, draw inferences about functions that are implicit in these quantifications, and point out implications for the determination of adequacy. PMID:26024057

  5. Quantifying the vitamin D economy.

    PubMed

    Heaney, Robert P; Armas, Laura A G

    2015-01-01

    Vitamin D enters the body through multiple routes and in a variety of chemical forms. Utilization varies with input, demand, and genetics. Vitamin D and its metabolites are carried in the blood on a Gc protein that has three principal alleles with differing binding affinities and ethnic prevalences. Three major metabolites are produced, which act via two routes, endocrine and autocrine/paracrine, and in two compartments, extracellular and intracellular. Metabolic consumption is influenced by physiological controls, noxious stimuli, and tissue demand. When administered as a supplement, varying dosing schedules produce major differences in serum metabolite profiles. To understand vitamin D's role in human physiology, it is necessary both to identify the foregoing entities, mechanisms, and pathways and, specifically, to quantify them. This review was performed to delineate the principal entities and transitions involved in the vitamin D economy, summarize the status of present knowledge of the applicable rates and masses, draw inferences about functions that are implicit in these quantifications, and point out implications for the determination of adequacy.

  6. Quantifying Uncertainty in Epidemiological Models

    SciTech Connect

    Ramanathan, Arvind; Jha, Sumit Kumar

    2012-01-01

    Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.

  7. Quantifying macromolecular conformational transition pathways

    NASA Astrophysics Data System (ADS)

    Seyler, Sean; Kumar, Avishek; Thorpe, Michael; Beckstein, Oliver

    2015-03-01

    Diverse classes of proteins function through large-scale conformational changes that are challenging for computer simulations. A range of fast path-sampling techniques have been used to generate transitions, but it has been difficult to compare paths from (and assess the relative strengths of) different methods. We introduce a comprehensive method (pathway similarity analysis, PSA) for quantitatively characterizing and comparing macromolecular pathways. The Hausdorff and Fréchet metrics (known from computational geometry) are used to quantify the degree of similarity between polygonal curves in configuration space. A strength of PSA is its use of the full information available from the 3 N-dimensional configuration space trajectory without requiring additional specific knowledge about the system. We compare a sample of eleven different methods for the closed-to-open transitions of the apo enzyme adenylate kinase (AdK) and also apply PSA to an ensemble of 400 AdK trajectories produced by dynamic importance sampling MD and the Geometrical Pathways algorithm. We discuss the method's potential to enhance our understanding of transition path sampling methods, validate them, and help guide future research toward deeper physical insights into conformational transitions.

  8. Quantifying of bactericide properties of medicinal plants

    PubMed Central

    Ács, András; Gölöncsér, Flóra; Barabás, Anikó

    2011-01-01

    Extended research has been carried out to clarify the ecological role of plant secondary metabolites (SMs). Although their primary ecological function is self-defense, bioactive compounds have long been used in alternative medicine or in biological control of pests. Several members of the family Labiatae are known to have strong antimicrobial capacity. For testing and quantifying antibacterial activity, most often standard microbial protocols are used, assessing inhibitory activity on a selected strain. In this study, the applicability of a microbial ecotoxtest was evaluated to quantify the aggregate bactericide capacity of Labiatae species, based on the bioluminescence inhibition of the bacterium Vibrio fischeri. Striking differences were found amongst herbs, reaching even 10-fold toxicity. Glechoma hederacea L. proved to be the most toxic, with the EC50 of 0.4073 g dried plant/l. LC50 values generated by the standard bioassay seem to be a good indicator of the bactericide property of herbs. Traditional use of the selected herbs shows a good correlation with bioactivity expressed as bioluminescence inhibition, leading to the conclusion that the Vibrio fischeri bioassay can be a good indicator of the overall antibacterial capacity of herbs, at least on a screening level. PMID:21502819

  9. Quantifying cell behaviors during embryonic wound healing

    NASA Astrophysics Data System (ADS)

    Mashburn, David; Ma, Xiaoyan; Crews, Sarah; Lynch, Holley; McCleery, W. Tyler; Hutson, M. Shane

    2011-03-01

    During embryogenesis, internal forces induce motions in cells leading to widespread motion in tissues. We previously developed laser hole-drilling as a consistent, repeatable way to probe such epithelial mechanics. The initial recoil (less than 30s) gives information about physical properties (elasticity, force) of cells surrounding the wound, but the long-term healing process (tens of minutes) shows how cells adjust their behavior in response to stimuli. To study this biofeedback in many cells through time, we developed tools to quantify statistics of individual cells. By combining watershed segmentation with a powerful and efficient user interaction system, we overcome problems that arise in any automatic segmentation from poor image quality. We analyzed cell area, perimeter, aspect ratio, and orientation relative to wound for a wide variety of laser cuts in dorsal closure. We quantified statistics for different regions as well, i.e. cells near to and distant from the wound. Regional differences give a distribution of wound-induced changes, whose spatial localization provides clues into the physical/chemical signals that modulate the wound healing response. Supported by the Human Frontier Science Program (RGP0021/2007 C).

  10. Detecting and Quantifying Topography in Neural Maps

    PubMed Central

    Yarrow, Stuart; Razak, Khaleel A.; Seitz, Aaron R.; Seriès, Peggy

    2014-01-01

    Topographic maps are an often-encountered feature in the brains of many species, yet there are no standard, objective procedures for quantifying topography. Topographic maps are typically identified and described subjectively, but in cases where the scale of the map is close to the resolution limit of the measurement technique, identifying the presence of a topographic map can be a challenging subjective task. In such cases, an objective topography detection test would be advantageous. To address these issues, we assessed seven measures (Pearson distance correlation, Spearman distance correlation, Zrehen's measure, topographic product, topological correlation, path length and wiring length) by quantifying topography in three classes of cortical map model: linear, orientation-like, and clusters. We found that all but one of these measures were effective at detecting statistically significant topography even in weakly-ordered maps, based on simulated noisy measurements of neuronal selectivity and sparse sampling of the maps. We demonstrate the practical applicability of these measures by using them to examine the arrangement of spatial cue selectivity in pallid bat A1. This analysis shows that significantly topographic arrangements of interaural intensity difference and azimuth selectivity exist at the scale of individual binaural clusters. PMID:24505279

  11. Quantifying utricular stimulation during natural behavior

    PubMed Central

    Rivera, Angela R. V.; Davis, Julian; Grant, Wally; Blob, Richard W.; Peterson, Ellengene; Neiman, Alexander B.; Rowe, Michael

    2012-01-01

    The use of natural stimuli in neurophysiological studies has led to significant insights into the encoding strategies used by sensory neurons. To investigate these encoding strategies in vestibular receptors and neurons, we have developed a method for calculating the stimuli delivered to a vestibular organ, the utricle, during natural (unrestrained) behaviors, using the turtle as our experimental preparation. High-speed digital video sequences are used to calculate the dynamic gravito-inertial (GI) vector acting on the head during behavior. X-ray computed tomography (CT) scans are used to determine the orientation of the otoconial layer (OL) of the utricle within the head, and the calculated GI vectors are then rotated into the plane of the OL. Thus, the method allows us to quantify the spatio-temporal structure of stimuli to the OL during natural behaviors. In the future, these waveforms can be used as stimuli in neurophysiological experiments to understand how natural signals are encoded by vestibular receptors and neurons. We provide one example of the method which shows that turtle feeding behaviors can stimulate the utricle at frequencies higher than those typically used in vestibular studies. This method can be adapted to other species, to other vestibular end organs, and to other methods of quantifying head movements. PMID:22753360

  12. Quantifying Scheduling Challenges for Exascale System Software

    SciTech Connect

    Mondragon, Oscar; Bridges, Patrick G.; Jones, Terry R

    2015-01-01

    The move towards high-performance computing (HPC) ap- plications comprised of coupled codes and the need to dra- matically reduce data movement is leading to a reexami- nation of time-sharing vs. space-sharing in HPC systems. In this paper, we discuss and begin to quantify the perfor- mance impact of a move away from strict space-sharing of nodes for HPC applications. Specifically, we examine the po- tential performance cost of time-sharing nodes between ap- plication components, we determine whether a simple coor- dinated scheduling mechanism can address these problems, and we research how suitable simple constraint-based opti- mization techniques are for solving scheduling challenges in this regime. Our results demonstrate that current general- purpose HPC system software scheduling and resource al- location systems are subject to significant performance de- ciencies which we quantify for six representative applica- tions. Based on these results, we discuss areas in which ad- ditional research is needed to meet the scheduling challenges of next-generation HPC systems.

  13. Quantifying uncertainty from material inhomogeneity.

    SciTech Connect

    Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

    2009-09-01

    Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the

  14. A stochastic approach for quantifying immigrant integration: the Spanish test case

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Barra, Adriano; Contucci, Pierluigi; Sandell, Richard; Vernia, Cecilia

    2014-10-01

    We apply stochastic process theory to the analysis of immigrant integration. Using a unique and detailed data set from Spain, we study the relationship between local immigrant density and two social and two economic immigration quantifiers for the period 1999-2010. As opposed to the classic time-series approach, by letting immigrant density play the role of ‘time’ and the quantifier the role of ‘space,’ it becomes possible to analyse the behavior of the quantifiers by means of continuous time random walks. Two classes of results are then obtained. First, we show that social integration quantifiers evolve following diffusion law, while the evolution of economic quantifiers exhibits ballistic dynamics. Second, we make predictions of best- and worst-case scenarios taking into account large local fluctuations. Our stochastic process approach to integration lends itself to interesting forecasting scenarios which, in the hands of policy makers, have the potential to improve political responses to integration problems. For instance, estimating the standard first-passage time and maximum-span walk reveals local differences in integration performance for different immigration scenarios. Thus, by recognizing the importance of local fluctuations around national means, this research constitutes an important tool to assess the impact of immigration phenomena on municipal budgets and to set up solid multi-ethnic plans at the municipal level as immigration pressures build.

  15. Career and Technical Education: Show Us the Buck, We'll Show You the Bang!

    ERIC Educational Resources Information Center

    Whetstone, Ryan

    2011-01-01

    Adult and CTE programs in California have been cut by about 60 percent over the past three years. A number of school districts have summarily eliminated these programs to preserve funding for other educational endeavors. The author says part of the problem has been the community's inability to communicate quantifiable results. One of the hottest…

  16. National Orange Show Photovoltaic Demonstration

    SciTech Connect

    Dan Jimenez Sheri Raborn, CPA; Tom Baker

    2008-03-31

    National Orange Show Photovoltaic Demonstration created a 400KW Photovoltaic self-generation plant at the National Orange Show Events Center (NOS). The NOS owns a 120-acre state fairground where it operates an events center and produces an annual citrus fair known as the Orange Show. The NOS governing board wanted to employ cost-saving programs for annual energy expenses. It is hoped the Photovoltaic program will result in overall savings for the NOS, help reduce the State's energy demands as relating to electrical power consumption, improve quality of life within the affected grid area as well as increase the energy efficiency of buildings at our venue. In addition, the potential to reduce operational expenses would have a tremendous effect on the ability of the NOS to service its community.

  17. Quantifying facial paralysis using the Kinect v2.

    PubMed

    Gaber, Amira; Taher, Mona F; Wahed, Manal Abdel

    2015-01-01

    Assessment of facial paralysis (FP) and quantitative grading of facial asymmetry are essential in order to quantify the extent of the condition as well as to follow its improvement or progression. As such, there is a need for an accurate quantitative grading system that is easy to use, inexpensive and has minimal inter-observer variability. A comprehensive automated system to quantify and grade FP is the main objective of this work. An initial prototype has been presented by the authors. The present research aims to enhance the accuracy and robustness of one of this system's modules: the resting symmetry module. This is achieved by including several modifications to the computation method of the symmetry index (SI) for the eyebrows, eyes and mouth. These modifications are the gamma correction technique, the area of the eyes, and the slope of the mouth. The system was tested on normal subjects and showed promising results. The mean SI of the eyebrows decreased slightly from 98.42% to 98.04% using the modified method while the mean SI for the eyes and mouth increased from 96.93% to 99.63% and from 95.6% to 98.11% respectively while using the modified method. The system is easy to use, inexpensive, automated and fast, has no inter-observer variability and is thus well suited for clinical use.

  18. Quantifying the synchronizability of externally driven oscillators.

    PubMed

    Stefański, Andrzej

    2008-03-01

    This paper is focused on the problem of complete synchronization in arrays of externally driven identical or slightly different oscillators. These oscillators are coupled by common driving which makes an occurrence of generalized synchronization between a driving signal and response oscillators possible. Therefore, the phenomenon of generalized synchronization is also analyzed here. The research is concentrated on the cases of an irregular (chaotic or stochastic) driving signal acting on continuous-time (Duffing systems) and discrete-time (Henon maps) response oscillators. As a tool for quantifying the robustness of the synchronized state, response (conditional) Lyapunov exponents are applied. The most significant result presented in this paper is a novel method of estimation of the largest response Lyapunov exponent. This approach is based on the complete synchronization of two twin response subsystems via additional master-slave coupling between them. Examples of the method application and its comparison with the classical algorithm for calculation of Lyapunov exponents are widely demonstrated. Finally, the idea of effective response Lyapunov exponents, which allows us to quantify the synchronizability in case of slightly different response oscillators, is introduced. PMID:18377057

  19. A Generalizable Methodology for Quantifying User Satisfaction

    NASA Astrophysics Data System (ADS)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  20. Quantifying, Visualizing, and Monitoring Lead Optimization.

    PubMed

    Maynard, Andrew T; Roberts, Christopher D

    2016-05-12

    Although lead optimization (LO) is by definition a process, process-centric analysis and visualization of this important phase of pharmaceutical R&D has been lacking. Here we describe a simple statistical framework to quantify and visualize the progression of LO projects so that the vital signs of LO convergence can be monitored. We refer to the resulting visualizations generated by our methodology as the "LO telemetry" of a project. These visualizations can be automated to provide objective, holistic, and instantaneous analysis and communication of LO progression. This enhances the ability of project teams to more effectively drive LO process, while enabling management to better coordinate and prioritize LO projects. We present the telemetry of five LO projects comprising different biological targets and different project outcomes, including clinical compound selection, termination due to preclinical safety/tox, and termination due to lack of tractability. We demonstrate that LO progression is accurately captured by the telemetry. We also present metrics to quantify LO efficiency and tractability. PMID:26262898

  1. Computed tomography to quantify tooth abrasion

    NASA Astrophysics Data System (ADS)

    Kofmehl, Lukas; Schulz, Georg; Deyhle, Hans; Filippi, Andreas; Hotz, Gerhard; Berndt-Dagassan, Dorothea; Kramis, Simon; Beckmann, Felix; Müller, Bert

    2010-09-01

    Cone-beam computed tomography, also termed digital volume tomography, has become a standard technique in dentistry, allowing for fast 3D jaw imaging including denture at moderate spatial resolution. More detailed X-ray images of restricted volumes for post-mortem studies in dental anthropology are obtained by means of micro computed tomography. The present study evaluates the impact of the pipe smoking wear on teeth morphology comparing the abraded tooth with its contra-lateral counterpart. A set of 60 teeth, loose or anchored in the jaw, from 12 dentitions have been analyzed. After the two contra-lateral teeth were scanned, one dataset has been mirrored before the two datasets were registered using affine and rigid registration algorithms. Rigid registration provides three translational and three rotational parameters to maximize the overlap of two rigid bodies. For the affine registration, three scaling factors are incorporated. Within the present investigation, affine and rigid registrations yield comparable values. The restriction to the six parameters of the rigid registration is not a limitation. The differences in size and shape between the tooth and its contra-lateral counterpart generally exhibit only a few percent in the non-abraded volume, validating that the contralateral tooth is a reasonable approximation to quantify, for example, the volume loss as the result of long-term clay pipe smoking. Therefore, this approach allows quantifying the impact of the pipe abrasion on the internal tooth morphology including root canal, dentin, and enamel volumes.

  2. Message passing for quantified Boolean formulas

    NASA Astrophysics Data System (ADS)

    Zhang, Pan; Ramezanpour, Abolfazl; Zdeborová, Lenka; Zecchina, Riccardo

    2012-05-01

    We introduce two types of message passing algorithms for quantified Boolean formulas (QBF). The first type is a message passing based heuristics that can prove unsatisfiability of the QBF by assigning the universal variables in such a way that the remaining formula is unsatisfiable. In the second type, we use message passing to guide branching heuristics of a Davis-Putnam-Logemann-Loveland (DPLL) complete solver. Numerical experiments show that on random QBFs our branching heuristics give robust exponential efficiency gain with respect to state-of-the-art solvers. We also manage to solve some previously unsolved benchmarks from the QBFLIB library. Apart from this, our study sheds light on using message passing in small systems and as subroutines in complete solvers.

  3. Quantifying a cellular automata simulation of electric vehicles

    NASA Astrophysics Data System (ADS)

    Hill, Graeme; Bell, Margaret; Blythe, Phil

    2014-12-01

    Within this work the Nagel-Schreckenberg (NS) cellular automata is used to simulate a basic cyclic road network. Results from SwitchEV, a real world Electric Vehicle trial which has collected more than two years of detailed electric vehicle data, are used to quantify the results of the NS automata, demonstrating similar power consumption behavior to that observed in the experimental results. In particular the efficiency of the electric vehicles reduces as the vehicle density increases, due in part to the reduced efficiency of EVs at low speeds, but also due to the energy consumption inherent in changing speeds. Further work shows the results from introducing spatially restricted speed restriction. In general it can be seen that induced congestion from spatially transient events propagates back through the road network and alters the energy and efficiency profile of the simulated vehicles, both before and after the speed restriction. Vehicles upstream from the restriction show a reduced energy usage and an increased efficiency, and vehicles downstream show an initial large increase in energy usage as they accelerate away from the speed restriction.

  4. Quantifying Sediment Transport Determined From Grain-Size Distributions

    NASA Astrophysics Data System (ADS)

    de Almeida, R. A.; Möller, O. O.; Lentini, C. A.; Campos, E. J.

    2005-05-01

    A technique derived from McLaren & Bowles (1985) has been applied to investigate sediment dynamics in the Patos Lagoon estuary (Brazil). Qualitative sediment transport in the access channel of the estuary was inferred from changes in statistical properties describing grain-size distributions. Assuming the influence of a single transport function, the spatial gradient of particle mean size, sorting and skewness was used to determine the transport direction along the channel. A long-average net sediment deposition rate in the area was estimated using digitalized historical nautical charts. This deposition rate was used to quantify the sediment transport inside the estuary, through a simple application of Green's Theorem. Results show a net seaward transport in the deep channel of approximately 50 m3 day-1, accompanied by a net inward transport in the shallower channel margin of similar intensity. The estimated net sediment transport was validated against a numerical model output, with good agreement in terms of direction and intensity.

  5. Quantifying consumption rates of dissolved oxygen along bed forms

    NASA Astrophysics Data System (ADS)

    Boano, Fulvio; De Falco, Natalie; Arnon, Shai

    2016-04-01

    Streambed interfaces represent hotspots for nutrient transformations because they host different microbial species, and the evaluation of these reaction rates is important to assess the fate of nutrients in riverine environments. In this work we analyze a series of flume experiments on oxygen demand in dune-shaped hyporheic sediments under losing and gaining flow conditions. We employ a new modeling code to quantify oxygen consumption rates from observed vertical profiles of oxygen concentration. The code accounts for transport by molecular diffusion and water advection, and automatically determines the reaction rates that provide the best fit between observed and modeled concentration values. The results show that reaction rates are not uniformly distributed across the streambed, in agreement with the expected behavior predicted by hyporheic exchange theory. Oxygen consumption was found to be highly influenced by the presence of gaining or losing flow conditions, which controlled the delivery of labile DOC to streambed microorganisms.

  6. Quantifying Drosophila food intake: comparative analysis of current methodology

    PubMed Central

    Deshpande, Sonali A.; Carvalho, Gil B.; Amador, Ariadna; Phillips, Angela M.; Hoxha, Sany; Lizotte, Keith J.; Ja, William W.

    2014-01-01

    Food intake is a fundamental parameter in animal studies. Despite the prevalent use of Drosophila in laboratory research, precise measurements of food intake remain challenging in this model organism. Here, we compare several common Drosophila feeding assays: the Capillary Feeder (CAFE), food-labeling with a radioactive tracer or a colorimetric dye, and observations of proboscis extension (PE). We show that the CAFE and radioisotope-labeling provide the most consistent results, have the highest sensitivity, and can resolve differences in feeding that dye-labeling and PE fail to distinguish. We conclude that performing the radiolabeling and CAFE assays in parallel is currently the best approach for quantifying Drosophila food intake. Understanding the strengths and limitations of food intake methodology will greatly advance Drosophila studies of nutrition, behavior, and disease. PMID:24681694

  7. Fuzzy Entropy Method for Quantifying Supply Chain Networks Complexity

    NASA Astrophysics Data System (ADS)

    Zhang, Jihui; Xu, Junqin

    Supply chain is a special kind of complex network. Its complexity and uncertainty makes it very difficult to control and manage. Supply chains are faced with a rising complexity of products, structures, and processes. Because of the strong link between a supply chain’s complexity and its efficiency the supply chain complexity management becomes a major challenge of today’s business management. The aim of this paper is to quantify the complexity and organization level of an industrial network working towards the development of a ‘Supply Chain Network Analysis’ (SCNA). By measuring flows of goods and interaction costs between different sectors of activity within the supply chain borders, a network of flows is built and successively investigated by network analysis. The result of this study shows that our approach can provide an interesting conceptual perspective in which the modern supply network can be framed, and that network analysis can handle these issues in practice.

  8. Quantifying cortical activity during general anesthesia using wavelet analysis.

    PubMed

    Zikov, Tatjana; Bibian, Stéphane; Dumont, Guy A; Huzmezan, Mihai; Ries, Craig R

    2006-04-01

    This paper reports on a novel method for quantifying the cortical activity of a patient during general anesthesia as a surrogate measure of the patient's level of consciousness. The proposed technique is based on the analysis of a single-channel (frontal) electroencephalogram (EEG) signal using stationary wavelet transform (SWT). The wavelet coefficients calculated from the EEG are pooled into a statistical representation, which is then compared to two well-defined states: the awake state with normal EEG activity, and the isoelectric state with maximal cortical depression. The resulting index, referred to as the wavelet-based anesthetic value for central nervous system monitoring (WAV(CNS)), quantifies the depth of consciousness between these two extremes. To validate the proposed technique, we present a clinical study which explores the advantages of the WAV(CNS) in comparison with the BIS monitor (Aspect Medical Systems, MA), currently a reference in consciousness monitoring. Results show that the WAV(CNS) and BIS are well correlated (r = 0.969) during periods of steady-state despite fundamental algorithmic differences. However, in terms of dynamic behavior, the WAV(CNS) offers faster tracking of transitory changes at induction and emergence, with an average lead of 15-30 s. Furthermore, and conversely to the BIS, the WAV(CNS) regains its preinduction baseline value when patients are responding to verbal command after emergence from anesthesia. We conclude that the proposed analysis technique is an attractive alternative to BIS monitoring. In addition, we show that the WAV(CNS) dynamics can be modeled as a linear time invariant transfer function. This index is, therefore, well suited for use as a feedback sensor in advisory systems, closed-loop control schemes, and for the identification of the pharmacodynamic models of anesthetic drugs.

  9. Measuring political polarization: Twitter shows the two sides of Venezuela

    NASA Astrophysics Data System (ADS)

    Morales, A. J.; Borondo, J.; Losada, J. C.; Benito, R. M.

    2015-03-01

    We say that a population is perfectly polarized when divided in two groups of the same size and opposite opinions. In this paper, we propose a methodology to study and measure the emergence of polarization from social interactions. We begin by proposing a model to estimate opinions in which a minority of influential individuals propagate their opinions through a social network. The result of the model is an opinion probability density function. Next, we propose an index to quantify the extent to which the resulting distribution is polarized. Finally, we apply the proposed methodology to a Twitter conversation about the late Venezuelan president, Hugo Chávez, finding a good agreement between our results and offline data. Hence, we show that our methodology can detect different degrees of polarization, depending on the structure of the network.

  10. Quantifying capital goods for waste incineration

    SciTech Connect

    Brogaard, L.K.; Riber, C.; Christensen, T.H.

    2013-06-15

    Highlights: • Materials and energy used for the construction of waste incinerators were quantified. • The data was collected from five incineration plants in Scandinavia. • Included were six main materials, electronic systems, cables and all transportation. • The capital goods contributed 2–3% compared to the direct emissions impact on GW. - Abstract: Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000–240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000–26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000–5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7–14 kg CO{sub 2} per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2–3% with respect to kg CO{sub 2} per tonne of waste combusted.

  11. Plan Showing Cross Bracing Under Upper Stringers, Typical Section Showing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Plan Showing Cross Bracing Under Upper Stringers, Typical Section Showing End Framing, Plan Showing Cross Bracing Under Lower Stringers, End Elevation - Covered Bridge, Spanning Contoocook River, Hopkinton, Merrimack County, NH

  12. Processing of Numerical and Proportional Quantifiers

    ERIC Educational Resources Information Center

    Shikhare, Sailee; Heim, Stefan; Klein, Elise; Huber, Stefan; Willmes, Klaus

    2015-01-01

    Quantifier expressions like "many" and "at least" are part of a rich repository of words in language representing magnitude information. The role of numerical processing in comprehending quantifiers was studied in a semantic truth value judgment task, asking adults to quickly verify sentences about visual displays using…

  13. quantifying and Predicting Reactive Transport

    SciTech Connect

    Peter C. Burns, Department of Civil Engineering and Geological Sciences, University of Notre Dame

    2009-12-04

    This project was led by Dr. Jiamin Wan at Lawrence Berkeley National Laboratory. Peter Burns provided expertise in uranium mineralogy and in identification of uranium minerals in test materials. Dr. Wan conducted column tests regarding uranium transport at LBNL, and samples of the resulting columns were sent to Dr. Burns for analysis. Samples were analyzed for uranium mineralogy by X-ray powder diffraction and by scanning electron microscopy, and results were provided to Dr. Wan for inclusion in the modeling effort. Full details of the project can be found in Dr. Wan's final reports for the associated effort at LBNL.

  14. Quantifying temporal ventriloquism in audiovisual synchrony perception.

    PubMed

    Kuling, Irene A; Kohlrausch, Armin; Juola, James F

    2013-10-01

    The integration of visual and auditory inputs in the human brain works properly only if the components are perceived in close temporal proximity. In the present study, we quantified cross-modal interactions in the human brain for audiovisual stimuli with temporal asynchronies, using a paradigm from rhythm perception. In this method, participants had to align the temporal position of a target in a rhythmic sequence of four markers. In the first experiment, target and markers consisted of a visual flash or an auditory noise burst, and all four combinations of target and marker modalities were tested. In the same-modality conditions, no temporal biases and a high precision of the adjusted temporal position of the target were observed. In the different-modality conditions, we found a systematic temporal bias of 25-30 ms. In the second part of the first and in a second experiment, we tested conditions in which audiovisual markers with different stimulus onset asynchronies (SOAs) between the two components and a visual target were used to quantify temporal ventriloquism. The adjusted target positions varied by up to about 50 ms and depended in a systematic way on the SOA and its proximity to the point of subjective synchrony. These data allowed testing different quantitative models. The most satisfying model, based on work by Maij, Brenner, and Smeets (Journal of Neurophysiology 102, 490-495, 2009), linked temporal ventriloquism and the percept of synchrony and was capable of adequately describing the results from the present study, as well as those of some earlier experiments. PMID:23868564

  15. Quantifying chaos for ecological stoichiometry.

    PubMed

    Duarte, Jorge; Januário, Cristina; Martins, Nuno; Sardanyés, Josep

    2010-09-01

    The theory of ecological stoichiometry considers ecological interactions among species with different chemical compositions. Both experimental and theoretical investigations have shown the importance of species composition in the outcome of the population dynamics. A recent study of a theoretical three-species food chain model considering stoichiometry [B. Deng and I. Loladze, Chaos 17, 033108 (2007)] shows that coexistence between two consumers predating on the same prey is possible via chaos. In this work we study the topological and dynamical measures of the chaotic attractors found in such a model under ecological relevant parameters. By using the theory of symbolic dynamics, we first compute the topological entropy associated with unimodal Poincaré return maps obtained by Deng and Loladze from a dimension reduction. With this measure we numerically prove chaotic competitive coexistence, which is characterized by positive topological entropy and positive Lyapunov exponents, achieved when the first predator reduces its maximum growth rate, as happens at increasing δ1. However, for higher values of δ1 the dynamics become again stable due to an asymmetric bubble-like bifurcation scenario. We also show that a decrease in the efficiency of the predator sensitive to prey's quality (increasing parameter ζ) stabilizes the dynamics. Finally, we estimate the fractal dimension of the chaotic attractors for the stoichiometric ecological model.

  16. Quantifying Coral Reef Ecosystem Services

    EPA Science Inventory

    Coral reefs have been declining during the last four decades as a result of both local and global anthropogenic stresses. Numerous research efforts to elucidate the nature, causes, magnitude, and potential remedies for the decline have led to the widely held belief that the recov...

  17. Common ecology quantifies human insurgency.

    PubMed

    Bohorquez, Juan Camilo; Gourley, Sean; Dixon, Alexander R; Spagat, Michael; Johnson, Neil F

    2009-12-17

    Many collective human activities, including violence, have been shown to exhibit universal patterns. The size distributions of casualties both in whole wars from 1816 to 1980 and terrorist attacks have separately been shown to follow approximate power-law distributions. However, the possibility of universal patterns ranging across wars in the size distribution or timing of within-conflict events has barely been explored. Here we show that the sizes and timing of violent events within different insurgent conflicts exhibit remarkable similarities. We propose a unified model of human insurgency that reproduces these commonalities, and explains conflict-specific variations quantitatively in terms of underlying rules of engagement. Our model treats each insurgent population as an ecology of dynamically evolving, self-organized groups following common decision-making processes. Our model is consistent with several recent hypotheses about modern insurgency, is robust to many generalizations, and establishes a quantitative connection between human insurgency, global terrorism and ecology. Its similarity to financial market models provides a surprising link between violent and non-violent forms of human behaviour. PMID:20016600

  18. Quantifying the geometry of micropipets.

    PubMed

    Bowman, C L; Ruknudin, A M

    1999-01-01

    Accurate knowledge of the internal diameter (id) of micropipet tips is important, because the ability to study many different aspects of biological membranes is a very sensitive function of tip size. The authors examined two methods used to characterize pipet tips: the digital manometric method (DMM) and bubble number method (BNM). For DMM, the authors compared the ability of Laplace's equation (model I) and a modified form of his equation (model II), which accounts for adhesion between the test fluid and glass. Pressure measurements were made with a digital manometer, and ids at the tip were measured using scanning electron microscopy (SEM). The micropipet tips showed a slight asymmetry in id, with a approx 5% difference between maximum and minimum id. On average, model I overestimates the largest id by 2%. Model II overestimates the smaller id by 2%. For micropipet tips ranging from 1.00 to 5.00 microm, the corresponding uncertainties range from 20 to 100 nm. Making the normally hydrophilic glass surface hydrophobic strongly reduced threshold pressures when tested in water, but not 100% methanol. Compared to BNM, DMM was insensitive to changes in atmospheric pressure: BNM can be corrected for changes in atmospheric pressure. Convergence angle(s) can be determined from measurements of the pressure and the axial distance of the meniscus from the tip. The accuracy and precision of digital manometry approaches that of SEM. DMM should be particularly useful in selecting micropipets for patch clamp studies of small vesicles (< 10 microm), and may enable systematic selection of micropipets for many other experiments.

  19. Quantifying Barrier Island Recovery Following a Hurricane

    NASA Astrophysics Data System (ADS)

    Hammond, B.; Houser, C.

    2014-12-01

    Barrier islands are dynamic landscapes that are believed to minimize storm impact to mainland communities and also provide important ecological services in the coastal environment. The protection afforded by the island and the services it provides, however, depend on island resiliency in the face of accelerated sea level rise, which is in turn dependent on the rate of island recovery following storm events that may also change in both frequency and magnitude in the future. These changes in frequency may affect even large dunes and their resiliency, resulting in the island transitioning from a high to a low elevation. Previous research has shown that the condition of the foredune depends on the recovery of the nearshore and beach profile and the ability of vegetation to capture aeolian-transported sediment. An inability of the foredune to recover may result in mainland susceptibility to storm energy, inability for ecosystems to recover and thrive, and sediment budget instability. In this study, LiDAR data is used to quantify the rates of dune recovery at Fire Island, NY, the Outer Banks, NC, Santa Rosa Island, FL, and Matagorda Island, TX. Preliminary results indicate foredune recovery varies significantly both alongshore and in the cross-shore, suggesting that barrier island response and recovery to storm events cannot be considered from a strictly two-dimensional (cross-shore) perspective.

  20. Diagnostic measure to quantify loss of clinical components in multi-lead electrocardiogram.

    PubMed

    Tripathy, R K; Sharma, L N; Dandapat, S

    2016-03-01

    In this Letter, a novel principal component (PC)-based diagnostic measure (PCDM) is proposed to quantify loss of clinical components in the multi-lead electrocardiogram (MECG) signals. The analysis of MECG shows that, the clinical components are captured in few PCs. The proposed diagnostic measure is defined as the sum of weighted percentage root mean square difference (PRD) between the PCs of original and processed MECG signals. The values of the weight depend on the clinical importance of PCs. The PCDM is tested over MECG enhancement and a novel MECG data reduction scheme. The proposed measure is compared with weighted diagnostic distortion, wavelet energy diagnostic distortion and PRD. The qualitative evaluation is performed using Spearman rank-order correlation coefficient (SROCC) and Pearson linear correlation coefficient. The simulation result demonstrates that the PCDM performs better to quantify loss of clinical components in MECG and shows a SROCC value of 0.9686 with subjective measure. PMID:27222735

  1. Quantifying error distributions in crowding.

    PubMed

    Hanus, Deborah; Vul, Edward

    2013-03-22

    When multiple objects are in close proximity, observers have difficulty identifying them individually. Two classes of theories aim to account for this crowding phenomenon: spatial pooling and spatial substitution. Variations of these accounts predict different patterns of errors in crowded displays. Here we aim to characterize the kinds of errors that people make during crowding by comparing a number of error models across three experiments in which we manipulate flanker spacing, display eccentricity, and precueing duration. We find that both spatial intrusions and individual letter confusions play a considerable role in errors. Moreover, we find no evidence that a naïve pooling model that predicts errors based on a nonadditive combination of target and flankers explains errors better than an independent intrusion model (indeed, in our data, an independent intrusion model is slightly, but significantly, better). Finally, we find that manipulating trial difficulty in any way (spacing, eccentricity, or precueing) produces homogenous changes in error distributions. Together, these results provide quantitative baselines for predictive models of crowding errors, suggest that pooling and spatial substitution models are difficult to tease apart, and imply that manipulations of crowding all influence a common mechanism that impacts subject performance.

  2. Quantifying antimicrobial resistance at veal calf farms.

    PubMed

    Bosman, Angela B; Wagenaar, Jaap A; Wagenaar, Jaap; Stegeman, Arjan; Vernooij, Hans; Mevius, Dik

    2012-01-01

    This study was performed to determine a sampling strategy to quantify the prevalence of antimicrobial resistance on veal calf farms, based on the variation in antimicrobial resistance within and between calves on five farms. Faecal samples from 50 healthy calves (10 calves/farm) were collected. From each individual sample and one pooled faecal sample per farm, 90 selected Escherichia coli isolates were tested for their resistance against 25 mg/L amoxicillin, 25 mg/L tetracycline, 0.5 mg/L cefotaxime, 0.125 mg/L ciprofloxacin and 8/152 mg/L trimethoprim/sulfamethoxazole (tmp/s) by replica plating. From each faecal sample another 10 selected E. coli isolates were tested for their resistance by broth microdilution as a reference. Logistic regression analysis was performed to compare the odds of testing an isolate resistant between both test methods (replica plating vs. broth microdilution) and to evaluate the effect of pooling faecal samples. Bootstrap analysis was used to investigate the precision of the estimated prevalence of resistance to each antimicrobial obtained by several simulated sampling strategies. Replica plating showed similar odds of E. coli isolates tested resistant compared to broth microdilution, except for ciprofloxacin (OR 0.29, p ≤ 0.05). Pooled samples showed in general lower odds of an isolate being resistant compared to individual samples, although these differences were not significant. Bootstrap analysis showed that within each antimicrobial the various compositions of a pooled sample provided consistent estimates for the mean proportion of resistant isolates. Sampling strategies should be based on the variation in resistance among isolates within faecal samples and between faecal samples, which may vary by antimicrobial. In our study, the optimal sampling strategy from the perspective of precision of the estimated levels of resistance and practicality consists of a pooled faecal sample from 20 individual animals, of which 90 isolates are

  3. Quantifying athlete self-talk.

    PubMed

    Hardy, James; Hall, Craig R; Hardy, Lew

    2005-09-01

    Two studies were conducted. The aims of Study 1 were (a) to generate quantitative data on the content of athletes' self-talk and (b) to examine differences in the use of self-talk in general as well as the functions of self-talk in practice and competition settings. Differences in self-talk between the sexes, sport types and skill levels were also assessed. Athletes (n = 295, mean age = 21.9 years) from a variety of sports and competitive levels completed the Self-Talk Use Questionnaire (STUQ), which was developed specifically for the study. In Study 1, single-factor between-group multivariate analyses of variance revealed significant differences across sex and sport type for the content of self-talk. Mixed-model multivariate analyses of variance revealed overall greater use of self-talk, as well as increased use of the functions of self-talk, in competition compared with practice. Moreover, individual sport athletes reported greater use of self-talk, as well as the functions of self-talk, than their team sport counterparts. In Study 2, recreational volleyball players (n = 164, mean age = 21.5 years) completed a situationally modified STUQ. The results were very similar to those of Study 1. That the content of athlete self-talk was generally positive, covert and abbreviated lends support to the application of Vygotsky's (1986) verbal self-regulation theory to the study of self-talk in sport. Researchers are encouraged to examine the effectiveness of self-talk in future studies.

  4. Quantifying asymmetry: ratios and alternatives.

    PubMed

    Franks, Erin M; Cabo, Luis L

    2014-08-01

    Traditionally, the study of metric skeletal asymmetry has relied largely on univariate analyses, utilizing ratio transformations when the goal is comparing asymmetries in skeletal elements or populations of dissimilar dimensions. Under this approach, raw asymmetries are divided by a size marker, such as a bilateral average, in an attempt to produce size-free asymmetry indices. Henceforth, this will be referred to as "controlling for size" (see Smith: Curr Anthropol 46 (2005) 249-273). Ratios obtained in this manner often require further transformations to interpret the meaning and sources of asymmetry. This model frequently ignores the fundamental assumption of ratios: the relationship between the variables entered in the ratio must be isometric. Violations of this assumption can obscure existing asymmetries and render spurious results. In this study, we examined the performance of the classic indices in detecting and portraying the asymmetry patterns in four human appendicular bones and explored potential methodological alternatives. Examination of the ratio model revealed that it does not fulfill its intended goals in the bones examined, as the numerator and denominator are independent in all cases. The ratios also introduced strong biases in the comparisons between different elements and variables, generating spurious asymmetry patterns. Multivariate analyses strongly suggest that any transformation to control for overall size or variable range must be conducted before, rather than after, calculating the asymmetries. A combination of exploratory multivariate techniques, such as Principal Components Analysis, and confirmatory linear methods, such as regression and analysis of covariance, appear as a promising and powerful alternative to the use of ratios. PMID:24842694

  5. Comparing methods of quantifying tibial acceleration slope.

    PubMed

    Duquette, Adriana M; Andrews, David M

    2010-05-01

    Considerable variability in tibial acceleration slope (AS) values, and different interpretations of injury risk based on these values, have been reported. Acceleration slope variability may be due in part to variations in the quantification methods used. Therefore, the purpose of this study was to quantify differences in tibial AS values determined using end points at various percentage ranges between impact and peak tibial acceleration, as a function of either amplitude or time. Tibial accelerations were recorded from 20 participants (21.8 +/- 2.9 years, 1.7 m +/- 0.1 m, 75.1 kg +/- 17.0 kg) during 24 unshod heel impacts using a human pendulum apparatus. Nine ranges were tested from 5-95% (widest range) to 45-55% (narrowest range) at 5% increments. AS(Amplitude) values increased consistently from the widest to narrowest ranges, whereas the AS(Time) values remained essentially the same. The magnitudes of AS(Amplitude) values were significantly higher and more sensitive to changes in percentage range than AS(Time) values derived from the same impact data. This study shows that tibial AS magnitudes are highly dependent on the method used to calculate them. Researchers are encouraged to carefully consider the method they use to calculate AS so that equivalent comparisons and assessments of injury risk across studies can be made.

  6. The missing metric: quantifying contributions of reviewers

    PubMed Central

    Cantor, Maurício; Gero, Shane

    2015-01-01

    The number of contributing reviewers often outnumbers the authors of publications. This has led to apathy towards reviewing and the conclusion that the peer-review system is broken. Given the trade-offs between submitting and reviewing manuscripts, reviewers and authors naturally want visibility for their efforts. While study after study has called for revolutionizing publication practices, the current paradigm does not recognize reviewers' time and expertise. We propose the R-index as a simple way to quantify scientists' contributions as reviewers. We modelled its performance using simulations based on real data to show that early–mid career scientists, who complete high-quality reviews of longer manuscripts within their field, can perform as well as leading scientists reviewing only for high-impact journals. By giving citeable academic recognition for reviewing, R-index will encourage more participation with better reviews, regardless of the career stage. Moreover, the R-index will allow editors to exploit scores to manage and improve their review team, and for journals to promote high average scores as signals of a practical and efficient service to authors. Peer-review is a pervasive necessity across disciplines and the simple utility of this missing metric will credit a valuable aspect of academic productivity without having to revolutionize the current peer-review system. PMID:26064609

  7. Quantifying capital goods for waste incineration.

    PubMed

    Brogaard, L K; Riber, C; Christensen, T H

    2013-06-01

    Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000-240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000-26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000-5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7-14 kg CO2 per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2-3% with respect to kg CO2 per tonne of waste combusted.

  8. The missing metric: quantifying contributions of reviewers.

    PubMed

    Cantor, Maurício; Gero, Shane

    2015-02-01

    The number of contributing reviewers often outnumbers the authors of publications. This has led to apathy towards reviewing and the conclusion that the peer-review system is broken. Given the trade-offs between submitting and reviewing manuscripts, reviewers and authors naturally want visibility for their efforts. While study after study has called for revolutionizing publication practices, the current paradigm does not recognize reviewers' time and expertise. We propose the R-index as a simple way to quantify scientists' contributions as reviewers. We modelled its performance using simulations based on real data to show that early-mid career scientists, who complete high-quality reviews of longer manuscripts within their field, can perform as well as leading scientists reviewing only for high-impact journals. By giving citeable academic recognition for reviewing, R-index will encourage more participation with better reviews, regardless of the career stage. Moreover, the R-index will allow editors to exploit scores to manage and improve their review team, and for journals to promote high average scores as signals of a practical and efficient service to authors. Peer-review is a pervasive necessity across disciplines and the simple utility of this missing metric will credit a valuable aspect of academic productivity without having to revolutionize the current peer-review system.

  9. Quantifying capital goods for waste incineration.

    PubMed

    Brogaard, L K; Riber, C; Christensen, T H

    2013-06-01

    Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000-240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000-26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000-5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7-14 kg CO2 per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2-3% with respect to kg CO2 per tonne of waste combusted. PMID:23561797

  10. Polymer microlenses for quantifying cell sheet mechanics.

    PubMed

    Miquelard-Garnier, Guillaume; Zimberlin, Jessica A; Sikora, Christian B; Wadsworth, Patricia; Crosby, Alfred

    2010-01-01

    Mechanical interactions between individual cells and their substrate have been studied extensively over the past decade; however, understanding how these interactions change as cells interact with neighboring cells in the development of a cell sheet, or early stage tissue, is less developed. We use a recently developed experimental technique for quantifying the mechanics of confluent cell sheets. Living cells are cultured on a thin film of polystyrene [PS], which is attached to a patterned substrate of crosslinked poly(dimethyl siloxane) [PDMS] microwells. As cells attach to the substrate and begin to form a sheet, they apply sufficient contractile force to buckle the PS film over individual microwells to form a microlens array. The curvature for each microlens is measured by confocal microscopy and can be related to the strain and stress applied by the cell sheet using simple mechanical analysis for the buckling of thin films. We demonstrate that this technique can provide insight into the important materials properties and length scales that govern cell sheet responses, especially the role of stiffness of the substrate. We show that intercellular forces can lead to significantly different behaviors than the ones observed for individual cells, where focal adhesion is the relevant parameter.

  11. Asteroid Geophysics and Quantifying the Impact Hazard

    NASA Technical Reports Server (NTRS)

    Sears, D.; Wooden, D. H.; Korycanksy, D. G.

    2015-01-01

    Probably the major challenge in understanding, quantifying, and mitigating the effects of an impact on Earth is understanding the nature of the impactor. Of the roughly 25 meteorite craters on the Earth that have associated meteorites, all but one was produced by an iron meteorite and only one was produced by a stony meteorite. Equally important, even meteorites of a given chemical class produce a wide variety of behavior in the atmosphere. This is because they show considerable diversity in their mechanical properties which have a profound influence on the behavior of meteorites during atmospheric passage. Some stony meteorites are weak and do not reach the surface or reach the surface as thousands of relatively harmless pieces. Some stony meteorites roll into a maximum drag configuration and are strong enough to remain intact so a large single object reaches the surface. Others have high concentrations of water that may facilitate disruption. However, while meteorite falls and meteorites provide invaluable information on the physical nature of the objects entering the atmosphere, there are many unknowns concerning size and scale that can only be determined by from the pre-atmospheric properties of the asteroids. Their internal structure, their thermal properties, their internal strength and composition, will all play a role in determining the behavior of the object as it passes through the atmosphere, whether it produces an airblast and at what height, and the nature of the impact and amount and distribution of ejecta.

  12. Quantifying Flaw Characteristics from IR NDE Data

    SciTech Connect

    Miller, W; Philips, N R; Burke, M W; Robbins, C L

    2003-02-14

    Work is presented which allows flaw characteristics to be quantified from the transient IR NDE signature. The goal of this effort was to accurately determine the type, size and depth of flaws revealed with IR NDE, using sonic IR as the example IR NDE technique. Typically an IR NDE experiment will result in a positive qualitative indication of a flaw such as a cold or hot spot in the image, but will not provide quantitative data thereby leaving the practitioner to make educated guesses as to the source of the signal. The technique presented here relies on comparing the transient IR signature to exact heat transfer analytical results for prototypical flaws, using the flaw characteristics as unknown fitting parameters. A nonlinear least squares algorithm is used to evaluate the fitting parameters, which then provide a direct measure of the flaw characteristics that can be mapped to the imaged surface for visual reference. The method uses temperature data for the heat transfer analysis, so radiometric calibration of the IR signal is required. The method provides quantitative data with a single thermal event (e.g. acoustic pulse or flash), as compared to phase-lock techniques that require many events. The work has been tested with numerical data but remains to be validated by experimental data, and that effort is underway.

  13. 8. Detail showing concrete abutment, showing substructure of bridge, specifically ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. Detail showing concrete abutment, showing substructure of bridge, specifically west side of arch and substructure. - Presumpscot Falls Bridge, Spanning Presumptscot River at Allen Avenue extension, 0.75 mile west of U.S. Interstate 95, Falmouth, Cumberland County, ME

  14. 28. MAP SHOWING LOCATION OF ARVFS FACILITY AS BUILT. SHOWS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    28. MAP SHOWING LOCATION OF ARVFS FACILITY AS BUILT. SHOWS LINCOLN BOULEVARD, BIG LOST RIVER, AND NAVAL REACTORS FACILITY. F.C. TORKELSON DRAWING NUMBER 842-ARVFS-101-2. DATED OCTOBER 12, 1965. INEL INDEX CODE NUMBER: 075 0101 851 151969. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  15. Quantifying climatological ranges and anomalies for Pacific coral reef ecosystems.

    PubMed

    Gove, Jamison M; Williams, Gareth J; McManus, Margaret A; Heron, Scott F; Sandin, Stuart A; Vetter, Oliver J; Foley, David G

    2013-01-01

    Coral reef ecosystems are exposed to a range of environmental forcings that vary on daily to decadal time scales and across spatial scales spanning from reefs to archipelagos. Environmental variability is a major determinant of reef ecosystem structure and function, including coral reef extent and growth rates, and the abundance, diversity, and morphology of reef organisms. Proper characterization of environmental forcings on coral reef ecosystems is critical if we are to understand the dynamics and implications of abiotic-biotic interactions on reef ecosystems. This study combines high-resolution bathymetric information with remotely sensed sea surface temperature, chlorophyll-a and irradiance data, and modeled wave data to quantify environmental forcings on coral reefs. We present a methodological approach to develop spatially constrained, island- and atoll-scale metrics that quantify climatological range limits and anomalous environmental forcings across U.S. Pacific coral reef ecosystems. Our results indicate considerable spatial heterogeneity in climatological ranges and anomalies across 41 islands and atolls, with emergent spatial patterns specific to each environmental forcing. For example, wave energy was greatest at northern latitudes and generally decreased with latitude. In contrast, chlorophyll-a was greatest at reef ecosystems proximate to the equator and northern-most locations, showing little synchrony with latitude. In addition, we find that the reef ecosystems with the highest chlorophyll-a concentrations; Jarvis, Howland, Baker, Palmyra and Kingman are each uninhabited and are characterized by high hard coral cover and large numbers of predatory fishes. Finally, we find that scaling environmental data to the spatial footprint of individual islands and atolls is more likely to capture local environmental forcings, as chlorophyll-a concentrations decreased at relatively short distances (>7 km) from 85% of our study locations. These metrics will help

  16. Quantifying climatological ranges and anomalies for Pacific coral reef ecosystems.

    PubMed

    Gove, Jamison M; Williams, Gareth J; McManus, Margaret A; Heron, Scott F; Sandin, Stuart A; Vetter, Oliver J; Foley, David G

    2013-01-01

    Coral reef ecosystems are exposed to a range of environmental forcings that vary on daily to decadal time scales and across spatial scales spanning from reefs to archipelagos. Environmental variability is a major determinant of reef ecosystem structure and function, including coral reef extent and growth rates, and the abundance, diversity, and morphology of reef organisms. Proper characterization of environmental forcings on coral reef ecosystems is critical if we are to understand the dynamics and implications of abiotic-biotic interactions on reef ecosystems. This study combines high-resolution bathymetric information with remotely sensed sea surface temperature, chlorophyll-a and irradiance data, and modeled wave data to quantify environmental forcings on coral reefs. We present a methodological approach to develop spatially constrained, island- and atoll-scale metrics that quantify climatological range limits and anomalous environmental forcings across U.S. Pacific coral reef ecosystems. Our results indicate considerable spatial heterogeneity in climatological ranges and anomalies across 41 islands and atolls, with emergent spatial patterns specific to each environmental forcing. For example, wave energy was greatest at northern latitudes and generally decreased with latitude. In contrast, chlorophyll-a was greatest at reef ecosystems proximate to the equator and northern-most locations, showing little synchrony with latitude. In addition, we find that the reef ecosystems with the highest chlorophyll-a concentrations; Jarvis, Howland, Baker, Palmyra and Kingman are each uninhabited and are characterized by high hard coral cover and large numbers of predatory fishes. Finally, we find that scaling environmental data to the spatial footprint of individual islands and atolls is more likely to capture local environmental forcings, as chlorophyll-a concentrations decreased at relatively short distances (>7 km) from 85% of our study locations. These metrics will help

  17. Quantifying Climatological Ranges and Anomalies for Pacific Coral Reef Ecosystems

    PubMed Central

    Gove, Jamison M.; Williams, Gareth J.; McManus, Margaret A.; Heron, Scott F.; Sandin, Stuart A.; Vetter, Oliver J.; Foley, David G.

    2013-01-01

    Coral reef ecosystems are exposed to a range of environmental forcings that vary on daily to decadal time scales and across spatial scales spanning from reefs to archipelagos. Environmental variability is a major determinant of reef ecosystem structure and function, including coral reef extent and growth rates, and the abundance, diversity, and morphology of reef organisms. Proper characterization of environmental forcings on coral reef ecosystems is critical if we are to understand the dynamics and implications of abiotic–biotic interactions on reef ecosystems. This study combines high-resolution bathymetric information with remotely sensed sea surface temperature, chlorophyll-a and irradiance data, and modeled wave data to quantify environmental forcings on coral reefs. We present a methodological approach to develop spatially constrained, island- and atoll-scale metrics that quantify climatological range limits and anomalous environmental forcings across U.S. Pacific coral reef ecosystems. Our results indicate considerable spatial heterogeneity in climatological ranges and anomalies across 41 islands and atolls, with emergent spatial patterns specific to each environmental forcing. For example, wave energy was greatest at northern latitudes and generally decreased with latitude. In contrast, chlorophyll-a was greatest at reef ecosystems proximate to the equator and northern-most locations, showing little synchrony with latitude. In addition, we find that the reef ecosystems with the highest chlorophyll-a concentrations; Jarvis, Howland, Baker, Palmyra and Kingman are each uninhabited and are characterized by high hard coral cover and large numbers of predatory fishes. Finally, we find that scaling environmental data to the spatial footprint of individual islands and atolls is more likely to capture local environmental forcings, as chlorophyll-a concentrations decreased at relatively short distances (>7 km) from 85% of our study locations. These metrics will

  18. Pea Plants Show Risk Sensitivity.

    PubMed

    Dener, Efrat; Kacelnik, Alex; Shemesh, Hagai

    2016-07-11

    Sensitivity to variability in resources has been documented in humans, primates, birds, and social insects, but the fit between empirical results and the predictions of risk sensitivity theory (RST), which aims to explain this sensitivity in adaptive terms, is weak [1]. RST predicts that agents should switch between risk proneness and risk aversion depending on state and circumstances, especially according to the richness of the least variable option [2]. Unrealistic assumptions about agents' information processing mechanisms and poor knowledge of the extent to which variability imposes specific selection in nature are strong candidates to explain the gap between theory and data. RST's rationale also applies to plants, where it has not hitherto been tested. Given the differences between animals' and plants' information processing mechanisms, such tests should help unravel the conflicts between theory and data. Measuring root growth allocation by split-root pea plants, we show that they favor variability when mean nutrient levels are low and the opposite when they are high, supporting the most widespread RST prediction. However, the combination of non-linear effects of nitrogen availability at local and systemic levels may explain some of these effects as a consequence of mechanisms not necessarily evolved to cope with variance [3, 4]. This resembles animal examples in which properties of perception and learning cause risk sensitivity even though they are not risk adaptations [5]. PMID:27374342

  19. Quantifying drug-protein binding in vivo.

    SciTech Connect

    Buchholz, B; Bench, G; Keating III, G; Palmblad, M; Vogel, J; Grant, P G; Hillegonds, D

    2004-02-17

    Accelerator mass spectrometry (AMS) provides precise quantitation of isotope labeled compounds that are bound to biological macromolecules such as DNA or proteins. The sensitivity is high enough to allow for sub-pharmacological (''micro-'') dosing to determine macromolecular targets without inducing toxicities or altering the system under study, whether it is healthy or diseased. We demonstrated an application of AMS in quantifying the physiologic effects of one dosed chemical compound upon the binding level of another compound in vivo at sub-toxic doses [4].We are using tissues left from this study to develop protocols for quantifying specific binding to isolated and identified proteins. We also developed a new technique to quantify nanogram to milligram amounts of isolated protein at precisions that are comparable to those for quantifying the bound compound by AMS.

  20. Quantifying Urban Groundwater in Environmental Field Observatories

    NASA Astrophysics Data System (ADS)

    Welty, C.; Miller, A. J.; Belt, K.; Smith, J. A.; Band, L. E.; Groffman, P.; Scanlon, T.; Warner, J.; Ryan, R. J.; Yeskis, D.; McGuire, M. P.

    2006-12-01

    Despite the growing footprint of urban landscapes and their impacts on hydrologic and biogeochemical cycles, comprehensive field studies of urban water budgets are few. The cumulative effects of urban infrastructure (buildings, roads, culverts, storm drains, detention ponds, leaking water supply and wastewater pipe networks) on temporal and spatial patterns of groundwater stores, fluxes, and flowpaths are poorly understood. The goal of this project is to develop expertise and analytical tools for urban groundwater systems that will inform future environmental observatory planning and that can be shared with research teams working in urban environments elsewhere. The work plan for this project draws on a robust set of information resources in Maryland provided by ongoing monitoring efforts of the Baltimore Ecosystem Study (BES), USGS, and the U.S. Forest Service working together with university scientists and engineers from multiple institutions. A key concern is to bridge the gap between small-scale intensive field studies and larger-scale and longer-term hydrologic patterns using synoptic field surveys, remote sensing, numerical modeling, data mining and visualization tools. Using the urban water budget as a unifying theme, we are working toward estimating the various elements of the budget in order to quantify the influence of urban infrastructure on groundwater. Efforts include: (1) comparison of base flow behavior from stream gauges in a nested set of watersheds at four different spatial scales from 0.8 to 171 km2, with diverse patterns of impervious cover and urban infrastructure; (2) synoptic survey of well water levels to characterize the regional water table; (3) use of airborne thermal infrared imagery to identify locations of groundwater seepage into streams across a range of urban development patterns; (4) use of seepage transects and tracer tests to quantify the spatial pattern of groundwater fluxes to the drainage network in selected subwatersheds; (5

  1. Portable XRF Technology to Quantify Pb in Bone In Vivo.

    PubMed

    Specht, Aaron James; Weisskopf, Marc; Nie, Linda Huiling

    2014-01-01

    Lead is a ubiquitous toxicant. Bone lead has been established as an important biomarker for cumulative lead exposures and has been correlated with adverse health effects on many systems in the body. K-shell X-ray fluorescence (KXRF) is the standard method for measuring bone lead, but this approach has many difficulties that have limited the widespread use of this exposure assessment method. With recent advancements in X-ray fluorescence (XRF) technology, we have developed a portable system that can quantify lead in bone in vivo within 3 minutes. Our study investigated improvements to the system, four calibration methods, and system validation for in vivo measurements. Our main results show that the detection limit of the system is 2.9 ppm with 2 mm soft tissue thickness, the best calibration method for in vivo measurement is background subtraction, and there is strong correlation between KXRF and portable LXRF bone lead results. Our results indicate that the technology is ready to be used in large human population studies to investigate adverse health effects of lead exposure. The portability of the system and fast measurement time should allow for this technology to greatly advance the research on lead exposure and public/environmental health. PMID:26317033

  2. Portable XRF Technology to Quantify Pb in Bone In Vivo

    PubMed Central

    Specht, Aaron James; Weisskopf, Marc; Nie, Linda Huiling

    2014-01-01

    Lead is a ubiquitous toxicant. Bone lead has been established as an important biomarker for cumulative lead exposures and has been correlated with adverse health effects on many systems in the body. K-shell X-ray fluorescence (KXRF) is the standard method for measuring bone lead, but this approach has many difficulties that have limited the widespread use of this exposure assessment method. With recent advancements in X-ray fluorescence (XRF) technology, we have developed a portable system that can quantify lead in bone in vivo within 3 minutes. Our study investigated improvements to the system, four calibration methods, and system validation for in vivo measurements. Our main results show that the detection limit of the system is 2.9 ppm with 2 mm soft tissue thickness, the best calibration method for in vivo measurement is background subtraction, and there is strong correlation between KXRF and portable LXRF bone lead results. Our results indicate that the technology is ready to be used in large human population studies to investigate adverse health effects of lead exposure. The portability of the system and fast measurement time should allow for this technology to greatly advance the research on lead exposure and public/environmental health. PMID:26317033

  3. Portable XRF Technology to Quantify Pb in Bone In Vivo.

    PubMed

    Specht, Aaron James; Weisskopf, Marc; Nie, Linda Huiling

    2014-01-01

    Lead is a ubiquitous toxicant. Bone lead has been established as an important biomarker for cumulative lead exposures and has been correlated with adverse health effects on many systems in the body. K-shell X-ray fluorescence (KXRF) is the standard method for measuring bone lead, but this approach has many difficulties that have limited the widespread use of this exposure assessment method. With recent advancements in X-ray fluorescence (XRF) technology, we have developed a portable system that can quantify lead in bone in vivo within 3 minutes. Our study investigated improvements to the system, four calibration methods, and system validation for in vivo measurements. Our main results show that the detection limit of the system is 2.9 ppm with 2 mm soft tissue thickness, the best calibration method for in vivo measurement is background subtraction, and there is strong correlation between KXRF and portable LXRF bone lead results. Our results indicate that the technology is ready to be used in large human population studies to investigate adverse health effects of lead exposure. The portability of the system and fast measurement time should allow for this technology to greatly advance the research on lead exposure and public/environmental health.

  4. Quantifying Potential Groundwater Recharge In South Texas

    NASA Astrophysics Data System (ADS)

    Basant, S.; Zhou, Y.; Leite, P. A.; Wilcox, B. P.

    2015-12-01

    Groundwater in South Texas is heavily relied on for human consumption and irrigation for food crops. Like most of the south west US, woody encroachment has altered the grassland ecosystems here too. While brush removal has been widely implemented in Texas with the objective of increasing groundwater recharge, the linkage between vegetation and groundwater recharge in South Texas is still unclear. Studies have been conducted to understand plant-root-water dynamics at the scale of plants. However, little work has been done to quantify the changes in soil water and deep percolation at the landscape scale. Modeling water flow through soil profiles can provide an estimate of the total water flowing into deep percolation. These models are especially powerful with parameterized and calibrated with long term soil water data. In this study we parameterize the HYDRUS soil water model using long term soil water data collected in Jim Wells County in South Texas. Soil water was measured at every 20 cm intervals up to a depth of 200 cm. The parameterized model will be used to simulate soil water dynamics under a variety of precipitation regimes ranging from well above normal to severe drought conditions. The results from the model will be compared with the changes in soil moisture profile observed in response to vegetation cover and treatments from a study in a similar. Comparative studies like this can be used to build new and strengthen existing hypotheses regarding deep percolation and the role of soil texture and vegetation in groundwater recharge.

  5. Quantifying ant activity using vibration measurements.

    PubMed

    Oberst, Sebastian; Baro, Enrique Nava; Lai, Joseph C S; Evans, Theodore A

    2014-01-01

    Ant behaviour is of great interest due to their sociality. Ant behaviour is typically observed visually, however there are many circumstances where visual observation is not possible. It may be possible to assess ant behaviour using vibration signals produced by their physical movement. We demonstrate through a series of bioassays with different stimuli that the level of activity of meat ants (Iridomyrmex purpureus) can be quantified using vibrations, corresponding to observations with video. We found that ants exposed to physical shaking produced the highest average vibration amplitudes followed by ants with stones to drag, then ants with neighbours, illuminated ants and ants in darkness. In addition, we devised a novel method based on wavelet decomposition to separate the vibration signal owing to the initial ant behaviour from the substrate response, which will allow signals recorded from different substrates to be compared directly. Our results indicate the potential to use vibration signals to classify some ant behaviours in situations where visual observation could be difficult.

  6. Quantifying Wrinkle Features of Thin Membrane Structures

    NASA Technical Reports Server (NTRS)

    Jacobson, Mindy B.; Iwasa, Takashi; Naton, M. C.

    2004-01-01

    For future micro-systems utilizing membrane based structures, quantified predictions of wrinkling behavior in terms of amplitude, angle and wavelength are needed to optimize the efficiency and integrity of such structures, as well as their associated control systems. For numerical analyses performed in the past, limitations on the accuracy of membrane distortion simulations have often been related to the assumptions made. This work demonstrates that critical assumptions include: effects of gravity, supposed initial or boundary conditions, and the type of element used to model the membrane. In this work, a 0.2 m x 02 m membrane is treated as a structural material with non-negligible bending stiffness. Finite element modeling is used to simulate wrinkling behavior due to a constant applied in-plane shear load. Membrane thickness, gravity effects, and initial imperfections with respect to flatness were varied in numerous nonlinear analysis cases. Significant findings include notable variations in wrinkle modes for thickness in the range of 50 microns to 1000 microns, which also depend on the presence of an applied gravity field. However, it is revealed that relationships between overall strain energy density and thickness for cases with differing initial conditions are independent of assumed initial conditions. In addition, analysis results indicate that the relationship between wrinkle amplitude scale (W/t) and structural scale (L/t) is independent of the nonlinear relationship between thickness and stiffness.

  7. Quantifying truncation errors in effective field theory

    NASA Astrophysics Data System (ADS)

    Furnstahl, R. J.; Klco, N.; Phillips, D. R.; Wesolowski, S.

    2015-08-01

    Bayesian procedures designed to quantify truncation errors in perturbative calculations of quantum chromodynamics observables are adapted to expansions in effective field theory (EFT). In the Bayesian approach, such truncation errors are derived from degree-of-belief (DOB) intervals for EFT predictions. Computation of these intervals requires specification of prior probability distributions ("priors") for the expansion coefficients. By encoding expectations about the naturalness of these coefficients, this framework provides a statistical interpretation of the standard EFT procedure where truncation errors are estimated using the order-by-order convergence of the expansion. It also permits exploration of the ways in which such error bars are, and are not, sensitive to assumptions about EFT-coefficient naturalness. We first demonstrate the calculation of Bayesian probability distributions for the EFT truncation error in some representative examples and then focus on the application of chiral EFT to neutron-proton scattering. Epelbaum, Krebs, and Meißner recently articulated explicit rules for estimating truncation errors in such EFT calculations of few-nucleon-system properties. We find that their basic procedure emerges generically from one class of naturalness priors considered and that all such priors result in consistent quantitative predictions for 68% DOB intervals. We then explore several methods by which the convergence properties of the EFT for a set of observables may be used to check the statistical consistency of the EFT expansion parameter.

  8. Quantifying Access Disparities in Response Plans

    PubMed Central

    Indrakanti, Saratchandra; Mikler, Armin R.; O’Neill, Martin; Tiwari, Chetan

    2016-01-01

    Effective response planning and preparedness are critical to the health and well-being of communities in the face of biological emergencies. Response plans involving mass prophylaxis may seem feasible when considering the choice of dispensing points within a region, overall population density, and estimated traffic demands. However, the plan may fail to serve particular vulnerable subpopulations, resulting in access disparities during emergency response. For a response plan to be effective, sufficient mitigation resources must be made accessible to target populations within short, federally-mandated time frames. A major challenge in response plan design is to establish a balance between the allocation of available resources and the provision of equal access to PODs for all individuals in a given geographic region. Limitations on the availability, granularity, and currency of data to identify vulnerable populations further complicate the planning process. To address these challenges and limitations, data driven methods to quantify vulnerabilities in the context of response plans have been developed and are explored in this article. PMID:26771551

  9. Quantifying the dynamics of financial correlations

    NASA Astrophysics Data System (ADS)

    Drożdż, S.; Kwapień, J.; Grümmer, F.; Ruf, F.; Speth, J.

    2001-10-01

    A novel application of the correlation matrix formalism to study dynamics of the financial evolution is presented. This formalism allows to quantify the memory effects as well as some potential repeatable intraday structures in the financial time series. The present study is based on the high-frequency Deutsche Aktienindex (DAX) data over the time period between November 1997 and December 1999 and demonstrates a power of the method. In this way, two significant new aspects of the DAX evolution are identified: (i) the memory effects turn out to be sizably shorter than what the standard autocorrelation function analysis seems to indicate and (ii) there exist short term repeatable structures in fluctuations that are governed by a distinct dynamics. The former of these results may provide an argument in favour of the market efficiency while the latter one may indicate origin of the difficulty in reaching a Gaussian limit, expected from the central limit theorem, in the distribution of returns on longer time horizons.

  10. Quantifying the limits of fingerprint variability.

    PubMed

    Fagert, Michael; Morris, Keith

    2015-09-01

    The comparison and identification of fingerprints are made difficult by fingerprint variability arising from distortion. This study seeks to quantify both the limits of fingerprint variability when subject to heavy distortion, and the variability observed in repeated inked planar impressions. A total of 30 fingers were studied: 10 right slant loops, 10 plain whorls, and 10 plain arches. Fingers were video recorded performing several distortion movements under heavy deposition pressure: left, right, up, and down translation of the finger, clockwise and counter-clockwise torque of the finger, and planar impressions. Fingerprint templates, containing 'true' minutiae locations, were created for each finger using 10 repeated inked planar impressions. A minimal amount of variability, 0.18mm globally, was observed for minutiae in repeated inked planar impressions. When subject to heavy distortion minutiae can be displaced by upwards of 3mm and their orientation altered by as much as 30° in relation to their template positions. Minutiae displacements of 1mm and 10° changes in orientation are readily observed. The results of this study will allow fingerprint examiners to identify and understand the degree of variability that can be reasonably expected throughout the various regions of fingerprints.

  11. Quantifying capital goods for biological treatment of organic waste.

    PubMed

    Brogaard, Line K; Petersen, Per H; Nielsen, Peter D; Christensen, Thomas H

    2015-02-01

    Materials and energy used for construction of anaerobic digestion (AD) and windrow composting plants were quantified in detail. The two technologies were quantified in collaboration with consultants and producers of the parts used to construct the plants. The composting plants were quantified based on the different sizes for the three different types of waste (garden and park waste, food waste and sludge from wastewater treatment) in amounts of 10,000 or 50,000 tonnes per year. The AD plant was quantified for a capacity of 80,000 tonnes per year. Concrete and steel for the tanks were the main materials for the AD plant. For the composting plants, gravel and concrete slabs for the pavement were used in large amounts. To frame the quantification, environmental impact assessments (EIAs) showed that the steel used for tanks at the AD plant and the concrete slabs at the composting plants made the highest contribution to Global Warming. The total impact on Global Warming from the capital goods compared to the operation reported in the literature on the AD plant showed an insignificant contribution of 1-2%. For the composting plants, the capital goods accounted for 10-22% of the total impact on Global Warming from composting. PMID:25595291

  12. Quantifying capital goods for biological treatment of organic waste.

    PubMed

    Brogaard, Line K; Petersen, Per H; Nielsen, Peter D; Christensen, Thomas H

    2015-02-01

    Materials and energy used for construction of anaerobic digestion (AD) and windrow composting plants were quantified in detail. The two technologies were quantified in collaboration with consultants and producers of the parts used to construct the plants. The composting plants were quantified based on the different sizes for the three different types of waste (garden and park waste, food waste and sludge from wastewater treatment) in amounts of 10,000 or 50,000 tonnes per year. The AD plant was quantified for a capacity of 80,000 tonnes per year. Concrete and steel for the tanks were the main materials for the AD plant. For the composting plants, gravel and concrete slabs for the pavement were used in large amounts. To frame the quantification, environmental impact assessments (EIAs) showed that the steel used for tanks at the AD plant and the concrete slabs at the composting plants made the highest contribution to Global Warming. The total impact on Global Warming from the capital goods compared to the operation reported in the literature on the AD plant showed an insignificant contribution of 1-2%. For the composting plants, the capital goods accounted for 10-22% of the total impact on Global Warming from composting.

  13. Oxygen-Enhanced MRI Accurately Identifies, Quantifies, and Maps Tumor Hypoxia in Preclinical Cancer Models.

    PubMed

    O'Connor, James P B; Boult, Jessica K R; Jamin, Yann; Babur, Muhammad; Finegan, Katherine G; Williams, Kaye J; Little, Ross A; Jackson, Alan; Parker, Geoff J M; Reynolds, Andrew R; Waterton, John C; Robinson, Simon P

    2016-02-15

    There is a clinical need for noninvasive biomarkers of tumor hypoxia for prognostic and predictive studies, radiotherapy planning, and therapy monitoring. Oxygen-enhanced MRI (OE-MRI) is an emerging imaging technique for quantifying the spatial distribution and extent of tumor oxygen delivery in vivo. In OE-MRI, the longitudinal relaxation rate of protons (ΔR1) changes in proportion to the concentration of molecular oxygen dissolved in plasma or interstitial tissue fluid. Therefore, well-oxygenated tissues show positive ΔR1. We hypothesized that the fraction of tumor tissue refractory to oxygen challenge (lack of positive ΔR1, termed "Oxy-R fraction") would be a robust biomarker of hypoxia in models with varying vascular and hypoxic features. Here, we demonstrate that OE-MRI signals are accurate, precise, and sensitive to changes in tumor pO2 in highly vascular 786-0 renal cancer xenografts. Furthermore, we show that Oxy-R fraction can quantify the hypoxic fraction in multiple models with differing hypoxic and vascular phenotypes, when used in combination with measurements of tumor perfusion. Finally, Oxy-R fraction can detect dynamic changes in hypoxia induced by the vasomodulator agent hydralazine. In contrast, more conventional biomarkers of hypoxia (derived from blood oxygenation-level dependent MRI and dynamic contrast-enhanced MRI) did not relate to tumor hypoxia consistently. Our results show that the Oxy-R fraction accurately quantifies tumor hypoxia noninvasively and is immediately translatable to the clinic.

  14. Quantifying the Risk of Blood Exposure in Optometric Clinical Education.

    ERIC Educational Resources Information Center

    Hoppe, Elizabeth

    1997-01-01

    A study attempted to quantify risk of blood exposure in optometric clinical education by surveying optometric interns in their fourth year at the Southern California College of Optometry concerning their history of exposure or use of a needle. Results indicate blood exposure or needle use ranged from 0.95 to 18.71 per 10,000 patient encounters.…

  15. Quantifying dynamical spillover in co-evolving multiplex networks

    NASA Astrophysics Data System (ADS)

    Vijayaraghavan, Vikram S.; Noël, Pierre-André; Maoz, Zeev; D'Souza, Raissa M.

    2015-10-01

    Multiplex networks (a system of multiple networks that have different types of links but share a common set of nodes) arise naturally in a wide spectrum of fields. Theoretical studies show that in such multiplex networks, correlated edge dynamics between the layers can have a profound effect on dynamical processes. However, how to extract the correlations from real-world systems is an outstanding challenge. Here we introduce the Multiplex Markov chain to quantify correlations in edge dynamics found in longitudinal data of multiplex networks. By comparing the results obtained from the multiplex perspective to a null model which assumes layers in a network are independent, we can identify real correlations as distinct from simultaneous changes that occur due to random chance. We use this approach on two different data sets: the network of trade and alliances between nation states, and the email and co-commit networks between developers of open source software. We establish the existence of “dynamical spillover” showing the correlated formation (or deletion) of edges of different types as the system evolves. The details of the dynamics over time provide insight into potential causal pathways.

  16. Quantifying evolutionary dynamics from variant-frequency time series.

    PubMed

    Khatri, Bhavin S

    2016-01-01

    From Kimura's neutral theory of protein evolution to Hubbell's neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher's angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series. PMID:27616332

  17. Quantifying uncertainty in the phylogenetics of Australian numeral systems.

    PubMed

    Zhou, Kevin; Bowern, Claire

    2015-09-22

    Researchers have long been interested in the evolution of culture and the ways in which change in cultural systems can be reconstructed and tracked. Within the realm of language, these questions are increasingly investigated with Bayesian phylogenetic methods. However, such work in cultural phylogenetics could be improved by more explicit quantification of reconstruction and transition probabilities. We apply such methods to numerals in the languages of Australia. As a large phylogeny with almost universal 'low-limit' systems, Australian languages are ideal for investigating numeral change over time. We reconstruct the most likely extent of the system at the root and use that information to explore the ways numerals evolve. We show that these systems do not increment serially, but most commonly vary their upper limits between 3 and 5. While there is evidence for rapid system elaboration beyond the lower limits, languages lose numerals as well as gain them. We investigate the ways larger numerals build on smaller bases, and show that there is a general tendency to both gain and replace 4 by combining 2 + 2 (rather than inventing a new unanalysable word 'four'). We develop a series of methods for quantifying and visualizing the results.

  18. Quantifying evolutionary dynamics from variant-frequency time series

    NASA Astrophysics Data System (ADS)

    Khatri, Bhavin S.

    2016-09-01

    From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series.

  19. Quantifying Uncertainties in Land-Surface Microwave Emissivity Retrievals

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko

    2013-01-01

    Uncertainties in the retrievals of microwaveland-surface emissivities are quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including the Special Sensor Microwave Imager, the Tropical Rainfall Measuring Mission Microwave Imager, and the Advanced Microwave Scanning Radiometer for Earth Observing System, are studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land-surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors inthe retrievals. Generally, these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 1%-4% (3-12 K) over desert and 1%-7% (3-20 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.5%-2% (2-6 K). In particular, at 85.5/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are most likely caused by rain/cloud contamination, which can lead to random errors up to 10-17 K under the most severe conditions.

  20. Quantifying evolutionary dynamics from variant-frequency time series

    PubMed Central

    Khatri, Bhavin S.

    2016-01-01

    From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series. PMID:27616332

  1. Quantifying Uncertainties in Land Surface Microwave Emissivity Retrievals

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko

    2012-01-01

    Uncertainties in the retrievals of microwave land surface emissivities were quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including SSM/I, TMI and AMSR-E, were studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors in the retrievals. Generally these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 14% (312 K) over desert and 17% (320 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.52% (26 K). In particular, at 85.0/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are mostly likely caused by rain/cloud contamination, which can lead to random errors up to 1017 K under the most severe conditions.

  2. Quantifying uncertainty in the phylogenetics of Australian numeral systems

    PubMed Central

    Zhou, Kevin; Bowern, Claire

    2015-01-01

    Researchers have long been interested in the evolution of culture and the ways in which change in cultural systems can be reconstructed and tracked. Within the realm of language, these questions are increasingly investigated with Bayesian phylogenetic methods. However, such work in cultural phylogenetics could be improved by more explicit quantification of reconstruction and transition probabilities. We apply such methods to numerals in the languages of Australia. As a large phylogeny with almost universal ‘low-limit' systems, Australian languages are ideal for investigating numeral change over time. We reconstruct the most likely extent of the system at the root and use that information to explore the ways numerals evolve. We show that these systems do not increment serially, but most commonly vary their upper limits between 3 and 5. While there is evidence for rapid system elaboration beyond the lower limits, languages lose numerals as well as gain them. We investigate the ways larger numerals build on smaller bases, and show that there is a general tendency to both gain and replace 4 by combining 2 + 2 (rather than inventing a new unanalysable word ‘four'). We develop a series of methods for quantifying and visualizing the results. PMID:26378214

  3. Quantifying dynamical spillover in co-evolving multiplex networks

    PubMed Central

    Vijayaraghavan, Vikram S.; Noël, Pierre-André; Maoz, Zeev; D’Souza, Raissa M.

    2015-01-01

    Multiplex networks (a system of multiple networks that have different types of links but share a common set of nodes) arise naturally in a wide spectrum of fields. Theoretical studies show that in such multiplex networks, correlated edge dynamics between the layers can have a profound effect on dynamical processes. However, how to extract the correlations from real-world systems is an outstanding challenge. Here we introduce the Multiplex Markov chain to quantify correlations in edge dynamics found in longitudinal data of multiplex networks. By comparing the results obtained from the multiplex perspective to a null model which assumes layers in a network are independent, we can identify real correlations as distinct from simultaneous changes that occur due to random chance. We use this approach on two different data sets: the network of trade and alliances between nation states, and the email and co-commit networks between developers of open source software. We establish the existence of “dynamical spillover” showing the correlated formation (or deletion) of edges of different types as the system evolves. The details of the dynamics over time provide insight into potential causal pathways. PMID:26459949

  4. Quantifying variability within water samples: the need for adequate subsampling.

    PubMed

    Donohue, Ian; Irvine, Kenneth

    2008-01-01

    Accurate and precise determination of the concentration of nutrients and other substances in waterbodies is an essential requirement for supporting effective management and legislation. Owing primarily to logistic and financial constraints, however, national and regional agencies responsible for monitoring surface waters tend to quantify chemical indicators of water quality using a single sample from each waterbody, thus largely ignoring spatial variability. We show here that total sample variability, which comprises both analytical variability and within-sample heterogeneity, of a number of important chemical indicators of water quality (chlorophyll a, total phosphorus, total nitrogen, soluble molybdate-reactive phosphorus and dissolved inorganic nitrogen) varies significantly both over time and among determinands, and can be extremely high. Within-sample heterogeneity, whose mean contribution to total sample variability ranged between 62% and 100%, was significantly higher in samples taken from rivers compared with those from lakes, and was shown to be reduced by filtration. Our results show clearly that neither a single sample, nor even two sub-samples from that sample is adequate for the reliable, and statistically robust, detection of changes in the quality of surface waters. We recommend strongly that, in situations where it is practicable to take only a single sample from a waterbody, a minimum of three sub-samples are analysed from that sample for robust quantification of both the concentrations of determinands and total sample variability. PMID:17706740

  5. Satellite Movie Shows Erika Dissipate

    NASA Video Gallery

    This animation of visible and infrared imagery from NOAA's GOES-West satellite from Aug. 27 to 29 shows Tropical Storm Erika move through the Eastern Caribbean Sea and dissipate near eastern Cuba. ...

  6. Quantifying Sentiment and Influence in Blogspaces

    SciTech Connect

    Hui, Peter SY; Gregory, Michelle L.

    2010-07-25

    The weblog, or blog, has become a popular form of social media, through which authors can write posts, which can in turn generate feedback in the form of user comments. When considered in totality, a collection of blogs can thus be viewed as a sort of informal collection of mass sentiment and opinion. An obvious topic of interest might be to mine this collection to obtain some gauge of public sentiment over the wide variety of topics contained therein. However, the sheer size of the so-called blogosphere, combined with the fact that the subjects of posts can vary over a practically limitless number of topics poses some serious challenges when any meaningful analysis is attempted. Namely, the fact that largely anyone with access to the Internet can author their own blog, raises the serious issue of credibility— should some blogs be considered to be more influential than others, and consequently, when gauging sentiment with respect to a topic, should some blogs be weighted more heavily than others? In addition, as new posts and comments can be made on almost a constant basis, any blog analysis algorithm must be able to handle such updates efficiently. In this paper, we give a formalization of the blog model. We give formal methods of quantifying sentiment and influence with respect to a hierarchy of topics, with the specific aim of facilitating the computation of a per-topic, influence-weighted sentiment measure. Finally, as efficiency is a specific endgoal, we give upper bounds on the time required to update these values with new posts, showing that our analysis and algorithms are scalable.

  7. Quantifying consistent individual differences in habitat selection.

    PubMed

    Leclerc, Martin; Vander Wal, Eric; Zedrosser, Andreas; Swenson, Jon E; Kindberg, Jonas; Pelletier, Fanie

    2016-03-01

    Habitat selection is a fundamental behaviour that links individuals to the resources required for survival and reproduction. Although natural selection acts on an individual's phenotype, research on habitat selection often pools inter-individual patterns to provide inferences on the population scale. Here, we expanded a traditional approach of quantifying habitat selection at the individual level to explore the potential for consistent individual differences of habitat selection. We used random coefficients in resource selection functions (RSFs) and repeatability estimates to test for variability in habitat selection. We applied our method to a detailed dataset of GPS relocations of brown bears (Ursus arctos) taken over a period of 6 years, and assessed whether they displayed repeatable individual differences in habitat selection toward two habitat types: bogs and recent timber-harvest cut blocks. In our analyses, we controlled for the availability of habitat, i.e. the functional response in habitat selection. Repeatability estimates of habitat selection toward bogs and cut blocks were 0.304 and 0.420, respectively. Therefore, 30.4 and 42.0 % of the population-scale habitat selection variability for bogs and cut blocks, respectively, was due to differences among individuals, suggesting that consistent individual variation in habitat selection exists in brown bears. Using simulations, we posit that repeatability values of habitat selection are not related to the value and significance of β estimates in RSFs. Although individual differences in habitat selection could be the results of non-exclusive factors, our results illustrate the evolutionary potential of habitat selection.

  8. Quantifying the curvilinear metabolic scaling in mammals.

    PubMed

    Packard, Gary C

    2015-10-01

    A perplexing problem confronting students of metabolic allometry concerns the convex curvature that seemingly occurs in log-log plots of basal metabolic rate (BMR) vs. body mass in mammals. This putative curvilinearity has typically been interpreted in the context of a simple power function, Y=a*Xb, on the arithmetic scale, with the allometric exponent, b, supposedly increasing steadily as a dependent function of body size. The relationship can be quantified in arithmetic domain by exponentiating a quadratic equation fitted to logarithmic transformations of the original data, but the resulting model is not in the form of a power function and it is unlikely to describe accurately the pattern in the original distribution. I therefore re-examined a dataset for 636 species of mammal and discovered that the relationship between BMR and body mass is well-described by a power function with an explicit, non-zero intercept and lognormal, heteroscedastic error. The model has an invariant allometric exponent of 0.75, so the appearance in prior investigations of a steadily increasing exponent probably was an aberration resulting from undue reliance on logarithmic transformations to estimate statistical models in arithmetic domain. Theoretical constructs relating BMR to body mass in mammals may need to be modified to accommodate a positive intercept in the statistical model, but they do not need to be revised, or rejected, at present time on grounds that the allometric exponent varies with body size. New data from planned experiments will be needed to confirm any hypothesis based on data currently available.

  9. Quantifying consistent individual differences in habitat selection.

    PubMed

    Leclerc, Martin; Vander Wal, Eric; Zedrosser, Andreas; Swenson, Jon E; Kindberg, Jonas; Pelletier, Fanie

    2016-03-01

    Habitat selection is a fundamental behaviour that links individuals to the resources required for survival and reproduction. Although natural selection acts on an individual's phenotype, research on habitat selection often pools inter-individual patterns to provide inferences on the population scale. Here, we expanded a traditional approach of quantifying habitat selection at the individual level to explore the potential for consistent individual differences of habitat selection. We used random coefficients in resource selection functions (RSFs) and repeatability estimates to test for variability in habitat selection. We applied our method to a detailed dataset of GPS relocations of brown bears (Ursus arctos) taken over a period of 6 years, and assessed whether they displayed repeatable individual differences in habitat selection toward two habitat types: bogs and recent timber-harvest cut blocks. In our analyses, we controlled for the availability of habitat, i.e. the functional response in habitat selection. Repeatability estimates of habitat selection toward bogs and cut blocks were 0.304 and 0.420, respectively. Therefore, 30.4 and 42.0 % of the population-scale habitat selection variability for bogs and cut blocks, respectively, was due to differences among individuals, suggesting that consistent individual variation in habitat selection exists in brown bears. Using simulations, we posit that repeatability values of habitat selection are not related to the value and significance of β estimates in RSFs. Although individual differences in habitat selection could be the results of non-exclusive factors, our results illustrate the evolutionary potential of habitat selection. PMID:26597548

  10. Towards quantifying cochlear implant localization performance in complex acoustic environments.

    PubMed

    Kerber, S; Seeber, B U

    2011-08-01

    Cochlear implant (CI) users frequently report listening difficulties in reverberant and noisy spaces. While it is common to assess speech understanding with implants in background noise, binaural hearing performance has rarely been quantified in the presence of other sources, although the binaural system is a major contributor to the robustness of speech understanding in noisy situations with normal hearing. Here, a pointing task was used to measure horizontal localization ability of a bilateral CI user in quiet and in a continuous diffuse noise interferer at a signal-to-noise ratio of 0 dB. Results were compared to localization performance of six normal hearing listeners. The average localization error of the normal hearing listeners was within normal ranges reported previously and only increased by 1.8° when the interfering noise was introduced. In contrast, the bilateral CI user showed a localization error of 22° in quiet which rose to 31° in noise. This increase was partly due to target sounds being inaudible when presented from frontal locations between -20° and +20°. With the noise present, the implant user was only able to reliably hear target sounds presented from locations well off the median plane. The results give support to the informal complaints raised by CI users and can help to define targets for the design of, e.g., noise reduction algorithms for implant processors.

  11. Quantifying Numerical Dissipation due to Filtering in Implicit LES

    NASA Astrophysics Data System (ADS)

    Cadieux, Francois; Domaradzki, Julian Andrzej

    2015-11-01

    Numerical dissipation plays an important role in LES and has given rise to the widespread use of implicit LES in the academic community. Recent results demonstrate that even with higher order codes, the use of stabilizing filters can act as a source of numerical dissipation strong enough to compare to an explicit subgrid-scale model (Cadieux et al., JFE 136-6). The amount of numerical dissipation added by such filtering operation in the simulation of a laminar separation bubble is quantified using a new method developed by Schranner et al., Computers & Fluids 114. It is then compared to a case where the filter is turned off, as well as the subgrid-scale dissipation that would be added by the σ model. The sensitivity of the method to the choice of subdomain location and size is explored. The effect of different derivative approximations and integration methods is also scrutinized. The method is shown to be robust and accurate for large subdomains. Results show that without filtering, numerical dissipation in the high order code is negligible, and that the filtering operation at the resolution considered adds substantial numerical dissipation in the same regions and at a similar rate as the σ subgrid-scale model would. NSF grant CBET-1233160.

  12. Quantifying safety benefit of winter road maintenance: accident frequency modeling.

    PubMed

    Usman, Taimur; Fu, Liping; Miranda-Moreno, Luis F

    2010-11-01

    This research presents a modeling approach to investigate the association of the accident frequency during a snow storm event with road surface conditions, visibility and other influencing factors controlling for traffic exposure. The results have the premise to be applied for evaluating different maintenance strategies using safety as a performance measure. As part of this approach, this research introduces a road surface condition index as a surrogate measure of the commonly used friction measure to capture different road surface conditions. Data from various data sources, such as weather, road condition observations, traffic counts and accidents, are integrated and used to test three event-based models including the Negative Binomial model, the generalized NB model and the zero inflated NB model. These models are compared for their capability to explain differences in accident frequencies between individual snow storms. It was found that the generalized NB model best fits the data, and is most capable of capturing heterogeneity other than excess zeros. Among the main results, it was found that the road surface condition index was statistically significant influencing the accident occurrence. This research is the first showing the empirical relationship between safety and road surface conditions at a disaggregate level (event-based), making it feasible to quantify the safety benefits of alternative maintenance goals and methods.

  13. Quantifying the provenance of aeolian sediments using multiple composite fingerprints

    NASA Astrophysics Data System (ADS)

    Liu, Benli; Niu, Qinghe; Qu, Jianjun; Zu, Ruiping

    2016-09-01

    We introduce a new fingerprinting method that uses multiple composite fingerprints for studies of aeolian sediment provenance. We used this method to quantify the provenance of sediments on both sides of the Qinghai-Tibetan Railway (QTR) in the Cuona Lake section of the Tibetan Plateau (TP), in an environment characterized by aeolian and fluvial interactions. The method involves repeatedly solving a linear mixing model based on mass conservation; the model is not limited to spatial scale or transport types and uses all the tracer groups that passed the range check, Kruskal-Wallis H-test, and a strict analytical solution screening. The proportional estimates that result from using different composite fingerprints are highly variable; however, the average of these fingerprints has a greater accuracy and certainty than any single fingerprint. The results show that sand from the lake beach, hilly surface, and gullies contribute, respectively, 48%, 31% and 21% to the western railway sediments and 43%, 33% and 24% to the eastern railway sediments. The difference between contributions from various sources on either side of the railway, which may increase in the future, was clearly related to variations in local transport characteristics, a conclusion that is supported by grain size analysis. The construction of the QTR changed the local cycling of materials, and the difference in provenance between the sediments that are separated by the railway reflects the changed sedimentary conditions on either side of the railway. The effectiveness of this method suggests that it will be useful in other studies of aeolian sediments.

  14. Interpolating Quantifier-Free Presburger Arithmetic

    NASA Astrophysics Data System (ADS)

    Kroening, Daniel; Leroux, Jérôme; Rümmer, Philipp

    Craig interpolation has become a key ingredient in many symbolic model checkers, serving as an approximative replacement for expensive quantifier elimination. In this paper, we focus on an interpolating decision procedure for the full quantifier-free fragment of Presburger Arithmetic, i.e., linear arithmetic over the integers, a theory which is a good fit for the analysis of software systems. In contrast to earlier procedures based on quantifier elimination and the Omega test, our approach uses integer linear programming techniques: relaxation of interpolation problems to the rationals, and a complete branch-and-bound rule tailored to efficient interpolation. Equations are handled via a dedicated polynomial-time sub-procedure. We have fully implemented our procedure on top of the SMT-solver OpenSMT and present an extensive experimental evaluation.

  15. Arches showing UV flaring activity

    NASA Technical Reports Server (NTRS)

    Fontenla, J. M.

    1988-01-01

    The UVSP data obtained in the previous maximum activity cycle show the frequent appearance of flaring events in the UV. In many cases these flaring events are characterized by at least two footpoints which show compact impulsive non-simultaneous brightenings and a fainter but clearly observed arch developes between the footpoints. These arches and footpoints are observed in line corresponding to different temperatures, as Lyman alpha, N V, and C IV, and when observed above the limb display large Doppler shifts at some stages. The size of the arches can be larger than 20 arcsec.

  16. Uncertainty of natural tracer methods for quantifying river-aquifer interaction in a large river

    NASA Astrophysics Data System (ADS)

    Xie, Yueqing; Cook, Peter G.; Shanafield, Margaret; Simmons, Craig T.; Zheng, Chunmiao

    2016-04-01

    The quantification of river-aquifer interaction is critical to the conjunctive management of surface water and groundwater, in particular in the arid and semiarid environment with much higher potential evapotranspiration than precipitation. A variety of natural tracer methods are available to quantify river-aquifer interaction at different scales. These methods however have only been tested in rivers with relatively low flow rates (mostly less than 5 m3 s-1). In this study, several natural tracers including heat, radon-222 and electrical conductivity were measured both on vertical riverbed profiles and on longitudinal river samples to quantify river-aquifer exchange flux at both point and regional scales in the Heihe River (northwest China; flow rate 63 m3 s-1). Results show that the radon-222 profile method can estimate a narrower range of point-scale flux than the temperature profile method. In particular, three vertical radon-222 profiles failed to estimate the upper bounds of plausible flux ranges. Results also show that when quantifying regional-scale river-aquifer exchange flux, the river chemistry method constrained the flux (5.20-10.39 m2 d-1) better than the river temperature method (-100 to 100 m2 d-1). The river chemistry method also identified spatial variability of flux, whereas the river temperature method did not have sufficient resolution. Overall, for quantifying river-aquifer exchange flux in a large river, both the temperature profile method and the radon-222 profile method provide useful complementary information at the point scale to complement each other, whereas the river chemistry method is recommended over the river temperature method at the regional scale.

  17. Create a Polarized Light Show.

    ERIC Educational Resources Information Center

    Conrad, William H.

    1992-01-01

    Presents a lesson that introduces students to polarized light using a problem-solving approach. After illustrating the concept using a slinky and poster board with a vertical slot, students solve the problem of creating a polarized light show using Polya's problem-solving methods. (MDH)

  18. Pembrolizumab Shows Promise for NSCLC.

    PubMed

    2015-06-01

    Data from the KEYNOTE-001 trial show that pembrolizumab improves clinical outcomes for patients with advanced non-small cell lung cancer, and is well tolerated. PD-L1 expression in at least 50% of tumor cells correlated with improved efficacy.

  19. The interpretation of classically quantified sentences: a set-theoretic approach.

    PubMed

    Politzer, Guy; Henst, Jean-Baptiste; Delle Luche, Claire; Noveck, Ira A

    2006-07-01

    We present a set-theoretic model of the mental representation of classically quantified sentences (All P are Q, Some P are Q, Some P are not Q, and No P are Q). We take inclusion, exclusion, and their negations to be primitive concepts. We show that although these sentences are known to have a diagrammatic expression (in the form of the Gergonne circles) that constitutes a semantic representation, these concepts can also be expressed syntactically in the form of algebraic formulas. We hypothesized that the quantified sentences have an abstract underlying representation common to the formulas and their associated sets of diagrams (models). We derived 9 predictions (3 semantic, 2 pragmatic, and 4 mixed) regarding people's assessment of how well each of the 5 diagrams expresses the meaning of each of the quantified sentences. We report the results from 3 experiments using Gergonne's (1817) circles or an adaptation of Leibniz (1903/1988) lines as external representations and show them to support the predictions.

  20. Quantifying the reheating temperature of the universe

    NASA Astrophysics Data System (ADS)

    Mazumdar, Anupam; Zaldívar, Bryan

    2014-09-01

    The aim of this paper is to determine an exact definition of the reheat temperature for a generic perturbative decay of the inflaton. In order to estimate the reheat temperature, there are two important conditions one needs to satisfy: (a) the decay products of the inflaton must dominate the energy density of the universe, i.e. the universe becomes completely radiation dominated, and (b) the decay products of the inflaton have attained local thermodynamical equilibrium. For some choices of parameters, the latter is a more stringent condition, such that the decay products may thermalise much after the beginning of radiation-domination. Consequently, we have obtained that the reheat temperature can be much lower than the standard-lore estimation. In this paper we describe under what conditions our universe could have efficient or inefficient thermalisation, and quantify the reheat temperature for both the scenarios. This result has an immediate impact on many applications which rely on the thermal history of the universe, in particular gravitino abundance. Instant thermalisation: when the inflaton decay products instantly thermalise upon decay. Efficient thermalisation: when the inflaton decay products thermalise right at the instant when radiation epoch starts dominating the universe. Delayed thermalisation: when the inflaton decay products thermalise deep inside the radiation dominated epoch after the transition from inflaton-to-radiation domination had occurred. This paper is organised as follows. In Section 2 we set the stage and write down the relevant equations for our analysis. The standard lore about the reheating epoch is briefly commented in Section 3. Section 4 is devoted to present our analysis, in which we study the conditions under which the plasma attains thermalisation. Later on, in Section 5 we discuss the concept of reheat temperature such as to properly capture the issues of thermalisation. Finally, we conclude in Section 6.

  1. Quantified Histopathology of the Keratoconic Cornea

    PubMed Central

    Mathew, Jessica H.; Goosey, John D.; Bergmanson, Jan P. G.

    2011-01-01

    Purpose The present study systematically investigated and quantified histopathological changes in a series of keratoconic (Kc) corneas utilizing a physiologically formulated fixative to not further distort the already distorted diseased corneas. Methods Twelve surgically removed Kc corneal buttons were immediately preserved and processed for light and transmission electron microscopy using an established corneal protocol. Measurements were taken from the central cone and peripheral regions of the host button. The sample size examined ranged in length from 390–2608um centrally and 439–2242um peripherally. Results The average corneal thickness was 437um centrally and 559um peripherally. Epithelial thickness varied centrally from 14–92um and peripherally from 30–91um. A marked thickening of the epithelial basement membrane was noted in 58% of corneas. Centrally, anterior limiting lamina (ALL) was thinned or lost over 60% of the area examined, while peripheral cornea was also affected, but to a lesser extent. Histopathologically, posterior cornea remained undisturbed by the disease. Anteriorly in the stroma, an increased number of cells and tissue debris were encountered and some of these cells were clearly not keratocytes. Conclusions It is concluded that Kc pathology, at least initially, has a distinct anterior focus involving the epithelium, ALL and anterior stroma. The epithelium had lost its cellular uniformity and was compromised by the loss or damage to the ALL. The activity of the hitherto unreported recruited stromal cells may be to break down and remove ALL and anterior stromal lamellae leading to the overall thinning that accompanies this disease. PMID:21623252

  2. COMPLEXITY & APPROXIMABILITY OF QUANTIFIED & STOCHASTIC CONSTRAINT SATISFACTION PROBLEMS

    SciTech Connect

    H. B. HUNT; M. V. MARATHE; R. E. STEARNS

    2001-06-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S and T be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SATc(S).) Here, we study simultaneously the complexity of decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. We present simple yet general techniques to characterize simultaneously, the complexity or efficient approximability of a number of versions/variants of the problems SAT(S), Q-SAT(S), S-SAT(S),MAX-Q-SAT(S) etc., for many different such D,C,S,T. These versions/variants include decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. Our unified approach is based on the following two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic representability. Some of the results extend the earlier results in [Pa85,LMP99,CF+93,CF+94] Our techniques and results reported here also provide significant steps towards obtaining dichotomy theorems, for a number of the problems above, including the problems MAX-Q-SAT(S), and MAX-S-SAT(S). The discovery of such dichotomy theorems, for unquantified formulas, has received significant recent attention in the literature [CF+93, CF+94, Cr95, KSW97]. Keywords: NP-hardness; Approximation Algorithms; PSPACE-hardness; Quantified and Stochastic Constraint Satisfaction Problems.

  3. Quantifying forest mortality with the remote sensing of snow

    NASA Astrophysics Data System (ADS)

    Baker, Emily Hewitt

    Greenhouse gas emissions have altered global climate significantly, increasing the frequency of drought, fire, and pest-related mortality in forests across the western United States, with increasing area affected each year. Associated changes in forests are of great concern for the public, land managers, and the broader scientific community. These increased stresses have resulted in a widespread, spatially heterogeneous decline of forest canopies, which in turn exerts strong controls on the accumulation and melt of the snowpack, and changes forest-atmosphere exchanges of carbon, water, and energy. Most satellite-based retrievals of summer-season forest data are insufficient to quantify canopy, as opposed to the combination of canopy and undergrowth, since the signals of the two types of vegetation greenness have proven persistently difficult to distinguish. To overcome this issue, this research develops a method to quantify forest canopy cover using winter-season fractional snow covered area (FSCA) data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) snow covered area and grain size (MODSCAG) algorithm. In areas where the ground surface and undergrowth are completely snow-covered, a pixel comprises only forest canopy and snow. Following a snowfall event, FSCA initially rises, as snow is intercepted in the canopy, and then falls, as snow unloads. A select set of local minima in a winter F SCA timeseries form a threshold where canopy is snow-free, but forest understory is snow-covered. This serves as a spatially-explicit measurement of forest canopy, and viewable gap fraction (VGF) on a yearly basis. Using this method, we determine that MODIS-observed VGF is significantly correlated with an independent product of yearly crown mortality derived from spectral analysis of Landsat imagery at 25 high-mortality sites in northern Colorado. (r =0.96 +/-0.03, p =0.03). Additionally, we determine the lag timing between green-stage tree mortality and

  4. Magic Carpet Shows Its Colors

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The upper left image in this display is from the panoramic camera on the Mars Exploration Rover Spirit, showing the 'Magic Carpet' region near the rover at Gusev Crater, Mars, on Sol 7, the seventh martian day of its journey (Jan. 10, 2004). The lower image, also from the panoramic camera, is a monochrome (single filter) image of a rock in the 'Magic Carpet' area. Note that colored portions of the rock correlate with extracted spectra shown in the plot to the side. Four different types of materials are shown: the rock itself, the soil in front of the rock, some brighter soil on top of the rock, and some dust that has collected in small recesses on the rock face ('spots'). Each color on the spectra matches a line on the graph, showing how the panoramic camera's different colored filters are used to broadly assess the varying mineral compositions of martian rocks and soils.

  5. Quantifying and Communicating Uncertainty in Preclinical Human Dose-Prediction

    PubMed Central

    Sundqvist, M; Lundahl, A; Någård, MB; Bredberg, U; Gennemark, P

    2015-01-01

    Human dose-prediction is fundamental for ranking lead-optimization compounds in drug discovery and to inform design of early clinical trials. This tutorial describes how uncertainty in such predictions can be quantified and efficiently communicated to facilitate decision-making. Using three drug-discovery case studies, we show how several uncertain pieces of input information can be integrated into one single uncomplicated plot with key predictions, including their uncertainties, for many compounds or for many scenarios, or both. PMID:26225248

  6. Classifying and quantifying basins of attraction

    SciTech Connect

    Sprott, J. C.; Xiong, Anda

    2015-08-15

    A scheme is proposed to classify the basins for attractors of dynamical systems in arbitrary dimensions. There are four basic classes depending on their size and extent, and each class can be further quantified to facilitate comparisons. The calculation uses a Monte Carlo method and is applied to numerous common dissipative chaotic maps and flows in various dimensions.

  7. Quantifying the Thermal Fatigue of CPV Modules

    SciTech Connect

    Bosco, N.; Kurtz, S.

    2011-02-01

    A method is presented to quantify thermal fatigue in the CPV die-attach from meteorological data. A comparative; study between cities demonstrates a significant difference in the accumulated damage. These differences are most; sensitive to the number of larger (ΔT) thermal cycles experienced for a location. High frequency data (<1/min) may; be required to most accurately employ this method.

  8. Quantifying Semantic Linguistic Maturity in Children

    ERIC Educational Resources Information Center

    Hansson, Kristina; Bååth, Rasmus; Löhndorf, Simone; Sahlén, Birgitta; Sikström, Sverker

    2016-01-01

    We propose a method to quantify "semantic linguistic maturity" (SELMA) based on a high dimensional semantic representation of words created from the co-occurrence of words in a large text corpus. The method was applied to oral narratives from 108 children aged 4;0-12;10. By comparing the SELMA measure with maturity ratings made by human…

  9. Quantifying the Reuse of Learning Objects

    ERIC Educational Resources Information Center

    Elliott, Kristine; Sweeney, Kevin

    2008-01-01

    This paper reports the findings of one case study from a larger project, which aims to quantify the claimed efficiencies of reusing learning objects to develop e-learning resources. The case study describes how an online inquiry project "Diabetes: A waste of energy" was developed by searching for, evaluating, modifying and then integrating as many…

  10. Subtleties of Hidden Quantifiers in Implication

    ERIC Educational Resources Information Center

    Shipman, Barbara A.

    2016-01-01

    Mathematical conjectures and theorems are most often of the form P(x) ? Q(x), meaning ?x,P(x) ? Q(x). The hidden quantifier ?x is crucial in understanding the implication as a statement with a truth value. Here P(x) and Q(x) alone are only predicates, without truth values, since they contain unquantified variables. But standard textbook…

  11. Quantifying biased response of axon to chemical gradient steepness in a microfluidic device.

    PubMed

    Xiao, Rong-Rong; Wang, Lei; Zhang, Lin; Liu, Yu-Ning; Yu, Xiao-Lei; Huang, Wei-Hua

    2014-12-01

    Axons are very sensitive to molecular gradients and can discriminate extremely small differences in gradient steepness. Microfluidic devices capable of generating chemical gradients and adjusting their steepness could be used to quantify the sensitivity of axonal response. Here, we present a versatile and robust microfluidic device that can generate substrate-bound molecular gradients with evenly varying steepness on a single chip to precisely quantify axonal response. In this device, two solutions are perfused into a central channel via two inlets while partially flowing into two peripheral channels through interconnecting grooves, which gradually decrease the fluid velocity along the central channel. Molecular gradients with evenly and gradually decreased steepness can therefore be generated with a high resolution that is less than 0.05%/mm. In addition, the overall distribution range and resolution of the gradient steepness can be highly and flexibly controlled by adjusting various parameters of the device. Using this device, we quantified the hippocampal axonal response to substrate-bound laminin and ephrin-A5 gradients with varying steepnesses. Our results provided more detailed information on how and to what extent different steepnesses guide hippocampal neuron development during the initial outgrowth. Furthermore, our results show that axons can sensitively respond to very shallow laminin and ephrin-A5 gradients, which could effectively initiate biased differentiation of hippocampal neurons in the steepness range investigated in this study. PMID:25381866

  12. Interpreting Quantifier Scope Ambiguity: Evidence of Heuristic First, Algorithmic Second Processing

    PubMed Central

    Dwivedi, Veena D.

    2013-01-01

    The present work suggests that sentence processing requires both heuristic and algorithmic processing streams, where the heuristic processing strategy precedes the algorithmic phase. This conclusion is based on three self-paced reading experiments in which the processing of two-sentence discourses was investigated, where context sentences exhibited quantifier scope ambiguity. Experiment 1 demonstrates that such sentences are processed in a shallow manner. Experiment 2 uses the same stimuli as Experiment 1 but adds questions to ensure deeper processing. Results indicate that reading times are consistent with a lexical-pragmatic interpretation of number associated with context sentences, but responses to questions are consistent with the algorithmic computation of quantifier scope. Experiment 3 shows the same pattern of results as Experiment 2, despite using stimuli with different lexical-pragmatic biases. These effects suggest that language processing can be superficial, and that deeper processing, which is sensitive to structure, only occurs if required. Implications for recent studies of quantifier scope ambiguity are discussed. PMID:24278439

  13. Graphene Oxides Show Angiogenic Properties.

    PubMed

    Mukherjee, Sudip; Sriram, Pavithra; Barui, Ayan Kumar; Nethi, Susheel Kumar; Veeriah, Vimal; Chatterjee, Suvro; Suresh, Kattimuttathu Ittara; Patra, Chitta Ranjan

    2015-08-01

    Angiogenesis, a process resulting in the formation of new capillaries from the pre-existing vasculature plays vital role for the development of therapeutic approaches for cancer, atherosclerosis, wound healing, and cardiovascular diseases. In this report, the synthesis, characterization, and angiogenic properties of graphene oxide (GO) and reduced graphene oxide (rGO) have been demonstrated, observed through several in vitro and in vivo angiogenesis assays. The results here demonstrate that the intracellular formation of reactive oxygen species and reactive nitrogen species as well as activation of phospho-eNOS and phospho-Akt might be the plausible mechanisms for GO and rGO induced angiogenesis. The results altogether suggest the possibilities for the development of alternative angiogenic therapeutic approach for the treatment of cardiovascular related diseases where angiogenesis plays a significant role.

  14. Quantifying solute transport processes: are chemically "conservative" tracers electrically conservative?

    USGS Publications Warehouse

    Singha, Kamini; Li, Li; Day-Lewis, Frederick D.; Regberg, Aaron B.

    2012-01-01

    The concept of a nonreactive or conservative tracer, commonly invoked in investigations of solute transport, requires additional study in the context of electrical geophysical monitoring. Tracers that are commonly considered conservative may undergo reactive processes, such as ion exchange, thus changing the aqueous composition of the system. As a result, the measured electrical conductivity may reflect not only solute transport but also reactive processes. We have evaluated the impacts of ion exchange reactions, rate-limited mass transfer, and surface conduction on quantifying tracer mass, mean arrival time, and temporal variance in laboratory-scale column experiments. Numerical examples showed that (1) ion exchange can lead to resistivity-estimated tracer mass, velocity, and dispersivity that may be inaccurate; (2) mass transfer leads to an overestimate in the mobile tracer mass and an underestimate in velocity when using electrical methods; and (3) surface conductance does not notably affect estimated moments when high-concentration tracers are used, although this phenomenon may be important at low concentrations or in sediments with high and/or spatially variable cation-exchange capacity. In all cases, colocated groundwater concentration measurements are of high importance for interpreting geophysical data with respect to the controlling transport processes of interest.

  15. Statistical physics approach to quantifying differences in myelinated nerve fibers

    NASA Astrophysics Data System (ADS)

    Comin, César H.; Santos, João R.; Corradini, Dario; Morrison, Will; Curme, Chester; Rosene, Douglas L.; Gabrielli, Andrea; da F. Costa, Luciano; Stanley, H. Eugene

    2014-03-01

    We present a new method to quantify differences in myelinated nerve fibers. These differences range from morphologic characteristics of individual fibers to differences in macroscopic properties of collections of fibers. Our method uses statistical physics tools to improve on traditional measures, such as fiber size and packing density. As a case study, we analyze cross-sectional electron micrographs from the fornix of young and old rhesus monkeys using a semi-automatic detection algorithm to identify and characterize myelinated axons. We then apply a feature selection approach to identify the features that best distinguish between the young and old age groups, achieving a maximum accuracy of 94% when assigning samples to their age groups. This analysis shows that the best discrimination is obtained using the combination of two features: the fraction of occupied axon area and the effective local density. The latter is a modified calculation of axon density, which reflects how closely axons are packed. Our feature analysis approach can be applied to characterize differences that result from biological processes such as aging, damage from trauma or disease or developmental differences, as well as differences between anatomical regions such as the fornix and the cingulum bundle or corpus callosum.

  16. The red planet shows off

    NASA Astrophysics Data System (ADS)

    Beish, J. D.; Parker, D. C.; Hernandez, C. E.

    1989-01-01

    Results from observations of Mars between November 1987 and September 1988 are reviewed. The observations were part of a program to provide continuous global coverage of Mars in the period surrounding its opposition on September 28, 1988. Observations of Martian clouds, dust storms, the planet's south pole, and the Martian surface are discussed.

  17. ShowMe3D

    SciTech Connect

    Sinclair, Michael B

    2012-01-05

    ShowMe3D is a data visualization graphical user interface specifically designed for use with hyperspectral image obtained from the Hyperspectral Confocal Microscope. The program allows the user to select and display any single image from a three dimensional hyperspectral image stack. By moving a slider control, the user can easily move between images of the stack. The user can zoom into any region of the image. The user can select any pixel or region from the displayed image and display the fluorescence spectrum associated with that pixel or region. The user can define up to 3 spectral filters to apply to the hyperspectral image and view the image as it would appear from a filter-based confocal microscope. The user can also obtain statistics such as intensity average and variance from selected regions.

  18. "Show me" bioethics and politics.

    PubMed

    Christopher, Myra J

    2007-10-01

    Missouri, the "Show Me State," has become the epicenter of several important national public policy debates, including abortion rights, the right to choose and refuse medical treatment, and, most recently, early stem cell research. In this environment, the Center for Practical Bioethics (formerly, Midwest Bioethics Center) emerged and grew. The Center's role in these "cultural wars" is not to advocate for a particular position but to provide well researched and objective information, perspective, and advocacy for the ethical justification of policy positions; and to serve as a neutral convener and provider of a public forum for discussion. In this article, the Center's work on early stem cell research is a case study through which to argue that not only the Center, but also the field of bioethics has a critical role in the politics of public health policy.

  19. ShowMe3D

    2012-01-05

    ShowMe3D is a data visualization graphical user interface specifically designed for use with hyperspectral image obtained from the Hyperspectral Confocal Microscope. The program allows the user to select and display any single image from a three dimensional hyperspectral image stack. By moving a slider control, the user can easily move between images of the stack. The user can zoom into any region of the image. The user can select any pixel or region from themore » displayed image and display the fluorescence spectrum associated with that pixel or region. The user can define up to 3 spectral filters to apply to the hyperspectral image and view the image as it would appear from a filter-based confocal microscope. The user can also obtain statistics such as intensity average and variance from selected regions.« less

  20. Entropy generation method to quantify thermal comfort

    NASA Technical Reports Server (NTRS)

    Boregowda, S. C.; Tiwari, S. N.; Chaturvedi, S. K.

    2001-01-01

    The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study

  1. Casimir experiments showing saturation effects

    SciTech Connect

    Sernelius, Bo E.

    2009-10-15

    We address several different Casimir experiments where theory and experiment disagree. First out is the classical Casimir force measurement between two metal half spaces; here both in the form of the torsion pendulum experiment by Lamoreaux and in the form of the Casimir pressure measurement between a gold sphere and a gold plate as performed by Decca et al.; theory predicts a large negative thermal correction, absent in the high precision experiments. The third experiment is the measurement of the Casimir force between a metal plate and a laser irradiated semiconductor membrane as performed by Chen et al.; the change in force with laser intensity is larger than predicted by theory. The fourth experiment is the measurement of the Casimir force between an atom and a wall in the form of the measurement by Obrecht et al. of the change in oscillation frequency of a {sup 87}Rb Bose-Einstein condensate trapped to a fused silica wall; the change is smaller than predicted by theory. We show that saturation effects can explain the discrepancies between theory and experiment observed in all these cases.

  2. Quantifiable diagnosis of muscular dystrophies and neurogenic atrophies through network analysis

    PubMed Central

    2013-01-01

    Background The diagnosis of neuromuscular diseases is strongly based on the histological characterization of muscle biopsies. However, this morphological analysis is mostly a subjective process and difficult to quantify. We have tested if network science can provide a novel framework to extract useful information from muscle biopsies, developing a novel method that analyzes muscle samples in an objective, automated, fast and precise manner. Methods Our database consisted of 102 muscle biopsy images from 70 individuals (including controls, patients with neurogenic atrophies and patients with muscular dystrophies). We used this to develop a new method, Neuromuscular DIseases Computerized Image Analysis (NDICIA), that uses network science analysis to capture the defining signature of muscle biopsy images. NDICIA characterizes muscle tissues by representing each image as a network, with fibers serving as nodes and fiber contacts as links. Results After a ‘training’ phase with control and pathological biopsies, NDICIA was able to quantify the degree of pathology of each sample. We validated our method by comparing NDICIA quantification of the severity of muscular dystrophies with a pathologist’s evaluation of the degree of pathology, resulting in a strong correlation (R = 0.900, P <0.00001). Importantly, our approach can be used to quantify new images without the need for prior ‘training’. Therefore, we show that network science analysis captures the useful information contained in muscle biopsies, helping the diagnosis of muscular dystrophies and neurogenic atrophies. Conclusions Our novel network analysis approach will serve as a valuable tool for assessing the etiology of muscular dystrophies or neurogenic atrophies, and has the potential to quantify treatment outcomes in preclinical and clinical trials. PMID:23514382

  3. Quantifiers more or less quantify online: ERP evidence for partial incremental interpretation

    PubMed Central

    Urbach, Thomas P.; Kutas, Marta

    2010-01-01

    Event-related brain potentials were recorded during RSVP reading to test the hypothesis that quantifier expressions are incrementally interpreted fully and immediately. In sentences tapping general knowledge (Farmers grow crops/worms as their primary source of income), Experiment 1 found larger N400s for atypical (worms) than typical objects (crops). Experiment 2 crossed object typicality with non-logical subject-noun phrase quantifiers (most, few). Off-line plausibility ratings exhibited the crossover interaction predicted by full quantifier interpretation: Most farmers grow crops and Few farmers grow worms were rated more plausible than Most farmers grow worms and Few farmers grow crops. Object N400s, although modulated in the expected direction, did not reverse. Experiment 3 replicated these findings with adverbial quantifiers (Farmers often/rarely grow crops/worms). Interpretation of quantifier expressions thus is neither fully immediate nor fully delayed. Furthermore, object atypicality was associated with a frontal slow positivity in few-type/rarely quantifier contexts, suggesting systematic processing differences among quantifier types. PMID:20640044

  4. Quantifying the relevance of cyclones for precipitation extremes

    NASA Astrophysics Data System (ADS)

    Pfahl, S.; Wernli, H.

    2012-04-01

    Precipitation extremes and associated floods may have a huge impact on society. It is thus important to better understand the mechanisms causing these events, also with regard to their variations in a changing climate. Here the importance of a particular category of weather systems, namely cyclones, for the occurrence of regional-scale precipitation extremes is quantified globally, based on the ERA-Interim reanalysis dataset for the period 1989-2009. Such an event-based climatological approach complements previous case studies, which established the physical relationship between cyclones and heavy precipitation. Cyclones are identified from ERA-Interim sea level pressure fields as features with a finite size, determined by the outermost closed pressure contour comprising one or several pressure minima. At each grid point, the 99% percentile of six-hourly accumulated precipitation is calculated, and all dates with six-hourly precipitation larger than this percentile are identified as extreme events. A comparison with the satellite observation-based CMORPH dataset for the years 2003 to 2009 shows that ERA-Interim properly captures the timing of the extreme events in the major parts of the extratropics. A cyclone is assumed to induce a precipitation extreme if both occur simultaneously at the same grid point. The percentage of extreme precipitation events coinciding with a cyclone is then quantified at every grid point. This percentage strongly exceeds the climatological cyclone frequency in many regions. It reaches maxima of more than 80%, e.g., in the main North Atlantic, North Pacific and Southern Ocean storm track areas. Other regional hot spots of cyclone-induced precipitation extremes are found in areas with very low climatological cyclone frequencies, in particular around the Mediterranean Sea, in eastern China, over the Philippines and the southeastern United States. Our results suggest that in these hot spot regions changes of precipitation extremes with

  5. Quantifying measurement uncertainty in full-scale compost piles using organic micro-pollutant concentrations.

    PubMed

    Sadef, Yumna; Poulsen, Tjalfe G; Bester, Kai

    2014-05-01

    Reductions in measurement uncertainty for organic micro-pollutant concentrations in full scale compost piles using comprehensive sampling and allowing equilibration time before sampling were quantified. Results showed that both application of a comprehensive sampling procedure (involving sample crushing) and allowing one week of equilibration time before sampling reduces measurement uncertainty by about 50%. Results further showed that for measurements carried out on samples collected using a comprehensive procedure, measurement uncertainty was associated exclusively with the analytic methods applied. Application of statistical analyses confirmed that these results were significant at the 95% confidence level. Overall implications of these results are (1) that it is possible to eliminate uncertainty associated with material inhomogeneity and (2) that in order to reduce uncertainty, sampling procedure is very important early in the composting process but less so later in the process.

  6. Quantifying the coherence of pure quantum states

    NASA Astrophysics Data System (ADS)

    Chen, Jianxin; Grogan, Shane; Johnston, Nathaniel; Li, Chi-Kwong; Plosker, Sarah

    2016-10-01

    In recent years, several measures have been proposed for characterizing the coherence of a given quantum state. We derive several results that illuminate how these measures behave when restricted to pure states. Notably, we present an explicit characterization of the closest incoherent state to a given pure state under the trace distance measure of coherence. We then use this result to show that the states maximizing the trace distance of coherence are exactly the maximally coherent states. We define the trace distance of entanglement and show that it coincides with the trace distance of coherence for pure states. Finally, we give an alternate proof to a recent result that the ℓ1 measure of coherence of a pure state is never smaller than its relative entropy of coherence.

  7. Quantifying Stock Return Distributions in Financial Markets.

    PubMed

    Botta, Federico; Moat, Helen Susannah; Stanley, H Eugene; Preis, Tobias

    2015-01-01

    Being able to quantify the probability of large price changes in stock markets is of crucial importance in understanding financial crises that affect the lives of people worldwide. Large changes in stock market prices can arise abruptly, within a matter of minutes, or develop across much longer time scales. Here, we analyze a dataset comprising the stocks forming the Dow Jones Industrial Average at a second by second resolution in the period from January 2008 to July 2010 in order to quantify the distribution of changes in market prices at a range of time scales. We find that the tails of the distributions of logarithmic price changes, or returns, exhibit power law decays for time scales ranging from 300 seconds to 3600 seconds. For larger time scales, we find that the distributions tails exhibit exponential decay. Our findings may inform the development of models of market behavior across varying time scales. PMID:26327593

  8. Quantifying Stock Return Distributions in Financial Markets

    PubMed Central

    Botta, Federico; Moat, Helen Susannah; Stanley, H. Eugene; Preis, Tobias

    2015-01-01

    Being able to quantify the probability of large price changes in stock markets is of crucial importance in understanding financial crises that affect the lives of people worldwide. Large changes in stock market prices can arise abruptly, within a matter of minutes, or develop across much longer time scales. Here, we analyze a dataset comprising the stocks forming the Dow Jones Industrial Average at a second by second resolution in the period from January 2008 to July 2010 in order to quantify the distribution of changes in market prices at a range of time scales. We find that the tails of the distributions of logarithmic price changes, or returns, exhibit power law decays for time scales ranging from 300 seconds to 3600 seconds. For larger time scales, we find that the distributions tails exhibit exponential decay. Our findings may inform the development of models of market behavior across varying time scales. PMID:26327593

  9. Quantifying Shape Changes and Tissue Deformation in Leaf Development1[C][W][OPEN

    PubMed Central

    Rolland-Lagan, Anne-Gaëlle; Remmler, Lauren; Girard-Bock, Camille

    2014-01-01

    The analysis of biological shapes has applications in many areas of biology, and tools exist to quantify organ shape and detect shape differences between species or among variants. However, such measurements do not provide any information about the mechanisms of shape generation. Quantitative data on growth patterns may provide insights into morphogenetic processes, but since growth is a complex process occurring in four dimensions, growth patterns alone cannot intuitively be linked to shape outcomes. Here, we present computational tools to quantify tissue deformation and surface shape changes over the course of leaf development, applied to the first leaf of Arabidopsis (Arabidopsis thaliana). The results show that the overall leaf shape does not change notably during the developmental stages analyzed, yet there is a clear upward radial deformation of the leaf tissue in early time points. This deformation pattern may provide an explanation for how the Arabidopsis leaf maintains a relatively constant shape despite spatial heterogeneities in growth. These findings highlight the importance of quantifying tissue deformation when investigating the control of leaf shape. More generally, experimental mapping of deformation patterns may help us to better understand the link between growth and shape in organ development. PMID:24710066

  10. Mimas Showing False Colors #1

    NASA Technical Reports Server (NTRS)

    2005-01-01

    False color images of Saturn's moon, Mimas, reveal variation in either the composition or texture across its surface.

    During its approach to Mimas on Aug. 2, 2005, the Cassini spacecraft narrow-angle camera obtained multi-spectral views of the moon from a range of 228,000 kilometers (142,500 miles).

    The image at the left is a narrow angle clear-filter image, which was separately processed to enhance the contrast in brightness and sharpness of visible features. The image at the right is a color composite of narrow-angle ultraviolet, green, infrared and clear filter images, which have been specially processed to accentuate subtle changes in the spectral properties of Mimas' surface materials. To create this view, three color images (ultraviolet, green and infrared) were combined into a single black and white picture that isolates and maps regional color differences. This 'color map' was then superimposed over the clear-filter image at the left.

    The combination of color map and brightness image shows how the color differences across the Mimas surface materials are tied to geological features. Shades of blue and violet in the image at the right are used to identify surface materials that are bluer in color and have a weaker infrared brightness than average Mimas materials, which are represented by green.

    Herschel crater, a 140-kilometer-wide (88-mile) impact feature with a prominent central peak, is visible in the upper right of each image. The unusual bluer materials are seen to broadly surround Herschel crater. However, the bluer material is not uniformly distributed in and around the crater. Instead, it appears to be concentrated on the outside of the crater and more to the west than to the north or south. The origin of the color differences is not yet understood. It may represent ejecta material that was excavated from inside Mimas when the Herschel impact occurred. The bluer color of these materials may be caused by subtle differences in

  11. Quantifying Biogenic Bias in Screening Libraries

    PubMed Central

    Hert, Jérôme; Irwin, John J.; Laggner, Christian; Keiser, Michael J.; Shoichet, Brian K.

    2009-01-01

    In lead discovery, libraries of 106 molecules are screened for biological activity. Given the over 1060 drug-like molecules thought possible, such screens might never succeed. That they do, even occasionally, implies a biased selection of library molecules. Here a method is developed to quantify the bias in screening libraries towards biogenic molecules. With this approach, we consider what is missing from screening libraries and how they can be optimized. PMID:19483698

  12. Quantifying the CV: Adapting an Impact Assessment Model to Astronomy

    NASA Astrophysics Data System (ADS)

    Bohémier, K. A.

    2015-04-01

    We present the process and results of applying the Becker Model to the curriculum vitae of a Yale University astronomy professor. As background, in July 2013, the Becker Medical Library at Washington Univ. in St. Louis held a workshop for librarians on the Becker Model, a framework developed by research assessment librarians for quantifying medical researchers' individual and group outputs. Following the workshop, the model was analyzed for content to adapt it to the physical sciences.

  13. The logic in language: How all quantifiers are alike, but each quantifier is different.

    PubMed

    Feiman, Roman; Snedeker, Jesse

    2016-06-01

    Quantifier words like each, every, all and three are among the most abstract words in language. Unlike nouns, verbs and adjectives, the meanings of quantifiers are not related to a referent out in the world. Rather, quantifiers specify what relationships hold between the sets of entities, events and properties denoted by other words. When two quantifiers are in the same clause, they create a systematic ambiguity. "Every kid climbed a tree" could mean that there was only one tree, climbed by all, or many different trees, one per climbing kid. In the present study, participants chose a picture to indicate their preferred reading of different ambiguous sentences - those containing every, as well as the other three quantifiers. In Experiment 1, we found large systematic differences in preference, depending on the quantifier word. In Experiment 2, we then manipulated the choice of a particular reading of one sentence, and tested how this affected participants' reading preference on a subsequent target sentence. We found a priming effect for all quantifiers, but only when the prime and target sentences contained the same quantifier. For example, all-a sentences prime other all-a sentences, while each-a primes each-a, but sentences with each do not prime sentences with all or vice versa. In Experiment 3, we ask whether the lack of priming across quantifiers could be due to the two sentences sharing one fewer word. We find that changing the verb between the prime and target sentence does not reduce the priming effect. In Experiment 4, we discover one case where there is priming across quantifiers - when one number (e.g. three) is in the prime, and a different one (e.g. four) is in the target. We discuss how these findings relate to linguistic theories of quantifier meaning and what they tell us about the division of labor between conceptual content and combinatorial semantics, as well as the mental representations of quantification and of the abstract logical structure of

  14. COMPLEXITY&APPROXIMABILITY OF QUANTIFIED&STOCHASTIC CONSTRAINT SATISFACTION PROBLEMS

    SciTech Connect

    Hunt, H. B.; Marathe, M. V.; Stearns, R. E.

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S and T be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SATc(S).) Here, we study simultaneously the complexity of decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. We present simple yet general techniques to characterize simultaneously, the complexity or efficient approximability of a number of versions/variants of the problems SAT(S), Q-SAT(S), S-SAT(S),MAX-Q-SAT(S) etc., for many different such D,C ,S, T. These versions/variants include decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. Our unified approach is based on the following two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic represent ability. Some of the results extend the earlier results in [Pa85,LMP99,CF+93,CF+94O]u r techniques and results reported here also provide significant steps towards obtaining dichotomy theorems, for a number of the problems above, including the problems MAX-&-SAT( S), and MAX-S-SAT(S). The discovery of such dichotomy theorems, for unquantified formulas, has received significant recent attention in the literature [CF+93,CF+94,Cr95,KSW97

  15. Quantifying food intake in socially housed monkeys: social status effects on caloric consumption.

    PubMed

    Wilson, Mark E; Fisher, Jeff; Fischer, Andrew; Lee, Vanessa; Harris, Ruth B; Bartness, Timothy J

    2008-07-01

    Obesity results from a number of factors including socio-environmental influences and rodent models show that several different stressors increase the preference for calorically dense foods leading to an obese phenotype. We present here a non-human primate model using socially housed adult female macaques living in long-term stable groups given access to diets of different caloric density. Consumption of a low fat (LFD; 15% of calories from fat) and a high fat diet (HFD; 45% of calories from fat) was quantified by means of a custom-built, automated feeder that dispensed a pellet of food when activated by a radiofrequency chip implanted subcutaneously in the animal's wrist. Socially subordinate females showed indices of chronic psychological stress having reduced glucocorticoid negative feedback and higher frequencies of anxiety-like behavior. Twenty-four hour intakes of both the LFD and HFD were significantly greater in subordinates than dominates, an effect that persisted whether standard monkey chow (13% of calories from fat) was present or absent. Furthermore, although dominants restricted their food intake to daylight, subordinates continued to feed at night. Total caloric intake was significantly correlated with body weight change. Collectively, these results show that food intake can be reliably quantified in non-human primates living in complex social environments and suggest that socially subordinate females consume more calories, suggesting this ethologically relevant model may help understand how psychosocial stress changes food preferences and consumption leading to obesity.

  16. A novel real time imaging platform to quantify macrophage phagocytosis.

    PubMed

    Kapellos, Theodore S; Taylor, Lewis; Lee, Heyne; Cowley, Sally A; James, William S; Iqbal, Asif J; Greaves, David R

    2016-09-15

    Phagocytosis of pathogens, apoptotic cells and debris is a key feature of macrophage function in host defense and tissue homeostasis. Quantification of macrophage phagocytosis in vitro has traditionally been technically challenging. Here we report the optimization and validation of the IncuCyte ZOOM® real time imaging platform for macrophage phagocytosis based on pHrodo® pathogen bioparticles, which only fluoresce when localized in the acidic environment of the phagolysosome. Image analysis and fluorescence quantification were performed with the automated IncuCyte™ Basic Software. Titration of the bioparticle number showed that the system is more sensitive than a spectrofluorometer, as it can detect phagocytosis when using 20× less E. coli bioparticles. We exemplified the power of this real time imaging platform by studying phagocytosis of murine alveolar, bone marrow and peritoneal macrophages. We further demonstrate the ability of this platform to study modulation of the phagocytic process, as pharmacological inhibitors of phagocytosis suppressed bioparticle uptake in a concentration-dependent manner, whereas opsonins augmented phagocytosis. We also investigated the effects of macrophage polarization on E. coli phagocytosis. Bone marrow-derived macrophage (BMDM) priming with M2 stimuli, such as IL-4 and IL-10 resulted in higher engulfment of bioparticles in comparison with M1 polarization. Moreover, we demonstrated that tolerization of BMDMs with lipopolysaccharide (LPS) results in impaired E. coli bioparticle phagocytosis. This novel real time assay will enable researchers to quantify macrophage phagocytosis with a higher degree of accuracy and sensitivity and will allow investigation of limited populations of primary phagocytes in vitro.

  17. A novel real time imaging platform to quantify macrophage phagocytosis.

    PubMed

    Kapellos, Theodore S; Taylor, Lewis; Lee, Heyne; Cowley, Sally A; James, William S; Iqbal, Asif J; Greaves, David R

    2016-09-15

    Phagocytosis of pathogens, apoptotic cells and debris is a key feature of macrophage function in host defense and tissue homeostasis. Quantification of macrophage phagocytosis in vitro has traditionally been technically challenging. Here we report the optimization and validation of the IncuCyte ZOOM® real time imaging platform for macrophage phagocytosis based on pHrodo® pathogen bioparticles, which only fluoresce when localized in the acidic environment of the phagolysosome. Image analysis and fluorescence quantification were performed with the automated IncuCyte™ Basic Software. Titration of the bioparticle number showed that the system is more sensitive than a spectrofluorometer, as it can detect phagocytosis when using 20× less E. coli bioparticles. We exemplified the power of this real time imaging platform by studying phagocytosis of murine alveolar, bone marrow and peritoneal macrophages. We further demonstrate the ability of this platform to study modulation of the phagocytic process, as pharmacological inhibitors of phagocytosis suppressed bioparticle uptake in a concentration-dependent manner, whereas opsonins augmented phagocytosis. We also investigated the effects of macrophage polarization on E. coli phagocytosis. Bone marrow-derived macrophage (BMDM) priming with M2 stimuli, such as IL-4 and IL-10 resulted in higher engulfment of bioparticles in comparison with M1 polarization. Moreover, we demonstrated that tolerization of BMDMs with lipopolysaccharide (LPS) results in impaired E. coli bioparticle phagocytosis. This novel real time assay will enable researchers to quantify macrophage phagocytosis with a higher degree of accuracy and sensitivity and will allow investigation of limited populations of primary phagocytes in vitro. PMID:27475716

  18. Quantifying thermal modifications on laser welded skin tissue

    NASA Astrophysics Data System (ADS)

    Tabakoglu, Hasim Ö.; Gülsoy, Murat

    2011-02-01

    Laser tissue welding is a potential medical treatment method especially on closing cuts implemented during any kind of surgery. Photothermal effects of laser on tissue should be quantified in order to determine optimal dosimetry parameters. Polarized light and phase contrast techniques reveal information about extend of thermal change over tissue occurred during laser welding application. Change in collagen structure in skin tissue stained with hematoxilen and eosin samples can be detected. In this study, three different near infrared laser wavelengths (809 nm, 980 nm and 1070 nm) were compared for skin welding efficiency. 1 cm long cuts were treated spot by spot laser application on Wistar rats' dorsal skin, in vivo. In all laser applications, 0.5 W of optical power was delivered to the tissue, 5 s continuously, resulting in 79.61 J/cm2 energy density (15.92 W/cm2 power density) for each spot. The 1st, 4th, 7th, 14th, and 21st days of recovery period were determined as control days, and skin samples needed for histology were removed on these particular days. The stained samples were examined under a light microscope. Images were taken with a CCD camera and examined with imaging software. 809 Nm laser was found to be capable of creating strong full-thickness closure, but thermal damage was evident. The thermal damage from 980 nm laser welding was found to be more tolerable. The results showed that 1070 nm laser welding produced noticeably stronger bonds with minimal scar formation.

  19. Using multilevel models to quantify heterogeneity in resource selection

    USGS Publications Warehouse

    Wagner, T.; Diefenbach, D.R.; Christensen, S.A.; Norton, A.S.

    2011-01-01

    Models of resource selection are being used increasingly to predict or model the effects of management actions rather than simply quantifying habitat selection. Multilevel, or hierarchical, models are an increasingly popular method to analyze animal resource selection because they impose a relatively weak stochastic constraint to model heterogeneity in habitat use and also account for unequal sample sizes among individuals. However, few studies have used multilevel models to model coefficients as a function of predictors that may influence habitat use at different scales or quantify differences in resource selection among groups. We used an example with white-tailed deer (Odocoileus virginianus) to illustrate how to model resource use as a function of distance to road that varies among deer by road density at the home range scale. We found that deer avoidance of roads decreased as road density increased. Also, we used multilevel models with sika deer (Cervus nippon) and white-tailed deer to examine whether resource selection differed between species. We failed to detect differences in resource use between these two species and showed how information-theoretic and graphical measures can be used to assess how resource use may have differed. Multilevel models can improve our understanding of how resource selection varies among individuals and provides an objective, quantifiable approach to assess differences or changes in resource selection. ?? The Wildlife Society, 2011.

  20. Quantifying the Impact of Scenic Environments on Health.

    PubMed

    Seresinhe, Chanuki Illushka; Preis, Tobias; Moat, Helen Susannah

    2015-01-01

    Few people would deny an intuitive sense of increased wellbeing when spending time in beautiful locations. Here, we ask: can we quantify the relationship between environmental aesthetics and human health? We draw on data from Scenic-Or-Not, a website that crowdsources ratings of "scenicness" for geotagged photographs across Great Britain, in combination with data on citizen-reported health from the Census for England and Wales. We find that inhabitants of more scenic environments report better health, across urban, suburban and rural areas, even when taking core socioeconomic indicators of deprivation into account, such as income, employment and access to services. Our results provide evidence in line with the striking hypothesis that the aesthetics of the environment may have quantifiable consequences for our wellbeing. PMID:26603464

  1. Quantifying the Impact of Scenic Environments on Health

    NASA Astrophysics Data System (ADS)

    Seresinhe, Chanuki Illushka; Preis, Tobias; Moat, Helen Susannah

    2015-11-01

    Few people would deny an intuitive sense of increased wellbeing when spending time in beautiful locations. Here, we ask: can we quantify the relationship between environmental aesthetics and human health? We draw on data from Scenic-Or-Not, a website that crowdsources ratings of “scenicness” for geotagged photographs across Great Britain, in combination with data on citizen-reported health from the Census for England and Wales. We find that inhabitants of more scenic environments report better health, across urban, suburban and rural areas, even when taking core socioeconomic indicators of deprivation into account, such as income, employment and access to services. Our results provide evidence in line with the striking hypothesis that the aesthetics of the environment may have quantifiable consequences for our wellbeing.

  2. Quantifying the Impact of Scenic Environments on Health

    PubMed Central

    Seresinhe, Chanuki Illushka; Preis, Tobias; Moat, Helen Susannah

    2015-01-01

    Few people would deny an intuitive sense of increased wellbeing when spending time in beautiful locations. Here, we ask: can we quantify the relationship between environmental aesthetics and human health? We draw on data from Scenic-Or-Not, a website that crowdsources ratings of “scenicness” for geotagged photographs across Great Britain, in combination with data on citizen-reported health from the Census for England and Wales. We find that inhabitants of more scenic environments report better health, across urban, suburban and rural areas, even when taking core socioeconomic indicators of deprivation into account, such as income, employment and access to services. Our results provide evidence in line with the striking hypothesis that the aesthetics of the environment may have quantifiable consequences for our wellbeing. PMID:26603464

  3. Analysis of subsurface temperature data to quantify groundwater recharge rates in a closed Altiplano basin, northern Chile

    NASA Astrophysics Data System (ADS)

    Kikuchi, C. P.; Ferré, T. P. A.

    2016-09-01

    Quantifying groundwater recharge is a fundamental part of groundwater resource assessment and management, and is requisite to determining the safe yield of an aquifer. Natural groundwater recharge in arid and semi-arid regions comprises several mechanisms: in-place, mountain-front, and mountain-block recharge. A field study was undertaken in a high-plain basin in the Altiplano region of northern Chile to quantify the magnitude of in-place and mountain-front recharge. Water fluxes corresponding to both recharge mechanisms were calculated using heat as a natural tracer. To quantify in-place recharge, time-series temperature data in cased boreholes were collected, and the annual fluctuation at multiple depths analyzed to infer the water flux through the unsaturated zone. To quantify mountain-front recharge, time-series temperature data were collected in perennial and ephemeral stream channels. Streambed thermographs were analyzed to determine the onset and duration of flow in ephemeral channels, and the vertical water fluxes into both perennial and ephemeral channels. The point flux estimates in streambeds and the unsaturated zone were upscaled to channel and basin-floor areas to provide comparative estimates of the range of volumetric recharge rates corresponding to each recharge mechanism. The results of this study show that mountain-front recharge is substantially more important than in-place recharge in this basin. The results further demonstrate the worth of time-series subsurface temperature data to characterize both in-place and mountain-front recharge processes.

  4. Gains and Pitfalls of Quantifier Elimination as a Teaching Tool

    ERIC Educational Resources Information Center

    Oldenburg, Reinhard

    2015-01-01

    Quantifier Elimination is a procedure that allows simplification of logical formulas that contain quantifiers. Many mathematical concepts are defined in terms of quantifiers and especially in calculus their use has been identified as an obstacle in the learning process. The automatic deduction provided by quantifier elimination thus allows…

  5. Quantifying the underlying landscape and paths of cancer

    PubMed Central

    Li, Chunhe; Wang, Jin

    2014-01-01

    Cancer is a disease regulated by the underlying gene networks. The emergence of normal and cancer states as well as the transformation between them can be thought of as a result of the gene network interactions and associated changes. We developed a global potential landscape and path framework to quantify cancer and associated processes. We constructed a cancer gene regulatory network based on the experimental evidences and uncovered the underlying landscape. The resulting tristable landscape characterizes important biological states: normal, cancer and apoptosis. The landscape topography in terms of barrier heights between stable state attractors quantifies the global stability of the cancer network system. We propose two mechanisms of cancerization: one is by the changes of landscape topography through the changes in regulation strengths of the gene networks. The other is by the fluctuations that help the system to go over the critical barrier at fixed landscape topography. The kinetic paths from least action principle quantify the transition processes among normal state, cancer state and apoptosis state. The kinetic rates provide the quantification of transition speeds among normal, cancer and apoptosis attractors. By the global sensitivity analysis of the gene network parameters on the landscape topography, we uncovered some key gene regulations determining the transitions between cancer and normal states. This can be used to guide the design of new anti-cancer tactics, through cocktail strategy of targeting multiple key regulation links simultaneously, for preventing cancer occurrence or transforming the early cancer state back to normal state. PMID:25232051

  6. Quantifying Error in the CMORPH Satellite Precipitation Estimates

    NASA Astrophysics Data System (ADS)

    Xu, B.; Yoo, S.; Xie, P.

    2010-12-01

    As part of the collaboration between China Meteorological Administration (CMA) National Meteorological Information Centre (NMIC) and NOAA Climate Prediction Center (CPC), a new system is being developed to construct hourly precipitation analysis on a 0.25olat/lon grid over China by merging information derived from gauge observations and CMORPH satellite precipitation estimates. Foundation to the development of the gauge-satellite merging algorithm is the definition of the systematic and random error inherent in the CMORPH satellite precipitation estimates. In this study, we quantify the CMORPH error structures through comparisons against a gauge-based analysis of hourly precipitation derived from station reports from a dense network over China. First, systematic error (bias) of the CMORPH satellite estimates are examined with co-located hourly gauge precipitation analysis over 0.25olat/lon grid boxes with at least one reporting station. The CMORPH exhibits biases of regional variations showing over-estimates over eastern China, and seasonal changes with over-/under-estimates during warm/cold seasons. The CMORPH bias presents range-dependency. In general, the CMORPH tends to over-/under-estimate weak / strong rainfall. The bias, when expressed in the form of ratio between the gauge observations and the CMORPH satellite estimates, increases with the rainfall intensity but tends to saturate at a certain level for high rainfall. Based on the above results, a prototype algorithm is developed to remove the CMORPH bias through matching the PDF of original CMORPH estimates against that of the gauge analysis using data pairs co-located over grid boxes with at least one reporting gauge over a 30-day period ending at the target date. The spatial domain for collecting the co-located data pairs is expanded so that at least 5000 pairs of data are available to ensure statistical availability. The bias-corrected CMORPH is then compared against the gauge data to quantify the

  7. An index for quantifying flocking behavior.

    PubMed

    Quera, Vicenç; Herrando, Salvador; Beltran, Francesc S; Salas, Laura; Miñano, Meritxell

    2007-12-01

    One of the classic research topics in adaptive behavior is the collective displacement of groups of organisms such as flocks of birds, schools of fish, herds of mammals, and crowds of people. However, most agent-based simulations of group behavior do not provide a quantitative index for determining the point at which the flock emerges. An index was developed of the aggregation of moving individuals in a flock and an example was provided of how it can be used to quantify the degree to which a group of moving individuals actually forms a flock. PMID:18229552

  8. Crowdsourcing for quantifying transcripts: An exploratory study.

    PubMed

    Azzam, Tarek; Harman, Elena

    2016-02-01

    This exploratory study attempts to demonstrate the potential utility of crowdsourcing as a supplemental technique for quantifying transcribed interviews. Crowdsourcing is the harnessing of the abilities of many people to complete a specific task or a set of tasks. In this study multiple samples of crowdsourced individuals were asked to rate and select supporting quotes from two different transcripts. The findings indicate that the different crowdsourced samples produced nearly identical ratings of the transcripts, and were able to consistently select the same supporting text from the transcripts. These findings suggest that crowdsourcing, with further development, can potentially be used as a mixed method tool to offer a supplemental perspective on transcribed interviews.

  9. Inducing and Quantifying Clostridium difficile Spore Formation.

    PubMed

    Shen, Aimee; Fimlaid, Kelly A; Pishdadian, Keyan

    2016-01-01

    The Gram-positive nosocomial pathogen Clostridium difficile induces sporulation during growth in the gastrointestinal tract. Sporulation is necessary for this obligate anaerobe to form metabolically dormant spores that can resist antibiotic treatment, survive exit from the mammalian host, and transmit C. difficile infections. In this chapter, we describe a method for inducing C. difficile sporulation in vitro. This method can be used to study sporulation and maximize spore purification yields for a number of C. difficile strain backgrounds. We also describe procedures for visualizing spore formation using phase-contrast microscopy and for quantifying the efficiency of sporulation using heat resistance as a measure of functional spore formation. PMID:27507338

  10. Quantifying and scaling airplane performance in turbulence

    NASA Astrophysics Data System (ADS)

    Richardson, Johnhenri R.

    This dissertation studies the effects of turbulent wind on airplane airspeed and normal load factor, determining how these effects scale with airplane size and developing envelopes to account for them. The results have applications in design and control of aircraft, especially small scale aircraft, for robustness with respect to turbulence. Using linearized airplane dynamics and the Dryden gust model, this dissertation presents analytical and numerical scaling laws for airplane performance in gusts, safety margins that guarantee, with specified probability, that steady flight can be maintained when stochastic wind gusts act upon an airplane, and envelopes to visualize these safety margins. Presented here for the first time are scaling laws for the phugoid natural frequency, phugoid damping ratio, airspeed variance in turbulence, and flight path angle variance in turbulence. The results show that small aircraft are more susceptible to high frequency gusts, that the phugoid damping ratio does not depend directly on airplane size, that the airspeed and flight path angle variances can be parameterized by the ratio of the phugoid natural frequency to a characteristic turbulence frequency, and that the coefficient of variation of the airspeed decreases with increasing airplane size. Accompanying numerical examples validate the results using eleven different airplanes models, focusing on NASA's hypothetical Boeing 757 analog the Generic Transport Model and its operational 5.5% scale model, the NASA T2. Also presented here for the first time are stationary flight, where the flight state is a stationary random process, and the stationary flight envelope, an adjusted steady flight envelope to visualize safety margins for stationary flight. The dissertation shows that driving the linearized airplane equations of motion with stationary, stochastic gusts results in stationary flight. It also shows how feedback control can enlarge the stationary flight envelope by alleviating

  11. Quantifying the surface chemistry of 3D matrices in situ

    NASA Astrophysics Data System (ADS)

    Tzeranis, Dimitrios S.; So, Peter T. C.; Yannas, Ioannis V.

    2014-03-01

    Despite the major role of the matrix (the insoluble environment around cells) in physiology and pathology, there are very few and limited methods that can quantify the surface chemistry of a 3D matrix such as a biomaterial or tissue ECM. This study describes a novel optical-based methodology that can quantify the surface chemistry (density of adhesion ligands for particular cell adhesion receptors) of a matrix in situ. The methodology utilizes fluorescent analogs (markers) of the receptor of interest and a series of binding assays, where the amount of bound markers on the matrix is quantified via spectral multi-photon imaging. The study provides preliminary results for the quantification of the ligands for the two major collagen-binding integrins (α1β1, α2β1) in porous collagen scaffolds that have been shown to be able to induce maximum regeneration in transected peripheral nerves. The developed methodology opens the way for quantitative descriptions of the insoluble microenvironment of cells in physiology and pathology, and for integrating the matrix in quantitative models of cell signaling. α

  12. Quantifying the micrometorological controls on fog deposition

    NASA Astrophysics Data System (ADS)

    Farlin, J. P.; Paw U, K. T.; Underwood, J.

    2014-12-01

    Fog deposition has been shown to be a significant water input into many arid ecosystems. However, deposition of fog onto foliage depends on many factors. Previously, characterizing fog droplet size distributions was labor intensive, but currently we can characterize changes in fog droplet composition in the 2-50 μm in 2 μm intervals in real time. Evaluating how droplet size and ambient micrometeorological conditions affect deposition rates will allowing tremendous new insight into fog formation and deposition processes. Previous work has characterized fog deposition as it alters with wind speed in natural systems, but extensively testing how droplet size, wind speed, angle of interception all co-vary would be impossible in a natural setting. We utilized a wind tunnel with artificial fog generating nebulizers to simulate fog events across micrometeorological conditions. Using a weighing lysimeter, we were able to quantify the differential rates of deposition on different theoretical leaf types as droplet size and micrometeorological conditions vary. We hope to inform fog collector designs with this information to ensure we are accurately quantifying the fluxes of fog-derived water into these systems.

  13. Quantifying chemical reactions by using mixing analysis.

    PubMed

    Jurado, Anna; Vázquez-Suñé, Enric; Carrera, Jesús; Tubau, Isabel; Pujades, Estanislao

    2015-01-01

    This work is motivated by a sound understanding of the chemical processes that affect the organic pollutants in an urban aquifer. We propose an approach to quantify such processes using mixing calculations. The methodology consists of the following steps: (1) identification of the recharge sources (end-members) and selection of the species (conservative and non-conservative) to be used, (2) identification of the chemical processes and (3) evaluation of mixing ratios including the chemical processes. This methodology has been applied in the Besòs River Delta (NE Barcelona, Spain), where the River Besòs is the main aquifer recharge source. A total number of 51 groundwater samples were collected from July 2007 to May 2010 during four field campaigns. Three river end-members were necessary to explain the temporal variability of the River Besòs: one river end-member is from the wet periods (W1) and two are from dry periods (D1 and D2). This methodology has proved to be useful not only to compute the mixing ratios but also to quantify processes such as calcite and magnesite dissolution, aerobic respiration and denitrification undergone at each observation point.

  14. Precise thermal NDE for quantifying structural damage

    SciTech Connect

    Del Grande, N.K.; Durbin, P.F.

    1995-09-18

    The authors demonstrated a fast, wide-area, precise thermal NDE imaging system to quantify aircraft corrosion damage, such as percent metal loss, above a threshold of 5% with 3% overall uncertainties. The DBIR precise thermal imaging and detection method has been used successfully to characterize defect types, and their respective depths, in aircraft skins, and multi-layered composite materials used for wing patches, doublers and stiffeners. This precise thermal NDE inspection tool has long-term potential benefits to evaluate the structural integrity of airframes, pipelines and waste containers. They proved the feasibility of the DBIR thermal NDE imaging system to inspect concrete and asphalt-concrete bridge decks. As a logical extension to the successful feasibility study, they plan to inspect a concrete bridge deck from a moving vehicle to quantify the volumetric damage within the deck and the percent of the deck which has subsurface delaminations. Potential near-term benefits are in-service monitoring from a moving vehicle to inspect the structural integrity of the bridge deck. This would help prioritize the repair schedule for a reported 200,000 bridge decks in the US which need substantive repairs. Potential long-term benefits are affordable, and reliable, rehabilitation for bridge decks.

  15. Quantifying meta-correlations in financial markets

    NASA Astrophysics Data System (ADS)

    Kenett, Dror Y.; Preis, Tobias; Gur-Gershgoren, Gitit; Ben-Jacob, Eshel

    2012-08-01

    Financial markets are modular multi-level systems, in which the relationships between the individual components are not constant in time. Sudden changes in these relationships significantly affect the stability of the entire system, and vice versa. Our analysis is based on historical daily closing prices of the 30 components of the Dow Jones Industrial Average (DJIA) from March 15th, 1939 until December 31st, 2010. We quantify the correlation among these components by determining Pearson correlation coefficients, to investigate whether mean correlation of the entire portfolio can be used as a precursor for changes in the index return. To this end, we quantify the meta-correlation - the correlation of mean correlation and index return. We find that changes in index returns are significantly correlated with changes in mean correlation. Furthermore, we study the relationship between the index return and correlation volatility - the standard deviation of correlations for a given time interval. This parameter provides further evidence of the effect of the index on market correlations and their fluctuations. Our empirical findings provide new information and quantification of the index leverage effect, and have implications to risk management, portfolio optimization, and to the increased stability of financial markets.

  16. Winter wren populations show adaptation to local climate.

    PubMed

    Morrison, Catriona A; Robinson, Robert A; Pearce-Higgins, James W

    2016-06-01

    Most studies of evolutionary responses to climate change have focused on phenological responses to warming, and provide only weak evidence for evolutionary adaptation. This could be because phenological changes are more weakly linked to fitness than more direct mechanisms of climate change impacts, such as selective mortality during extreme weather events which have immediate fitness consequences for the individuals involved. Studies examining these other mechanisms may be more likely to show evidence for evolutionary adaptation. To test this, we quantify regional population responses of a small resident passerine (winter wren Troglodytes troglodytes) to a measure of winter severity (number of frost days). Annual population growth rate was consistently negatively correlated with this measure, but the point at which different populations achieved stability (λ = 1) varied across regions and was closely correlated with the historic average number of frost days, providing strong evidence for local adaptation. Despite this, regional variation in abundance remained negatively related to the regional mean number of winter frost days, potentially as a result of a time-lag in the rate of evolutionary response to climate change. As expected from Bergmann's rule, individual wrens were heavier in colder regions, suggesting that local adaptation may be mediated through body size. However, there was no evidence for selective mortality of small individuals in cold years, with annual variation in mean body size uncorrelated with the number of winter frost days, so the extent to which local adaptation occurs through changes in body size, or another mechanism remains uncertain. PMID:27429782

  17. Winter wren populations show adaptation to local climate.

    PubMed

    Morrison, Catriona A; Robinson, Robert A; Pearce-Higgins, James W

    2016-06-01

    Most studies of evolutionary responses to climate change have focused on phenological responses to warming, and provide only weak evidence for evolutionary adaptation. This could be because phenological changes are more weakly linked to fitness than more direct mechanisms of climate change impacts, such as selective mortality during extreme weather events which have immediate fitness consequences for the individuals involved. Studies examining these other mechanisms may be more likely to show evidence for evolutionary adaptation. To test this, we quantify regional population responses of a small resident passerine (winter wren Troglodytes troglodytes) to a measure of winter severity (number of frost days). Annual population growth rate was consistently negatively correlated with this measure, but the point at which different populations achieved stability (λ = 1) varied across regions and was closely correlated with the historic average number of frost days, providing strong evidence for local adaptation. Despite this, regional variation in abundance remained negatively related to the regional mean number of winter frost days, potentially as a result of a time-lag in the rate of evolutionary response to climate change. As expected from Bergmann's rule, individual wrens were heavier in colder regions, suggesting that local adaptation may be mediated through body size. However, there was no evidence for selective mortality of small individuals in cold years, with annual variation in mean body size uncorrelated with the number of winter frost days, so the extent to which local adaptation occurs through changes in body size, or another mechanism remains uncertain.

  18. Winter wren populations show adaptation to local climate

    PubMed Central

    Morrison, Catriona A.; Robinson, Robert A.; Pearce-Higgins, James W.

    2016-01-01

    Most studies of evolutionary responses to climate change have focused on phenological responses to warming, and provide only weak evidence for evolutionary adaptation. This could be because phenological changes are more weakly linked to fitness than more direct mechanisms of climate change impacts, such as selective mortality during extreme weather events which have immediate fitness consequences for the individuals involved. Studies examining these other mechanisms may be more likely to show evidence for evolutionary adaptation. To test this, we quantify regional population responses of a small resident passerine (winter wren Troglodytes troglodytes) to a measure of winter severity (number of frost days). Annual population growth rate was consistently negatively correlated with this measure, but the point at which different populations achieved stability (λ = 1) varied across regions and was closely correlated with the historic average number of frost days, providing strong evidence for local adaptation. Despite this, regional variation in abundance remained negatively related to the regional mean number of winter frost days, potentially as a result of a time-lag in the rate of evolutionary response to climate change. As expected from Bergmann's rule, individual wrens were heavier in colder regions, suggesting that local adaptation may be mediated through body size. However, there was no evidence for selective mortality of small individuals in cold years, with annual variation in mean body size uncorrelated with the number of winter frost days, so the extent to which local adaptation occurs through changes in body size, or another mechanism remains uncertain. PMID:27429782

  19. Quantifying voids effecting delamination in carbon/epoxy composites: static and fatigue fracture behavior

    NASA Astrophysics Data System (ADS)

    Hakim, I.; May, D.; Abo Ras, M.; Meyendorf, N.; Donaldson, S.

    2016-04-01

    On the present work, samples of carbon fiber/epoxy composites with different void levels were fabricated using hand layup vacuum bagging process by varying the pressure. Thermal nondestructive methods: thermal conductivity measurement, pulse thermography, pulse phase thermography and lock-in-thermography, and mechanical testing: modes I and II interlaminar fracture toughness were conducted. Comparing the parameters resulted from the thermal nondestructive testing revealed that voids lead to reductions in thermal properties in all directions of composites. The results of mode I and mode II interlaminar fracture toughness showed that voids lead to reductions in interlaminar fracture toughness. The parameters resulted from thermal nondestructive testing were correlated to the results of mode I and mode II interlaminar fracture toughness and voids were quantified.

  20. Quantifying near-surface water exchange to assess hydrometeorological models

    NASA Astrophysics Data System (ADS)

    Parent, Annie-Claude; Anctil, François; Morais, Anne

    2013-04-01

    Modelling water exchange from the lower atmosphere, crop and soil system using hydrometeorological models allows processing an actual evapotranspiration (ETa) which is a complex but critical value for numerous hydrological purposes e.g. hydrological modelling and crop irrigation. This poster presents a summary of the hydrometeorological research activity conducted by our research group. The first purpose of this research is to quantify ETa and drainage of a rainfed potato crop located in South-Eastern Canada. Then, the outputs of the hydrometeorological models under study are compared with the observed turbulent fluxes. Afterwards, the sensibility of the hydrometeorological models to different inputs is assessed for an environment under a changing climate. ETa was measured from micrometeorological instrumentation (CSAT3, Campbell SCI Inc.; Li7500, LiCor Inc.), and the eddy covariance techniques. Near surface soil heat flux and soil water content at different layers from 10 cm to 100 cm were also measured. Other parameters required by the hydrometeorological models were observed using meteorological standard instrumentation: shortwave and longwave solar radiation, wind speed, air temperature, atmospheric pressure and precipitation. The cumulative ETa during the growth season (123 days) was 331.5 mm, with a daily maximum of 6.5 mm at full coverage; precipitation was 350.6 mm which is rather small compared with the historical mean (563.3 mm). This experimentation allowed calculating crop coefficients that vary among the growth season for a rainfed potato crop. Land surface schemes as CLASS (Canadian Land Surface Scheme) and c-ISBA (a Canadian version of the model Interaction Sol-Biosphère-Atmosphère) are 1-D physical hydrometeorological models that produce turbulent fluxes (including ETa) for a given crop. The schemes performances were assessed for both energy and water balance, based on the resulting turbulent fluxes and the given observations. CLASS showed

  1. Ribosomal protein L7 as a suitable reference gene for quantifying gene expression in gastropod Bellamya aeruginosa.

    PubMed

    Liu, Qing; Lei, Kun; Ma, Qingqing; Qiao, Fei; Li, Zi-Cheng; An, Li-Hui

    2016-04-01

    Expression levels of eight candidate reference genes were quantified in tissues of gastropod Bellamya aeruginosa exposed for 10 d to various stressors, including fasting, 17β-estradiol, 17α-methyltestosterone, and Cd(2+). The results showed that 18s rRNA was the most highly expressed of the candidate reference genes, while H2A was the least expressed. There were no significant changes (p>0.05) in the expression of the eight genes in tissues among the different treatments. Using RefFinder to evaluate the expression stabilities of the eight candidate reference genes, ribosomal protein was shown to be the most stable reference gene, and no effects were observed among the different stressor treatments. These results indicate that RPL 7 is the most suitable reference gene for quantifying gene expression in B. aeruginosa under environmental stress, which was verified in B. aeruginosa exposed to high doses of E2 for 24 and 72h. PMID:26991845

  2. Effect of soil structure on the growth of bacteria in soil quantified using CARD-FISH

    NASA Astrophysics Data System (ADS)

    Juyal, Archana; Eickhorst, Thilo; Falconer, Ruth; Otten, Wilfred

    2014-05-01

    It has been reported that compaction of soil due to use of heavy machinery has resulted in the reduction of crop yield. Compaction affects the physical properties of soil such as bulk density, soil strength and porosity. This causes an alteration in the soil structure which limits the mobility of nutrients, water and air infiltration and root penetration in soil. Several studies have been conducted to explore the effect of soil compaction on plant growth and development. However, there is scant information on the effect of soil compaction on the microbial community and its activities in soil. Understanding the effect of soil compaction on microbial community is essential as microbial activities are very sensitive to abrupt environmental changes in soil. Therefore, the aim of this work was to investigate the effect of soil structure on growth of bacteria in soil. The bulk density of soil was used as a soil physical parameter to quantify the effect of soil compaction. To detect and quantify bacteria in soil the method of catalyzed reporter deposition-fluorescence in situ hybridization (CARD-FISH) was used. This technique results in high intensity fluorescent signals which make it easy to quantify bacteria against high levels of autofluorescence emitted by soil particles and organic matter. In this study, bacterial strains Pseudomonas fluorescens SBW25 and Bacillus subtilis DSM10 were used. Soils of aggregate size 2-1mm were packed at five different bulk densities in polyethylene rings (4.25 cm3).The soil rings were sampled at four different days. Results showed that the total number of bacteria counts was reduced significantly (P

  3. Quantifying the Anthropogenic Footprint in Eastern China.

    PubMed

    Meng, Chunlei; Dou, Youjun

    2016-01-01

    Urban heat island (UHI) is one of the most focuses in urban climate study. The parameterization of the anthropogenic heat (AH) is crucial important in UHI study, but universal method to parameterize the spatial pattern of the AH is lacking now. This paper uses the NOAA DMSP/OLS nighttime light data to parameterize the spatial pattern of the AH. Two experiments were designed and performed to quantify the influences of the AH to land surface temperature (LST) in eastern China and 24 big cities. The annual mean heating caused by AH is up to 1 K in eastern China. This paper uses the relative LST differences rather than the absolute LST differences between the control run and contrast run of common land model (CoLM) to find the drivers. The heating effect of the anthropogenic footprint has less influence on relatively warm and wet cities. PMID:27067132

  4. Animal biometrics: quantifying and detecting phenotypic appearance.

    PubMed

    Kühl, Hjalmar S; Burghardt, Tilo

    2013-07-01

    Animal biometrics is an emerging field that develops quantified approaches for representing and detecting the phenotypic appearance of species, individuals, behaviors, and morphological traits. It operates at the intersection between pattern recognition, ecology, and information sciences, producing computerized systems for phenotypic measurement and interpretation. Animal biometrics can benefit a wide range of disciplines, including biogeography, population ecology, and behavioral research. Currently, real-world applications are gaining momentum, augmenting the quantity and quality of ecological data collection and processing. However, to advance animal biometrics will require integration of methodologies among the scientific disciplines involved. Such efforts will be worthwhile because the great potential of this approach rests with the formal abstraction of phenomics, to create tractable interfaces between different organizational levels of life.

  5. Quantifying the Anthropogenic Footprint in Eastern China

    PubMed Central

    Meng, Chunlei; Dou, Youjun

    2016-01-01

    Urban heat island (UHI) is one of the most focuses in urban climate study. The parameterization of the anthropogenic heat (AH) is crucial important in UHI study, but universal method to parameterize the spatial pattern of the AH is lacking now. This paper uses the NOAA DMSP/OLS nighttime light data to parameterize the spatial pattern of the AH. Two experiments were designed and performed to quantify the influences of the AH to land surface temperature (LST) in eastern China and 24 big cities. The annual mean heating caused by AH is up to 1 K in eastern China. This paper uses the relative LST differences rather than the absolute LST differences between the control run and contrast run of common land model (CoLM) to find the drivers. The heating effect of the anthropogenic footprint has less influence on relatively warm and wet cities. PMID:27067132

  6. How to quantify conduits in wood?

    PubMed

    Scholz, Alexander; Klepsch, Matthias; Karimi, Zohreh; Jansen, Steven

    2013-01-01

    Vessels and tracheids represent the most important xylem cells with respect to long distance water transport in plants. Wood anatomical studies frequently provide several quantitative details of these cells, such as vessel diameter, vessel density, vessel element length, and tracheid length, while important information on the three dimensional structure of the hydraulic network is not considered. This paper aims to provide an overview of various techniques, although there is no standard protocol to quantify conduits due to high anatomical variation and a wide range of techniques available. Despite recent progress in image analysis programs and automated methods for measuring cell dimensions, density, and spatial distribution, various characters remain time-consuming and tedious. Quantification of vessels and tracheids is not only important to better understand functional adaptations of tracheary elements to environment parameters, but will also be essential for linking wood anatomy with other fields such as wood development, xylem physiology, palaeobotany, and dendrochronology.

  7. Quantifying the Anthropogenic Footprint in Eastern China.

    PubMed

    Meng, Chunlei; Dou, Youjun

    2016-04-12

    Urban heat island (UHI) is one of the most focuses in urban climate study. The parameterization of the anthropogenic heat (AH) is crucial important in UHI study, but universal method to parameterize the spatial pattern of the AH is lacking now. This paper uses the NOAA DMSP/OLS nighttime light data to parameterize the spatial pattern of the AH. Two experiments were designed and performed to quantify the influences of the AH to land surface temperature (LST) in eastern China and 24 big cities. The annual mean heating caused by AH is up to 1 K in eastern China. This paper uses the relative LST differences rather than the absolute LST differences between the control run and contrast run of common land model (CoLM) to find the drivers. The heating effect of the anthropogenic footprint has less influence on relatively warm and wet cities.

  8. Quantifying the Anthropogenic Footprint in Eastern China

    NASA Astrophysics Data System (ADS)

    Meng, Chunlei; Dou, Youjun

    2016-04-01

    Urban heat island (UHI) is one of the most focuses in urban climate study. The parameterization of the anthropogenic heat (AH) is crucial important in UHI study, but universal method to parameterize the spatial pattern of the AH is lacking now. This paper uses the NOAA DMSP/OLS nighttime light data to parameterize the spatial pattern of the AH. Two experiments were designed and performed to quantify the influences of the AH to land surface temperature (LST) in eastern China and 24 big cities. The annual mean heating caused by AH is up to 1 K in eastern China. This paper uses the relative LST differences rather than the absolute LST differences between the control run and contrast run of common land model (CoLM) to find the drivers. The heating effect of the anthropogenic footprint has less influence on relatively warm and wet cities.

  9. How to quantify conduits in wood?

    PubMed

    Scholz, Alexander; Klepsch, Matthias; Karimi, Zohreh; Jansen, Steven

    2013-01-01

    Vessels and tracheids represent the most important xylem cells with respect to long distance water transport in plants. Wood anatomical studies frequently provide several quantitative details of these cells, such as vessel diameter, vessel density, vessel element length, and tracheid length, while important information on the three dimensional structure of the hydraulic network is not considered. This paper aims to provide an overview of various techniques, although there is no standard protocol to quantify conduits due to high anatomical variation and a wide range of techniques available. Despite recent progress in image analysis programs and automated methods for measuring cell dimensions, density, and spatial distribution, various characters remain time-consuming and tedious. Quantification of vessels and tracheids is not only important to better understand functional adaptations of tracheary elements to environment parameters, but will also be essential for linking wood anatomy with other fields such as wood development, xylem physiology, palaeobotany, and dendrochronology. PMID:23507674

  10. Quantifying creativity: can measures span the spectrum?

    PubMed

    Simonton, Dean Keith

    2012-03-01

    Because the cognitive neuroscientists have become increasingly interested in the phenomenon of creativity, the issue arises of how creativity is to be optimally measured. Unlike intelligence, which can be assessed across the full range of intellectual ability creativity measures tend to concentrate on different sections of the overall spectrum. After first defining creativity in terms of the three criteria of novelty, usefulness, and surprise, this article provides an overview of the available measures. Not only do these instruments vary according to whether they focus on the creative process, person, or product, but they differ regarding whether they tap into "little-c" versus "Big-C" creativity; only productivity and eminence measures reach into genius-level manifestations of the phenomenon. The article closes by discussing whether various alternative assessment techniques can be integrated into a single measure that quantifies creativity across the full spectrum.

  11. Techniques for quantifying instream flows for recreation

    SciTech Connect

    DiGennaro, B.

    1995-12-31

    Historically, instream flow research has focused on protection and enhancement of fish resources. However, resource agencies and the public are increasingly focusing on broader instream flow issues including minimum flows and scheduled releases specifically for recreational purposes. Although the concept of instream flows for recreation is not new, recent efforts have been made to better define specific techniques for quantifying instream flow needs for recreation. Because few factors have the potential to affect hydropower generation and operational flexibility as significantly as instream flows, these techniques have been of particular interest and of particular value to the hydropower industry. This paper presents a conceptual model for river recreation as a basis for understanding the underlying principles of recreation flow studies, introduces some of the more commonly used methods, and lists the advantages and disadvantages of technique. Particular attention is given to a specific survey based method commonly referred to as a Controlled Flow Assessment.

  12. Quantifying International Travel Flows Using Flickr

    PubMed Central

    Barchiesi, Daniele; Moat, Helen Susannah; Alis, Christian; Bishop, Steven; Preis, Tobias

    2015-01-01

    Online social media platforms are opening up new opportunities to analyse human behaviour on an unprecedented scale. In some cases, the fast, cheap measurements of human behaviour gained from these platforms may offer an alternative to gathering such measurements using traditional, time consuming and expensive surveys. Here, we use geotagged photographs uploaded to the photo-sharing website Flickr to quantify international travel flows, by extracting the location of users and inferring trajectories to track their movement across time. We find that Flickr based estimates of the number of visitors to the United Kingdom significantly correlate with the official estimates released by the UK Office for National Statistics, for 28 countries for which official estimates are calculated. Our findings underline the potential for indicators of key aspects of human behaviour, such as mobility, to be generated from data attached to the vast volumes of photographs posted online. PMID:26147500

  13. Quantifying structural states of soft mudrocks

    NASA Astrophysics Data System (ADS)

    Li, B.; Wong, R. C. K.

    2016-05-01

    In this paper, a cm model is proposed to quantify structural states of soft mudrocks, which are dependent on clay fractions and porosities. Physical properties of natural and reconstituted soft mudrock samples are used to derive two parameters in the cm model. With the cm model, a simplified homogenization approach is proposed to estimate geomechanical properties and fabric orientation distributions of soft mudrocks based on the mixture theory. Soft mudrocks are treated as a mixture of nonclay minerals and clay-water composites. Nonclay minerals have a high stiffness and serve as a structural framework of mudrocks when they have a high volume fraction. Clay-water composites occupy the void space among nonclay minerals and serve as an in-fill matrix. With the increase of volume fraction of clay-water composites, there is a transition in the structural state from the state of framework supported to the state of matrix supported. The decreases in shear strength and pore size as well as increases in compressibility and anisotropy in fabric are quantitatively related to such transition. The new homogenization approach based on the proposed cm model yields better performance evaluation than common effective medium modeling approaches because the interactions among nonclay minerals and clay-water composites are considered. With wireline logging data, the cm model is applied to quantify the structural states of Colorado shale formations at different depths in the Cold Lake area, Alberta, Canada. Key geomechancial parameters are estimated based on the proposed homogenization approach and the critical intervals with low strength shale formations are identified.

  14. Crisis of Japanese vascular flora shown by quantifying extinction risks for 1618 taxa.

    PubMed

    Kadoya, Taku; Takenaka, Akio; Ishihama, Fumiko; Fujita, Taku; Ogawa, Makoto; Katsuyama, Teruo; Kadono, Yasuro; Kawakubo, Nobumitsu; Serizawa, Shunsuke; Takahashi, Hideki; Takamiya, Masayuki; Fujii, Shinji; Matsuda, Hiroyuki; Muneda, Kazuo; Yokota, Masatsugu; Yonekura, Koji; Yahara, Tetsukazu

    2014-01-01

    Although many people have expressed alarm that we are witnessing a mass extinction, few projections have been quantified, owing to limited availability of time-series data on threatened organisms, especially plants. To quantify the risk of extinction, we need to monitor changes in population size over time for as many species as possible. Here, we present the world's first quantitative projection of plant species loss at a national level, with stochastic simulations based on the results of population censuses of 1618 threatened plant taxa in 3574 map cells of ca. 100 km2. More than 500 lay botanists helped monitor those taxa in 1994-1995 and in 2003-2004. We projected that between 370 and 561 vascular plant taxa will go extinct in Japan during the next century if past trends of population decline continue. This extinction rate is approximately two to three times the global rate. Using time-series data, we show that existing national protected areas (PAs) covering ca. 7% of Japan will not adequately prevent population declines: even core PAs can protect at best <60% of local populations from decline. Thus, the Aichi Biodiversity Target to expand PAs to 17% of land (and inland water) areas, as committed to by many national governments, is not enough: only 29.2% of currently threatened species will become non-threatened under the assumption that probability of protection success by PAs is 0.5, which our assessment shows is realistic. In countries where volunteers can be organized to monitor threatened taxa, censuses using our method should be able to quantify how fast we are losing species and to assess how effective current conservation measures such as PAs are in preventing species extinction. PMID:24922311

  15. Quantifying uncertainty in observational rainfall datasets

    NASA Astrophysics Data System (ADS)

    Lennard, Chris; Dosio, Alessandro; Nikulin, Grigory; Pinto, Izidine; Seid, Hussen

    2015-04-01

    rainfall datasets available over Africa on monthly, daily and sub-daily time scales as appropriate to quantify spatial and temporal differences between the datasets. We find regional wet and dry biases between datasets (using the ensemble mean as a reference) with generally larger biases in reanalysis products. Rainfall intensity is poorly represented in some datasets which demonstrates some datasets should not be used for rainfall intensity analyses. Using 10 CORDEX models we show in east Africa that the spread between observed datasets is often similar to the spread between models. We recommend that specific observational rainfall datasets datasets be used for specific investigations and also that where many datasets are applicable to an investigation, a probabilistic view be adopted for rainfall studies over Africa. Endris, H. S., P. Omondi, S. Jain, C. Lennard, B. Hewitson, L. Chang'a, J. L. Awange, A. Dosio, P. Ketiem, G. Nikulin, H-J. Panitz, M. Büchner, F. Stordal, and L. Tazalika (2013) Assessment of the Performance of CORDEX Regional Climate Models in Simulating East African Rainfall. J. Climate, 26, 8453-8475. DOI: 10.1175/JCLI-D-12-00708.1 Gbobaniyi, E., A. Sarr, M. B. Sylla, I. Diallo, C. Lennard, A. Dosio, A. Dhie ?diou, A. Kamga, N. A. B. Klutse, B. Hewitson, and B. Lamptey (2013) Climatology, annual cycle and interannual variability of precipitation and temperature in CORDEX simulations over West Africa. Int. J. Climatol., DOI: 10.1002/joc.3834 Hernández-Díaz, L., R. Laprise, L. Sushama, A. Martynov, K. Winger, and B. Dugas (2013) Climate simulation over CORDEX Africa domain using the fifth-generation Canadian Regional Climate Model (CRCM5). Clim. Dyn. 40, 1415-1433. DOI: 10.1007/s00382-012-1387-z Kalognomou, E., C. Lennard, M. Shongwe, I. Pinto, A. Favre, M. Kent, B. Hewitson, A. Dosio, G. Nikulin, H. Panitz, and M. Büchner (2013) A diagnostic evaluation of precipitation in CORDEX models over southern Africa. Journal of Climate, 26, 9477-9506. DOI:10

  16. Quantifiers More or Less Quantify On-Line: ERP Evidence for Partial Incremental Interpretation

    ERIC Educational Resources Information Center

    Urbach, Thomas P.; Kutas, Marta

    2010-01-01

    Event-related brain potentials were recorded during RSVP reading to test the hypothesis that quantifier expressions are incrementally interpreted fully and immediately. In sentences tapping general knowledge ("Farmers grow crops/worms as their primary source of income"), Experiment 1 found larger N400s for atypical ("worms") than typical objects…

  17. Quantifying Subsidence in the 1999-2000 Arctic Winter Vortex

    NASA Technical Reports Server (NTRS)

    Greenblatt, Jeffery B.; Jost, Hans-juerg; Loewenstein, Max; Podolske, James R.; Bui, T. Paul; Elkins, James W.; Moore, Fred L.; Ray, Eric A.; Sen, Bhaswar; Margitan, James J.; Hipskind, R. Stephen (Technical Monitor)

    2000-01-01

    Quantifying the subsidence of the polar winter stratospheric vortex is essential to the analysis of ozone depletion, as chemical destruction often occurs against a large, altitude-dependent background ozone concentration. Using N2O measurements made during SOLVE on a variety of platforms (ER-2, in-situ balloon and remote balloon), the 1999-2000 Arctic winter subsidence is determined from N2O-potential temperature correlations along several N2O isopleths. The subsidence rates are compared to those determined in other winters, and comparison is also made with results from the SLIMCAT stratospheric chemical transport model.

  18. Quantifying the Cosmic Web using the Shapefinder diagonistic

    NASA Astrophysics Data System (ADS)

    Sarkar, Prakash

    2016-10-01

    One of the most successful method in quantifying the structures in the Cosmic Web is the Minkowski Functionals. In 3D, there are four minkowski Functionals: Area, Volume, Integrated Mean Curvature and the Integrated Gaussian Curvature. For defining the Minkowski Functionals one should define a surface. We have developed a method based on Marching cube 33 algorithm to generate a surface from a discrete data sets. Next we calculate the Minkowski Functionals and Shapefinder from the triangulated polyhedral surface. Applying this methodology to different data sets , we obtain interesting results related to geometry, morphology and topology of the large scale structure

  19. Quantifying the statistical complexity of low-frequency fluctuations in semiconductor lasers with optical feedback

    SciTech Connect

    Tiana-Alsina, J.; Torrent, M. C.; Masoller, C.; Garcia-Ojalvo, J.

    2010-07-15

    Low-frequency fluctuations (LFFs) represent a dynamical instability that occurs in semiconductor lasers when they are operated near the lasing threshold and subject to moderate optical feedback. LFFs consist of sudden power dropouts followed by gradual, stepwise recoveries. We analyze experimental time series of intensity dropouts and quantify the complexity of the underlying dynamics employing two tools from information theory, namely, Shannon's entropy and the Martin, Plastino, and Rosso statistical complexity measure. These measures are computed using a method based on ordinal patterns, by which the relative length and ordering of consecutive interdropout intervals (i.e., the time intervals between consecutive intensity dropouts) are analyzed, disregarding the precise timing of the dropouts and the absolute durations of the interdropout intervals. We show that this methodology is suitable for quantifying subtle characteristics of the LFFs, and in particular the transition to fully developed chaos that takes place when the laser's pump current is increased. Our method shows that the statistical complexity of the laser does not increase continuously with the pump current, but levels off before reaching the coherence collapse regime. This behavior coincides with that of the first- and second-order correlations of the interdropout intervals, suggesting that these correlations, and not the chaotic behavior, are what determine the level of complexity of the laser's dynamics. These results hold for two different dynamical regimes, namely, sustained LFFs and coexistence between LFFs and steady-state emission.

  20. Quantifying nonverbal communicative behavior in face-to-face human dialogues

    NASA Astrophysics Data System (ADS)

    Skhiri, Mustapha; Cerrato, Loredana

    2002-11-01

    The referred study is based on the assumption that understanding how humans use nonverbal behavior in dialogues can be very useful in the design of more natural-looking animated talking heads. The goal of the study is twofold: (1) to explore how people use specific facial expressions and head movements to serve important dialogue functions, and (2) to show evidence that it is possible to measure and quantify the entity of these movements with the Qualisys MacReflex motion tracking system. Naturally elicited dialogues between humans have been analyzed with focus on the attention on those nonverbal behaviors that serve the very relevant functions of regulating the conversational flux (i.e., turn taking) and producing information about the state of communication (i.e., feedback). The results show that eyebrow raising, head nods, and head shakes are typical signals involved during the exchange of speaking turns, as well as in the production and elicitation of feedback. These movements can be easily measured and quantified, and this measure can be implemented in animated talking heads.

  1. A mass-balance model to separate and quantify colloidal and solute redistributions in soil

    USGS Publications Warehouse

    Bern, C.R.; Chadwick, O.A.; Hartshorn, A.S.; Khomo, L.M.; Chorover, J.

    2011-01-01

    Studies of weathering and pedogenesis have long used calculations based upon low solubility index elements to determine mass gains and losses in open systems. One of the questions currently unanswered in these settings is the degree to which mass is transferred in solution (solutes) versus suspension (colloids). Here we show that differential mobility of the low solubility, high field strength (HFS) elements Ti and Zr can trace colloidal redistribution, and we present a model for distinguishing between mass transfer in suspension and solution. The model is tested on a well-differentiated granitic catena located in Kruger National Park, South Africa. Ti and Zr ratios from parent material, soil and colloidal material are substituted into a mixing equation to quantify colloidal movement. The results show zones of both colloid removal and augmentation along the catena. Colloidal losses of 110kgm-2 (-5% relative to parent material) are calculated for one eluviated soil profile. A downslope illuviated profile has gained 169kgm-2 (10%) colloidal material. Elemental losses by mobilization in true solution are ubiquitous across the catena, even in zones of colloidal accumulation, and range from 1418kgm-2 (-46%) for an eluviated profile to 195kgm-2 (-23%) at the bottom of the catena. Quantification of simultaneous mass transfers in solution and suspension provide greater specificity on processes within soils and across hillslopes. Additionally, because colloids include both HFS and other elements, the ability to quantify their redistribution has implications for standard calculations of soil mass balances using such index elements. ?? 2011.

  2. Quantifying the relative impact of climate and human activities on streamflow

    NASA Astrophysics Data System (ADS)

    Ahn, Kuk-Hyun; Merwade, Venkatesh

    2014-07-01

    The objective of this study is to quantify the role of climate and human impacts on streamflow conditions by using historical streamflow records, in conjunction with trend analysis and hydrologic modeling. Four U.S. states, including Indiana, New York, Arizona and Georgia area used to represent various level of human activity based on population change and diverse climate conditions. The Mann-Kendall trend analysis is first used to examine the magnitude changes in precipitation, streamflow and potential evapotranspiration for the four states. Four hydrologic modeling methods, including linear regression, hydrologic simulation, annual balance, and Budyko analysis are then used to quantify the amount of climate and human impacts on streamflow. All four methods show that the human impact is higher on streamflow at most gauging stations in all four states compared to climate impact. Among the four methods used, the linear regression approach produced the best hydrologic output in terms of higher Nash-Sutcliffe coefficient. The methodology used in this study is also able to correctly highlight the areas with higher human impact such as the modified channelized reaches in the northwestern part of Indiana. The results from this study show that population alone cannot capture all the changes caused by human activities in a region. However, this approach provides a starting point towards understanding the role of individual human activities on streamflow changes.

  3. Quantifying human health risks from virginiamycin used in chickens.

    PubMed

    Cox, Louis A; Popken, Douglas A

    2004-02-01

    The streptogramin antimicrobial combination Quinupristin-Dalfopristin (QD) has been used in the United States since late 1999 to treat patients with vancomycin-resistant Enterococcus faecium (VREF) infections. Another streptogramin, virginiamycin (VM), is used as a growth promoter and therapeutic agent in farm animals in the United States and other countries. Many chickens test positive for QD-resistant E. faecium, raising concern that VM use in chickens might compromise QD effectiveness against VREF infections by promoting development of QD-resistant strains that can be transferred to human patients. Despite the potential importance of this threat to human health, quantifying the risk via traditional farm-to-fork modeling has proved extremely difficult. Enough key data (mainly on microbial loads at each stage) are lacking so that such modeling amounts to little more than choosing a set of assumptions to determine the answer. Yet, regulators cannot keep waiting for more data. Patients prescribed QD are typically severely ill, immunocompromised people for whom other treatment options have not readily been available. Thus, there is a pressing need for sound risk assessment methods to inform risk management decisions for VM/QD using currently available data. This article takes a new approach to the QD-VM risk modeling challenge. Recognizing that the usual farm-to-fork ("forward chaining") approach commonly used in antimicrobial risk assessment for food animals is unlikely to produce reliable results soon enough to be useful, we instead draw on ideas from traditional fault tree analysis ("backward chaining") to reverse the farm-to-fork process and start with readily available human data on VREF case loads and QD resistance rates. Combining these data with recent genogroup frequency data for humans, chickens, and other sources (Willems et al., 2000, 2001) allows us to quantify potential human health risks from VM in chickens in both the United States and Australia, two

  4. Quantifying Uncertainties in Rainfall Maps from Cellular Communication Networks

    NASA Astrophysics Data System (ADS)

    Uijlenhoet, R.; Rios Gaona, M. F.; Overeem, A.; Leijnse, H.

    2014-12-01

    The core idea behind rainfall retrievals from commercial microwave link networks is to measure the decrease in power due to attenuation of the electromagnetic signal by raindrops along the link path. Accurate rainfall measurements are of vital importance in hydrological applications, for instance, flash-flood early-warning systems, agriculture, and climate modeling. Hence, such an alternative technique fulfills the need for measurements with higher resolution in time and space, especially in places where standard rain gauge-networks are scarce or poorly maintained. Rainfall estimation via commercial microwave link networks, at country-wide scales, has recently been demonstrated. Despite their potential applicability in rainfall estimation at higher spatiotemporal resolutions, the uncertainties present in link-based rainfall maps are not yet fully comprehended. Now we attempt to quantify the inherent sources of uncertainty present in interpolated maps computed from commercial microwave link rainfall retrievals. In order to disentangle these sources of uncertainty we identified four main sources of error: 1) microwave link measurements, 2) availability of microwave link measurements, 3) spatial distribution of the network, and 4) interpolation methodology. We computed more than 1000 rainfall fields, for The Netherlands, from real and simulated microwave link data. These rainfall fields were compared to quality-controlled gauge-adjusted radar rainfall maps considered as ground-truth. Thus we were able to quantify the contribution of errors in microwave link measurements to the overall uncertainty. The actual performance of the commercial microwave link network is affected by the intermittent availability of the links, not only in time but also in space. We simulated a fully-operational network in time and space, and thus we quantified the role of the availability of microwave link measurements to the overall uncertainty. This research showed that the largest source of

  5. A new way of quantifying diagnostic information from multilead electrocardiogram for cardiac disease classification

    PubMed Central

    Sharma, L.N.; Dandapat, S.

    2014-01-01

    A new measure for quantifying diagnostic information from a multilead electrocardiogram (MECG) is proposed. This diagnostic measure is based on principal component (PC) multivariate multiscale sample entropy (PMMSE). The PC analysis is used to reduce the dimension of the MECG data matrix. The multivariate multiscale sample entropy is evaluated over the PC matrix. The PMMSE values along each scale are used as a diagnostic feature vector. The performance of the proposed measure is evaluated using a least square support vector machine classifier for detection and classification of normal (healthy control) and different cardiovascular diseases such as cardiomyopathy, cardiac dysrhythmia, hypertrophy and myocardial infarction. The results show that the cardiac diseases are successfully detected and classified with an average accuracy of 90.34%. Comparison with some of the recently published methods shows improved performance of the proposed measure of cardiac disease classification. PMID:26609392

  6. Using nitrate to quantify quick flow in a karst aquifer

    USGS Publications Warehouse

    Mahler, B.J.; Garner, B.D.

    2009-01-01

    In karst aquifers, contaminated recharge can degrade spring water quality, but quantifying the rapid recharge (quick flow) component of spring flow is challenging because of its temporal variability. Here, we investigate the use of nitrate in a two-endmember mixing model to quantify quick flow in Barton Springs, Austin, Texas. Historical nitrate data from recharging creeks and Barton Springs were evaluated to determine a representative nitrate concentration for the aquifer water endmember (1.5 mg/L) and the quick flow endmember (0.17 mg/L for nonstormflow conditions and 0.25 mg/L for stormflow conditions). Under nonstormflow conditions for 1990 to 2005, model results indicated that quick flow contributed from 0% to 55% of spring flow. The nitrate-based two-endmember model was applied to the response of Barton Springs to a storm and results compared to those produced using the same model with ??18O and specific conductance (SC) as tracers. Additionally, the mixing model was modified to allow endmember quick flow values to vary over time. Of the three tracers, nitrate appears to be the most advantageous because it is conservative and because the difference between the concentrations in the two endmembers is large relative to their variance. The ??18O- based model was very sensitive to variability within the quick flow endmember, and SC was not conservative over the timescale of the storm response. We conclude that a nitrate-based two-endmember mixing model might provide a useful approach for quantifying the temporally variable quick flow component of spring flow in some karst systems. ?? 2008 National Ground Water Association.

  7. Quantifying Volume of Groundwater in High Elevation Meadows

    NASA Astrophysics Data System (ADS)

    Ciruzzi, D.; Lowry, C.

    2013-12-01

    Assessing the current and future water needs of high elevation meadows is dependent on quantifying the volume of groundwater stored within the meadow sediment. As groundwater dependent ecosystems, these meadows rely on their ability to capture and store water in order to support ecologic function and base flow to streams. Previous research of these meadows simplified storage by assuming a homogenous reservoir of constant thickness. These previous storage models were able to close the water mass balance, but it is unclear if these assumptions will be successful under future anthropogenic impacts, such as increased air temperature resulting in dryer and longer growing seasons. Applying a geophysical approach, ground-penetrating radar was used at Tuolumne Meadows, CA to qualitatively and quantitatively identify the controls on volume of groundwater storage. From the geophysical results, a three-dimensional model of Tuolumne Meadows was created, which identified meadow thickness and bedrock geometry. This physical model was used in a suite of numerical models simulating high elevation meadows in order to quantify volume of groundwater stored with temporal and spatial variability. Modeling efforts tested both wet and dry water years in order to quantify the variability in the volume of groundwater storage for a range of aquifer properties. Each model was evaluated based on the seasonal depth to water in order to evaluate a particular scenario's ability to support ecological function and base flow. Depending on the simulated meadows ability or inability to support its ecosystem, each representative meadow was categorized as successful or unsuccessful. Restoration techniques to increase active storage volume were suggested at unsuccessful meadows.

  8. Quantifying the BICEP2-Planck tension over gravitational waves.

    PubMed

    Smith, Kendrick M; Dvorkin, Cora; Boyle, Latham; Turok, Neil; Halpern, Mark; Hinshaw, Gary; Gold, Ben

    2014-07-18

    The recent BICEP2 measurement of B-mode polarization in the cosmic microwave background (r = 0.2(-0.05)(+0.07)), a possible indication of primordial gravity waves, appears to be in tension with the upper limit from WMAP (r < 0.13 at 95% C.L.) and Planck (r < 0.11 at 95% C.L.). We carefully quantify the level of tension and show that it is very significant (around 0.1% unlikely) when the observed deficit of large-scale temperature power is taken into account. We show that measurements of TE and EE power spectra in the near future will discriminate between the hypotheses that this tension is either a statistical fluke or a sign of new physics. We also discuss extensions of the standard cosmological model that relieve the tension and some novel ways to constrain them. PMID:25083631

  9. Quantifying the BICEP2-Planck tension over gravitational waves.

    PubMed

    Smith, Kendrick M; Dvorkin, Cora; Boyle, Latham; Turok, Neil; Halpern, Mark; Hinshaw, Gary; Gold, Ben

    2014-07-18

    The recent BICEP2 measurement of B-mode polarization in the cosmic microwave background (r = 0.2(-0.05)(+0.07)), a possible indication of primordial gravity waves, appears to be in tension with the upper limit from WMAP (r < 0.13 at 95% C.L.) and Planck (r < 0.11 at 95% C.L.). We carefully quantify the level of tension and show that it is very significant (around 0.1% unlikely) when the observed deficit of large-scale temperature power is taken into account. We show that measurements of TE and EE power spectra in the near future will discriminate between the hypotheses that this tension is either a statistical fluke or a sign of new physics. We also discuss extensions of the standard cosmological model that relieve the tension and some novel ways to constrain them.

  10. Quantifying the complexity of the delayed logistic map.

    PubMed

    Masoller, Cristina; Rosso, Osvaldo A

    2011-01-28

    Statistical complexity measures are used to quantify the degree of complexity of the delayed logistic map, with linear and nonlinear feedback. We employ two methods for calculating the complexity measures, one with the 'histogram-based' probability distribution function and the other one with ordinal patterns. We show that these methods provide complementary information about the complexity of the delay-induced dynamics: there are parameter regions where the histogram-based complexity is zero while the ordinal pattern complexity is not, and vice versa. We also show that the time series generated from the nonlinear delayed logistic map can present zero missing or forbidden patterns, i.e. all possible ordinal patterns are realized into orbits.

  11. Quantifying Nitrogen Loss From Flooded Hawaiian Taro Fields

    NASA Astrophysics Data System (ADS)

    Deenik, J. L.; Penton, C. R.; Bruland, G. L.; Popp, B. N.; Engstrom, P.; Mueller, J. A.; Tiedje, J.

    2010-12-01

    In 2004 a field fertilization experiment showed that approximately 80% of the fertilizer nitrogen (N) added to flooded Hawaiian taro (Colocasia esculenta) fields could not be accounted for using classic N balance calculations. To quantify N loss through denitrification and anaerobic ammonium oxidation (anammox) pathways in these taro systems we utilized a slurry-based isotope pairing technique (IPT). Measured nitrification rates and porewater N profiles were also used to model ammonium and nitrate fluxes through the top 10 cm of soil. Quantitative PCR of nitrogen cycling functional genes was used to correlate porewater N dynamics with potential microbial activity. Rates of denitrification calculated using porewater profiles were compared to those obtained using the slurry method. Potential denitrification rates of surficial sediments obtained with the slurry method were found to drastically overestimate the calculated in-situ rates. The largest discrepancies were present in fields greater than one month after initial fertilization, reflecting a microbial community poised to denitrify the initial N pulse. Potential surficial nitrification rates varied between 1.3% of the slurry-measured denitrification potential in a heavily-fertilized site to 100% in an unfertilized site. Compared to the use of urea, fish bone meal fertilizer use resulted in decreased N loss through denitrification in the surface sediment, according to both porewater modeling and IPT measurements. In addition, sub-surface porewater profiles point to root-mediated coupled nitrification/denitrification as a potential N loss pathway that is not captured in surface-based incubations. Profile-based surface plus subsurface coupled nitrification/denitrification estimates were between 1.1 and 12.7 times denitrification estimates from the surface only. These results suggest that the use of a ‘classic’ isotope pairing technique that employs 15NO3- in fertilized agricultural systems can lead to a drastic

  12. Quantifying the role of mitigation hills in reducing tsunami runup

    NASA Astrophysics Data System (ADS)

    Marras, S.; Suckale, J.; Lunghino, B.; Giraldo, F.; Hood, K. M.

    2015-12-01

    Coastal communities around the world are being encouraged to plant or restore vegetation along their shores for the purpose of mitigating tsunami damage. A common setup for these projects is to develop 'mitigation hills' - an ensemble of vegetated hills along the coast - instead of one continuous stretch of vegetation. The rationale behind a staggered-hill setup is to give tree roots more space to grow and deepen. From a fluid-dynamical point of view, however, staggered mitigation hills may have significant drawbacks such as diverting the flow into the low-lying areas of the park, which could entail strong currents in the narrow channels between the hills and lead to erosion of the hills from the sides. The goal of this study is to quantify how mitigation hills affect tsunami runup and to provide constraints on the design of mitigation hills that mitigate tsunami damage using numerical simulations. Our computations of tsunami runup are based on the non-linear shallow water equation solved through a fully implicit, high-order, discontinuous Galerkin method. The adaptive computational grid is fitted to the hill topography to capture geometric effects accurately. A new dynamic subgrid-scale eddy viscosity originally designed for large eddy simulation of compressible flows is used for stabilization and to capture the obstacle-generated turbulence. We have carefully benchmarked our model in 1D and 2D against classical test cases. The included figure shows an example run of tsunami runup through coastal mitigation hills. In the interest of providing generalizable results, we perform a detailed scaling analysis of our model runs. We find that the protective value of mitigation hills depends sensitively on the non-linearity of the incoming wave and the relative height of the wave to the hills. Our simulations also suggest that the assumed initial condition is consequential and we hence consider a range of incoming waves ranging from a simple soliton to a more realistic N

  13. Quantifying alosine prey in the diets of marine piscivores in the Gulf of Maine.

    PubMed

    McDermott, S P; Bransome, N C; Sutton, S E; Smith, B E; Link, J S; Miller, T J

    2015-06-01

    The objectives of this work were to quantify the spatial and temporal distribution of the occurrence of anadromous fishes (alewife Alosa pseudoharengus, blueback herring Alosa aestivalis and American shad Alosa sapidissima) in the stomachs of demersal fishes in coastal waters of the north-west Atlantic Ocean. Results show that anadromous fishes were detectable and quantifiable in the diets of common marine piscivores for every season sampled. Even though anadromous fishes were not the most abundant prey, they accounted for c. 5-10% of the diet by mass for several marine piscivores. Statistical comparisons of these data with fish diet data from a broad-scale survey of the north-west Atlantic Ocean indicate that the frequency of this trophic interaction was significantly higher within spatially and temporally focused sampling areas of this study than in the broad-scale survey. Odds ratios of anadromous predation were as much as 460 times higher in the targeted sampling as compared with the broad-scale sampling. Analyses indicate that anadromous prey consumption was more concentrated in the near-coastal waters compared with consumption of a similar, but more widely distributed species, the Atlantic herring Clupea harengus. In the context of ecosystem-based fisheries management, the results suggest that even low-frequency feeding events may be locally important, and should be incorporated into ecosystem models.

  14. Quantifying the Robustness of the English Sibilant Fricative Contrast in Children

    PubMed Central

    Reidy, Patrick F.; Beckman, Mary E.; Edwards, Jan

    2015-01-01

    Purpose Four measures of children's developing robustness of phonological contrast were compared to see how they correlated with age, vocabulary size, and adult listeners' correctness ratings. Method Word-initial sibilant fricative productions from eighty-one 2- to 5-year-old children and 20 adults were phonetically transcribed and acoustically analyzed. Four measures of robustness of contrast were calculated for each speaker on the basis of the centroid frequency measured from each fricative token. Productions that were transcribed as correct from different children were then used as stimuli in a perception experiment in which adult listeners rated the goodness of each production. Results Results showed that the degree of category overlap, quantified as the percentage of a child's productions whose category could be correctly predicted from the output of a mixed-effects logistic regression model, was the measure that correlated best with listeners' goodness judgments. Conclusions Even when children's productions have been transcribed as correct, adult listeners are sensitive to within-category variation quantified by the child's degree of category overlap. Further research is needed to explore the relationship between the age of a child and adults' sensitivity to different types of within-category variation in children's speech. PMID:25766040

  15. Quantifying Variability of Avian Colours: Are Signalling Traits More Variable?

    PubMed Central

    Delhey, Kaspar; Peters, Anne

    2008-01-01

    Background Increased variability in sexually selected ornaments, a key assumption of evolutionary theory, is thought to be maintained through condition-dependence. Condition-dependent handicap models of sexual selection predict that (a) sexually selected traits show amplified variability compared to equivalent non-sexually selected traits, and since males are usually the sexually selected sex, that (b) males are more variable than females, and (c) sexually dimorphic traits more variable than monomorphic ones. So far these predictions have only been tested for metric traits. Surprisingly, they have not been examined for bright coloration, one of the most prominent sexual traits. This omission stems from computational difficulties: different types of colours are quantified on different scales precluding the use of coefficients of variation. Methodology/Principal Findings Based on physiological models of avian colour vision we develop an index to quantify the degree of discriminable colour variation as it can be perceived by conspecifics. A comparison of variability in ornamental and non-ornamental colours in six bird species confirmed (a) that those coloured patches that are sexually selected or act as indicators of quality show increased chromatic variability. However, we found no support for (b) that males generally show higher levels of variability than females, or (c) that sexual dichromatism per se is associated with increased variability. Conclusions/Significance We show that it is currently possible to realistically estimate variability of animal colours as perceived by them, something difficult to achieve with other traits. Increased variability of known sexually-selected/quality-indicating colours in the studied species, provides support to the predictions borne from sexual selection theory but the lack of increased overall variability in males or dimorphic colours in general indicates that sexual differences might not always be shaped by similar selective

  16. Automated Counting of Particles To Quantify Cleanliness

    NASA Technical Reports Server (NTRS)

    Rhode, James

    2005-01-01

    A machine vision system, similar to systems used in microbiological laboratories to count cultured microbes, has been proposed for quantifying the cleanliness of nominally precisely cleaned hardware by counting residual contaminant particles. The system would include a microscope equipped with an electronic camera and circuitry to digitize the camera output, a personal computer programmed with machine-vision and interface software, and digital storage media. A filter pad, through which had been aspirated solvent from rinsing the hardware in question, would be placed on the microscope stage. A high-resolution image of the filter pad would be recorded. The computer would analyze the image and present a histogram of sizes of particles on the filter. On the basis of the histogram and a measure of the desired level of cleanliness, the hardware would be accepted or rejected. If the hardware were accepted, the image would be saved, along with other information, as a quality record. If the hardware were rejected, the histogram and ancillary information would be recorded for analysis of trends. The software would perceive particles that are too large or too numerous to meet a specified particle-distribution profile. Anomalous particles or fibrous material would be flagged for inspection.

  17. Quantifying the evolutionary dynamics of language.

    PubMed

    Lieberman, Erez; Michel, Jean-Baptiste; Jackson, Joe; Tang, Tina; Nowak, Martin A

    2007-10-11

    Human language is based on grammatical rules. Cultural evolution allows these rules to change over time. Rules compete with each other: as new rules rise to prominence, old ones die away. To quantify the dynamics of language evolution, we studied the regularization of English verbs over the past 1,200 years. Although an elaborate system of productive conjugations existed in English's proto-Germanic ancestor, Modern English uses the dental suffix, '-ed', to signify past tense. Here we describe the emergence of this linguistic rule amidst the evolutionary decay of its exceptions, known to us as irregular verbs. We have generated a data set of verbs whose conjugations have been evolving for more than a millennium, tracking inflectional changes to 177 Old-English irregular verbs. Of these irregular verbs, 145 remained irregular in Middle English and 98 are still irregular today. We study how the rate of regularization depends on the frequency of word usage. The half-life of an irregular verb scales as the square root of its usage frequency: a verb that is 100 times less frequent regularizes 10 times as fast. Our study provides a quantitative analysis of the regularization process by which ancestral forms gradually yield to an emerging linguistic rule.

  18. Quantifying the Magnetic Advantage in Magnetotaxis

    PubMed Central

    Smith, M. J.; Sheehan, P. E.; Perry, L. L.; O'Connor, K.; Csonka, L. N.; Applegate, B. M.; Whitman, L. J.

    2006-01-01

    Magnetotactic bacteria are characterized by the production of magnetosomes, nanoscale particles of lipid bilayer encapsulated magnetite, that act to orient the bacteria in magnetic fields. These magnetosomes allow magneto-aerotaxis, which is the motion of the bacteria along a magnetic field and toward preferred concentrations of oxygen. Magneto-aerotaxis has been shown to direct the motion of these bacteria downward toward sediments and microaerobic environments favorable for growth. Herein, we compare the magneto-aerotaxis of wild-type, magnetic Magnetospirillum magneticum AMB-1 with a nonmagnetic mutant we have engineered. Using an applied magnetic field and an advancing oxygen gradient, we have quantified the magnetic advantage in magneto-aerotaxis as a more rapid migration to preferred oxygen levels. Magnetic, wild-type cells swimming in an applied magnetic field more quickly migrate away from the advancing oxygen than either wild-type cells in a zero field or the nonmagnetic cells in any field. We find that the responses of the magnetic and mutant strains are well described by a relatively simple analytical model, an analysis of which indicates that the key benefit of magnetotaxis is an enhancement of a bacterium's ability to detect oxygen, not an increase in its average speed moving away from high oxygen concentrations. PMID:16714352

  19. Quantifying and Mapping Global Data Poverty.

    PubMed

    Leidig, Mathias; Teeuw, Richard M

    2015-01-01

    Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. However, access to these technologies, as well as their associated software and training materials, is not evenly distributed: since the 1990s there has been concern about a "Digital Divide" between the data-rich and the data-poor. We present an innovative metric for evaluating international variations in access to digital data: the Data Poverty Index (DPI). The DPI is based on Internet speeds, numbers of computer owners and Internet users, mobile phone ownership and network coverage, as well as provision of higher education. The datasets used to produce the DPI are provided annually for almost all the countries of the world and can be freely downloaded. The index that we present in this 'proof of concept' study is the first to quantify and visualise the problem of global data poverty, using the most recent datasets, for 2013. The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. The DPI highlights countries where support is needed for improving access to the Internet and for the provision of training in geoinfomatics. We conclude that the DPI is of value as a potential metric for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction. PMID:26560884

  20. Quantifying the evolutionary dynamics of language

    PubMed Central

    Lieberman, Erez; Michel, Jean-Baptiste; Jackson, Joe; Tang, Tina; Nowak, Martin A.

    2008-01-01

    Human language is based on grammatical rules1–4. Cultural evolution allows these rules to change over time5. Rules compete with each other: as new rules rise to prominence, old ones die away. To quantify the dynamics of language evolution, we studied the regularization of English verbs over the last 1200 years. Although an elaborate system of productive conjugations existed in English’s proto-Germanic ancestor, modern English uses the dental suffix, -ed, to signify past tense6. Here, we describe the emergence of this linguistic rule amidst the evolutionary decay of its exceptions, known to us as irregular verbs. We have generated a dataset of verbs whose conjugations have been evolving for over a millennium, tracking inflectional changes to 177 Old English irregulars. Of these irregulars, 145 remained irregular in Middle English and 98 are still irregular today. We study how the rate of regularization depends on the frequency of word usage. The half-life of an irregular verb scales as the square root of its usage frequency: a verb that is 100 times less frequent regularizes 10 times as fast. Our study provides a quantitative analysis of the regularization process by which ancestral forms gradually yield to an emerging linguistic rule. PMID:17928859

  1. Concurrent schedules: Quantifying the aversiveness of noise

    PubMed Central

    McAdie, Tina M.; Foster, T. Mary; Temple, William

    1996-01-01

    Four hens worked under independent multiple concurrent variable-interval schedules with an overlaid aversive stimulus (sound of hens in a poultry shed at 100dBA) activated by the first peck on a key. The sound remained on until a response was made on the other key. The key that activated the sound in each component was varied over a series of conditions. When the sound was activated by the left (or right) key in one component, it was activated by the right (or left) key in the other component. Bias was examined under a range of different variable-interval schedules, and the applicability of the generalized matching law was examined. It was found that the hens' behavior was biased away from the sound independently of the schedule in effect and that this bias could be quantified using a modified version of the generalized matching law. Behavior during the changeover delays was not affected by the presence of the noise or by changes in reinforcement rate, even though the total response measures were. Insensitivity shown during the delay suggests that behavior after the changeover delay may be more appropriate as a measure of preference (or aversiveness) of stimuli than are overall behavior measures. PMID:16812802

  2. Choosing appropriate techniques for quantifying groundwater recharge

    USGS Publications Warehouse

    Scanlon, B.R.; Healy, R.W.; Cook, P.G.

    2002-01-01

    Various techniques are available to quantify recharge; however, choosing appropriate techniques is often difficult. Important considerations in choosing a technique include space/time scales, range, and reliability of recharge estimates based on different techniques; other factors may limit the application of particular techniques. The goal of the recharge study is important because it may dictate the required space/time scales of the recharge estimates. Typical study goals include water-resource evaluation, which requires information on recharge over large spatial scales and on decadal time scales; and evaluation of aquifer vulnerability to contamination, which requires detailed information on spatial variability and preferential flow. The range of recharge rates that can be estimated using different approaches should be matched to expected recharge rates at a site. The reliability of recharge estimates using different techniques is variable. Techniques based on surface-water and unsaturated-zone data provide estimates of potential recharge, whereas those based on groundwater data generally provide estimates of actual recharge. Uncertainties in each approach to estimating recharge underscore the need for application of multiple techniques to increase reliability of recharge estimates.

  3. Quantifying truncation errors in effective field theory

    NASA Astrophysics Data System (ADS)

    Furnstahl, R. J.; Klco, N.; Phillips, D. R.; Wesolowski, S.

    2015-10-01

    Bayesian procedures designed to quantify truncation errors in perturbative calculations of QCD observables are adapted to expansions in effective field theory (EFT). In the Bayesian approach, such truncation errors are derived from degree-of-belief (DOB) intervals for EFT predictions. Computation of these intervals requires specification of prior probability distributions (``priors'') for the expansion coefficients. By encoding expectations about the naturalness of these coefficients, this framework provides a statistical interpretation of the standard EFT procedure where truncation errors are estimated using the order-by-order convergence of the expansion. It also permits exploration of the ways in which such error bars are, and are not, sensitive to assumptions about EFT-coefficient naturalness. We demonstrate the calculation of Bayesian DOB intervals for the EFT truncation error in some representative cases and explore several methods by which the convergence properties of the EFT for a set of observables may be used to check the statistical consistency of the EFT expansion parameter. Supported in part by the NSF and the DOE.

  4. Quantifying and Mapping Global Data Poverty

    PubMed Central

    2015-01-01

    Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. However, access to these technologies, as well as their associated software and training materials, is not evenly distributed: since the 1990s there has been concern about a "Digital Divide" between the data-rich and the data-poor. We present an innovative metric for evaluating international variations in access to digital data: the Data Poverty Index (DPI). The DPI is based on Internet speeds, numbers of computer owners and Internet users, mobile phone ownership and network coverage, as well as provision of higher education. The datasets used to produce the DPI are provided annually for almost all the countries of the world and can be freely downloaded. The index that we present in this ‘proof of concept’ study is the first to quantify and visualise the problem of global data poverty, using the most recent datasets, for 2013. The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. The DPI highlights countries where support is needed for improving access to the Internet and for the provision of training in geoinfomatics. We conclude that the DPI is of value as a potential metric for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction. PMID:26560884

  5. Quantifying the transmission potential of pandemic influenza

    NASA Astrophysics Data System (ADS)

    Chowell, Gerardo; Nishiura, Hiroshi

    2008-03-01

    This article reviews quantitative methods to estimate the basic reproduction number of pandemic influenza, a key threshold quantity to help determine the intensity of interventions required to control the disease. Although it is difficult to assess the transmission potential of a probable future pandemic, historical epidemiologic data is readily available from previous pandemics, and as a reference quantity for future pandemic planning, mathematical and statistical analyses of historical data are crucial. In particular, because many historical records tend to document only the temporal distribution of cases or deaths (i.e. epidemic curve), our review focuses on methods to maximize the utility of time-evolution data and to clarify the detailed mechanisms of the spread of influenza. First, we highlight structured epidemic models and their parameter estimation method which can quantify the detailed disease dynamics including those we cannot observe directly. Duration-structured epidemic systems are subsequently presented, offering firm understanding of the definition of the basic and effective reproduction numbers. When the initial growth phase of an epidemic is investigated, the distribution of the generation time is key statistical information to appropriately estimate the transmission potential using the intrinsic growth rate. Applications of stochastic processes are also highlighted to estimate the transmission potential using similar data. Critically important characteristics of influenza data are subsequently summarized, followed by our conclusions to suggest potential future methodological improvements.

  6. Data Used in Quantified Reliability Models

    NASA Technical Reports Server (NTRS)

    DeMott, Diana; Kleinhammer, Roger K.; Kahn, C. J.

    2014-01-01

    Data is the crux to developing quantitative risk and reliability models, without the data there is no quantification. The means to find and identify reliability data or failure numbers to quantify fault tree models during conceptual and design phases is often the quagmire that precludes early decision makers consideration of potential risk drivers that will influence design. The analyst tasked with addressing a system or product reliability depends on the availability of data. But, where is does that data come from and what does it really apply to? Commercial industries, government agencies, and other international sources might have available data similar to what you are looking for. In general, internal and external technical reports and data based on similar and dissimilar equipment is often the first and only place checked. A common philosophy is "I have a number - that is good enough". But, is it? Have you ever considered the difference in reported data from various federal datasets and technical reports when compared to similar sources from national and/or international datasets? Just how well does your data compare? Understanding how the reported data was derived, and interpreting the information and details associated with the data is as important as the data itself.

  7. Quantifying and Mapping Global Data Poverty.

    PubMed

    Leidig, Mathias; Teeuw, Richard M

    2015-01-01

    Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. However, access to these technologies, as well as their associated software and training materials, is not evenly distributed: since the 1990s there has been concern about a "Digital Divide" between the data-rich and the data-poor. We present an innovative metric for evaluating international variations in access to digital data: the Data Poverty Index (DPI). The DPI is based on Internet speeds, numbers of computer owners and Internet users, mobile phone ownership and network coverage, as well as provision of higher education. The datasets used to produce the DPI are provided annually for almost all the countries of the world and can be freely downloaded. The index that we present in this 'proof of concept' study is the first to quantify and visualise the problem of global data poverty, using the most recent datasets, for 2013. The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. The DPI highlights countries where support is needed for improving access to the Internet and for the provision of training in geoinfomatics. We conclude that the DPI is of value as a potential metric for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction.

  8. A revised metric for quantifying body shape in vertebrates.

    PubMed

    Collar, David C; Reynaga, Crystal M; Ward, Andrea B; Mehta, Rita S

    2013-08-01

    Vertebrates exhibit tremendous diversity in body shape, though quantifying this variation has been challenging. In the past, researchers have used simplified metrics that either describe overall shape but reveal little about its anatomical basis or that characterize only a subset of the morphological features that contribute to shape variation. Here, we present a revised metric of body shape, the vertebrate shape index (VSI), which combines the four primary morphological components that lead to shape diversity in vertebrates: head shape, length of the second major body axis (depth or width), and shape of the precaudal and caudal regions of the vertebral column. We illustrate the usefulness of VSI on a data set of 194 species, primarily representing five major vertebrate clades: Actinopterygii, Lissamphibia, Squamata, Aves, and Mammalia. We quantify VSI diversity within each of these clades and, in the course of doing so, show how measurements of the morphological components of VSI can be obtained from radiographs, articulated skeletons, and cleared and stained specimens. We also demonstrate that head shape, secondary body axis, and vertebral characteristics are important independent contributors to body shape diversity, though their importance varies across vertebrate groups. Finally, we present a functional application of VSI to test a hypothesized relationship between body shape and the degree of axial bending associated with locomotor modes in ray-finned fishes. Altogether, our study highlights the promise VSI holds for identifying the morphological variation underlying body shape diversity as well as the selective factors driving shape evolution.

  9. A revised metric for quantifying body shape in vertebrates.

    PubMed

    Collar, David C; Reynaga, Crystal M; Ward, Andrea B; Mehta, Rita S

    2013-08-01

    Vertebrates exhibit tremendous diversity in body shape, though quantifying this variation has been challenging. In the past, researchers have used simplified metrics that either describe overall shape but reveal little about its anatomical basis or that characterize only a subset of the morphological features that contribute to shape variation. Here, we present a revised metric of body shape, the vertebrate shape index (VSI), which combines the four primary morphological components that lead to shape diversity in vertebrates: head shape, length of the second major body axis (depth or width), and shape of the precaudal and caudal regions of the vertebral column. We illustrate the usefulness of VSI on a data set of 194 species, primarily representing five major vertebrate clades: Actinopterygii, Lissamphibia, Squamata, Aves, and Mammalia. We quantify VSI diversity within each of these clades and, in the course of doing so, show how measurements of the morphological components of VSI can be obtained from radiographs, articulated skeletons, and cleared and stained specimens. We also demonstrate that head shape, secondary body axis, and vertebral characteristics are important independent contributors to body shape diversity, though their importance varies across vertebrate groups. Finally, we present a functional application of VSI to test a hypothesized relationship between body shape and the degree of axial bending associated with locomotor modes in ray-finned fishes. Altogether, our study highlights the promise VSI holds for identifying the morphological variation underlying body shape diversity as well as the selective factors driving shape evolution. PMID:23746908

  10. Using multiscale norms to quantify mixing and transport

    NASA Astrophysics Data System (ADS)

    Thiffeault, Jean-Luc

    2012-02-01

    Mixing is relevant to many areas of science and engineering, including the pharmaceutical and food industries, oceanography, atmospheric sciences and civil engineering. In all these situations one goal is to quantify and often then to improve the degree of homogenization of a substance being stirred, referred to as a passive scalar or tracer. A classical measure of mixing is the variance of the concentration of the scalar, which is the L2 norm of a mean-zero concentration field. Recently, other norms have been used to quantify mixing, in particular the mix-norm as well as negative Sobolev norms. These norms have the advantage that unlike variance they decay even in the absence of diffusion, and their decay corresponds to the flow being mixing in the sense of ergodic theory. General Sobolev norms weigh scalar gradients differently, and are known as multiscale norms for mixing. We review the applications of such norms to mixing and transport, and show how they can be used to optimize the stirring and mixing of a decaying passive scalar. We then review recent work on the less-studied case of a continuously replenished scalar field—the source-sink problem. In that case the flows that optimally reduce the norms are associated with transport rather than mixing: they push sources onto sinks, and vice versa.

  11. Quantifying VOC emissions for the strategic petroleum reserve.

    SciTech Connect

    Knowlton, Robert G.; Lord, David L.

    2013-06-01

    A very important aspect of the Department of Energys (DOEs) Strategic Petroleum Reserve (SPR) program is regulatory compliance. One of the regulatory compliance issues deals with limiting the amount of volatile organic compounds (VOCs) that are emitted into the atmosphere from brine wastes when they are discharged to brine holding ponds. The US Environmental Protection Agency (USEPA) has set limits on the amount of VOCs that can be discharged to the atmosphere. Several attempts have been made to quantify the VOC emissions associated with the brine ponds going back to the late 1970s. There are potential issues associated with each of these quantification efforts. Two efforts were made to quantify VOC emissions by analyzing VOC content of brine samples obtained from wells. Efforts to measure air concentrations were mentioned in historical reports but no data have been located to confirm these assertions. A modeling effort was also performed to quantify the VOC emissions. More recently in 2011- 2013, additional brine sampling has been performed to update the VOC emissions estimate. An analysis of the statistical confidence in these results is presented here. Arguably, there are uncertainties associated with each of these efforts. The analysis herein indicates that the upper confidence limit in VOC emissions based on recent brine sampling is very close to the 0.42 ton/MMB limit used historically on the project. Refining this estimate would require considerable investment in additional sampling, analysis, and monitoring. An analysis of the VOC emissions at each site suggests that additional discharges could be made and stay within current regulatory limits.

  12. Quantifying Floods of a Flood Regime in Space and Time

    NASA Astrophysics Data System (ADS)

    Whipple, A. A.; Fleenor, W. E.; Viers, J. H.

    2015-12-01

    Interaction between a flood hydrograph and floodplain topography results in spatially and temporally variable conditions important for ecosystem process and function. Individual floods whose frequency and dimensionality comprise a river's flood regime contribute to that variability and in aggregate are important drivers of floodplain ecosystems. Across the globe, water management actions, land use changes as well as hydroclimatic change associated with climate change have profoundly affected natural flood regimes and their expression within the floodplain landscape. Homogenization of riverscapes has degraded once highly diverse and productive ecosystems. Improved understanding of the range of flood conditions and spatial variability within floodplains, or hydrospatial conditions, is needed to improve water and land management and restoration activities to support the variable conditions under which species adapted. This research quantifies the flood regime of a floodplain site undergoing restoration through levee breaching along the lower Cosumnes River of California. One of the few lowland alluvial rivers of California with an unregulated hydrograph and regular floodplain connectivity, the Cosumnes River provides a useful test-bed for exploring river-floodplain interaction. Representative floods of the Cosumnes River are selected from previously-established flood types comprising the flood regime and applied within a 2D hydrodynamic model representing the floodplain restoration site. Model output is analyzed and synthesized to quantify and compare conditions in space and time, using metrics such as depth and velocity. This research establishes methods for quantifying a flood regime's floodplain inundation characteristics, illustrates the role of flow variability and landscape complexity in producing heterogeneous floodplain conditions, and suggests important implications for managing more ecologically functional floodplains.

  13. Quantifying weld solidification cracking susceptibility using the varestraint test

    SciTech Connect

    Lin, W.; Lippold, J.C.; Nelson, T.W.

    1994-12-31

    Since the introduction of the original Varestraint concept in the 1960`s, the longitudinal- and transverse-type Varestraint tests have become the most widely utilized techniques for quantifying weld solidification cracking susceptibility. Conventionally, cracking susceptibility is assessed by threshold strain to cause cracking and the degree of cracking as quantified by total crack strain to cause cracking and the degree of cracking as quantified by total crack length or maximum crack length. Although material-specific quantifications such as the brittle temperature range (BTR) have been proposed for the transverse-type test, similar quantifications have not been developed for the longitudinal type test. Various alloys including 304, 310, 316L, A-286, AL6XN, 20Cb-3, RA253, and RA333 stainless steels, 625, 690, and 718 nickel-base alloys, 2090, 2219, 5083, and 6061 aluminum alloys were investigated using both longitudinal- and transverse-type Varestraint tests. Tests were performed using a newly developed, computer-controlled Varestraint unit equipped with a 3-axis movable torch, spring-loaded fixture and a servo-hydraulic loading system. It was found that extensive cracking was observed in the fusion zone emanating radially from the solid-liquid inteface toward the fusion boundary in the longitudinal-type test, while weld centerline cracking was prevalent in the transverse-type test. The theoretical basis for the formation of the CSR is that liquation-related cracking only occurs in a certain temperature range known as the BTR. The detailed procedure in the development of the CSR in the fusion zone is described and discussed. This approach allows a weldability data base to be created and the comparison of results from different laboratories using different test techniques.

  14. Quantifying prosthetic gait deviation using simple outcome measures

    PubMed Central

    Kark, Lauren; Odell, Ross; McIntosh, Andrew S; Simmons, Anne

    2016-01-01

    AIM: To develop a subset of simple outcome measures to quantify prosthetic gait deviation without needing three-dimensional gait analysis (3DGA). METHODS: Eight unilateral, transfemoral amputees and 12 unilateral, transtibial amputees were recruited. Twenty-eight able-bodied controls were recruited. All participants underwent 3DGA, the timed-up-and-go test and the six-minute walk test (6MWT). The lower-limb amputees also completed the Prosthesis Evaluation Questionnaire. Results from 3DGA were summarised using the gait deviation index (GDI), which was subsequently regressed, using stepwise regression, against the other measures. RESULTS: Step-length (SL), self-selected walking speed (SSWS) and the distance walked during the 6MWT (6MWD) were significantly correlated with GDI. The 6MWD was the strongest, single predictor of the GDI, followed by SL and SSWS. The predictive ability of the regression equations were improved following inclusion of self-report data related to mobility and prosthetic utility. CONCLUSION: This study offers a practicable alternative to quantifying kinematic deviation without the need to conduct complete 3DGA. PMID:27335814

  15. Quantifying dynamic characteristics of human walking for comprehensive gait cycle.

    PubMed

    Mummolo, Carlotta; Mangialardi, Luigi; Kim, Joo H

    2013-09-01

    Normal human walking typically consists of phases during which the body is statically unbalanced while maintaining dynamic stability. Quantifying the dynamic characteristics of human walking can provide better understanding of gait principles. We introduce a novel quantitative index, the dynamic gait measure (DGM), for comprehensive gait cycle. The DGM quantifies the effects of inertia and the static balance instability in terms of zero-moment point and ground projection of center of mass and incorporates the time-varying foot support region (FSR) and the threshold between static and dynamic walking. Also, a framework of determining the DGM from experimental data is introduced, in which the gait cycle segmentation is further refined. A multisegmental foot model is integrated into a biped system to reconstruct the walking motion from experiments, which demonstrates the time-varying FSR for different subphases. The proof-of-concept results of the DGM from a gait experiment are demonstrated. The DGM results are analyzed along with other established features and indices of normal human walking. The DGM provides a measure of static balance instability of biped walking during each (sub)phase as well as the entire gait cycle. The DGM of normal human walking has the potential to provide some scientific insights in understanding biped walking principles, which can also be useful for their engineering and clinical applications.

  16. Quantifying uncertainty in LCA-modelling of waste management systems

    SciTech Connect

    Clavreul, Julie; Guyonnet, Dominique; Christensen, Thomas H.

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Uncertainty in LCA-modelling of waste management is significant. Black-Right-Pointing-Pointer Model, scenario and parameter uncertainties contribute. Black-Right-Pointing-Pointer Sequential procedure for quantifying uncertainty is proposed. Black-Right-Pointing-Pointer Application of procedure is illustrated by a case-study. - Abstract: Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining the selected methods: (Step 1) a sensitivity analysis evaluating the sensitivities of the results with respect to the input uncertainties, (Step 2) an uncertainty propagation providing appropriate tools for representing uncertainties and calculating the overall uncertainty of the model results, (Step 3) an uncertainty contribution analysis quantifying the contribution of each parameter uncertainty to the final uncertainty and (Step 4) as a new approach, a combined sensitivity analysis providing a visualisation of the shift in the ranking of different options due to variations of selected key parameters. This tiered approach optimises the resources available to LCA practitioners by only propagating the most influential uncertainties.

  17. Beyond immunity: quantifying the effects of host anti-parasite behavior on parasite transmission.

    PubMed

    Daly, Elizabeth W; Johnson, Pieter T J

    2011-04-01

    A host's first line of defense in response to the threat of parasitic infection is behavior, yet the efficacy of anti-parasite behaviors in reducing infection are rarely quantified relative to immunological defense mechanisms. Larval amphibians developing in aquatic habitats are at risk of infection from a diverse assemblage of pathogens, some of which cause substantial morbidity and mortality, suggesting that behavioral avoidance and resistance could be significant defensive strategies. To quantify the importance of anti-parasite behaviors in reducing infection, we exposed larval Pacific chorus frogs (Pseudacris regilla) to pathogenic trematodes (Ribeiroia and Echinostoma) in one of two experimental conditions: behaviorally active (unmanipulated) or behaviorally impaired (anesthetized). By quantifying both the number of successful and unsuccessful parasites, we show that host behavior reduces infection prevalence and intensity for both parasites. Anesthetized hosts were 20-39% more likely to become infected and, when infected, supported 2.8-fold more parasitic cysts. Echinostoma had a 60% lower infection success relative to the more deadly Ribeiroia and was also more vulnerable to behaviorally mediated reductions in transmission. For Ribeiroia, increases in host mass enhanced infection success, consistent with epidemiological theory, but this relationship was eroded among active hosts. Our results underscore the importance of host behavior in mitigating disease risk and suggest that, in some systems, anti-parasite behaviors can be as or more effective than immune-mediated defenses in reducing infection. Considering the severe pathologies induced by these and other pathogens of amphibians, we emphasize the value of a broader understanding of anti-parasite behaviors and how co-occurring stressors affect them.

  18. Quantifying individual performance in Cricket — A network analysis of batsmen and bowlers

    NASA Astrophysics Data System (ADS)

    Mukherjee, Satyam

    2014-01-01

    Quantifying individual performance in the game of Cricket is critical for team selection in International matches. The number of runs scored by batsmen and wickets taken by bowlers serves as a natural way of quantifying the performance of a cricketer. Traditionally the batsmen and bowlers are rated on their batting or bowling average respectively. However, in a game like Cricket it is always important the manner in which one scores the runs or claims a wicket. Scoring runs against a strong bowling line-up or delivering a brilliant performance against a team with a strong batting line-up deserves more credit. A player’s average is not able to capture this aspect of the game. In this paper we present a refined method to quantify the ‘quality’ of runs scored by a batsman or wickets taken by a bowler. We explore the application of Social Network Analysis (SNA) to rate the players in a team performance. We generate a directed and weighted network of batsmen-bowlers using the player-vs-player information available for Test cricket and ODI cricket. Additionally we generate a network of batsmen and bowlers based on the dismissal record of batsmen in the history of cricket-Test (1877-2011) and ODI (1971-2011). Our results show that M. Muralitharan is the most successful bowler in the history of Cricket. Our approach could potentially be applied in domestic matches to judge a player’s performance which in turn paves the way for a balanced team selection for International matches.

  19. Quantifying facial expression recognition across viewing conditions.

    PubMed

    Goren, Deborah; Wilson, Hugh R

    2006-04-01

    Facial expressions are key to social interactions and to assessment of potential danger in various situations. Therefore, our brains must be able to recognize facial expressions when they are transformed in biologically plausible ways. We used synthetic happy, sad, angry and fearful faces to determine the amount of geometric change required to recognize these emotions during brief presentations. Five-alternative forced choice conditions involving central viewing, peripheral viewing and inversion were used to study recognition among the four emotions. Two-alternative forced choice was used to study affect discrimination when spatial frequency information in the stimulus was modified. The results show an emotion and task-dependent pattern of detection. Facial expressions presented with low peak frequencies are much harder to discriminate from neutral than faces defined by either mid or high peak frequencies. Peripheral presentation of faces also makes recognition much more difficult, except for happy faces. Differences between fearful detection and recognition tasks are probably due to common confusions with sadness when recognizing fear from among other emotions. These findings further support the idea that these emotions are processed separately from each other. PMID:16364393

  20. Visualizing and Quantifying Blob Characteristics on NSTX

    NASA Astrophysics Data System (ADS)

    Davis, William; Zweben, Stewart; Myra, James; D'Ippolito, Daniel; Ko, Matthew

    2012-10-01

    Understanding the radial motion of blob-filaments in the tokamak edge plasma is important since this motion can affect the width of the heat and particle scrape-off layer (SOL) [1]. High resolution (64x80), high speed (400,000 frames/sec) edge turbulence movies taken of the NSTX outer midplane separatrix region have recently been analyzed for blob motion. Regions of high light emission from gas puff imaging within a 25x30 cm cross-section were used to track blob-filaments in the plasma edge and into the SOL. Software tools have been developed for visualizing blob movement and automatically generating statistics of blob speed, shape, amplitude, size, and orientation; thousands of blobs have been analyzed for dozens of shots. The blob tracking algorithm and resulting database entries are explained in detail. Visualization tools also show how poloidal and radial motion change as blobs move through the scrape-off-layer (SOL), e.g. suggesting the influence of sheared flow. Relationships between blob size and velocity are shown for various types of plasmas and compared with simplified theories of blob motion. This work was supported by DOE Contract DE-AC02-09-CH11466. [4pt] [1] J.R. Myra et al, Phys. Plasmas 18, 012305 (2011)

  1. Quantify the accuracy of coal seam gas content

    SciTech Connect

    Mavor, M.J.; Pratt, T.J.; Nelson, C.R.

    1995-10-01

    Gas content determination is a critical procedure performed to evaluate the expected gas production rate and producible reserve potential of coal seam reservoirs. The results from a Gas Research Institute (GRI) research project indicate that gas content estimates obtained with many commonly used methods can be low by 50%. These low estimates result in underestimation of gas-in-place reserves, under-prediction of potential gas production rates during primary and enhanced recovery and under-valuation of the economic worth of investors` assets. The results of the GRI research project quantifies the accuracy and comparability of the most commonly used coal seam gas content evaluation procedures. The best methods for accurately estimating the gas-in-place are also identified.

  2. Quantifiers are incrementally interpreted in context, more than less

    PubMed Central

    Urbach, Thomas P.; DeLong, Katherine A.; Kutas, Marta

    2015-01-01

    Language interpretation is often assumed to be incremental. However, our studies of quantifier expressions in isolated sentences found N400 event-related brain potential (ERP) evidence for partial but not full immediate quantifier interpretation (Urbach & Kutas, 2010). Here we tested similar quantifier expressions in pragmatically supporting discourse contexts (Alex was an unusual toddler. Most/Few kids prefer sweets/vegetables…) while participants made plausibility judgments (Experiment 1) or read for comprehension (Experiment 2). Control Experiments 3A (plausibility) and 3B (comprehension) removed the discourse contexts. Quantifiers always modulated typical and/or atypical word N400 amplitudes. However, only the real-time N400 effects only in Experiment 2 mirrored offline quantifier and typicality crossover interaction effects for plausibility ratings and cloze probabilities. We conclude that quantifier expressions can be interpreted fully and immediately, though pragmatic and task variables appear to impact the speed and/or depth of quantifier interpretation. PMID:26005285

  3. Quantifying compositional impacts of ambient aerosol on cloud droplet formation

    NASA Astrophysics Data System (ADS)

    Lance, Sara

    It has been historically assumed that most of the uncertainty associated with the aerosol indirect effect on climate can be attributed to the unpredictability of updrafts. In Chapter 1, we analyze the sensitivity of cloud droplet number density, to realistic variations in aerosol chemical properties and to variable updraft velocities using a 1-dimensional cloud parcel model in three important environmental cases (continental, polluted and remote marine). The results suggest that aerosol chemical variability may be as important to the aerosol indirect effect as the effect of unresolved cloud dynamics, especially in polluted environments. We next used a continuous flow streamwise thermal gradient Cloud Condensation Nuclei counter (CCNc) to study the water-uptake properties of the ambient aerosol, by exposing an aerosol sample to a controlled water vapor supersaturation and counting the resulting number of droplets. In Chapter 2, we modeled and experimentally characterized the heat transfer properties and droplet growth within the CCNc. Chapter 3 describes results from the MIRAGE field campaign, in which the CCNc and a Hygroscopicity Tandem Differential Mobility Analyzer (HTDMA) were deployed at a ground-based site during March, 2006. Size-resolved CCN activation spectra and growth factor distributions of the ambient aerosol in Mexico City were obtained, and an analytical technique was developed to quantify a probability distribution of solute volume fractions for the CCN in addition to the aerosol mixing-state. The CCN were shown to be much less CCN active than ammonium sulfate, with water uptake properties more consistent with low molecular weight organic compounds. The pollution outflow from Mexico City was shown to have CCN with an even lower fraction of soluble material. "Chemical Closure" was attained for the CCN, by comparing the inferred solute volume fraction with that from direct chemical measurements. A clear diurnal pattern was observed for the CCN solute

  4. Constraining Habitable Environments on Mars by Quantifying Available Geochemical Energy

    NASA Astrophysics Data System (ADS)

    Tierney, L. L.; Jakosky, B. M.

    2009-12-01

    The search for life on Mars includes the availability of liquid water, access to biogenic elements and an energy source. In the past, when water was more abundant on Mars, a source of energy may have been the limiting factor for potential life. Energy, either from photosynthesis or chemosynthesis, is required in order to drive metabolism. Potential martian organisms most likely took advantage of chemosynthetic reactions at and below the surface. Terrestrial chemolithoautotrophs, for example, thrive off of chemical disequilibrium that exists in many environments and use inorganic redox (reduction-oxidation) reactions to drive metabolism and create cellular biomass. The chemical disequilibrium of six different martian environments were modeled in this study and analyzed incorporating a range of water and rock compositions, water:rock mass ratios, atmospheric fugacities, pH, and temperatures. All of these models can be applied to specific sites on Mars including environments similar to Meridiani Planum and Gusev Crater. Both a mass transfer geochemical model of groundwater-basalt interaction and a mixing model of groundwater-hydrothermal fluid interaction were used to estimate hypothetical martian fluid compositions that results from mixing over the entire reaction path. By determining the overall Gibbs free energy yields for redox reactions in the H-O-C-S-Fe-Mn system, the amount of geochemical energy that was available for potential chemolithoautotrophic microorganisms was quantified and the amount of biomass that could have been sustained was estimated. The quantity of biomass that can be formed and supported within a system depends on energy availability, thus sites that have higher levels and fluxes of energy have greater potential to support life. Results show that iron- and sulfur-oxidation reactions would have been the most favorable redox reactions in aqueous systems where groundwater and rock interacted at or near the surface. These types of reactions could

  5. The Physics of Equestrian Show Jumping

    NASA Astrophysics Data System (ADS)

    Stinner, Art

    2014-04-01

    This article discusses the kinematics and dynamics of equestrian show jumping. For some time I have attended a series of show jumping events at Spruce Meadows, an international equestrian center near Calgary, Alberta, often referred to as the "Wimbledon of equestrian jumping." I have always had a desire to write an article such as this one, but when I searched the Internet for information and looked at YouTube presentations, I could only find simplistic references to Newton's laws and the conservation of mechanical energy principle. Nowhere could I find detailed calculations. On the other hand, there were several biomechanical articles with empirical reports of the results of kinetic and dynamic investigations of show jumping using high-speed digital cameras and force plates. They summarize their results in tables that give information about the motion of a horse jumping over high fences (1.40 m) and the magnitudes of the forces encountered when landing. However, they do not describe the physics of these results.

  6. Quantifying scale relationships in snow distributions

    NASA Astrophysics Data System (ADS)

    Deems, Jeffrey S.

    2007-12-01

    Spatial distributions of snow in mountain environments represent the time integration of accumulation and ablation processes, and are strongly and dynamically linked to mountain hydrologic, ecologic, and climatic systems. Accurate measurement and modeling of the spatial distribution and variability of the seasonal mountain snowpack at different scales are imperative for water supply and hydropower decision-making, for investigations of land-atmosphere interaction or biogeochemical cycling, and for accurate simulation of earth system processes and feedbacks. Assessment and prediction of snow distributions in complex terrain are heavily dependent on scale effects, as the pattern and magnitude of variability in snow distributions depends on the scale of observation. Measurement and model scales are usually different from process scales, and thereby introduce a scale bias to the estimate or prediction. To quantify this bias, or to properly design measurement schemes and model applications, the process scale must be known or estimated. Airborne Light Detection And Ranging (lidar) products provide high-resolution, broad-extent altimetry data for terrain and snowpack mapping, and allow an application of variogram fractal analysis techniques to characterize snow depth scaling properties over lag distances from 1 to 1000 meters. Snow depth patterns as measured by lidar at three Colorado mountain sites exhibit fractal (power law) scaling patterns over two distinct scale ranges, separated by a distinct break at the 15-40 m lag distance, depending on the site. Each fractal range represents a range of separation distances over which snow depth processes remain consistent. The scale break between fractal regions is a characteristic scale at which snow depth process relationships change fundamentally. Similar scale break distances in vegetation topography datasets suggest that the snow depth scale break represents a change in wind redistribution processes from wind

  7. Quantifying Riverscape Connectivity with Graph Theory

    NASA Astrophysics Data System (ADS)

    Carbonneau, P.; Milledge, D.; Sinha, R.; Tandon, S. K.

    2013-12-01

    Fluvial catchments convey fluxes of water, sediment, nutrients and aquatic biota. At continental scales, crustal topography defines the overall path of channels whilst at local scales depositional and/or erosional features generally determine the exact path of a channel. Furthermore, constructions such as dams, for either water abstraction or hydropower, often have a significant impact on channel networks.The concept of ';connectivity' is commonly invoked when conceptualising the structure of a river network.This concept is easy to grasp but there have been uneven efforts across the environmental sciences to actually quantify connectivity. Currently there have only been a few studies reporting quantitative indices of connectivity in river sciences, notably, in the study of avulsion processes. However, the majority of current work describing some form of environmental connectivity in a quantitative manner is in the field of landscape ecology. Driven by the need to quantify habitat fragmentation, landscape ecologists have returned to graph theory. Within this formal setting, landscape ecologists have successfully developed a range of indices which can model connectivity loss. Such formal connectivity metrics are currently needed for a range of applications in fluvial sciences. One of the most urgent needs relates to dam construction. In the developed world, hydropower development has generally slowed and in many countries, dams are actually being removed. However, this is not the case in the developing world where hydropower is seen as a key element to low-emissions power-security. For example, several dam projects are envisaged in Himalayan catchments in the next 2 decades. This region is already under severe pressure from climate change and urbanisation, and a better understanding of the network fragmentation which can be expected in this system is urgently needed. In this paper, we apply and adapt connectivity metrics from landscape ecology. We then examine the

  8. Quantifying Flow Resistance of Mountain Streams Using the HHT Approach

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Fu, X.

    2014-12-01

    This study quantifies the flow resistance of mountain streams with gravel bed and remarkable bed forms. The motivation is to follow the previous ideas (Robert, A. 1990) that the bed surface can be divided into micro-scale and macro-scale roughness, respectively. We processed the field data of longitudinal bed profiles of the Longxi River, Sichuan Province, China, using the Hilbert-Huang Transformation Method (HHT). Each longitudinal profile was decomposed into a set of curves with different frequencies of spatial fluctuation. The spectrogram was accordingly obtained. We supposed that a certain high and low frequency curves correspond to the micro- and macro-roughness of stream bed, respectively. We specified the characteristic height and length with the spectrogram, which represent the macro bed form accounting for bed form roughness. We then estimated the bed form roughness as being proportional to the ratio of the height to length multiplied by the height(Yang et al,2005). We also assumed the parameter, Sp, defined as the sinuosity of the highest frequency curve as the measure of the micro-scale roughness. We then took into account the effect of bed material sizes through using the product of d50/R and Sp, where d50 is the sediment median size and R is the hydraulic radius. The macro- and micro-scale roughness parameters were merged together nonlinearly to evaluate the flow resistance caused by the interplaying friction and form drag forces. Validation results show that the square of the determinant coefficient can reach as high as 0.84 in the case of the Longxi River. Future studies will focus on the verification against more field data as well as the combination of skin friction and form drag. Key words: flow resistance; roughness; HHT; spectrogram; form drag Robert, A. (1990), Boundary roughness in coarse-grained channels, Prog. Phys. Geogr., 14(1), 42-69. Yang, S.-Q., S.-K. Tan, and S.-Y. Lim. (2005), Flow resistance and bed form geometry in a wide alluvial

  9. Quantifying collective attention from tweet stream.

    PubMed

    Sasahara, Kazutoshi; Hirata, Yoshito; Toyoda, Masashi; Kitsuregawa, Masaru; Aihara, Kazuyuki

    2013-01-01

    Online social media are increasingly facilitating our social interactions, thereby making available a massive "digital fossil" of human behavior. Discovering and quantifying distinct patterns using these data is important for studying social behavior, although the rapid time-variant nature and large volumes of these data make this task difficult and challenging. In this study, we focused on the emergence of "collective attention" on Twitter, a popular social networking service. We propose a simple method for detecting and measuring the collective attention evoked by various types of events. This method exploits the fact that tweeting activity exhibits a burst-like increase and an irregular oscillation when a particular real-world event occurs; otherwise, it follows regular circadian rhythms. The difference between regular and irregular states in the tweet stream was measured using the Jensen-Shannon divergence, which corresponds to the intensity of collective attention. We then associated irregular incidents with their corresponding events that attracted the attention and elicited responses from large numbers of people, based on the popularity and the enhancement of key terms in posted messages or "tweets." Next, we demonstrate the effectiveness of this method using a large dataset that contained approximately 490 million Japanese tweets by over 400,000 users, in which we identified 60 cases of collective attentions, including one related to the Tohoku-oki earthquake. "Retweet" networks were also investigated to understand collective attention in terms of social interactions. This simple method provides a retrospective summary of collective attention, thereby contributing to the fundamental understanding of social behavior in the digital era. PMID:23637913

  10. Quantifying landscape resilience using vegetation indices

    NASA Astrophysics Data System (ADS)

    Eddy, I. M. S.; Gergel, S. E.

    2014-12-01

    Landscape resilience refers to the ability of systems to adapt to and recover from disturbance. In pastoral landscapes, degradation can be measured in terms of increased desertification and/or shrub encroachment. In many countries across Central Asia, the use and resilience of pastoral systems has changed markedly over the past 25 years, influenced by centralized Soviet governance, private property rights and recently, communal resource governance. In Kyrgyzstan, recent governance reforms were in response to the increasing degradation of pastures attributed to livestock overgrazing. Our goal is to examine and map the landscape-level factors that influence overgrazing throughout successive governance periods. Here, we map and examine some of the spatial factors influencing landscape resilience in agro-pastoral systems in the Kyrgyzstan Republic where pastures occupy >50% of the country's area. We ask three questions: 1) which mechanisms of pasture degradation (desertification vs. shrub encroachment), are detectable using remote sensing vegetation indices?; 2) Are these degraded pastures associated with landscape features that influence herder mobility and accessibility (e.g., terrain, distance to other pastures)?; and 3) Have these patterns changed through successive governance periods? Using a chronosequence of Landsat imagery (1999-2014), NDVI and other VIs were used to identify trends in pasture condition during the growing season. Least-cost path distances as well as graph theoretic indices were derived from topographic factors to assess landscape connectivity (from villages to pastures and among pastures). Fieldwork was used to assess the feasibility and accuracy of this approach using the most recent imagery. Previous research concluded that low herder mobility hindered pasture use, thus we expect the distance from pasture to village to be an important predictor of pasture condition. This research will quantify the magnitude of pastoral degradation and test

  11. A new model for quantifying climate episodes

    NASA Astrophysics Data System (ADS)

    Biondi, Franco; Kozubowski, Tomasz J.; Panorska, Anna K.

    2005-07-01

    When long records of climate (precipitation, temperature, stream runoff, etc.) are available, either from instrumental observations or from proxy records, the objective evaluation and comparison of climatic episodes becomes necessary. Such episodes can be quantified in terms of duration (the number of time intervals, e.g. years, the process remains continuously above or below a reference level) and magnitude (the sum of all series values for a given duration). The joint distribution of duration and magnitude is represented here by a stochastic model called BEG, for bivariate distribution with exponential and geometric marginals. The model is based on the theory of random sums, and its mathematical derivation confirms and extends previous empirical findings. Probability statements that can be obtained from the model are illustrated by applying it to a 2300-year dendroclimatic reconstruction of water-year precipitation for the eastern Sierra Nevada-western Great Basin. Using the Dust Bowl drought period as an example, the chance of a longer or greater drought is 8%. Conditional probabilities are much higher, i.e. a drought of that magnitude has a 62% chance of lasting for 11 years or longer, and a drought that lasts 11 years has a 46% chance of having an equal or greater magnitude. In addition, because of the bivariate model, we can estimate a 6% chance of witnessing a drought that is both longer and greater. Additional examples of model application are also provided. This type of information provides a way to place any climatic episode in a temporal perspective, and such numerical statements help with reaching science-based management and policy decisions.

  12. Quantifying human vitamin kinetics using AMS

    SciTech Connect

    Hillegonds, D; Dueker, S; Ognibene, T; Buchholz, B; Lin, Y; Vogel, J; Clifford, A

    2004-02-19

    Tracing vitamin kinetics at physiologic concentrations has been hampered by a lack of quantitative sensitivity for chemically equivalent tracers that could be used safely in healthy people. Instead, elderly or ill volunteers were sought for studies involving pharmacologic doses with radioisotopic labels. These studies fail to be relevant in two ways: vitamins are inherently micronutrients, whose biochemical paths are saturated and distorted by pharmacological doses; and while vitamins remain important for health in the elderly or ill, their greatest effects may be in preventing slow and cumulative diseases by proper consumption throughout youth and adulthood. Neither the target dose nor the target population are available for nutrient metabolic studies through decay counting of radioisotopes at high levels. Stable isotopic labels are quantified by isotope ratio mass spectrometry at levels that trace physiologic vitamin doses, but the natural background of stable isotopes severely limits the time span over which the tracer is distinguishable. Indeed, study periods seldom ranged over a single biological mean life of the labeled nutrients, failing to provide data on the important final elimination phase of the compound. Kinetic data for the absorption phase is similarly rare in micronutrient research because the phase is rapid, requiring many consecutive plasma samples for accurate representation. However, repeated blood samples of sufficient volume for precise stable or radio-isotope quantitations consume an indefensible amount of the volunteer's blood over a short period. Thus, vitamin pharmacokinetics in humans has often relied on compartmental modeling based upon assumptions and tested only for the short period of maximal blood circulation, a period that poorly reflects absorption or final elimination kinetics except for the most simple models.

  13. Quantifying Collective Attention from Tweet Stream

    PubMed Central

    Sasahara, Kazutoshi; Hirata, Yoshito; Toyoda, Masashi; Kitsuregawa, Masaru; Aihara, Kazuyuki

    2013-01-01

    Online social media are increasingly facilitating our social interactions, thereby making available a massive “digital fossil” of human behavior. Discovering and quantifying distinct patterns using these data is important for studying social behavior, although the rapid time-variant nature and large volumes of these data make this task difficult and challenging. In this study, we focused on the emergence of “collective attention” on Twitter, a popular social networking service. We propose a simple method for detecting and measuring the collective attention evoked by various types of events. This method exploits the fact that tweeting activity exhibits a burst-like increase and an irregular oscillation when a particular real-world event occurs; otherwise, it follows regular circadian rhythms. The difference between regular and irregular states in the tweet stream was measured using the Jensen-Shannon divergence, which corresponds to the intensity of collective attention. We then associated irregular incidents with their corresponding events that attracted the attention and elicited responses from large numbers of people, based on the popularity and the enhancement of key terms in posted messages or “tweets.” Next, we demonstrate the effectiveness of this method using a large dataset that contained approximately 490 million Japanese tweets by over 400,000 users, in which we identified 60 cases of collective attentions, including one related to the Tohoku-oki earthquake. “Retweet” networks were also investigated to understand collective attention in terms of social interactions. This simple method provides a retrospective summary of collective attention, thereby contributing to the fundamental understanding of social behavior in the digital era. PMID:23637913

  14. Quantifying Missing Heritability at Known GWAS Loci

    PubMed Central

    Gusev, Alexander; Bhatia, Gaurav; Zaitlen, Noah; Vilhjalmsson, Bjarni J.; Diogo, Dorothée; Stahl, Eli A.; Gregersen, Peter K.; Worthington, Jane; Klareskog, Lars; Raychaudhuri, Soumya; Plenge, Robert M.; Pasaniuc, Bogdan; Price, Alkes L.

    2013-01-01

    Recent work has shown that much of the missing heritability of complex traits can be resolved by estimates of heritability explained by all genotyped SNPs. However, it is currently unknown how much heritability is missing due to poor tagging or additional causal variants at known GWAS loci. Here, we use variance components to quantify the heritability explained by all SNPs at known GWAS loci in nine diseases from WTCCC1 and WTCCC2. After accounting for expectation, we observed all SNPs at known GWAS loci to explain more heritability than GWAS-associated SNPs on average (). For some diseases, this increase was individually significant: for Multiple Sclerosis (MS) () and for Crohn's Disease (CD) (); all analyses of autoimmune diseases excluded the well-studied MHC region. Additionally, we found that GWAS loci from other related traits also explained significant heritability. The union of all autoimmune disease loci explained more MS heritability than known MS SNPs () and more CD heritability than known CD SNPs (), with an analogous increase for all autoimmune diseases analyzed. We also observed significant increases in an analysis of Rheumatoid Arthritis (RA) samples typed on ImmunoChip, with more heritability from all SNPs at GWAS loci () and more heritability from all autoimmune disease loci () compared to known RA SNPs (including those identified in this cohort). Our methods adjust for LD between SNPs, which can bias standard estimates of heritability from SNPs even if all causal variants are typed. By comparing adjusted estimates, we hypothesize that the genome-wide distribution of causal variants is enriched for low-frequency alleles, but that causal variants at known GWAS loci are skewed towards common alleles. These findings have important ramifications for fine-mapping study design and our understanding of complex disease architecture. PMID:24385918

  15. Quantifying fluvial bedrock erosion using repeat terrestrial Lidar

    NASA Astrophysics Data System (ADS)

    Cook, Kristen

    2013-04-01

    The Da'an River Gorge in western Taiwan provides a unique opportunity to observe the formation and evolution of a natural bedrock gorge. The 1.2 km long and up to 20 m deep gorge has formed since 1999 in response to uplift of the riverbed during the Chi-Chi earthquake. The extremely rapid pace of erosion enables us to observe both downcutting and channel widening over short time periods. We have monitored the evolution of the gorge since 2009 using repeat RTK GPS surveys and terrestrial Lidar scans. GPS surveys of the channel profile are conducted frequently, with 24 surveys to date, while Lidar scans are conducted after major floods, or after 5-9 months without a flood, for a total of 8 scans to date. The Lidar data are most useful for recording erosion of channel walls, which is quite episodic and highly variable along the channel. By quantifying the distribution of wall erosion in space and time, we can improve our understanding of channel widening processes and of the development of the channel planform, particularly the growth of bends. During the summer of 2012, the Da'an catchment experienced two large storm events, a meiyu (plum rain) event on June 10-13 that brought 800 mm of rain and a typhoon on August 1-3 that brought 650 mm of rain. The resulting floods had significant geomorphic effects on the Da'an gorge, including up to 10s of meters of erosion in some sections of the gorge walls. We quantify these changes using Lidar surveys conducted on June 7, July 3, and August 30. Channel wall collapses also occur in the absence of large floods, and we use scans from August 23, 2011 and June 7, 2012 to quantify erosion during a period that included a number of small floods, but no large ones. This allows us to compare the impact of 9 months of normal conditions to the impact of short-duration extreme events. The observed variability of erosion in space and time highlights the need for 3D techniques such as terrestrial Lidar to properly quantify erosion in this

  16. Quantifying unsteadiness and dynamics of pulsatory volcanic activity

    NASA Astrophysics Data System (ADS)

    Dominguez, L.; Pioli, L.; Bonadonna, C.; Connor, C. B.; Andronico, D.; Harris, A. J. L.; Ripepe, M.

    2016-06-01

    , can be also described based on the log-logistic parameter s, which is found to increase from regular mafic systems to highly variable silicic systems. These results suggest that the periodicity of explosions, quantified in terms of the distribution of repose times, can give fundamental information about the system dynamics and change regularly across eruptive styles (i.e., Strombolian to Vulcanian), allowing for direct comparison and quantification of different types of pulsatory activity during these eruptions.

  17. Quantifying spore viability of the honey bee pathogen Nosema apis using flow cytometry.

    PubMed

    Peng, Yan; Lee-Pullen, Tracey F; Heel, Kathy; Millar, A Harvey; Baer, Boris

    2014-05-01

    Honey bees are hosts to more than 80 different parasites, some of them being highly virulent and responsible for substantial losses in managed honey bee populations. The study of honey bee pathogens and their interactions with the bees' immune system has therefore become a research area of major interest. Here we developed a fast, accurate and reliable method to quantify the viability of spores of the honey bee gut parasite Nosema apis. To verify this method, a dilution series with 0, 25, 50, 75, and 100% live N. apis was made and SYTO 16 and Propidium Iodide (n = 35) were used to distinguish dead from live spores. The viability of spores in each sample was determined by flow cytometry and compared with the current method based on fluorescence microscopy. Results show that N. apis viability counts using flow cytometry produced very similar results when compared with fluorescence microscopy. However, we found that fluorescence microscopy underestimates N. apis viability in samples with higher percentages of viable spores, the latter typically being what is found in biological samples. A series of experiments were conducted to confirm that flow cytometry allows the use of additional fluorescent dyes such as SYBR 14 and SYTOX Red (used in combination with SYTO 16 or Propidium Iodide) to distinguish dead from live spores. We also show that spore viability quantification with flow cytometry can be undertaken using substantially lower dye concentrations than fluorescence microscopy. In conclusion, our data show flow cytometry to be a fast, reliable method to quantify N. apis spore viabilities, which has a number of advantages compared with existing methods.

  18. Visualizing and quantifying the suppressive effects of glucocorticoids on the tadpole immune system in vivo.

    PubMed

    Schreiber, Alexander M

    2011-12-01

    A challenging topic in undergraduate physiology courses is the complex interaction between the vertebrate endocrine system and the immune system. There are relatively few established and accessible laboratory exercises available to instructors to help their students gain a working understanding of these interactions. The present laboratory module was developed to show students how glucocorticoid receptor activity can be pharmacologically modulated in Xenopus laevis tadpoles and the resulting effects on thymus gland size visualized and quantified in vivo. After treating young tadpoles with a cortisol receptor agonist (dexamethasone) for 1 wk, students can easily visualize the suppressive effects of glucocorticoids on the intact thymus gland, which shrinks dramatically in size in response to this steroid hormone analog. However, the suppressive effect of dexamethasone is nullified in the presence of the glucocorticoid receptor antagonist RU-486, which powerfully illustrates the specific effects of glucocorticoid receptor inhibition on the immune system. Image analysis and statistics software are used to quantify the effects of glucocorticoid modulation on thymus size.

  19. Quantifying uncertainties in travel time tomography using the null space shuttle

    NASA Astrophysics Data System (ADS)

    de Wit, R. W.; Trampert, J.; van der Hilst, R. D.

    2010-12-01

    Due to the underdetermined nature of large tomographic inverse problems, a sizable nullspace exists. It is therefore important to investigate the uncertainty of tomographic models produced by inverse problems with multiple solutions. The nullspace shuttle (Deal and Nolet, 1996) has been designed to exploit components of the nullspace, along with a priori information or a physical model, in order to improve or enhance the original minimum-norm solution. We generalize the null space shuttle technique to quantify uncertainties in a classical study of travel time tomography (Li et al.,2008) and examine a range of models that are consistent with the travel time data. The family of resulting tomographic model is used to quantify the uncertainties in the original tomographic image, a global P-wave speed perturbation mantle model. We further show what is and what is not resolved in the aforementioned tomographic image. With the null space shuttle we are able to alter or remove structures (e.g. slabs, plumes) in the original tomographic image. We suggest that this technique should be routinely applied before physical interpretations of tomographic images are made.

  20. Quantifying Land Use Impacts on Biodiversity: Combining Species-Area Models and Vulnerability Indicators.

    PubMed

    Chaudhary, Abhishek; Verones, Francesca; de Baan, Laura; Hellweg, Stefanie

    2015-08-18

    Habitat degradation and subsequent biodiversity damage often take place far from the place of consumption because of globalization and the increasing level of international trade. Informing consumers and policy makers about the biodiversity impacts "hidden" in the life cycle of imported products is an important step toward achieving sustainable consumption patterns. Spatially explicit methods are needed in life cycle assessment to accurately quantify biodiversity impacts of products and processes. We use the Countryside species-area relationship (SAR) to quantify regional species loss due to land occupation and transformation for five taxa and six land use types in 804 terrestrial ecoregions. Further, we calculate vulnerability scores for each ecoregion based on the fraction of each species' geographic range (endemic richness) hosted by the ecoregion and the IUCN assigned threat level of each species. Vulnerability scores are multiplied with SAR-predicted regional species loss to estimate potential global extinctions per unit of land use. As a case study, we assess the land use biodiversity impacts of 1 kg of bioethanol produced using six different feed stocks in different parts of the world. Results show that the regions with highest biodiversity impacts differed markedly when the vulnerability of species was included.

  1. Quantifying local cerebral blood flow by N-isopropyl-p-(123I)iodoamphetamine (IMP) tomography

    SciTech Connect

    Kuhl, D.E.; Barrio, J.R.; Huang, S.C.; Selin, C.; Ackermann, R.F.; Lear, J.L.; Wu, J.L.; Lin, T.H.; Phelps, M.E.

    1982-03-01

    A model was validated wherein local cerebral blood flow (LCBF) in humans was quantified by single-photon emission computed tomography (SPECT) with intravenously injected N-isopropyl-p-(123I)iodoamphetamine (IMP) combined with a modification of the classic method of arterial input sampling. After intravenous injection of IMP in rat, autoradiograms of the brain showed activity distributions in the pattern of LCBF. IMP was nearly completely removed on first pass through monkey brain after intracarotid injection (CBF.33 ml/100 g/min) and washed out with a half-time of approximately 1 hr. When the modified method of arterial input and tissue-sample counting applied to dog brain, there was good correspondence between LCBF based on IMP and on that by microsphere injection over a wide flow range. In applying the method to human subjects using SPECT, whole-brain CBF measured 47.2 +/- 5.4 ml/100 g/min (mean +/- s.d., N.5), stable gray-white distinction persisted for over 1 hr, and the half-time for brain washout was approximately 1 hr. Perfusion deficits in patients were clearly demonstrated and quantified, comparing well with results now available from positron ECT.

  2. Chimpanzees (Pan troglodytes) and bonobos (Pan paniscus) quantify split solid objects.

    PubMed

    Cacchione, Trix; Hrubesch, Christine; Call, Josep

    2013-01-01

    Recent research suggests that gorillas' and orangutans' object representations survive cohesion violations (e.g., a split of a solid object into two halves), but that their processing of quantities may be affected by them. We assessed chimpanzees' (Pan troglodytes) and bonobos' (Pan paniscus) reactions to various fission events in the same series of action tasks modelled after infant studies previously run on gorillas and orangutans (Cacchione and Call in Cognition 116:193-203, 2010b). Results showed that all four non-human great ape species managed to quantify split objects but that their performance varied as a function of the non-cohesiveness produced in the splitting event. Spatial ambiguity and shape invariance had the greatest impact on apes' ability to represent and quantify objects. Further, we observed species differences with gorillas performing lower than other species. Finally, we detected a substantial age effect, with ape infants below 6 years of age being outperformed by both juvenile/adolescent and adult apes.

  3. Quantifying temporal bone morphology of great apes and humans: an approach using geometric morphometrics

    PubMed Central

    Lockwood, Charles A; Lynch, John M; Kimbel, William H

    2002-01-01

    The hominid temporal bone offers a complex array of morphology that is linked to several different functional systems. Its frequent preservation in the fossil record gives the temporal bone added significance in the study of human evolution, but its morphology has proven difficult to quantify. In this study we use techniques of 3D geometric morphometrics to quantify differences among humans and great apes and discuss the results in a phylogenetic context. Twenty-three landmarks on the ectocranial surface of the temporal bone provide a high level of anatomical detail. Generalized Procrustes analysis (GPA) is used to register (adjust for position, orientation and scale) landmark data from 405 adults representing Homo, Pan, Gorilla and Pongo. Principal components analysis of residuals from the GPA shows that the major source of variation is between humans and apes. Human characteristics such as a coronally orientated petrous axis, a deep mandibular fossa, a projecting mastoid process, and reduced lateral extension of the tympanic element strongly impact the analysis. In phenetic cluster analyses, gorillas and orangutans group together with respect to chimpanzees, and all apes group together with respect to humans. Thus, the analysis contradicts depictions of African apes as a single morphotype. Gorillas and orangutans lack the extensive preglenoid surface of chimpanzees, and their mastoid processes are less medially inflected. These and other characters shared by gorillas and orangutans are probably primitive for the African hominid clade. PMID:12489757

  4. Quantifying the Short-Term Costs of Conservation Interventions for Fishers at Lake Alaotra, Madagascar

    PubMed Central

    Wallace, Andrea P. C.; Milner-Gulland, E. J.; Jones, Julia P. G.; Bunnefeld, Nils; Young, Richard; Nicholson, Emily

    2015-01-01

    Artisanal fisheries are a key source of food and income for millions of people, but if poorly managed, fishing can have declining returns as well as impacts on biodiversity. Management interventions such as spatial and temporal closures can improve fishery sustainability and reduce environmental degradation, but may carry substantial short-term costs for fishers. The Lake Alaotra wetland in Madagascar supports a commercially important artisanal fishery and provides habitat for a Critically Endangered primate and other endemic wildlife of conservation importance. Using detailed data from more than 1,600 fisher catches, we used linear mixed effects models to explore and quantify relationships between catch weight, effort, and spatial and temporal restrictions to identify drivers of fisher behaviour and quantify the potential effect of fishing restrictions on catch. We found that restricted area interventions and fishery closures would generate direct short-term costs through reduced catch and income, and these costs vary between groups of fishers using different gear. Our results show that conservation interventions can have uneven impacts on local people with different fishing strategies. This information can be used to formulate management strategies that minimise the adverse impacts of interventions, increase local support and compliance, and therefore maximise conservation effectiveness. PMID:26107284

  5. Comprehensive analysis of individual pulp fiber bonds quantifies the mechanisms of fiber bonding in paper

    NASA Astrophysics Data System (ADS)

    Hirn, Ulrich; Schennach, Robert

    2015-05-01

    The process of papermaking requires substantial amounts of energy and wood consumption, which contributes to larger environmental costs. In order to optimize the production of papermaking to suit its many applications in material science and engineering, a quantitative understanding of bonding forces between the individual pulp fibers is of importance. Here we show the first approach to quantify the bonding energies contributed by the individual bonding mechanisms. We calculated the impact of the following mechanisms necessary for paper formation: mechanical interlocking, interdiffusion, capillary bridges, hydrogen bonding, Van der Waals forces, and Coulomb forces on the bonding energy. Experimental results quantify the area in molecular contact necessary for bonding. Atomic force microscopy experiments derive the impact of mechanical interlocking. Capillary bridges also contribute to the bond. A model based on the crystal structure of cellulose leads to values for the chemical bonds. In contrast to general believe which favors hydrogen bonding Van der Waals bonds play the most important role according to our model. Comparison with experimentally derived bond energies support the presented model. This study characterizes bond formation between pulp fibers leading to insight that could be potentially used to optimize the papermaking process, while reducing energy and wood consumption.

  6. Quantifying dynamic sensitivity of optimization algorithm parameters to improve hydrological model calibration

    NASA Astrophysics Data System (ADS)

    Qi, Wei; Zhang, Chi; Fu, Guangtao; Zhou, Huicheng

    2016-02-01

    It is widely recognized that optimization algorithm parameters have significant impacts on algorithm performance, but quantifying the influence is very complex and difficult due to high computational demands and dynamic nature of search parameters. The overall aim of this paper is to develop a global sensitivity analysis based framework to dynamically quantify the individual and interactive influence of algorithm parameters on algorithm performance. A variance decomposition sensitivity analysis method, Analysis of Variance (ANOVA), is used for sensitivity quantification, because it is capable of handling small samples and more computationally efficient compared with other approaches. The Shuffled Complex Evolution method developed at the University of Arizona algorithm (SCE-UA) is selected as an optimization algorithm for investigation, and two criteria, i.e., convergence speed and success rate, are used to measure the performance of SCE-UA. Results show the proposed framework can effectively reveal the dynamic sensitivity of algorithm parameters in the search processes, including individual influences of parameters and their interactive impacts. Interactions between algorithm parameters have significant impacts on SCE-UA performance, which has not been reported in previous research. The proposed framework provides a means to understand the dynamics of algorithm parameter influence, and highlights the significance of considering interactive parameter influence to improve algorithm performance in the search processes.

  7. Quantifying the threat of extinction from Muller's ratchet in the diploid Amazon molly (Poecilia formosa)

    PubMed Central

    2008-01-01

    Background The Amazon molly (Poecilia formosa) is a small unisexual fish that has been suspected of being threatened by extinction from the stochastic accumulation of slightly deleterious mutations that is caused by Muller's ratchet in non-recombining populations. However, no detailed quantification of the extent of this threat is available. Results Here we quantify genomic decay in this fish by using a simple model of Muller's ratchet with the most realistic parameter combinations available employing the evolution@home global computing system. We also describe simple extensions of the standard model of Muller's ratchet that allow us to deal with selfing diploids, triploids and mitotic recombination. We show that Muller's ratchet creates a threat of extinction for the Amazon molly for many biologically realistic parameter combinations. In most cases, extinction is expected to occur within a time frame that is less than previous estimates of the age of the species, leading to a genomic decay paradox. Conclusion How then does the Amazon molly survive? Several biological processes could individually or in combination solve this genomic decay paradox, including paternal leakage of undamaged DNA from sexual sister species, compensatory mutations and many others. More research is needed to quantify the contribution of these potential solutions towards the survival of the Amazon molly and other (ancient) asexual species. PMID:18366680

  8. An integrated method for quantifying root architecture of field-grown maize

    PubMed Central

    Wu, Jie; Guo, Yan

    2014-01-01

    Background and Aims A number of techniques have recently been developed for studying the root system architecture (RSA) of seedlings grown in various media. In contrast, methods for sampling and analysis of the RSA of field-grown plants, particularly for details of the lateral root components, are generally inadequate. Methods An integrated methodology was developed that includes a custom-made root-core sampling system for extracting intact root systems of individual maize plants, a combination of proprietary software and a novel program used for collecting individual RSA information, and software for visualizing the measured individual nodal root architecture. Key Results Example experiments show that large root cores can be sampled, and topological and geometrical structure of field-grown maize root systems can be quantified and reconstructed using this method. Second- and higher order laterals are found to contribute substantially to total root number and length. The length of laterals of distinct orders varies significantly. Abundant higher order laterals can arise from a single first-order lateral, and they concentrate in the proximal axile branching zone. Conclusions The new method allows more meaningful sampling than conventional methods because of its easily opened, wide corer and sampling machinery, and effective analysis of RSA using the software. This provides a novel technique for quantifying RSA of field-grown maize and also provides a unique evaluation of the contribution of lateral roots. The method also offers valuable potential for parameterization of root architectural models. PMID:24532646

  9. Quantifying the effect of environment stability on the transcription factor repertoire of marine microbes

    PubMed Central

    2011-01-01

    Background DNA-binding transcription factors (TFs) regulate cellular functions in prokaryotes, often in response to environmental stimuli. Thus, the environment exerts constant selective pressure on the TF gene content of microbial communities. Recently a study on marine Synechococcus strains detected differences in their genomic TF content related to environmental adaptation, but so far the effect of environmental parameters on the content of TFs in bacterial communities has not been systematically investigated. Results We quantified the effect of environment stability on the transcription factor repertoire of marine pelagic microbes from the Global Ocean Sampling (GOS) metagenome using interpolated physico-chemical parameters and multivariate statistics. Thirty-five percent of the difference in relative TF abundances between samples could be explained by environment stability. Six percent was attributable to spatial distance but none to a combination of both spatial distance and stability. Some individual TFs showed a stronger relationship to environment stability and space than the total TF pool. Conclusions Environmental stability appears to have a clearly detectable effect on TF gene content in bacterioplanktonic communities described by the GOS metagenome. Interpolated environmental parameters were shown to compare well to in situ measurements and were essential for quantifying the effect of the environment on the TF content. It is demonstrated that comprehensive and well-structured contextual data will strongly enhance our ability to interpret the functional potential of microbes from metagenomic data. PMID:22587903

  10. Quantifiable outcomes from corporate and higher education learning collaborations

    NASA Astrophysics Data System (ADS)

    Devine, Thomas G.

    The study investigated the existence of measurable learning outcomes that emerged out of the shared strengths of collaborating sponsors. The study identified quantifiable learning outcomes that confirm corporate, academic and learner participation in learning collaborations. Each of the three hypotheses and the synergy indicator quantitatively and qualitatively confirmed learning outcomes benefiting participants. The academic-indicator quantitatively confirmed that learning outcomes attract learners to the institution. The corporate-indicator confirmed that learning outcomes include knowledge exchange and enhanced workforce talents for careers in the energy-utility industry. The learner-indicator confirmed that learning outcomes provide professional development opportunities for employment. The synergy-indicator confirmed that best learning practices in learning collaborations emanate out of the sponsors' shared strengths, and that partnerships can be elevated to strategic alliances, going beyond response to the desires of sponsors to create learner-centered cultures. The synergy-indicator confirmed the value of organizational processes that elevate sponsors' interactions to sharing strength, to create a learner-centered culture. The study's series of qualitative questions confirmed prior success factors, while verifying the hypothesis results and providing insight not available from quantitative data. The direct benefactors of the study are the energy-utility learning-collaboration participants of the study, and corporation, academic institutions, and learners of the collaboration. The indirect benefactors are the stakeholders of future learning collaborations, through improved knowledge of the existence or absence of quantifiable learning outcomes.

  11. Quantifying complexity in translational research: an integrated approach

    PubMed Central

    Munoz, David A.; Nembhard, Harriet Black; Kraschnewski, Jennifer L.

    2014-01-01

    Purpose This article quantifies complexity in translational research. The impact of major operational steps and technical requirements (TR) is calculated with respect to their ability to accelerate moving new discoveries into clinical practice. Design/Methodology/Approach A three-phase integrated Quality Function Deployment (QFD) and Analytic Hierarchy Process (AHP) method was used to quantify complexity in translational research. A case study in obesity was used to usability. Findings Generally, the evidence generated was valuable for understanding various components in translational research. Particularly, we found that collaboration networks, multidisciplinary team capacity and community engagement are crucial for translating new discoveries into practice. Research limitations/implications As the method is mainly based on subjective opinion, some argue that the results may be biased. However, a consistency ratio is calculated and used as a guide to subjectivity. Alternatively, a larger sample may be incorporated to reduce bias. Practical implications The integrated QFD-AHP framework provides evidence that could be helpful to generate agreement, develop guidelines, allocate resources wisely, identify benchmarks and enhance collaboration among similar projects. Originality/value Current conceptual models in translational research provide little or no clue to assess complexity. The proposed method aimed to fill this gap. Additionally, the literature review includes various features that have not been explored in translational research. PMID:25417380

  12. Risk-Quantified Decision-Making at Rocky Flats

    SciTech Connect

    Myers, Jeffrey C.

    2008-01-15

    Surface soils in the 903 Pad Lip Area of the Rocky Flats Environmental Technology Site (RFETS) were contaminated with {sup 239/240}Pu by site operations. To meet remediation goals, accurate definition of areas where {sup 239/240}Pu activity exceeded the threshold level of 50 pCi/g and those below 50- pCi/g needed definition. In addition, the confidence for remedial decisions needed to be quantified and displayed visually. Remedial objectives needed to achieve a 90 percent certainty that unremediated soils had less than a 10 percent chance of {sup 239/240}Pu activity exceeding 50-pCi/g. Removing areas where the chance of exceedance is greater than 10 percent creates a 90 percent confidence in the remedial effort results. To achieve the stipulated goals, the geostatistical approach of probability kriging (Myers 1997) was implemented. Lessons learnt: Geostatistical techniques provided a risk-quantified approach to remedial decision-making and provided visualizations of the excavation area. Error analysis demonstrated compliance and confirmed that more than sufficient soils were removed. Error analysis also illustrated that any soils above the threshold that were not removed would be of nominal activity. These quantitative approaches were useful from a regulatory, engineering, and stakeholder satisfaction perspective.

  13. Quantifying relative diver effects in underwater visual censuses.

    PubMed

    Dickens, Luke C; Goatley, Christopher H R; Tanner, Jennifer K; Bellwood, David R

    2011-01-01

    Diver-based Underwater Visual Censuses (UVCs), particularly transect-based surveys, are key tools in the study of coral reef fish ecology. These techniques, however, have inherent problems that make it difficult to collect accurate numerical data. One of these problems is the diver effect (defined as the reaction of fish to a diver). Although widely recognised, its effects have yet to be quantified and the extent of taxonomic variation remains to be determined. We therefore examined relative diver effects on a reef fish assemblage on the Great Barrier Reef. Using common UVC methods, the recorded abundance of seven reef fish groups were significantly affected by the ongoing presence of SCUBA divers. Overall, the diver effect resulted in a 52% decrease in the mean number of individuals recorded, with declines of up to 70% in individual families. Although the diver effect appears to be a significant problem, UVCs remain a useful approach for quantifying spatial and temporal variation in relative fish abundances, especially if using methods that minimise the exposure of fishes to divers. Fixed distance transects using tapes or lines deployed by a second diver (or GPS-calibrated timed swims) would appear to maximise fish counts and minimise diver effects.

  14. Methods for quantifying uncertainty in fast reactor analyses.

    SciTech Connect

    Fanning, T. H.; Fischer, P. F.

    2008-04-07

    Liquid-metal-cooled fast reactors in the form of sodium-cooled fast reactors have been successfully built and tested in the U.S. and throughout the world. However, no fast reactor has operated in the U.S. for nearly fourteen years. More importantly, the U.S. has not constructed a fast reactor in nearly 30 years. In addition to reestablishing the necessary industrial infrastructure, the development, testing, and licensing of a new, advanced fast reactor concept will likely require a significant base technology program that will rely more heavily on modeling and simulation than has been done in the past. The ability to quantify uncertainty in modeling and simulations will be an important part of any experimental program and can provide added confidence that established design limits and safety margins are appropriate. In addition, there is an increasing demand from the nuclear industry for best-estimate analysis methods to provide confidence bounds along with their results. The ability to quantify uncertainty will be an important component of modeling that is used to support design, testing, and experimental programs. Three avenues of UQ investigation are proposed. Two relatively new approaches are described which can be directly coupled to simulation codes currently being developed under the Advanced Simulation and Modeling program within the Reactor Campaign. A third approach, based on robust Monte Carlo methods, can be used in conjunction with existing reactor analysis codes as a means of verification and validation of the more detailed approaches.

  15. Quantifying Relative Diver Effects in Underwater Visual Censuses

    PubMed Central

    Dickens, Luke C.; Goatley, Christopher H. R.; Tanner, Jennifer K.; Bellwood, David R.

    2011-01-01

    Diver-based Underwater Visual Censuses (UVCs), particularly transect-based surveys, are key tools in the study of coral reef fish ecology. These techniques, however, have inherent problems that make it difficult to collect accurate numerical data. One of these problems is the diver effect (defined as the reaction of fish to a diver). Although widely recognised, its effects have yet to be quantified and the extent of taxonomic variation remains to be determined. We therefore examined relative diver effects on a reef fish assemblage on the Great Barrier Reef. Using common UVC methods, the recorded abundance of seven reef fish groups were significantly affected by the ongoing presence of SCUBA divers. Overall, the diver effect resulted in a 52% decrease in the mean number of individuals recorded, with declines of up to 70% in individual families. Although the diver effect appears to be a significant problem, UVCs remain a useful approach for quantifying spatial and temporal variation in relative fish abundances, especially if using methods that minimise the exposure of fishes to divers. Fixed distance transects using tapes or lines deployed by a second diver (or GPS-calibrated timed swims) would appear to maximise fish counts and minimise diver effects. PMID:21533039

  16. Live cell interferometry quantifies dynamics of biomass partitioning during cytokinesis.

    PubMed

    Zangle, Thomas A; Teitell, Michael A; Reed, Jason

    2014-01-01

    The equal partitioning of cell mass between daughters is the usual and expected outcome of cytokinesis for self-renewing cells. However, most studies of partitioning during cell division have focused on daughter cell shape symmetry or segregation of chromosomes. Here, we use live cell interferometry (LCI) to quantify the partitioning of daughter cell mass during and following cytokinesis. We use adherent and non-adherent mouse fibroblast and mouse and human lymphocyte cell lines as models and show that, on average, mass asymmetries present at the time of cleavage furrow formation persist through cytokinesis. The addition of multiple cytoskeleton-disrupting agents leads to increased asymmetry in mass partitioning which suggests the absence of active mass partitioning mechanisms after cleavage furrow positioning. PMID:25531652

  17. Quantifying spin Hall angles from spin pumping: experiments and theory.

    PubMed

    Mosendz, O; Pearson, J E; Fradin, F Y; Bauer, G E W; Bader, S D; Hoffmann, A

    2010-01-29

    Spin Hall effects intermix spin and charge currents even in nonmagnetic materials and, therefore, ultimately may allow the use of spin transport without the need for ferromagnets. We show how spin Hall effects can be quantified by integrating Ni{80}Fe{20}|normal metal (N) bilayers into a coplanar waveguide. A dc spin current in N can be generated by spin pumping in a controllable way by ferromagnetic resonance. The transverse dc voltage detected along the Ni{80}Fe{20}|N has contributions from both the anisotropic magnetoresistance and the spin Hall effect, which can be distinguished by their symmetries. We developed a theory that accounts for both. In this way, we determine the spin Hall angle quantitatively for Pt, Au, and Mo. This approach can readily be adapted to any conducting material with even very small spin Hall angles.

  18. Quantifying Age-dependent Extinction from Species Phylogenies.

    PubMed

    Alexander, Helen K; Lambert, Amaury; Stadler, Tanja

    2016-01-01

    Several ecological factors that could play into species extinction are expected to correlate with species age, i.e., time elapsed since the species arose by speciation. To date, however, statistical tools to incorporate species age into likelihood-based phylogenetic inference have been lacking. We present here a computational framework to quantify age-dependent extinction through maximum likelihood parameter estimation based on phylogenetic trees, assuming species lifetimes are gamma distributed. Testing on simulated trees shows that neglecting age dependence can lead to biased estimates of key macroevolutionary parameters. We then apply this method to two real data sets, namely a complete phylogeny of birds (class Aves) and a clade of self-compatible and -incompatible nightshades (Solanaceae), gaining initial insights into the extent to which age-dependent extinction may help explain macroevolutionary patterns. Our methods have been added to the R package TreePar. PMID:26405218

  19. Quantifying chaotic dynamics from integrate-and-fire processes

    SciTech Connect

    Pavlov, A. N.; Pavlova, O. N.; Mohammad, Y. K.; Kurths, J.

    2015-01-15

    Characterizing chaotic dynamics from integrate-and-fire (IF) interspike intervals (ISIs) is relatively easy performed at high firing rates. When the firing rate is low, a correct estimation of Lyapunov exponents (LEs) describing dynamical features of complex oscillations reflected in the IF ISI sequences becomes more complicated. In this work we discuss peculiarities and limitations of quantifying chaotic dynamics from IF point processes. We consider main factors leading to underestimated LEs and demonstrate a way of improving numerical determining of LEs from IF ISI sequences. We show that estimations of the two largest LEs can be performed using around 400 mean periods of chaotic oscillations in the regime of phase-coherent chaos. Application to real data is discussed.

  20. The Use of Micro-CT with Image Segmentation to Quantify Leakage in Dental Restorations

    PubMed Central

    Carrera, Carola A.; Lan, Caixia; Escobar-Sanabria, David; Li, Yuping; Rudney, Joel; Aparicio, Conrado; Fok, Alex

    2015-01-01

    Objective To develop a method for quantifying leakage in composite resin restorations after curing, using non-destructive X-ray micro-computed tomography (micro-CT) and image segmentation. Methods Class-I cavity preparations were made in 20 human third molars, which were divided into 2 groups. Group I was restored with Z100 and Group II with Filtek LS. Micro-CT scans were taken for both groups before and after they were submerged in silver nitrate solution (AgNO3 50%) to reveal any interfacial gap and leakage at the tooth restoration interface. Image segmentation was carried out by first performing image correlation to align the before- and after-treatment images and then by image subtraction to isolate the silver nitrate penetrant for precise volume calculation. Two-tailed Student’s t-test was used to analyze the results, with the level of significance set at p<0.05. Results All samples from Group I showed silver nitrate penetration with a mean volume of 1.3 ± 0.7 mm3. In Group II, only 2 out of the 10 restorations displayed infiltration along the interface, giving a mean volume of 0.3 ± 0.3 mm3. The difference between the two groups was statistically significant (p < 0.05). The infiltration showed non-uniform patterns within the interface. Significance We have developed a method to quantify the volume of leakage using non-destructive micro-CT, silver nitrate infiltration and image segmentation. Our results confirmed that substantial leakage could occur in composite restorations that have imperfections in the adhesive layer or interfacial debonding through polymerization shrinkage. For the restorative systems investigated in this study, this occurred mostly at the interface between the adhesive system and the tooth structure. PMID:25649496

  1. Quantifying Permafrost Characteristics with DCR-ERT

    NASA Astrophysics Data System (ADS)

    Schnabel, W.; Trochim, E.; Munk, J.; Kanevskiy, M. Z.; Shur, Y.; Fortier, R.

    2012-12-01

    Geophysical methods are an efficient method for quantifying permafrost characteristics for Arctic road design and engineering. In the Alaskan Arctic construction and maintenance of roads requires integration of permafrost; ground that is below 0 degrees C for two or more years. Features such as ice content and temperature are critical for understanding current and future ground conditions for planning, design and evaluation of engineering applications. This study focused on the proposed Foothills West Transportation Access project corridor where the purpose is to construct a new all-season road connecting the Dalton Highway to Umiat. Four major areas were chosen that represented a range of conditions including gravel bars, alluvial plains, tussock tundra (both unburned and burned conditions), high and low centered ice-wedge polygons and an active thermokarst feature. Direct-current resistivity using galvanic contact (DCR-ERT) was applied over transects. In conjunction complimentary site data including boreholes, active layer depths, vegetation descriptions and site photographs was obtained. The boreholes provided information on soil morphology, ice texture and gravimetric moisture content. Horizontal and vertical resolutions in the DCR-ERT were varied to determine the presence or absence of ground ice; subsurface heterogeneity; and the depth to groundwater (if present). The four main DCR-ERT methods used were: 84 electrodes with 2 m spacing; 42 electrodes with 0.5 m spacing; 42 electrodes with 2 m spacing; and 84 electrodes with 1 m spacing. In terms of identifying the ground ice characteristics the higher horizontal resolution DCR-ERT transects with either 42 or 84 electrodes and 0.5 or 1 m spacing were best able to differentiate wedge-ice. This evaluation is based on a combination of both borehole stratigraphy and surface characteristics. Simulated apparent resistivity values for permafrost areas varied from a low of 4582 Ω m to a high of 10034 Ω m. Previous

  2. Quantifying the impacts of global disasters

    NASA Astrophysics Data System (ADS)

    Jones, L. M.; Ross, S.; Wilson, R. I.; Borrero, J. C.; Brosnan, D.; Bwarie, J. T.; Geist, E. L.; Hansen, R. A.; Johnson, L. A.; Kirby, S. H.; Long, K.; Lynett, P. J.; Miller, K. M.; Mortensen, C. E.; Perry, S. C.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Thio, H. K.; Wein, A. M.; Whitmore, P.; Wood, N. J.

    2012-12-01

    The US Geological Survey, National Oceanic and Atmospheric Administration, California Geological Survey, and other entities are developing a Tsunami Scenario, depicting a realistic outcome of a hypothetical but plausible large tsunami originating in the eastern Aleutian Arc, affecting the west coast of the United States, including Alaska and Hawaii. The scenario includes earth-science effects, damage and restoration of the built environment, and social and economic impacts. Like the earlier ShakeOut and ARkStorm disaster scenarios, the purpose of the Tsunami Scenario is to apply science to quantify the impacts of natural disasters in a way that can be used by decision makers in the affected sectors to reduce the potential for loss. Most natural disasters are local. A major hurricane can destroy a city or damage a long swath of coastline while mostly sparing inland areas. The largest earthquake on record caused strong shaking along 1500 km of Chile, but left the capital relatively unscathed. Previous scenarios have used the local nature of disasters to focus interaction with the user community. However, the capacity for global disasters is growing with the interdependency of the global economy. Earthquakes have disrupted global computer chip manufacturing and caused stock market downturns. Tsunamis, however, can be global in their extent and direct impact. Moreover, the vulnerability of seaports to tsunami damage can increase the global consequences. The Tsunami Scenario is trying to capture the widespread effects while maintaining the close interaction with users that has been one of the most successful features of the previous scenarios. The scenario tsunami occurs in the eastern Aleutians with a source similar to the 2011 Tohoku event. Geologic similarities support the argument that a Tohoku-like source is plausible in Alaska. It creates a major nearfield tsunami in the Aleutian arc and peninsula, a moderate tsunami in the US Pacific Northwest, large but not the

  3. Quantifying Total Electron Content Forecasts during Ionospheric Storms

    NASA Astrophysics Data System (ADS)

    Meng, X.; Mannucci, A. J.; Verkhoglyadova, O. P.; Tsurutani, B.

    2015-12-01

    We make total electron content (TEC) predictions with the Global Ionosphere-Thermosphere Model (GITM), in order to explore the feasibility of ionospheric forecasts with the current generation of physics-based models. For a number of representative ionospheric storms, we perform GITM simulations in a forecast mode. The simulations are driven by solar wind conditions at 1 AU from either in-situ observations or predictions by one of the physics-based heliospheric models ENLIL, CORHEL, and SWMF. A TEC metric has been developed to quantify forecasted storm-time TEC disturbances. To evaluate the forecasts, we compare the simulation results with Global Positioning System satellite observations. We introduce methods to deduce the primary factors responsible for the forecasted TEC response. Statistical results are obtained and analyzed for different types of storms.

  4. Quantifying uncertainty in discharge measurements: A new approach

    USGS Publications Warehouse

    Kiang, J.E.; Cohn, T.A.; Mason, R.R.

    2009-01-01

    The accuracy of discharge measurements using velocity meters and the velocity-area method is typically assessed based on empirical studies that may not correspond to conditions encountered in practice. In this paper, a statistical approach for assessing uncertainty based on interpolated variance estimation (IVE) is introduced. The IVE method quantifies all sources of random uncertainty in the measured data. This paper presents results employing data from sites where substantial over-sampling allowed for the comparison of IVE-estimated uncertainty and observed variability among repeated measurements. These results suggest that the IVE approach can provide approximate estimates of measurement uncertainty. The use of IVE to estimate the uncertainty of a discharge measurement would provide the hydrographer an immediate determination of uncertainty and help determine whether there is a need for additional sampling in problematic river cross sections. ?? 2009 ASCE.

  5. Synoptic relationships quantified between surface Chlorophyll-a and diagnostic pigments specific to phytoplankton functional types

    NASA Astrophysics Data System (ADS)

    Hirata, T.; Hardman-Mountford, N. J.; Brewin, R. J. W.; Aiken, J.; Barlow, R.; Suzuki, K.; Isada, T.; Howell, E.; Hashioka, T.; Noguchi-Aita, M.; Yamanaka, Y.

    2010-09-01

    Error-quantified, synoptic-scale relationships between chlorophyll-a (Chla) and phytoplankton pigment groups at the sea surface are presented. A total of nine pigment groups were considered to represent nine phytoplankton functional types (PFTs) including microplankton, nanoplankton, picoplankton, diatoms, dinoflagellates, green algae, picoeukaryotes, prokaryotes and Prochlorococcus sp. The observed relationships between Chla and pigment groups were well-defined at the global scale to show that Chla can be used as an index of not only phytoplankton abundance but also community structure; large (micro) phytoplankton monotonically increase as Chla increases, whereas the small (pico) phytoplankton community generally decreases. Within these relationships, we also found non-monotonic variations with Chla for certain pico-plankton (pico-eukaryotes, Prokaryotes and Prochlorococcus sp.) and for Green Algae and nano-sized phytoplankton. The relationships were quantified with a least-square fitting approach in order to estimate the PFTs from Chla alone. The estimated uncertainty of the relationships quantified depends on both phytoplankton types and Chla concentration. Maximum uncertainty over all groups (34.7% Chla) was found from diatom at approximately Chla = 1.07 mg m-3. However, the mean uncertainty of the relationships over all groups was 5.8 [% Chla] over the entire Chla range observed (0.02 < Chla < 6.84 mg m-3). The relationships were applied to SeaWiFS satellite Chla data from 1998 to 2009 to show the global climatological fields of the surface distribution of PFTs. Results show that microplankton are present in the mid and high latitudes, constituting ~9.0 [% Chla] of the phytoplankton community at the global surface, in which diatoms explain ~6.0 [% Chla]. Nanoplankton are ubiquious throught much of the global surface oceans except subtropical gyres, acting as a background population, constituting ~44.2 [% Chla]. Picoplankton are mostly limited in subtropical

  6. Molecular Marker Approach on Characterizing and Quantifying Charcoal in Environmental Media

    NASA Astrophysics Data System (ADS)

    Kuo, L.; Herbert, B. E.; Louchouarn, P.

    2006-12-01

    Black carbon (BC) is widely distributed in natural environments including soils, sediments, freshwater, seawater and the atmosphere. It is produced mostly from the incomplete combustion of fossil fuels and vegetation. In recent years, increasing attention has been given to BC due to its potential influence in many biogeochemical processes. In the environment, BC exists as a continuum ranging from partly charred plant materials, charcoal residues to highly condensed soot and graphite particles. The heterogeneous nature of black carbon means that BC is always operationally-defined, highlighting the need for standard methods that support data comparisons. Unlike soot and graphite that can be quantified with well-established methods, it is difficult to directly quantify charcoal in geologic media due to its chemical and physical heterogeneity. Most of the available charcoal quantification methods detect unknown fractions of the BC continuum. To specifically identify and quantify charcoal in soils and sediments, we adopted and validated an innovative molecular marker approach that quantifies levoglucosan, a pyrogenic derivative of cellulose, as a proxy of charcoal. Levoglucosan is source-specific, stable and is able to be detected at low concentrations using gas chromatograph-mass spectrometer (GC-MS). In the present study, two different plant species, honey mesquite and cordgrass, were selected as the raw materials to synthesize charcoals. The lab-synthesize charcoals were made under control conditions to eliminate the high heterogeneity often found in natural charcoals. The effects of two major combustion factors, temperature and duration, on the yield of levoglucosan were characterized in the lab-synthesize charcoals. Our results showed that significant levoglucosan production in the two types of charcoal was restricted to relatively low combustion temperatures (150-350 degree C). The combustion duration did not cause significant differences in the yield of

  7. Quantifying moisture transport in cementitious materials using neutron radiography

    NASA Astrophysics Data System (ADS)

    Lucero, Catherine L.

    . It has been found through this study that small pores, namely voids created by chemical shrinkage, gel pores, and capillary pores, ranging from 0.5 nm to 50 microm, fill quickly through capillary action. However, large entrapped and entrained air voids ranging from 0.05 to 1.25 mm remain empty during the initial filling process. In mortar exposed to calcium chloride solution, a decrease in sorptivity was observed due to an increase in viscosity and surface tension of the solution as proposed by Spragg et al 2011. This work however also noted a decrease in the rate of absorption due to a reaction between the salt and matrix which results in the filling of the pores in the concrete. The results from neutron imaging can help in the interpretation of standard absorption tests. ASTM C1585 test results can be further analyzed in several ways that could give an accurate indication of the durability of the concrete. Results can be reported in depth of penetration versus the square root of time rather than mm3 of fluid per mm2 of exposed surface area. Since a known fraction of pores are initially filling before reaching the edge of the sample, the actual depth of penetration can be calculated. This work is compared with an 'intrinsic sorptivity' that can be used to interpret mass measurements. Furthermore, the influence of shrinkage reducing admixtures (SRAs) on drying was studied. Neutron radiographs showed that systems saturated in water remain "wetter" than systems saturated in 5% SRA solution. The SRA in the system reduces the moisture diffusion coefficient due an increase in viscosity and decrease in surface tension. Neutron radiography provided spatial information of the drying front that cannot be achieved using other methods.

  8. Life cycle assessment of urban wastewater systems: Quantifying the relative contribution of sewer systems.

    PubMed

    Risch, Eva; Gutierrez, Oriol; Roux, Philippe; Boutin, Catherine; Corominas, Lluís

    2015-06-15

    This study aims to propose a holistic, life cycle assessment (LCA) of urban wastewater systems (UWS) based on a comprehensive inventory including detailed construction and operation of sewer systems and wastewater treatment plants (WWTPs). For the first time, the inventory of sewers infrastructure construction includes piping materials and aggregates, manholes, connections, civil works and road rehabilitation. The operation stage comprises energy consumption in pumping stations together with air emissions of methane and hydrogen sulphide, and water emissions from sewer leaks. Using a real case study, this LCA aims to quantify the contributions of sewer systems to the total environmental impacts of the UWS. The results show that the construction of sewer infrastructures has an environmental impact (on half of the 18 studied impact categories) larger than both the construction and operation of the WWTP. This study highlights the importance of including the construction and operation of sewer systems in the environmental assessment of centralised versus decentralised options for UWS.

  9. Quantifying Nanomolar Protein Concentrations Using Designed DNA Carriers and Solid-State Nanopores.

    PubMed

    Kong, Jinglin; Bell, Nicholas A W; Keyser, Ulrich F

    2016-06-01

    Designed "DNA carriers" have been proposed as a new method for nanopore based specific protein detection. In this system, target protein molecules bind to a long DNA strand at a defined position creating a second level transient current drop against the background DNA translocation. Here, we demonstrate the ability of this system to quantify protein concentrations in the nanomolar range. After incubation with target protein at different concentrations, the fraction of DNA translocations showing a secondary current spike allows for the quantification of the corresponding protein concentration. For our proof-of-principle experiments we use two standard binding systems, biotin-streptavidin and digoxigenin-antidigoxigenin, that allow for measurements of the concentration down to the low nanomolar range. The results demonstrate the potential for a novel quantitative and specific protein detection scheme using the DNA carrier method.

  10. Life cycle assessment of urban wastewater systems: Quantifying the relative contribution of sewer systems.

    PubMed

    Risch, Eva; Gutierrez, Oriol; Roux, Philippe; Boutin, Catherine; Corominas, Lluís

    2015-06-15

    This study aims to propose a holistic, life cycle assessment (LCA) of urban wastewater systems (UWS) based on a comprehensive inventory including detailed construction and operation of sewer systems and wastewater treatment plants (WWTPs). For the first time, the inventory of sewers infrastructure construction includes piping materials and aggregates, manholes, connections, civil works and road rehabilitation. The operation stage comprises energy consumption in pumping stations together with air emissions of methane and hydrogen sulphide, and water emissions from sewer leaks. Using a real case study, this LCA aims to quantify the contributions of sewer systems to the total environmental impacts of the UWS. The results show that the construction of sewer infrastructures has an environmental impact (on half of the 18 studied impact categories) larger than both the construction and operation of the WWTP. This study highlights the importance of including the construction and operation of sewer systems in the environmental assessment of centralised versus decentralised options for UWS. PMID:25839834

  11. QUANTIFYING ATYPICALITY IN AFFECTIVE FACIAL EXPRESSIONS OF CHILDREN WITH AUTISM SPECTRUM DISORDERS.

    PubMed

    Metallinou, Angeliki; Grossman, Ruth B; Narayanan, Shrikanth

    2013-01-01

    We focus on the analysis, quantification and visualization of atypicality in affective facial expressions of children with High Functioning Autism (HFA). We examine facial Motion Capture data from typically developing (TD) children and children with HFA, using various statistical methods, including Functional Data Analysis, in order to quantify atypical expression characteristics and uncover patterns of expression evolution in the two populations. Our results show that children with HFA display higher asynchrony of motion between facial regions, more rough facial and head motion, and a larger range of facial region motion. Overall, subjects with HFA consistently display a wider variability in the expressive facial gestures that they employ. Our analysis demonstrates the utility of computational approaches for understanding behavioral data and brings new insights into the autism domain regarding the atypicality that is often associated with facial expressions of subjects with HFA.

  12. A freely available semi-automated method for quantifying retinal ganglion cells in entire retinal flatmounts.

    PubMed

    Geeraerts, E; Dekeyster, E; Gaublomme, D; Salinas-Navarro, M; De Groef, L; Moons, L

    2016-06-01

    Glaucomatous optic neuropathies are characterized by progressive loss of retinal ganglion cells (RGCs), the neurons that connect the eye to the brain. Quantification of these RGCs is a cornerstone in experimental optic neuropathy research and commonly performed via manually quantifying parts of the retina. However, this is a time-consuming process subject to inter- and intra-observer variability. Here we present a freely available ImageJ script to semi-automatically quantify RGCs in entire retinal flatmounts after immunostaining for the RGC-specific transcription factor Brn3a. The blob-like signal of Brn3a-immunopositive RGCs is enhanced via eigenvalues of the Hessian matrix and the resulting local maxima are counted as RGCs. After the user has outlined the retinal flatmount area, the total RGC number and retinal area are reported and an isodensity map, showing the RGC density distribution across the retina, is created. The semi-automated quantification shows a very strong correlation (Pearson's r ≥ 0.99) with manual counts for both widefield and confocal images, thereby validating the data generated via the developed script. Moreover, application of this method in established glaucomatous optic neuropathy models such as N-methyl-D-aspartate-induced excitotoxicity, optic nerve crush and laser-induced ocular hypertension revealed RGC loss conform with literature. Compared to manual counting, the described automated quantification method is faster and shows user-independent consistency. Furthermore, as the script detects the RGC number in entire retinal flatmounts, the method allows detection of regional differences in RGC density. As such, it can help advance research investigating the degenerative mechanisms of glaucomatous optic neuropathies and the effectiveness of new neuroprotective treatments. Because the script is flexible and easy to optimize due to a low number of critical parameters, it can potentially be applied in combination with other tissues or

  13. Path Similarity Analysis: A Method for Quantifying Macromolecular Pathways

    PubMed Central

    Seyler, Sean L.; Kumar, Avishek; Thorpe, M. F.; Beckstein, Oliver

    2015-01-01

    Diverse classes of proteins function through large-scale conformational changes and various sophisticated computational algorithms have been proposed to enhance sampling of these macromolecular transition paths. Because such paths are curves in a high-dimensional space, it has been difficult to quantitatively compare multiple paths, a necessary prerequisite to, for instance, assess the quality of different algorithms. We introduce a method named Path Similarity Analysis (PSA) that enables us to quantify the similarity between two arbitrary paths and extract the atomic-scale determinants responsible for their differences. PSA utilizes the full information available in 3N-dimensional configuration space trajectories by employing the Hausdorff or Fréchet metrics (adopted from computational geometry) to quantify the degree of similarity between piecewise-linear curves. It thus completely avoids relying on projections into low dimensional spaces, as used in traditional approaches. To elucidate the principles of PSA, we quantified the effect of path roughness induced by thermal fluctuations using a toy model system. Using, as an example, the closed-to-open transitions of the enzyme adenylate kinase (AdK) in its substrate-free form, we compared a range of protein transition path-generating algorithms. Molecular dynamics-based dynamic importance sampling (DIMS) MD and targeted MD (TMD) and the purely geometric FRODA (Framework Rigidity Optimized Dynamics Algorithm) were tested along with seven other methods publicly available on servers, including several based on the popular elastic network model (ENM). PSA with clustering revealed that paths produced by a given method are more similar to each other than to those from another method and, for instance, that the ENM-based methods produced relatively similar paths. PSA applied to ensembles of DIMS MD and FRODA trajectories of the conformational transition of diphtheria toxin, a particularly challenging example, showed that

  14. An experimental study quantifying pulmonary ventilation on inhalation of aerosol under steady and episodic emission.

    PubMed

    Poon, Carmen K M; Lai, Alvin C K

    2011-09-15

    Estimating inhalation dose accurately under realistic conditions can enhance the accuracy of risk assessment. Conventional methods to quantify aerosol concentration that susceptible victims in contaminated environments are exposed to use real time particle counters to measure concentrations in environments without occupancy. Breathing-induced airflow interacts and influences concentration around nostrils or mouth and alter the ultimate exposure. This subject has not yet been systematically studied, particularly under transient emission. In this work, an experimental facility comprising two manikins was designed and fabricated. One of them mimicked realistic breathing, acting as a susceptible victim. Both steady and episodic emissions were generated in an air-conditioned environmental chamber in which two different ventilation schemes were tested. The scaled-dose of the victim under different expiratory velocities and pulmonary ventilation was measured. Inferring from results obtained from comprehensive tests, it can be concluded that breathing has very significant influence on the ultimate dose compared with that without breathing. Majority of results show that breathing reduces inhalation quantity and the reduction magnitude increases with breathing rate. This is attributed to the fact that the exhalation process plays a more significant role in reducing the dose level than the enhanced effect during inhalation period. The higher the breathing rate, the sharper the decline of the resultant concentration would be leading to lower dose. Nevertheless, under low pulmonary ventilation, results show that breathing increases dose marginally. Results also reveals that ventilation scheme also affects the exposure.

  15. Quantifying the Relationship Between Financial News and the Stock Market

    PubMed Central

    Alanyali, Merve; Moat, Helen Susannah; Preis, Tobias

    2013-01-01

    The complex behavior of financial markets emerges from decisions made by many traders. Here, we exploit a large corpus of daily print issues of the Financial Times from 2nd January 2007 until 31st December 2012 to quantify the relationship between decisions taken in financial markets and developments in financial news. We find a positive correlation between the daily number of mentions of a company in the Financial Times and the daily transaction volume of a company's stock both on the day before the news is released, and on the same day as the news is released. Our results provide quantitative support for the suggestion that movements in financial markets and movements in financial news are intrinsically interlinked. PMID:24356666

  16. Quantifying the Behavior of Stock Correlations Under Market Stress

    PubMed Central

    Preis, Tobias; Kenett, Dror Y.; Stanley, H. Eugene; Helbing, Dirk; Ben-Jacob, Eshel

    2012-01-01

    Understanding correlations in complex systems is crucial in the face of turbulence, such as the ongoing financial crisis. However, in complex systems, such as financial systems, correlations are not constant but instead vary in time. Here we address the question of quantifying state-dependent correlations in stock markets. Reliable estimates of correlations are absolutely necessary to protect a portfolio. We analyze 72 years of daily closing prices of the 30 stocks forming the Dow Jones Industrial Average (DJIA). We find the striking result that the average correlation among these stocks scales linearly with market stress reflected by normalized DJIA index returns on various time scales. Consequently, the diversification effect which should protect a portfolio melts away in times of market losses, just when it would most urgently be needed. Our empirical analysis is consistent with the interesting possibility that one could anticipate diversification breakdowns, guiding the design of protected portfolios. PMID:23082242

  17. Quantifying Interparticle Forces and Heterogeneity in 3D Granular Materials

    NASA Astrophysics Data System (ADS)

    Hurley, R. C.; Hall, S. A.; Andrade, J. E.; Wright, J.

    2016-08-01

    Interparticle forces in granular materials are intimately linked to mechanical properties and are known to self-organize into heterogeneous structures, or force chains, under external load. Despite progress in understanding the statistics and spatial distribution of interparticle forces in recent decades, a systematic method for measuring forces in opaque, three-dimensional (3D), frictional, stiff granular media has yet to emerge. In this Letter, we present results from an experiment that combines 3D x-ray diffraction, x-ray tomography, and a numerical force inference technique to quantify interparticle forces and their heterogeneity in an assembly of quartz grains undergoing a one-dimensional compression cycle. Forces exhibit an exponential decay above the mean and partition into strong and weak networks. We find a surprising inverse relationship between macroscopic load and the heterogeneity of interparticle forces, despite the clear emergence of two force chains that span the system.

  18. Quantifying Interparticle Forces and Heterogeneity in 3D Granular Materials.

    PubMed

    Hurley, R C; Hall, S A; Andrade, J E; Wright, J

    2016-08-26

    Interparticle forces in granular materials are intimately linked to mechanical properties and are known to self-organize into heterogeneous structures, or force chains, under external load. Despite progress in understanding the statistics and spatial distribution of interparticle forces in recent decades, a systematic method for measuring forces in opaque, three-dimensional (3D), frictional, stiff granular media has yet to emerge. In this Letter, we present results from an experiment that combines 3D x-ray diffraction, x-ray tomography, and a numerical force inference technique to quantify interparticle forces and their heterogeneity in an assembly of quartz grains undergoing a one-dimensional compression cycle. Forces exhibit an exponential decay above the mean and partition into strong and weak networks. We find a surprising inverse relationship between macroscopic load and the heterogeneity of interparticle forces, despite the clear emergence of two force chains that span the system.

  19. Quantifying the relationship between financial news and the stock market.

    PubMed

    Alanyali, Merve; Moat, Helen Susannah; Preis, Tobias

    2013-01-01

    The complex behavior of financial markets emerges from decisions made by many traders. Here, we exploit a large corpus of daily print issues of the Financial Times from 2(nd) January 2007 until 31(st) December 2012 to quantify the relationship between decisions taken in financial markets and developments in financial news. We find a positive correlation between the daily number of mentions of a company in the Financial Times and the daily transaction volume of a company's stock both on the day before the news is released, and on the same day as the news is released. Our results provide quantitative support for the suggestion that movements in financial markets and movements in financial news are intrinsically interlinked.

  20. Quantifying the Impact of Unavailability in Cyber-Physical Environments

    SciTech Connect

    Aissa, Anis Ben; Abercrombie, Robert K; Sheldon, Federick T.; Mili, Ali

    2014-01-01

    The Supervisory Control and Data Acquisition (SCADA) system discussed in this work manages a distributed control network for the Tunisian Electric & Gas Utility. The network is dispersed over a large geographic area that monitors and controls the flow of electricity/gas from both remote and centralized locations. The availability of the SCADA system in this context is critical to ensuring the uninterrupted delivery of energy, including safety, security, continuity of operations and revenue. Such SCADA systems are the backbone of national critical cyber-physical infrastructures. Herein, we propose adapting the Mean Failure Cost (MFC) metric for quantifying the cost of unavailability. This new metric combines the classic availability formulation with MFC. The resulting metric, so-called Econometric Availability (EA), offers a computational basis to evaluate a system in terms of the gain/loss ($/hour of operation) that affects each stakeholder due to unavailability.

  1. Quantifying light exposure patterns in young adult students

    NASA Astrophysics Data System (ADS)

    Alvarez, Amanda A.; Wildsoet, Christine F.

    2013-08-01

    Exposure to bright light appears to be protective against myopia in both animals (chicks, monkeys) and children, but quantitative data on human light exposure are limited. In this study, we report on a technique for quantifying light exposure using wearable sensors. Twenty-seven young adult subjects wore a light sensor continuously for two weeks during one of three seasons, and also completed questionnaires about their visual activities. Light data were analyzed with respect to refractive error and season, and the objective sensor data were compared with subjects' estimates of time spent indoors and outdoors. Subjects' estimates of time spent indoors and outdoors were in poor agreement with durations reported by the sensor data. The results of questionnaire-based studies of light exposure should thus be interpreted with caution. The role of light in refractive error development should be investigated using multiple methods such as sensors to complement questionnaires.

  2. Quantifying adhesion energy of mechanical coatings at atomistic scale

    NASA Astrophysics Data System (ADS)

    Yin, Deqiang; Peng, Xianghe; Qin, Yi; Feng, Jiling; Wang, Zhongchang

    2011-12-01

    Coatings of transition metal compounds find widespread technological applications where adhesion is known to influence or control functionality. Here, we, by first-principles calculations, propose a new way to assess adhesion in coatings and apply it to analyze the TiN coating. We find that the calculated adhesion energies of both the (1 1 1) and (0 0 1) orientations are small under no residual stress, yet increase linearly once the stress is imposed, suggesting that the residual stress is key to affecting adhesion. The strengthened adhesion is found to be attributed to the stress-induced shrinkage of neighbouring bonds, which results in stronger interactions between bonds in TiN coatings. Further finite elements simulation (FEM) based on calculated adhesion energy reproduces well the initial cracking process observed in nano-indentation experiments, thereby validating the application of this approach in quantifying adhesion energy of surface coating systems.

  3. Muscle ultrasound quantifies segmental neuromuscular outcome in pediatric myelomeningocele.

    PubMed

    Verbeek, Renate J; Hoving, Eelco W; Maurits, Natalia M; Brouwer, Oebele F; van der Hoeven, Johannes H; Sival, Deborah A

    2014-01-01

    In pediatric spina bifida aperta (SBA), non-invasive assessment of neuromuscular integrity by muscle ultrasound density (MUD) could provide important information about the clinical condition. We therefore aimed to determine the association between pediatric SBA MUD and segmental neurologic function. We included 23 children (age range: 1-18 y) with SBA with L4-5 lesions, and we associated SBA MUD with control values and segmental neuromuscular function. Results revealed that MUD outcomes in the lower extremities: (i) are independent of age, (ii) exceed control values, (iii) differ intra-individually (i.e., between the left and right sides in the same individual) in association with segmental neuromuscular function. We concluded that SBA leg MUD can quantify the segmental neuromuscular condition throughout childhood.

  4. Quantifying the impact of molecular defects on polymer network elasticity.

    PubMed

    Zhong, Mingjiang; Wang, Rui; Kawamoto, Ken; Olsen, Bradley D; Johnson, Jeremiah A

    2016-09-16

    Elasticity, one of the most important properties of a soft material, is difficult to quantify in polymer networks because of the presence of topological molecular defects in these materials. Furthermore, the impact of these defects on bulk elasticity is unknown. We used rheology, disassembly spectrometry, and simulations to measure the shear elastic modulus and count the numbers of topological "loop" defects of various order in a series of polymer hydrogels, and then used these data to evaluate the classical phantom and affine network theories of elasticity. The results led to a real elastic network theory (RENT) that describes how loop defects affect bulk elasticity. Given knowledge of the loop fractions, RENT provides predictions of the shear elastic modulus that are consistent with experimental observations. PMID:27634530

  5. Quantifying the Behavior of Stock Correlations Under Market Stress

    NASA Astrophysics Data System (ADS)

    Preis, Tobias; Kenett, Dror Y.; Stanley, H. Eugene; Helbing, Dirk; Ben-Jacob, Eshel

    2012-10-01

    Understanding correlations in complex systems is crucial in the face of turbulence, such as the ongoing financial crisis. However, in complex systems, such as financial systems, correlations are not constant but instead vary in time. Here we address the question of quantifying state-dependent correlations in stock markets. Reliable estimates of correlations are absolutely necessary to protect a portfolio. We analyze 72 years of daily closing prices of the 30 stocks forming the Dow Jones Industrial Average (DJIA). We find the striking result that the average correlation among these stocks scales linearly with market stress reflected by normalized DJIA index returns on various time scales. Consequently, the diversification effect which should protect a portfolio melts away in times of market losses, just when it would most urgently be needed. Our empirical analysis is consistent with the interesting possibility that one could anticipate diversification breakdowns, guiding the design of protected portfolios.

  6. Quantifying the behavior of stock correlations under market stress.

    PubMed

    Preis, Tobias; Kenett, Dror Y; Stanley, H Eugene; Helbing, Dirk; Ben-Jacob, Eshel

    2012-01-01

    Understanding correlations in complex systems is crucial in the face of turbulence, such as the ongoing financial crisis. However, in complex systems, such as financial systems, correlations are not constant but instead vary in time. Here we address the question of quantifying state-dependent correlations in stock markets. Reliable estimates of correlations are absolutely necessary to protect a portfolio. We analyze 72 years of daily closing prices of the 30 stocks forming the Dow Jones Industrial Average (DJIA). We find the striking result that the average correlation among these stocks scales linearly with market stress reflected by normalized DJIA index returns on various time scales. Consequently, the diversification effect which should protect a portfolio melts away in times of market losses, just when it would most urgently be needed. Our empirical analysis is consistent with the interesting possibility that one could anticipate diversification breakdowns, guiding the design of protected portfolios.

  7. Identifying and quantifying interactions in a laboratory swarm

    NASA Astrophysics Data System (ADS)

    Puckett, James; Kelley, Douglas; Ouellette, Nicholas

    2013-03-01

    Emergent collective behavior, such as in flocks of birds or swarms of bees, is exhibited throughout the animal kingdom. Many models have been developed to describe swarming and flocking behavior using systems of self-propelled particles obeying simple rules or interacting via various potentials. However, due to experimental difficulties and constraints, little empirical data exists for characterizing the exact form of the biological interactions. We study laboratory swarms of flying Chironomus riparius midges, using stereoimaging and particle tracking techniques to record three-dimensional trajectories for all the individuals in the swarm. We describe methods to identify and quantify interactions by examining these trajectories, and report results on interaction magnitude, frequency, and mutuality.

  8. Quantifying Interparticle Forces and Heterogeneity in 3D Granular Materials.

    PubMed

    Hurley, R C; Hall, S A; Andrade, J E; Wright, J

    2016-08-26

    Interparticle forces in granular materials are intimately linked to mechanical properties and are known to self-organize into heterogeneous structures, or force chains, under external load. Despite progress in understanding the statistics and spatial distribution of interparticle forces in recent decades, a systematic method for measuring forces in opaque, three-dimensional (3D), frictional, stiff granular media has yet to emerge. In this Letter, we present results from an experiment that combines 3D x-ray diffraction, x-ray tomography, and a numerical force inference technique to quantify interparticle forces and their heterogeneity in an assembly of quartz grains undergoing a one-dimensional compression cycle. Forces exhibit an exponential decay above the mean and partition into strong and weak networks. We find a surprising inverse relationship between macroscopic load and the heterogeneity of interparticle forces, despite the clear emergence of two force chains that span the system. PMID:27610890

  9. UV-vis spectra as an alternative to the Lowry method for quantify hair damage induced by surfactants.

    PubMed

    Pires-Oliveira, Rafael; Joekes, Inés

    2014-11-01

    It is well known that long term use of shampoo causes damage to human hair. Although the Lowry method has been widely used to quantify hair damage, it is unsuitable to determine this in the presence of some surfactants and there is no other method proposed in literature. In this work, a different method is used to investigate and compare the hair damage induced by four types of surfactants (including three commercial-grade surfactants) and water. Hair samples were immersed in aqueous solution of surfactants under conditions that resemble a shower (38 °C, constant shaking). These solutions become colored with time of contact with hair and its UV-vis spectra were recorded. For comparison, the amount of extracted proteins from hair by sodium dodecyl sulfate (SDS) and by water were estimated by the Lowry method. Additionally, non-pigmented vs. pigmented hair and also sepia melanin were used to understand the washing solution color and their spectra. The results presented herein show that hair degradation is mostly caused by the extraction of proteins, cuticle fragments and melanin granules from hair fiber. It was found that the intensity of solution color varies with the charge density of the surfactants. Furthermore, the intensity of solution color can be correlated to the amount of proteins quantified by the Lowry method as well as to the degree of hair damage. UV-vis spectrum of hair washing solutions is a simple and straightforward method to quantify and compare hair damages induced by different commercial surfactants.

  10. Quantifying complexity of the chaotic regime of a semiconductor laser subject to feedback via information theory measures

    NASA Astrophysics Data System (ADS)

    Soriano, Miguel C.; Zunino, Luciano; Rosso, Osvaldo A.; Mirasso, Claudio R.

    2010-04-01

    The time evolution of the output of a semiconductor laser subject to optical feedback can exhibit high-dimensional chaotic fluctuations. In this contribution, our aim is to quantify the complexity of the chaotic time-trace generated by a semiconductor laser subject to delayed optical feedback. To that end, we discuss the properties of two recently introduced complexity measures based on information theory, namely the permutation entropy (PE) and the statistical complexity measure (SCM). The PE and SCM are defined as a functional of a symbolic probability distribution, evaluated using the Bandt-Pompe recipe to assign a probability distribution function to the time series generated by the chaotic system. In order to evaluate the performance of these novel complexity quantifiers, we compare them to a more standard chaos quantifier, namely the Kolmogorov-Sinai entropy. Here, we present numerical results showing that the statistical complexity and the permutation entropy, evaluated at the different time-scales involved in the chaotic regime of the laser subject to optical feedback, give valuable information about the complexity of the laser dynamics.

  11. Design and Analysis of a Micromechanical Three-Component Force Sensor for Characterizing and Quantifying Surface Roughness

    NASA Astrophysics Data System (ADS)

    Liang, Q.; Wu, W.; Zhang, D.; Wei, B.; Sun, W.; Wang, Y.; Ge, Y.

    2015-10-01

    Roughness, which can represent the trade-off between manufacturing cost and performance of mechanical components, is a critical predictor of cracks, corrosion and fatigue damage. In order to measure polished or super-finished surfaces, a novel touch probe based on three-component force sensor for characterizing and quantifying surface roughness is proposed by using silicon micromachining technology. The sensor design is based on a cross-beam structure, which ensures that the system possesses high sensitivity and low coupling. The results show that the proposed sensor possesses high sensitivity, low coupling error, and temperature compensation function. The proposed system can be used to investigate micromechanical structures with nanometer accuracy.

  12. Quantifying Qualitative Data Using Cognitive Maps

    ERIC Educational Resources Information Center

    Scherp, Hans-Ake

    2013-01-01

    The aim of the article is to show how substantial qualitative material consisting of graphic cognitive maps can be analysed by using digital CmapTools, Excel and SPSS. Evidence is provided of how qualitative and quantitative methods can be combined in educational research by transforming qualitative data into quantitative data to facilitate…

  13. Quantifying induced effects of subsurface renewable energy storage

    NASA Astrophysics Data System (ADS)

    Bauer, Sebastian; Beyer, Christof; Pfeiffer, Tilmann; Boockmeyer, Anke; Popp, Steffi; Delfs, Jens-Olaf; Wang, Bo; Li, Dedong; Dethlefsen, Frank; Dahmke, Andreas

    2015-04-01

    New methods and technologies for energy storage are required for the transition to renewable energy sources. Subsurface energy storage systems such as salt caverns or porous formations offer the possibility of hosting large amounts of energy or substance. When employing these systems, an adequate system and process understanding is required in order to assess the feasibility of the individual storage option at the respective site and to predict the complex and interacting effects induced. This understanding is the basis for assessing the potential as well as the risks connected with a sustainable usage of these storage options, especially when considering possible mutual influences. For achieving this aim, in this work synthetic scenarios for the use of the geological underground as an energy storage system are developed and parameterized. The scenarios are designed to represent typical conditions in North Germany. The types of subsurface use investigated here include gas storage and heat storage in porous formations. The scenarios are numerically simulated and interpreted with regard to risk analysis and effect forecasting. For this, the numerical simulators Eclipse and OpenGeoSys are used. The latter is enhanced to include the required coupled hydraulic, thermal, geomechanical and geochemical processes. Using the simulated and interpreted scenarios, the induced effects are quantified individually and monitoring concepts for observing these effects are derived. This presentation will detail the general investigation concept used and analyze the parameter availability for this type of model applications. Then the process implementation and numerical methods required and applied for simulating the induced effects of subsurface storage are detailed and explained. Application examples show the developed methods and quantify induced effects and storage sizes for the typical settings parameterized. This work is part of the ANGUS+ project, funded by the German Ministry

  14. The SEGUE K Giant Survey. III. Quantifying Galactic Halo Substructure

    NASA Astrophysics Data System (ADS)

    Janesh, William; Morrison, Heather L.; Ma, Zhibo; Rockosi, Constance; Starkenburg, Else; Xue, Xiang Xiang; Rix, Hans-Walter; Harding, Paul; Beers, Timothy C.; Johnson, Jennifer; Lee, Young Sun; Schneider, Donald P.

    2016-01-01

    We statistically quantify the amount of substructure in the Milky Way stellar halo using a sample of 4568 halo K giant stars at Galactocentric distances ranging over 5-125 kpc. These stars have been selected photometrically and confirmed spectroscopically as K giants from the Sloan Digital Sky Survey’s Sloan Extension for Galactic Understanding and Exploration project. Using a position-velocity clustering estimator (the 4distance) and a model of a smooth stellar halo, we quantify the amount of substructure in the halo, divided by distance and metallicity. Overall, we find that the halo as a whole is highly structured. We also confirm earlier work using blue horizontal branch (BHB) stars which showed that there is an increasing amount of substructure with increasing Galactocentric radius, and additionally find that the amount of substructure in the halo increases with increasing metallicity. Comparing to resampled BHB stars, we find that K giants and BHBs have similar amounts of substructure over equivalent ranges of Galactocentric radius. Using a friends-of-friends algorithm to identify members of individual groups, we find that a large fraction (˜33%) of grouped stars are associated with Sgr, and identify stars belonging to other halo star streams: the Orphan Stream, the Cetus Polar Stream, and others, including previously unknown substructures. A large fraction of sample K giants (more than 50%) are not grouped into any substructure. We find also that the Sgr stream strongly dominates groups in the outer halo for all except the most metal-poor stars, and suggest that this is the source of the increase of substructure with Galactocentric radius and metallicity.

  15. Using Insar to Quantify Seasonal Fluctuations in Landslide Velocity, Eel River, Northern California

    NASA Astrophysics Data System (ADS)

    Handwerger, A. L.; Roering, J. J.; Schmidt, D. A.

    2011-12-01

    Large, slow-moving, deep-seated landslides are hydrologically driven and respond to precipitation over seasonal time scales. Precipitation causes changes in pore pressure, which alters effective stress and landslide velocity. Here, we use InSAR to quantify changes in landslide velocity for 32 landslides between February 2007 and January 2011 in the Eel River catchment, northern California. We investigate relationships between lithology and landslide properties (including aspect ratio, planform area, depth) and landslide dynamics. The time series behavior of each landslide was calculated by performing an inversion of small-baseline interferograms. We produced 165 differential interferograms with a minimum satellite return interval of 46 days using ALOS PALSAR data from tracks 223 and 224 with the ROI_PAC processing package. Climatic data and geologic maps were provided by NOAA and the California State Geological Survey, respectively. For each landslide we analyzed the planform area, depth, slope, and drainage area using DEMs derived from LiDAR and SRTM data. To quantify the resolution of our time series methodology, we performed a sensitivity analysis using a synthetic data set to determine the minimum detectable temporal signal given the temporal distribution of interferograms. This analysis shows that the temporal sampling of the data set is sufficient to resolve a seasonal signal with a wavelength of ~1 year, which is consistent with the expected seasonal response time of these landslides. Preliminary results show that holding lithology and climate constant, landslides move continuously through the year, accelerating well into the wet season and decelerating during the dry season with a lag time of weeks to months. The 32 identified landslides move at line-of-sight rates ranging from 0.1 m yr -1 to 0.45 m yr -1, and have dimensions ranging from 0.5 to 5 km long and 0.27 to 3 km wide. Each landslide has distinct kinematic zones (e.g. source, transport, toe) that

  16. Quantifying Unnecessary Normal Tissue Complication Risks due to Suboptimal Planning: A Secondary Study of RTOG 0126

    SciTech Connect

    Moore, Kevin L.; Schmidt, Rachel; Moiseenko, Vitali; Olsen, Lindsey A.; Tan, Jun; Xiao, Ying; Galvin, James; Pugh, Stephanie; Seider, Michael J.; Dicker, Adam P.; Bosch, Walter; Michalski, Jeff; Mutic, Sasa

    2015-06-01

    Purpose: The purpose of this study was to quantify the frequency and clinical severity of quality deficiencies in intensity modulated radiation therapy (IMRT) planning in the Radiation Therapy Oncology Group 0126 protocol. Methods and Materials: A total of 219 IMRT patients from the high-dose arm (79.2 Gy) of RTOG 0126 were analyzed. To quantify plan quality, we used established knowledge-based methods for patient-specific dose-volume histogram (DVH) prediction of organs at risk and a Lyman-Kutcher-Burman (LKB) model for grade ≥2 rectal complications to convert DVHs into normal tissue complication probabilities (NTCPs). The LKB model was validated by fitting dose-response parameters relative to observed toxicities. The 90th percentile (22 of 219) of plans with the lowest excess risk (difference between clinical and model-predicted NTCP) were used to create a model for the presumed best practices in the protocol (pDVH{sub 0126,top10%}). Applying the resultant model to the entire sample enabled comparisons between DVHs that patients could have received to DVHs they actually received. Excess risk quantified the clinical impact of suboptimal planning. Accuracy of pDVH predictions was validated by replanning 30 of 219 patients (13.7%), including equal numbers of presumed “high-quality,” “low-quality,” and randomly sampled plans. NTCP-predicted toxicities were compared to adverse events on protocol. Results: Existing models showed that bladder-sparing variations were less prevalent than rectum quality variations and that increased rectal sparing was not correlated with target metrics (dose received by 98% and 2% of the PTV, respectively). Observed toxicities were consistent with current LKB parameters. Converting DVH and pDVH{sub 0126,top10%} to rectal NTCPs, we observed 94 of 219 patients (42.9%) with ≥5% excess risk, 20 of 219 patients (9.1%) with ≥10% excess risk, and 2 of 219 patients (0.9%) with ≥15% excess risk. Replanning demonstrated the

  17. Quantifying polypeptide conformational space: sensitivity to conformation and ensemble definition.

    PubMed

    Sullivan, David C; Lim, Carmay

    2006-08-24

    Quantifying the density of conformations over phase space (the conformational distribution) is needed to model important macromolecular processes such as protein folding. In this work, we quantify the conformational distribution for a simple polypeptide (N-mer polyalanine) using the cumulative distribution function (CDF), which gives the probability that two randomly selected conformations are separated by less than a "conformational" distance and whose inverse gives conformation counts as a function of conformational radius. An important finding is that the conformation counts obtained by the CDF inverse depend critically on the assignment of a conformation's distance span and the ensemble (e.g., unfolded state model): varying ensemble and conformation definition (1 --> 2 A) varies the CDF-based conformation counts for Ala(50) from 10(11) to 10(69). In particular, relatively short molecular dynamics (MD) relaxation of Ala(50)'s random-walk ensemble reduces the number of conformers from 10(55) to 10(14) (using a 1 A root-mean-square-deviation radius conformation definition) pointing to potential disconnections in comparing the results from simplified models of unfolded proteins with those from all-atom MD simulations. Explicit waters are found to roughen the landscape considerably. Under some common conformation definitions, the results herein provide (i) an upper limit to the number of accessible conformations that compose unfolded states of proteins, (ii) the optimal clustering radius/conformation radius for counting conformations for a given energy and solvent model, (iii) a means of comparing various studies, and (iv) an assessment of the applicability of random search in protein folding.

  18. Quantifying the Nonlinear, Anisotropic Material Response of Spinal Ligaments

    NASA Astrophysics Data System (ADS)

    Robertson, Daniel J.

    Spinal ligaments may be a significant source of chronic back pain, yet they are often disregarded by the clinical community due to a lack of information with regards to their material response, and innervation characteristics. The purpose of this dissertation was to characterize the material response of spinal ligaments and to review their innervation characteristics. Review of relevant literature revealed that all of the major spinal ligaments are innervated. They cause painful sensations when irritated and provide reflexive control of the deep spinal musculature. As such, including the neurologic implications of iatrogenic ligament damage in the evaluation of surgical procedures aimed at relieving back pain will likely result in more effective long-term solutions. The material response of spinal ligaments has not previously been fully quantified due to limitations associated with standard soft tissue testing techniques. The present work presents and validates a novel testing methodology capable of overcoming these limitations. In particular, the anisotropic, inhomogeneous material constitutive properties of the human supraspinous ligament are quantified and methods for determining the response of the other spinal ligaments are presented. In addition, a method for determining the anisotropic, inhomogeneous pre-strain distribution of the spinal ligaments is presented. The multi-axial pre-strain distributions of the human anterior longitudinal ligament, ligamentum flavum and supraspinous ligament were determined using this methodology. Results from this work clearly demonstrate that spinal ligaments are not uniaxial structures, and that finite element models which account for pre-strain and incorporate ligament's complex material properties may provide increased fidelity to the in vivo condition.

  19. Quantifying Particle Numbers and Mass Flux in Drifting Snow

    NASA Astrophysics Data System (ADS)

    Crivelli, Philip; Paterna, Enrico; Horender, Stefan; Lehning, Michael

    2016-06-01

    We compare two of the most common methods of quantifying mass flux, particle numbers and particle-size distribution for drifting snow events, the snow-particle counter (SPC), a laser-diode-based particle detector, and particle tracking velocimetry based on digital shadowgraphic imaging. The two methods were correlated for mass flux and particle number flux. For the SPC measurements, the device was calibrated by the manufacturer beforehand. The shadowgrapic imaging method measures particle size and velocity directly from consecutive images, and before each new test the image pixel length is newly calibrated. A calibration study with artificially scattered sand particles and glass beads provides suitable settings for the shadowgraphical imaging as well as obtaining a first correlation of the two methods in a controlled environment. In addition, using snow collected in trays during snowfall, several experiments were performed to observe drifting snow events in a cold wind tunnel. The results demonstrate a high correlation between the mass flux obtained for the calibration studies (r ≥slant 0.93 ) and good correlation for the drifting snow experiments (r ≥slant 0.81 ). The impact of measurement settings is discussed in order to reliably quantify particle numbers and mass flux in drifting snow. The study was designed and performed to optimize the settings of the digital shadowgraphic imaging system for both the acquisition and the processing of particles in a drifting snow event. Our results suggest that these optimal settings can be transferred to different imaging set-ups to investigate sediment transport processes.

  20. An organotypic spinal cord slice culture model to quantify neurodegeneration.

    PubMed

    Ravikumar, Madhumitha; Jain, Seema; Miller, Robert H; Capadona, Jeffrey R; Selkirk, Stephen M

    2012-11-15

    Activated microglia cells have been implicated in the neurodegenerative process of Alzheimer's disease, Parkinson's disease, Huntington's disease, amyotrophic lateral sclerosis, and multiple sclerosis; however, the precise roles of microglia in disease progression are unclear. Despite these diseases having been described for more than a century, current FDA approved therapeutics are symptomatic in nature with little evidence to supporting a neuroprotective effect. Furthermore, identifying novel therapeutics remains challenging due to undetermined etiology, a variable disease course, and the paucity of validated targets. Here, we describe the use of a novel ex vivo spinal cord culture system that offers the ability to screen potential neuroprotective agents, while maintaining the complexity of the in vivo environment. To this end, we treated spinal cord slice cultures with lipopolysaccharide and quantified neuron viability in culture using measurements of axon length and FluoroJadeC intensity. To simulate a microglia-mediated response to cellular debris, antigens, or implanted materials/devices, we supplemented the culture media with increasing densities of microspheres, facilitating microglia-mediated phagocytosis of the particles, which demonstrated a direct correlation between the phagocytic activities of microglia and neuronal health. To validate our model's capacity to accurately depict neuroprotection, cultures were treated with resveratrol, which demonstrated enhanced neuronal health. Our results successfully demonstrate the use of this model to reproducibly quantify the extent of neurodegeneration through the measurement of axon length and FluoroJadeC intensity, and we suggest this model will allow for accurate, high-throughput screening, which could result in expedited success in translational efficacy of therapeutic agents to clinical trials.

  1. Quantifying the role of forest soil and bedrock in the acid neutralization of surface water in steep hillslopes.

    PubMed

    Asano, Yuko; Uchida, Taro

    2005-02-01

    The role of soil and bedrock in acid neutralizing processes has been difficult to quantify because of hydrological and biogeochemical uncertainties. To quantify those roles, hydrochemical observations were conducted at two hydrologically well-defined, steep granitic hillslopes in the Tanakami Mountains of Japan. These paired hillslopes are similar except for their soils; Fudoji is leached of base cations (base saturation <6%), while Rachidani is covered with fresh soil (base saturation >30%), because the erosion rate is 100-1000 times greater. The results showed that (1) soil solution pH at the soil-bedrock interface at Fudoji (4.3) was significantly lower than that of Rachidani (5.5), (2) the hillslope discharge pH in both hillslopes was similar (6.7-6.8), and (3) at Fudoji, 60% of the base cations leaching from the hillslope were derived from bedrock, whereas only 20% were derived from bedrock in Rachidani. Further, previously published results showed that the stream pH could not be predicted from the acid deposition rate and soil base saturation status. These results demonstrate that bedrock plays an especially important role when the overlying soil has been leached of base cations. These results indicate that while the status of soil acidification is a first-order control on vulnerability to surface water acidification, in some cases such as at Fudoji, subsurface interaction with the bedrock determines the sensitivity of surface water to acidic deposition.

  2. Quantifying commuter exposures to volatile organic compounds

    NASA Astrophysics Data System (ADS)

    Kayne, Ashleigh

    Motor-vehicles can be a predominant source of air pollution in cities. Traffic-related air pollution is often unavoidable for people who live in populous areas. Commuters may have high exposures to traffic-related air pollution as they are close to vehicle tailpipes. Volatile organic compounds (VOCs) are one class of air pollutants of concern because exposure to VOCs carries risk for adverse health effects. Specific VOCs of interest for this work include benzene, toluene, ethylbenzene, and xylenes (BTEX), which are often found in gasoline and combustion products. Although methods exist to measure time-integrated personal exposures to BTEX, there are few practical methods to measure a commuter's time-resolved BTEX exposure which could identify peak exposures that could be concealed with a time-integrated measurement. This study evaluated the ability of a photoionization detector (PID) to measure commuters' exposure to BTEX using Tenax TA samples as a reference and quantified the difference in BTEX exposure between cyclists and drivers with windows open and closed. To determine the suitability of two measurement methods (PID and Tenax TA) for use in this study, the precision, linearity, and limits of detection (LODs) for both the PID and Tenax TA measurement methods were determined in the laboratory with standard BTEX calibration gases. Volunteers commuted from their homes to their work places by cycling or driving while wearing a personal exposure backpack containing a collocated PID and Tenax TA sampler. Volunteers completed a survey and indicated if the windows in their vehicle were open or closed. Comparing pairs of exposure data from the Tenax TA and PID sampling methods determined the suitability of the PID to measure the BTEX exposures of commuters. The difference between BTEX exposures of cyclists and drivers with windows open and closed in Fort Collins was determined. Both the PID and Tenax TA measurement methods were precise and linear when evaluated in the

  3. Quantifying commuter exposures to volatile organic compounds

    NASA Astrophysics Data System (ADS)

    Kayne, Ashleigh

    Motor-vehicles can be a predominant source of air pollution in cities. Traffic-related air pollution is often unavoidable for people who live in populous areas. Commuters may have high exposures to traffic-related air pollution as they are close to vehicle tailpipes. Volatile organic compounds (VOCs) are one class of air pollutants of concern because exposure to VOCs carries risk for adverse health effects. Specific VOCs of interest for this work include benzene, toluene, ethylbenzene, and xylenes (BTEX), which are often found in gasoline and combustion products. Although methods exist to measure time-integrated personal exposures to BTEX, there are few practical methods to measure a commuter's time-resolved BTEX exposure which could identify peak exposures that could be concealed with a time-integrated measurement. This study evaluated the ability of a photoionization detector (PID) to measure commuters' exposure to BTEX using Tenax TA samples as a reference and quantified the difference in BTEX exposure between cyclists and drivers with windows open and closed. To determine the suitability of two measurement methods (PID and Tenax TA) for use in this study, the precision, linearity, and limits of detection (LODs) for both the PID and Tenax TA measurement methods were determined in the laboratory with standard BTEX calibration gases. Volunteers commuted from their homes to their work places by cycling or driving while wearing a personal exposure backpack containing a collocated PID and Tenax TA sampler. Volunteers completed a survey and indicated if the windows in their vehicle were open or closed. Comparing pairs of exposure data from the Tenax TA and PID sampling methods determined the suitability of the PID to measure the BTEX exposures of commuters. The difference between BTEX exposures of cyclists and drivers with windows open and closed in Fort Collins was determined. Both the PID and Tenax TA measurement methods were precise and linear when evaluated in the

  4. Quantifying nursing workflow in medication administration.

    PubMed

    Keohane, Carol A; Bane, Anne D; Featherstone, Erica; Hayes, Judy; Woolf, Seth; Hurley, Ann; Bates, David W; Gandhi, Tejal K; Poon, Eric G

    2008-01-01

    New medication administration systems are showing promise in improving patient safety at the point of care, but adoption of these systems requires significant changes in nursing workflow. To prepare for these changes, the authors report on a time-motion study that measured the proportion of time that nurses spend on various patient care activities, focusing on medication administration-related activities. Implications of their findings are discussed.

  5. In Vivo Angiography Quantifies Oxygen-Induced Retinopathy Vascular Recovery

    PubMed Central

    Mezu-Ndubuisi, Olachi J.

    2016-01-01

    ABSTRACT Purpose Retinopathy of prematurity (ROP) is a potentially blinding vasoproliferative disease. There is no standardized way to quantify plus disease (tortuous and dilated retinal vessels) or characterize abnormal recovery during ROP monitoring. This study objectively studies vascular features in live mice during development using noninvasive retinal imaging. Methods Using fluorescein angiography (FA), retinal vascular features were quantified in live mice with oxygen induced retinopathy (OIR). A total of 105 wild-type mice were exposed to 77% oxygen from postnatal day 7 (P7) till P12 (OIR mice). Also, 105 age-matched pups were raised in room air (RA mice). In vivo FA was performed at early (P16 to P20), mid (P23 to P27), late (P30 to P34), and mature (P47) phases of retinal vascular development. Retinal vascular area, retinal vein width, and retinal artery tortuosity were quantified. Results Retinal artery tortuosity was higher in OIR than RA mice at early (p < 0.0001), mid (p < 0.0001), late (p < 0.0001), and mature (p < 0.0001) phases. Retinal vascular area in OIR mice increased from early to mid-phase (p < 0.0001), but remained unchanged from mid to late (p = 0.23), and from late to mature phase (p = 0.98). Retinal vein width was larger in OIR mice compared to RA mice during early phase only. Arteries in OIR mice were more tortuous from early to mid-phase (p < 0.0001), but tortuosity remained stable from mid through mature phase. RA mice had an increase in retinal vascular area from early to late phase, but maintained uniform retinal vein width and retinal artery tortuosity in all phases. Conclusions In vivo FA distinguished arterial and venous features, similar to plus disease, and revealed aberrant recovery of OIR mice (arterial tortuosity, reduced capillary density, and absent neovascular buds) that persisted into adulthood. Retinal artery tortuosity may be a reliable, objective marker of severity of ROP. Infants with abnormal retinal vascular

  6. Using Accelerometer and Gyroscopic Measures to Quantify Postural Stability

    PubMed Central

    Alberts, Jay L.; Hirsch, Joshua R.; Koop, Mandy Miller; Schindler, David D.; Kana, Daniel E.; Linder, Susan M.; Campbell, Scott; Thota, Anil K.

    2015-01-01

    Context Force platforms and 3-dimensional motion-capture systems provide an accurate method of quantifying postural stability. Substantial cost, space, time to administer, and need for trained personnel limit widespread use of biomechanical techniques in the assessment of postural stability in clinical or field environments. Objective To determine whether accelerometer and gyroscope data sampled from a consumer electronics device (iPad2) provide sufficient resolution of center-of-gravity (COG) movements to accurately quantify postural stability in healthy young people. Design Controlled laboratory study. Setting Research laboratory in an academic medical center. Patients or Other Participants A total of 49 healthy individuals (age = 19.5 ± 3.1 years, height = 167.7 ± 13.2 cm, mass = 68.5 ± 17.5 kg). Intervention(s) Participants completed the NeuroCom Sensory Organization Test (SOT) with an iPad2 affixed at the sacral level. Main Outcome Measure(s) Primary outcomes were equilibrium scores from both systems and the time series of the angular displacement of the anteroposterior COG sway during each trial. A Bland-Altman assessment for agreement was used to compare equilibrium scores produced by the NeuroCom and iPad2 devices. Limits of agreement was defined as the mean bias (NeuroCom − iPad) ± 2 standard deviations. Mean absolute percentage error and median difference between the NeuroCom and iPad2 measurements were used to evaluate how closely the real-time COG sway measured by the 2 systems tracked each other. Results The limits between the 2 devices ranged from −0.5° to 0.5° in SOT condition 1 to −2.9° to 1.3° in SOT condition 5. The largest absolute value of the measurement error within the 95% confidence intervals for all conditions was 2.9°. The mean absolute percentage error analysis indicated that the iPad2 tracked NeuroCom COG with an average error ranging from 5.87% to 10.42% of the NeuroCom measurement across SOT conditions. Conclusions The i

  7. Land cover change and remote sensing: Examples of quantifying spatiotemporal dynamics in tropical forests

    SciTech Connect

    Krummel, J.R.; Su, Haiping; Fox, J.; Yarnasan, S.; Ekasingh, M.

    1995-06-01

    Research on human impacts or natural processes that operate over broad geographic areas must explicitly address issues of scale and spatial heterogeneity. While the tropical forests of Southeast Asia and Mexico have been occupied and used to meet human needs for thousands of years, traditional forest management systems are currently being transformed by rapid and far-reaching demographic, political, economic, and environmental changes. The dynamics of population growth, migration into the remaining frontiers, and responses to national and international market forces result in a demand for land to produce food and fiber. These results illustrate some of the mechanisms that drive current land use changes, especially in the tropical forest frontiers. By linking the outcome of individual land use decisions and measures of landscape fragmentation and change, the aggregated results shows the hierarchy of temporal and spatial events that in summation result in global changes to the most complex and sensitive biome -- tropical forests. By quantifying the spatial and temporal patterns of tropical forest change, researchers can assist policy makers by showing how landscape systems in these tropical forests are controlled by physical, biological, social, and economic parameters.

  8. Trematode hemoglobins show exceptionally high oxygen affinity.

    PubMed

    Kiger, L; Rashid, A K; Griffon, N; Haque, M; Moens, L; Gibson, Q H; Poyart, C; Marden, M C

    1998-08-01

    Ligand binding studies were made with hemoglobin (Hb) isolated from trematode species Gastrothylax crumenifer (Gc), Paramphistomum epiclitum (Pe), Explanatum explanatum (Ee), parasitic worms of water buffalo Bubalus bubalis, and Isoparorchis hypselobagri (Ih) parasitic in the catfish Wallago attu. The kinetics of oxygen and carbon monoxide binding show very fast association rates. Whereas oxygen can be displaced on a millisecond time scale from human Hb at 25 degrees C, the dissociation of oxygen from trematode Hb may require a few seconds to over 20 s (for Hb Pe). Carbon monoxide dissociation is faster, however, than for other monomeric hemoglobins or myoglobins. Trematode hemoglobins also show a reduced rate of autoxidation; the oxy form is not readily oxidized by potassium ferricyanide, indicating that only the deoxy form reacts rapidly with this oxidizing agent. Unlike most vertebrate Hbs, the trematodes have a tyrosine residue at position E7 instead of the usual distal histidine. As for Hb Ascaris, which also displays a high oxygen affinity, the trematodes have a tyrosine in position B10; two H-bonds to the oxygen molecule are thought to be responsible for the very high oxygen affinity. The trematode hemoglobins display a combination of high association rates and very low dissociation rates, resulting in some of the highest oxygen affinities ever observed.

  9. Visual Attention and Quantifier-Spreading in Heritage Russian Bilinguals

    ERIC Educational Resources Information Center

    Sekerina, Irina A.; Sauermann, Antje

    2015-01-01

    It is well established in language acquisition research that monolingual children and adult second language learners misinterpret sentences with the universal quantifier "every" and make quantifier-spreading errors that are attributed to a preference for a match in number between two sets of objects. The present Visual World eye-tracking…

  10. Quantifying biodiversity and asymptotics for a sequence of random strings.

    PubMed

    Koyano, Hitoshi; Kishino, Hirohisa

    2010-06-01

    We present a methodology for quantifying biodiversity at the sequence level by developing the probability theory on a set of strings. Further, we apply our methodology to the problem of quantifying the population diversity of microorganisms in several extreme environments and digestive organs and reveal the relation between microbial diversity and various environmental parameters.

  11. Quantifying biodiversity and asymptotics for a sequence of random strings.

    PubMed

    Koyano, Hitoshi; Kishino, Hirohisa

    2010-06-01

    We present a methodology for quantifying biodiversity at the sequence level by developing the probability theory on a set of strings. Further, we apply our methodology to the problem of quantifying the population diversity of microorganisms in several extreme environments and digestive organs and reveal the relation between microbial diversity and various environmental parameters. PMID:20866445

  12. Shortcuts to Quantifier Interpretation in Children and Adults

    ERIC Educational Resources Information Center

    Brooks, Patricia J.; Sekerina, Irina

    2006-01-01

    Errors involving universal quantification are common in contexts depicting sets of individuals in partial, one-to-one correspondence. In this article, we explore whether quantifier-spreading errors are more common with distributive quantifiers each and every than with all. In Experiments 1 and 2, 96 children (5- to 9-year-olds) viewed pairs of…

  13. Quantifying viruses and bacteria in wastewater - results, quality control, and interpretation methods

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Membrane bioreactors (MBR), used for wastewater treatment in Ohio and elsewhere in the United States, have pore sizes large enough to theoretically reduce concentrations of protozoa and bacteria, but not viruses. Sampling for viruses in wastewater is seldom done and not required. Instead, the bac...

  14. Hyperspectral remote sensing tools for quantifying plant litter and invasive species in arid ecosystems

    USGS Publications Warehouse

    Nagler, Pamela L.; Sridhar, B.B. Maruthi; Olsson, Aaryn Dyami; Glenn, Edward P.; van Leeuwen, Willem J.D.; Thenkabail, Prasad S.; Huete, Alfredo; Lyon, John G.

    2012-01-01

    Green vegetation can be distinguished using visible and infrared multi-band and hyperspectral remote sensing methods. The problem has been in identifying and distinguishing the non-photosynthetically active radiation (PAR) landscape components, such as litter and soils, and from green vegetation. Additionally, distinguishing different species of green vegetation is challenging using the relatively few bands available on most satellite sensors. This chapter focuses on hyperspectral remote sensing characteristics that aim to distinguish between green vegetation, soil, and litter (or senescent vegetation). Quantifying litter by remote sensing methods is important in constructing carbon budgets of natural and agricultural ecosystems. Distinguishing between plant types is important in tracking the spread of invasive species. Green leaves of different species usually have similar spectra, making it difficult to distinguish between species. However, in this chapter we show that phenological differences between species can be used to detect some invasive species by their distinct patterns of greening and dormancy over an annual cycle based on hyperspectral data. Both applications require methods to quantify the non-green cellulosic fractions of plant tissues by remote sensing even in the presence of soil and green plant cover. We explore these methods and offer three case studies. The first concerns distinguishing surface litter from soil using the Cellulose Absorption Index (CAI), as applied to no-till farming practices where plant litter is left on the soil after harvest. The second involves using different band combinations to distinguish invasive saltcedar from agricultural and native riparian plants on the Lower Colorado River. The third illustrates the use of the CAI and NDVI in time-series analyses to distinguish between invasive buffelgrass and native plants in a desert environment in Arizona. Together the results show how hyperspectral imagery can be applied to

  15. Quantifying and Generalizing Hydrologic Responses to Dam Regulation using a Statistical Modeling Approach

    SciTech Connect

    McManamay, Ryan A

    2014-01-01

    Despite the ubiquitous existence of dams within riverscapes, much of our knowledge about dams and their environmental effects remains context-specific. Hydrology, more than any other environmental variable, has been studied in great detail with regard to dam regulation. While much progress has been made in generalizing the hydrologic effects of regulation by large dams, many aspects of hydrology show site-specific fidelity to dam operations, small dams (including diversions), and regional hydrologic regimes. A statistical modeling framework is presented to quantify and generalize hydrologic responses to varying degrees of dam regulation. Specifically, the objectives were to 1) compare the effects of local versus cumulative dam regulation, 2) determine the importance of different regional hydrologic regimes in influencing hydrologic responses to dams, and 3) evaluate how different regulation contexts lead to error in predicting hydrologic responses to dams. Overall, model performance was poor in quantifying the magnitude of hydrologic responses, but performance was sufficient in classifying hydrologic responses as negative or positive. Responses of some hydrologic indices to dam regulation were highly dependent upon hydrologic class membership and the purpose of the dam. The opposing coefficients between local and cumulative-dam predictors suggested that hydrologic responses to cumulative dam regulation are complex, and predicting the hydrology downstream of individual dams, as opposed to multiple dams, may be more easy accomplished using statistical approaches. Results also suggested that particular contexts, including multipurpose dams, high cumulative regulation by multiple dams, diversions, close proximity to dams, and certain hydrologic classes are all sources of increased error when predicting hydrologic responses to dams. Statistical models, such as the ones presented herein, show promise in their ability to model the effects of dam regulation effects at

  16. New primers for detecting and quantifying denitrifying anaerobic methane oxidation archaea in different ecological niches.

    PubMed

    Ding, Jing; Ding, Zhao-Wei; Fu, Liang; Lu, Yong-Ze; Cheng, Shuk H; Zeng, Raymond J

    2015-11-01

    The significance of ANME-2d in methane sink in the environment has been overlooked, and there was no any study evaluating the distribution of ANME-2d in the environment. New primers were thus needed to be designed for following research. In this paper, a pair of primers (DP397F and DP569R) was designed to quantify ANME-2d. The specificity and amplification efficiency of this primer pair were acceptable. PCR amplification of another pair of primers (DP142F and DP779R) generated a single, bright targeted band from the enrichment sample, but yielded faint, multiple bands from the environmental samples. Nested PCR was conducted using the primers DP142F/DP779R in the first round and DP142F/DP569R in the second round, which generated a bright targeted band. Further phylogenetic analysis showed that these targeted bands were ANME-2d-related sequences. Real-time PCR showed that the copies of the 16s ribosomal RNA gene of ANME-2d in these samples ranged from 3.72 × 10(4) to 2.30 × 10(5) copies μg(-1) DNA, indicating that the percentage of ANME-2d was greatest in a polluted river sample and least in a rice paddy sample. These results demonstrate that the newly developed real-time PCR primers could sufficiently quantify ANME-2d and that nested PCR with an appropriate combination of the new primers could successfully detect ANME-2d in environmental samples; the latter finding suggests that ANME-2d may spread in environments.

  17. Quantifiably secure power grid operation, management, and evolution :

    SciTech Connect

    Gray, Genetha Anne.; Watson, Jean-Paul; Silva Monroy, Cesar Augusto; Gramacy, Robert B.

    2013-09-01

    This report summarizes findings and results of the Quantifiably Secure Power Grid Operation, Management, and Evolution LDRD. The focus of the LDRD was to develop decisionsupport technologies to enable rational and quantifiable risk management for two key grid operational timescales: scheduling (day-ahead) and planning (month-to-year-ahead). Risk or resiliency metrics are foundational in this effort. The 2003 Northeast Blackout investigative report stressed the criticality of enforceable metrics for system resiliency the grids ability to satisfy demands subject to perturbation. However, we neither have well-defined risk metrics for addressing the pervasive uncertainties in a renewable energy era, nor decision-support tools for their enforcement, which severely impacts efforts to rationally improve grid security. For day-ahead unit commitment, decision-support tools must account for topological security constraints, loss-of-load (economic) costs, and supply and demand variability especially given high renewables penetration. For long-term planning, transmission and generation expansion must ensure realized demand is satisfied for various projected technological, climate, and growth scenarios. The decision-support tools investigated in this project paid particular attention to tailoriented risk metrics for explicitly addressing high-consequence events. Historically, decisionsupport tools for the grid consider expected cost minimization, largely ignoring risk and instead penalizing loss-of-load through artificial parameters. The technical focus of this work was the development of scalable solvers for enforcing risk metrics. Advanced stochastic programming solvers were developed to address generation and transmission expansion and unit commitment, minimizing cost subject to pre-specified risk thresholds. Particular attention was paid to renewables where security critically depends on production and demand prediction accuracy. To address this

  18. A new device to quantify tactile sensation in neuropathy

    PubMed Central

    Selim, M.M.; Brink, T.S.; Hodges, J.S.; Wendelschafer-Crabb, G.; Foster, S.X.Y.-L.; Nolano, M.; Provitera, V.; Simone, D.A.

    2011-01-01

    Objective: To devise a rapid, sensitive method to quantify tactile threshold of finger pads for early detection and staging of peripheral neuropathy and for use in clinical trials. Methods: Subjects were 166 healthy controls and 103 patients with, or at risk for, peripheral neuropathy. Subjects were screened by questionnaire. The test device, the Bumps, is a checkerboard-like smooth surface with 12 squares; each square encloses 5 colored circles. The subject explores the circles of each square with the index finger pad to locate the one circle containing a small bump. Bumps in different squares have different heights. Detection threshold is defined as the smallest bump height detected. In some subjects, a 3-mm skin biopsy from the tested finger pad was taken to compare density of Meissner corpuscles (MCs) to bump detection thresholds. Results: The mean (±SEM) bump detection threshold for control subjects was 3.3 ± 0.10 μm. Threshold and test time were age related, older subjects having slightly higher thresholds and using more time. Mean detection threshold of patients with neuropathy (6.2 ± 0.35 μm) differed from controls (p < 0.001). A proposed threshold for identifying impaired sensation had a sensitivity of 71% and specificity of 74%. Detection threshold was higher when MC density was decreased. Conclusions: These preliminary studies suggest that the Bumps test is a rapid, sensitive, inexpensive method to quantify tactile sensation of finger pads. It has potential for early diagnosis of tactile deficiency in subjects suspected of having neuropathy, for staging degree of tactile deficit, and for monitoring change over time. PMID:21555731

  19. Quantifying Selective Pressures Driving Bacterial Evolution Using Lineage Analysis

    NASA Astrophysics Data System (ADS)

    Lambert, Guillaume; Kussell, Edo

    2015-01-01

    Organisms use a variety of strategies to adapt to their environments and maximize long-term growth potential, but quantitative characterization of the benefits conferred by the use of such strategies, as well as their impact on the whole population's rate of growth, remains challenging. Here, we use a path-integral framework that describes how selection acts on lineages—i.e., the life histories of individuals and their ancestors—to demonstrate that lineage-based measurements can be used to quantify the selective pressures acting on a population. We apply this analysis to Escherichia coli bacteria exposed to cyclical treatments of carbenicillin, an antibiotic that interferes with cell-wall synthesis and affects cells in an age-dependent manner. While the extensive characterization of the life history of thousands of cells is necessary to accurately extract the age-dependent selective pressures caused by carbenicillin, the same measurement can be recapitulated using lineage-based statistics of a single surviving cell. Population-wide evolutionary pressures can be extracted from the properties of the surviving lineages within a population, providing an alternative and efficient procedure to quantify the evolutionary forces acting on a population. Importantly, this approach is not limited to age-dependent selection, and the framework can be generalized to detect signatures of other trait-specific selection using lineage-based measurements. Our results establish a powerful way to study the evolutionary dynamics of life under selection and may be broadly useful in elucidating selective pressures driving the emergence of antibiotic resistance and the evolution of survival strategies in biological systems.

  20. Quantifying MCMC exploration of phylogenetic tree space.

    PubMed

    Whidden, Chris; Matsen, Frederick A

    2015-05-01

    In order to gain an understanding of the effectiveness of phylogenetic Markov chain Monte Carlo (MCMC), it is important to understand how quickly the empirical distribution of the MCMC converges to the posterior distribution. In this article, we investigate this problem on phylogenetic tree topologies with a metric that is especially well suited to the task: the subtree prune-and-regraft (SPR) metric. This metric directly corresponds to the minimum number of MCMC rearrangements required to move between trees in common phylogenetic MCMC implementations. We develop a novel graph-based approach to analyze tree posteriors and find that the SPR metric is much more informative than simpler metrics that are unrelated to MCMC moves. In doing so, we show conclusively that topological peaks do occur in Bayesian phylogenetic posteriors from real data sets as sampled with standard MCMC approaches, investigate the efficiency of Metropolis-coupled MCMC (MCMCMC) in traversing the valleys between peaks, and show that conditional clade distribution (CCD) can have systematic problems when there are multiple peaks.

  1. Quantifying tissue mechanical properties using photoplethysmography

    SciTech Connect

    Akl, Tony; Wilson, Mark A.; Ericson, Milton Nance; Cote, Gerard L.

    2014-01-01

    Photoplethysmography (PPG) is a non-invasive optical method that can be used to detect blood volume changes in the microvascular bed of tissue. The PPG signal comprises two components; a pulsatile waveform (AC) attributed to changes in the interrogated blood volume with each heartbeat, and a slowly varying baseline (DC) combining low frequency fluctuations mainly due to respiration and sympathetic nervous system activity. In this report, we investigate the AC pulsatile waveform of the PPG pulse for ultimate use in extracting information regarding the biomechanical properties of tissue and vasculature. By analyzing the rise time of the pulse in the diastole period, we show that PPG is capable of measuring changes in the Young s Modulus of tissue mimicking phantoms with a resolution of 4 KPa in the range of 12 to 61 KPa. In addition, the shape of the pulse can potentially be used to diagnose vascular complications by differentiating upstream from downstream complications. A Windkessel model was used to model changes in the biomechanical properties of the circulation and to test the proposed concept. The modeling data confirmed the response seen in vitro and showed the same trends in the PPG rise and fall times with changes in compliance and vascular resistance.

  2. Quantifying MCMC Exploration of Phylogenetic Tree Space

    PubMed Central

    Whidden, Chris; Matsen, Frederick A.

    2015-01-01

    In order to gain an understanding of the effectiveness of phylogenetic Markov chain Monte Carlo (MCMC), it is important to understand how quickly the empirical distribution of the MCMC converges to the posterior distribution. In this article, we investigate this problem on phylogenetic tree topologies with a metric that is especially well suited to the task: the subtree prune-and-regraft (SPR) metric. This metric directly corresponds to the minimum number of MCMC rearrangements required to move between trees in common phylogenetic MCMC implementations. We develop a novel graph-based approach to analyze tree posteriors and find that the SPR metric is much more informative than simpler metrics that are unrelated to MCMC moves. In doing so, we show conclusively that topological peaks do occur in Bayesian phylogenetic posteriors from real data sets as sampled with standard MCMC approaches, investigate the efficiency of Metropolis-coupled MCMC (MCMCMC) in traversing the valleys between peaks, and show that conditional clade distribution (CCD) can have systematic problems when there are multiple peaks. PMID:25631175

  3. Quantifying Security Threats and Their Impact

    SciTech Connect

    Aissa, Anis Ben; Abercrombie, Robert K; Sheldon, Frederick T; Mili, Ali

    2009-01-01

    In earlier works, we present a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper we illustrate this infrastructure by means of a sample example involving an e-commerce application.

  4. Quantifying Effects Of Water Stress On Sunflowers

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This poster presentation describes the data collection and analysis procedures and results for 2009 from a research grant funded by the National Sunflower Association. The primary objective was to evaluate the use of crop canopy temperature measured with infrared temperature sensors, as a more time ...

  5. Quantifying and predicting Drosophila larvae crawling phenotypes.

    PubMed

    Günther, Maximilian N; Nettesheim, Guilherme; Shubeita, George T

    2016-01-01

    The fruit fly Drosophila melanogaster is a widely used model for cell biology, development, disease, and neuroscience. The fly's power as a genetic model for disease and neuroscience can be augmented by a quantitative description of its behavior. Here we show that we can accurately account for the complex and unique crawling patterns exhibited by individual Drosophila larvae using a small set of four parameters obtained from the trajectories of a few crawling larvae. The values of these parameters change for larvae from different genetic mutants, as we demonstrate for fly models of Alzheimer's disease and the Fragile X syndrome, allowing applications such as genetic or drug screens. Using the quantitative model of larval crawling developed here we use the mutant-specific parameters to robustly simulate larval crawling, which allows estimating the feasibility of laborious experimental assays and aids in their design. PMID:27323901

  6. Quantifying Non-Markovianity with Temporal Steering.

    PubMed

    Chen, Shin-Liang; Lambert, Neill; Li, Che-Ming; Miranowicz, Adam; Chen, Yueh-Nan; Nori, Franco

    2016-01-15

    Einstein-Podolsky-Rosen (EPR) steering is a type of quantum correlation which allows one to remotely prepare, or steer, the state of a distant quantum system. While EPR steering can be thought of as a purely spatial correlation, there does exist a temporal analogue, in the form of single-system temporal steering. However, a precise quantification of such temporal steering has been lacking. Here, we show that it can be measured, via semidefinite programing, with a temporal steerable weight, in direct analogy to the recently proposed EPR steerable weight. We find a useful property of the temporal steerable weight in that it is a nonincreasing function under completely positive trace-preserving maps and can be used to define a sufficient and practical measure of strong non-Markovianity. PMID:26824533

  7. Quantifying and predicting Drosophila larvae crawling phenotypes

    PubMed Central

    Günther, Maximilian N.; Nettesheim, Guilherme; Shubeita, George T.

    2016-01-01

    The fruit fly Drosophila melanogaster is a widely used model for cell biology, development, disease, and neuroscience. The fly’s power as a genetic model for disease and neuroscience can be augmented by a quantitative description of its behavior. Here we show that we can accurately account for the complex and unique crawling patterns exhibited by individual Drosophila larvae using a small set of four parameters obtained from the trajectories of a few crawling larvae. The values of these parameters change for larvae from different genetic mutants, as we demonstrate for fly models of Alzheimer’s disease and the Fragile X syndrome, allowing applications such as genetic or drug screens. Using the quantitative model of larval crawling developed here we use the mutant-specific parameters to robustly simulate larval crawling, which allows estimating the feasibility of laborious experimental assays and aids in their design. PMID:27323901

  8. Quantifying Irregularity in Pulsating Red Giants

    NASA Astrophysics Data System (ADS)

    Percy, J. R.; Esteves, S.; Lin, A.; Menezes, C.; Wu, S.

    2009-12-01

    Hundreds of red giant variable stars are classified as “type L,” which the General Catalogue of Variable Stars (GCVS) defines as “slow irregular variables of late spectral type...which show no evidence of periodicity, or any periodicity present is very poorly defined....” Self-correlation (Percy and Muhammed 2004) is a simple form of time-series analysis which determines the cycle-to-cycle behavior of a star, averaged over all the available data. It is well suited for analyzing stars which are not strictly periodic. Even for non-periodic stars, it provides a “profile” of the variability, including the average “characteristic time” of variability. We have applied this method to twenty-three L-type variables which have been measured extensively by AAVSO visual observers. We find a continuous spectrum of behavior, from irregular to semiregular.

  9. Quantifying protein diffusion and capture on filaments.

    PubMed

    Reithmann, Emanuel; Reese, Louis; Frey, Erwin

    2015-02-17

    The functional relevance of regulating proteins is often limited to specific binding sites such as the ends of microtubules or actin-filaments. A localization of proteins on these functional sites is of great importance. We present a quantitative theory for a diffusion and capture process, where proteins diffuse on a filament and stop diffusing when reaching the filament's end. It is found that end-association after one-dimensional diffusion is the main source for tip-localization of such proteins. As a consequence, diffusion and capture is highly efficient in enhancing the reaction velocity of enzymatic reactions, where proteins and filament ends are to each other as enzyme and substrate. We show that the reaction velocity can effectively be described within a Michaelis-Menten framework. Together, one-dimensional diffusion and capture beats the (three-dimensional) Smoluchowski diffusion limit for the rate of protein association to filament ends.

  10. Quantifying App Store Dynamics: Longitudinal Tracking of Mental Health Apps

    PubMed Central

    Nicholas, Jennifer; Christensen, Helen

    2016-01-01

    Background For many mental health conditions, mobile health apps offer the ability to deliver information, support, and intervention outside the clinical setting. However, there are difficulties with the use of a commercial app store to distribute health care resources, including turnover of apps, irrelevance of apps, and discordance with evidence-based practice. Objective The primary aim of this study was to quantify the longevity and rate of turnover of mental health apps within the official Android and iOS app stores. The secondary aim was to quantify the proportion of apps that were clinically relevant and assess whether the longevity of these apps differed from clinically nonrelevant apps. The tertiary aim was to establish the proportion of clinically relevant apps that included claims of clinical effectiveness. We performed additional subgroup analyses using additional data from the app stores, including search result ranking, user ratings, and number of downloads. Methods We searched iTunes (iOS) and the Google Play (Android) app stores each day over a 9-month period for apps related to depression, bipolar disorder, and suicide. We performed additional app-specific searches if an app no longer appeared within the main search Results On the Android platform, 50% of the search results changed after 130 days (depression), 195 days (bipolar disorder), and 115 days (suicide). Search results were more stable on the iOS platform, with 50% of the search results remaining at the end of the study period. Approximately 75% of Android and 90% of iOS apps were still available to download at the end of the study. We identified only 35.3% (347/982) of apps as being clinically relevant for depression, of which 9 (2.6%) claimed clinical effectiveness. Only 3 included a full citation to a published study. Conclusions The mental health app environment is volatile, with a clinically relevant app for depression becoming unavailable to download every 2.9 days. This poses

  11. Groundwater-dependent vegetation: Quantifying the groundwater subsidy

    NASA Astrophysics Data System (ADS)

    Lowry, Christopher S.; Loheide, Steven P.

    2010-06-01

    The typical stratigraphy of riparian ecosystems consists of fine-grained overbank deposits overlying coarser-grained materials. Plants within these regions rely on soil moisture in the fine-grained sediments as well as supplemental groundwater for root water uptake. The additional water available as a result of shallow water table conditions is defined here as groundwater subsidy and is found to be a significant contribution to root water uptake. Work presented here quantifies the effect of groundwater subsidy on root water uptake as a result of variations in the soil thickness of the upper fine-grained sediments, rate of water table decline, and maximum water table depth. Variations in soil thickness and water table decline regimes produce a complex response with respect to both the rate of groundwater subsidy and the cumulative groundwater subsidy. These simulated regimes are analogs to environmental scenarios in riparian ecosystems that result from stream incision, soil erosion, and climate change. These results have implications for identifying ecosystems most susceptible to future change as well as those most amenable to restoration.

  12. Complexity and approximability of quantified and stochastic constraint satisfaction problems

    SciTech Connect

    Hunt, H. B.; Stearns, R. L.; Marathe, M. V.

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SAT{sub c}(S)). Here, we study simultaneously the complexity of and the existence of efficient approximation algorithms for a number of variants of the problems SAT(S) and SAT{sub c}(S), and for many different D, C, and S. These problem variants include decision and optimization problems, for formulas, quantified formulas stochastically-quantified formulas. We denote these problems by Q-SAT(S), MAX-Q-SAT(S), S-SAT(S), MAX-S-SAT(S) MAX-NSF-Q-SAT(S) and MAX-NSF-S-SAT(S). The main contribution is the development of a unified predictive theory for characterizing the the complexity of these problems. Our unified approach is based on the following basic two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic representability. Let k {ge} 2. Let S be a finite set of finite-arity relations on {Sigma}{sub k} with the following condition on S: All finite arity relations on {Sigma}{sub k} can be represented as finite existentially-quantified conjunctions of relations in S applied to variables (to variables and constant symbols in C), Then we prove the following new results: (1) The problems SAT(S) and SAT{sub c}(S) are both NQL-complete and {le}{sub logn}{sup bw}-complete for NP. (2) The problems Q-SAT(S), Q-SAT{sub c}(S), are PSPACE-complete. Letting k = 2, the problem S-SAT(S) and S-SAT{sub c}(S) are PSPACE-complete. (3) {exists} {epsilon} > 0 for which approximating the problems MAX-Q-SAT(S) within {epsilon} times optimum is PSPACE-hard. Letting k =: 2, {exists} {epsilon} > 0 for which approximating the problems MAX-S-SAT(S) within {epsilon} times optimum is PSPACE-hard. (4

  13. Quantifying human response capabilities towards tsunami threats at community level

    NASA Astrophysics Data System (ADS)

    Post, J.; Mück, M.; Zosseder, K.; Wegscheider, S.; Taubenböck, H.; Strunz, G.; Muhari, A.; Anwar, H. Z.; Birkmann, J.; Gebert, N.

    2009-04-01

    besides others play a role. An attempt to quantify this variable under high uncertainty is also presented. Quantifying ET is based on a GIS modelling using a Cost Weighted Distance approach. Basic principle is to define the best evacuation path from a given point to the next safe area (shelter location). Here the fastest path from that point to the shelter location has to be found. Thereby the impact of land cover, slope, population density, population age and gender distribution are taken into account as literature studies prove these factors as highly important. Knowing the fastest path and the distance to the next safe area together with a spatially distributed pattern of evacuation speed delivers the time needed from each location to a safe area. By considering now the obtained time value for RsT the coverage area of an evacuation target point (safe area) can be assigned. Incorporating knowledge on people capacity of an evacuation target point the respective coverage area is refined. Hence areas with weak, moderate and good human response capabilities can be detected. This allows calculation of potential amount of people affected (dead or injured) and amount of people dislocated. First results for Kuta (Bali) for a worst case tsunami event deliver people affected of approx. 25 000 when RT = 0 minutes (direct evacuation when receiving a tsunami warning to 120 000 when RT > ETA (no evacuation action until tsunami hits the land). Additionally fastest evacuation routes to the evacuation target points can be assigned. Areas with weak response capabilities can be assigned as priority areas to install e.g. additional evacuation target points or to increase tsunami knowledge and awareness to promote a faster reaction time. Especially in analyzing underlying socio-economic properties causing deficiencies in responding to a tsunami threat can lead to valuable information and direct planning of adaptation measures. Keywords: Community level, Risk and vulnerability assessment

  14. Quantifying the impact of metamorphic reactions on strain localization in the mantle

    NASA Astrophysics Data System (ADS)

    Huet, Benjamin; Yamato, Philippe

    2014-05-01

    Metamorphic reactions are most often considered as a passive record of changes in pressure, temperature and fluid conditions that rocks experience. In that way, they provide key constraints on the tectonic evolution of the crust and the mantle. However, natural examples show that metamorphism can also modify the strength of rocks and affect the strain localization in ductile shear zones. Hence, metamorphic reactions have an active role in tectonics by inducing softening and/or hardening depending on the involved reactions. Quantifying the mechanical effect of such metamorphic reactions is, therefore, a crucial task for determining both the strength distribution in the lithosphere and its evolution. However, the estimate of the effective strength of such polyphase rocks remains still an open issue. Some flow laws (determined experimentally) already exist for monophase aggregates and polyphase rocks for rheologically important materials. They provide good constraints on lithology-controlled lithospheric strength variations. Unfortunately, since the whole range of mineralogical and chemical rock compositions cannot be experimentally tested, the variations of strength due to in metamorphism reaction cannot be systematically and fully characterized. In order to tackle this issue, we here present the results of a study coupling thermodynamical and mechanical modeling that allows us to predict the mechanical impact of metamorphic reactions on the strength of the mantle. Thermodynamic modeling (using Theriak-Domino) is used for calculating the mineralogical composition of a typical peridotite as a function of pressure, temperature and water content. The calculated modes and flow laws parameters for monophase aggregates are then used as input of the Minimized Power Geometric model for predicting the polyphase aggregate strength. Our results are then used to quantify the strength evolution of the mantle as a function of pressure, temperature and water content in two

  15. A field method to quantify exchange with less-mobile porosity in streambeds using electrical hysteresis

    NASA Astrophysics Data System (ADS)

    Briggs, M. A.; Day-Lewis, F. D.; Zarnetske, J. P.; Harvey, J. W.; Lane, J. W., Jr.

    2015-12-01

    Heterogeneous streambed materials may be expected to develop two general porosity domains: a more-mobile porosity dominated by advective exchange, and a less-mobile porosity dominated by diffusive exchange. Less-mobile porosity containing unique redox conditions or contaminant mass may be invisible to traditional porewater sampling methods, even using "low-flow" techniques, because these methods sample water preferentially from the mobile porosity domain. Further, most tracer breakthrough curve analyses have only provided indirect information (tailing) regarding the prevalence and connectivity of less-mobile porosity, typically over experimental flowpath scales between 1-10 meters. To address the limitations of conventional methods, we use electrical geophysical methods to aid in the inference of less-mobile porosity parameters. Unlike traditional fluid sampling, electrical methods can directly sense less-mobile solute and can target specific points along subsurface flowpaths. We demonstrate how the geophysical methodology developed for dual-domain groundwater transport can be scaled to the streambed through synthetic, laboratory column, and field experiments; further we show how previously-used numerical modeling techniques can be replaced by a more-simple analytical approach. The new analytical method is based on electrical theory, and involves characteristics of electrical hysteresis patterns (e.g. hinge point values) that are used to quantify (1) the size of paired mobile and less-mobile porosities, and (2) the exchange rate coefficient through simple curve fitting. Results from the analytical approach compare favorably with results from calibration of numerical models and also independent measurements of mobile and less-mobile porosity. Lastly, we demonstrate a method of focused solute streambed injection to quantify less-mobile porosity and explain redox zonation in contrasting stream environments.

  16. Quantifying the Components of Impervious Surfaces

    USGS Publications Warehouse

    Tilley, Janet S.; Slonecker, E. Terrence

    2006-01-01

    This study's objectives were to (1) determine the relative contribution of impervious surface individual components by collecting digital information from high-resolution imagery, 1-meter or better; and to (2) determine which of the more advanced techniques, such as spectral unmixing or the application of coefficients to land use or land cover data, was the most suitable method that could be used by State and local governments as well as Federal agencies to efficiently measure the imperviousness in any given watershed or area of interest. The components of impervious surfaces, combined from all the watersheds and time periods from objective one were the following: buildings 29.2-percent, roads 28.3-percent, parking lots 24.6-percent; with the remaining three totaling 14-percent - driveways, sidewalks, and other, where other were any other features that were not contained within the first five. Results from objective two were spectral unmixing techniques will ultimately be the most efficient method of determining imperviousness, but are not yet accurate enough as it is critical to achieve accuracy better than 10-percent of the truth, of which the method is not consistently accomplishing as observed in this study. Of the three techniques in coefficient application tested, land use coefficient application was not practical, while if the last two methods, coefficients applied to land cover data, were merged, their end results could be to within 5-percent or better, of the truth. Until the spectral unmixing technique has been further refined, land cover coefficients should be used, which offer quick results, but not current as they were developed for the 1992 National Land Characteristics Data.

  17. The Interpretation of Classically Quantified Sentences: A Set-Theoretic Approach

    ERIC Educational Resources Information Center

    Politzer, Guy; Van der Henst, Jean-Baptiste; Delle Luche, Claire; Noveck, Ira A.

    2006-01-01

    We present a set-theoretic model of the mental representation of classically quantified sentences (All P are Q, Some P are Q, Some P are not Q, and No P are Q). We take inclusion, exclusion, and their negations to be primitive concepts. We show that although these sentences are known to have a diagrammatic expression (in the form of the Gergonne…

  18. Quantifying the Electrocatalytic Turnover of Vitamin B12-Mediated Dehalogenation on Single Soft Nanoparticles.

    PubMed

    Cheng, Wei; Compton, Richard G

    2016-02-12

    We report the electrocatalytic dehalogenation of trichloroethylene (TCE) by single soft nanoparticles in the form of Vitamin B12 -containing droplets. We quantify the turnover number of the catalytic reaction at the single soft nanoparticle level. The kinetic data shows that the binding of TCE with the electro-reduced vitamin in the Co(I) oxidation state is chemically reversible. PMID:26806226

  19. Quantifying oil filtration effects on bearing life

    NASA Technical Reports Server (NTRS)

    Needelman, William M.; Zaretsky, Erwin V.

    1991-01-01

    Rolling-element bearing life is influenced by the number, size, and material properties of particles entering the Hertzian contact of the rolling element and raceway. In general, rolling-element bearing life increases with increasing level of oil filtration. Based upon test results, two equations are presented which allow for the adjustment of bearing L(sub 10) or catalog life based upon oil filter rating. It is recommended that where no oil filtration is used catalog life be reduced by 50 percent.

  20. Quantifying data worth toward reducing predictive uncertainty

    USGS Publications Warehouse

    Dausman, A.M.; Doherty, J.; Langevin, C.D.; Sukop, M.C.

    2010-01-01

    The present study demonstrates a methodology for optimization of environmental data acquisition. Based on the premise that the worth of data increases in proportion to its ability to reduce the uncertainty of key model predictions, the methodology can be used to compare the worth of different data types, gathered at different locations within study areas of arbitrary complexity. The method is applied to a hypothetical nonlinear, variable density numerical model of salt and heat transport. The relative utilities of temperature and concentration measurements at different locations within the model domain are assessed in terms of their ability to reduce the uncertainty associated with predictions of movement of the salt water interface in response to a decrease in fresh water recharge. In order to test the sensitivity of the method to nonlinear model behavior, analyses were repeated for multiple realizations of system properties. Rankings of observation worth were similar for all realizations, indicating robust performance of the methodology when employed in conjunction with a highly nonlinear model. The analysis showed that while concentration and temperature measurements can both aid in the prediction of interface movement, concentration measurements, especially when taken in proximity to the interface at locations where the interface is expected to move, are of greater worth than temperature measurements. Nevertheless, it was also demonstrated that pairs of temperature measurements, taken in strategic locations with respect to the interface, can also lead to more precise predictions of interface movement. Journal compilation ?? 2010 National Ground Water Association.

  1. Quantifying the origin of metallic glass formation

    NASA Astrophysics Data System (ADS)

    Johnson, W. L.; Na, J. H.; Demetriou, M. D.

    2016-01-01

    The waiting time to form a crystal in a unit volume of homogeneous undercooled liquid exhibits a pronounced minimum τX* at a `nose temperature' T* located between the glass transition temperature Tg, and the crystal melting temperature, TL. Turnbull argued that τX* should increase rapidly with the dimensionless ratio trg=Tg/TL. Angell introduced a dimensionless `fragility parameter', m, to characterize the fall of atomic mobility with temperature above Tg. Both trg and m are widely thought to play a significant role in determining τX*. Here we survey and assess reported data for TL, Tg, trg, m and τX* for a broad range of metallic glasses with widely varying τX*. By analysing this database, we derive a simple empirical expression for τX*(trg, m) that depends exponentially on trg and m, and two fitting parameters. A statistical analysis shows that knowledge of trg and m alone is therefore sufficient to predict τX* within estimated experimental errors. Surprisingly, the liquid/crystal interfacial free energy does not appear in this expression for τX*.

  2. Quantifying the origin of metallic glass formation

    PubMed Central

    Johnson, W. L.; Na, J. H.; Demetriou, M. D.

    2016-01-01

    The waiting time to form a crystal in a unit volume of homogeneous undercooled liquid exhibits a pronounced minimum τX* at a ‘nose temperature' T* located between the glass transition temperature Tg, and the crystal melting temperature, TL. Turnbull argued that τX* should increase rapidly with the dimensionless ratio trg=Tg/TL. Angell introduced a dimensionless ‘fragility parameter', m, to characterize the fall of atomic mobility with temperature above Tg. Both trg and m are widely thought to play a significant role in determining τX*. Here we survey and assess reported data for TL, Tg, trg, m and τX* for a broad range of metallic glasses with widely varying τX*. By analysing this database, we derive a simple empirical expression for τX*(trg, m) that depends exponentially on trg and m, and two fitting parameters. A statistical analysis shows that knowledge of trg and m alone is therefore sufficient to predict τX* within estimated experimental errors. Surprisingly, the liquid/crystal interfacial free energy does not appear in this expression for τX*. PMID:26786966

  3. Quantifying in vivo MR spectra with circles

    NASA Astrophysics Data System (ADS)

    Gabr, Refaat E.; Ouwerkerk, Ronald; Bottomley, Paul A.

    2006-03-01

    Accurate and robust quantification of in vivo magnetic resonance spectroscopy (MRS) data is essential to its application in research and medicine. The performance of existing analysis methods is problematic for in vivo studies where low signal-to-noise ratio, overlapping peaks and intense artefacts are endemic. Here, a new frequency-domain technique for MRS data analysis is introduced wherein the circular trajectories which result when spectral peaks are projected onto the complex plane, are fitted with active circle models. The use of active contour strategies naturally allows incorporation of prior knowledge as constraint energy terms. The problem of phasing spectra is eliminated, and baseline artefacts are dealt with using active contours-snakes. The stability and accuracy of the new technique, CFIT, is compared with a standard time-domain fitting tool, using simulated 31P data with varying amounts of noise and 98 real human chest and heart 31P MRS data sets. The real data were also analyzed by our standard frequency-domain absorption-mode technique. On the real data, CFIT demonstrated the least fitting failures of all methods and an accuracy similar to the latter method, with both these techniques outperforming the time-domain approach. Contrasting results from simulations argue that performance relative to Cramer-Rao Bounds may not be a suitable indicator of fitting performance with typical in vivo data such as these. We conclude that CFIT is a stable, accurate alternative to the best existing methods of fitting in vivo data.

  4. Structural property of soybean lunasin and development of a method to quantify lunasin in plasma using an optimized immunoassay protocol.

    PubMed

    Dia, Vermont P; Frankland-Searby, Sarah; del Hierro, Francisco Laso; Garcia, Guadalupe; de Mejia, Elvira Gonzalez

    2013-05-01

    Lunasin is a 43-amino acid naturally occurring chemopreventive peptide with demonstrated anti-cancer and anti-inflammatory properties. The objectives of this study were to determine the effect of temperature on the secondary structure of lunasin, to develop a method of isolating lunasin from human plasma using an ion-exchange microspin column and to quantify the amount of lunasin using an optimized enzyme-linked immunosorbent assay. Lunasin was purified using a combination of ion-exchange chromatography, ultrafiltration and gel filtration chromatography. Circular dichroism showed that increased in temperature from 25 to 100 °C resulted in changes on the secondary structure of lunasin and its capability to interact with rabbit polyclonal antibody. Enzyme linked immunosorbent assay showed that lunasin rabbit polyclonal antibody has a titer of 250 and a specific activity of 0.05 mL/μg. A linear response was detected between 16 to 48 ng lunasin per mL (y=0.03x-0.38, R(2)=0.96). The use of diethylaminoethyl microspin column to isolate spiked lunasin in human plasma showed that most lunasin (37.8-46.5%) bound to the column eluted with Tris-HCl buffer, pH 7.5 with a yield up to 76.6%. In conclusion, lunasin can be isolated from human plasma by a simple DEAE microspin column technique and can be quantified using a validated and optimized immunoassay procedure. This method can be used directly to quantify lunasin from plasma in different human and animal studies aiming to determine its bioavailability. PMID:23265496

  5. "No-Shows": A Vexing Problem.

    ERIC Educational Resources Information Center

    Holmes, John; And Others

    1980-01-01

    What can we learn about the applicants who do not show up on campus to register? This study suggests both a method for learning more about "no-shows" and the reasons why they change their mind. (Author)

  6. Alzheimer's Gene May Show Effects in Childhood

    MedlinePlus

    ... page: https://medlineplus.gov/news/fullstory_159854.html Alzheimer's Gene May Show Effects in Childhood Brain scans ... 13, 2016 (HealthDay News) -- A gene related to Alzheimer's disease may start to show effects on brain ...

  7. A graph-theoretic method to quantify the airline route authority

    NASA Technical Reports Server (NTRS)

    Chan, Y.

    1979-01-01

    The paper introduces a graph-theoretic method to quantify the legal statements in route certificate which specifies the airline routing restrictions. All the authorized nonstop and multistop routes, including the shortest time routes, can be obtained, and the method suggests profitable route structure alternatives to airline analysts. This method to quantify the C.A.B. route authority was programmed in a software package, Route Improvement Synthesis and Evaluation, and demonstrated in a case study with a commercial airline. The study showed the utility of this technique in suggesting route alternatives and the possibility of improvements in the U.S. route system.

  8. Quantifying uncertainty in material damage from vibrational data

    SciTech Connect

    Butler, T.; Huhtala, A.; Juntunen, M.

    2015-02-15

    The response of a vibrating beam to a force depends on many physical parameters including those determined by material properties. Damage caused by fatigue or cracks results in local reductions in stiffness parameters and may drastically alter the response of the beam. Data obtained from the vibrating beam are often subject to uncertainties and/or errors typically modeled using probability densities. The goal of this paper is to estimate and quantify the uncertainty in damage modeled as a local reduction in stiffness using uncertain data. We present various frameworks and methods for solving this parameter determination problem. We also describe a mathematical analysis to determine and compute useful output data for each method. We apply the various methods in a specified sequence that allows us to interface the various inputs and outputs of these methods in order to enhance the inferences drawn from the numerical results obtained from each method. Numerical results are presented using both simulated and experimentally obtained data from physically damaged beams.

  9. Quantifying the Consistency of Scientific Databases

    PubMed Central

    Šubelj, Lovro; Bajec, Marko; Mileva Boshkoska, Biljana; Kastrin, Andrej; Levnajić, Zoran

    2015-01-01

    Science is a social process with far-reaching impact on our modern society. In recent years, for the first time we are able to scientifically study the science itself. This is enabled by massive amounts of data on scientific publications that is increasingly becoming available. The data is contained in several databases such as Web of Science or PubMed, maintained by various public and private entities. Unfortunately, these databases are not always consistent, which considerably hinders this study. Relying on the powerful framework of complex networks, we conduct a systematic analysis of the consistency among six major scientific databases. We found that identifying a single "best" database is far from easy. Nevertheless, our results indicate appreciable differences in mutual consistency of different databases, which we interpret as recipes for future bibliometric studies. PMID:25984946

  10. Quantifying uncertainties in wind energy assessment

    NASA Astrophysics Data System (ADS)

    Patlakas, Platon; Galanis, George; Kallos, George

    2015-04-01

    The constant rise of wind energy production and the subsequent penetration in global energy markets during the last decades resulted in new sites selection with various types of problems. Such problems arise due to the variability and the uncertainty of wind speed. The study of the wind speed distribution lower and upper tail may support the quantification of these uncertainties. Such approaches focused on extreme wind conditions or periods below the energy production threshold are necessary for a better management of operations. Towards this direction, different methodologies are presented for the credible evaluation of potential non-frequent/extreme values for these environmental conditions. The approaches used, take into consideration the structural design of the wind turbines according to their lifespan, the turbine failures, the time needed for repairing as well as the energy production distribution. In this work, a multi-parametric approach for studying extreme wind speed values will be discussed based on tools of Extreme Value Theory. In particular, the study is focused on extreme wind speed return periods and the persistence of no energy production based on a weather modeling system/hind cast/10-year dataset. More specifically, two methods (Annual Maxima and Peaks Over Threshold) were used for the estimation of extreme wind speeds and their recurrence intervals. Additionally, two different methodologies (intensity given duration and duration given intensity, both based on Annual Maxima method) were implied to calculate the extreme events duration, combined with their intensity as well as the event frequency. The obtained results prove that the proposed approaches converge, at least on the main findings, for each case. It is also remarkable that, despite the moderate wind speed climate of the area, several consequent days of no energy production are observed.

  11. Quantifying the multiple, environmental benefits of reintroducing the Eurasian Beaver

    NASA Astrophysics Data System (ADS)

    Brazier, Richard; Puttock, Alan; Graham, Hugh; Anderson, Karen; Cunliffe, Andrew; Elliott, Mark

    2016-04-01

    Beavers are ecological engineers with an ability to modify the structure and flow of fluvial systems and create complex wetland environments with dams, ponds and canals. Consequently, beaver activity has potential for river restoration, management and the provision of multiple environmental ecosystem services including biodiversity, flood risk mitigation, water quality and sustainable drinking water provision. With the current debate surrounding the reintroduction of beavers into the United Kingdom, it is critical to monitor the impact of beavers upon the environment. We have developed and implemented a monitoring strategy to quantify the impact of reintroducing the Eurasian Beaver on multiple environmental ecosystem services and river systems at a range of scales. First, the experimental design and preliminary results will be presented from the Mid-Devon Beaver Trial, where a family of beavers has been introduced to a 3 ha enclosure situated upon a first order tributary of the River Tamar. The site was instrumented to monitor the flow rate and quality of water entering and leaving the site. Additionally, the impacts of beavers upon riparian vegetation structure, water/carbon storage were investigated. Preliminary results indicate that beaver activity, particularly the building of ponds and dams, increases water storage within the landscape and moderates the river response to rainfall. Baseflow is enhanced during dry periods and storm flow is attenuated, potentially reducing the risk of flooding downstream. Initial analysis of water quality indicates that water entering the site (running off intensively managed grasslands upslope), has higher suspended sediment loads and nitrate levels, than that leaving the site, after moving through the series of beaver ponds. These results suggest beaver activity may also act as a means by which the negative impact of diffuse water pollution from agriculture can be mitigated thus providing cleaner water in rivers downstream

  12. NASA GIBS Use in Live Planetarium Shows

    NASA Astrophysics Data System (ADS)

    Emmart, C. B.

    2015-12-01

    The American Museum of Natural History's Hayden Planetarium was rebuilt in year 2000 as an immersive theater for scientific data visualization to show the universe in context to our planet. Specific astrophysical movie productions provide the main daily programming, but interactive control software, developed at AMNH allows immersive presentation within a data aggregation of astronomical catalogs called the Digital Universe 3D Atlas. Since 2006, WMS globe browsing capabilities have been built into a software development collaboration with Sweden's Linkoping University (LiU). The resulting Uniview software, now a product of the company SCISS, is operated by about fifty planetariums around that world with ability to network amongst the sites for global presentations. Public presentation of NASA GIBS has allowed authoritative narratives to be presented within the range of data available in context to other sources such as Science on a Sphere, NASA Earth Observatory and Google Earth KML resources. Specifically, the NOAA supported World Views Network conducted a series of presentations across the US that focused on local ecological issues that could then be expanded in the course of presentation to national and global scales of examination. NASA support of for GIBS resources in an easy access multi scale streaming format like WMS has tremendously enabled particularly facile presentations of global monitoring like never before. Global networking of theaters for distributed presentations broadens out the potential for impact of this medium. Archiving and refinement of these presentations has already begun to inform new types of documentary productions that examine pertinent, global interdependency topics.

  13. Quantifying peak discharges for historical floods

    USGS Publications Warehouse

    Cook, J.L.

    1987-01-01

    It is usually advantageous to use information regarding historical floods, if available, to define the flood-frequency relation for a stream. Peak stages can sometimes be determined for outstanding floods that occurred many years ago before systematic gaging of streams began. In the United States, this information is usually not available for more than 100-200 years, but in countries with long cultural histories, such as China, historical flood data are available at some sites as far back as 2,000 years or more. It is important in flood studies to be able to assign a maximum discharge rate and an associated error range to the historical flood. This paper describes the significant characteristics and uncertainties of four commonly used methods for estimating the peak discharge of a flood. These methods are: (1) rating curve (stage-discharge relation) extension; (2) slope conveyance; (3) slope area; and (4) step backwater. Logarithmic extensions of rating curves are based on theoretical plotting techniques that results in straight line extensions provided that channel shape and roughness do not change significantly. The slope-conveyance and slope-area methods are based on the Manning equation, which requires specific data on channel size, shape and roughness, as well as the water-surface slope for one or more cross-sections in a relatively straight reach of channel. The slope-conveyance method is used primarily for shaping and extending rating curves, whereas the slope-area method is used for specific floods. The step-backwater method, also based on the Manning equation, requires more cross-section data than the slope-area ethod, but has a water-surface profile convergence characteristic that negates the need for known or estimated water-surface slope. Uncertainties in calculating peak discharge for historical floods may be quite large. Various investigations have shown that errors in calculating peak discharges by the slope-area method under ideal conditions for

  14. Quantifying the measurement of differential diagnosis.

    PubMed

    Johnson, L A

    1995-01-01

    Differential diagnosis is central to the work and training of all health care professionals. To develop solid differential diagnosis and skills, students required practice diagnosing numerous and varied patients. In the absence of real patients, patient simulations are commonly used to provide this range of diagnostic experiences. This study examined the benefits of interactive patient simulations on the diagnostic approaches of beginning dental students (novices) and practicing dentists (experts). The study tested the hypothesis that novices tend to use trial-and-error, while experts tend to use pattern recognition during differential diagnosis. a second goal of the study explored objective and subjective measures of a differential diagnosis approach. Seventy-five subjects comprised two treatment groups: a novice group and an expert group. Each group completed ten patient simulations and a case study test measured the diagnostic approach. A three factor MANOVA (p>0.05) was followed by univariate ANOVA's. The result indicated differences between the diagnostic approaches of experts and novices and that the subjective Ratings and the objective Maximum Decisions and Average Variation variables were the best measures of a differential diagnostic approach. PMID:8591429

  15. Use of tracers to quantify subsurface flow through a mining pit.

    PubMed

    Schladow, S Geoffrey; Clark, Jordan F

    2008-12-01

    Three independent tracer experiments were conducted to quantify the through-flow of water from Herman Pit, an abandoned mercury (Hg) mine pit adjacent to Clear Lake, California, USA. The tracers used were Rhodamine-WT, sulfur hexafluoride, and a mixture of sulfur hexafluoride and neon-22. The tracers were injected into Herman Pit, a generally well-mixed water body of approximately 81,000 m2, and the concentrations were monitored in the mine pit, observation wells, and the lake for 2-3 months following each injection. The results for all three experiments showed that the tracer arrived at certain observation wells within days of injection. Comparing all the well data showed a highly heterogeneous response, with a small number of wells showing this near-instantaneous response and others taking months before the tracer was detectable. Tracer was also found in the lake on four occasions over a one-month period, too few to infer any pattern but sufficient to confirm the connection of the two water bodies. Using a simple mass balance model it was possible to determine the effective loss rate through advection for each of the tracers and with this to estimate the through-flow rate. The through-flow rate for all three experiments was approximately 630 L/s, at least 1-2 orders of magnitude larger than previous estimates, all of which had been based on geochemical inferences or other indirect measures of the pit through-flow.

  16. Quantifying the cleanliness of glass capillaries.

    PubMed

    Bowman, C L

    1998-01-01

    I used capillary rise methods to investigate the lumenal surface properties of quartz (fused silica, Amersil T-08), borosilicate (Corning 7800), and high-lead glass (Corning 0010) capillaries commonly used to make patch pipets. I calculated the capillary rise and contact angle for water and methanol from weight measurements. The capillary rise was compared with the theoretical maximum value calculated by assuming each fluid perfectly wetted the lumenal surface of the glass (i.e., zero contact angle, which reflects the absence of surface contamination). For borosilicate, high-lead, and quartz capillaries, the rise for water was substantially less than the theoretical maximum rise. Exposure of the borosilicate, lead, and quartz capillaries to several cleaning methods resulted in substantially better--but not perfect--agreement between the theoretical maximum rise and calculated capillary rise. By contrast, the capillary rise for methanol was almost identical in untreated and cleaned capillaries, but less than its theoretical maximum rise. The residual discrepancy between the observed and theoretical rise for water could not be improved on by trying a variety of cleaning procedures, but some cleaning methods were superior to others. The water solubility of the surface contaminants, deduced from the effectiveness of repeated rinsing, was different for each of the three types of capillaries examined: Corning 7800 > quartz > Corning 0010. A surface film was also detected in quatz tubing with an internal filament. I conclude that these borosilicate, quartz, and high-lead glass capillaries have a film on the lumenal surface, which can be removed using appropriate cleaning methods. The surface contaminants may be unique to each type of capillary and may also be hydrophobic. Two simple methods are presented to quantitate the cleanliness of glass capillary tubing commonly used to make pipets for studies of biological membranes. It is not known if the surface film is of

  17. Quantifying Parkinson's disease progression by simulating gait patterns

    NASA Astrophysics Data System (ADS)

    Cárdenas, Luisa; Martínez, Fabio; Atehortúa, Angélica; Romero, Eduardo

    2015-12-01

    Modern rehabilitation protocols of most neurodegenerative diseases, in particular the Parkinson Disease, rely on a clinical analysis of gait patterns. Currently, such analysis is highly dependent on both the examiner expertise and the type of evaluation. Development of evaluation methods with objective measures is then crucial. Physical models arise as a powerful alternative to quantify movement patterns and to emulate the progression and performance of specific treatments. This work introduces a novel quantification of the Parkinson disease progression using a physical model that accurately represents the main gait biomarker, the body Center of Gravity (CoG). The model tracks the whole gait cycle by a coupled double inverted pendulum that emulates the leg swinging for the single support phase and by a damper-spring System (SDP) that recreates both legs in contact with the ground for the double phase. The patterns generated by the proposed model are compared with actual ones learned from 24 subjects in stages 2,3, and 4. The evaluation performed demonstrates a better performance of the proposed model when compared with a baseline model(SP) composed of a coupled double pendulum and a mass-spring system. The Frechet distance measured differences between model estimations and real trajectories, showing for stages 2, 3 and 4 distances of 0.137, 0.155, 0.38 for the baseline and 0.07, 0.09, 0.29 for the proposed method.

  18. Quantifying the benefits of vehicle pooling with shareability networks

    PubMed Central

    Santi, Paolo; Resta, Giovanni; Szell, Michael; Sobolevsky, Stanislav; Strogatz, Steven H.; Ratti, Carlo

    2014-01-01

    Taxi services are a vital part of urban transportation, and a considerable contributor to traffic congestion and air pollution causing substantial adverse effects on human health. Sharing taxi trips is a possible way of reducing the negative impact of taxi services on cities, but this comes at the expense of passenger discomfort quantifiable in terms of a longer travel time. Due to computational challenges, taxi sharing has traditionally been approached on small scales, such as within airport perimeters, or with dynamical ad hoc heuristics. However, a mathematical framework for the systematic understanding of the tradeoff between collective benefits of sharing and individual passenger discomfort is lacking. Here we introduce the notion of shareability network, which allows us to model the collective benefits of sharing as a function of passenger inconvenience, and to efficiently compute optimal sharing strategies on massive datasets. We apply this framework to a dataset of millions of taxi trips taken in New York City, showing that with increasing but still relatively low passenger discomfort, cumulative trip length can be cut by 40% or more. This benefit comes with reductions in service cost, emissions, and with split fares, hinting toward a wide passenger acceptance of such a shared service. Simulation of a realistic online system demonstrates the feasibility of a shareable taxi service in New York City. Shareability as a function of trip density saturates fast, suggesting effectiveness of the taxi sharing system also in cities with much sparser taxi fleets or when willingness to share is low. PMID:25197046

  19. A Methodology for Quantifying Heart Function in the Embryonic Zebrafish

    NASA Astrophysics Data System (ADS)

    Johnson, Brennan; Garrity, Deborah; Dasi, Lakshmi

    2012-11-01

    Several studies have linked epigenetic factors such as blood flow dynamics and cardiac function to proper heart development. To better understand this process, it is essential to develop robust quantitative methods to investigate the blood dynamics and wall kinematics in vivo. Here, we develop a methodology that can be used throughout the early stages of development which requires no specialized equipment other than a bright field microscope and high-speed camera. We use the embryonic zebrafish as our model due to its superb optical access and widespread acceptance as a powerful model for human heart development. Using these methods, we quantify blood flow rates, stroke volume, cardiac output, ejection fraction, and other important parameters related to heart function. We also investigate the pumping mechanics from heart tube to looped configuration. We show that although the mechanism changes fundamentally, it does so in a continuous fashion that can incorporate combined pumping mechanisms at intermediate stages. This work provides a basis for quantitatively comparing normal and abnormal heart development, and may help us gain a better understanding of congenital heart defects. Funded by NSF.

  20. Quantifying the benefits of vehicle pooling with shareability networks.

    PubMed

    Santi, Paolo; Resta, Giovanni; Szell, Michael; Sobolevsky, Stanislav; Strogatz, Steven H; Ratti, Carlo

    2014-09-16

    Taxi services are a vital part of urban transportation, and a considerable contributor to traffic congestion and air pollution causing substantial adverse effects on human health. Sharing taxi trips is a possible way of reducing the negative impact of taxi services on cities, but this comes at the expense of passenger discomfort quantifiable in terms of a longer travel time. Due to computational challenges, taxi sharing has traditionally been approached on small scales, such as within airport perimeters, or with dynamical ad hoc heuristics. However, a mathematical framework for the systematic understanding of the tradeoff between collective benefits of sharing and individual passenger discomfort is lacking. Here we introduce the notion of shareability network, which allows us to model the collective benefits of sharing as a function of passenger inconvenience, and to efficiently compute optimal sharing strategies on massive datasets. We apply this framework to a dataset of millions of taxi trips taken in New York City, showing that with increasing but still relatively low passenger discomfort, cumulative trip length can be cut by 40% or more. This benefit comes with reductions in service cost, emissions, and with split fares, hinting toward a wide passenger acceptance of such a shared service. Simulation of a realistic online system demonstrates the feasibility of a shareable taxi service in New York City. Shareability as a function of trip density saturates fast, suggesting effectiveness of the taxi sharing system also in cities with much sparser taxi fleets or when willingness to share is low.