These are representative sample records from related to your search topic.
For comprehensive and current results, perform a real-time search at

Quantifying disability: data, methods and results.  

PubMed Central

Conventional methods for collecting, analysing and disseminating data and information on disability in populations have relied on cross-sectional censuses and surveys which measure prevalence in a given period. While this may be relevant for defining the extent and demographic pattern of disabilities in a population, and thus indicating the need for rehabilitative services, prevention requires detailed information on the underlying diseases and injuries that cause disabilities. The Global Burden of Disease methodology described in this paper provides a mechanism for quantifying the health consequences of the years of life lived with disabilities by first estimating the age-sex-specific incidence rates of underlying conditions, and then mapping these to a single disability index which collectively reflects the probability of progressing to a disability, the duration of life lived with the disability, and the approximate severity of the disability in terms of activity restriction. Detailed estimates of the number of disability-adjusted life years (DALYs) lived are provided in this paper, for eight geographical regions. The results should be useful to those concerned with planning health services for the disabled and, more particularly, with determining policies to prevent the underlying conditions which give rise to serious disabling sequelae. PMID:8062403

Murray, C. J.; Lopez, A. D.



Quantifying EMI resulting from finite-impedance reference planes  

Microsoft Academic Search

Parasitic inductance in printed circuit board (PCB) geometries can detrimentally impact the electromagnetic interference (EMI) performance and signal integrity of high-speed digital designs. This paper identifies and quantifies the parameters that affect the inductance of some typical PCB geometries. Closed-form expressions are provided for estimating the inductances of simple trace and ground plane configurations

David M. Hockanson; J. L. Dreniak; Todd H. Hubing; Thomas P. Van Doren; Fei Sha; Cheung-Wei Lam




Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey



Showing results, 3 Energy technology and energy planning  

E-print Network

techniques for industry ­ Wind energy, 4 Wind turbines, 4 Wind energy systems, 5 Wind resources and wind systems, 10 Energy planning in developing countries, 11 ­ Environmental impact of atmospheric processesShowing results, 3 Energy, 4 Energy technology and energy planning Environment, 12 Environmental


Quantifying IOHDR brachytherapy underdosage resulting from an incomplete scatter environment  

SciTech Connect

Purpose: Most brachytherapy planning systems are based on a dose calculation algorithm that assumes an infinite scatter environment surrounding the target volume and applicator. Dosimetric errors from this assumption are negligible. However, in intraoperative high-dose-rate brachytherapy (IOHDR) where treatment catheters are typically laid either directly on a tumor bed or within applicators that may have little or no scatter material above them, the lack of scatter from one side of the applicator can result in underdosage during treatment. This study was carried out to investigate the magnitude of this underdosage. Methods: IOHDR treatment geometries were simulated using a solid water phantom beneath an applicator with varying amounts of bolus material on the top and sides of the applicator to account for missing tissue. Treatment plans were developed for 3 different treatment surface areas (4 x 4, 7 x 7, 12 x 12 cm{sup 2}), each with prescription points located at 3 distances (0.5 cm, 1.0 cm, and 1.5 cm) from the source dwell positions. Ionization measurements were made with a liquid-filled ionization chamber linear array with a dedicated electrometer and data acquisition system. Results: Measurements showed that the magnitude of the underdosage varies from about 8% to 13% of the prescription dose as the prescription depth is increased from 0.5 cm to 1.5 cm. This treatment error was found to be independent of the irradiated area and strongly dependent on the prescription distance. Furthermore, for a given prescription depth, measurements in planes parallel to an applicator at distances up to 4.0 cm from the applicator plane showed that the dose delivery error is equal in magnitude throughout the target volume. Conclusion: This study demonstrates the magnitude of underdosage in IOHDR treatments delivered in a geometry that may not result in a full scatter environment around the applicator. This implies that the target volume and, specifically, the prescription depth (tumor bed) may get a dose significantly less than prescribed. It might be clinically relevant to correct for this inaccuracy.

Raina, Sanjay [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, NY (United States); Avadhani, Jaiteerth S. [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, NY (United States); Oh, Moonseong [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, NY (United States); Malhotra, Harish K. [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, NY (United States); Jaggernauth, Wainwright [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, NY (United States); Kuettel, Michael R. [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, NY (United States); Podgorsak, Matthew B. [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, NY (United States)]. E-mail:



Discovering and Quantifying Mean Streets: A Summary of Results Technical Report  

E-print Network

subsets of a spatial network whose attribute values are significantly higher than expected. Discovering and quantifying mean streets is an im- portant problem with many applications such as detecting high-crime spatial networks is computation- ally very expensive due to the difficulty of characterizing

Shekhar, Shashi


Optimizing Search by Showing Results Context Susan Dumais and Edward Cutrell  

E-print Network

coverage available with standard search engines. RELATED WORK Classification classification we mean abilityOptimizing Search by Showing Results Context Susan Dumais and Edward Cutrell Microsoft Research One interfaces integrating semantic category information search results. interfaces were based on familiar ranked

Chen, Hao


Astronomy Diagnostic Test Results Reflect Course Goals and Show Room for Improvement  

ERIC Educational Resources Information Center

The results of administering the Astronomy Diagnostic Test (ADT) to introductory astronomy students at Henry Ford Community College over three years have shown gains comparable with national averages. Results have also accurately corresponded to course goals, showing greater gains in topics covered in more detail, and lower gains in topics covered

LoPresto, Michael C.



Quantifying viruses and bacteria in wastewaterResults, interpretation methods, and quality control  

USGS Publications Warehouse

Membrane bioreactors (MBR), used for wastewater treatment in Ohio and elsewhere in the United States, have pore sizes small enough to theoretically reduce concentrations of protozoa and bacteria, but not viruses. Sampling for viruses in wastewater is seldom done and not required. Instead, the bacterial indicators Escherichia coli (E. coli) and fecal coliforms are the required microbial measures of effluents for wastewater-discharge permits. Information is needed on the effectiveness of MBRs in removing human enteric viruses from wastewaters, particularly as compared to conventional wastewater treatment before and after disinfection. A total of 73 regular and 28 quality-control (QC) samples were collected at three MBR and two conventional wastewater plants in Ohio during 23 regular and 3 QC sampling trips in 2008-10. Samples were collected at various stages in the treatment processes and analyzed for bacterial indicators E. coli, fecal coliforms, and enterococci by membrane filtration; somatic and F-specific coliphage by the single agar layer (SAL) method; adenovirus, enterovirus, norovirus GI and GII, rotavirus, and hepatitis A virus by molecular methods; and viruses by cell culture. While addressing the main objective of the study-comparing removal of viruses and bacterial indicators in MBR and conventional plants-it was realized that work was needed to identify data analysis and quantification methods for interpreting enteric virus and QC data. Therefore, methods for quantifying viruses, qualifying results, and applying QC data to interpretations are described in this report. During each regular sampling trip, samples were collected (1) before conventional or MBR treatment (post-preliminary), (2) after secondary or MBR treatment (post-secondary or post-MBR), (3) after tertiary treatment (one conventional plant only), and (4) after disinfection (post-disinfection). Glass-wool fiber filtration was used to concentrate enteric viruses from large volumes, and small volume grab samples were collected for direct-plating analyses for bacterial indicators and coliphage. After filtration, the viruses were eluted from the filter and further concentrated. The final concentrated sample volume (FCSV) was used for enteric virus analysis by use of two methods-cell culture and a molecular method, polymerase chain reaction (PCR). Quantitative PCR (qPCR) for DNA viruses and quantitative reverse-transcriptase PCR (qRT-PCR) for RNA viruses were used in this study. To support data interpretations, the assay limit of detection (ALOD) was set for each virus assay and used to determine sample reporting limits (SRLs). For qPCR and qRT-PCR the ALOD was an estimated value because it was not established according to established method detection limit procedures. The SRLs were different for each sample because effective sample volumes (the volume of the original sample that was actually used in each analysis) were different for each sample. Effective sample volumes were much less than the original sample volumes because of reductions from processing steps and (or) from when dilutions were made to minimize the effects from PCR-inhibiting substances. Codes were used to further qualify the virus data and indicate the level of uncertainty associated with each measurement. Quality-control samples were used to support data interpretations. Field and laboratory blanks for bacteria, coliphage, and enteric viruses were all below detection, indicating that it was unlikely that samples were contaminated from equipment or processing procedures. The absolute value log differences (AVLDs) between concurrent replicate pairs were calculated to identify the variability associated with each measurement. For bacterial indicators and coliphage, the AVLD results indicated that concentrations <10 colony-forming units or plaque-forming units per 100 mL can differ between replicates by as much as 1 log, whereas higher concentrations can differ by as much as 0.3 log. The AVLD results for viruses indicated that differences between replicates can be as great as 1.2 log g

Francy, Donna S.; Stelzer, Erin A.; Bushon, Rebecca N.; Brady, Amie M.G.; Mailot, Brian E.; Spencer, Susan K.; Borchardt, Mark A.; Elber, Ashley G.; Riddell, Kimberly R.; Gellner, Terry M.



RESULTS & CONCLUSION The analysis (above) shows that there are multiple reaches of Trout Brook and Smith  

E-print Network

RESULTS & CONCLUSION The analysis (above) shows that there are multiple reaches of Trout Brook, the centerlines of Trout Brook and Smith Brook were traced, along with some tributaries, from their respective Trout Brook and Smith Brook in eastern Cortland County. Essentially this was a proof-of-concept project

Barclay, David J.


Quantifying the effects of root reinforcing on slope stability: results of the first tests with an new shearing device  

NASA Astrophysics Data System (ADS)

The role of vegetation in preventing shallow soil mass movements such as shallow landslides and soil erosion is generally well recognized and, correspondingly, soil bioengineering on steep slopes has been widely used in practice. However, the precise effectiveness of vegetation regarding slope stabilityis still difficult to determine. A recently designed inclinable shearing device for large scale vegetated soil samples allows quantitative evaluation of the additional shear strength provided by roots of specific plant species. In the following we describe the results of a first series of shear strength experiments with this apparatus focusing on root reinforcement of White Alder (Alnus incana) and Silver Birch (Betula pendula) in large soil block samples (500 x 500 x 400 mm). The specimen with partly saturated soil of a maximum grain size of 10 mm were slowly sheared at an inclination of 35 with low normal stresses of 3.2 kPa accounting for natural conditions on a typical slope prone to mass movements. Measurements during the experiments involved shear stress, shear displacement and normal displacement, all recorded with high accuracy. In addition, dry weights of sprout and roots were measured to quantify plant growth of the planted specimen. The results with the new apparatus indicate a considerable reinforcement of the soil due to plant roots, i.e. maximum shear stress of the vegetated specimen were substantially higher compared to non-vegetated soil and the additional strength was a function of species and growth. Soil samples with seedlings planted five months prior to the test yielded an important increase in maximum shear stress of 250% for White Alder and 240% for Silver Birch compared to non-vegetated soil. The results of a second test series with 12 month old plants showed even clearer enhancements in maximum shear stress (390% for Alder and 230% for Birch). Overall the results of this first series of shear strength experiments with the new apparatus using planted and unplanted soil specimen confirm the importance of plants in soil stabilisation. Furthermore, they demonstrate the suitability of the apparatus to quantify the additional strength of specific vegetation as a function of species and growth under clearly defined conditions in the laboratory.

Rickli, Christian; Graf, Frank



Application of bone scans for prostate cancer staging: Which guideline shows better result?  

PubMed Central

Introduction: We evaluated the accuracy of current guidelines by analyzing bone scan results and clinical parameters of patients with prostate cancer to determine the optimal guideline for predicting bone metastasis. Methods: We retrospectively analyzed patients who were diagnosed with prostate cancer and who underwent a bone scan. Bone metastasis was confirmed by bone scan results with clinical and radiological follow-up. Serum prostate-specific antigen, Gleason score, percent of positive biopsy core, clinical staging and bone scan results were analyzed. We analyzed diagnostic performance in predicting bone metastasis of the guidelines of the European Association of Urology (EAU), American Urological Association (AUA), and the National Comprehensive Cancer Network (NCCN) guidelines as well as Brigantis classification and regression tree (CART). We also compared the percent of positive biopsy core between patients with and without bone metastases. Results: A total 167 of 806 patients had bone metastases. Receiver operating curve analysis revealed that the AUA and EAU guidelines were better for detecting bone metastases than were Brigantis CART and NCCN. No significant difference was observed between AUA and EAU guidelines. Patients with bone metastases had a higher percent positive core than did patients without metastasis (the cut-off value >55.6). Conclusion: The EAU and AUA guidelines showed better results than did Brigantis CART and NCCN for predicting bone metastasis in the enrolled patients. A bone scan is strongly recommended for patients who have a higher percent positive core and who meet the EAU and AUA guidelines. PMID:25210554

Chong, Ari; Hwang, Insang; Ha, Jung-min; Yu, Seong Hyeon; Hwang, Eu Chang; Yu, Ho Song; Kim, Sun Ouck; Jung, Seung-Il; Kang, Taek Won; Kwon, Dong Deuk; Park, Kwangsung



Computational Methods continued In previous work, we showed that while the LDA results systemmatically under-  

E-print Network

bohr representing the vacuum region. Interface properties with Li were modeled using a periodic array approximation[10] (GGA) functional. We also showed that the fractional atomic coordinates were very similar 64 bohr-2 and sampling of the Brillouin zone at least as dense as 10-3 bohr-3 /k-point. In fact

Holzwarth, Natalie


An analysis of semiclassical radiation from single particle quantum currents shows surprising results  

E-print Network

Classical electromagnetic radiation from quantum currents and densities are calculated. For the free Schrodinger equation with no external force it's found that the classical radiation is zero to all orders of the multipole expansion. This is true of mixed or pure states for the charged particle. It is a non-trivial and surprising result. A similar result is found for the Klein-Gordon currents when the wave function consists of only positive energy solutions. For the Dirac equation it is found that radiation is suppressed at lower frequencies but is not zero at all frequencies. Implications of these results for the interpretation of quantum mechanics are discussed.

Mark P. Davidson



1. ABSTRACT We show results from joint TES-OMI retrievals for  

E-print Network

the Trinidad Head sonde station (41N, 124.1W). Other available corroborative data (not currently utilized interest is near the planetary boundary layer. Trinidad Head sonde 10-8 10-7 10-6 10-5 O3 (VMR) 1000.0 100 to jackknife Trinidad Head sonde provides a check of results in conjunction with surface ozone sites


Trial results show high remission rate in leukemia following immune cell therapy

Children and young adults (age 1 to age 30) with chemotherapy-resistant B-cell acute lymphoblastic leukemia (ALL) experienced high remission rates following treatment with an experimental immunotherapy. Results demonstrated that the immunotherapy treatment had anti-leukemia effects in patients and that the treatment was feasible and safe.


NIH trial shows promising results in treating a lymphoma in young people

Patients with a type of cancer known as primary mediastinal B-cell lymphoma who received infusions of chemotherapy, but who did not have radiation therapy to an area of the thorax known as the mediastinum, had excellent outcomes, according to clinical trial results.


Lung cancer trial results show mortality benefit with low-dose CT:

The NCI has released initial results from a large-scale test of screening methods to reduce deaths from lung cancer by detecting cancers at relatively early stages. The National Lung Screening Trial, a randomized national trial involving more than 53,000 current and former heavy smokers ages 55 to 74, compared the effects of two screening procedures for lung cancer -- low-dose helical computed tomography (CT) and standard chest X-ray -- on lung cancer mortality and found 20 percent fewer lung cancer deaths among trial participants screened with low-dose helical CT.


Updated clinical results show experimental agent ibrutinib as highly active in CLL patients

Updated results from a Phase Ib/II clinical trial led by the Ohio State University Comprehensive Cancer Center Arthur G. James Cancer Hospital and Richard J. Solove Research Institute indicates that a novel therapeutic agent for chronic lymphocytic leukemia (CLL) is highly active and well tolerated in patients who have relapsed and are resistant to other therapy. The agent, ibrutinib (PCI-32765), is the first drug designed to target Bruton's tyrosine kinase (BTK), a protein essential for CLL-cell survival and proliferation. CLL is the most common form of leukemia, with about 15,000 new cases annually in the U.S. About 4,400 Americans die of the disease each year.


Results From Mars Show Electrostatic Charging of the Mars Pathfinder Sojourner Rover  

NASA Technical Reports Server (NTRS)

Indirect evidence (dust accumulation) has been obtained indicating that the Mars Pathfinder rover, Sojourner, experienced electrostatic charging on Mars. Lander camera images of the Sojourner rover provide distinctive evidence of dust accumulation on rover wheels during traverses, turns, and crabbing maneuvers. The sol 22 (22nd Martian "day" after Pathfinder landed) end-of-day image clearly shows fine red dust concentrated around the wheel edges with additional accumulation in the wheel hubs. A sol 41 image of the rover near the rock "Wedge" (see the next image) shows a more uniform coating of dust on the wheel drive surfaces with accumulation in the hubs similar to that in the previous image. In the sol 41 image, note particularly the loss of black-white contrast on the Wheel Abrasion Experiment strips (center wheel). This loss of contrast was also seen when dust accumulated on test wheels in the laboratory. We believe that this accumulation occurred because the Martian surface dust consists of clay-sized particles, similar to those detected by Viking, which have become electrically charged. By adhering to the wheels, the charged dust carries a net nonzero charge to the rover, raising its electrical potential relative to its surroundings. Similar charging behavior was routinely observed in an experimental facility at the NASA Lewis Research Center, where a Sojourner wheel was driven in a simulated Martian surface environment. There, as the wheel moved and accumulated dust (see the following image), electrical potentials in excess of 100 V (relative to the chamber ground) were detected by a capacitively coupled electrostatic probe located 4 mm from the wheel surface. The measured wheel capacitance was approximately 80 picofarads (pF), and the calculated charge, 8 x 10(exp -9) coulombs (C). Voltage differences of 100 V and greater are believed sufficient to produce Paschen electrical discharge in the Martian atmosphere. With an accumulated net charge of 8 x 10(exp -9) C, and average arc time of 1 msec, arcs can also occur with estimated arc currents approaching 10 milliamperes (mA). Discharges of this magnitude could interfere with the operation of sensitive electrical or electronic elements and logic circuits. Sojourner rover wheel tested in laboratory before launch to Mars. Before launch, we believed that the dust would become triboelectrically charged as it was moved about and compacted by the rover wheels. In all cases observed in the laboratory, the test wheel charged positively, and the wheel tracks charged negatively. Dust samples removed from the laboratory wheel averaged a few ones to tens of micrometers in size (clay size). Coarser grains were left behind in the wheel track. On Mars, grain size estimates of 2 to 10 mm were derived for the Martian surface materials from the Viking Gas Exchange Experiment. These size estimates approximately match the laboratory samples. Our tentative conclusion for the Sojourner observations is that fine clay-sized particles acquired an electrostatic charge during rover traverses and adhered to the rover wheels, carrying electrical charge to the rover. Since the Sojourner rover carried no instruments to measure this mission's onboard electrical charge, confirmatory measurements from future rover missions on Mars are desirable so that the physical and electrical properties of the Martian surface dust can be characterized. Sojourner was protected by discharge points, and Faraday cages were placed around sensitive electronics. But larger systems than Sojourner are being contemplated for missions to the Martian surface in the foreseeable future. The design of such systems will require a detailed knowledge of how they will interact with their environment. Validated environmental interaction models and guidelines for the Martian surface must be developed so that design engineers can test new ideas prior to cutting hardware. These models and guidelines cannot be validated without actual flighata. Electrical charging of vehicles and, one day, astronauts moving across t

Kolecki, Joseph C.; Siebert, Mark W.



Animation shows promise in initiating timely cardiopulmonary resuscitation: results of a pilot study.  


Delayed responses during cardiac arrest are common. Timely interventions during cardiac arrest have a direct impact on patient survival. Integration of technology in nursing education is crucial to enhance teaching effectiveness. The goal of this study was to investigate the effect of animation on nursing students' response time to cardiac arrest, including initiation of timely chest compression. Nursing students were randomized into experimental and control groups prior to practicing in a high-fidelity simulation laboratory. The experimental group was educated, by discussion and animation, about the importance of starting cardiopulmonary resuscitation upon recognizing an unresponsive patient. Afterward, a discussion session allowed students in the experimental group to gain more in-depth knowledge about the most recent changes in the cardiac resuscitation guidelines from the American Heart Association. A linear mixed model was run to investigate differences in time of response between the experimental and control groups while controlling for differences in those with additional degrees, prior code experience, and basic life support certification. The experimental group had a faster response time compared with the control group and initiated timely cardiopulmonary resuscitation upon recognition of deteriorating conditions (P < .0001). The results demonstrated the efficacy of combined teaching modalities for timely cardiopulmonary resuscitation. Providing opportunities for repetitious practice when a patient's condition is deteriorating is crucial for teaching safe practice. PMID:24473120

Attin, Mina; Winslow, Katheryn; Smith, Tyler



QUantifying the Aerosol Direct and Indirect Effect over Eastern Mediterranean from Satellites (QUADIEEMS): Overview and preliminary results  

NASA Astrophysics Data System (ADS)

An overview and preliminary results from the research implemented within the framework of QUADIEEMS project are presented. For the scopes of the project, satellite data from five sensors (MODIS aboard EOS TERRA, MODIS aboard EOS AQUA, TOMS aboard Earth Probe, OMI aboard EOS AURA and CALIOP aboard CALIPSO) are used in conjunction with meteorological data from ECMWF ERA-interim reanalysis and data from a global chemical-aerosol-transport model as well as simulation results from a regional climate model (RegCM4) coupled with a simplified aerosol scheme. QUADIEEMS focuses on Eastern Mediterranean [30oN-45No, 17.5oE-37.5oE], a region situated at the crossroad of different aerosol types and thus ideal for the investigation of the direct and indirect effects of various aerosol types at a high spatial resolution. The project consists of five components. First, raw data from various databases are acquired, analyzed and spatially homogenized with the outcome being a high resolution (0.1x0.1 degree) and a moderate resolution (1.0x1.0 degree) gridded dataset of aerosol and cloud optical properties. The marine, dust and anthropogenic fraction of aerosols over the region is quantified making use of the homogenized dataset. Regional climate model simulations with REGCM4/aerosol are also implemented for the greater European region for the period 2000-2010 at a resolution of 50 km. REGCM4's ability to simulate AOD550 over Europe is evaluated. The aerosol-cloud relationships, for sub-regions of Eastern Mediterranean characterized by the presence of predominant aerosol types, are examined. The aerosol-cloud relationships are also examined taking into account the relative position of aerosol and cloud layers as defined by CALIPSO observations. Within the final component of the project, results and data that emerged from all the previous components are used in satellite-based parameterizations in order to quantify the direct and indirect (first) radiative effect of the different aerosol types at a resolution of 0.1x0.1 degrees. The procedure is repeated using a 1.0x1.0 degree resolution, in order to examine the footprint of the aerosol direct and indirect effects. The project ends with the evaluation of REGCM4's ability to simulate the aerosol direct radiative effect over the region. QUADIEEMS is co-financed by the European Social Fund (ESF) and national resources under the operational programme Education and Lifelong Learning (EdLL) within the framework of the Action "Supporting Postdoctoral Researchers".

Georgoulias, Aristeidis K.; Zanis, Prodromos; Pschl, Ulrich; Kourtidis, Konstantinos A.; Alexandri, Georgia; Ntogras, Christos; Marinou, Eleni; Amiridis, Vassilis



Quantifying Contextuality  

NASA Astrophysics Data System (ADS)

Contextuality is central to both the foundations of quantum theory and to the novel information processing tasks. Despite some recent proposals, it still faces a fundamental problem: how to quantify its presence? In this work, we provide a universal framework for quantifying contextuality. We conduct two complementary approaches: (i) the bottom-up approach, where we introduce a communication game, which grasps the phenomenon of contextuality in a quantitative manner; (ii) the top-down approach, where we just postulate two measures, relative entropy of contextuality and contextuality cost, analogous to existent measures of nonlocality (a special case of contextuality). We then match the two approaches by showing that the measure emerging from the communication scenario turns out to be equal to the relative entropy of contextuality. Our framework allows for the quantitative, resource-type comparison of completely different games. We give analytical formulas for the proposed measures for some contextual systems, showing in particular that the Peres-Mermin game is by order of magnitude more contextual than that of Klyachko et al. Furthermore, we explore properties of these measures such as monotonicity or additivity.

Grudka, A.; Horodecki, K.; Horodecki, M.; Horodecki, P.; Horodecki, R.; Joshi, P.; K?obus, W.; Wjcik, A.



Quantifying contextuality.  


Contextuality is central to both the foundations of quantum theory and to the novel information processing tasks. Despite some recent proposals, it still faces a fundamental problem: how to quantify its presence? In this work, we provide a universal framework for quantifying contextuality. We conduct two complementary approaches: (i)the bottom-up approach, where we introduce a communication game, which grasps the phenomenon of contextuality in a quantitative manner; (ii)the top-down approach, where we just postulate two measures, relative entropy of contextuality and contextuality cost, analogous to existent measures of nonlocality (a special case of contextuality). We then match the two approaches by showing that the measure emerging from the communication scenario turns out to be equal to the relative entropy of contextuality. Our framework allows for the quantitative, resource-type comparison of completely different games. We give analytical formulas for the proposed measures for some contextual systems, showing in particular that the Peres-Mermin game is by order of magnitude more contextual than that of Klyachko etal. Furthermore, we explore properties of these measures such as monotonicity or additivity. PMID:24724629

Grudka, A; Horodecki, K; Horodecki, M; Horodecki, P; Horodecki, R; Joshi, P; K?obus, W; Wjcik, A



Quantifying Reemission Of Mercury From Terrestrial And Aquatic Systems Using Stable Isotopes: Results From The Experimental Lakes Area METAALICUS Study  

NASA Astrophysics Data System (ADS)

This study represents the first attempt to directly quantify the re-emission of deposited Hg. This is crucial for understanding the sources of Hg emitted from natural surfaces as being of either geological origin or through re-emission of recently deposited Hg. Three stable Hg isotopes are being added experimentally to a headwater lake, wetlands, and its watershed in a whole-ecosystem manipulation study at the Experimental Lake Area in Canada. Our overall objective is to determine the link between atmospheric deposition and Hg in fish, but numerous aspects of the biogeochemical cycling of Hg are being addressed during METAALICUS (Mercury Experiment to Assess Atmospheric Loading in Canada and the U.S.), including Hg re-emission. Pilot studies in 1999-2000 applied enriched 200Hg to isolated upland and wetland plots, and to lake enclosures. Fluxes were measured with dynamic chambers for several months. The 200Hg spike was quickly detected in ground-level air (e.g. 5 ng/m3) suggesting rapid initial volatilization of the new Hg. Initial 200Hg fluxes > ambient Hg, but emissions of 200Hg decreased within 3 months to non-detects; about 5% of the applied 200Hg spike was emitted from uplands and about 10% from wetlands. The 200Hg spike (representing new deposition) was generally more readily volatilized than was ambient (old) Hg in both sites. Mercury evasion to the atmosphere from a lake enclosure was also measured and compared with the flux estimated from measured dissolved gaseous mercury (DGM). The introduction of the tracer spike was followed by increased concentrations of DGM and higher fluxes to the atmosphere. In some cases, the observed and calculated fluxes were similar; however, it was common for the observed flux to exceed the calculated flux significantly under some conditions, suggesting that DGM concentration alone in the water column is a poor predictor of gaseous mercury evasion. A substantially larger fraction of the newly deposited Hg was re-emitted from the lake than from wetlands or from upland soils. The whole-ecosystem manipulation is now underway at ELA Lake 658. Addition of 200Hg (to uplands), 202Hg (lake), and 199Hg (wetlands) commenced in 2001 and was completed in June 2003. These data are now being analyzed, and appear to support the behavior seen in the pilot studies; final results will be presented.

Lindberg, S. E.; Southworth, G.; Peterson, M.; Hintelmann, H.; Graydon, J.; St. Louis, V.; Amyot, M.; Krabbenhoft, D.



Quantifying Reemission Of Mercury From Terrestrial And Aquatic Systems Using Stable Isotopes: Results From The Experimental Lakes Area METAALICUS Study  

Microsoft Academic Search

This study represents the first attempt to directly quantify the re-emission of deposited Hg. This is crucial for understanding the sources of Hg emitted from natural surfaces as being of either geological origin or through re-emission of recently deposited Hg. Three stable Hg isotopes are being added experimentally to a headwater lake, wetlands, and its watershed in a whole-ecosystem manipulation

S. E. Lindberg; G. Southworth; M. Peterson; H. Hintelmann; J. Graydon; V. St. Louis; M. Amyot; D. Krabbenhoft



News Note: Long-term Results from Study of Tamoxifen and Raloxifene Shows Lower Toxicities of Raloxifene

Initial results in 2006 of the NCI-sponsored Study of Tamoxifen and Raloxifene (STAR) showed that a common osteoporosis drug, raloxifene, prevented breast cancer to the same degree, but with fewer serious side-effects, than the drug tamoxifen that had been in use many years for breast cancer prevention as well as treatment. The longer-term results show that raloxifene retained 76 percent of the effectiveness of tamoxifen in preventing invasive disease and grew closer to tamoxifen in preventing noninvasive disease, while remaining far less toxic in particular, there was significantly less endometrial cancer with raloxifene use.


We describe initial results which show "live" ultrasound echography data visualized within a pregnant human subject. The  

E-print Network

-through head-mounted display, ultrasound echography, 3D medical imaging 1. Introduction We have been working-through HMD. Even though we concentrate here on medical ultrasound imaging, applications of this displayAbstract We describe initial results which show "live" ultrasound echography data visualized within

North Carolina at Chapel Hill, University of


Quantifying saltmarsh vegetation and its effect on wave height dissipation: Results from a UK East coast saltmarsh  

NASA Astrophysics Data System (ADS)

The degree to which incident wind waves are attenuated over intertidal surfaces is critical to the development of coastal wetlands, which are, amongst other processes, affected by the delivery, erosion, and/or resuspension of sediment due to wave action. Knowledge on wave attenuation over saltmarsh surfaces is also essential for accurate assessments of their natural sea-defence value to be made and incorporated into sea defence and management schemes. The aim of this paper is to evaluate the use of a digital photographic method for the quantification of marsh vegetation density and then to investigate the relative roles played by hydrodynamic controls and vegetation density/type in causing the attenuation of incident waves over a macro-tidal saltmarsh. Results show that a significant statistical relationship exists between the density of vegetation measured in side-on photographs and the dry biomass of the photographed vegetation determined through direct harvesting. The potential of the digital photographic method for the spatial and temporal comparison of marsh surface vegetation biomass, density, and canopy structure is highlighted and the method was applied to assess spatial and seasonal differences in vegetation density and their effect on wave attenuation at three locations on a macro-tidal saltmarsh on Dengie Peninsula, Essex, UK. In this environmental setting, vegetation density/type did not have a significant direct effect on wave attenuation but modified the process of wave transformation under different hydrodynamic conditions. At the two locations, characterised by a relatively tall canopy (15-26 cm) with biomass values of 430-500 g m -2, dominated by Spartina spp. (>70% of total dry biomass), relative incident wave height (wave height/water depth) is identified as a statistically significant dominant positive control on wave attenuation up to a threshold value of 0.55, beyond which wave attenuation showed no significant further increase. At the third location, characterised by only slightly less biomass (398 g m -2) but a shorter (6 cm) canopy of the annual Salicornia spp., no significant relationship existed between wave attenuation and relative wave height. Seasonally (between September and December) significant temporal increase/decrease in vegetation density occurred in one of the Spartina canopies and in the Salicornia canopy, respectively, and led to an expected (but not statistically significant) increase/decrease in wave attenuation. The wider implications of these findings in the context of form-process interactions on saltmarshes and their effect on marsh evolution are also discussed.

Mller, I.



Quantifying the response of terrestrial carbon fluxes to future climate change: Results from CMIP5 Earth System Models simulations  

NASA Astrophysics Data System (ADS)

Quantification of the terrestrial ecosystem feedback is crucial for better prediction of future global climate-carbon cycle. However, previous studies using coupled climate-carbon models show large uncertainties of terrestrial carbon-climate feedbacks. While the largest uncertainty in feedbacks is induced by the models' responses to radiative and physiological CO2, recent studies demonstrate that the N cycle and land cover change could have significant impacts. Here, we explore the responses of carbon fluxes between atmosphere and land to various forcing using five experiments (i.e., historical, RCP45, RCP85, esmFdbk2, esmFixClim2) from 14 CMIP5 Earth System Models (ESMs). The simulated modern global net primary productivity (NPP) ranges from 46.1 Pg C yr-1 in CCSM4 to 90.8 Pg C yr-1 in MPI-ESM-LR, with an ensemble mean of 68.6 Pg C yr-1. Eleven of the fourteen ESMs substantially overestimate current global NPP as calculated from MOD17A3 dataset (53.5 1.7 Pg C yr-1). All models predicted an increase of global NPP in both RCP concentration scenarios. With N limitation, CCSM4 and NorESM1-M simulated the lowest current NPP and future relative increase of NPP in RCP concentration scenarios. A comparison of the coupled and uncoupled CO2-climate experiments indicates that while the increase of NPP north of 45N is a combined effect of CO2 fertilization and global warming, the increase of global NPP in the RCP4.5 concentration scenario is dominated by CO2 fertilization. In general, the magnitude of NPP increases caused by CO2-induced warming and/or CO2-fertilization in each model is significantly correlated with its simulated modern NPP value. Similar to NPP, the increase of terrestrial heterotrophic respiration carbon fluxes (RH) and fire emission carbon fluxes (fFire) in RCP concentration scenarios are tightly correlated with their modern intensity among the CMIP5 ESMs. Consequently, the simulated response of net ecosystem productivity (NEP = NPP - RH - fFire) in RCP scenarios is correlated with its modern magnitude. Despite different magnitudes, all ESMs predict that the land acts as a sink of carbon in both RCP scenarios without land cover change. In contrast, when including land cover change, two and five ESMs suggest that the land will become a source of carbon with the RCP4.5 and RCP8.5 scenarios, respectively. Thus, future land cover change, which is very uncertain, is consistently predicted to contribute significantly to the terrestrial carbon-climate feedback. In sum, our results suggest that (1) an accurate representation of the modern terrestrial carbon fluxes is critical to constrain future carbon-climate feedbacks and (2) in addition to CO2-fertilization and CO2-induced warming, the N cycle and land cover change are important for predicting future carbon-climate interactions.

Zhou, J.; Riley, W. J.



Quantifying the effect of crops surface albedo variability on GHG budgets in a life cycle assessment approach : methodology and results.  

NASA Astrophysics Data System (ADS)

We tested a new method to estimate the radiative forcing of several crops at the annual and rotation scales, using local measurements data from two ICOS experimental sites. We used jointly 1) the radiative forcing caused by greenhouse gas (GHG) net emissions, calculated by using a Life Cycle Analysis (LCA) approach and in situ measurements (Ceschia et al. 2010), and 2) the radiative forcing caused by rapid changes in surface albedo typical from those ecosystems and resulting from management and crop phenology. The carbon and GHG budgets (GHGB) of 2 crop sites with contrasted management located in South West France (Aurad and Lamasqure sites) was estimated over a complete rotation by combining a classical LCA approach with on site flux measurements. At both sites, carbon inputs (organic fertilisation and seeds), carbon exports (harvest) and net ecosystem production (NEP), measured with the eddy covariance technique, were calculated. The variability of the different terms and their relative contributions to the net ecosystem carbon budget (NECB) were analysed for all site-years, and the effect of management on NECB was assessed. To account for GHG fluxes that were not directly measured on site, we estimated the emissions caused by field operations (EFO) for each site using emission factors from the literature. The EFO were added to the NECB to calculate the total GHGB for a range of cropping systems and management regimes. N2O emissions were or calculated following the IPCC (2007) guidelines, and CH4 emissions were assumed to be negligible compared to other contributions to the net GHGB. Additionally, albedo was calculated continuously using the short wave incident and reflected radiation measurements in the field (0.3-3m) from CNR1 sensors. Mean annual differences in albedo and deduced radiative forcing from a reference value were then compared for all site-years. Mean annual differences in radiative forcing were then converted in g C equivalent m-2 in order to add this effect to the GHG budget (Muoz et a. 2010). Increasing the length of the vegetative period is considered as one of the main levers for improving the NECB of crop ecosystems. Therefore, we also tested the effect of adding intermediate crops or maintaining crop voluntary re-growth on both the NECB and the radiative forcing caused by the changes in mean annual surface albedo. We showed that the NEP was improved and as a consequence NECB and GHGB too. Intermediate crops also increased the mean annual surface albedo and therefore caused a negative radiative forcing (cooling effect) expressed in g C equivalent m-2 (sink). The use of an intermediate crop could in some cases switch the crop from a positive NEP (source) to a negative one (sink) and the change in radiative forcing (up to -110 g C-eq m-2 yr-1) could overwhelm the NEP term.

Ferlicoq, Morgan; Ceschia, Eric; Brut, Aurore; Tallec, Tiphaine



Genomic and Enzymatic Results Show Bacillus cellulosilyticus Uses a Novel Set of LPXTA Carbohydrases to Hydrolyze Polysaccharides  

PubMed Central

Background Alkaliphilic Bacillus species are intrinsically interesting due to the bioenergetic problems posed by growth at high pH and high salt. Three alkaline cellulases have been cloned, sequenced and expressed from Bacillus cellulosilyticus N-4 (Bcell) making it an excellent target for genomic sequencing and mining of biomass-degrading enzymes. Methodology/Principal Findings The genome of Bcell is a single chromosome of 4.7 Mb with no plasmids present and three large phage insertions. The most unusual feature of the genome is the presence of 23 LPXTA membrane anchor proteins; 17 of these are annotated as involved in polysaccharide degradation. These two values are significantly higher than seen in any other Bacillus species. This high number of membrane anchor proteins is seen only in pathogenic Gram-positive organisms such as Listeria monocytogenes or Staphylococcus aureus. Bcell also possesses four sortase D subfamily 4 enzymes that incorporate LPXTA-bearing proteins into the cell wall; three of these are closely related to each other and unique to Bcell. Cell fractionation and enzymatic assay of Bcell cultures show that the majority of polysaccharide degradation is associated with the cell wall LPXTA-enzymes, an unusual feature in Gram-positive aerobes. Genomic analysis and growth studies both strongly argue against Bcell being a truly cellulolytic organism, in spite of its name. Preliminary results suggest that fungal mycelia may be the natural substrate for this organism. Conclusions/Significance Bacillus cellulosilyticus N-4, in spite of its name, does not possess any of the genes necessary for crystalline cellulose degradation, demonstrating the risk of classifying microorganisms without the benefit of genomic analysis. Bcell is the first Gram-positive aerobic organism shown to use predominantly cell-bound, non-cellulosomal enzymes for polysaccharide degradation. The LPXTA-sortase system utilized by Bcell may have applications both in anchoring cellulases and other biomass-degrading enzymes to Bcell itself and in anchoring proteins other Gram-positive organisms. PMID:23593409

Mead, David; Drinkwater, Colleen; Brumm, Phillip J.



Acute Myocardial Infarction and Pulmonary Diseases Result in Two Different Degradation Profiles of Elastin as Quantified by Two Novel ELISAs  

PubMed Central

Background Elastin is a signature protein of the arteries and lungs, thus it was hypothesized that elastin is subject to enzymatic degradation during cardiovascular and pulmonary diseases. The aim was to investigate if different fragments of the same protein entail different information associated to two different diseases and if these fragments have the potential of being diagnostic biomarkers. Methods Monoclonal antibodies were raised against an identified fragment (the ELM-2 neoepitope) generated at the amino acid position 552 in elastin by matrix metalloproteinase (MMP) ?9/?12. A newly identified ELM neoepitope was generated by the same proteases but at amino acid position 441. The distribution of ELM-2 and ELM, in human arterial plaques and fibrotic lung tissues were investigated by immunohistochemistry. A competitive ELISA for ELM-2 was developed. The clinical relevance of the ELM and ELM-2 ELISAs was evaluated in patients with acute myocardial infarction (AMI), no AMI, high coronary calcium, or low coronary calcium. The serological release of ELM-2 in patients with chronic obstructive pulmonary disease (COPD) or idiopathic pulmonary fibrosis (IPF) was compared to controls. Results ELM and ELM-2 neoepitopes were both localized in diseased carotid arteries and fibrotic lungs. In the cardiovascular cohort, ELM-2 levels were 66% higher in serum from AMI patients compared to patients with no AMI (p<0.01). Levels of ELM were not significantly increased in these patients and no correlation was observed between ELM-2 and ELM. ELM-2 was not elevated in the COPD and IPF patients and was not correlated to ELM. ELM was shown to be correlated with smoking habits (p<0.01). Conclusions The ELM-2 neoepitope was related to AMI whereas the ELM neoepitope was related to pulmonary diseases. These results indicate that elastin neoepitopes generated by the same proteases but at different amino acid sites provide different tissue-related information depending on the disease in question. PMID:23805173

Skjt-Arkil, Helene; Clausen, Rikke E.; Rasmussen, Lars M.; Wang, Wanchun; Wang, Yaguo; Zheng, Qinlong; Mickley, Hans; Saaby, Lotte; Diederichsen, Axel C. P.; Lambrechtsen, Jess; Martinez, Fernando J.; Hogaboam, Cory M.; Han, MeiLan; Larsen, Martin R.; Nawrocki, Arkadiusz; Vainer, Ben; Krustrup, Dorrit; Bjrling-Poulsen, Marina; Karsdal, Morten A.; Leeming, Diana J.



Development and application of methods to quantify spatial and temporal hyperpolarized 3He MRI ventilation dynamics: preliminary results in chronic obstructive pulmonary disease  

NASA Astrophysics Data System (ADS)

Hyperpolarized helium-3 (3He) magnetic resonance imaging (MRI) has emerged as a non-invasive research method for quantifying lung structural and functional changes, enabling direct visualization in vivo at high spatial and temporal resolution. Here we described the development of methods for quantifying ventilation dynamics in response to salbutamol in Chronic Obstructive Pulmonary Disease (COPD). Whole body 3.0 Tesla Excite 12.0 MRI system was used to obtain multi-slice coronal images acquired immediately after subjects inhaled hyperpolarized 3He gas. Ventilated volume (VV), ventilation defect volume (VDV) and thoracic cavity volume (TCV) were recorded following segmentation of 3He and 1H images respectively, and used to calculate percent ventilated volume (PVV) and ventilation defect percent (VDP). Manual segmentation and Otsu thresholding were significantly correlated for VV (r=.82, p=.001), VDV (r=.87 p=.0002), PVV (r=.85, p=.0005), and VDP (r=.85, p=.0005). The level of agreement between these segmentation methods was also evaluated using Bland-Altman analysis and this showed that manual segmentation was consistently higher for VV (Mean=.22 L, SD=.05) and consistently lower for VDV (Mean=-.13, SD=.05) measurements than Otsu thresholding. To automate the quantification of newly ventilated pixels (NVp) post-bronchodilator, we used translation, rotation, and scaling transformations to register pre-and post-salbutamol images. There was a significant correlation between NVp and VDV (r=-.94 p=.005) and between percent newly ventilated pixels (PNVp) and VDP (r=- .89, p=.02), but not for VV or PVV. Evaluation of 3He MRI ventilation dynamics using Otsu thresholding and landmark-based image registration provides a way to regionally quantify functional changes in COPD subjects after treatment with beta-agonist bronchodilators, a common COPD and asthma therapy.

Kirby, Miranda; Wheatley, Andrew; McCormack, David G.; Parraga, Grace



Quantifying chain reptation in entangled polymer melts: Topological and dynamical mapping of atomistic simulation results onto the tube model  

NASA Astrophysics Data System (ADS)

The topological state of entangled polymers has been analyzed recently in terms of primitive paths which allowed obtaining reliable predictions of the static (statistical) properties of the underlying entanglement network for a number of polymer melts. Through a systematic methodology that first maps atomistic molecular dynamics (MD) trajectories onto time trajectories of primitive chains and then documents primitive chain motion in terms of a curvilinear diffusion in a tubelike region around the coarse-grained chain contour, we are extending these static approaches here even further by computing the most fundamental function of the reptation theory, namely, the probability ?(s,t) that a segment s of the primitive chain remains inside the initial tube after time t, accounting directly for contour length fluctuations and constraint release. The effective diameter of the tube is independently evaluated by observing tube constraints either on atomistic displacements or on the displacement of primitive chain segments orthogonal to the initial primitive path. Having computed the tube diameter, the tube itself around each primitive path is constructed by visiting each entanglement strand along the primitive path one after the other and approximating it by the space of a small cylinder having the same axis as the entanglement strand itself and a diameter equal to the estimated effective tube diameter. Reptation of the primitive chain longitudinally inside the effective constraining tube as well as local transverse fluctuations of the chain driven mainly from constraint release and regeneration mechanisms are evident in the simulation results; the latter causes parts of the chains to venture outside their average tube surface for certain periods of time. The computed ?(s,t) curves account directly for both of these phenomena, as well as for contour length fluctuations, since all of them are automatically captured in the atomistic simulations. Linear viscoelastic properties such as the zero shear rate viscosity and the spectra of storage and loss moduli obtained on the basis of the obtained ?(s,t) curves for three different polymer melts (polyethylene, cis-1,4-polybutadiene, and trans-1,4-polybutadiene) are consistent with experimental rheological data and in qualitative agreement with the double reptation and dual constraint models. The new methodology is general and can be routinely applied to analyze primitive path dynamics and chain reptation in atomistic trajectories (accumulated through long MD simulations) of other model polymers or polymeric systems (e.g., bidisperse, branched, grafted, etc.); it is thus believed to be particularly useful in the future in evaluating proposed tube models and developing more accurate theories for entangled systems.

Stephanou, Pavlos S.; Baig, Chunggi; Tsolou, Georgia; Mavrantzas, Vlasis G.; Krger, Martin



QUANTIFYING FOREST ABOVEGROUND CARBON POOLS AND FLUXES USING MULTI-TEMPORAL LIDAR A report on field monitoring, remote sensing MMV, GIS integration, and modeling results for forestry field validation test to quantify aboveground tree biomass and carbon  

SciTech Connect

Sound policy recommendations relating to the role of forest management in mitigating atmospheric carbon dioxide (CO{sub 2}) depend upon establishing accurate methodologies for quantifying forest carbon pools for large tracts of land that can be dynamically updated over time. Light Detection and Ranging (LiDAR) remote sensing is a promising technology for achieving accurate estimates of aboveground biomass and thereby carbon pools; however, not much is known about the accuracy of estimating biomass change and carbon flux from repeat LiDAR acquisitions containing different data sampling characteristics. In this study, discrete return airborne LiDAR data was collected in 2003 and 2009 across {approx}20,000 hectares (ha) of an actively managed, mixed conifer forest landscape in northern Idaho, USA. Forest inventory plots, established via a random stratified sampling design, were established and sampled in 2003 and 2009. The Random Forest machine learning algorithm was used to establish statistical relationships between inventory data and forest structural metrics derived from the LiDAR acquisitions. Aboveground biomass maps were created for the study area based on statistical relationships developed at the plot level. Over this 6-year period, we found that the mean increase in biomass due to forest growth across the non-harvested portions of the study area was 4.8 metric ton/hectare (Mg/ha). In these non-harvested areas, we found a significant difference in biomass increase among forest successional stages, with a higher biomass increase in mature and old forest compared to stand initiation and young forest. Approximately 20% of the landscape had been disturbed by harvest activities during the six-year time period, representing a biomass loss of >70 Mg/ha in these areas. During the study period, these harvest activities outweighed growth at the landscape scale, resulting in an overall loss in aboveground carbon at this site. The 30-fold increase in sampling density between the 2003 and 2009 did not affect the biomass estimates. Overall, LiDAR data coupled with field reference data offer a powerful method for calculating pools and changes in aboveground carbon in forested systems. The results of our study suggest that multitemporal LiDAR-based approaches are likely to be useful for high quality estimates of aboveground carbon change in conifer forest systems.

Lee Spangler; Lee A. Vierling; Eva K. Stand; Andrew T. Hudak; Jan U.H. Eitel; Sebastian Martinuzzi



Quantifying Quantumness  

NASA Astrophysics Data System (ADS)

We introduce and study a measure of ``quantumness'' of a quantum state based on its Hilbert-Schmidt distance from the set of classical states. ``Classical states'' were defined earlier as states for which a positive P-function exists, i.e. they are mixtures of coherent states [1]. We study invariance properties of the measure, upper bounds, and its relation to entanglement measures. We evaluate the quantumness of a number of physically interesting states and show that for any physical system in thermal equilibrium there is a finite critical temperature above which quantumness vanishes. We then use the measure for identifying the ``most quantum'' states. Such states are expected to be potentially most useful for quantum information theoretical applications. We find these states explicitly for low-dimensional spin-systems, and show that they possess beautiful, highly symmetric Majorana representations. [4pt] [1] Classicality of spin states, Olivier Giraud, Petr Braun, and Daniel Braun, Phys. Rev. A 78, 042112 (2008)

Braun, Daniel; Giraud, Olivier; Braun, Peter A.



Reasoning with quantifiers Bart Geurts*  

E-print Network

Reasoning with quantifiers Bart Geurts* Department of Philosophy, University of Nijmegen, P.O. Box reasoning is syllo- gistic inference, which is just a restricted form of reasoning with quantifiers to show how our understanding of syllogistic reasoning may benefit from semantical research on quantifica

van Lambalgen, Michiel


Special Thanks to: Professor Wagdi Habashi, Dr. Isik Ozcer, the NTI staff and CFD lab staff The results show that this optimization procedure produces faster convergence  

E-print Network

int., 2011. 2. Flight Safety Foundation, "Aviation Safety Network." Accident Description. Available · The results show that this optimization procedure produces faster convergence rates while maintaining from Internet; accessed 28 July 2011. 3

Barthelat, Francois


Quantifiable Lateral Flow Assay Test Strips  

NASA Technical Reports Server (NTRS)

As easy to read as a home pregnancy test, three Quantifiable Lateral Flow Assay (QLFA) strips used to test water for E. coli show different results. The brightly glowing control line on the far right of each strip indicates that all three tests ran successfully. But the glowing test line on the middle left and bottom strips reveal their samples were contaminated with E. coli bacteria at two different concentrations. The color intensity correlates with concentration of contamination.



Quantifying clinical relevance.  


Communicating clinical trial results should include measures of effect size in order to quantify the clinical relevance of the results. P-values are not informative regarding the size of treatment effects. Cohen's d and its variants are often used but are not easy to understand in terms of applicability to day-to-day clinical practice. Number needed to treat and number needed to harm can provide additional information about effect size that clinicians may find useful in clinical decision making, and although number needed to treat and number needed to harm are limited to dichotomous outcomes, it is recommended that they be considered for inclusion when describing clinical trial results. Examples are given using the fictional antipsychotic medications miracledone and fantastapine for the treatment of acute schizophrenia. PMID:25152844

Citrome, Leslie



Working memory mechanism in proportional quantifier verification.  


The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g., "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow dots". The second study reveals that both types of sentences are correlated with memory storage, however, only proportional sentences are associated with the cognitive control. This result suggests that the cognitive mechanism underlying the verification of proportional quantifiers is crucially related to the integration process, in which an individual has to compare in memory the cardinalities of two sets. In the third study we find that the numerical distance between two cardinalities that must be compared significantly influences the verification time and accuracy. The results of our studies are discussed in the broader context of processing complex sentences. PMID:24374596

Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria



Toward quantifying infrared clutter  

NASA Astrophysics Data System (ADS)

Target detection in clutter depends sensitively on the spatial structure of the latter. In particular, it is the ratio of the target size to the clutter inhomogeneity scale which is of crucial importance. Indeed, looking for the leopard in the background of leopard skin is a difficult task. Hence quantifying thermal clutter is essential to the development of successful detection algorithms and signature analysis. This paper describes an attempt at clutter characterization along with several applications using calibrated thermal imagery collected by the Keweenaw Research Center. The key idea is to combine spatial and intensity statistics of the clutter into one number in order to characterize intensity variations over the length scale imposed by the target. Furthermore, when properly normalized, this parameter appears independent of temporal meteorological variation, thereby constituting a background scene invariant. This measure has a basis in analysis of variance and is related to digital signal processing fundamentals. Statistical analysis of thermal images is presented with promising results.

Reynolds, William R.



Simple instruments used in monitoring ionospheric perturbations and some observational results showing the ionospheric responses to the perturbations mainly from the lower atmosphere  

NASA Astrophysics Data System (ADS)

Ionospheric disturbances such as SID and acoustic gravity waves in different scales are well known and commonly discussed topics. Some simple ground equipment was designed and used for monitoring continuously the effects of these disturbances, especially, SWF, SFD. Besides SIDs, They also reflect clearly the acoustic gravity waves in different scale and Spread-F and these data are important supplementary to the traditional ionosonde records. It is of signifi-cance in understanding physical essentials of the ionospheric disturbances and applications in SID warning. In this paper, the designing of the instruments is given and results are discussed in detail. Some case studies were introduced as example which showed very clearly not only immediate effects of solar flare, but also the phenomena of ionospheric responses to large scale gravity waves from lower atmosphere such as typhoon, great earthquake and volcano erup-tion. Particularlyresults showed that acoustic gravity waves play significant role in seeding ionospheric Spread-F. These examples give evidence that lower atmospheric activities strongly influence the ionosphere.

Xiao, Zuo; Hao, Yongqiang; Zhang, Donghe; Xiao, Sai-Guan; Huang, Weiquan


Preliminary results on the use of diagnostic ultrasonography as a management tool to quantify egg production potential in breeding ostrich (Struthio camelus australis) females.  


An ostrich breeding flock, joined as individual breeding pairs (n = 136 pairs), was used to investigate the possibility of diagnostic ultrasonography as a method to predict the reproductive performance of ostrich females during a breeding season. Follicular activity was easily detected and quantified by using diagnostic ultrasonography. One to 8 follicles were recorded in 25% of females scanned at the beginning of the 9-month breeding season. At the end of the breeding season, 1-3 follicles were observed in 28.7% females. Females in which follicular activity was observed came into production earlier than those in which no follicles were observed, with the mean (+/- SE) number of days to the production of the 1st egg being 22.3 +/- 12.5 and 87.4 +/- 7.2 days, respectively. Females in which follicular activity was observed at the beginning of the breeding season, produced on average 181% more eggs during the 1st month of the breeding season (P < 0.01) than females in which no follicular activity was observed (6.67 +/- 0.70 vs 2.37 +/- 0.41 eggs). Egg production over the first 2 months of breeding and over the entire breeding season were similarly affected (P < 0.01), with the mean number of eggs produced over the first 2 months of the breeding season being 14.7 +/- 1.5 for females with observed follicular activity and 7.4 +/- 0.9 eggs for females with no observed follicular activity. Females in which follicular activity was observed at the end of the breeding season produced on average 108% more eggs (P < 0.01) during the last month of the breeding season than females in which no follicular activity was observed (2.77 +/- 0.43 vs. 1.33 +/- 0.27 eggs). There was a tendency (P = 0.06) for egg production over the last 2 months to be similarly affected (6.10 +/- 0.85 vs 4.19 +/- 0.54 eggs). No relationship with egg production over the entire breeding season was found for the end-of-the-breeding-season observations. Diagnostic ultrasonography can thus be used as a management tool to identify reproductively healthy ostrich females and also females with a higher egg production potential over a period of 2 months after or prior to assessment. Future studies should focus on the development of the technique to predict reproductive performance over entire breeding seasons for selection purposes. PMID:12240768

Lambrechts, H; Cloete, S W P; Swart, D; Greyling, J P C



Prospects of an alternative treatment against Trypanosoma cruzi based on abietic acid derivatives show promising results in Balb/c mouse model.  


Chagas disease, caused by the protozoa parasite Trypanosoma cruzi, is an example of extended parasitaemia with unmet medical needs. Current treatments based on old-featured benznidazole (Bz) and nifurtimox are expensive and do not fulfil the criteria of effectiveness, and a lack of toxicity devoid to modern drugs. In this work, a group of abietic acid derivatives that are chemically stable and well characterised were introduced as candidates for the treatment of Chagas disease. Invitro and invivo assays were performed in order to test the effectiveness of these compounds. Finally, those which showed the best activity underwent additional studies in order to elucidate the possible mechanism of action. Invitro results indicated that some compounds have low toxicity (i.e. >150?M, against Vero cell) combined with high efficacy (i.e. <20?M) against some forms of T.cruzi. Further invivo studies on mice models confirmed the expectations of improvements in infected mice. Invivo tests on the acute phase gave parasitaemia inhibition values higher those of Bz, and a remarkable decrease in the reactivation of parasitaemia was found in the chronic phase after immunosuppression of the mice treated with one of the compounds. The morphological alterations found in treated parasites with our derivatives confirmed extensive damage; energetic metabolism disturbances were also registered by (1)H NMR. The demonstrated invivo activity and low toxicity, together with the use of affordable starting products and the lack of synthetic complexity, put these abietic acid derivatives in a remarkable position toward the development of an anti-Chagasic agent. PMID:25462275

Olmo, F; Guardia, J J; Marin, C; Messouri, I; Rosales, M J; Urbanov, K; Chayboun, I; Chahboun, R; Alvarez-Manzaneda, E J; Snchez-Moreno, M



Molecular analysis of human glycophorin MiIX gene shows a silent segment transfer and untemplated mutation resulting from gene conversion via sequence repeats.  


The human glycophorin (HGp) loci that define the red blood cell surface antigens of the MNSs blood group system exhibit considerable allelic variation. Previous studies have identified gene conversion events involving HGpA(alpha) and HGpB(delta) that produced delta-alpha-delta hybrid genes which differ in the location of breakpoints. This report presents the molecular analysis of HGpMilX, the first example of a reverse alpha-delta-alpha hybrid gene that specifies a newly described phenotype of the Miltenberger complex. A novel restriction fragment unique to the HGpMilX gene was detected by Southern blot hybridization. The structure of the genomic region encoding the entire extracellular domain of the MilX protein was determined. Nucleotide sequencing of amplified genomic DNA showed that a silent segment of the HGpB(delta) gene had been transposed to replace the internal part of exon III in the HGpA(alpha) gene, thereby resulting in the formation of the MilX allele with an alpha-delta-alpha configuration. The proximal alpha-delta breakpoint was found to be flanked by a direct repeat of the acceptor splice site, whereas the distal delta-alpha breakpoint was localized to a palindromic region. This DNA rearrangement, with a minimal transfer of 16 templated nucleotides and a single mutation of untemplated adenyl nucleotide, not only created two intraexon hybrid junctions but transactivated the expression of a new stretch of amino acid residues in the MilX protein. Such a segment replacement may have occurred through the directional transfer from one duplex to the other via the mechanism of gene conversion. The occurrence of HGpMilX as another hybrid derived from parts of parent genes underlines the role of the recombinational "hotspot" in the generation of allelic diversity in the glycophorin family. PMID:1421409

Huang, C H; Skov, F; Daniels, G; Tippett, P; Blumenfeld, O O



Quantifying proportional variability.  


Real quantities can undergo such a wide variety of dynamics that the mean is often a meaningless reference point for measuring variability. Despite their widespread application, techniques like the Coefficient of Variation are not truly proportional and exhibit pathological properties. The non-parametric measure Proportional Variability (PV) [1] resolves these issues and provides a robust way to summarize and compare variation in quantities exhibiting diverse dynamical behaviour. Instead of being based on deviation from an average value, variation is simply quantified by comparing the numbers to each other, requiring no assumptions about central tendency or underlying statistical distributions. While PV has been introduced before and has already been applied in various contexts to population dynamics, here we present a deeper analysis of this new measure, derive analytical expressions for the PV of several general distributions and present new comparisons with the Coefficient of Variation, demonstrating cases in which PV is the more favorable measure. We show that PV provides an easily interpretable approach for measuring and comparing variation that can be generally applied throughout the sciences, from contexts ranging from stock market stability to climate variation. PMID:24386334

Heath, Joel P; Borowski, Peter



Storytelling Slide Shows to Improve Diabetes and High Blood Pressure Knowledge and Self-Efficacy: Three-Year Results among Community Dwelling Older African Americans  

ERIC Educational Resources Information Center

This study combined the African American tradition of oral storytelling with the Hispanic medium of "Fotonovelas." A staggered pretest posttest control group design was used to evaluate four Storytelling Slide Shows on health that featured community members. A total of 212 participants were recruited for the intervention and 217 for the

Bertera, Elizabeth M.



Visualizing and quantifying the suppressive effects of glucocorticoids on the tadpole immune system in vivo  

NSDL National Science Digital Library

This article presents a laboratory module developed to show students how glucocorticoid receptor activity can be pharmacologically modulated in Xenopus laevis tadpoles and the resulting effects on thymus gland size visualized and quantified in vivo.

Alexander Schreiber (St. Lawrence University)



The Relevance of External Quality Assessment for Molecular Testing for ALK Positive Non-Small Cell Lung Cancer: Results from Two Pilot Rounds Show Room for Optimization  

PubMed Central

Background and Purpose Molecular profiling should be performed on all advanced non-small cell lung cancer with non-squamous histology to allow treatment selection. Currently, this should include EGFR mutation testing and testing for ALK rearrangements. ROS1 is another emerging target. ALK rearrangement status is a critical biomarker to predict response to tyrosine kinase inhibitors such as crizotinib. To promote high quality testing in non-small cell lung cancer, the European Society of Pathology has introduced an external quality assessment scheme. This article summarizes the results of the first two pilot rounds organized in 20122013. Materials and Methods Tissue microarray slides consisting of cell-lines and resection specimens were distributed with the request for routine ALK testing using IHC or FISH. Participation in ALK FISH testing included the interpretation of four digital FISH images. Results Data from 173 different laboratories was obtained. Results demonstrate decreased error rates in the second round for both ALK FISH and ALK IHC, although the error rates were still high and the need for external quality assessment in laboratories performing ALK testing is evident. Error rates obtained by FISH were lower than by IHC. The lowest error rates were observed for the interpretation of digital FISH images. Conclusion There was a large variety in FISH enumeration practices. Based on the results from this study, recommendations for the methodology, analysis, interpretation and result reporting were issued. External quality assessment is a crucial element to improve the quality of molecular testing. PMID:25386659

Tembuyser, Lien; Tack, Vronique; Zwaenepoel, Karen; Pauwels, Patrick; Miller, Keith; Bubendorf, Lukas; Kerr, Keith; Schuuring, Ed; Thunnissen, Erik; Dequeker, Elisabeth M. C.



Map Showing Earthquake Shaking and Tsunami Hazard in Guadeloupe and Dominica, as a Result of an M8.0 Earthquake on the Lesser Antilles Megathrust  

USGS Multimedia Gallery

Earthquake shaking (onland) and tsunami (ocean) hazard in Guadeloupe and Dominica, as a result of anM8.0 earthquake on the Lesser Antilles megathrust adjacent to Guadeloupe. Colors onland represent scenario earthquake shaking intensities calculated in USGS ShakeMap software (Waldetal. 20...


Quantifying tumour heterogeneity with CT  

PubMed Central

Abstract Heterogeneity is a key feature of malignancy associated with adverse tumour biology. Quantifying heterogeneity could provide a useful non-invasive imaging biomarker. Heterogeneity on computed tomography (CT) can be quantified using texture analysis which extracts spatial information from CT images (unenhanced, contrast-enhanced and derived images such as CT perfusion) that may not be perceptible to the naked eye. The main components of texture analysis can be categorized into image transformation and quantification. Image transformation filters the conventional image into its basic components (spatial, frequency, etc.) to produce derived subimages. Texture quantification techniques include structural-, model- (fractal dimensions), statistical- and frequency-based methods. The underlying tumour biology that CT texture analysis may reflect includes (but is not limited to) tumour hypoxia and angiogenesis. Emerging studies show that CT texture analysis has the potential to be a useful adjunct in clinical oncologic imaging, providing important information about tumour characterization, prognosis and treatment prediction and response. PMID:23545171

Miles, Kenneth A.



Quantifying Qualitative Learning.  

ERIC Educational Resources Information Center

A teacher at an alternative school for at-risk students discusses the development of student assessment that increases students' self-esteem, convinces students that learning is fun, and prepares students to return to traditional school settings. She found that allowing students to participate in the assessment process successfully quantified the

Bogus, Barbara



Value of Fused 18F-Choline-PET/MRI to Evaluate Prostate Cancer Relapse in Patients Showing Biochemical Recurrence after EBRT: Preliminary Results  

PubMed Central

Purpose. We compared the accuracy of 18F-Choline-PET/MRI with that of multiparametric MRI (mMRI), 18F-Choline-PET/CT, 18F-Fluoride-PET/CT, and contrast-enhanced CT (CeCT) in detecting relapse in patients with suspected relapse of prostate cancer (PC) after external beam radiotherapy (EBRT). We assessed the association between standard uptake value (SUV) and apparent diffusion coefficient (ADC). Methods. We evaluated 21 patients with biochemical relapse after EBRT. Patients underwent 18F-Choline-PET/contrast-enhanced (Ce)CT, 18F-Fluoride-PET/CT, and mMRI. Imaging coregistration of PET and mMRI was performed. Results. 18F-Choline-PET/MRI was positive in 18/21 patients, with a detection rate (DR) of 86%. DRs of 18F-Choline-PET/CT, CeCT, and mMRI were 76%, 43%, and 81%, respectively. In terms of DR the only significant difference was between 18F-Choline-PET/MRI and CeCT. On lesion-based analysis, the accuracy of 18F-Choline-PET/MRI, 18F-Choline-PET/CT, CeCT, and mMRI was 99%, 95%, 70%, and 85%, respectively. Accuracy, sensitivity, and NPV of 18F-Choline-PET/MRI were significantly higher than those of both mMRI and CeCT. On whole-body assessment of bone metastases, the sensitivity of 18F-Choline-PET/CT and 18F-Fluoride-PET/CT was significantly higher than that of CeCT. Regarding local and lymph node relapse, we found a significant inverse correlation between ADC and SUV-max. Conclusion. 18F-Choline-PET/MRI is a promising technique in detecting PC relapse. PMID:24877053

Piccardo, Arnoldo; Paparo, Francesco; Picazzo, Riccardo; Naseri, Mehrdad; Ricci, Paolo; Marziano, Andrea; Bacigalupo, Lorenzo; Biscaldi, Ennio; Rollandi, Gian Andrea; Grillo-Ruggieri, Filippo; Farsad, Mohsen



Meditations on Quantified Constraint Satisfaction  

E-print Network

The quantified constraint satisfaction problem (QCSP) is the problem of deciding, given a structure and a first-order prenex sentence whose quantifier-free part is the conjunction of atoms, whether or not the sentence holds on the structure. One obtains a family of problems by defining, for each structure B, the problem QCSP(B) to be the QCSP where the structure is fixed to be B. In this article, we offer a viewpoint on the research program of understanding the complexity of the problems QCSP(B) on finite structures. In particular, we propose and discuss a group of conjectures; throughout, we attempt to place the conjectures in relation to existing results and to emphasize open issues and potential research directions.

Chen, Hubie



Human and artificial intelligence acquisition of quantifiers  

E-print Network

This paper is concerned with constraints on learning quantifiers, particularly those cognitive on human learning and algorithmic on machine learning, and the resulting implications of those constraints on language ...

Zhou, Samson S



Quantifying air pollution removal by green roofs in Chicago Jun Yang a,c,*, Qian Yu b  

E-print Network

Quantifying air pollution removal by green roofs in Chicago Jun Yang a,c,*, Qian Yu b , Peng Gong c t The level of air pollution removal by green roofs in Chicago was quantified using a dry deposition model. The result showed that a total of 1675 kg of air pollutants was removed by 19.8 ha of green roofs in one year

Yu, Qian


Referential and nonreferential substitutional quantifiers  

Microsoft Academic Search

It is common to find philosophers claiming that it is possible to free the quantifiers - especially the particular (or so-called existential) quantifier - from questions of reference, existence, and ontology, by having recourse to what is now referred to as the substitutional interpretation of the quantifiers. Although there may be ontologically neutral uses of the substitutional interpretation, it is

Alex Orenstein



On quantifying insect movements  

SciTech Connect

We elaborate on methods described by Turchin, Odendaal Rausher for quantifying insect movement pathways. We note the need to scale measurement resolution to the study insects and the questions being asked, and we discuss the use of surveying instrumentation for recording sequential positions of individuals on pathways. We itemize several measures that may be used to characterize movement pathways and illustrate these by comparisons among several Eleodes beetles occurring in shortgrass steppe. The fractal dimension of pathways may provide insights not available from absolute measures of pathway configuration. Finally, we describe a renormalization procedure that may be used to remove sequential interdependence among locations of moving individuals while preserving the basic attributes of the pathway.

Wiens, J.A.; Crist, T.O. (Colorado State Univ., Fort Collins (United States)); Milne, B.T. (Univ. of New Mexico, Albuquerque (United States))



Quantifying decoherence in continuous variable systems  

E-print Network

We present a detailed report on the decoherence of quantum states of continuous variable systems under the action of a quantum optical master equation resulting from the interaction with general Gaussian uncorrelated environments. The rate of decoherence is quantified by relating it to the decay rates of various, complementary measures of the quantum nature of a state, such as the purity, some nonclassicality indicators in phase space and, for two-mode states, entanglement measures and total correlations between the modes. Different sets of physically relevant initial configurations are considered, including one- and two-mode Gaussian states, number states, and coherent superpositions. Our analysis shows that, generally, the use of initially squeezed configurations does not help to preserve the coherence of Gaussian states, whereas it can be effective in protecting coherent superpositions of both number states and Gaussian wave packets.

A. Serafini; M. G. A. Paris; F. Illuminati; S. De Siena



Quantifying loopy network architectures.  


Biology presents many examples of planar distribution and structural networks having dense sets of closed loops. An archetype of this form of network organization is the vasculature of dicotyledonous leaves, which showcases a hierarchically-nested architecture containing closed loops at many different levels. Although a number of approaches have been proposed to measure aspects of the structure of such networks, a robust metric to quantify their hierarchical organization is still lacking. We present an algorithmic framework, the hierarchical loop decomposition, that allows mapping loopy networks to binary trees, preserving in the connectivity of the trees the architecture of the original graph. We apply this framework to investigate computer generated graphs, such as artificial models and optimal distribution networks, as well as natural graphs extracted from digitized images of dicotyledonous leaves and vasculature of rat cerebral neocortex. We calculate various metrics based on the asymmetry, the cumulative size distribution and the Strahler bifurcation ratios of the corresponding trees and discuss the relationship of these quantities to the architectural organization of the original graphs. This algorithmic framework decouples the geometric information (exact location of edges and nodes) from the metric topology (connectivity and edge weight) and it ultimately allows us to perform a quantitative statistical comparison between predictions of theoretical models and naturally occurring loopy graphs. PMID:22701593

Katifori, Eleni; Magnasco, Marcelo O



Quantifying T Lymphocyte Turnover  

PubMed Central

Peripheral T cell populations are maintained by production of naive T cells in the thymus, clonal expansion of activated cells, cellular self-renewal (or homeostatic proliferation), and density dependent cell life spans. A variety of experimental techniques have been employed to quantify the relative contributions of these processes. In modern studies lymphocytes are typically labeled with 5-bromo-2?-deoxyuridine (BrdU), deuterium, or the fluorescent dye carboxy-fluorescein diacetate succinimidyl ester (CFSE), their division history has been studied by monitoring telomere shortening and the dilution of T cell receptor excision circles (TRECs) or the dye CFSE, and clonal expansion has been documented by recording changes in the population densities of antigen specific cells. Proper interpretation of such data in terms of the underlying rates of T cell production, division, and death has proven to be notoriously difficult and involves mathematical modeling. We review the various models that have been developed for each of these techniques, discuss which models seem most appropriate for what type of data, reveal open problems that require better models, and pinpoint how the assumptions underlying a mathematical model may influence the interpretation of data. Elaborating various successful cases where modeling has delivered new insights in T cell population dynamics, this review provides quantitative estimates of several processes involved in the maintenance of naive and memory, CD4+ and CD8+ T cell pools in mice and men. PMID:23313150

De Boer, Rob J.; Perelson, Alan S.



What Do Blood Tests Show?  


... page from the NHLBI on Twitter. What Do Blood Tests Show? Blood tests show whether the levels ... changes may work best. Result Ranges for Common Blood Tests This section presents the result ranges for ...


Solar Light Show  

NSDL National Science Digital Library

Over the last few days, the Earth has been buffeted by a geomagnetic storm caused by a major solar flare. In addition to disruptions in radio, telecommunications, and electric service, the flare may also produce a dramatic light show as it peaks tonight. Weather permitting, the aurora borealis, or northern lights, may be visible as far south as Washington, D.C. The best viewing time will be local midnight. The sun is currently at the peak of its eleven-year solar cycle, spawning flares and "coronal mass ejections" (CME), violent outbursts of gas from the sun's corona that can carry up to 10 billion tons of electrified gas traveling at speeds as high as 2000 km/s. Geomagnetic storms result when solar winds compress the magnetosphere, sometimes interfering with electric power transmission and satellites, but also creating beautiful aurorae, as many stargazers hope will occur tonight.

de Nie, Michael Willem.


Working Memory Mechanism in Proportional Quantifier Verification  

ERIC Educational Resources Information Center

The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow

Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria



Public medical shows.  


In the second half of the 19th century, Jean-Martin Charcot (1825-1893) became famous for the quality of his teaching and his innovative neurological discoveries, bringing many French and foreign students to Paris. A hunger for recognition, together with progressive and anticlerical ideals, led Charcot to invite writers, journalists, and politicians to his lessons, during which he presented the results of his work on hysteria. These events became public performances, for which physicians and patients were transformed into actors. Major newspapers ran accounts of these consultations, more like theatrical shows in some respects. The resultant enthusiasm prompted other physicians in Paris and throughout France to try and imitate them. We will compare the form and substance of Charcot's lessons with those given by Jules-Bernard Luys (1828-1897), Victor Dumontpallier (1826-1899), Ambroise-Auguste Libault (1823-1904), Hippolyte Bernheim (1840-1919), Joseph Grasset (1849-1918), and Albert Pitres (1848-1928). We will also note their impact on contemporary cinema and theatre. PMID:25273491

Walusinski, Olivier



Quantifying Information Flow Gavin Lowe #  

E-print Network

Quantifying Information Flow Gavin Lowe # February 5, 2002 Abstract We extend definitions of information to Low. However, in many circumstances, some flow of information will be inevitable and acceptable of information flow so as to quantify the amount of information passed; in other words, we give a formal

Lowe, Gavin


Quantifying Spin Hall Effects in Nonmagnetic Metals  

NASA Astrophysics Data System (ADS)

Spin Hall effects intermix spin and charge currents even in nonmagnetic materials and, therefore, offer the possibility to generate and detect spin currents without the need for using ferromagnetic materials. In order to gain insight into the underlying physical mechanism and to identify technologically relevant materials, it is important to quantify the spin Hall angle ?, which is a direct measure of the charge-to-spin (and vice versa) conversion efficiency. Towards this end we utilized non-local transport measurements with double Hall bars fabricated from gold and copper.ootnotetextG. Mihajlovi'c, J. E. Pearson, M. A. Garcia, S. D. Bader, and A. Hoffmann, Phys.Rev. Lett. 103, 166601 (2009). In principle, this geometry permits the study of spin currents both generated and detected via spin Hall effects. We observe an unusual non-local resistivity that changes sign as a function of temperature. However, this results is quantitatively similar in gold and cooper, indicating that the non-local signals are not due to spin transport. An analysis of the data based on a combination of diffusive and quasi-ballistic transport leads to an upper limit of ?< 0.027 for gold at room temperature. Therefore we developed an approach based on spin pumping, which enables us to quantify even small spin Hall angles with high accuracy. Spin pumping utilizes microwave excitation of a ferromagnetic layer adjacent to a normal metal to generate over a macroscopic area a homogeneous dc spin current, which can be quantified from the line-width of the ferromagnetic resonance. In this geometry voltages from spin Hall effects scale with the device dimension and therefore good signal-to-noise can be obtained even for materials with small spin Hall angles. We integrated ferromagnet/normal metal bilayers into a co-planar waveguide and determined the spin Hall angle for a variety of non-magnetic materials (Pt, Pd, Au, and Mo) at room temperature. Of these materials Pt shows the largest spin Hall angle with ?= 0.0130.002.ootnotetextO. Mosendz, V. Vlaminck, J. E. Pearson, F. Y. Fradin, G. E. W. Bauer, S. D. Bader, and A. Hoffmann, arXiv:1009.5089; O. Mosendz, J. E. Pearson, F. Y. Fradin, G. E. W. Bauer, S. D. Bader, and A. Hoffmann, Phys. Rev.Lett.104, 046601 (2010).

Hoffmann, Axel



Television Quiz Show Simulation  

ERIC Educational Resources Information Center

This article explores the simulation of four television quiz shows for students in China studying English as a foreign language (EFL). It discusses the adaptation and implementation of television quiz shows and how the students reacted to them.

Hill, Jonnie Lynn



The Great Cometary Show  

NASA Astrophysics Data System (ADS)

The ESO Very Large Telescope Interferometer, which allows astronomers to scrutinise objects with a precision equivalent to that of a 130-m telescope, is proving itself an unequalled success every day. One of the latest instruments installed, AMBER, has led to a flurry of scientific results, an anthology of which is being published this week as special features in the research journal Astronomy & Astrophysics. ESO PR Photo 06a/07 ESO PR Photo 06a/07 The AMBER Instrument "With its unique capabilities, the VLT Interferometer (VLTI) has created itself a niche in which it provide answers to many astronomical questions, from the shape of stars, to discs around stars, to the surroundings of the supermassive black holes in active galaxies," says Jorge Melnick (ESO), the VLT Project Scientist. The VLTI has led to 55 scientific papers already and is in fact producing more than half of the interferometric results worldwide. "With the capability of AMBER to combine up to three of the 8.2-m VLT Unit Telescopes, we can really achieve what nobody else can do," added Fabien Malbet, from the LAOG (France) and the AMBER Project Scientist. Eleven articles will appear this week in Astronomy & Astrophysics' special AMBER section. Three of them describe the unique instrument, while the other eight reveal completely new results about the early and late stages in the life of stars. ESO PR Photo 06b/07 ESO PR Photo 06b/07 The Inner Winds of Eta Carinae The first results presented in this issue cover various fields of stellar and circumstellar physics. Two papers deal with very young solar-like stars, offering new information about the geometry of the surrounding discs and associated outflowing winds. Other articles are devoted to the study of hot active stars of particular interest: Alpha Arae, Kappa Canis Majoris, and CPD -57o2874. They provide new, precise information about their rotating gas envelopes. An important new result concerns the enigmatic object Eta Carinae. Using AMBER with its high spatial and spectral resolution, it was possible to zoom into the very heart of this very massive star. In this innermost region, the observations are dominated by the extremely dense stellar wind that totally obscures the underlying central star. The AMBER observations show that this dense stellar wind is not spherically symmetric, but exhibits a clearly elongated structure. Overall, the AMBER observations confirm that the extremely high mass loss of Eta Carinae's massive central star is non-spherical and much stronger along the poles than in the equatorial plane. This is in agreement with theoretical models that predict such an enhanced polar mass-loss in the case of rapidly rotating stars. ESO PR Photo 06c/07 ESO PR Photo 06c/07 RS Ophiuchi in Outburst Several papers from this special feature focus on the later stages in a star's life. One looks at the binary system Gamma 2 Velorum, which contains the closest example of a star known as a Wolf-Rayet. A single AMBER observation allowed the astronomers to separate the spectra of the two components, offering new insights in the modeling of Wolf-Rayet stars, but made it also possible to measure the separation between the two stars. This led to a new determination of the distance of the system, showing that previous estimates were incorrect. The observations also revealed information on the region where the winds from the two stars collide. The famous binary system RS Ophiuchi, an example of a recurrent nova, was observed just 5 days after it was discovered to be in outburst on 12 February 2006, an event that has been expected for 21 years. AMBER was able to detect the extension of the expanding nova emission. These observations show a complex geometry and kinematics, far from the simple interpretation of a spherical fireball in extension. AMBER has detected a high velocity jet probably perpendicular to the orbital plane of the binary system, and allowed a precise and careful study of the wind and the shockwave coming from the nova. The stream of results from the VLTI and AMBER



Quantifying network heterogeneity.  


Despite degree distributions give some insights about how heterogeneous a network is, they fail in giving a unique quantitative characterization of network heterogeneity. This is particularly the case when several different distributions fit for the same network, when the number of data points is very scarce due to network size, or when we have to compare two networks with completely different degree distributions. Here we propose a unique characterization of network heterogeneity based on the difference of functions of node degrees for all pairs of linked nodes. We show that this heterogeneity index can be expressed as a quadratic form of the Laplacian matrix of the network, which allows a spectral representation of network heterogeneity. We give bounds for this index, which is equal to zero for any regular network and equal to one only for star graphs. Using it we study random networks showing that those generated by the Erds-Rnyi algorithm have zero heterogeneity, and those generated by the preferential attachment method of Barabsi and Albert display only 11% of the heterogeneity of a star graph. We finally study 52 real-world networks and we found that they display a large variety of heterogeneities. We also show that a classification system based on degree distributions does not reflect the heterogeneity properties of real-world networks. PMID:21230700

Estrada, Ernesto



Quantifying decoherence in continuous variable systems  

Microsoft Academic Search

We present a detailed report on the decoherence of quantum states of continuous variable systems under the action of a quantum optical master equation resulting from the interaction with general Gaussian uncorrelated environments. The rate of decoherence is quantified by relating it to the decay rates of various, complementary measures of the quantum nature of a state, such as the

A Serafini; M G A Paris; F. Illuminati; S. De Siena



Quantifying the extinction vortex  

Microsoft Academic Search

We developed a database of 10 wild vertebrate populations whose declines to extinction were monitored over at least 12 years. We quantitatively characterized the final declines of these well-monitored populations and tested key theoretical predictions about the process of extinction, obtaining two primary results. First, we found evidence of logarithmic scaling of time-to-extinction as a function of population size for

William F. Fagan



Quantifying the extinction vortex.  


We developed a database of 10 wild vertebrate populations whose declines to extinction were monitored over at least 12 years. We quantitatively characterized the final declines of these well-monitored populations and tested key theoretical predictions about the process of extinction, obtaining two primary results. First, we found evidence of logarithmic scaling of time-to-extinction as a function of population size for each of the 10 populations. Second, two lines of evidence suggested that these extinction-bound populations collectively exhibited dynamics akin to those theoretically proposed to occur in extinction vortices. Specifically, retrospective analyses suggested that a population size of n individuals within a decade of extinction was somehow less valuable to persistence than the same population size was earlier. Likewise, both year-to-year rates of decline and year-to-year variability increased as the time-to-extinction decreased. Together, these results provide key empirical insights into extinction dynamics, an important topic that has received extensive theoretical attention. PMID:16958868

Fagan, William F; Holmes, E E



The Diane Rehm Show  

NSDL National Science Digital Library

The Diane Rehm Show has its origins in a mid-day program at WAMU in Washington, D.C. Diane Rehm came on to host the program in 1979, and in 1984 it was renamed "The Diane Rehm Show". Over the past several decades, Rehm has played host to hundreds of guests, include Archbishop Desmond Tutu, Julie Andrews, and President Bill Clinton. This website contains an archive of her past programs, and visitors can use the interactive calendar to look through past shows. Those visitors looking for specific topics can use the "Topics" list on the left-hand side of the page, or also take advantage of the search engine. The show has a number of social networking links, including a Facebook page and a Twitter feed.



ERIC Educational Resources Information Center

Describes the Collegiate Results Instrument (CRI), which measures a range of collegiate outcomes for alumni 6 years after graduation. The CRI was designed to target alumni from institutions across market segments and assess their values, abilities, work skills, occupations, and pursuit of lifelong learning. (EV)

Zemsky, Robert; Shaman, Susan; Shapiro, Daniel B.



The Ozone Show.  

ERIC Educational Resources Information Center

Uses a talk show activity for a final assessment tool for students to debate about the ozone hole. Students are assessed on five areas: (1) cooperative learning; (2) the written component; (3) content; (4) self-evaluation; and (5) peer evaluation. (SAH)

Mathieu, Aaron



Demonstration Road Show  

NSDL National Science Digital Library

The Idaho State University Department of Physics conducts science demonstration shows at S. E. Idaho schools. Four different presentations are currently available; "Forces and Motion", "States of Matter", "Electricity and Magnetism", and "Sound and Waves". Information provided includes descriptions of the material and links to other resources.



Do Elephants Show Empathy?  

Microsoft Academic Search

Elephants show a rich social organization and display a number of unusual traits. In this paper, we analyse reports collected over a thirty-five year period, describing behaviour that has the potential to reveal signs of empathic understanding. These include coalition formation, the offering of protection and comfort to others, retrieving and 'babysitting' calves, aiding individuals that would otherwise have difficulty

Lucy A. Bates; Phyllis C. Lee; Norah Njiraini; Joyce H. Poole; Katito Sayialel; Soila Sayialel; Cynthia J. Moss; Richard W. Byrne



ISU Demonstration Road Show  

NSDL National Science Digital Library

The Idaho State University Department of Physics conducts science demonstration shows at SE Idaho schools. Four different presentations are currently available; "Forces and Motion", "States of Matter", "Electricity and Magnetism", and "Sound and Waves". Student activities and descriptions of the demonstrated material are also provided.

Shropshire, Steven



Stage a Water Show  

ERIC Educational Resources Information Center

In the author's book titled "The Incredible Water Show," the characters from "Miss Alaineus: A Vocabulary Disaster" used an ocean of information to stage an inventive performance about the water cycle. In this article, the author relates how she turned the story into hands-on science teaching for real-life fifth-grade students. The author also

Frasier, Debra



Quantifying Stimulus Discriminability: A Comparison of Information Theory and Ideal Observer Analysis  

Microsoft Academic Search

Performance in sensory discrimination tasks is commonly quantified using either information theory or ideal observer analysis. These two quantitative frameworks are often assumed to be equivalent. For example, higher mutual information is said to correspond to improved performance of an ideal observer in a stimulus estimation task. To the contrary, drawing on and extending previous results, we show that five

Eric E. Thomson; William B. Kristan Jr.



Show-Me Magazine  

NSDL National Science Digital Library

Come along as the folks at the University of Missouri show you the history of their college days through the Show Me magazine. It's a wonderful collection of college humor published from 1946 to 1963. First-time visitors would do well to read about the magazine's colorful past, courtesy of Jerry Smith. A good place to start is the November 1920 issue (easily found when you browse by date), which contains a number of parody advertisements along with some doggerels poking good natured fun at the football team and an assortment of deans. Also, it's worth noting that visitors can scroll through issues and save them to an online "bookbag" for later use.



Mars Slide Show  

NASA Technical Reports Server (NTRS)

15 September 2006 This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows a landslide that occurred off of a steep slope in Tithonium Chasma, part of the vast Valles Marineris trough system.

Location near: 4.8oS, 84.6oW Image width: 3 km (1.9 mi) Illumination from: upper left Season: Southern Autumn



NPR: The Picture Show  

NSDL National Science Digital Library

National Public Radio's "The Picture Show" photo blog is a great way to avoid culling through the thousands of less interesting and engaging photographs on the web. With a dedicated team of professionals, this blog brings together different posts that profile various sets of photographs that cover 19th century war in Afghanistan, visual memories of WWII, unpublished photographs of JFK's presidential campaign, and abandoned buildings on the islands in Boston Harbor. Visitors can search through previous posts, use social media features to share the photo features with friends, and also sign up to receive new materials via their RSS feed. There's quite a nice mix of material here, and visitors can also comment on the photos and recommend the collection to friends and others.


Egg: the Arts Show  

NSDL National Science Digital Library

"Egg is a new TV show about people making art across America" from PBS. This accompanying Website presents excerpts from sixteen episodes of the series, with three more "hatching soon," such as Close to Home, profiling three photographers: Jeanine Pohlhaus, whose pictures document her father's struggle with mental illness; Gregory Crewdson's photos of Lee, Massachusetts; and Joseph Rodriguez's photos of Hispanics in New York City. Excerpts include video clips, gallery listings where the artists' work can be seen, and short interviews with artists. Some episodes also offer "peeps," glimpses of material not shown on TV, such as the Space episode's peep, Shooting Stars, that provides directions for astrophotography, taking photographs of star trails. Other sections of the site are airdates, for local listings; see and do usa, where vacationers can search for art events at their destinations; and egg on the arts, a discussion forum.


Quantifying reliability uncertainty : a proof of concept.  

SciTech Connect

This paper develops Classical and Bayesian methods for quantifying the uncertainty in reliability for a system of mixed series and parallel components for which both go/no-go and variables data are available. Classical methods focus on uncertainty due to sampling error. Bayesian methods can explore both sampling error and other knowledge-based uncertainties. To date, the reliability community has focused on qualitative statements about uncertainty because there was no consensus on how to quantify them. This paper provides a proof of concept that workable, meaningful quantification methods can be constructed. In addition, the application of the methods demonstrated that the results from the two fundamentally different approaches can be quite comparable. In both approaches, results are sensitive to the details of how one handles components for which no failures have been seen in relatively few tests.

Diegert, Kathleen V.; Dvorack, Michael A.; Ringland, James T.; Mundt, Michael Joseph; Huzurbazar, Aparna (Los Alamos National Laboratory, Los Alamos, NM); Lorio, John F.; Fatherley, Quinn (Los Alamos National Laboratory, Los Alamos, NM); Anderson-Cook, Christine (Los Alamos National Laboratory, Los Alamos, NM); Wilson, Alyson G. (Los Alamos National Laboratory, Los Alamos, NM); Zurn, Rena M.



A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT) assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE)  

NASA Astrophysics Data System (ADS)

A variety of methods are used to measure the capability of particulate matter (PM) to catalytically generate reactive oxygen species (ROS) in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT) assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants) as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with the large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (Coefficient of Variation of 12% for standards, 4% for ambient samples), and reasonably low limit of detection (0.31 nmol min-1). Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 0.12, r2 = 0.92, N = 9). The system was utilized for the Southeastern Center for Air Pollution and Epidemiology (SCAPE) to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, road-side, and rural) in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per air volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88), suggesting regional sources contributing to the PM oxidative potential in southeast US. However, the greater heterogeneity in the intrinsic DTT activity (per PM mass basis) across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the instrument can also be used to determine oxidative potential with other acellular assays.

Fang, T.; Verma, V.; Guo, H.; King, L. E.; Edgerton, E. S.; Weber, R. J.



Quantifying offshore wind resources from satellite wind maps: study area the North Sea  

NASA Astrophysics Data System (ADS)

Offshore wind resources are quantified from satellite synthetic aperture radar (SAR) and satellite scatterometer observations at local and regional scale respectively at the Horns Rev site in Denmark. The method for wind resource estimation from satellite observations interfaces with the wind atlas analysis and application program (WAsP). An estimate of the wind resource at the new project site at Horns Rev is given based on satellite SAR observations. The comparison of offshore satellite scatterometer winds, global model data and in situ data shows good agreement. Furthermore, the wake effect of the Horns Rev wind farm is quantified from satellite SAR images and compared with state-of-the-art wake model results with good agreement. It is a unique method using satellite observations to quantify the spatial extent of the wake behind large offshore wind farms. Copyright

Hasager, C. B.; Barthelmie, R. J.; Christiansen, M. B.; Nielsen, M.; Pryor, S. C.



Quantifying pulsed laser induced damage to graphene  

SciTech Connect

As an emerging optical material, graphene's ultrafast dynamics are often probed using pulsed lasers yet the region in which optical damage takes place is largely uncharted. Here, femtosecond laser pulses induced localized damage in single-layer graphene on sapphire. Raman spatial mapping, SEM, and AFM microscopy quantified the damage. The resulting size of the damaged area has a linear correlation with the optical fluence. These results demonstrate local modification of sp{sup 2}-carbon bonding structures with optical pulse fluences as low as 14 mJ/cm{sup 2}, an order-of-magnitude lower than measured and theoretical ablation thresholds.

Currie, Marc; Caldwell, Joshua D.; Bezares, Francisco J.; Robinson, Jeremy; Anderson, Travis; Chun, Hayden; Tadjer, Marko [Optical Sciences Division and Electronics Science and Technology Division, Naval Research Laboratory, Washington DC 20375 (United States)




Technology Transfer Automated Retrieval System (TEKTRAN)

Analytical results from different laboratories have greater variation than those from a single laboratory, and this variation differs by nutrient. Objectives of this presentation are to describe methods for quantifying the analytical reproducibility among and repeatability within laboratories, estim...


Quantifying tissue hemodynamics by NIRS versus DOT: global versus focal changes in cerebral hemodynamics  

NASA Astrophysics Data System (ADS)

Near infrared spectroscopy (NIRS) is used to quantify changes in oxy-hemoglobin (HbO) and deoxy-hemoglobin (Hb) concentrations in tissue. The analysis uses the modified Beer-Lambert law, which is generally valid for quantifying global concentration changes. We examine the errors that result from analyzing focal changes in HbO and Hb concentrations. We find that the measured focal change in HbO and Hb are linearly proportional to the actual focal changes but that the proportionally constants are different. Thus relative changes in HbO and Hb cannot, in general, be quantified. However, we show that under certain circumstances it is possible to quantify these relative changes. This builds the case for diffuse optical tomography (DOT) which in general should be able to quantify focal changes in HbO and Hb through the use of image reconstruction algorithms that deconvolve the photon diffusion point-spread-function. We demonstrate the differences between NIRS and DOT using a rat model of somatosensory stimulation.

Boas, David A.; Cheng, Xuefeng; Marota, John A.; Mandeville, Joseph B.



A stochastic approach for quantifying immigrant integration: the Spanish test case  

NASA Astrophysics Data System (ADS)

We apply stochastic process theory to the analysis of immigrant integration. Using a unique and detailed data set from Spain, we study the relationship between local immigrant density and two social and two economic immigration quantifiers for the period 1999-2010. As opposed to the classic time-series approach, by letting immigrant density play the role of time and the quantifier the role of space, it becomes possible to analyse the behavior of the quantifiers by means of continuous time random walks. Two classes of results are then obtained. First, we show that social integration quantifiers evolve following diffusion law, while the evolution of economic quantifiers exhibits ballistic dynamics. Second, we make predictions of best- and worst-case scenarios taking into account large local fluctuations. Our stochastic process approach to integration lends itself to interesting forecasting scenarios which, in the hands of policy makers, have the potential to improve political responses to integration problems. For instance, estimating the standard first-passage time and maximum-span walk reveals local differences in integration performance for different immigration scenarios. Thus, by recognizing the importance of local fluctuations around national means, this research constitutes an important tool to assess the impact of immigration phenomena on municipal budgets and to set up solid multi-ethnic plans at the municipal level as immigration pressures build.

Agliari, Elena; Barra, Adriano; Contucci, Pierluigi; Sandell, Richard; Vernia, Cecilia



Quantifying torso deformity in scoliosis  

NASA Astrophysics Data System (ADS)

Scoliosis affects the alignment of the spine and the shape of the torso. Most scoliosis patients and their families are more concerned about the effect of scoliosis on the torso than its effect on the spine. There is a need to develop robust techniques for quantifying torso deformity based on full torso scans. In this paper, deformation indices obtained from orthogonal maps of full torso scans are used to quantify torso deformity in scoliosis. 'Orthogonal maps' are obtained by applying orthogonal transforms to 3D surface maps. (An 'orthogonal transform' maps a cylindrical coordinate system to a Cartesian coordinate system.) The technique was tested on 361 deformed computer models of the human torso and on 22 scans of volunteers (8 normal and 14 scoliosis). Deformation indices from the orthogonal maps correctly classified up to 95% of the volunteers with a specificity of 1.00 and a sensitivity of 0.91. In addition to classifying scoliosis, the system gives a visual representation of the entire torso in one view and is viable for use in a clinical environment for managing scoliosis.

Ajemba, Peter O.; Kumar, Anish; Durdle, Nelson G.; Raso, V. James



Towards quantifying fuzzy stream power  

NASA Astrophysics Data System (ADS)

Deterministic flow direction algorithms such as the D8 have wide application in numerical models of landscape evolution. These simple algorithms play a central role in quantifying drainage basin area, and hence approximatingvia empirically derived relationships from regional flood frequency and hydraulic geometrystream power or fluvial erosion potential. Here we explore how alternative algorithms that employ a probabilistic choice of flow direction affect quantitative estimates of stream power. We test a probabilistic multi-flow direction algorithm within the MATLAB TopoToolbox in model and real landscapes of low topographic relief and minute gradients, where potentially fuzzy drainage divides are dictated by, among others, alluvial fan dynamics, playa infill, and groundwater fluxes and seepage. We employ a simplistic numerical landscape evolution model that simulates fluvial incision and hillslope diffusion and explicitly models the existence and capture of endorheic basins that prevail in (semi-)arid, low-relief landscapes. We discuss how using this probabilistic multi-flow direction algorithm helps represent and quantify uncertainty about spatio-temporal drainage divide locations and how this bears on quantitative estimates of downstream stream power and fluvial erosion potential as well as their temporal dynamics.

Schwanghart, W.; Korup, O.



Evaluation of two methods for quantifying passeriform lice.  


Two methods commonly used to quantify ectoparasites on live birds are visual examination and dust-ruffling. Visual examination provides an estimate of ectoparasite abundance based on an observer's timed inspection of various body regions on a bird. Dust-ruffling involves application of insecticidal powder to feathers that are then ruffled to dislodge ectoparasites onto a collection surface where they can then be counted. Despite the common use of these methods in the field, the proportion of actual ectoparasites they account for has only been tested with Rock Pigeons (Columba livia), a relatively large-bodied species (238-302 g) with dense plumage. We tested the accuracy of the two methods using European Starlings (Sturnus vulgaris; ~75 g). We first quantified the number of lice (Brueelia nebulosa) on starlings using visual examination, followed immediately by dust-ruffling. Birds were then euthanized and the proportion of lice accounted for by each method was compared to the total number of lice on each bird as determined with a body-washing method. Visual examination and dust-ruffling each accounted for a relatively small proportion of total lice (14% and 16%, respectively), but both were still significant predictors of abundance. The number of lice observed by visual examination accounted for 68% of the variation in total abundance. Similarly, the number of lice recovered by dust-ruffling accounted for 72% of the variation in total abundance. Our results show that both methods can be used to reliably quantify the abundance of lice on European Starlings and other similar-sized passerines. PMID:24039328

Koop, Jennifer A H; Clayton, Dale H



Quantifying Spin Hall Effects from Spin Pumping  

NASA Astrophysics Data System (ADS)

Recent activity in spin transport research has included a focus on spin Hall effects, which arise from spin-orbit interactions. Spin orbit coupling in normal metals (NM) results in a conversion of pure spin currents into charge currents, which are perpendicular to both the spin current direction and the spin polarization. This phenomenon is known as the inverse spin Hall effect and it generates a voltage across a spin-current-carrying sample. The strength of the inverse spin Hall effect is characterized by a single dimensionless parameter, the spin Hall angle, which is materials-specific. Here we present a new method to quantify spin Hall angles for many different materials. We studied the inverse spin Hall effect in Ni80Fe20/NM bilayer structures by generating pure spin currents inside the NM layer through spin pumping at the Ni80Fe20/NM interface. Integrating a patterned Ni80Fe20/NM bilayer into a coplanar waveguide transmission line enables us to excite large angle magnetization precession in Ni80Fe20 via rf excitation, which in turn generates a dc spin current in the adjacent NM. A strong dc signal across the Ni80Fe20/NM is observed at the FMR position, and its magnitude is dependent on the power of the rf excitation and the direction of the applied magnetic field. We identified two distinct contributions to the dc voltage: one symmetric with respect to the FMR resonance position, and the other antisymmetric. Our analysis shows that the antisymmetric contribution is due to anisotropic magnetoresistance (AMR) in the Ni80Fe20 layer and is present even in single-layer Ni80Fe20 films. The second, symmetric, contribution to the dc voltage is attributed to the inverse spin Hall effect. The main advantage of our approach is that this second contribution scales with the device dimension and thus even small spin Hall signals can be detected with large accuracy. Using this approach we determined the spin Hall angle for Pt, Au and Mo by fitting the experimental data to a self-consistent theory, which accounts for both AMR and inverse spin Hall effect contributions.footnotetextO. Mosendz, J. E. Pearson, F. Y. Fradin, G. E. W. Bauer, S. D. Bauer, and A. Hoffmann, ArXiv:0911.2725. Our technique allows to electrically detect the spin accumulation in the NM. Using this connection, we also showed that spin pumping is suppressed when MgO tunneling barrier is inserted at the Ni80Fe20/NM interface.footnotetextO. Mosendz, J. E. Pearson, F. Y. Fradin, S. D. Bauer, and A. Hoffmann, ArXiv:0911.3182

Mosendz, Oleksandr



Quantifying cellular differentiation by physical phenotype using digital holographic microscopy.  


Although the biochemical changes that occur during cell differentiation are well-known, less known is that there are significant, cell-wide physical changes that also occur. Understanding and quantifying these changes can help to better understand the process of differentiation as well as ways to monitor it. Digital holographic microscopy (DHM) is a marker-free quantitative phase microscopy technique for measuring biological processes such as cellular differentiation, alleviating the need for introduction of foreign markers. We found significant changes in subcellular structure and refractive index of differentiating myeloid precursor cells within one day of differentiation induction, and significant differences depending on the type of lineage commitment. We augmented our results by showing significant changes in the softness of myeloid precursor cell differentiation within one day using optical stretching, a laser trap-based marker-free technique. DHM and optical stretching therefore provide consequential parameterization of cellular differentiation with sensitivity otherwise difficult to achieve. Therefore, we provide a way forward to quantify and understand cell differentiation with minimal perturbation using biophotonics. PMID:22262315

Chalut, Kevin J; Ekpenyong, Andrew E; Clegg, Warren L; Melhuish, Isabel C; Guck, Jochen



Use of the Concept of Equivalent Biologically Effective Dose (BED) to Quantify the Contribution of Hyperthermia to Local Tumor Control in Radiohyperthermia Cervical Cancer Trials, and Comparison With Radiochemotherapy Results  

SciTech Connect

Purpose: To express the magnitude of contribution of hyperthermia to local tumor control in radiohyperthermia (RT/HT) cervical cancer trials, in terms of the radiation-equivalent biologically effective dose (BED) and to explore the potential of the combined modalities in the treatment of this neoplasm. Materials and Methods: Local control rates of both arms of each study (RT vs. RT+HT) reported from randomized controlled trials (RCT) on concurrent RT/HT for cervical cancer were reviewed. By comparing the two tumor control probabilities (TCPs) from each study, we calculated the HT-related log cell-kill and then expressed it in terms of the number of 2 Gy fraction equivalents, for a range of tumor volumes and radiosensitivities. We have compared the contribution of each modality and made some exploratory calculations on the TCPs that might be expected from a combined trimodality treatment (RT+CT+HT). Results: The HT-equivalent number of 2-Gy fractions ranges from 0.6 to 4.8 depending on radiosensitivity. Opportunities for clinically detectable improvement by the addition of HT are only available in tumors with an alpha value in the approximate range of 0.22-0.28 Gy{sup -1}. A combined treatment (RT+CT+HT) is not expected to improve prognosis in radioresistant tumors. Conclusion: The most significant improvements in TCP, which may result from the combination of RT/CT/HT for locally advanced cervical carcinomas, are likely to be limited only to those patients with tumors of relatively low-intermediate radiosensitivity.

Plataniotis, George A. [Department of Oncology, Aberdeen Royal Infirmary, Aberdeen (United Kingdom)], E-mail:; Dale, Roger G. [Imperial College Healthcare NHS Trust, London (United Kingdom)



Quantifying entanglement with witness operators  

SciTech Connect

We present a unifying approach to the quantification of entanglement based on entanglement witnesses, which includes several already established entanglement measures such as the negativity, the concurrence, and the robustness of entanglement. We then introduce an infinite family of new entanglement quantifiers, having as its limits the best separable approximation measure and the generalized robustness. Gaussian states, states with symmetry, states constrained to super-selection rules, and states composed of indistinguishable particles are studied under the view of the witnessed entanglement. We derive new bounds to the fidelity of teleportation d{sub min}, for the distillable entanglement E{sub D} and for the entanglement of formation. A particular measure, the PPT-generalized robustness, stands out due to its easy calculability and provides sharper bounds to d{sub min} and E{sub D} than the negativity in most of the states. We illustrate our approach studying thermodynamical properties of entanglement in the Heisenberg XXX and dimerized models.

Brandao, Fernando G.S.L. [Grupo de Informacao Quantica, Departamento de Fisica, Universidade Federal de Minas Gerais, Caixa Postal 702, Belo Horizonte, 30.123-970, MG (Brazil)



Quantifying entanglement with scattering experiments  

NASA Astrophysics Data System (ADS)

We show how the entanglement contained in states of spins arranged on a lattice may be lower bounded with observables arising in scattering experiments. We focus on the partial differential cross section obtained in neutron scattering from magnetic materials but our results are sufficiently general such that they may also be applied to, e.g., optical Bragg scattering from ultracold atoms in optical lattices or from ion chains. We discuss resonating valence bond states and ground and thermal states of experimentally relevant modelssuch as the Heisenberg, Majumdar-Ghosh, and XY modelsin different geometries and with different spin numbers. As a by-product, we find that for the one-dimensional XY model in a transverse field such measurements reveal factorization and the quantum phase transition at zero temperature.

Marty, O.; Epping, M.; Kampermann, H.; Bru, D.; Plenio, M. B.; Cramer, M.



Quantifying utricular stimulation during natural behavior  

PubMed Central

The use of natural stimuli in neurophysiological studies has led to significant insights into the encoding strategies used by sensory neurons. To investigate these encoding strategies in vestibular receptors and neurons, we have developed a method for calculating the stimuli delivered to a vestibular organ, the utricle, during natural (unrestrained) behaviors, using the turtle as our experimental preparation. High-speed digital video sequences are used to calculate the dynamic gravito-inertial (GI) vector acting on the head during behavior. X-ray computed tomography (CT) scans are used to determine the orientation of the otoconial layer (OL) of the utricle within the head, and the calculated GI vectors are then rotated into the plane of the OL. Thus, the method allows us to quantify the spatio-temporal structure of stimuli to the OL during natural behaviors. In the future, these waveforms can be used as stimuli in neurophysiological experiments to understand how natural signals are encoded by vestibular receptors and neurons. We provide one example of the method which shows that turtle feeding behaviors can stimulate the utricle at frequencies higher than those typically used in vestibular studies. This method can be adapted to other species, to other vestibular end organs, and to other methods of quantifying head movements. PMID:22753360

Rivera, Angela R. V.; Davis, Julian; Grant, Wally; Blob, Richard W.; Peterson, Ellengene; Neiman, Alexander B.; Rowe, Michael



Quantifying of bactericide properties of medicinal plants  

PubMed Central

Extended research has been carried out to clarify the ecological role of plant secondary metabolites (SMs). Although their primary ecological function is self-defense, bioactive compounds have long been used in alternative medicine or in biological control of pests. Several members of the family Labiatae are known to have strong antimicrobial capacity. For testing and quantifying antibacterial activity, most often standard microbial protocols are used, assessing inhibitory activity on a selected strain. In this study, the applicability of a microbial ecotoxtest was evaluated to quantify the aggregate bactericide capacity of Labiatae species, based on the bioluminescence inhibition of the bacterium Vibrio fischeri. Striking differences were found amongst herbs, reaching even 10-fold toxicity. Glechoma hederacea L. proved to be the most toxic, with the EC50 of 0.4073 g dried plant/l. LC50 values generated by the standard bioassay seem to be a good indicator of the bactericide property of herbs. Traditional use of the selected herbs shows a good correlation with bioactivity expressed as bioluminescence inhibition, leading to the conclusion that the Vibrio fischeri bioassay can be a good indicator of the overall antibacterial capacity of herbs, at least on a screening level. PMID:21502819

cs, Andrs; Glncsr, Flra; Barabs, Anik



Detecting and quantifying topography in neural maps.  


Topographic maps are an often-encountered feature in the brains of many species, yet there are no standard, objective procedures for quantifying topography. Topographic maps are typically identified and described subjectively, but in cases where the scale of the map is close to the resolution limit of the measurement technique, identifying the presence of a topographic map can be a challenging subjective task. In such cases, an objective topography detection test would be advantageous. To address these issues, we assessed seven measures (Pearson distance correlation, Spearman distance correlation, Zrehen's measure, topographic product, topological correlation, path length and wiring length) by quantifying topography in three classes of cortical map model: linear, orientation-like, and clusters. We found that all but one of these measures were effective at detecting statistically significant topography even in weakly-ordered maps, based on simulated noisy measurements of neuronal selectivity and sparse sampling of the maps. We demonstrate the practical applicability of these measures by using them to examine the arrangement of spatial cue selectivity in pallid bat A1. This analysis shows that significantly topographic arrangements of interaural intensity difference and azimuth selectivity exist at the scale of individual binaural clusters. PMID:24505279

Yarrow, Stuart; Razak, Khaleel A; Seitz, Aaron R; Seris, Peggy



Fuzzy Entropy Method for Quantifying Supply Chain Networks Complexity  

NASA Astrophysics Data System (ADS)

Supply chain is a special kind of complex network. Its complexity and uncertainty makes it very difficult to control and manage. Supply chains are faced with a rising complexity of products, structures, and processes. Because of the strong link between a supply chains complexity and its efficiency the supply chain complexity management becomes a major challenge of todays business management. The aim of this paper is to quantify the complexity and organization level of an industrial network working towards the development of a Supply Chain Network Analysis (SCNA). By measuring flows of goods and interaction costs between different sectors of activity within the supply chain borders, a network of flows is built and successively investigated by network analysis. The result of this study shows that our approach can provide an interesting conceptual perspective in which the modern supply network can be framed, and that network analysis can handle these issues in practice.

Zhang, Jihui; Xu, Junqin


Quantifying the fluvial autogenic processes: Tank Experiments  

NASA Astrophysics Data System (ADS)

The evolution of deltaic shorelines has long been explained by allogenic changes in the environment such as changes in tectonics, base level, and sediment supply. Recently, the importance of autogenic cyclicity has been recognized in concert with allogenic forcing. Decoupling autogenic variability from allogenic signatures is essential in order to understand depositional systems and the stratigraphic record; however, autogenic behavior in sedimentary environments is not understood well enough to separate it from allogenic factors. Data drawn from model experiments that isolate the autogenic variability from allogenic forcing are the key to understanding and predicting autogenic responses in fluvial and deltaic systems. Here, three experiments using a constant water discharge (Qw) with a varying sediment flux (Qs) are conducted to examine the autogenic variability in a fluviodeltaic system. The experimental basin has dimensions of 1 m x 1 m, and a sediment/water mixture was delivered into the experimental basin. The sediment mixture contained 50% fine sand (.1 mm) and 50% coarse sand (2 mm) by volume and was delivered into the basin. The delta was built over a flat, non-erodible surface into a standing body of water with a constant base level and no subsidence. The autogenic responses of the fluvial and deltaic systems were captured by time-lapse images and the shoreline position was mapped to quantify the autogenic processes. The autogenic response to varying sediment supply while maintaining constant water supply include changes in 1) the slope of the fluvial-surface, 2) the frequency of autogenic storage and release events, and 3) shoreline roughness. Interestingly, the data shows a non-linear relationship between the frequency of autogenic cyclicity and the ratio of sediment supply to water discharge. The successive increase in the sediment supply and thus the increase in the ratio of Qs to Qw caused the slope of the fluvial surface to increase, and the frequency of autogenic sediment storage and release events to increase, but in a non-linear nature. This non-linear increase results from the autogenic frequency not increasing by a factor of 2 when the sediment flux increases by a factor of 2. Since the experimental data suggests that the frequency of autogenic variability is also related to the slope of the fluvial-surface, an increase in the fluvial slope would force the fluvial system to experience larger autogenic processes over a longer period of time. These three experiments are part of a larger matrix of nine total flume experiments, which explore variations in sediment supply, water discharge, and Qs/Qw to better understand fluvial autogenic processes.

Powell, E. J.; Kim, W.; Muto, T.



In favour of the definition "adolescents with idiopathic scoliosis": juvenile and adolescent idiopathic scoliosis braced after ten years of age, do not show different end results. SOSORT award winner 2014  

PubMed Central

Background The most important factor discriminating juvenile (JIS) from adolescent idiopathic scoliosis (AIS) is the risk of deformity progression. Brace treatment can change natural history, even when risk of progression is high. The aim of this study was to compare the end of growth results of JIS subjects, treated after 10 years of age, with final results of AIS. Methods Design: prospective observational controlled cohort study nested in a prospective database. Setting: outpatient tertiary referral clinic specialized in conservative treatment of spinal deformities. Inclusion criteria: idiopathic scoliosis; European Risser 02; 25 degrees to 45 degrees Cobb; start treatment age: 10 years or more, never treated before. Exclusion criteria: secondary scoliosis, neurological etiology, prior treatment for scoliosis (brace or surgery). Groups: 27 patients met the inclusion criteria for the AJIS, (Juvenile Idiopathic Scoliosis treated in adolescence), demonstrated by an x-ray before 10 year of age, and treatment start after 10 years of age. AIS group included 45 adolescents with a diagnostic x-ray made after the threshold of age 10 years. Results at the end of growth were analysed; the threshold of 5 Cobb degree to define worsened, improved and stabilized curves was considered. Statistics: Mean and SD were used for descriptive statistics of clinical and radiographic changes. Relative Risk of failure (RR), Chi-square and T-test of all data was calculated to find differences among the two groups. 95% Confidence Interval (CI) , and of radiographic changes have been calculated. Results We did not find any Cobb angle significant differences among groups at baseline and at the end of treatment. The only difference was in the number of patients progressed above 45 degrees, found in the JIS group. The RR of progression of AJIS was, 1.35 (IC95% 0.57-3.17) versus AIS, and it wasn't statistically significant in the AJIS group, in respect to AIS group (p = 0.5338). Conclusion There are no significant differences in the final results of AIS and JIS, treated with total respect of the SRS and SOSORT criteria, in adolescence. Brace efficacy can neutralize the risk of progression. PMID:25031608



A Generalizable Methodology for Quantifying User Satisfaction  

NASA Astrophysics Data System (ADS)

Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung


Computed tomography to quantify tooth abrasion  

NASA Astrophysics Data System (ADS)

Cone-beam computed tomography, also termed digital volume tomography, has become a standard technique in dentistry, allowing for fast 3D jaw imaging including denture at moderate spatial resolution. More detailed X-ray images of restricted volumes for post-mortem studies in dental anthropology are obtained by means of micro computed tomography. The present study evaluates the impact of the pipe smoking wear on teeth morphology comparing the abraded tooth with its contra-lateral counterpart. A set of 60 teeth, loose or anchored in the jaw, from 12 dentitions have been analyzed. After the two contra-lateral teeth were scanned, one dataset has been mirrored before the two datasets were registered using affine and rigid registration algorithms. Rigid registration provides three translational and three rotational parameters to maximize the overlap of two rigid bodies. For the affine registration, three scaling factors are incorporated. Within the present investigation, affine and rigid registrations yield comparable values. The restriction to the six parameters of the rigid registration is not a limitation. The differences in size and shape between the tooth and its contra-lateral counterpart generally exhibit only a few percent in the non-abraded volume, validating that the contralateral tooth is a reasonable approximation to quantify, for example, the volume loss as the result of long-term clay pipe smoking. Therefore, this approach allows quantifying the impact of the pipe abrasion on the internal tooth morphology including root canal, dentin, and enamel volumes.

Kofmehl, Lukas; Schulz, Georg; Deyhle, Hans; Filippi, Andreas; Hotz, Gerhard; Berndt-Dagassan, Dorothea; Kramis, Simon; Beckmann, Felix; Mller, Bert



Stimfit: quantifying electrophysiological data with Python  

PubMed Central

Intracellular electrophysiological recordings provide crucial insights into elementary neuronal signals such as action potentials and synaptic currents. Analyzing and interpreting these signals is essential for a quantitative understanding of neuronal information processing, and requires both fast data visualization and ready access to complex analysis routines. To achieve this goal, we have developed Stimfit, a free software package for cellular neurophysiology with a Python scripting interface and a built-in Python shell. The program supports most standard file formats for cellular neurophysiology and other biomedical signals through the Biosig library. To quantify and interpret the activity of single neurons and communication between neurons, the program includes algorithms to characterize the kinetics of presynaptic action potentials and postsynaptic currents, estimate latencies between pre- and postsynaptic events, and detect spontaneously occurring events. We validate and benchmark these algorithms, give estimation errors, and provide sample use cases, showing that Stimfit represents an efficient, accessible and extensible way to accurately analyze and interpret neuronal signals. PMID:24600389

Guzman, Segundo J.; Schlgl, Alois; Schmidt-Hieber, Christoph



Quantifying Uncertainty in Epidemiological Models  

SciTech Connect

Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.

Ramanathan, Arvind [ORNL; Jha, Sumit Kumar [University of Central Florida



Quantifying uncertainty from material inhomogeneity.  

SciTech Connect

Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the deformation behavior was found to depend strongly on the character of the nearby microstructure.

Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee



Quantifying the isotopic continental effect  

NASA Astrophysics Data System (ADS)

Since the establishment of the IAEA-WMO precipitation-monitoring network in 1961, it has been observed that isotope ratios in precipitation (? 2H and ? 18O) generally decrease from coastal to inland locations, an observation described as the 'continental effect.' While discussed frequently in the literature, there have been few attempts to quantify the variables controlling this effect despite the fact that isotopic gradients over continents can vary by orders of magnitude. In a number of studies, traditional Rayleigh fractionation has proven inadequate in describing the global variability of isotopic gradients due to its simplified treatment of moisture transport and its lack of moisture recycling processes. In this study, we use a one-dimensional idealized model of water vapor transport along a storm track to investigate the dominant variables controlling isotopic gradients in precipitation across terrestrial environments. We find that the sensitivity of these gradients to progressive rainout is controlled by a combination of the amount of evapotranspiration and the ratio of transport by advection to transport by eddy diffusion, with these variables becoming increasingly important with decreasing length scales of specific humidity. A comparison of modeled gradients with global precipitation isotope data indicates that these variables can account for the majority of variability in observed isotopic gradients between coastal and inland locations. Furthermore, the dependence of the 'continental effect' on moisture recycling allows for the quantification of evapotranspiration fluxes from measured isotopic gradients, with implications for both paleoclimate reconstructions and large-scale monitoring efforts in the context of global warming and a changing hydrologic cycle.

Winnick, Matthew J.; Chamberlain, C. Page; Caves, Jeremy K.; Welker, Jeffrey M.



2013 Goat Shows Show Date Show Name Entries Due Eligibility Weigh In Show Time Contact Phone Extra Info  

E-print Network

2013 Goat Shows Show Date Show Name Entries Due Eligibility Weigh In Show Time Contact Phone Extra/13/2013 Cannon Co. Day of Show Youth Must 8 a.m. to 12 p.m. Carol 615-563-5260 Bring own Jr. Goat $1.00 a head Control 10 a.m. Melton Bedding Association Goat 7/20/2013 Overton Co. Day of Show Youth Before 5 p.m. 6

Grissino-Mayer, Henri D.


Quantifying tetrodotoxin levels in the California newt using a non-destructive sampling method.  


Toxic or noxious substances often serve as a means of chemical defense for numerous taxa. However, such compounds may also facilitate ecological or evolutionary processes. The neurotoxin, tetrodotoxin (TTX), which is found in newts of the genus Taricha, acts as a selection pressure upon predatory garter snakes, is a chemical cue to conspecific larvae, which elicits antipredator behavior, and may also affect macroinvertebrate foraging behavior. To understand selection patterns and how potential variation might affect ecological and evolutionary processes, it is necessary to quantify TTX levels within individuals and populations. To do so has often required that animals be destructively sampled or removed from breeding habitats and brought into the laboratory. Here we demonstrate a non-destructive method of sampling adult Taricha that obviates the need to capture and collect individuals. We also show that embryos from oviposited California newt (Taricha torosa) egg masses can be individually sampled and TTX quantified from embryos. We employed three different extraction techniques to isolate TTX. Using a custom fabricated high performance liquid chromatography (HPLC) system we quantified recovery of TTX. We found that a newly developed micro-extraction technique significantly improved recovery compared to previously used methods. Results also indicate our improvements to the HPLC method have high repeatability and increased sensitivity, with a detection limit of 48pg (0.15pmol) TTX. The quantified amounts of TTX in adult newts suggest fine geographic variation in toxin levels between sampling localities isolated by as little as 3km. PMID:24467994

Bucciarelli, Gary M; Li, Amy; Kats, Lee B; Green, David B



Quantifying recrystallization by electron backscatter diffraction.  


The use of high-resolution electron backscatter diffraction in the scanning electron microscope to quantify the volume fraction of recrystallization and the recrystallization kinetics is discussed. Monitoring the changes of high-angle grain boundary (HAGB) content during annealing is shown to be a reliable method of determining the volume fraction of recrystallization during discontinuous recrystallization, where a large increase in the percentage of high-angle boundaries occurs during annealing. The results are shown to be consistent with the standard methods of studying recrystallization, such as quantitative metallography and hardness testing. Application of the method to a highly deformed material has shown that it can be used to identify the transition from discontinuous to continuous recrystallization during which there is no significant change in the percentage of HAGB during annealing. PMID:15009691

Jazaeri, H; Humphreys, F J



Quantifying distraction and interruption in urological surgery  

PubMed Central

Background To enhance safety in surgery, it is necessary to develop a variety of tools for measuring and evaluating the system of work. One important consideration for safety in any high?risk work is the frequency and effect of distraction and interruption. Aim To quantify distraction and interruption to the sterile surgical team in urology. Methods Observation of the behaviour of the surgical team and their task activity determined distraction and interruption recorded. Using an ordinal scale, an observer rated each salient distraction or interruption observed in relation to the team's involvement. Results The frequency of events and their attached ratings were high, deriving from varying degrees of equipment, procedure and environment problems, telephones, bleepers and conversations. Discussion With further refinement and testing, this method may be useful for distinguishing ordinal levels of work interference in surgery and helpful in raising awareness of its origin for postoperative surgical team debriefing. PMID:17403761

Healey, A N; Primus, C P; Koutantji, M



New Drug Shows Mixed Results Against Early Alzheimer's  


... In Search of an Alzheimers Cure New therapeutic target for Alzheimers could lead to drugs without side ... Fitness: Take Care of Your Core Fitness: Get Moving! How Meditation May Help Against Alzheimers Reducing Clutter ...


Soccer Tournament ELI Talent Show  

E-print Network

Highlights Soccer Tournament ELI Talent Show Notes from your Teachers Notes from the Office very quickly! ELI Talent Show As you probably already know, the ELI is going to have its second annual Talent Show. The talent show is open to ELI students, faculty, staff, and LAs. Acts can include

Pilyugin, Sergei S.


Quantifying and Improving DNS Availability Casey Deccio  

E-print Network

Quantifying and Improving DNS Availability By Casey Deccio B.S. (Brigham Young University) 2002 M;Copyright by CASEY DECCIO 2010 #12;Abstract Quantifying and Improving DNS Availability by Casey Deccio The Domain Name System (DNS) is one of the components most critical to Internet functionality. Nearly all

California at Davis, University of


Processing queries with quantifiers a horticultural approach  

Microsoft Academic Search

Most research on query processing has focussed on quantifier-free conjunctive queries. Existing techniques for processing queries with quantifiers either compile the query into a nested loop program or use variants of Codd's reduction from the Relational Calculus to the Relational Algebra. In this paper we propose an alternative technique that uses an algebra of graft and prune operations on trees.

Umeshwar Dayal



ELI Talent Show Final Exams  

E-print Network

Highlights ELI Talent Show Final Exams Scholarship Nominees Graduate Admissions Workshop Reminders from the Office Manners, Cultures, & Grammar TheELIWeekly ELI Talent Show It's going to be a blast! Come one, come all! The 2nd Annual ELI Talent Show will be on Tuesday, April 15th

Pilyugin, Sergei S.


Using effect size to quantify plantar pressure asymmetry of gait of nondisabled adults and patients with hemiparesis.  


In the literature, numerous statistical analyses are used to quantify asymmetry in gait. This study tested the effect size (ES) statistic for quantifying asymmetry in nondisabled and pathological populations. The plantar pressure peaks on eight footprint locations of 27 nondisabled subjects and 18 patients with hemiparesis were bilaterally compared. Asymmetry quantifications were performed with ES and standard statistical tests (index of asymmetry, symmetry index, and ratio index). The results show an advantage in using ES to quantify asymmetry when confidence limits are also calculated. Conversely, traditional asymmetry indexes immediately implied asymmetry without statistical basis. These findings should be considered when one is attempting to diagnose pathological walking patterns or guide rehabilitation processes. PMID:18247231

Potdevin, Franois J; Femery, Virginie G; Decatoire, Aurlien; Bosquet, Laurent; Coello, Yann; Moretto, Pierre



Rectal Swabs Are Suitable for Quantifying the Carriage Load of KPC-Producing Carbapenem-Resistant Enterobacteriaceae  

PubMed Central

It is more convenient and practical to collect rectal swabs than stool specimens to study carriage of colon pathogens. In this study, we examined the ability to use rectal swabs rather than stool specimens to quantify Klebsiella pneumoniae carbapenemase (KPC)-producing carbapenem-resistant Enterobacteriaceae (CRE). We used a quantitative real-time PCR (qPCR) assay to determine the concentration of the blaKPC gene relative to the concentration of 16S rRNA genes and a quantitative culture-based method to quantify CRE relative to total aerobic bacteria. Our results demonstrated that rectal swabs are suitable for quantifying the concentration of KPC-producing CRE and that qPCR showed higher correlation between rectal swabs and stool specimens than the culture-based method. PMID:23295937

Lerner, A.; Romano, J.; Chmelnitsky, I.; Navon-Venezia, S.; Edgar, R.



Quantifying diet for nutrigenomic studies  

Technology Transfer Automated Retrieval System (TEKTRAN)

The field of nutrigenomics shows tremendous promise for improved understanding of the effects of dietary intake on health. The knowledge that metabolic pathways may be altered in individuals with genetic variants in the presence of certain dietary exposures offers great potential for personalized nu...


Quantifying immersion in virtual reality  

Microsoft Academic Search

Virtual Reality (VR) has generated much excitement but little for- mal proof that it is useful. Because VR interfaces are difficult and expensive to build, the computer graphics community needs to be able to predict which applications will benefit from VR. In this paper, we show that users with a VR interface complete a search task faster than users with

Randy F. Pausch; Dennis Proffitt; George H. Williams



Quantifying Climatological Ranges and Anomalies for Pacific Coral Reef Ecosystems  

PubMed Central

Coral reef ecosystems are exposed to a range of environmental forcings that vary on daily to decadal time scales and across spatial scales spanning from reefs to archipelagos. Environmental variability is a major determinant of reef ecosystem structure and function, including coral reef extent and growth rates, and the abundance, diversity, and morphology of reef organisms. Proper characterization of environmental forcings on coral reef ecosystems is critical if we are to understand the dynamics and implications of abioticbiotic interactions on reef ecosystems. This study combines high-resolution bathymetric information with remotely sensed sea surface temperature, chlorophyll-a and irradiance data, and modeled wave data to quantify environmental forcings on coral reefs. We present a methodological approach to develop spatially constrained, island- and atoll-scale metrics that quantify climatological range limits and anomalous environmental forcings across U.S. Pacific coral reef ecosystems. Our results indicate considerable spatial heterogeneity in climatological ranges and anomalies across 41 islands and atolls, with emergent spatial patterns specific to each environmental forcing. For example, wave energy was greatest at northern latitudes and generally decreased with latitude. In contrast, chlorophyll-a was greatest at reef ecosystems proximate to the equator and northern-most locations, showing little synchrony with latitude. In addition, we find that the reef ecosystems with the highest chlorophyll-a concentrations; Jarvis, Howland, Baker, Palmyra and Kingman are each uninhabited and are characterized by high hard coral cover and large numbers of predatory fishes. Finally, we find that scaling environmental data to the spatial footprint of individual islands and atolls is more likely to capture local environmental forcings, as chlorophyll-a concentrations decreased at relatively short distances (>7 km) from 85% of our study locations. These metrics will help identify reef ecosystems most exposed to environmental stress as well as systems that may be more resistant or resilient to future climate change. PMID:23637939

Gove, Jamison M.; Williams, Gareth J.; McManus, Margaret A.; Heron, Scott F.; Sandin, Stuart A.; Vetter, Oliver J.; Foley, David G.



Quantifying Effective Flow and Transport Properties in Heterogeneous Porous Media  

NASA Astrophysics Data System (ADS)

Spatial heterogeneity, the spatial variation in physical and chemical properties, exists at almost all scales and is an intrinsic property of natural porous media. It is important to understand and quantify how small-scale spatial variations determine large-scale "effective" properties in order to predict fluid flow and transport behavior in the natural subsurface. In this work, we aim to systematically understand and quantify the role of the spatial distribution of sand grains of different sizes in determining effective dispersivity and effective permeability using quasi-2D flow-cell experiments and numerical simulations. Two dimensional flow cells (20 cm by 20 cm) were packed with the same total amount of fine and coarse sands however with different spatial patterns. The homogeneous case has the completely mixed fine and coarse sands. The four zone case distributes the fine sand in four identical square zones within the coarse sand matrix. The one square case has all the fine sands in one square block. With the one square case pattern, two more experiments were designed in order to examine the effect of grain size contrast on effective permeability and dispersivity. Effective permeability was calculated based on both experimental and modeling results. Tracer tests were run for all cases. Advection dispersion equations were solved to match breakthrough data and to obtain average dispersivity. We also used Continuous Time Random Walk (CTRW) to quantify the non-Fickian transport behavior for each case. For the three cases with the same grain size contrast, the results show that the effective permeability does not differ significantly. The effective dispersion coefficient is the smallest for the homogeneous case (0.05 cm) and largest for the four zone case (0.27 cm). With the same pattern, the dispersivity value is the largest with the highest size contrast (0.28 cm), which is higher than the one with the lowest case by a factor of 2. The non-Fickian behavior was quantified by the ? value within the CTRW framework. Fickian transport will result in ? values larger than 2 while its deviation from 2 indicates the extent of non-Fickian behavior. Among the three cases with the same grain size contrast, the ? value is closest to 2 in the homogeneous case (1.95), while smallest in the four zone case (1.89). In the one square case, with the highest size contrast, the ? value was 1.57, indicating increasing extent of non-Fickian behavior with higher size contrast. This study is one step toward understanding how small-scale spatial variation in physical properties affect large-scale flow and transport behavior. This step is important in predicting subsurface transport processes that are relevant to earth sciences, environmental engineering, and petroleum engineering.

Heidari, P.; Li, L.



Shakespeare and other English Renaissance authors as characterized by Information Theory complexity quantifiers  

NASA Astrophysics Data System (ADS)

We introduce novel Information Theory quantifiers in a computational linguistic study that involves a large corpus of English Renaissance literature. The 185 texts studied (136 plays and 49 poems in total), with first editions that range from 1580 to 1640, form a representative set of its period. Our data set includes 30 texts unquestionably attributed to Shakespeare; in addition we also included A Lovers Complaint, a poem which generally appears in Shakespeare collected editions but whose authorship is currently in dispute. Our statistical complexity quantifiers combine the power of Jensen-Shannons divergence with the entropy variations as computed from a probability distribution function of the observed word use frequencies. Our results show, among other things, that for a given entropy poems display higher complexity than plays, that Shakespeares work falls into two distinct clusters in entropy, and that his work is remarkable for its homogeneity and for its closeness to overall means.

Rosso, Osvaldo A.; Craig, Hugh; Moscato, Pablo



quantifying and Predicting Reactive Transport  

SciTech Connect

This project was led by Dr. Jiamin Wan at Lawrence Berkeley National Laboratory. Peter Burns provided expertise in uranium mineralogy and in identification of uranium minerals in test materials. Dr. Wan conducted column tests regarding uranium transport at LBNL, and samples of the resulting columns were sent to Dr. Burns for analysis. Samples were analyzed for uranium mineralogy by X-ray powder diffraction and by scanning electron microscopy, and results were provided to Dr. Wan for inclusion in the modeling effort. Full details of the project can be found in Dr. Wan's final reports for the associated effort at LBNL.

Peter C. Burns, Department of Civil Engineering and Geological Sciences, University of Notre Dame



Planning a Successful Tech Show  

ERIC Educational Resources Information Center

Tech shows are a great way to introduce prospective students, parents, and local business and industry to a technology and engineering or career and technical education program. In addition to showcasing instructional programs, a tech show allows students to demonstrate their professionalism and skills, practice public presentations, and interact

Nikirk, Martin



Quantifying Coral Reef Ecosystem Services  

EPA Science Inventory

Coral reefs have been declining during the last four decades as a result of both local and global anthropogenic stresses. Numerous research efforts to elucidate the nature, causes, magnitude, and potential remedies for the decline have led to the widely held belief that the recov...


Quantifying Uncertainties in Land-Surface Microwave Emissivity Retrievals  

NASA Technical Reports Server (NTRS)

Uncertainties in the retrievals of microwaveland-surface emissivities are quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including the Special Sensor Microwave Imager, the Tropical Rainfall Measuring Mission Microwave Imager, and the Advanced Microwave Scanning Radiometer for Earth Observing System, are studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land-surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors inthe retrievals. Generally, these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 1%-4% (3-12 K) over desert and 1%-7% (3-20 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.5%-2% (2-6 K). In particular, at 85.5/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are most likely caused by rain/cloud contamination, which can lead to random errors up to 10-17 K under the most severe conditions.

Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko



Quantified Energy Dissipation Rates in the Terrestrial Bow Shock  

NASA Astrophysics Data System (ADS)

We present the first observationally quantified measure of the energy dissipation rate due to wave-particle interactions in the transition region of the Earth's collisionless bow shock using data from the THEMIS spacecraft. Each of more than 11 bow shock crossings examined with available wave burst data showed both low frequency (<10 Hz) magnetosonic-whistler waves and high frequency (?10 Hz) electromagnetic and electrostatic waves throughout the entire transition region and into the magnetosheath. The high frequency waves were identified as combinations of ion-acoustic waves, electron cyclotron drift instability driven waves, electrostatic solitary waves, and electromagnetic whistler mode waves. These waves were found to have: (1) amplitudes capable of exceeding ?B ~ 10 nT and ?E ~ 300 mV/m, though more typical values were ?B ~ 0.1-1.0 nT and ?E ~ 10-50 mV/m; (2) energy fluxes in excess of 2000 ?W m-2; (3) resistivities > 9000 ? m; and (4) energy dissipation rates > 3 ?W m-3. The high frequency (>10 Hz) electromagnetic waves produce such excessive energy dissipation that they need only be, at times, < 0.01% efficient to produce the observed increase in entropy across the shocks necessary to balance the nonlinear wave steepening that produces the shocks. These results show that wave-particle interactions have the capacity to regulate the global structure and dominate the energy dissipation of collisionless shocks.

Wilson, L. B., III; Sibeck, D. G.; Breneman, A. W.; Le Contel, O.; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.



Quantifying Uncertainties in Land Surface Microwave Emissivity Retrievals  

NASA Technical Reports Server (NTRS)

Uncertainties in the retrievals of microwave land surface emissivities were quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including SSM/I, TMI and AMSR-E, were studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors in the retrievals. Generally these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 14% (312 K) over desert and 17% (320 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.52% (26 K). In particular, at 85.0/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are mostly likely caused by rain/cloud contamination, which can lead to random errors up to 1017 K under the most severe conditions.

Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko



Methods for quantifying Staphylococcus aureus in indoor air.  


Staphylococcus aureus has been detected in indoor air and linked to human infection. Quantifying S.aureus by efficient sampling methods followed by appropriate sample storage treatments is essential to characterize the exposure risk of humans. This laboratory study evaluated the effects of sampler type (all-glass impinger (AGI-30), BioSampler, and Andersen one-stage sampler (Andersen 1-STG)), collection fluid (deionized water (DW), phosphate-buffered saline (PBS), and Tween mixture (TM)), and sampling time (3-60min) on cell recovery. Effects of storage settings on bacterial concentration were also assessed over 48h. Results showed BioSampler performed better than Andersen 1-STG and AGI-30 (P<0.05) and TM was superior to PBS and DW (P<0.05). An increase in sampling time negatively affected the recoveries of cells in PBS of BioSampler and AGI-30 (P<0.05), whereas cell recoveries in TM were increased at sampling of 6-15min compared with 3min. Concentrations of cells collected in PBS were decreased with storage time at 4 and 23C (P<0.05), while cells stored in TM showed stable concentrations at 4C (P>0.05) and increased cell counts at 23C (P<0.05). Overall, sampling by BioSampler with TM followed by sample transportation and storage at 4C is recommended. PMID:24773454

Chang, C-W; Wang, L-J



Quantifying tomb geometries in resistivity images using watershed algorithms  

Microsoft Academic Search

Quantifying the geometries (defined here as width, height and depth of burial) of archeological structures within resistivity models produced as a result of the regularization constraints used in most inversion algorithms is difficult, especially when structures are closely spaced. Here we apply the watershed by simulated immersion method of boundary detection to smooth 2D resistivity images generated for synthetic and

Mehrez Elwaseif; Lee Slater



Quantifying the Impact of Wind Energy on Market Coupling  

E-print Network

Quantifying the Impact of Wind Energy on Market Coupling Hélène Le Cadre Mathilde Didier Abstract In a context of market coupling, we study analytically the impact of wind farm concentration and of the uncertainty resulting from the introduction of renewable energy on the procurement total cost, on the market

Paris-Sud XI, Université de


Quantifying Annual Aboveground Net Primary Production in the Intermountain West  

Technology Transfer Automated Retrieval System (TEKTRAN)

As part of a larger project, methods were developed to quantify current year growth on grasses, forbs, and shrubs. Annual aboveground net primary production (ANPP) data are needed for this project to calibrate results from computer simulation models and remote-sensing data. Measuring annual ANPP of ...


Quantifying capital goods for biological treatment of organic waste.  


Materials and energy used for construction of anaerobic digestion (AD) and windrow composting plants were quantified in detail. The two technologies were quantified in collaboration with consultants and producers of the parts used to construct the plants. The composting plants were quantified based on the different sizes for the three different types of waste (garden and park waste, food waste and sludge from wastewater treatment) in amounts of 10,000 or 50,000 tonnes per year. The AD plant was quantified for a capacity of 80,000 tonnes per year. Concrete and steel for the tanks were the main materials for the AD plant. For the composting plants, gravel and concrete slabs for the pavement were used in large amounts. To frame the quantification, environmental impact assessments (EIAs) showed that the steel used for tanks at the AD plant and the concrete slabs at the composting plants made the highest contribution to Global Warming. The total impact on Global Warming from the capital goods compared to the operation reported in the literature on the AD plant showed an insignificant contribution of 1-2%. For the composting plants, the capital goods accounted for 10-22% of the total impact on Global Warming from composting. PMID:25595291

Brogaard, Line K; Petersen, Per H; Nielsen, Peter D; Christensen, Thomas H



The interpretation of classically quantified sentences: a set-theoretic approach.  


We present a set-theoretic model of the mental representation of classically quantified sentences (All P are Q, Some P are Q, Some P are not Q, and No P are Q). We take inclusion, exclusion, and their negations to be primitive concepts. We show that although these sentences are known to have a diagrammatic expression (in the form of the Gergonne circles) that constitutes a semantic representation, these concepts can also be expressed syntactically in the form of algebraic formulas. We hypothesized that the quantified sentences have an abstract underlying representation common to the formulas and their associated sets of diagrams (models). We derived 9 predictions (3 semantic, 2 pragmatic, and 4 mixed) regarding people's assessment of how well each of the 5 diagrams expresses the meaning of each of the quantified sentences. We report the results from 3 experiments using Gergonne's (1817) circles or an adaptation of Leibniz (1903/1988) lines as external representations and show them to support the predictions. PMID:21702831

Politzer, Guy; Henst, Jean-Baptiste; Delle Luche, Claire; Noveck, Ira A



Quantifying capital goods for waste incineration.  


Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000-240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000-26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000-5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7-14 kg CO2 per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2-3% with respect to kg CO2 per tonne of waste combusted. PMID:23561797

Brogaard, L K; Riber, C; Christensen, T H



Using Graphs to Show Connections  

NSDL National Science Digital Library

The purpose of this resource is to show how graphs of GLOBE data over time show the interconnectedness of Earth's system components at the local level. Students visit a study site, where they observe and recall their existing knowledge of air, water, soil, and living things to make a list of interconnections among the four Earth system components. They make predictions about the effects of a change in a system, inferring ways these changes affect the characteristics of other related components.

The GLOBE Program, University Corporation for Atmospheric Research (UCAR)



Quantifying the parameters of successful agricultural producers  

E-print Network

The primary purpose of the study was to quantify the parameters of successful agricultural producers. Through the use of the Financial and Risk Management (FARM) Assistance database, this study evaluated economic measures for row-crop producers...

Kaase, Gregory Herman



Interpreting Quantifier Scope Ambiguity: Evidence of Heuristic First, Algorithmic Second Processing  

PubMed Central

The present work suggests that sentence processing requires both heuristic and algorithmic processing streams, where the heuristic processing strategy precedes the algorithmic phase. This conclusion is based on three self-paced reading experiments in which the processing of two-sentence discourses was investigated, where context sentences exhibited quantifier scope ambiguity. Experiment 1 demonstrates that such sentences are processed in a shallow manner. Experiment 2 uses the same stimuli as Experiment 1 but adds questions to ensure deeper processing. Results indicate that reading times are consistent with a lexical-pragmatic interpretation of number associated with context sentences, but responses to questions are consistent with the algorithmic computation of quantifier scope. Experiment 3 shows the same pattern of results as Experiment 2, despite using stimuli with different lexical-pragmatic biases. These effects suggest that language processing can be superficial, and that deeper processing, which is sensitive to structure, only occurs if required. Implications for recent studies of quantifier scope ambiguity are discussed. PMID:24278439

Dwivedi, Veena D.



Interpreting quantifier scope ambiguity: evidence of heuristic first, algorithmic second processing.  


The present work suggests that sentence processing requires both heuristic and algorithmic processing streams, where the heuristic processing strategy precedes the algorithmic phase. This conclusion is based on three self-paced reading experiments in which the processing of two-sentence discourses was investigated, where context sentences exhibited quantifier scope ambiguity. Experiment 1 demonstrates that such sentences are processed in a shallow manner. Experiment 2 uses the same stimuli as Experiment 1 but adds questions to ensure deeper processing. Results indicate that reading times are consistent with a lexical-pragmatic interpretation of number associated with context sentences, but responses to questions are consistent with the algorithmic computation of quantifier scope. Experiment 3 shows the same pattern of results as Experiment 2, despite using stimuli with different lexical-pragmatic biases. These effects suggest that language processing can be superficial, and that deeper processing, which is sensitive to structure, only occurs if required. Implications for recent studies of quantifier scope ambiguity are discussed. PMID:24278439

Dwivedi, Veena D



The quantified patient: a patient participatory culture.  


Abstract The Quantified Self Movement, which aims to improve various aspects of life and health through recording and reviewing daily activities and biometrics, is a new and upcoming practice of self monitoring that holds much promise. Now, the most underutilized resource in ambulatory health care, the patient, can participate like never before, and the patient's Quantified Self can be directly monitored and remotely accessed by health care professionals. PMID:25118077

Appelboom, Geoff; LoPresti, Melissa; Reginster, Jean-Yves; Sander Connolly, E; Dumont, Emmanuel P L



Quantifying Urban Groundwater in Environmental Field Observatories  

NASA Astrophysics Data System (ADS)

Despite the growing footprint of urban landscapes and their impacts on hydrologic and biogeochemical cycles, comprehensive field studies of urban water budgets are few. The cumulative effects of urban infrastructure (buildings, roads, culverts, storm drains, detention ponds, leaking water supply and wastewater pipe networks) on temporal and spatial patterns of groundwater stores, fluxes, and flowpaths are poorly understood. The goal of this project is to develop expertise and analytical tools for urban groundwater systems that will inform future environmental observatory planning and that can be shared with research teams working in urban environments elsewhere. The work plan for this project draws on a robust set of information resources in Maryland provided by ongoing monitoring efforts of the Baltimore Ecosystem Study (BES), USGS, and the U.S. Forest Service working together with university scientists and engineers from multiple institutions. A key concern is to bridge the gap between small-scale intensive field studies and larger-scale and longer-term hydrologic patterns using synoptic field surveys, remote sensing, numerical modeling, data mining and visualization tools. Using the urban water budget as a unifying theme, we are working toward estimating the various elements of the budget in order to quantify the influence of urban infrastructure on groundwater. Efforts include: (1) comparison of base flow behavior from stream gauges in a nested set of watersheds at four different spatial scales from 0.8 to 171 km2, with diverse patterns of impervious cover and urban infrastructure; (2) synoptic survey of well water levels to characterize the regional water table; (3) use of airborne thermal infrared imagery to identify locations of groundwater seepage into streams across a range of urban development patterns; (4) use of seepage transects and tracer tests to quantify the spatial pattern of groundwater fluxes to the drainage network in selected subwatersheds; (5) development of a mass balance for precipitation over a 170 km2 area on a 1x1 km2 grid using recording rain gages for bias correction of weather radar products; (5) calculation of urban evapotranspiration using the Penman-Monteith method compared with results from an eddy correlation station; (7) use of numerical groundwater model in a screening mode to estimate depth of groundwater contributing surface water flow; and (8) data mining of public agency records of potable water and wastewater flows to estimate leakage rates and flowpaths in relation to streamflow and groundwater fluxes.

Welty, C.; Miller, A. J.; Belt, K.; Smith, J. A.; Band, L. E.; Groffman, P.; Scanlon, T.; Warner, J.; Ryan, R. J.; Yeskis, D.; McGuire, M. P.



Results, Results, Results?  

ERIC Educational Resources Information Center

Given the amount of time, energy, and money devoted to provincial achievement exams in Canada, it is disturbing that Alberta students and teachers feel so pressured and that the exams do not accurately reflect what students know. Research shows that intelligence has an (untested) emotional component. (MLH)

Wallace, Dale



Reflection- quantifying a rare good  

E-print Network

Abstract. Based on a literature review, reflections in written text are rare. The reported proportions of reflection are based on different baselines, making comparisons difficult. In contrast, this research reports on the proportion of occurrences of elements of reflection based on sentence level. This metric allows to compare proportions of elements of reflection. Previous studies are based on courses tailored to foster reflection. The reported proportions represent more the success of a specific instruction than informing about proportions of reflections occurring in student writings in general. This study is based on a large sample of course forum posts of a virtual learning environment. In total 1000 sentences were randomly selected and manually classified according to six elements of reflection. Five raters rated each sentence. Agreement was calculated based on a majority vote. The proportions of elements of reflection are reported and its potential application for course analytics demonstrated. The results indicate that reflections in text are indeed rare, and that there are differences within elements of reflection.

Thomas Daniel Ullmann; Fridolin Wild; Peter Scott


Taking the high (or low) road: a quantifier priming perspective on basic anchoring effects.  


Current explanations of basic anchoring effects, defined as the influence of an arbitrary number standard on an uncertain judgment, confound numerical values with vague quantifiers. I show that the consideration of numerical anchors may bias subsequent judgments primarily through the priming of quantifiers, rather than the numbers themselves. Study 1 varied the target of a numerical comparison judgment in a between--participants design, while holding the numerical anchor value constant. This design yielded an anchoring effect consistent with a quantifier priming hypothesis. Study 2 included a direct manipulation of vague quantifiers in the traditional anchoring paradigm. Finally, Study 3 examined the notion that specific associations between quantifiers, reflecting values on separate judgmental dimensions (i.e., the price and height of a target) can affect the direction of anchoring effects. Discussion focuses on the nature of vague quantifier priming in numerically anchored judgments. PMID:23951950

Sleeth-Keppler, David



Quantifying Wrinkle Features of Thin Membrane Structures  

NASA Technical Reports Server (NTRS)

For future micro-systems utilizing membrane based structures, quantified predictions of wrinkling behavior in terms of amplitude, angle and wavelength are needed to optimize the efficiency and integrity of such structures, as well as their associated control systems. For numerical analyses performed in the past, limitations on the accuracy of membrane distortion simulations have often been related to the assumptions made. This work demonstrates that critical assumptions include: effects of gravity, supposed initial or boundary conditions, and the type of element used to model the membrane. In this work, a 0.2 m x 02 m membrane is treated as a structural material with non-negligible bending stiffness. Finite element modeling is used to simulate wrinkling behavior due to a constant applied in-plane shear load. Membrane thickness, gravity effects, and initial imperfections with respect to flatness were varied in numerous nonlinear analysis cases. Significant findings include notable variations in wrinkle modes for thickness in the range of 50 microns to 1000 microns, which also depend on the presence of an applied gravity field. However, it is revealed that relationships between overall strain energy density and thickness for cases with differing initial conditions are independent of assumed initial conditions. In addition, analysis results indicate that the relationship between wrinkle amplitude scale (W/t) and structural scale (L/t) is independent of the nonlinear relationship between thickness and stiffness.

Jacobson, Mindy B.; Iwasa, Takashi; Naton, M. C.



Identifying and quantifying urban recharge: a review  

NASA Astrophysics Data System (ADS)

The sources of and pathways for groundwater recharge in urban areas are more numerous and complex than in rural environments. Buildings, roads, and other surface infrastructure combine with man-made drainage networks to change the pathways for precipitation. Some direct recharge is lost, but additional recharge can occur from storm drainage systems. Large amounts of water are imported into most cities for supply, distributed through underground pipes, and collected again in sewers or septic tanks. The leaks from these pipe networks often provide substantial recharge. Sources of recharge in urban areas are identified through piezometry, chemical signatures, and water balances. All three approaches have problems. Recharge is quantified either by individual components (direct recharge, water-mains leakage, septic tanks, etc.) or holistically. Working with individual components requires large amounts of data, much of which is uncertain and is likely to lead to large uncertainties in the final result. Recommended holistic approaches include the use of groundwater modelling and solute balances, where various types of data are integrated. Urban recharge remains an under-researched topic, with few high-quality case studies reported in the literature.

Lerner, David N.



Quantifying selection in immune receptor repertoires  

PubMed Central

The efficient recognition of pathogens by the adaptive immune system relies on the diversity of receptors displayed at the surface of immune cells. T-cell receptor diversity results from an initial random DNA editing process, called VDJ recombination, followed by functional selection of cells according to the interaction of their surface receptors with self and foreign antigenic peptides. Using high-throughput sequence data from the ?-chain of human T-cell receptors, we infer factors that quantify the overall effect of selection on the elements of receptor sequence composition: the V and J gene choice and the length and amino acid composition of the variable region. We find a significant correlation between biases induced by VDJ recombination and our inferred selection factors together with a reduction of diversity during selection. Both effects suggest that natural selection acting on the recombination process has anticipated the selection pressures experienced during somatic evolution. The inferred selection factors differ little between donors or between naive and memory repertoires. The number of sequences shared between donors is well-predicted by our model, indicating a stochastic origin of such public sequences. Our approach is based on a probabilistic maximum likelihood method, which is necessary to disentangle the effects of selection from biases inherent in the recombination process. PMID:24941953

Elhanati, Yuval; Murugan, Anand; Callan, Curtis G.; Mora, Thierry; Walczak, Aleksandra M.



Map showing of UT's Main,  

E-print Network

$ $ Map showing locations of UT's Main, Scott Park, and Health Science Campuses Student Parking HEALTH SCIENCE CAMPUS GLENDALE BANCROFT NEBRASKA DETROIT FERI A NG BUILDING DIRECTORY AC Scott Park..........................................WBReceiving..........................................WB Minority Business Development Ctr....FAMinority Business Development Ctr....FA NorthwestNorthwest State

Viola, Ronald


The OOPSLA trivia show (TOOTS)  

Microsoft Academic Search

OOPSLA has a longstanding tradition of being a forum for discussing the cutting edge of technology in a fun and participatory environment. The type of events sponsored by OOPSLA sometimes border on the unconventional. This event represents an atypical panel that conforms to the concept of a game show that is focused on questions and answers related to OOPSLA themes.

Jeff Gray; Douglas C. Schmidt



Diarrheal Disease in Show Swine  

E-print Network

or short-term diarrhea followed by systemic or blood stream infection. E-439 3-07 *Visiting Professor, Swine Practice, College of Veterinary Medicine and Biomedical Sciences, The Texas A&M University System. Diarrheal Disease in Show Swine Bruce Lawhorn...

Lawhorn, D. Bruce



Browse the archive Show summaries  

E-print Network

for consumers? www.fightthepowergrab.o Slide Shows for Physics MathType for science and math presentations that a "fast ignition" laser facility could make a significant contribution to fusion research, as well than magnets to confine the plasma, will be investigated by the National Ignition Facility (NIF


Use of Multiple Fractal Dimensions to Quantify Airborne Particle Shape  

Microsoft Academic Search

Fractal dimension has been considered as a quite useful index for quantifying irregular or self-similar shapes. However, shapes of airborne particles may not be fully self-similar and may not be well characterized by a single fractal dimension. Alternatively, they may this study. The results are compared with the result of single fractal self-similar at different levels of scale and

Ying Xie; Philip K. Hopke; Gary Casuccio; Brad Henderson



Quantifying Performance Losses in Source-Channel Coding  

Microsoft Academic Search

In this paper, we identify and quantify loss factors causing sub-optimal performance in joint source-channel coding. We show that both the loss due to non-Gaussian distributed channel symbols and the loss due to non-Gaussian quantization error equals the relative entropy between the actual distribution and the optimal Gaussian distribution, given an average power constraint and an mean-squared error (mse) distortion

Fredrik Hekland; Geir E. ien; Tor A. Ramstad


Proposal for quantifying the Dzyaloshinsky-Moriya interaction by domain walls annihilation measurement  

NASA Astrophysics Data System (ADS)

We show that the magnitude of the Dzyaloshinsky-Moriya interaction (DMI) can be determined by a field-driven domain-walls (DWs) annihilation measurement. It is predicted that the DMI induces antiparallel Nel DWs which form a metastable 360 DW when they approach to each other. We find that an annihilation field, over which two DWs collide and vanish, is proportional to the magnitude of the DMI. This result suggests that the DMI can be quantified by a simple measurement of the annihilation field.

Hiramatsu, Ryo; Kim, Kab-Jin; Nakatani, Yoshinobu; Moriyama, Takahiro; Ono, Teruo



Quantifying thermodynamics of collagen thermal denaturation by second harmonic generation imaging  

NASA Astrophysics Data System (ADS)

Time-lapse second harmonic generation (SHG) microscopy was applied for the extraction of thermodynamic parameters of collagen thermal denaturation. We found that at sufficiently high temperatures, temporal dependence of SHG intensity from the isothermal treatment of chicken dermal collagen was single exponential and can be modeled by the Arrhenius equation. Activation energy and the frequency factor of chicken dermal collagen thermal denaturation were determined using temporal decays of SHG intensity at different temperatures. Our results show that time-lapse, high temperature SHG imaging can be used to quantify kinetic properties of collagen thermal denaturation within a microscopic volume of 1 nl.

Hovhannisyan, Vladimir A.; Su, Ping-Jung; Lin, Sung-Jan; Dong, Chen-Yuan



Quantifying solute transport processes: are chemically "conservative" tracers electrically conservative?  

USGS Publications Warehouse

The concept of a nonreactive or conservative tracer, commonly invoked in investigations of solute transport, requires additional study in the context of electrical geophysical monitoring. Tracers that are commonly considered conservative may undergo reactive processes, such as ion exchange, thus changing the aqueous composition of the system. As a result, the measured electrical conductivity may reflect not only solute transport but also reactive processes. We have evaluated the impacts of ion exchange reactions, rate-limited mass transfer, and surface conduction on quantifying tracer mass, mean arrival time, and temporal variance in laboratory-scale column experiments. Numerical examples showed that (1) ion exchange can lead to resistivity-estimated tracer mass, velocity, and dispersivity that may be inaccurate; (2) mass transfer leads to an overestimate in the mobile tracer mass and an underestimate in velocity when using electrical methods; and (3) surface conductance does not notably affect estimated moments when high-concentration tracers are used, although this phenomenon may be important at low concentrations or in sediments with high and/or spatially variable cation-exchange capacity. In all cases, colocated groundwater concentration measurements are of high importance for interpreting geophysical data with respect to the controlling transport processes of interest.

Singha, Kamini; Li, Li; Day-Lewis, Frederick D.; Regberg, Aaron B.



Quantifying thiol-gold interactions towards the efficient strength control  

NASA Astrophysics Data System (ADS)

The strength of the thiol-gold interactions provides the basis to fabricate robust self-assembled monolayers for diverse applications. Investigation on the stability of thiol-gold interactions has thus become a hot topic. Here we use atomic force microscopy to quantify the stability of individual thiol-gold contacts formed both by isolated single thiols and in self-assembled monolayers on gold surface. Our results show that the oxidized gold surface can enhance greatly the stability of gold-thiol contacts. In addition, the shift of binding modes from a coordinate bond to a covalent bond with the change in environmental pH and interaction time has been observed experimentally. Furthermore, isolated thiol-gold contact is found to be more stable than that in self-assembled monolayers. Our findings revealed mechanisms to control the strength of thiol-gold contacts and will help guide the design of thiol-gold contacts for a variety of practical applications.

Xue, Yurui; Li, Xun; Li, Hongbin; Zhang, Wenke



Statistical physics approach to quantifying differences in myelinated nerve fibers.  


We present a new method to quantify differences in myelinated nerve fibers. These differences range from morphologic characteristics of individual fibers to differences in macroscopic properties of collections of fibers. Our method uses statistical physics tools to improve on traditional measures, such as fiber size and packing density. As a case study, we analyze cross-sectional electron micrographs from the fornix of young and old rhesus monkeys using a semi-automatic detection algorithm to identify and characterize myelinated axons. We then apply a feature selection approach to identify the features that best distinguish between the young and old age groups, achieving a maximum accuracy of 94% when assigning samples to their age groups. This analysis shows that the best discrimination is obtained using the combination of two features: the fraction of occupied axon area and the effective local density. The latter is a modified calculation of axon density, which reflects how closely axons are packed. Our feature analysis approach can be applied to characterize differences that result from biological processes such as aging, damage from trauma or disease or developmental differences, as well as differences between anatomical regions such as the fornix and the cingulum bundle or corpus callosum. PMID:24676146

Comin, Csar H; Santos, Joo R; Corradini, Dario; Morrison, Will; Curme, Chester; Rosene, Douglas L; Gabrielli, Andrea; Costa, Luciano da F; Stanley, H Eugene



Statistical physics approach to quantifying differences in myelinated nerve fibers  

PubMed Central

We present a new method to quantify differences in myelinated nerve fibers. These differences range from morphologic characteristics of individual fibers to differences in macroscopic properties of collections of fibers. Our method uses statistical physics tools to improve on traditional measures, such as fiber size and packing density. As a case study, we analyze crosssectional electron micrographs from the fornix of young and old rhesus monkeys using a semi-automatic detection algorithm to identify and characterize myelinated axons. We then apply a feature selection approach to identify the features that best distinguish between the young and old age groups, achieving a maximum accuracy of 94% when assigning samples to their age groups. This analysis shows that the best discrimination is obtained using the combination of two features: the fraction of occupied axon area and the effective local density. The latter is a modified calculation of axon density, which reflects how closely axons are packed. Our feature analysis approach can be applied to characterize differences that result from biological processes such as aging, damage from trauma or disease or developmental differences, as well as differences between anatomical regions such as the fornix and the cingulum bundle or corpus callosum. PMID:24676146

Comin, Csar H.; Santos, Joo R.; Corradini, Dario; Morrison, Will; Curme, Chester; Rosene, Douglas L.; Gabrielli, Andrea; da F. Costa, Luciano; Stanley, H. Eugene



Quantifying singlet fission in novel organic materials using nonlinear optics  

NASA Astrophysics Data System (ADS)

Singlet fission is a form of multiple exciton generation in which two triplet excitons are produced from the decay of a photoexcited singlet exciton. In a small number of organic materials, most notably pentacene, this conversion process has been shown to occur with unity quantum yield on sub-ps timescales. However, a poorly understood mechanism for fission along with strict energy and geometry requirements have so far limited the observation of this process to a few classes of organic materials, with only a subset of these (most notably the polyacenes) showing both efficient fission and long-lived triplets. Here, we utilize novel organic materials to investigate how the efficiency of the fission process depends on the coupling and the energetic driving force between chromophores in both intra- and intermolecular singlet fission materials. We demonstrate how the triplet yield can be accurately quantified using a combination of traditional transient spectroscopies and recently developed excited state saturable absorption techniques. These results allow us to gain mechanistic insight into the fission process and suggest general strategies for generating new materials that can undergo efficient fission.

Busby, Erik; Xia, Jianlong; Yaffe, Omer; Kumar, Bharat; Berkelbach, Timothy; Wu, Qin; Miller, John; Nuckolls, Colin; Zhu, Xiaoyang; Reichman, David; Campos, Luis; Sfeir, Matthew Y.



Quantifying subsurface mixing of groundwater from lowland stream perspective.  

NASA Astrophysics Data System (ADS)

The distribution of time it takes water from the moment of precipitation to reach the catchment outlet is widely used as a characteristic for catchment discharge behaviour, catchment vulnerability to pollution spreading and pollutant loads from catchments to downstream waters. However, this distribution tends to vary in time driven by variability in precipitation and evapotranspiration. Subsurface mixing controls to what extent dynamics in rainfall and evpotranspiration are translated into dynamics of travel time distributions. This insight in hydrologic functioning of catchments requires new definitions and concepts that link dynamics of catchment travel time distributions to the degree of subsurface mixing. In this presentation we propose the concept of STorage Outflow Probability (STOP) functions, that quantify the probability of water parcels stored in a catchment, to leave this catchment by discharge or evapotranspiration. We will show how STOPs relate to the topography and subsurface and how they can be used for deriving time varying travel time distributions of a catchment. The presented analyses will combine a unique dataset of high-frequent discharge and nitrate concentration measurements with results of a spatially distributed groundwater model and conceptual models of water flow and solute transport. Remarkable findings are the large contrasts in discharge behaviour expressed in travel time between lowland and sloping catchments and the strong relationship between evapotranspiration and stream water nutrient concentration dynamics.

van der Velde, Ype; Torfs, Paul; van der Zee, Sjoerd; Uijlenhoet, Remko



Quantifying food intake in socially housed monkeys: social status effects on caloric consumption  

PubMed Central

Obesity results from a number of factors including socio-environmental influences and rodent models show that several different stressors increase the preference for calorically dense foods leading to an obese phenotype. We present here a non-human primate model using socially housed adult female macaques living in long-term stable groups given access to diets of different caloric density. Consumption of a low fat (LFD; 15% of calories from fat) and a high fat diet (HFD; 45% of calories from fat) was quantified by means of a custom-built, automated feeder that dispensed a pellet of food when activated by a radiofrequency chip implanted subcutaneously in the animals wrist. Socially subordinate females showed indices of chronic psychological stress having reduced glucocorticoid negative feedback and higher frequencies of anxiety-like behavior. Twenty-four hour intakes of both the LFD and HFD were significantly greater in subordinates than dominates, an effect that persisted whether standard monkey chow (13% of calories from fat) was present or absent. Furthermore, although dominants restricted their food intake to daylight, subordinates continued to feed at night. Total caloric intake was significantly correlated with body weight change. Collectively, these results show that food intake can be reliably quantified in non-human primates living in complex social environments and suggest that socially-subordinate females consume more calories, suggesting this ethologically relevant model may help understand how psychosocial stress changes food preferences and consumption leading to obesity. PMID:18486158

Wilson, Mark E.; Fisher, Jeff; Fischer, Andrew; Lee, Vanessa; Harris, Ruth B.; Bartness, Timothy J.



Casimir experiments showing saturation effects  

SciTech Connect

We address several different Casimir experiments where theory and experiment disagree. First out is the classical Casimir force measurement between two metal half spaces; here both in the form of the torsion pendulum experiment by Lamoreaux and in the form of the Casimir pressure measurement between a gold sphere and a gold plate as performed by Decca et al.; theory predicts a large negative thermal correction, absent in the high precision experiments. The third experiment is the measurement of the Casimir force between a metal plate and a laser irradiated semiconductor membrane as performed by Chen et al.; the change in force with laser intensity is larger than predicted by theory. The fourth experiment is the measurement of the Casimir force between an atom and a wall in the form of the measurement by Obrecht et al. of the change in oscillation frequency of a {sup 87}Rb Bose-Einstein condensate trapped to a fused silica wall; the change is smaller than predicted by theory. We show that saturation effects can explain the discrepancies between theory and experiment observed in all these cases.

Sernelius, Bo E. [Division of Theory and Modeling, Department of Physics, Chemistry and Biology, Linkoeping University, SE-581 83 Linkoeping (Sweden)



Quantifying the reheating temperature of the universe  

NASA Astrophysics Data System (ADS)

The aim of this paper is to determine an exact definition of the reheat temperature for a generic perturbative decay of the inflaton. In order to estimate the reheat temperature, there are two important conditions one needs to satisfy: (a) the decay products of the inflaton must dominate the energy density of the universe, i.e. the universe becomes completely radiation dominated, and (b) the decay products of the inflaton have attained local thermodynamical equilibrium. For some choices of parameters, the latter is a more stringent condition, such that the decay products may thermalise much after the beginning of radiation-domination. Consequently, we have obtained that the reheat temperature can be much lower than the standard-lore estimation. In this paper we describe under what conditions our universe could have efficient or inefficient thermalisation, and quantify the reheat temperature for both the scenarios. This result has an immediate impact on many applications which rely on the thermal history of the universe, in particular gravitino abundance. Instant thermalisation: when the inflaton decay products instantly thermalise upon decay. Efficient thermalisation: when the inflaton decay products thermalise right at the instant when radiation epoch starts dominating the universe. Delayed thermalisation: when the inflaton decay products thermalise deep inside the radiation dominated epoch after the transition from inflaton-to-radiation domination had occurred. This paper is organised as follows. In Section 2 we set the stage and write down the relevant equations for our analysis. The standard lore about the reheating epoch is briefly commented in Section 3. Section 4 is devoted to present our analysis, in which we study the conditions under which the plasma attains thermalisation. Later on, in Section 5 we discuss the concept of reheat temperature such as to properly capture the issues of thermalisation. Finally, we conclude in Section 6.

Mazumdar, Anupam; Zaldvar, Bryan



DOE: Quantifying the Value of Hydropower in the Electric Grid  

SciTech Connect

The report summarizes research to Quantify the Value of Hydropower in the Electric Grid. This 3-year DOE study focused on defining value of hydropower assets in a changing electric grid. Methods are described for valuation and planning of pumped storage and conventional hydropower. The project team conducted plant case studies, electric system modeling, market analysis, cost data gathering, and evaluations of operating strategies and constraints. Five other reports detailing these research results are available a project website, With increasing deployment of wind and solar renewable generation, many owners, operators, and developers of hydropower have recognized the opportunity to provide more flexibility and ancillary services to the electric grid. To quantify value of services, this study focused on the Western Electric Coordinating Council region. A security-constrained, unit commitment and economic dispatch model was used to quantify the role of hydropower for several future energy scenarios up to 2020. This hourly production simulation considered transmission requirements to deliver energy, including future expansion plans. Both energy and ancillary service values were considered. Addressing specifically the quantification of pumped storage value, no single value stream dominated predicted plant contributions in various energy futures. Modeling confirmed that service value depends greatly on location and on competition with other available grid support resources. In this summary, ten different value streams related to hydropower are described. These fell into three categories; operational improvements, new technologies, and electricity market opportunities. Of these ten, the study was able to quantify a monetary value in six by applying both present day and future scenarios for operating the electric grid. This study confirmed that hydropower resources across the United States contribute significantly to operation of the grid in terms of energy, capacity, and ancillary services. Many potential improvements to existing hydropower plants were found to be cost-effective. Pumped storage is the most likely form of large new hydro asset expansions in the U.S. however, justifying investments in new pumped storage plants remains very challenging with current electricity market economics. Even over a wide range of possible energy futures, up to 2020, no energy future was found to bring quantifiable revenues sufficient to cover estimated costs of plant construction. Value streams not quantified in this study may provide a different cost-benefit balance and an economic tipping point for hydro. Future studies are essential in the quest to quantify the full potential value. Additional research should consider the value of services provided by advanced storage hydropower and pumped storage at smaller time steps for integration of variable renewable resources, and should include all possible value streams such as capacity value and portfolio benefits i.e.; reducing cycling on traditional generation.




Quantifying plasticity-independent creep compliance and relaxation of viscoelastoplastic materials under contact loading  

E-print Network

Here we quantify the time-dependent mechanical properties of a linear viscoelastoplastic material under contact loading. For contact load relaxation, we showed that the relaxation modulus can be measured independently of ...

Vandamme, Matthieu


Mimas Showing False Colors #1  

NASA Technical Reports Server (NTRS)

False color images of Saturn's moon, Mimas, reveal variation in either the composition or texture across its surface.

During its approach to Mimas on Aug. 2, 2005, the Cassini spacecraft narrow-angle camera obtained multi-spectral views of the moon from a range of 228,000 kilometers (142,500 miles).

The image at the left is a narrow angle clear-filter image, which was separately processed to enhance the contrast in brightness and sharpness of visible features. The image at the right is a color composite of narrow-angle ultraviolet, green, infrared and clear filter images, which have been specially processed to accentuate subtle changes in the spectral properties of Mimas' surface materials. To create this view, three color images (ultraviolet, green and infrared) were combined into a single black and white picture that isolates and maps regional color differences. This 'color map' was then superimposed over the clear-filter image at the left.

The combination of color map and brightness image shows how the color differences across the Mimas surface materials are tied to geological features. Shades of blue and violet in the image at the right are used to identify surface materials that are bluer in color and have a weaker infrared brightness than average Mimas materials, which are represented by green.

Herschel crater, a 140-kilometer-wide (88-mile) impact feature with a prominent central peak, is visible in the upper right of each image. The unusual bluer materials are seen to broadly surround Herschel crater. However, the bluer material is not uniformly distributed in and around the crater. Instead, it appears to be concentrated on the outside of the crater and more to the west than to the north or south. The origin of the color differences is not yet understood. It may represent ejecta material that was excavated from inside Mimas when the Herschel impact occurred. The bluer color of these materials may be caused by subtle differences in the surface composition or the sizes of grains making up the icy soil.

The images were obtained when the Cassini spacecraft was above 25 degrees south, 134 degrees west latitude and longitude. The Sun-Mimas-spacecraft angle was 45 degrees and north is at the top.

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA's Science Mission Directorate, Washington, D.C. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging operations center is based at the Space Science Institute in Boulder, Colo.

For more information about the Cassini-Huygens mission visit . The Cassini imaging team homepage is at .



Quantifying temporal variability in population abundances  

Microsoft Academic Search

Understanding variability of population abundances is of central concern to theoretical and applied evolutionary ecology, yet quantifying the conceptually simple idea has been substantially problematic. Standard statistical measures of variability are particularly biassed by rare events, zero counts and other 'non-Gaussian' behaviour, which are often inappropriately weighted or excluded from analysis. I conjecture that these problems are primarily a function

Joel P. Heath



Quantifying Network Topology Robustness Under Budget Constraints  

E-print Network

Quantifying Network Topology Robustness Under Budget Constraints: General Model and Computational is to be operated under budget constraints, which previous models did not consider. We generalize previous blocking games by introducing a budget limit on the operator and consider two constraint formulations

Bencsáth, Boldizsár


Quantifying DNS Namespace Influence6 Casey Deccioa,  

E-print Network

Quantifying DNS Namespace Influence6 Casey Deccioa, , Jeff Sedayaob , Krishna Kantc , PrasantUniversity of California Davis, 1 Shields Ave., Davis, CA, USA Abstract Name resolution using the Domain Name System (DNS the control of the domain's owner. In this article we review the DNS protocol and several DNS server

California at Davis, University of


Quantifying the Thermal Fatigue of CPV Modules  

SciTech Connect

A method is presented to quantify thermal fatigue in the CPV die-attach from meteorological data. A comparative; study between cities demonstrates a significant difference in the accumulated damage. These differences are most; sensitive to the number of larger (?T) thermal cycles experienced for a location. High frequency data (<1/min) may; be required to most accurately employ this method.

Bosco, N.; Kurtz, S.



Partial Cylindrical Algebraic Decomposition for Quantifier Elimination  

Microsoft Academic Search

The Cylindrical Algebraic Decomposition method (CAD) decomposes $R^r$ into regions over which given polynomials have constant signs. An important application of CAD is quantifier elimination in elementary algebra and geometry. In this paper we present a method which intermingles CAD construction with truth evaluation so that parts of the CAD are constructed only as needed to further truth evaluation and

George E. Collins; Jay H. Hong



Quantifying precipitation suppression due to air Pollution  

E-print Network

Quantifying precipitation suppression due to air Pollution First author: Amir Givati The Hebrew January 2004 #12;ABSTRACT: Urban and industrial air pollution has been shown qualitatively to suppress. The evidence suggests that air pollution aerosols that are incorporated in orographic clouds slow down cloud

Li, Zhanqing


Quantifying the Advantage of Looking Forward  

PubMed Central

We introduce a future orientation index to quantify the degree to which Internet users worldwide seek more information about years in the future than years in the past. We analyse Google logs and find a striking correlation between the country's GDP and the predisposition of its inhabitants to look forward. PMID:22482034

Preis, Tobias; Moat, Helen Susannah; Stanley, H. Eugene; Bishop, Steven R.



Book Review Quantifying Behavior the JWatcher Way.  

E-print Network

researcher, I have scored countless hours of videotaped behavior of wild animals. I always knew JWatcher ability to guide the user through the process of scoring animal behavior successfully, using the softwareBook Review Quantifying Behavior the JWatcher Way. Daniel T. Blumstein and Janice C. Daniel

Grether, Gregory


Quantifying humics in freshwaters: purpose and methods  

Microsoft Academic Search

Natural organic matter (NOM) plays an important role in many environmentally relevant processes. NOM includes many different types of compounds, not all of which behave similarly. Much effort has gone into characterising some fractions of NOM (e.g. humic substances) in the different environmental compartments, in finding tracers to ascertain their origin, etc. However, few methods exist for quantifying the different

Montserrat Filella



Quantifying volcanic eruption fluxes with infrasound  

Microsoft Academic Search

Under suitable conditions the infrasound radiated during volcanic eruptions may serve as a valuable tool for quantifying eruptive outflux and \\/ or eruption intensity. For example, synchronous video, acoustic, and seismic records of discrete explosions at Karymsky Volcano indicate a strong correlation between acoustic intensity and muzzle velocity, which is not clearly reflected in the seismic records. These observations can

J. B. Johnson



Quantifying thermodynamic bottlenecks of metabolic pathways  

E-print Network

Quantifying thermodynamic bottlenecks of metabolic pathways Elad Noor TAU, October 2012 #12;Hulda S select the best candidate for synthetic metabolism? Bar-Even, Noor, Lewis, Milo PNAS, 2010 #12;OptimizingBottleneckEnergetics[kJ/mol] #12;Example: comparing carbon fixation pathway alternatives Natural Synthetic Bar-Even, Noor, Lewis

Beimel, Amos


Comparison of Approaches to Quantify Arterial Damping  

E-print Network

Comparison of Approaches to Quantify Arterial Damping Capacity From Pressurization Tests on Mouse Conduit Arteries Lian Tian e-mail: Zhijie Wang e-mail: Department-mail: Large conduit arteries are not purely elastic, but viscoelastic, which affects

Chesler, Naomi C.


A method for quantifying rotational symmetry.  


Here, a new approach for quantifying rotational symmetry based on vector analysis was described and compared with information obtained from a geometric morphometric analysis and a technique based on distance alone. A new method was developed that generates a polygon from the length and angle data of a structure and then quantifies the minimum change necessary to convert that polygon into a regular polygon. This technique yielded an asymmetry score (s) that can range from 0 (perfect symmetry) to 1 (complete asymmetry). Using digital images of Geranium robertianum flowers, this new method was compared with a technique based on lengths alone and with established geometric morphometric methods used to quantify shape variation. Asymmetry scores (s) more clearly described variation in symmetry and were more consistent with a visual assessment of the images than either comparative technique. This procedure is the first to quantify the asymmetry of radial structures accurately, uses easily obtainable measures to calculate the asymmetry score and allows comparisons among individuals and species, even when the comparisons involve structures with different patterns of symmetry. This technique enables the rigorous analysis of polysymmetric structures and provides a foundation for a better understanding of symmetry in nature. PMID:17688593

Frey, Frank M; Robertson, Aaron; Bukoski, Michael



Quantifying Global Plasmaspheric Images With in situ Observations  

Microsoft Academic Search

Simultaneous IMAGE EUV plasmaspheric images and Magnetospheric Plasma Analyzer (MPA) data from the Los Alamos National Laboratory's\\u000a geosynchronous satellites are combined to understand plasmaspheric behavior and to quantify the global images. A brief review\\u000a of the understanding of the plasmasphere as learned from in situ observations prior to the launch of IMAGE is given to place\\u000a the results presented here

M. B. Moldwin; B. R. Sandel; M. F. Thomsen; R. C. Elphic



Quantifying population structure on short timescales.  


Quantifying the contribution of the various processes that influence population genetic structure is important, but difficult. One of the reasons is that no single measure appropriately quantifies all aspects of genetic structure. An increasing number of studies is analysing population structure using the statistic D, which measures genetic differentiation, next to G(ST) , which quantifies the standardized variance in allele frequencies among populations. Few studies have evaluated which statistic is most appropriate in particular situations. In this study, we evaluated which index is more suitable in quantifying postglacial divergence between three-spined stickleback (Gasterosteus aculeatus) populations from Western Europe. Population structure on this short timescale (10?000 generations) is probably shaped by colonization history, followed by migration and drift. Using microsatellite markers and anticipating that D and G(ST) might have different capacities to reveal these processes, we evaluated population structure at two levels: (i) between lowland and upland populations, aiming to infer historical processes; and (ii) among upland populations, aiming to quantify contemporary processes. In the first case, only D revealed clear clusters of populations, putatively indicative of population ancestry. In the second case, only G(ST) was indicative for the balance between migration and drift. Simulations of colonization and subsequent divergence in a hierarchical stepping stone model confirmed this discrepancy, which becomes particularly strong for markers with moderate to high mutation rates. We conclude that on short timescales, and across strong clines in population size and connectivity, D is useful to infer colonization history, whereas G(ST) is sensitive to more recent demographic events. PMID:22646231

Raeymaekers, Joost A M; Lens, Luc; Van den Broeck, Frederik; Van Dongen, Stefan; Volckaert, Filip A M



A Characterization of the Linguistic Quantifier Self  

Microsoft Academic Search

This paper shows that it is possible to characterize the meanings ofreflexive pronouns when they operate on two and three place predicatesas the only meanings of their types that are complete homomorphisms(c-hom), permutation invariant (pi) and prefixing (pref). This extends previous results by Van Benthem and Keenan concerning the function of reflexives with two place predicates.

Dorit Ben Shalom



Quantifying Nitrogen Loss From Flooded Hawaiian Taro Fields  

NASA Astrophysics Data System (ADS)

In 2004 a field fertilization experiment showed that approximately 80% of the fertilizer nitrogen (N) added to flooded Hawaiian taro (Colocasia esculenta) fields could not be accounted for using classic N balance calculations. To quantify N loss through denitrification and anaerobic ammonium oxidation (anammox) pathways in these taro systems we utilized a slurry-based isotope pairing technique (IPT). Measured nitrification rates and porewater N profiles were also used to model ammonium and nitrate fluxes through the top 10 cm of soil. Quantitative PCR of nitrogen cycling functional genes was used to correlate porewater N dynamics with potential microbial activity. Rates of denitrification calculated using porewater profiles were compared to those obtained using the slurry method. Potential denitrification rates of surficial sediments obtained with the slurry method were found to drastically overestimate the calculated in-situ rates. The largest discrepancies were present in fields greater than one month after initial fertilization, reflecting a microbial community poised to denitrify the initial N pulse. Potential surficial nitrification rates varied between 1.3% of the slurry-measured denitrification potential in a heavily-fertilized site to 100% in an unfertilized site. Compared to the use of urea, fish bone meal fertilizer use resulted in decreased N loss through denitrification in the surface sediment, according to both porewater modeling and IPT measurements. In addition, sub-surface porewater profiles point to root-mediated coupled nitrification/denitrification as a potential N loss pathway that is not captured in surface-based incubations. Profile-based surface plus subsurface coupled nitrification/denitrification estimates were between 1.1 and 12.7 times denitrification estimates from the surface only. These results suggest that the use of a classic isotope pairing technique that employs 15NO3- in fertilized agricultural systems can lead to a drastic overestimation of in-situ denitrification rates and that root-associated subsurface coupled nitrification/denitrification may be a major N loss pathway in these flooded agricultural systems.

Deenik, J. L.; Penton, C. R.; Bruland, G. L.; Popp, B. N.; Engstrom, P.; Mueller, J. A.; Tiedje, J.



Quantifying thermal modifications on laser welded skin tissue  

NASA Astrophysics Data System (ADS)

Laser tissue welding is a potential medical treatment method especially on closing cuts implemented during any kind of surgery. Photothermal effects of laser on tissue should be quantified in order to determine optimal dosimetry parameters. Polarized light and phase contrast techniques reveal information about extend of thermal change over tissue occurred during laser welding application. Change in collagen structure in skin tissue stained with hematoxilen and eosin samples can be detected. In this study, three different near infrared laser wavelengths (809 nm, 980 nm and 1070 nm) were compared for skin welding efficiency. 1 cm long cuts were treated spot by spot laser application on Wistar rats' dorsal skin, in vivo. In all laser applications, 0.5 W of optical power was delivered to the tissue, 5 s continuously, resulting in 79.61 J/cm2 energy density (15.92 W/cm2 power density) for each spot. The 1st, 4th, 7th, 14th, and 21st days of recovery period were determined as control days, and skin samples needed for histology were removed on these particular days. The stained samples were examined under a light microscope. Images were taken with a CCD camera and examined with imaging software. 809 Nm laser was found to be capable of creating strong full-thickness closure, but thermal damage was evident. The thermal damage from 980 nm laser welding was found to be more tolerable. The results showed that 1070 nm laser welding produced noticeably stronger bonds with minimal scar formation.

Tabakoglu, Hasim .; Glsoy, Murat



Raman spectroscopy for quantifying cholesterol in intact coronary artery wall.  


The chemical composition of vascular lesions, an important determinant of plaque progression and rupture, can not presently be determined in vivo. Prior studies have shown that Raman spectroscopy can accurately quantify the amounts of major lipid classes and calcium salts in homogenized coronary artery tissue. This study determines how the relative cholesterol content, which is calculated from Raman spectra collected at the luminal surface of an artery, is related to its depth in an intact arterial wall. Raman spectra of human atherosclerotic plaques were measured after thin tissue layers were successively placed on them. From these spectra, relative cholesterol contents were calculated and used to determine how cholesterol signal strength is attenuated by overlaying tissue. Then, intact artery samples (n = 13) were examined spectroscopically, sectioned and stained specifically for cholesterol. Images of these sections were digitized, and image intensities were related to cholesterol content. These cholesterol amounts were weighed appropriately for depth into the tissue and area-integrated for comparison with spectroscopy results. A decaying exponential curve was fit to the layer study data (r2 = 0.97) and showed that approximately 300 microm of tissue attenuates cholesterol signals by 50%. In intact plaques, the spectroscopically-determined cholesterol amounts correlated strongly and linearly with those determined by digital microscopy (r2 = 0.94). With Raman spectroscopy techniques, the cholesterol content of a lesion can be determined by properly accounting for its depth into an arterial wall. Our results suggest that chemical concentrations in an artery wall could be mapped throughout its thickness, possibly by combining Raman spectroscopy methods with other techniques. PMID:9863544

Rmer, T J; Brennan, J F; Schut, T C; Wolthuis, R; van den Hoogen, R C; Emeis, J J; van der Laarse, A; Bruschke, A V; Puppels, G J



Quantifiers more or less quantify online: ERP evidence for partial incremental interpretation  

PubMed Central

Event-related brain potentials were recorded during RSVP reading to test the hypothesis that quantifier expressions are incrementally interpreted fully and immediately. In sentences tapping general knowledge (Farmers grow crops/worms as their primary source of income), Experiment 1 found larger N400s for atypical (worms) than typical objects (crops). Experiment 2 crossed object typicality with non-logical subject-noun phrase quantifiers (most, few). Off-line plausibility ratings exhibited the crossover interaction predicted by full quantifier interpretation: Most farmers grow crops and Few farmers grow worms were rated more plausible than Most farmers grow worms and Few farmers grow crops. Object N400s, although modulated in the expected direction, did not reverse. Experiment 3 replicated these findings with adverbial quantifiers (Farmers often/rarely grow crops/worms). Interpretation of quantifier expressions thus is neither fully immediate nor fully delayed. Furthermore, object atypicality was associated with a frontal slow positivity in few-type/rarely quantifier contexts, suggesting systematic processing differences among quantifier types. PMID:20640044

Urbach, Thomas P.; Kutas, Marta



Wiki surveys: Open and quantifiable social data collection  

E-print Network

Research about attitudes and opinions is central to social science and relies on two common methodological approaches: surveys and interviews. While surveys enable the quantification of large amounts of information quickly and at a reasonable cost, they are routinely criticized for being "top-down" and rigid. In contrast, interviews allow unanticipated information to "bubble up" directly from respondents, but are slow, expensive, and difficult to quantify. Advances in computing technology now enable a hybrid approach that combines the quantifiability of a survey and the openness of an interview; we call this new class of data collection tools wiki surveys. Drawing on principles underlying successful information aggregation projects, such as Wikipedia, we propose three general criteria that wiki surveys should satisfy: they should be greedy, collaborative, and adaptive. We then present results from, a free and open-source website we created that enables groups all over the world to deploy w...

Salganik, Matthew J



Quantifying electrode reliability during Brain-Computer Interface operation.  


One of the problems of non-invasive Brain- Computer Interface (BCI) applications is the occurrence of anomalous (unexpected) signals that might degrade BCI performance. This situation might slip the operator's attention since raw signals are not usually continuously visualized and monitored during BCI-actuated device operation. Anomalous data can for instance be the result of electrode misplacement, degrading impedance or loss of connectivity. Since this problem can develop at run-time, there is a need of a systematic approach to evaluate electrode reliability during online BCI operation. In this paper, we propose two metrics detecting how much each channel is deviating from its expected behavior. This quantifies electrode reliability at run-time which could be embedded into BCI data processing to increase performance. We assess the effectiveness of these metrics in quantifying signal degradation by conducting three experiments: electrode swap, electrode manipulation and offline artificially degradation of P300 signals. PMID:25376032

Sagha, Hesam; Perdikis, Serafeim; Millan, Jose Del R; Chavarriaga, Ricardo



Choosing appropriate techniques for quantifying groundwater recharge  

Microsoft Academic Search

. Various techniques are available to quantify recharge; however, choosing appropriate techniques is often difficult. Important\\u000a considerations in choosing a technique include space\\/time scales, range, and reliability of recharge estimates based on different\\u000a techniques; other factors may limit the application of particular techniques. The goal of the recharge study is important\\u000a because it may dictate the required space\\/time scales of

Bridget R. Scanlon; Richard W. Healy; Peter G. Cook



Quantifying Energy Savings by Improving Boiler Operation  

E-print Network

Dayton, OH ABSTRACT On/off operation and excess combustion air reduce boiler energy efficiency. This paper presents methods to quantify energy savings from switching to modulation control mode and reducing excess air in natural gas fired boilers... from reducing excess combustion air accounts for the increased combustion temperature, reduced internal convection coefficient and increased residence time of combustion gasses in the boiler. Measured boiler data are used to demonstrate...

Carpenter, K.; Kissock, J. K.



6212013. Arguments without quantifiers; historical numeration systems Analyzing arguments without quantifiers using truth table  

E-print Network

without quantifiers using truth table o Procedure Step 1 is dirty, I must mop it. The floor is dirty._____________ I must mop it. Truth table._______________________ My check arrived on time. Truth table: #12;2 The argument is ___________________________ o

Umble, Ron


Crisis of Japanese Vascular Flora Shown By Quantifying Extinction Risks for 1618 Taxa  

PubMed Central

Although many people have expressed alarm that we are witnessing a mass extinction, few projections have been quantified, owing to limited availability of time-series data on threatened organisms, especially plants. To quantify the risk of extinction, we need to monitor changes in population size over time for as many species as possible. Here, we present the world's first quantitative projection of plant species loss at a national level, with stochastic simulations based on the results of population censuses of 1618 threatened plant taxa in 3574 map cells of ca. 100 km2. More than 500 lay botanists helped monitor those taxa in 19941995 and in 20032004. We projected that between 370 and 561 vascular plant taxa will go extinct in Japan during the next century if past trends of population decline continue. This extinction rate is approximately two to three times the global rate. Using time-series data, we show that existing national protected areas (PAs) covering ca. 7% of Japan will not adequately prevent population declines: even core PAs can protect at best <60% of local populations from decline. Thus, the Aichi Biodiversity Target to expand PAs to 17% of land (and inland water) areas, as committed to by many national governments, is not enough: only 29.2% of currently threatened species will become non-threatened under the assumption that probability of protection success by PAs is 0.5, which our assessment shows is realistic. In countries where volunteers can be organized to monitor threatened taxa, censuses using our method should be able to quantify how fast we are losing species and to assess how effective current conservation measures such as PAs are in preventing species extinction. PMID:24922311

Kadoya, Taku; Takenaka, Akio; Ishihama, Fumiko; Fujita, Taku; Ogawa, Makoto; Katsuyama, Teruo; Kadono, Yasuro; Kawakubo, Nobumitsu; Serizawa, Shunsuke; Takahashi, Hideki; Takamiya, Masayuki; Fujii, Shinji; Matsuda, Hiroyuki; Muneda, Kazuo; Yokota, Masatsugu; Yonekura, Koji; Yahara, Tetsukazu



Quantifying the surface chemistry of 3D matrices in situ  

NASA Astrophysics Data System (ADS)

Despite the major role of the matrix (the insoluble environment around cells) in physiology and pathology, there are very few and limited methods that can quantify the surface chemistry of a 3D matrix such as a biomaterial or tissue ECM. This study describes a novel optical-based methodology that can quantify the surface chemistry (density of adhesion ligands for particular cell adhesion receptors) of a matrix in situ. The methodology utilizes fluorescent analogs (markers) of the receptor of interest and a series of binding assays, where the amount of bound markers on the matrix is quantified via spectral multi-photon imaging. The study provides preliminary results for the quantification of the ligands for the two major collagen-binding integrins (?1?1, ?2?1) in porous collagen scaffolds that have been shown to be able to induce maximum regeneration in transected peripheral nerves. The developed methodology opens the way for quantitative descriptions of the insoluble microenvironment of cells in physiology and pathology, and for integrating the matrix in quantitative models of cell signaling. ?

Tzeranis, Dimitrios S.; So, Peter T. C.; Yannas, Ioannis V.



Quantifying the relative impact of climate and human activities on streamflow  

NASA Astrophysics Data System (ADS)

The objective of this study is to quantify the role of climate and human impacts on streamflow conditions by using historical streamflow records, in conjunction with trend analysis and hydrologic modeling. Four U.S. states, including Indiana, New York, Arizona and Georgia area used to represent various level of human activity based on population change and diverse climate conditions. The Mann-Kendall trend analysis is first used to examine the magnitude changes in precipitation, streamflow and potential evapotranspiration for the four states. Four hydrologic modeling methods, including linear regression, hydrologic simulation, annual balance, and Budyko analysis are then used to quantify the amount of climate and human impacts on streamflow. All four methods show that the human impact is higher on streamflow at most gauging stations in all four states compared to climate impact. Among the four methods used, the linear regression approach produced the best hydrologic output in terms of higher Nash-Sutcliffe coefficient. The methodology used in this study is also able to correctly highlight the areas with higher human impact such as the modified channelized reaches in the northwestern part of Indiana. The results from this study show that population alone cannot capture all the changes caused by human activities in a region. However, this approach provides a starting point towards understanding the role of individual human activities on streamflow changes.

Ahn, Kuk-Hyun; Merwade, Venkatesh



Quantifying the statistical complexity of low-frequency fluctuations in semiconductor lasers with optical feedback  

SciTech Connect

Low-frequency fluctuations (LFFs) represent a dynamical instability that occurs in semiconductor lasers when they are operated near the lasing threshold and subject to moderate optical feedback. LFFs consist of sudden power dropouts followed by gradual, stepwise recoveries. We analyze experimental time series of intensity dropouts and quantify the complexity of the underlying dynamics employing two tools from information theory, namely, Shannon's entropy and the Martin, Plastino, and Rosso statistical complexity measure. These measures are computed using a method based on ordinal patterns, by which the relative length and ordering of consecutive interdropout intervals (i.e., the time intervals between consecutive intensity dropouts) are analyzed, disregarding the precise timing of the dropouts and the absolute durations of the interdropout intervals. We show that this methodology is suitable for quantifying subtle characteristics of the LFFs, and in particular the transition to fully developed chaos that takes place when the laser's pump current is increased. Our method shows that the statistical complexity of the laser does not increase continuously with the pump current, but levels off before reaching the coherence collapse regime. This behavior coincides with that of the first- and second-order correlations of the interdropout intervals, suggesting that these correlations, and not the chaotic behavior, are what determine the level of complexity of the laser's dynamics. These results hold for two different dynamical regimes, namely, sustained LFFs and coexistence between LFFs and steady-state emission.

Tiana-Alsina, J.; Torrent, M. C.; Masoller, C.; Garcia-Ojalvo, J. [Departament de Fisica i Enginyeria Nuclear, Universitat Politecnica de Catalunya, Campus de Terrassa, Edif. GAIA, Rambla de Sant Nebridi s/n, Terrassa E-08222 Barcelona (Spain); Rosso, O. A. [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627 Campus Pampulha, C.P. 702, 30123-970 Belo Horizonte, MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, 1428 Ciudad Universitaria, Buenos Aires (Argentina)



Quantifying nonverbal communicative behavior in face-to-face human dialogues  

NASA Astrophysics Data System (ADS)

The referred study is based on the assumption that understanding how humans use nonverbal behavior in dialogues can be very useful in the design of more natural-looking animated talking heads. The goal of the study is twofold: (1) to explore how people use specific facial expressions and head movements to serve important dialogue functions, and (2) to show evidence that it is possible to measure and quantify the entity of these movements with the Qualisys MacReflex motion tracking system. Naturally elicited dialogues between humans have been analyzed with focus on the attention on those nonverbal behaviors that serve the very relevant functions of regulating the conversational flux (i.e., turn taking) and producing information about the state of communication (i.e., feedback). The results show that eyebrow raising, head nods, and head shakes are typical signals involved during the exchange of speaking turns, as well as in the production and elicitation of feedback. These movements can be easily measured and quantified, and this measure can be implemented in animated talking heads.

Skhiri, Mustapha; Cerrato, Loredana



Mode Level Cognitive Subtraction (MLCS) quantifies spatiotemporal reorganization in large-scale brain topographies  

PubMed Central

Contemporary brain theories of cognitive function posit spatial, temporal and spatiotemporal reorganization as mechanisms for neural information processing. Corresponding brain imaging results underwrite this perspective of large scale reorganization. As we show here, a suitable choice of experimental control tasks allows the disambiguation of the spatial and temporal components of reorganization to a quantifiable degree of certainty. When using electro- or mag-netoencephalography (EEG or MEG), our approach relies on the identification of lower dimensional spaces obtained from the high dimensional data of suitably chosen control task conditions. Encephalographic data from task conditions are reconstructed within these control spaces. We show that the residual signal (part of the task signal not captured by the control spaces) allows the quantification of the degree of spatial reorganization, such as recruitment of additional brain networks. PMID:18583154

Banerjee, Arpan; Tognoli, Emmanuelle; Assisi, Collins G.; Kelso, J. A. Scott; Jirsa, Viktor K.



Quantifying near-surface water exchange to assess hydrometeorological models  

NASA Astrophysics Data System (ADS)

Modelling water exchange from the lower atmosphere, crop and soil system using hydrometeorological models allows processing an actual evapotranspiration (ETa) which is a complex but critical value for numerous hydrological purposes e.g. hydrological modelling and crop irrigation. This poster presents a summary of the hydrometeorological research activity conducted by our research group. The first purpose of this research is to quantify ETa and drainage of a rainfed potato crop located in South-Eastern Canada. Then, the outputs of the hydrometeorological models under study are compared with the observed turbulent fluxes. Afterwards, the sensibility of the hydrometeorological models to different inputs is assessed for an environment under a changing climate. ETa was measured from micrometeorological instrumentation (CSAT3, Campbell SCI Inc.; Li7500, LiCor Inc.), and the eddy covariance techniques. Near surface soil heat flux and soil water content at different layers from 10 cm to 100 cm were also measured. Other parameters required by the hydrometeorological models were observed using meteorological standard instrumentation: shortwave and longwave solar radiation, wind speed, air temperature, atmospheric pressure and precipitation. The cumulative ETa during the growth season (123 days) was 331.5 mm, with a daily maximum of 6.5 mm at full coverage; precipitation was 350.6 mm which is rather small compared with the historical mean (563.3 mm). This experimentation allowed calculating crop coefficients that vary among the growth season for a rainfed potato crop. Land surface schemes as CLASS (Canadian Land Surface Scheme) and c-ISBA (a Canadian version of the model Interaction Sol-Biosphre-Atmosphre) are 1-D physical hydrometeorological models that produce turbulent fluxes (including ETa) for a given crop. The schemes performances were assessed for both energy and water balance, based on the resulting turbulent fluxes and the given observations. CLASS showed overestimating the turbulent fluxes (including ETa) so as the fluctuations in the soil flux which were higher than those measured. ETa and runoff were overestimated by c-ISBA while drainage was weaker, compared to CLASS. On the whole, CLASS showed better modelling drainage. Further works include: 1- comparing observations and results from CLASS to the French model SURFEX (Surface Externalise), that uses the scheme ISBA, and 2- assessing the sensibility of CLASS to different meteorological inputs (i.e. 6 regional climate models) in producing a consistent ETa, in a context of climate changes.

Parent, Annie-Claude; Anctil, Franois; Morais, Anne



Quantifying and managing the risk of information security breaches participants in a supply chain  

E-print Network

Technical integration between companies can result in an increased risk of information security breaches. This thesis proposes a methodology for quantifying information security risk to a supply chain participant. Given a ...

Bellefeuille, Cynthia Lynn



Quantifying Hierarchy Stimuli in Systematic Desensitization Via GSR: A Preliminary Investigation  

ERIC Educational Resources Information Center

The aim of the method for quantifying hierarchy stimuli by Galvanic Skin Resistance recordings is to improve the results of systematic desensitization by attenuating the subjective influences in hierarchy construction which are common in traditional procedures. (Author/CS)

Barabasz, Arreed F.



A time-domain hybrid analysis method for detecting and quantifying T-wave alternans.  


T-wave alternans (TWA) in surface electrocardiograph (ECG) signals has been recognized as a marker of cardiac electrical instability and is hypothesized to be associated with increased risk for ventricular arrhythmias among patients. A novel time-domain TWA hybrid analysis method (HAM) utilizing the correlation method and least squares regression technique is described in this paper. Simulated ECGs containing artificial TWA (cases of absence of TWA and presence of stationary or time-varying or phase-reversal TWA) under different baseline wanderings are used to test the method, and the results show that HAM has a better ability of quantifying TWA amplitude compared with the correlation method (CM) and adapting match filter method (AMFM). The HAM is subsequently used to analyze the clinical ECGs, and results produced by the HAM have, in general, demonstrated consistency with those produced by the CM and the AMFM, while the quantifying TWA amplitudes by the HAM are universally higher than those by the other two methods. PMID:24803951

Wan, Xiangkui; Yan, Kanghui; Zhang, Linlin; Zeng, Yanjun



A Time-Domain Hybrid Analysis Method for Detecting and Quantifying T-Wave Alternans  

PubMed Central

T-wave alternans (TWA) in surface electrocardiograph (ECG) signals has been recognized as a marker of cardiac electrical instability and is hypothesized to be associated with increased risk for ventricular arrhythmias among patients. A novel time-domain TWA hybrid analysis method (HAM) utilizing the correlation method and least squares regression technique is described in this paper. Simulated ECGs containing artificial TWA (cases of absence of TWA and presence of stationary or time-varying or phase-reversal TWA) under different baseline wanderings are used to test the method, and the results show that HAM has a better ability of quantifying TWA amplitude compared with the correlation method (CM) and adapting match filter method (AMFM). The HAM is subsequently used to analyze the clinical ECGs, and results produced by the HAM have, in general, demonstrated consistency with those produced by the CM and the AMFM, while the quantifying TWA amplitudes by the HAM are universally higher than those by the other two methods. PMID:24803951

Wan, Xiangkui; Yan, Kanghui; Zhang, Linlin; Zeng, Yanjun



3D Wind: Quantifying wind speed and turbulence intensity  

NASA Astrophysics Data System (ADS)

Integrating measurements and modeling of wind characteristics for wind resource assessment and wind farm control is increasingly challenging as the scales of wind farms increases. Even offshore or in relatively homogeneous landscapes, there are significant gradients of both wind speed and turbulence intensity on scales that typify large wind farms. Our project is, therefore, focused on (i) improving methods to integrate remote sensing and in situ measurements with model simulations to produce a 3-dimensional view of the flow field on wind farm scales and (ii) investigating important controls on the spatiotemporal variability of flow fields within the coastal zone. The instrument suite deployed during the field experiments includes; 3-D sonic and cup anemometers deployed on meteorological masts and buoys, anemometers deployed on tethersondes and an Unmanned Aerial Vehicle, multiple vertically-pointing continuous-wave lidars and scanning Doppler lidars. We also integrate data from satellite-borne instrumentation - specifically synthetic aperture radar and scatterometers and output from the Weather Research and Forecast (WRF) model. Spatial wind fields and vertical profiles of wind speed from WRF and from the full in situ observational suite exhibit excellent agreement in a proof-of-principle experiment conducted in north Indiana particularly during convective conditions, but showed some discrepancies during the breakdown of the nocturnal stable layer. Our second experiment in May 2013 focused on triangulating a volume above an area of coastal water extending from the port in Cleveland out to an offshore water intake crib (about 5 km) and back to the coast, and includes extremely high resolution WRF simulations designed to characterize the coastal zone. Vertically pointing continuous-wave lidars were operated at each apex of the triangle, while the scanning Doppler lidar scanned out across the water over 90 degrees azimuth angle. Preliminary results pertaining to objective (i) indicate there is good agreement between wind and turbulence profiles to 200 m as measured by the vertically pointing lidars with the expected modification of the offshore profiles relative to the profiles at the coastline. However, these profiles do not always fully agree with wind speed and direction profiles measured by the scanning Doppler lidar. Further investigation is required to elucidate these results and to analyze whether these discrepancies occur during particular atmospheric conditions. Preliminary results regarding controls on flow in the coastal zone (i.e. objective ii) include clear evidence that the wind profile to 200 m was modified due to swell during unstable conditions even under moderate to high wind speed conditions. The measurement campaigns will be described in detail, with a view to evaluating optimal strategies for offshore measurement campaigns and in the context of quantifying wind and turbulence in a 3D volume.

Barthelmie, R. J.; Pryor, S. C.; Wang, H.; Crippa, P.



Quantifying function in the early embryonic heart.  


Congenital heart defects arise during the early stages of development, and studies have linked abnormal blood flow and irregular cardiac function to improper cardiac morphogenesis. The embryonic zebrafish offers superb optical access for live imaging of heart development. Here, we build upon previously used techniques to develop a methodology for quantifying cardiac function in the embryonic zebrafish model. Imaging was performed using bright field microscopy at 1500 frames/s at 0.76 ?m/pixel. Heart function was manipulated in a wild-type zebrafish at ?55 h post fertilization (hpf). Blood velocity and luminal diameter were measured at the atrial inlet and atrioventricular junction (AVJ) by analyzing spatiotemporal plots. Control volume analysis was used to estimate the flow rate waveform, retrograde fractions, stroke volume, and cardiac output. The diameter and flow waveforms at the inlet and AVJ are highly repeatable between heart beats. We have developed a methodology for quantifying overall heart function, which can be applied to early stages of zebrafish development. PMID:24231901

Johnson, Brennan M; Garrity, Deborah M; Dasi, Lakshmi Prasad



Quantifying meta-correlations in financial markets  

NASA Astrophysics Data System (ADS)

Financial markets are modular multi-level systems, in which the relationships between the individual components are not constant in time. Sudden changes in these relationships significantly affect the stability of the entire system, and vice versa. Our analysis is based on historical daily closing prices of the 30 components of the Dow Jones Industrial Average (DJIA) from March 15th, 1939 until December 31st, 2010. We quantify the correlation among these components by determining Pearson correlation coefficients, to investigate whether mean correlation of the entire portfolio can be used as a precursor for changes in the index return. To this end, we quantify the meta-correlation - the correlation of mean correlation and index return. We find that changes in index returns are significantly correlated with changes in mean correlation. Furthermore, we study the relationship between the index return and correlation volatility - the standard deviation of correlations for a given time interval. This parameter provides further evidence of the effect of the index on market correlations and their fluctuations. Our empirical findings provide new information and quantification of the index leverage effect, and have implications to risk management, portfolio optimization, and to the increased stability of financial markets.

Kenett, Dror Y.; Preis, Tobias; Gur-Gershgoren, Gitit; Ben-Jacob, Eshel



Quantifying individual performance in Cricket A network analysis of batsmen and bowlers  

NASA Astrophysics Data System (ADS)

Quantifying individual performance in the game of Cricket is critical for team selection in International matches. The number of runs scored by batsmen and wickets taken by bowlers serves as a natural way of quantifying the performance of a cricketer. Traditionally the batsmen and bowlers are rated on their batting or bowling average respectively. However, in a game like Cricket it is always important the manner in which one scores the runs or claims a wicket. Scoring runs against a strong bowling line-up or delivering a brilliant performance against a team with a strong batting line-up deserves more credit. A players average is not able to capture this aspect of the game. In this paper we present a refined method to quantify the quality of runs scored by a batsman or wickets taken by a bowler. We explore the application of Social Network Analysis (SNA) to rate the players in a team performance. We generate a directed and weighted network of batsmen-bowlers using the player-vs-player information available for Test cricket and ODI cricket. Additionally we generate a network of batsmen and bowlers based on the dismissal record of batsmen in the history of cricket-Test (1877-2011) and ODI (1971-2011). Our results show that M. Muralitharan is the most successful bowler in the history of Cricket. Our approach could potentially be applied in domestic matches to judge a players performance which in turn paves the way for a balanced team selection for International matches.

Mukherjee, Satyam



Beyond immunity: quantifying the effects of host anti-parasite behavior on parasite transmission.  


A host's first line of defense in response to the threat of parasitic infection is behavior, yet the efficacy of anti-parasite behaviors in reducing infection are rarely quantified relative to immunological defense mechanisms. Larval amphibians developing in aquatic habitats are at risk of infection from a diverse assemblage of pathogens, some of which cause substantial morbidity and mortality, suggesting that behavioral avoidance and resistance could be significant defensive strategies. To quantify the importance of anti-parasite behaviors in reducing infection, we exposed larval Pacific chorus frogs (Pseudacris regilla) to pathogenic trematodes (Ribeiroia and Echinostoma) in one of two experimental conditions: behaviorally active (unmanipulated) or behaviorally impaired (anesthetized). By quantifying both the number of successful and unsuccessful parasites, we show that host behavior reduces infection prevalence and intensity for both parasites. Anesthetized hosts were 20-39% more likely to become infected and, when infected, supported 2.8-fold more parasitic cysts. Echinostoma had a 60% lower infection success relative to the more deadly Ribeiroia and was also more vulnerable to behaviorally mediated reductions in transmission. For Ribeiroia, increases in host mass enhanced infection success, consistent with epidemiological theory, but this relationship was eroded among active hosts. Our results underscore the importance of host behavior in mitigating disease risk and suggest that, in some systems, anti-parasite behaviors can be as or more effective than immune-mediated defenses in reducing infection. Considering the severe pathologies induced by these and other pathogens of amphibians, we emphasize the value of a broader understanding of anti-parasite behaviors and how co-occurring stressors affect them. PMID:20857146

Daly, Elizabeth W; Johnson, Pieter T J



Where do roots take up water? A method to quantify local root water uptake  

NASA Astrophysics Data System (ADS)

During the past decades, considerable advances have been made in the conceptual understanding and mathematical description of root water uptake process. A large number of models of root water uptake with different degrees of complexity are now available. However, effective application of these models to practical situations for mechanistic description of root water uptake requires proper experimental data. The aim of this study is to introduce and test a non-destructive method for quantifying local water flow from soil to roots. We grew lupin in 30-25-1 cm containers. Each container was filled with a sandy soil which was partitioned into different compartments using 1cm-thick layers of coarse sand. Deuterium (D2O) was locally injected in soil near the root surface of 18-day old plans. The flow of D2O into transpiring plants (day) and non-transpiring plants (night) was traced using time-series neutron radiography. The results showed that: 1) D2O entered the roots faster during the day than night; 2) D2O quickly transported inside the roots towards the shoots during the day, while at night this flow was negligible. Differences between day and night were explained by convective flow of water into the root due to transpiration. To quantify the transport of D2O into roots, we developed a simple convection-diffusion model. The root water uptake predicted by the model was compared with the direct measurements of axial water flow in the roots. This new method allows for quantifying local water uptake in different parts of the root system.

Zarebanadkouki, M.; Kim, Y.; Carminati, A.



Quantifying VOC emissions from polymers: A case study  

SciTech Connect

Evaluating residual volatile organic compound emissions emanating from low-density polyethylene can pose significant challenges. These challenges include quantifying emissions from: (a) multiple process lines with different operating conditions; (b) several different comonomers; (c) variations of comonomer content in each grade; and (d) over 120 grades of LDPE. This presentation is a Case Study outlining a project to develop grade-specific emission data for low-density polyethylene pellets. This study included extensive laboratory analyses and required the development of a relational database to compile analytical results, calculate the mean concentration and standard deviation, and generate emissions reports.

Schulze, J.K.; Qasem, J.S.; Snoddy, R. [C-K Associates, Inc., Baton Rouge, LA (United States)



Quantifying Subsidence in the 1999-2000 Arctic Winter Vortex  

NASA Technical Reports Server (NTRS)

Quantifying the subsidence of the polar winter stratospheric vortex is essential to the analysis of ozone depletion, as chemical destruction often occurs against a large, altitude-dependent background ozone concentration. Using N2O measurements made during SOLVE on a variety of platforms (ER-2, in-situ balloon and remote balloon), the 1999-2000 Arctic winter subsidence is determined from N2O-potential temperature correlations along several N2O isopleths. The subsidence rates are compared to those determined in other winters, and comparison is also made with results from the SLIMCAT stratospheric chemical transport model.

Greenblatt, Jeffery B.; Jost, Hans-juerg; Loewenstein, Max; Podolske, James R.; Bui, T. Paul; Elkins, James W.; Moore, Fred L.; Ray, Eric A.; Sen, Bhaswar; Margitan, James J.; Hipskind, R. Stephen (Technical Monitor)



The Physics of Equestrian Show Jumping  

NASA Astrophysics Data System (ADS)

This article discusses the kinematics and dynamics of equestrian show jumping. For some time I have attended a series of show jumping events at Spruce Meadows, an international equestrian center near Calgary, Alberta, often referred to as the "Wimbledon of equestrian jumping." I have always had a desire to write an article such as this one, but when I searched the Internet for information and looked at YouTube presentations, I could only find simplistic references to Newton's laws and the conservation of mechanical energy principle. Nowhere could I find detailed calculations. On the other hand, there were several biomechanical articles with empirical reports of the results of kinetic and dynamic investigations of show jumping using high-speed digital cameras and force plates. They summarize their results in tables that give information about the motion of a horse jumping over high fences (1.40 m) and the magnitudes of the forces encountered when landing. However, they do not describe the physics of these results.

Stinner, Art



Quantifying iridescent coloration in animals: a method for improving repeatability  

E-print Network

METHODS Quantifying iridescent coloration in animals: a method for improving repeatability Melissa character- istics of iridescent colors present particular challenges and opportunities to quantify novel color metrics. Due to the fine-scale angle dependence of iridescent coloration, color metrics

Rutowski, Ronald L.


Quantifying the contributions to stratospheric ozone changes from ozone  

E-print Network

Quantifying the contributions to stratospheric ozone changes from ozone depleting substances Chemistry and Physics Quantifying the contributions to stratospheric ozone changes from ozone depleting by different combinations of long-lived Greenhouse Gases (GHGs) and Ozone Depleting Substances (ODSs

Wirosoetisno, Djoko


Quantifying the BICEP2-Planck tension over gravitational waves.  


The recent BICEP2 measurement of B-mode polarization in the cosmic microwave background (r = 0.2(-0.05)(+0.07)), a possible indication of primordial gravity waves, appears to be in tension with the upper limit from WMAP (r < 0.13 at 95% C.L.) and Planck (r < 0.11 at 95% C.L.). We carefully quantify the level of tension and show that it is very significant (around 0.1% unlikely) when the observed deficit of large-scale temperature power is taken into account. We show that measurements of TE and EE power spectra in the near future will discriminate between the hypotheses that this tension is either a statistical fluke or a sign of new physics. We also discuss extensions of the standard cosmological model that relieve the tension and some novel ways to constrain them. PMID:25083631

Smith, Kendrick M; Dvorkin, Cora; Boyle, Latham; Turok, Neil; Halpern, Mark; Hinshaw, Gary; Gold, Ben



A Synthetic Phased Array Surface Acoustic Wave Sensor for Quantifying Bolt Tension  

PubMed Central

In this paper, we report our findings on implementing a synthetic phased array surface acoustic wave sensor to quantify bolt tension. Maintaining proper bolt tension is important in many fields such as for ensuring safe operation of civil infrastructures. Significant advantages of this relatively simple methodology is its capability to assess bolt tension without any contact with the bolt, thus enabling measurement at inaccessible locations, multiple bolt measurement capability at a time, not requiring data collection during the installation and no calibration requirements. We performed detailed experiments on a custom-built flexible bench-top experimental setup consisting of 1018 steel plate of 12.7 mm ( in) thickness, a 6.4 mm ( in) grade 8 bolt and a stainless steel washer with 19 mm ( in) of external diameter. Our results indicate that this method is not only capable of clearly distinguishing properly bolted joints from loosened joints but also capable of quantifying how loose the bolt actually is. We also conducted detailed signal-to-noise (SNR) analysis and showed that the SNR value for the entire bolt tension range was sufficient for image reconstruction.

Martinez, Jairo; Sisman, Alper; Onen, Onursal; Velasquez, Dean; Guldiken, Rasim



Quantifying Mountain Block Recharge by Means of Catchment-Scale Storage-Discharge Relationships  

NASA Astrophysics Data System (ADS)

Despite the hydrologic significance of mountainous catchments in providing freshwater resources, especially in semi-arid regions, little is known about key hydrological processes in these systems, such as mountain block recharge (MBR). We developed an empirical approach based on the storage sensitivity function introduced by Kirchner (2009) to develop storage-discharge relationships from stream flow analysis. We investigated sensitivity of MBR estimates to uncertainty in the derivation of the catchment storage-discharge relations. We implemented this technique in a semi-arid mountainous catchment in South-east Arizona, USA (the Marshall Gulch catchment in the Santa Catalina Mountains near Tucson) with two distinct rainy seasons, winter frontal storms and summer monsoon separated by prolonged dry periods. Developing storage-discharge relation based on baseflow data in the dry period allowed quantifying change in fractured bedrock storage caused by MBR. Contribution of fractured bedrock to stream flow was confirmed using stable isotope data. Our results show that 1) incorporating scalable time steps to correct for stream flow measurement errors improves the model fit; 2) the quantile method is more suitable for stream flow data binning; 3) the choice of the regression model is more critical when the stage-discharge function is used to predict changes in bedrock storage beyond the maximum observed flow in the catchment and 4) application of daily versus hourly flow did not affect the storage-discharge relationship. This methodology allowed quantifying MBR using stream flow recession analysis from within the mountain system.

Ajami, H.; Troch, P. A.; Maddock, T.; Meixner, T.; Eastoe, C. J.



Quantifying Serum Antiplague Antibody with a Fiber-Optic Biosensor  

Microsoft Academic Search

The fiber-optic biosensor, originally developed to detect hazardous biological agents such as protein toxins or bacterial cells, has been utilized to quantify the concentration of serum antiplague antibodies. This biosensor has been used to detect and quantify the plague fraction 1 antigen in serum, plasma, and whole-blood samples, but its ability to quantify serum antibodies has not been demonstrated. By




On the complexity of quantified linear systems Salvatore Ruggieria,  

E-print Network

On the complexity of quantified linear systems Salvatore Ruggieria, , Pavlos Eirinakisb,1 , K fragment of the first- order theory of linear arithmetic. Quantified propositional formulas of linear inequalities with (k - 1) quantifier alternations are log-space complete in P k or P k depending on the initial

Ruggieri, Salvatore


Quantifying Meteorite Impact Craters Individual Volume Data Sheet  

E-print Network

Quantifying Meteorite Impact Craters Individual Volume Data Sheet Experiment One (Volume) Drop 1 150 Trial 2 150 Trial 3 150 #12;Quantifying Meteorite Impact Craters Individual Speed Data Sheet 100 Trial 3 100 50 cm Height Trial 1 50 Trial 2 50 Trial 3 50 #12;Quantifying Meteorite Impact Craters

Polly, David


How to quantify conduits in wood?  

PubMed Central

Vessels and tracheids represent the most important xylem cells with respect to long distance water transport in plants. Wood anatomical studies frequently provide several quantitative details of these cells, such as vessel diameter, vessel density, vessel element length, and tracheid length, while important information on the three dimensional structure of the hydraulic network is not considered. This paper aims to provide an overview of various techniques, although there is no standard protocol to quantify conduits due to high anatomical variation and a wide range of techniques available. Despite recent progress in image analysis programs and automated methods for measuring cell dimensions, density, and spatial distribution, various characters remain time-consuming and tedious. Quantification of vessels and tracheids is not only important to better understand functional adaptations of tracheary elements to environment parameters, but will also be essential for linking wood anatomy with other fields such as wood development, xylem physiology, palaeobotany, and dendrochronology. PMID:23507674

Scholz, Alexander; Klepsch, Matthias; Karimi, Zohreh; Jansen, Steven



World Health Organization: Quantifying environmental health impacts  

NSDL National Science Digital Library

The World Health Organization works in a number of public health areas, and their work on quantifying environmental health impacts has been receiving praise from many quarters. This site provides materials on their work in this area and visitors with a penchant for international health relief efforts and policy analysis will find the site invaluable. Along the left-hand side of the site, visitors will find topical sections that include "Methods", "Assessment at national level", "Global estimates", and "Publications". In the "Methods" area, visitors will learn about how the World Health Organization's methodology for studying environmental health impacts has been developed and they can also read a detailed report on the subject. The "Global Estimates" area is worth a look as well, and users can look at their complete report, "Preventing Disease Through Healthy Environments: Towards An Estimate Of the Global Burden Of Disease".


Quantifiers More or Less Quantify On-Line: ERP Evidence for Partial Incremental Interpretation  

ERIC Educational Resources Information Center

Event-related brain potentials were recorded during RSVP reading to test the hypothesis that quantifier expressions are incrementally interpreted fully and immediately. In sentences tapping general knowledge ("Farmers grow crops/worms as their primary source of income"), Experiment 1 found larger N400s for atypical ("worms") than typical objects

Urbach, Thomas P.; Kutas, Marta



Synoptic relationships quantified between surface Chlorophyll-a and diagnostic pigments specific to phytoplankton functional types  

NASA Astrophysics Data System (ADS)

Error-quantified, synoptic-scale relationships between chlorophyll-a (Chla) and phytoplankton pigment groups at the sea surface are presented. A total of nine pigment groups were considered to represent nine phytoplankton functional types (PFTs) including microplankton, nanoplankton, picoplankton, diatoms, dinoflagellates, green algae, picoeukaryotes, prokaryotes and Prochlorococcus sp. The observed relationships between Chla and pigment groups were well-defined at the global scale to show that Chla can be used as an index of not only phytoplankton abundance but also community structure; large (micro) phytoplankton monotonically increase as Chla increases, whereas the small (pico) phytoplankton community generally decreases. Within these relationships, we also found non-monotonic variations with Chla for certain pico-plankton (pico-eukaryotes, Prokaryotes and Prochlorococcus sp.) and for Green Algae and nano-sized phytoplankton. The relationships were quantified with a least-square fitting approach in order to estimate the PFTs from Chla alone. The estimated uncertainty of the relationships quantified depends on both phytoplankton types and Chla concentration. Maximum uncertainty over all groups (34.7% Chla) was found from diatom at approximately Chla = 1.07 mg m-3. However, the mean uncertainty of the relationships over all groups was 5.8 [% Chla] over the entire Chla range observed (0.02 < Chla < 6.84 mg m-3). The relationships were applied to SeaWiFS satellite Chla data from 1998 to 2009 to show the global climatological fields of the surface distribution of PFTs. Results show that microplankton are present in the mid and high latitudes, constituting ~9.0 [% Chla] of the phytoplankton community at the global surface, in which diatoms explain ~6.0 [% Chla]. Nanoplankton are ubiquious throught much of the global surface oceans except subtropical gyres, acting as a background population, constituting ~44.2 [% Chla]. Picoplankton are mostly limited in subtropical gyres, constituting ~46.8 [% Chla] globally, in which prokaryotes are the major species explaining 32.3 [% Chla] (prochlorococcus sp. explaining 21.5 [% Chla]), while pico-eukaryotes are notably abundant in the Southern Pacific explaining ~14.5 [% Chla]. These results may be used to constrain or validate global marine ecosystem models.

Hirata, T.; Hardman-Mountford, N. J.; Brewin, R. J. W.; Aiken, J.; Barlow, R.; Suzuki, K.; Isada, T.; Howell, E.; Hashioka, T.; Noguchi-Aita, M.; Yamanaka, Y.



Quantifying the Behavioural Relevance of Hippocampal Neurogenesis  

PubMed Central

Few studies that examine the neurogenesisbehaviour relationship formally establish covariation between neurogenesis and behaviour or rule out competing explanations. The behavioural relevance of neurogenesis might therefore be overestimated if other mechanisms account for some, or even all, of the experimental effects. A systematic review of the literature was conducted and the data reanalysed using causal mediation analysis, which can estimate the behavioural contribution of new hippocampal neurons separately from other mechanisms that might be operating. Results from eleven eligible individual studies were then combined in a meta-analysis to increase precision (representing data from 215 animals) and showed that neurogenesis made a negligible contribution to behaviour (standarised effect ?=?0.15; 95% CI ?=??0.04 to 0.34; p?=?0.128); other mechanisms accounted for the majority of experimental effects (standardised effect ?=?1.06; 95% CI ?=?0.74 to 1.38; p?=?1.710?11). PMID:25426717

Lazic, Stanley E.; Fuss, Johannes; Gass, Peter



A framework for quantifying net benefits of alternative prognostic models  

PubMed Central

New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright 2011 John Wiley & Sons, Ltd. PMID:21905066

Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G



A Simplified Score to Quantify Comorbidity in COPD  

PubMed Central

Importance Comorbidities are common in COPD, but quantifying their burden is difficult. Currently there is a COPD-specific comorbidity index to predict mortality and another to predict general quality of life. We sought to develop and validate a COPD-specific comorbidity score that reflects comorbidity burden on patient-centered outcomes. Materials and Methods Using the COPDGene study (GOLD II-IV COPD), we developed comorbidity scores to describe patient-centered outcomes employing three techniques: 1) simple count, 2) weighted score, and 3) weighted score based upon statistical selection procedure. We tested associations, area under the Curve (AUC) and calibration statistics to validate scores internally with outcomes of respiratory disease-specific quality of life (St. George's Respiratory Questionnaire, SGRQ), six minute walk distance (6MWD), modified Medical Research Council (mMRC) dyspnea score and exacerbation risk, ultimately choosing one score for external validation in SPIROMICS. Results Associations between comorbidities and all outcomes were comparable across the three scores. All scores added predictive ability to models including age, gender, race, current smoking status, pack-years smoked and FEV1 (p<0.001 for all comparisons). Area under the curve (AUC) was similar between all three scores across outcomes: SGRQ (range 0762407676), MMRC (0759007644), 6MWD (0753107560) and exacerbation risk (0683106919). Because of similar performance, the comorbidity count was used for external validation. In the SPIROMICS cohort, the comorbidity count performed well to predict SGRQ (AUC 07891), MMRC (AUC 07611), 6MWD (AUC 07086), and exacerbation risk (AUC 07341). Conclusions Quantifying comorbidity provides a more thorough understanding of the risk for patient-centered outcomes in COPD. A comorbidity count performs well to quantify comorbidity in a diverse population with COPD. PMID:25514500

Putcha, Nirupama; Puhan, Milo A.; Drummond, M. Bradley; Han, MeiLan K.; Regan, Elizabeth A.; Hanania, Nicola A.; Martinez, Carlos H.; Foreman, Marilyn; Bhatt, Surya P.; Make, Barry; Ramsdell, Joe; DeMeo, Dawn L.; Barr, R. Graham; Rennard, Stephen I.; Martinez, Fernando; Silverman, Edwin K.; Crapo, James; Wise, Robert A.; Hansel, Nadia N.



Quantifying VOC emissions for the strategic petroleum reserve.  

SciTech Connect

A very important aspect of the Department of Energy's (DOE's) Strategic Petroleum Reserve (SPR) program is regulatory compliance. One of the regulatory compliance issues deals with limiting the amount of volatile organic compounds (VOCs) that are emitted into the atmosphere from brine wastes when they are discharged to brine holding ponds. The US Environmental Protection Agency (USEPA) has set limits on the amount of VOCs that can be discharged to the atmosphere. Several attempts have been made to quantify the VOC emissions associated with the brine ponds going back to the late 1970's. There are potential issues associated with each of these quantification efforts. Two efforts were made to quantify VOC emissions by analyzing VOC content of brine samples obtained from wells. Efforts to measure air concentrations were mentioned in historical reports but no data have been located to confirm these assertions. A modeling effort was also performed to quantify the VOC emissions. More recently in 2011- 2013, additional brine sampling has been performed to update the VOC emissions estimate. An analysis of the statistical confidence in these results is presented here. Arguably, there are uncertainties associated with each of these efforts. The analysis herein indicates that the upper confidence limit in VOC emissions based on recent brine sampling is very close to the 0.42 ton/MMB limit used historically on the project. Refining this estimate would require considerable investment in additional sampling, analysis, and monitoring. An analysis of the VOC emissions at each site suggests that additional discharges could be made and stay within current regulatory limits.

Knowlton, Robert G.; Lord, David L.



Quantify the accuracy of coal seam gas content  

SciTech Connect

Gas content determination is a critical procedure performed to evaluate the expected gas production rate and producible reserve potential of coal seam reservoirs. The results from a Gas Research Institute (GRI) research project indicate that gas content estimates obtained with many commonly used methods can be low by 50%. These low estimates result in underestimation of gas-in-place reserves, under-prediction of potential gas production rates during primary and enhanced recovery and under-valuation of the economic worth of investors` assets. The results of the GRI research project quantifies the accuracy and comparability of the most commonly used coal seam gas content evaluation procedures. The best methods for accurately estimating the gas-in-place are also identified.

Mavor, M.J. [Tesseract Corp., Park City, UT (United States); Pratt, T.J. [TICORA Geosciences, Denver, CO (United States); Nelson, C.R. [Gas Research Institute, Chicago, IL (United States)



UV-vis spectra as an alternative to the Lowry method for quantify hair damage induced by surfactants.  


It is well known that long term use of shampoo causes damage to human hair. Although the Lowry method has been widely used to quantify hair damage, it is unsuitable to determine this in the presence of some surfactants and there is no other method proposed in literature. In this work, a different method is used to investigate and compare the hair damage induced by four types of surfactants (including three commercial-grade surfactants) and water. Hair samples were immersed in aqueous solution of surfactants under conditions that resemble a shower (38C, constant shaking). These solutions become colored with time of contact with hair and its UV-vis spectra were recorded. For comparison, the amount of extracted proteins from hair by sodium dodecyl sulfate (SDS) and by water were estimated by the Lowry method. Additionally, non-pigmented vs. pigmented hair and also sepia melanin were used to understand the washing solution color and their spectra. The results presented herein show that hair degradation is mostly caused by the extraction of proteins, cuticle fragments and melanin granules from hair fiber. It was found that the intensity of solution color varies with the charge density of the surfactants. Furthermore, the intensity of solution color can be correlated to the amount of proteins quantified by the Lowry method as well as to the degree of hair damage. UV-vis spectrum of hair washing solutions is a simple and straightforward method to quantify and compare hair damages induced by different commercial surfactants. PMID:25277290

Pires-Oliveira, Rafael; Joekes, Ins



Quantifying transmission by stage of infection in the field: The example of SIV-1 and STLV-1 infecting mandrills.  


The early stage of viral infection is often followed by an important increase of viral load and is generally considered to be the most at risk for pathogen transmission. Most methods quantifying the relative importance of the different stages of infection were developed for studies aimed at measuring HIV transmission in Humans. However, they cannot be transposed to animal populations in which less information is available. Here we propose a general method to quantify the importance of the early and late stages of the infection on micro-organism transmission from field studies. The method is based on a state space dynamical model parameterized using Bayesian inference. It is illustrated by a 28 years dataset in mandrills infected by Simian Immunodeficiency Virus type-1 (SIV-1) and the Simian T-Cell Lymphotropic Virus type-1 (STLV-1). For both viruses we show that transmission is predominant during the early stage of the infection (transmission ratio for SIV-1: 1.16 [0.0009; 18.15] and 9.92 [0.03; 83.8] for STLV-1). However, in terms of basic reproductive number (R0 ), which quantifies the weight of both stages in the spread of the virus, the results suggest that the epidemics of SIV-1 and STLV-1 are mainly driven by late transmissions in this population. Am. J. Primatol. 2014 Wiley Periodicals, Inc. PMID:25296992

Roussel, Marion; Pontier, Dominique; Kazanji, Mirdad; Ngoubangoye, Barthlmy; Mahieux, Renaud; Verrier, Delphine; Fouchet, David



Boron aluminum crippling strength shows improvement  

NASA Technical Reports Server (NTRS)

Results are presented from an experimental program directed toward improving boron aluminum crippling strength. Laminate changes evaluated were larger filament diameter, improved processing, shape changes, adding steel-aluminum cross plies, reduced filament volume in corners, adding boron aluminum angle plies, and using titanium interleaves. Filament diameter and steel-aluminum cross plies have little effect on crippling. It is shown that better processing combined with appropriate shape changes improved crippling over 50 percent at both room temperature and 600 F. Tests also show that crippling improvements ranging from 20 to 40 percent are achieved using angle plies and titanium interleaves.

Otto, O. R.; Bohlmann, R. E.



Quantifying active tectonic processes in orogenic belts from river analysis of the Sierra Nevada (Spain)  

NASA Astrophysics Data System (ADS)

The landscape of active orogenic belts is the result of the interaction between tectonic and surface processes. This interaction can be quantified by analysing bedrock river profiles, as they are the first features to respond to active tectonics. Sierra Nevada is an E-W oriented, ~80 km long, ~40 km wide young mountain chain in southern Spain which is seismically active. The western and southern margins of the chain are defined by normal faults with some strike-slip components, which have been active at least since the Pleistocene. Seismicity is less pronounced in the north where sedimentary basins now in exhumation are exposed. The geographical location of Sierra Nevada in the southern Mediterranean; its E-W orientation and pronounced elevation (highest peak, Mulhacen, 3,480m) result in a climate contrast between the west, with higher elevation, and the eastern flanks of the mountain. Precipitation occurs mainly as snowfall during winter; in the west, where peaks have elevations of more than 3,000 m, snow is present for up to 4 months per year. The difference in tectonic activity and climate is reflected in the river profiles. Analysing those we have divided the Sierra Nevada in three main areas: the northern flank, where rivers show predominantly concave profiles; the western flank, where rivers show tendencies towards non equilibrium; and the southern flank where rivers are not in equilibrium and often show convex profiles. The contribution climate makes in shaping these profiles is, however, unknown. We have analysed 16 longitudinal river profiles along the Sierra Nevada; quantified their morphological parameters and identified the presence of knickpoints. These results show a strong correlation between river disequilibrium and active seismicity in the southern flank; this correlation is not as clearly defined in the western area, where, despite the high tectonic activity, rivers are closer to equilibrium. The results observed in the northern rivers show that they are in equilibrium. We used stream power maps to model the effects of tectonics and climate on river profiles to provide insight on the erosion and exhumation of the Sierra Nevada. We suggest that the presence of snow and, probably water condensation, increase the erosional power of the rivers in the western flank hence we observe river profiles closer to equilibrium in a highly active tectonic area.

Carracedo, A.; Beucher, R.; Persano, C.; Jansen, J.; Codilean, A.; Hoey, T.; Bishop, P.



Quantifying GCM uncertainty for estimating storage requirements in Australian reservoir  

NASA Astrophysics Data System (ADS)

Climate change is anticipated to have enormous impacts on our water resources. Whether or not the existing storage capacity of reservoirs is sufficient to meet the future water demands is a question of great interest to water managers and policy makers. Amongst other things, uncertainties in GCM projections make accurate estimation of future water availability and reservoir storage requirements extremely complicated. The present study proposes a new method to quantify GCM uncertainties and their influence on the estimation of reservoir storage. Reliable quantification of GCM uncertainties requires utilization of many ensemble runs for each model and emission scenario. However, the climate modeling groups around the world produce only a few ensemble runs for each scenario. Using these limited number of ensemble runs, this study presents a method to quantify GCM uncertainty that varies with space and time as a function of the GCM being assessed. Then, using GCM projection and estimated associated uncertainty, new data series are generated assuming an additive error model which are used to ascertain effects of GCM uncertainties in impact assessment studies. The analysis involves the following important steps: First, standard errors of bias-corrected GCM projections are estimated using multiple model, scenario and ensemble runs conditional on each percentile. Second, assuming an additive error model, several realizations are generated by randomly sampling from normal distribution. Finally, the generated realizations are applied to evaluate impacts of climate change on reservoir storage estimation and establish its associated uncertainty. The proposed method is applied to quantify uncertainties in rainfall and temperature projections obtained from six GCMs, three emission scenarios and three ensemble runs after correcting biases using the Nested Bias Correction (NBC). Then, thousands of rainfall and temperature realizations are generated using an additive error model for selected GCM and scenario projection. The temperature data are used to estimate evaporation realizations which are then used as input (together with rainfall) to rainfall-runoff model for estimating streamflow. Finally, the streamflow realizations are used to quantify reservoir storage requirements with its associated uncertainties using behavior analysis. Results at the Warragamba dam in Australia reveal that GCM uncertainties will be significantly large for the future period than that for the historical period for both rainfall and temperature at different demand levels. Further, comparison of effects of rainfall and evaporation uncertainty suggests that reservoir storage uncertainty is introduced mainly from rainfall, rather than evaporation.

Woldemeskel, Fitsum; Sharma, Ashish; Sivakumar, Bellie; Mehrotra, Raj



Dissociated neural correlates of quantity processing of quantifiers, numbers, and numerosities.  


Quantities can be represented using either mathematical language (i.e., numbers) or natural language (i.e., quantifiers). Previous studies have shown that numerical processing elicits greater activation in the brain regions around the intraparietal sulcus (IPS) relative to other semantic processes. However, little research has been conducted to investigate whether the IPS is also critical for the semantic processing of quantifiers in natural language. In this study, 20 adults were scanned with functional magnetic resonance imaging while they performed semantic distance judgment involving six types of materials (i.e., frequency adverbs, quantity pronouns and nouns, animal names, Arabic digits, number words, and dot arrays). Conjunction analyses of brain activation showed that numbers and dot arrays elicited greater activation in the right IPS than did words (i.e., animal names) or quantifiers (i.e., frequency adverbs and quantity pronouns and nouns). Quantifiers elicited more activation in left middle temporal gyrus and inferior frontal gyrus than did numbers and dot arrays. No differences were found between quantifiers and animal names. These findings suggest that, although quantity processing for numbers and dot arrays typically relies on the right IPS region, quantity processing for quantifiers typically relies on brain regions for general semantic processing. Thus, the IPS does not appear to be the only brain region for quantity processing. PMID:23019128

Wei, Wei; Chen, Chuansheng; Yang, Tao; Zhang, Han; Zhou, Xinlin



Quantifying variability on thermal resistance of Listeria monocytogenes.  


Knowledge of the impact of strain variability and growth history on thermal resistance is needed to provide a realistic prediction and an adequate design of thermal treatments. In the present study, apart from quantifying strain variability on thermal resistance of Listeria monocytogenes, also biological variability and experimental variability were determined to prioritize their importance. Experimental variability was defined as the repeatability of parallel experimental replicates and biological variability was defined as the reproducibility of biologically independent reproductions. Furthermore, the effect of growth history was quantified. The thermal inactivation curves of 20 L. monocytogenes strains were fitted using the modified Weibull model, resulting in total 360 D-value estimates. The D-value ranged from 9 to 30min at 55C; from 0.6 to 4min at 60C; and from 0.08 to 0.6min at 65C. The estimated z-values of all strains ranged from 4.4 to 5.7C. The strain variability was ten times higher than the experimental variability and four times higher than the biological variability. Furthermore, the effect of growth history on thermal resistance variability was not significantly different from that of strain variability and was mainly determined by the growth phase. PMID:25462932

Aryani, D C; den Besten, H M W; Hazeleger, W C; Zwietering, M H



Quantifying Relative Diver Effects in Underwater Visual Censuses  

PubMed Central

Diver-based Underwater Visual Censuses (UVCs), particularly transect-based surveys, are key tools in the study of coral reef fish ecology. These techniques, however, have inherent problems that make it difficult to collect accurate numerical data. One of these problems is the diver effect (defined as the reaction of fish to a diver). Although widely recognised, its effects have yet to be quantified and the extent of taxonomic variation remains to be determined. We therefore examined relative diver effects on a reef fish assemblage on the Great Barrier Reef. Using common UVC methods, the recorded abundance of seven reef fish groups were significantly affected by the ongoing presence of SCUBA divers. Overall, the diver effect resulted in a 52% decrease in the mean number of individuals recorded, with declines of up to 70% in individual families. Although the diver effect appears to be a significant problem, UVCs remain a useful approach for quantifying spatial and temporal variation in relative fish abundances, especially if using methods that minimise the exposure of fishes to divers. Fixed distance transects using tapes or lines deployed by a second diver (or GPS-calibrated timed swims) would appear to maximise fish counts and minimise diver effects. PMID:21533039

Dickens, Luke C.; Goatley, Christopher H. R.; Tanner, Jennifer K.; Bellwood, David R.



Quantified PIRT and Uncertainty Quantification for Computer Code Validation  

NASA Astrophysics Data System (ADS)

This study is intended to investigate and propose a systematic method for uncertainty quantification for the computer code validation application. Uncertainty quantification has gained more and more attentions in recent years. U.S. Nuclear Regulatory Commission (NRC) requires the use of realistic best estimate (BE) computer code to follow the rigorous Code Scaling, Application and Uncertainty (CSAU) methodology. In CSAU, the Phenomena Identification and Ranking Table (PIRT) was developed to identify important code uncertainty contributors. To support and examine the traditional PIRT with quantified judgments, this study proposes a novel approach, the Quantified PIRT (QPIRT), to identify important code models and parameters for uncertainty quantification. Dimensionless analysis to code field equations to generate dimensionless groups (pi groups) using code simulation results serves as the foundation for QPIRT. Uncertainty quantification using DAKOTA code is proposed in this study based on the sampling approach. Nonparametric statistical theory identifies the fixed number of code run to assure the 95 percent probability and 95 percent confidence in the code uncertainty intervals.

Luo, Hu


Land cover change and remote sensing: Examples of quantifying spatiotemporal dynamics in tropical forests  

SciTech Connect

Research on human impacts or natural processes that operate over broad geographic areas must explicitly address issues of scale and spatial heterogeneity. While the tropical forests of Southeast Asia and Mexico have been occupied and used to meet human needs for thousands of years, traditional forest management systems are currently being transformed by rapid and far-reaching demographic, political, economic, and environmental changes. The dynamics of population growth, migration into the remaining frontiers, and responses to national and international market forces result in a demand for land to produce food and fiber. These results illustrate some of the mechanisms that drive current land use changes, especially in the tropical forest frontiers. By linking the outcome of individual land use decisions and measures of landscape fragmentation and change, the aggregated results shows the hierarchy of temporal and spatial events that in summation result in global changes to the most complex and sensitive biome -- tropical forests. By quantifying the spatial and temporal patterns of tropical forest change, researchers can assist policy makers by showing how landscape systems in these tropical forests are controlled by physical, biological, social, and economic parameters.

Krummel, J.R.; Su, Haiping [Argonne National Lab., IL (United States); Fox, J. [East-West Center, Honolulu, HI (United States); Yarnasan, S.; Ekasingh, M. [Chiang Mai Univ. (Thailand)



Live Cell Interferometry Quantifies Dynamics of Biomass Partitioning during Cytokinesis  

PubMed Central

The equal partitioning of cell mass between daughters is the usual and expected outcome of cytokinesis for self-renewing cells. However, most studies of partitioning during cell division have focused on daughter cell shape symmetry or segregation of chromosomes. Here, we use live cell interferometry (LCI) to quantify the partitioning of daughter cell mass during and following cytokinesis. We use adherent and non-adherent mouse fibroblast and mouse and human lymphocyte cell lines as models and show that, on average, mass asymmetries present at the time of cleavage furrow formation persist through cytokinesis. The addition of multiple cytoskeleton-disrupting agents leads to increased asymmetry in mass partitioning which suggests the absence of active mass partitioning mechanisms after cleavage furrow positioning. PMID:25531652

Zangle, Thomas A.; Teitell, Michael A.; Reed, Jason



Quantifying chaotic dynamics from integrate-and-fire processes.  


Characterizing chaotic dynamics from integrate-and-fire (IF) interspike intervals (ISIs) is relatively easy performed at high firing rates. When the firing rate is low, a correct estimation of Lyapunov exponents (LEs) describing dynamical features of complex oscillations reflected in the IF ISI sequences becomes more complicated. In this work we discuss peculiarities and limitations of quantifying chaotic dynamics from IF point processes. We consider main factors leading to underestimated LEs and demonstrate a way of improving numerical determining of LEs from IF ISI sequences. We show that estimations of the two largest LEs can be performed using around 400 mean periods of chaotic oscillations in the regime of phase-coherent chaos. Application to real data is discussed. PMID:25637929

Pavlov, A N; Pavlova, O N; Mohammad, Y K; Kurths, J



Use of short half-life cosmogenic isotopes to quantify sediment mixing and transport in karst conduits  

NASA Astrophysics Data System (ADS)

Particulate inorganic carbon (PIC) transport and flux in karst aquifers is poorly understood. Methods to quantify PIC flux are needed in order to account for total inorganic carbon removal (chemical plus mechanical) from karst settings. Quantifying PIC flux will allow more accurate calculations of landscape denudation and global carbon sink processes. The study concentrates on the critical processes of the suspended sediment component of mass flux - surface soil/stored sediment mixing, transport rates and distance, and sediment storage times. The primary objective of the study is to describe transport and mixing with the resolution of single storm-flow events. To quantify the transport processes, short half-life cosmogenic isotopes are utilized. The isotopes 7Be (t1/2 = 53d) and 210Pb (t1/2 = 22y) are the primary isotopes measured, and other potential isotopes such as 137Cs and 241Am are investigated. The study location is at Mammoth Cave National Park within the Logsdon River watershed. The Logsdon River conduit is continuously traversable underground for two kilometers. Background levels and input concentrations of isotopes are determined from soil samples taken at random locations in the catchment area, and suspended sediment collected from the primary sinking stream during a storm event. Suspended sediment was also collected from the downstream end of the conduit during the storm event. After the storm flow receded, fine sediment samples were taken from the cave stream at regular intervals to determine transport distances and mixing ratios along the conduit. Samples were analyzed with a Canberra Industries gamma ray spectrometer, counted for 24 hours to increase detection of low radionuclide activities. The measured activity levels of radionuclides in the samples were adjusted for decay from time of sampling using standard decay curves. The results of the study show that surface sediment mixing, transport and storage in karst conduits is a dynamic but potentially quantifiable process at the storm-event scale.

Paylor, R.



Quantifying the labeling and the levels of plant cell wall precursors using ion chromatography tandem mass spectrometry.  


The biosynthesis of cell wall polymers involves enormous fluxes through central metabolism that are not fully delineated and whose regulation is poorly understood. We have established and validated a liquid chromatography tandem mass spectrometry method using multiple reaction monitoring mode to separate and quantify the levels of plant cell wall precursors. Target analytes were identified by their parent/daughter ions and retention times. The method allows the quantification of precursors at low picomole quantities with linear responses up to the nanomole quantity range. When applying the technique to Arabidopsis (Arabidopsis thaliana) T87 cell cultures, 16 hexose-phosphates (hexose-Ps) and nucleotide-sugars (NDP-sugars) involved in cell wall biosynthesis were separately quantified. Using hexose-P and NDP-sugar standards, we have shown that hot water extraction allows good recovery of the target metabolites (over 86%). This method is applicable to quantifying the levels of hexose-Ps and NDP-sugars in different plant tissues, such as Arabidopsis T87 cells in culture and fenugreek (Trigonella foenum-graecum) endosperm tissue, showing higher levels of galacto-mannan precursors in fenugreek endosperm. In Arabidopsis cells incubated with [U-(13)C(Fru)]sucrose, the method was used to track the labeling pattern in cell wall precursors. As the fragmentation of hexose-Ps and NDP-sugars results in high yields of [PO(3)](-)/or [H(2)PO(4)](-) ions, mass isotopomers can be quantified directly from the intensity of selected tandem mass spectrometry transitions. The ability to directly measure (13)C labeling in cell wall precursors makes possible metabolic flux analysis of cell wall biosynthesis based on dynamic labeling experiments. PMID:20442274

Alonso, Ana P; Piasecki, Rebecca J; Wang, Yan; LaClair, Russell W; Shachar-Hill, Yair



Quantifying instantaneous performance in alpine ski racing.  


Alpine ski racing is a popular sport in many countries and a lot of research has gone into optimising athlete performance. Two factors influence athlete performance in a ski race: speed and the chosen path between the gates. However, to date there is no objective, quantitative method to determine instantaneous skiing performance that takes both of these factors into account. The purpose of this short communication was to define a variable quantifying instantaneous skiing performance and to study how this variable depended on the skiers' speed and on their chosen path. Instantaneous skiing performance was defined as time loss per elevation difference dt/dz, which depends on the skier's speed v(z), and the distance travelled per elevation difference ds/dz. Using kinematic data collected in an earlier study, it was evaluated how these variables can be used to assess the individual performance of six ski racers in two slalom turns. The performance analysis conducted in this study might be a useful tool not only for athletes and coaches preparing for competition, but also for sports scientists investigating skiing techniques or engineers developing and testing skiing equipment. PMID:22620279

Federolf, Peter Andreas



Quantifying force networks in particulate systems  

NASA Astrophysics Data System (ADS)

We present mathematical models based on persistent homology for analyzing force distributions in particulate systems. We define three distinct chain complexes of these distributions: digital, position, and interaction, motivated by different types of data that may be available from experiments and simulations, e.g. digital images, location of the particles, and the forces between the particles, respectively. We describe how algebraic topology, in particular, homology allows one to obtain algebraic representations of the geometry captured by these complexes. For each complex we define an associated force network from which persistent homology is computed. Using numerical data obtained from discrete element simulations of a system of particles undergoing slow compression, we demonstrate how persistent homology can be used to compare the force distributions in different systems, and discuss the differences between the properties of digital, position, and interaction force networks. To conclude, we formulate well-defined measures quantifying differences between force networks corresponding to the different states of a system, and therefore allow to analyze in precise terms dynamical properties of force networks.

Kramr, Miroslav; Goullet, Arnaud; Kondic, Lou; Mischaikow, Konstantin



Fluorescence imaging to quantify crop residue cover  

NASA Technical Reports Server (NTRS)

Crop residues, the portion of the crop left in the field after harvest, can be an important management factor in controlling soil erosion. Methods to quantify residue cover are needed that are rapid, accurate, and objective. Scenes with known amounts of crop residue were illuminated with long wave ultraviolet (UV) radiation and fluorescence images were recorded with an intensified video camera fitted with a 453 to 488 nm band pass filter. A light colored soil and a dark colored soil were used as background for the weathered soybean stems. Residue cover was determined by counting the proportion of the pixels in the image with fluorescence values greater than a threshold. Soil pixels had the lowest gray levels in the images. The values of the soybean residue pixels spanned nearly the full range of the 8-bit video data. Classification accuracies typically were within 3(absolute units) of measured cover values. Video imaging can provide an intuitive understanding of the fraction of the soil covered by residue.

Daughtry, C. S. T.; Mcmurtrey, J. E., III; Chappelle, E. W.



Quantifying facial expression recognition across viewing conditions.  


Facial expressions are key to social interactions and to assessment of potential danger in various situations. Therefore, our brains must be able to recognize facial expressions when they are transformed in biologically plausible ways. We used synthetic happy, sad, angry and fearful faces to determine the amount of geometric change required to recognize these emotions during brief presentations. Five-alternative forced choice conditions involving central viewing, peripheral viewing and inversion were used to study recognition among the four emotions. Two-alternative forced choice was used to study affect discrimination when spatial frequency information in the stimulus was modified. The results show an emotion and task-dependent pattern of detection. Facial expressions presented with low peak frequencies are much harder to discriminate from neutral than faces defined by either mid or high peak frequencies. Peripheral presentation of faces also makes recognition much more difficult, except for happy faces. Differences between fearful detection and recognition tasks are probably due to common confusions with sadness when recognizing fear from among other emotions. These findings further support the idea that these emotions are processed separately from each other. PMID:16364393

Goren, Deborah; Wilson, Hugh R



Adults with Autism Show Increased Sensitivity to Outcomes at Low Error Rates during Decision-Making  

ERIC Educational Resources Information Center

Decision-making is an important function that can be quantified using a two-choice prediction task. Individuals with Autistic Disorder (AD) often show highly restricted and repetitive behavior that may interfere with adaptive decision-making. We assessed whether AD adults showed repetitive behavior on the choice task that was unaffected by

Minassian, Arpi; Paulus, Martin; Lincoln, Alan; Perry, William



Quantifying Riverscape Connectivity with Graph Theory  

NASA Astrophysics Data System (ADS)

Fluvial catchments convey fluxes of water, sediment, nutrients and aquatic biota. At continental scales, crustal topography defines the overall path of channels whilst at local scales depositional and/or erosional features generally determine the exact path of a channel. Furthermore, constructions such as dams, for either water abstraction or hydropower, often have a significant impact on channel networks.The concept of ';connectivity' is commonly invoked when conceptualising the structure of a river network.This concept is easy to grasp but there have been uneven efforts across the environmental sciences to actually quantify connectivity. Currently there have only been a few studies reporting quantitative indices of connectivity in river sciences, notably, in the study of avulsion processes. However, the majority of current work describing some form of environmental connectivity in a quantitative manner is in the field of landscape ecology. Driven by the need to quantify habitat fragmentation, landscape ecologists have returned to graph theory. Within this formal setting, landscape ecologists have successfully developed a range of indices which can model connectivity loss. Such formal connectivity metrics are currently needed for a range of applications in fluvial sciences. One of the most urgent needs relates to dam construction. In the developed world, hydropower development has generally slowed and in many countries, dams are actually being removed. However, this is not the case in the developing world where hydropower is seen as a key element to low-emissions power-security. For example, several dam projects are envisaged in Himalayan catchments in the next 2 decades. This region is already under severe pressure from climate change and urbanisation, and a better understanding of the network fragmentation which can be expected in this system is urgently needed. In this paper, we apply and adapt connectivity metrics from landscape ecology. We then examine the connectivity structure of the Gangetic riverscape with fluvial remote sensing. Our study reach extends from the heavily dammed headwaters of the Bhagirathi, Mandakini and Alaknanda rivers which form the source of the Ganga to Allahabad ~900 km downstream on the main stem. We use Landsat-8 imagery as the baseline dataset. Channel width along the Ganga (i.e. Ganges) is often several kilometres. Therefore, the pan-sharpened 15m pixels of Landsat-8 are in fact capable of resolving inner channel features for over 80% of the channel length thus allowing a riverscape approach to be adopted. We examine the following connectivity metrics: size distribution of connected components, betweeness centrality and the integrated index of connectivity. A geographic perspective is added by mapping local (25 km-scale) values for these metrics in order to examine spatial patterns of connectivity. This approach allows us to map impacts of dam construction and has the potential to inform policy decisions in the area as well as open-up new avenues of investigation.

Carbonneau, P.; Milledge, D.; Sinha, R.; Tandon, S. K.



Quantifying and Generalizing Hydrologic Responses to Dam Regulation using a Statistical Modeling Approach  

SciTech Connect

Despite the ubiquitous existence of dams within riverscapes, much of our knowledge about dams and their environmental effects remains context-specific. Hydrology, more than any other environmental variable, has been studied in great detail with regard to dam regulation. While much progress has been made in generalizing the hydrologic effects of regulation by large dams, many aspects of hydrology show site-specific fidelity to dam operations, small dams (including diversions), and regional hydrologic regimes. A statistical modeling framework is presented to quantify and generalize hydrologic responses to varying degrees of dam regulation. Specifically, the objectives were to 1) compare the effects of local versus cumulative dam regulation, 2) determine the importance of different regional hydrologic regimes in influencing hydrologic responses to dams, and 3) evaluate how different regulation contexts lead to error in predicting hydrologic responses to dams. Overall, model performance was poor in quantifying the magnitude of hydrologic responses, but performance was sufficient in classifying hydrologic responses as negative or positive. Responses of some hydrologic indices to dam regulation were highly dependent upon hydrologic class membership and the purpose of the dam. The opposing coefficients between local and cumulative-dam predictors suggested that hydrologic responses to cumulative dam regulation are complex, and predicting the hydrology downstream of individual dams, as opposed to multiple dams, may be more easy accomplished using statistical approaches. Results also suggested that particular contexts, including multipurpose dams, high cumulative regulation by multiple dams, diversions, close proximity to dams, and certain hydrologic classes are all sources of increased error when predicting hydrologic responses to dams. Statistical models, such as the ones presented herein, show promise in their ability to model the effects of dam regulation effects at large spatial scales as to generalize the directionality of hydrologic responses.

McManamay, Ryan A [ORNL



Clathrin triskelia show evidence of molecular flexibility.  


The clathrin triskelion, which is a three-legged pinwheel-shaped heteropolymer, is a major component in the protein coats of certain post-Golgi and endocytic vesicles. At low pH, or at physiological pH in the presence of assembly proteins, triskelia will self-assemble to form a closed clathrin cage, or "basket". Recent static light scattering and dynamic light scattering studies of triskelia in solution showed that an individual triskelion has an intrinsic pucker similar to, but differing from, that inferred from a high resolution cryoEM structure of a triskelion in a clathrin basket. We extend the earlier solution studies by performing small-angle neutron scattering (SANS) experiments on isolated triskelia, allowing us to examine a higher q range than that probed by static light scattering. Results of the SANS measurements are consistent with the light scattering measurements, but show a shoulder in the scattering function at intermediate q values (0.016 A(-1)), just beyond the Guinier regime. This feature can be accounted for by Brownian dynamics simulations based on flexible bead-spring models of a triskelion, which generate time-averaged scattering functions. Calculated scattering profiles are in good agreement with the experimental SANS profiles when the persistence length of the assumed semiflexible triskelion is close to that previously estimated from the analysis of electron micrographs. PMID:18502808

Ferguson, Matthew L; Prasad, Kondury; Boukari, Hacene; Sackett, Dan L; Krueger, Susan; Lafer, Eileen M; Nossal, Ralph



Quantifying human vitamin kinetics using AMS  

SciTech Connect

Tracing vitamin kinetics at physiologic concentrations has been hampered by a lack of quantitative sensitivity for chemically equivalent tracers that could be used safely in healthy people. Instead, elderly or ill volunteers were sought for studies involving pharmacologic doses with radioisotopic labels. These studies fail to be relevant in two ways: vitamins are inherently micronutrients, whose biochemical paths are saturated and distorted by pharmacological doses; and while vitamins remain important for health in the elderly or ill, their greatest effects may be in preventing slow and cumulative diseases by proper consumption throughout youth and adulthood. Neither the target dose nor the target population are available for nutrient metabolic studies through decay counting of radioisotopes at high levels. Stable isotopic labels are quantified by isotope ratio mass spectrometry at levels that trace physiologic vitamin doses, but the natural background of stable isotopes severely limits the time span over which the tracer is distinguishable. Indeed, study periods seldom ranged over a single biological mean life of the labeled nutrients, failing to provide data on the important final elimination phase of the compound. Kinetic data for the absorption phase is similarly rare in micronutrient research because the phase is rapid, requiring many consecutive plasma samples for accurate representation. However, repeated blood samples of sufficient volume for precise stable or radio-isotope quantitations consume an indefensible amount of the volunteer's blood over a short period. Thus, vitamin pharmacokinetics in humans has often relied on compartmental modeling based upon assumptions and tested only for the short period of maximal blood circulation, a period that poorly reflects absorption or final elimination kinetics except for the most simple models.

Hillegonds, D; Dueker, S; Ognibene, T; Buchholz, B; Lin, Y; Vogel, J; Clifford, A



A new model for quantifying climate episodes  

NASA Astrophysics Data System (ADS)

When long records of climate (precipitation, temperature, stream runoff, etc.) are available, either from instrumental observations or from proxy records, the objective evaluation and comparison of climatic episodes becomes necessary. Such episodes can be quantified in terms of duration (the number of time intervals, e.g. years, the process remains continuously above or below a reference level) and magnitude (the sum of all series values for a given duration). The joint distribution of duration and magnitude is represented here by a stochastic model called BEG, for bivariate distribution with exponential and geometric marginals. The model is based on the theory of random sums, and its mathematical derivation confirms and extends previous empirical findings. Probability statements that can be obtained from the model are illustrated by applying it to a 2300-year dendroclimatic reconstruction of water-year precipitation for the eastern Sierra Nevada-western Great Basin. Using the Dust Bowl drought period as an example, the chance of a longer or greater drought is 8%. Conditional probabilities are much higher, i.e. a drought of that magnitude has a 62% chance of lasting for 11 years or longer, and a drought that lasts 11 years has a 46% chance of having an equal or greater magnitude. In addition, because of the bivariate model, we can estimate a 6% chance of witnessing a drought that is both longer and greater. Additional examples of model application are also provided. This type of information provides a way to place any climatic episode in a temporal perspective, and such numerical statements help with reaching science-based management and policy decisions.

Biondi, Franco; Kozubowski, Tomasz J.; Panorska, Anna K.



Talent Show Notes from the Office  

E-print Network

Highlights Paintball Talent Show Notes from the Office Spring B Places of Origin Birthdays's Weekly. Talent Show ­ Tryouts We're so excited about the Talent Show! We have a long list of students, March 24. This is also the last day to sign up to be in the Talent Show. We also need a Master

Pilyugin, Sergei S.


Quantifying the behavior of stock correlations under market stress.  


Understanding correlations in complex systems is crucial in the face of turbulence, such as the ongoing financial crisis. However, in complex systems, such as financial systems, correlations are not constant but instead vary in time. Here we address the question of quantifying state-dependent correlations in stock markets. Reliable estimates of correlations are absolutely necessary to protect a portfolio. We analyze 72 years of daily closing prices of the 30 stocks forming the Dow Jones Industrial Average (DJIA). We find the striking result that the average correlation among these stocks scales linearly with market stress reflected by normalized DJIA index returns on various time scales. Consequently, the diversification effect which should protect a portfolio melts away in times of market losses, just when it would most urgently be needed. Our empirical analysis is consistent with the interesting possibility that one could anticipate diversification breakdowns, guiding the design of protected portfolios. PMID:23082242

Preis, Tobias; Kenett, Dror Y; Stanley, H Eugene; Helbing, Dirk; Ben-Jacob, Eshel



Quantifying the Relationship Between Financial News and the Stock Market  

PubMed Central

The complex behavior of financial markets emerges from decisions made by many traders. Here, we exploit a large corpus of daily print issues of the Financial Times from 2nd January 2007 until 31st December 2012 to quantify the relationship between decisions taken in financial markets and developments in financial news. We find a positive correlation between the daily number of mentions of a company in the Financial Times and the daily transaction volume of a company's stock both on the day before the news is released, and on the same day as the news is released. Our results provide quantitative support for the suggestion that movements in financial markets and movements in financial news are intrinsically interlinked. PMID:24356666

Alanyali, Merve; Moat, Helen Susannah; Preis, Tobias



Quantifying the relationship between financial news and the stock market.  


The complex behavior of financial markets emerges from decisions made by many traders. Here, we exploit a large corpus of daily print issues of the Financial Times from 2(nd) January 2007 until 31(st) December 2012 to quantify the relationship between decisions taken in financial markets and developments in financial news. We find a positive correlation between the daily number of mentions of a company in the Financial Times and the daily transaction volume of a company's stock both on the day before the news is released, and on the same day as the news is released. Our results provide quantitative support for the suggestion that movements in financial markets and movements in financial news are intrinsically interlinked. PMID:24356666

Alanyali, Merve; Moat, Helen Susannah; Preis, Tobias



Empowering Women? The Oprah Winfrey Show  

Microsoft Academic Search

The Oprah Winfrey Show, the most-watched US daytime talk show, aims to empower women. This article examines the show's representations of gender and how images of `race', sexuality and class cross-cut them. It considers the show's status as television psychology. It explores the show's translation of aspects of black feminism to television, and discusses the social implications of its `super-real'

Corinne Squire



Quantifying Local Radiation-Induced Lung Damage From Computed Tomography  

SciTech Connect

Purpose: Optimal implementation of new radiotherapy techniques requires accurate predictive models for normal tissue complications. Since clinically used dose distributions are nonuniform, local tissue damage needs to be measured and related to local tissue dose. In lung, radiation-induced damage results in density changes that have been measured by computed tomography (CT) imaging noninvasively, but not yet on a localized scale. Therefore, the aim of the present study was to develop a method for quantification of local radiation-induced lung tissue damage using CT. Methods and Materials: CT images of the thorax were made 8 and 26 weeks after irradiation of 100%, 75%, 50%, and 25% lung volume of rats. Local lung tissue structure (S{sub L}) was quantified from local mean and local standard deviation of the CT density in Hounsfield units in 1-mm{sup 3} subvolumes. The relation of changes in S{sub L} (DELTAS{sub L}) to histologic changes and breathing rate was investigated. Feasibility for clinical application was tested by applying the method to CT images of a patient with non-small-cell lung carcinoma and investigating the local dose-effect relationship of DELTAS{sub L}. Results: In rats, a clear dose-response relationship of DELTAS{sub L} was observed at different time points after radiation. Furthermore, DELTAS{sub L} correlated strongly to histologic endpoints (infiltrates and inflammatory cells) and breathing rate. In the patient, progressive local dose-dependent increases in DELTAS{sub L} were observed. Conclusion: We developed a method to quantify local radiation-induced tissue damage in the lung using CT. This method can be used in the development of more accurate predictive models for normal tissue complications.

Ghobadi, Ghazaleh; Hogeweg, Laurens E. [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Faber, Hette [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Department of Cell Biology, Section of Radiation and Stress Cell Biology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Tukker, Wim G.J. [Department of Radiology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Schippers, Jacobus M. [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Accelerator Department, Paul Scherrer Institut, Villigen (Switzerland); Brandenburg, Sytze [Kernfysisch Versneller Instituut, Groningen (Netherlands); Langendijk, Johannes A. [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Coppes, Robert P. [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Department of Cell Biology, Section of Radiation and Stress Cell Biology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Luijk, Peter van, E-mail: p.van.luijk@rt.umcg.n [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands)



Quantifying the Nonlinear, Anisotropic Material Response of Spinal Ligaments  

NASA Astrophysics Data System (ADS)

Spinal ligaments may be a significant source of chronic back pain, yet they are often disregarded by the clinical community due to a lack of information with regards to their material response, and innervation characteristics. The purpose of this dissertation was to characterize the material response of spinal ligaments and to review their innervation characteristics. Review of relevant literature revealed that all of the major spinal ligaments are innervated. They cause painful sensations when irritated and provide reflexive control of the deep spinal musculature. As such, including the neurologic implications of iatrogenic ligament damage in the evaluation of surgical procedures aimed at relieving back pain will likely result in more effective long-term solutions. The material response of spinal ligaments has not previously been fully quantified due to limitations associated with standard soft tissue testing techniques. The present work presents and validates a novel testing methodology capable of overcoming these limitations. In particular, the anisotropic, inhomogeneous material constitutive properties of the human supraspinous ligament are quantified and methods for determining the response of the other spinal ligaments are presented. In addition, a method for determining the anisotropic, inhomogeneous pre-strain distribution of the spinal ligaments is presented. The multi-axial pre-strain distributions of the human anterior longitudinal ligament, ligamentum flavum and supraspinous ligament were determined using this methodology. Results from this work clearly demonstrate that spinal ligaments are not uniaxial structures, and that finite element models which account for pre-strain and incorporate ligament's complex material properties may provide increased fidelity to the in vivo condition.

Robertson, Daniel J.


Quantifying covalency and metallicity in correlated compounds undergoing metal-insulator transitions  

NASA Astrophysics Data System (ADS)

The tunability of bonding character in transition-metal compounds controls phase transitions and their fascinating properties such as high-temperature superconductivity, colossal magnetoresistance, spin-charge ordering, etc. However, separating out and quantifying the roles of covalency and metallicity derived from the same set of transition-metal d and ligand p electrons remains a fundamental challenge. In this study, we use bulk-sensitive photoelectron spectroscopy and configuration-interaction calculations for quantifying the covalency and metallicity in correlated compounds. The method is applied to study the first-order temperature- (T-) dependent metal-insulator transitions (MITs) in the cubic pyrochlore ruthenates Tl2Ru2O7 and Hg2Ru2O7. Core-level spectroscopy shows drastic T-dependent modifications which are well explained by including ligand-screening and metallic-screening channels. The core-level metallic-origin features get quenched upon gap formation in valence band spectra, while ionic and covalent components remain intact across the MIT. The results establish temperature-driven Mott-Hubbard MITs in three-dimensional ruthenates and reveal three energy scales: (a) 4d electronic changes occur on the largest (eV) energy scale, (b) the band-gap energies/charge gaps (Eg160-200 meV) are intermediate, and (c) the lowest-energy scale corresponds to the transition temperature TMIT (10 meV), which is also the spin gap energy of Tl2Ru2O7 and the magnetic-ordering temperature of Hg2Ru2O7. The method is general for doping- and T-induced transitions and is valid for V2O3, CrN, La1-xSrxMnO3, La2-xSrxCuO4, etc. The obtained transition-metal-ligand (d-p) bonding energies (V45-90 kcal/mol) are consistent with thermochemical data, and with energies of typical heteronuclear covalent bonds such as C-H, C-O, C-N, etc. In contrast, the metallic-screening energies of correlated compounds form a weaker class (V*10-40 kcal/mol) but are still stronger than van der Waals and hydrogen bonding. The results identify and quantify the roles of covalency and metallicity in 3d and 4d correlated compounds undergoing metal-insulator transitions.

Chainani, Ashish; Yamamoto, Ayako; Matsunami, Masaharu; Eguchi, Ritsuko; Taguchi, Munetaka; Takata, Yasutaka; Takagi, Hidenori; Shin, Shik; Nishino, Yoshinori; Yabashi, Makina; Tamasaku, Kenji; Ishikawa, Tetsuya



Quantifying the impacts of global disasters  

NASA Astrophysics Data System (ADS)

The US Geological Survey, National Oceanic and Atmospheric Administration, California Geological Survey, and other entities are developing a Tsunami Scenario, depicting a realistic outcome of a hypothetical but plausible large tsunami originating in the eastern Aleutian Arc, affecting the west coast of the United States, including Alaska and Hawaii. The scenario includes earth-science effects, damage and restoration of the built environment, and social and economic impacts. Like the earlier ShakeOut and ARkStorm disaster scenarios, the purpose of the Tsunami Scenario is to apply science to quantify the impacts of natural disasters in a way that can be used by decision makers in the affected sectors to reduce the potential for loss. Most natural disasters are local. A major hurricane can destroy a city or damage a long swath of coastline while mostly sparing inland areas. The largest earthquake on record caused strong shaking along 1500 km of Chile, but left the capital relatively unscathed. Previous scenarios have used the local nature of disasters to focus interaction with the user community. However, the capacity for global disasters is growing with the interdependency of the global economy. Earthquakes have disrupted global computer chip manufacturing and caused stock market downturns. Tsunamis, however, can be global in their extent and direct impact. Moreover, the vulnerability of seaports to tsunami damage can increase the global consequences. The Tsunami Scenario is trying to capture the widespread effects while maintaining the close interaction with users that has been one of the most successful features of the previous scenarios. The scenario tsunami occurs in the eastern Aleutians with a source similar to the 2011 Tohoku event. Geologic similarities support the argument that a Tohoku-like source is plausible in Alaska. It creates a major nearfield tsunami in the Aleutian arc and peninsula, a moderate tsunami in the US Pacific Northwest, large but not the maximum in Hawaii, and the largest plausible tsunami in southern California. To support the analysis of global impacts, we begin with the Ports of Los Angeles and Long Beach which account for >40% of the imports to the United States. We expand from there throughout California for the first level economic analysis. We are looking to work with Alaska and Hawaii, especially on similar economic issues in ports, over the next year and to expand the analysis to consideration of economic interactions between the regions.

Jones, L. M.; Ross, S.; Wilson, R. I.; Borrero, J. C.; Brosnan, D.; Bwarie, J. T.; Geist, E. L.; Hansen, R. A.; Johnson, L. A.; Kirby, S. H.; Long, K.; Lynett, P. J.; Miller, K. M.; Mortensen, C. E.; Perry, S. C.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Thio, H. K.; Wein, A. M.; Whitmore, P.; Wood, N. J.



Quantifying Permafrost Characteristics with DCR-ERT  

NASA Astrophysics Data System (ADS)

Geophysical methods are an efficient method for quantifying permafrost characteristics for Arctic road design and engineering. In the Alaskan Arctic construction and maintenance of roads requires integration of permafrost; ground that is below 0 degrees C for two or more years. Features such as ice content and temperature are critical for understanding current and future ground conditions for planning, design and evaluation of engineering applications. This study focused on the proposed Foothills West Transportation Access project corridor where the purpose is to construct a new all-season road connecting the Dalton Highway to Umiat. Four major areas were chosen that represented a range of conditions including gravel bars, alluvial plains, tussock tundra (both unburned and burned conditions), high and low centered ice-wedge polygons and an active thermokarst feature. Direct-current resistivity using galvanic contact (DCR-ERT) was applied over transects. In conjunction complimentary site data including boreholes, active layer depths, vegetation descriptions and site photographs was obtained. The boreholes provided information on soil morphology, ice texture and gravimetric moisture content. Horizontal and vertical resolutions in the DCR-ERT were varied to determine the presence or absence of ground ice; subsurface heterogeneity; and the depth to groundwater (if present). The four main DCR-ERT methods used were: 84 electrodes with 2 m spacing; 42 electrodes with 0.5 m spacing; 42 electrodes with 2 m spacing; and 84 electrodes with 1 m spacing. In terms of identifying the ground ice characteristics the higher horizontal resolution DCR-ERT transects with either 42 or 84 electrodes and 0.5 or 1 m spacing were best able to differentiate wedge-ice. This evaluation is based on a combination of both borehole stratigraphy and surface characteristics. Simulated apparent resistivity values for permafrost areas varied from a low of 4582 ? m to a high of 10034 ? m. Previous studies found permafrost conditions with corresponding resistivity values as low as 5000 ? m. This work emphasizes the necessity of tailoring the DCR-ERT survey to verified ground ice characteristics.

Schnabel, W.; Trochim, E.; Munk, J.; Kanevskiy, M. Z.; Shur, Y.; Fortier, R.



Quantifying tetrodotoxin levels in the California newt using a non-destructive sampling method  

E-print Network

Quantifying tetrodotoxin levels in the California newt using a non-destructive sampling method Gary processes. The neurotoxin, tetrodotoxin (TTX), which is found in newts of the genus Taricha, acts individuals. We also show that embryos from oviposited California newt (Taricha torosa) egg masses can

Shaffer, H. Bradley


gene encoding enhanced green fluorescent protein to the repressor gene, and quantify  

E-print Network

gene encoding enhanced green fluorescent protein to the repressor gene, and quantify of gene expression in the feedback network, compared with the control networks. They also show concentrations of anhydrotetra- cycline--achemicalinhibitorofTetR. In past theoretical studies of gene

Weeks, Eric R.


FOR SUBMISSION 1 Quantifying the degree of self-nestedness of trees.  

E-print Network

FOR SUBMISSION 1 Quantifying the degree of self-nestedness of trees. Application to the structural in the problem of approximating trees by trees with a particular self-nested structure. Self-nested trees are such that all their subtrees of a given height are isomorphic. We show that these trees present remarkable

Paris-Sud XI, Université de


Toward quantifying uncertainty in travel time tomography using the null-space shuttle  

E-print Network

Toward quantifying uncertainty in travel time tomography using the null-space shuttle R. W. L. de in travel time tomography using the null-space shuttle, J. Geophys. Res., 117, B03301, doi:10.1029/2011JB the null-space of the forward operator. We show that with the null-space shuttle it is possible to assess

Utrecht, Universiteit


New Hampshire Guide 4-H Dog Shows  

E-print Network

New Hampshire Guide to 4-H Dog Shows UNH Cooperative Extension 4-H Youth Development Moiles House cooperating. #12;NH Guide to 4-H Dog Shows i Table of Contents INTRODUCTION .................................................................................................................................2 Purpose of the 4-H Dog Project

New Hampshire, University of


Quantifying the biodiversity value of tropical primary, secondary, and plantation forests  

PubMed Central

Biodiversity loss from deforestation may be partly offset by the expansion of secondary forests and plantation forestry in the tropics. However, our current knowledge of the value of these habitats for biodiversity conservation is limited to very few taxa, and many studies are severely confounded by methodological shortcomings. We examined the conservation value of tropical primary, secondary, and plantation forests for 15 taxonomic groups using a robust and replicated sample design that minimized edge effects. Different taxa varied markedly in their response to patterns of land use in terms of species richness and the percentage of species restricted to primary forest (varying from 5% to 57%), yet almost all between-forest comparisons showed marked differences in community structure and composition. Cross-taxon congruence in response patterns was very weak when evaluated using abundance or species richness data, but much stronger when using metrics based upon community similarity. Our results show that, whereas the biodiversity indicator group concept may hold some validity for several taxa that are frequently sampled (such as birds and fruit-feeding butterflies), it fails for those exhibiting highly idiosyncratic responses to tropical land-use change (including highly vagile species groups such as bats and orchid bees), highlighting the problems associated with quantifying the biodiversity value of anthropogenic habitats. Finally, although we show that areas of native regeneration and exotic tree plantations can provide complementary conservation services, we also provide clear empirical evidence demonstrating the irreplaceable value of primary forests. PMID:18003934

Barlow, J.; Gardner, T. A.; Araujo, I. S.; vila-Pires, T. C.; Bonaldo, A. B.; Costa, J. E.; Esposito, M. C.; Ferreira, L. V.; Hawes, J.; Hernandez, M. I. M.; Hoogmoed, M. S.; Leite, R. N.; Lo-Man-Hung, N. F.; Malcolm, J. R.; Martins, M. B.; Mestre, L. A. M.; Miranda-Santos, R.; Nunes-Gutjahr, A. L.; Overal, W. L.; Parry, L.; Peters, S. L.; Ribeiro-Junior, M. A.; da Silva, M. N. F.; da Silva Motta, C.; Peres, C. A.



Inside Gun Shows What Goes On  

E-print Network

Inside Gun Shows What Goes On When Everybody Thinks Nobody's Watching Epilogue #12;Inside Gun Shows;Epilogue In February 2010, I attended a Crossroads of the West gun show at the Arizona State Fairgrounds here an update on each of the Phoenix obser- vations made in the photo-essay portion of Inside Gun

Leistikow, Bruce N.


Inside Gun Shows What Goes On  

E-print Network

Inside Gun Shows What Goes On When Everybody Thinks Nobody's Watching Executive Summary #12;Inside Gun Shows What Goes on When Everybody Thinks Nobody's Watching Garen Wintemute, MD, MPH Violence;Executive Summary Gun shows are surrounded by controversy. On the one hand, they are important economic

Nguyen, Danh


Inside Gun Shows What Goes On  

E-print Network

Gun Shows Work Buying and Selling What's for Sale Culture Politics Interventions v 1 11 55 91 159 219Preface Inside Gun Shows What Goes On When Everybody Thinks Nobody's Watching #12;#12;Inside Gun-Violence Effort. She put gun shows on my radar and is an ace straw-purchase spotter. Thanks also to Barbara Claire

Leistikow, Bruce N.


Quantifying Spatial Variability of Selected Soil Trace Elements and Their Scaling Relationships Using Multifractal Techniques  

PubMed Central

Multifractal techniques were utilized to quantify the spatial variability of selected soil trace elements and their scaling relationships in a 10.24-ha agricultural field in northeast China. 1024 soil samples were collected from the field and available Fe, Mn, Cu and Zn were measured in each sample. Descriptive results showed that Mn deficiencies were widespread throughout the field while Fe and Zn deficiencies tended to occur in patches. By estimating single multifractal spectra, we found that available Fe, Cu and Zn in the study soils exhibited high spatial variability and the existence of anomalies ([?(q)max??(q)min]?0.54), whereas available Mn had a relatively uniform distribution ([?(q)max??(q)min]?0.10). The joint multifractal spectra revealed that the strong positive relationships (r?0.86, P<0.001) among available Fe, Cu and Zn were all valid across a wider range of scales and over the full range of data values, whereas available Mn was weakly related to available Fe and Zn (r?0.18, P<0.01) but not related to available Cu (r?=??0.03, P?=?0.40). These results show that the variability and singularities of selected soil trace elements as well as their scaling relationships can be characterized by single and joint multifractal parameters. The findings presented in this study could be extended to predict selected soil trace elements at larger regional scales with the aid of geographic information systems. PMID:23874944

Zhang, Fasheng; Yin, Guanghua; Wang, Zhenying; McLaughlin, Neil; Geng, Xiaoyuan; Liu, Zuoxin



Plant species descriptions show signs of disease.  


It is well known that diseases can greatly influence the morphology of plants, but often the incidence of disease is either too rare or the symptoms too obvious for the 'abnormalities' to cause confusion in systematics. However, we have recently come across several misinterpretations of disease-induced traits that may have been perpetuated into modern species inventories. Anther-smut disease (caused by the fungus Microbotryum violaceum) is common in many members of the Caryophyllaceae and related plant families. This disease causes anthers of infected plants to be filled with dark-violet fungal spores rather than pollen. Otherwise, their vegetative morphology is within the normal range of healthy plants. Here, we present the results of a herbarium survey showing that a number of type specimens (on which the species name and original description are based) in the genus Silene from Asia are diseased with anther smut. The primary visible disease symptom, namely the dark-violet anthers, is incorporated into the original species descriptions and some of these descriptions have persisted unchanged into modern floras. This raises the question of whether diseased type specimens have erroneously been given unique species names. PMID:14667368

Hood, Michael E; Antonovics, Janis



Plant species descriptions show signs of disease.  

PubMed Central

It is well known that diseases can greatly influence the morphology of plants, but often the incidence of disease is either too rare or the symptoms too obvious for the 'abnormalities' to cause confusion in systematics. However, we have recently come across several misinterpretations of disease-induced traits that may have been perpetuated into modern species inventories. Anther-smut disease (caused by the fungus Microbotryum violaceum) is common in many members of the Caryophyllaceae and related plant families. This disease causes anthers of infected plants to be filled with dark-violet fungal spores rather than pollen. Otherwise, their vegetative morphology is within the normal range of healthy plants. Here, we present the results of a herbarium survey showing that a number of type specimens (on which the species name and original description are based) in the genus Silene from Asia are diseased with anther smut. The primary visible disease symptom, namely the dark-violet anthers, is incorporated into the original species descriptions and some of these descriptions have persisted unchanged into modern floras. This raises the question of whether diseased type specimens have erroneously been given unique species names. PMID:14667368

Hood, Michael E; Antonovics, Janis



A new time quantifiable Monte Carlo method in simulating magnetization reversal process  

E-print Network

We propose a new time quantifiable Monte Carlo (MC) method to simulate the thermally induced magnetization reversal for an isolated single domain particle system. The MC method involves the determination of density of states, and the use of Master equation for time evolution. We derive an analytical factor to convert MC steps into real time intervals. Unlike a previous time quantified MC method, our method is readily scalable to arbitrarily long time scales, and can be repeated for different temperatures with minimal computational effort. Based on the conversion factor, we are able to make a direct comparison between the results obtained from MC and Langevin dynamics methods, and find excellent agreement between them. An analytical formula for the magnetization reversal time is also derived, which agrees very well with both numerical Langevin and time-quantified MC results, over a large temperature range and for parallel and oblique easy axis orientations.

X. Z. Cheng; M. B. A. Jalil; H. K. Lee; Y. Okabe



Shortcuts to Quantifier Interpretation in Children and Adults  

ERIC Educational Resources Information Center

Errors involving universal quantification are common in contexts depicting sets of individuals in partial, one-to-one correspondence. In this article, we explore whether quantifier-spreading errors are more common with distributive quantifiers each and every than with all. In Experiments 1 and 2, 96 children (5- to 9-year-olds) viewed pairs of

Brooks, Patricia J.; Sekerina, Irina



Children's Knowledge of the Quantifier "Dou" in Mandarin Chinese  

ERIC Educational Resources Information Center

The quantifier "dou" (roughly corresponding to English "all") in Mandarin Chinese has been the topic of much discussion in the theoretical literature. This study investigated children's knowledge of this quantifier using a new methodological technique, which we dubbed the Question-Statement Task. Three questions were addressed: (i) whether young

Zhou, Peng; Crain, Stephen



Quantifier elimination for real closed fields by cylindrical algebraic decomposition  

Microsoft Academic Search

Tarski in 1948, [18] published a quantifier elimination method for the elementary theory of real closed fields (which he had discovered in 1930). As noted by Tarski, any quantifier elimination method for this theory also provides a decision method, which enables one to decide whether any sentence of the theory is true or false. Since many important and difficult mathematical

George E. Collins



Quantifying Offshore Wind Resources from Satellite Wind Maps  

E-print Network

Quantifying Offshore Wind Resources from Satellite Wind Maps: Study Area the North Sea C. B National Laboratory, Roskilde, Denmark Offshore wind resources are quantified from satellite synthetic the spatial extent of the wake behind large offshore wind farms. Copyright © 2006 John Wiley & Sons, Ltd

Pryor, Sara C.


Resolving Quantifier and Number Restriction to Question OWL Ontologies  

Microsoft Academic Search

This paper describes an approach to resolve quantifiers and number restrictions in natural language questions to query ontologies. Incorporating this feature enables natural language query interfaces to capture a wider range of user queries. To deal with quantifiers and number restrictions, we analyzed a corpus of such questions and derived constraints at the syntactic level to recognize and parse them.

Shamima Mithun; Leila Kosseim; Volker Haarslev



Visual Attention and Quantifier-Spreading in Heritage Russian Bilinguals  

ERIC Educational Resources Information Center

It is well established in language acquisition research that monolingual children and adult second language learners misinterpret sentences with the universal quantifier "every" and make quantifier-spreading errors that are attributed to a preference for a match in number between two sets of objects. The present Visual World eye-tracking

Sekerina, Irina A.; Sauermann, Antje



Edinburgh Research Explorer Detecting and Quantifying Topography in Neural Maps  

E-print Network

for published version: Yarrow, S, Seitz, AR, Seriès, P & Razak, K 2014, 'Detecting and Quantifying Topography date: 14. Jun. 2014 #12;Detecting and Quantifying Topography in Neural Maps Stuart Yarrow1 *, Khaleel A at the scale of individual binaural clusters. Citation: Yarrow S, Razak KA, Seitz AR, Serie`s P (2014

Millar, Andrew J.


FixBag : A Fixpoint Calculator for Quantified Bag Constraints  

E-print Network

FixBag : A Fixpoint Calculator for Quantified Bag Constraints Tuan-Hung Pham1 , Minh-Thai Trinh2 of programs, we have developed a tool to com- pute symbolic fixpoints for quantified bag domain. This domain and method pre/post conditions via fixpoint analysis of recursive bag constraints. To support better

Minnesota, University of


FixBag : A Fixpoint Calculator for Quantified Bag Constraints  

E-print Network

FixBag : A Fixpoint Calculator for Quantified Bag Constraints Tuan-Hung Pham1 , Minh-Thai Trinh2 range of programs, we have developed a tool to compute symbolic fixpoints for quantified bag domain invariants and method pre/post conditions via fix- point analysis of recursive bag constraints. To support

Chin, Wei Ngan


Quantifier elimination for formulas constrained by quadratic equations  

Microsoft Academic Search

An algorithm is given for constructing a quantifier free formula (a boolean expression of polynomial equations and inequalities) equivalent to a given formula of the form: (% c R)[azzz + alz + a. = O A F], where F is a quantifier free formula in Z1, . . . . z~, z, and az, al, ao are polynomials in z

Hoon Hong



Quantifying post-fire recovery of forest canopy structure and its environmental drivers using satellite image time-series  

NASA Astrophysics Data System (ADS)

Fire is a recurring disturbance in most of Australia's forests. Depending on fire severity, impacts on forest canopies vary from light scorching to complete defoliation, with related variation in the magnitude and duration of post-fire gas exchange by that canopy. Estimates of fire impacts on forest canopy structure and carbon uptake for south-eastern Australia's forests do not exist. Here, we use 8-day composite measurements of the fraction of Absorbed Photosynthetically Active radiation (FPAR) as recorded by the Moderate-resolution Imaging Spectroradiometer (MODIS) to characterise forest canopies before and after fire and to compare burnt and unburnt sites. FPAR is a key biophysical canopy variable and primary input for estimating Gross Primary Productivity (GPP). Post-fire FPAR loss was quantified for all forest areas burnt between 2001 and 2010, showing good agreement with independent assessments of fire severity patterns of 2009 Black Saturday fires. A new method was developed to determine the duration of post-fire recovery from MODIS-FPAR time-series. The method involves a spatial-mode principal component analysis on full FPAR time series followed by a K-means clustering to group pixels based on similarity in temporal patterns. Using fire history data, time series of FPAR for burnt and unburnt pixels in each cluster were then compared to quantify the duration of the post-fire recovery period, which ranged from less than 1 to 8 years. The results show that time series of MODIS FPAR are well suited to detect and quantify disturbances of forest canopy structure and function in large areas of highly variable climate and phenology. Finally, the role of post-fire climate conditions and previous fire history on the duration of the post-fire recovery of the forest canopy was examined using generalized additive models.

Khanal, Shiva; Duursma, Remko; Boer, Matthias



Quantifying Digit Force Vector Coordination during Precision Pinch  

PubMed Central

A methodology was established to investigate the contact mechanics of the thumb and the index finger at the digit-object interface during precision pinch. Two force/torque transducers were incorporated into an apparatus designed to overcome the thickness of each transducer and provide a flexible pinch span for digit placement and force application. To demonstrate the utility of the device, five subjects completed a pinch task with the pulps of their thumb and index finger. Inter-digit force vector coordination was quantified by examining the 1) force vector component magnitudes, 2) resultant force vector magnitudes, 3) coordination angle the angle formed by the resultant vectors of each digit, 4) direction angles the angle formed by each vector and the coordinate axes, and 5) center of pressure locations. It was shown that the resultant force magnitude of the index finger exceeded that of the thumb by 0.8 0.3 N and that the coordination angle between the digit resultant force vectors was 160.2 4.6. The experimental apparatus and analysis methods provide a valuable tool for the quantitative examination of biomechanics and motor control during dexterous manipulation. PMID:24443624

Marquardt, Tamara L.; Li, Zong-Ming



A method to quantify transesterification activities of lipases in litters.  


Lipases are glycerol ester hydrolases (EC produced by a wide range of microorganisms. They catalyse the hydrolysis of different esters but this reaction is reversible, depending on the water content of the reaction medium, via esterification and transesterification. The synthetic activity of lipases can be of major importance in natural ecosystems since it can be involved in carbon stockage in soils or litters. Here, the detection of transesterification activities of lipases in litter is reported for the first time. We used two different litters: litter of Quercus pubescens (QP) and litter of both Q. pubescens and Q. ilex. Different p-nitrophenyl esters and pentanol were used to test transesterification in a reaction medium with an organic solvent (heptane). We showed that these activities were proportional to the amount of litter, the incubation time and the substrate concentration and that they increased with temperature. Furthermore, the lipases from the litters studied were very thermostable since they were still active after 2 h at 70 degrees C. These activities showed common properties of lipases: the highest activities were obtained with a medium acyl-chain substrate p-nitrophenyl caprylate and transesterification activities were correlated to water activity, a(w). The following parameters are recommended to quantify transesterification activities in litter: 10 mM of p-nitrophenyl caprylate, 1 g of litter, 500 microL of pentanol, q.s.p. 4 mL of heptane incubated at 30 degrees C for 2 h. PMID:19426767

Goujard, L; Ferre, E; Gil, G; Ruaudel, F; Farnet, A M



Quantifying the Spatial Dimension of Dengue Virus Epidemic Spread within a Tropical Urban Environment  

PubMed Central

Background Dengue infection spread in naive populations occurs in an explosive and widespread fashion primarily due to the absence of population herd immunity, the population dynamics and dispersal of Ae. aegypti, and the movement of individuals within the urban space. Knowledge on the relative contribution of such factors to the spatial dimension of dengue virus spread has been limited. In the present study we analyzed the spatio-temporal pattern of a large dengue virus-2 (DENV-2) outbreak that affected the Australian city of Cairns (north Queensland) in 2003, quantified the relationship between dengue transmission and distance to the epidemic's index case (IC), evaluated the effects of indoor residual spraying (IRS) on the odds of dengue infection, and generated recommendations for city-wide dengue surveillance and control. Methods and Findings We retrospectively analyzed data from 383 DENV-2 confirmed cases and 1,163 IRS applications performed during the 25-week epidemic period. Spatial (local k-function, angular wavelets) and space-time (Knox test) analyses quantified the intensity and directionality of clustering of dengue cases, whereas a semi-parametric Bayesian space-time regression assessed the impact of IRS and spatial autocorrelation in the odds of weekly dengue infection. About 63% of the cases clustered up to 800 m around the IC's house. Most cases were distributed in the NW-SE axis as a consequence of the spatial arrangement of blocks within the city and, possibly, the prevailing winds. Space-time analysis showed that DENV-2 infection spread rapidly, generating 18 clusters (comprising 65% of all cases), and that these clusters varied in extent as a function of their distance to the IC's residence. IRS applications had a significant protective effect in the further occurrence of dengue cases, but only when they reached coverage of 60% or more of the neighboring premises of a house. Conclusion By applying sound statistical analysis to a very detailed dataset from one of the largest outbreaks that affected the city of Cairns in recent times, we not only described the spread of dengue virus with high detail but also quantified the spatio-temporal dimension of dengue virus transmission within this complex urban environment. In areas susceptible to non-periodic dengue epidemics, effective disease prevention and control would depend on the prompt response to introduced cases. We foresee that some of the results and recommendations derived from our study may also be applicable to other areas currently affected or potentially subject to dengue epidemics. PMID:21200419

Vazquez-Prokopec, Gonzalo M.; Kitron, Uriel; Montgomery, Brian; Horne, Peter; Ritchie, Scott A.



Quantifying Qualitative Data Using Cognitive Maps  

ERIC Educational Resources Information Center

The aim of the article is to show how substantial qualitative material consisting of graphic cognitive maps can be analysed by using digital CmapTools, Excel and SPSS. Evidence is provided of how qualitative and quantitative methods can be combined in educational research by transforming qualitative data into quantitative data to facilitate

Scherp, Hans-Ake



Quantifying Phycocyanin Concentration in Cyanobacterial Algal Blooms from Remote Sensing Reflectance-A Quasi Analytical Approach  

NASA Astrophysics Data System (ADS)

Cyanobacterial harmful algal blooms (CHAB) are notorious for depleting dissolved oxygen level, producing various toxins, causing threats to aquatic life, altering the food-web dynamics and the overall ecosystem functioning in inland lakes, estuaries, and coastal waters. Most of these algal blooms produce various toxins that can damage cells, tissues and even cause mortality of living organisms. Frequent monitoring of water quality in a synoptic scale has been possible by the virtue of remote sensing techniques. In this research, we present a novel technique to monitor CHAB using remote sensing reflectance products. We have modified a multi-band quasi analytical algorithm that determines phytoplankton absorption coefficients from above surface remote sensing reflectance measurements using an inversion method. In situ hyperspectral remote sensing reflectance data were collected from several highly turbid and productive aquaculture ponds. A novel technique was developed to further decompose the phytoplankton absorption coefficients at 620 nm and obtain phycocyanin absorption coefficient at the same wavelength. An empirical relationship was established between phycocyanin absorption coefficients at 620 nm and measured phycocyanin concentrations. Model calibration showed strong relationship between phycocyanin absorption coefficients and phycocyanin pigment concentration (r2=0.94). Validation of the model in a separate dataset produced a root mean squared error of 167 mg m-3 (phycocyanin range: 26-1012 mg m-3). Results demonstrate that the new approach will be suitable for quantifying phycocyanin concentration in cyanobacteria dominated turbid productive waters. Band architecture of the model matches with the band configuration of the Medium Resolution Imaging Spectrometer (MERIS) and assures that MERIS reflectance products can be used to quantify phycocyanin in cyanobacterial harmful algal blooms in optically complex waters.

Mishra, S.; Mishra, D. R.; Tucker, C.



Quantifying metal ions binding onto dissolved organic matter using log-transformed absorbance spectra.  


This study introduces the concept of consistent examination of changes of log-transformed absorbance spectra of dissolved organic matter (DOM) at incrementally increasing concentrations of heavy metal cations such as copper, cadmium, and aluminum at environmentally relevant concentrations. The approach is designed to highlight contributions of low-intensity absorbance features that appear to be especially sensitive to DOM reactions. In accord with this approach, log-transformed absorbance spectra of fractions of DOM from the Suwannee River were acquired at varying pHs and concentrations of copper, cadmium, and aluminum. These log-transformed spectra were processed using the differential approach and used to examine the nature of the observed changes of DOM absorbance and correlate them with the extent of Me-DOM complexation. Two alternative parameters, namely the change of the spectral slope in the range of wavelengths 325-375nm (DSlope325-375) and differential logarithm of DOM absorbance at 350nm (DLnA350) were introduced to quantify Cu(II), Cd(II), and Al(III) binding onto DOMs. DLnA350 and DSlope325-375 datasets were compared with the amount of DOM-bound Cu(II), Cd(II), and Al(III) estimated based on NICA-Donnan model calculations. This examination showed that the DLnA350 and DSlope325-375 acquired at various pH values, metal ions concentrations, and DOM types were strongly and unambiguously correlated with the concentration of DOM-bound metal ions. The obtained experimental results and their interpretation indicate that the introduced DSlope325-375 and DLnA35 parameters are predictive of and can be used to quantify in situ metal ions interactions with DOMs. The presented approach can be used to gain more information about DOM-metal interactions and for further optimization of existing formal models of metal-DOM complexation. PMID:23490103

Yan, Mingquan; Wang, Dongsheng; Korshin, Gregory V; Benedetti, Marc F



Quantifying the kinetic stability of hyperstable proteins via time-dependent SDS trapping.  


Globular proteins are usually in equilibrium with unfolded conformations, whereas kinetically stable proteins (KSPs) are conformationally trapped by their high unfolding transition state energy. Kinetic stability (KS) could allow proteins to maintain their activity under harsh conditions, increase a protein's half-life, or protect against misfolding-aggregation. Here we show the development of a simple method for quantifying a protein's KS that involves incubating a protein in SDS at high temperature as a function of time, running the unheated samples on SDS-PAGE, and quantifying the bands to determine the time-dependent loss of a protein's SDS resistance. Six diverse proteins, including two monomer, two dimers, and two tetramers, were studied by this method, and the kinetics of the loss of SDS resistance correlated linearly with their unfolding rate determined by circular dichroism. These results imply that the mechanism by which SDS denatures proteins involves conformational trapping, with a trapping rate that is determined and limited by the rate of protein unfolding. We applied the SDS trapping of proteins (S-TraP) method to superoxide dismutase (SOD) and transthyretin (TTR), which are highly KSPs with native unfolding rates that are difficult to measure by conventional spectroscopic methods. A combination of S-TraP experiments between 75 and 90 C combined with Eyring plot analysis yielded an unfolding half-life of 70 37 and 18 6 days at 37 C for SOD and TTR, respectively. The S-TraP method shown here is extremely accessible, sample-efficient, cost-effective, compatible with impure or complex samples, and will be useful for exploring the biological and pathological roles of kinetic stability. PMID:22106876

Xia, Ke; Zhang, Songjie; Bathrick, Brendan; Liu, Shuangqi; Garcia, Yeidaliz; Coln, Wilfredo



Quantifying fluvial topography using UAS imagery and SfM photogrammetry  

NASA Astrophysics Data System (ADS)

The measurement and monitoring of fluvial topography at high spatial and temporal resolutions is in increasing demand for a range of river science and management applications, including change detection, hydraulic models, habitat assessments, river restorations and sediment budgets. Existing approaches are yet to provide a single technique for rapidly quantifying fluvial topography in both exposed and submerged areas, with high spatial resolution, reach-scale continuous coverage, high accuracy and reasonable cost. In this paper, we explore the potential of using imagery acquired from a small unmanned aerial system (UAS) and processed using Structure-from-Motion (SfM) photogrammetry for filling this gap. We use a rotary winged hexacopter known as the Draganflyer X6, a consumer grade digital camera (Panasonic Lumix DMC-LX3) and the commercially available PhotoScan Pro SfM software (Agisoft LLC). We test the approach on three contrasting river systems; a shallow margin of the San Pedro River in the Valdivia region of south-central Chile, the lowland River Arrow in Warwickshire, UK, and the upland Coledale Beck in Cumbria, UK. Digital elevation models (DEMs) and orthophotos of hyperspatial resolution (0.01-0.02m) are produced. Mean elevation errors are found to vary somewhat between sites, dependent on vegetation coverage and the spatial arrangement of ground control points (GCPs) used to georeference the data. Mean errors are in the range 4-44mm for exposed areas and 17-89mm for submerged areas. Errors in submerged areas can be improved to 4-56mm with the application of a simple refraction correction procedure. Multiple surveys of the River Arrow site show consistently high quality results, indicating the repeatability of the approach. This work therefore demonstrates the potential of a UAS-SfM approach for quantifying fluvial topography.

Woodget, Amy; Carbonneau, Patrice; Visser, Fleur; Maddock, Ian; Habit, Evelyn



Quantifying instantaneous regeneration rates of plant leaf waxes using stable hydrogen isotope labeling.  


Leaf waxes protect terrestrial plants from biotic and abiotic stresses and are important sedimentary biomarkers for terrestrial plants. Thus, understanding the production and ablation of leaf waxes is critical in plant physiology and for geochemical studies. However, there have been no accurate approaches to quantify leaf wax production at different time scales. In this study, we demonstrate a novel approach to study leaf wax regeneration by irrigating plants with a pulse of deuterium-enriched water, followed by measurements of leaf wax D/H ratios by gas chromatography/isotope-ratio mass spectrometry (GC/IRMS). We demonstrate the efficacy of this approach using the grass species Phleum pratense in a greenhouse environment. Using a binary isotope mass balance model, we are able to quantify the regeneration rates of the C(16), C(18) acids and leaf waxes (C(23)-C(31) n-alkanes; C(22)-C(30) n-acids) over a diurnal cycle. Our results show that within one day 33-47% of C(16) and C(18) acids are regenerated, and thus the recycling time for these compounds is 2-3 days. For C(22)-C(26) n-alkyl lipids, 7-21% are regenerated within one day and thus they require 5-16 days to recycle. In comparison, the recycling time for long-chain n-alkyl lipids (C(27)-C(31)) is as long as 71-128 days. Our approach can be applied to different plants at shorter or longer time scales by adjusting the degree of isotopic labeling, sampling intervals and the amount of irrigation water. PMID:22173799

Gao, Li; Burnier, Andre; Huang, Yongsong



Quantifying human response capabilities towards tsunami threats at community level  

NASA Astrophysics Data System (ADS)

Decision makers at the community level need detailed information on tsunami risks in their area. Knowledge on potential hazard impact, exposed elements such as people, critical facilities and lifelines, people's coping capacity and recovery potential are crucial to plan precautionary measures for adaptation and to mitigate potential impacts of tsunamis on society and the environment. A crucial point within a people-centred tsunami risk assessment is to quantify the human response capabilities towards tsunami threats. Based on this quantification and spatial representation in maps tsunami affected and safe areas, difficult-to-evacuate areas, evacuation target points and evacuation routes can be assigned and used as an important contribution to e.g. community level evacuation planning. Major component in the quantification of human response capabilities towards tsunami impacts is the factor time. The human response capabilities depend on the estimated time of arrival (ETA) of a tsunami, the time until technical or natural warning signs (ToNW) can be received, the reaction time (RT) of the population (human understanding of a tsunami warning and the decision to take appropriate action), the evacuation time (ET, time people need to reach a safe area) and the actual available response time (RsT = ETA - ToNW - RT). If RsT is larger than ET, people in the respective areas are able to reach a safe area and rescue themselves. Critical areas possess RsT values equal or even smaller ET and hence people whin these areas will be directly affected by a tsunami. Quantifying the factor time is challenging and an attempt to this is presented here. The ETA can be derived by analyzing pre-computed tsunami scenarios for a respective area. For ToNW we assume that the early warning center is able to fulfil the Indonesian presidential decree to issue a warning within 5 minutes. RT is difficult as here human intrinsic factors as educational level, believe, tsunami knowledge and experience besides others play a role. An attempt to quantify this variable under high uncertainty is also presented. Quantifying ET is based on a GIS modelling using a Cost Weighted Distance approach. Basic principle is to define the best evacuation path from a given point to the next safe area (shelter location). Here the fastest path from that point to the shelter location has to be found. Thereby the impact of land cover, slope, population density, population age and gender distribution are taken into account as literature studies prove these factors as highly important. Knowing the fastest path and the distance to the next safe area together with a spatially distributed pattern of evacuation speed delivers the time needed from each location to a safe area. By considering now the obtained time value for RsT the coverage area of an evacuation target point (safe area) can be assigned. Incorporating knowledge on people capacity of an evacuation target point the respective coverage area is refined. Hence areas with weak, moderate and good human response capabilities can be detected. This allows calculation of potential amount of people affected (dead or injured) and amount of people dislocated. First results for Kuta (Bali) for a worst case tsunami event deliver people affected of approx. 25 000 when RT = 0 minutes (direct evacuation when receiving a tsunami warning to 120 000 when RT > ETA (no evacuation action until tsunami hits the land). Additionally fastest evacuation routes to the evacuation target points can be assigned. Areas with weak response capabilities can be assigned as priority areas to install e.g. additional evacuation target points or to increase tsunami knowledge and awareness to promote a faster reaction time. Especially in analyzing underlying socio-economic properties causing deficiencies in responding to a tsunami threat can lead to valuable information and direct planning of adaptation measures. Keywords: Community level, Risk and vulnerability assessment, Early warning, Disaster management, Tsunami, Indonesia

Post, J.; Mck, M.; Zosseder, K.; Wegscheider, S.; Taubenbck, H.; Strunz, G.; Muhari, A.; Anwar, H. Z.; Birkmann, J.; Gebert, N.



Solar System Odyssey - Fulldome Digital Planetarium Show  

NSDL National Science Digital Library

This is a Fulldome Digital Planetarium Show. Learners go on a futuristic journey through our Solar System. They explore the inner and outer planets, then the moons: Titan, Europa, and Callisto as possible places to establish a human colony. A full-length preview of the show is available on the website, you need to scroll down about 3/4 of the page - under section on children's shows, direct link not available.


New methods to quantify the cracking performance of cementitious systems made with internal curing  

NASA Astrophysics Data System (ADS)

The use of high performance concretes that utilize low water-cement ratios have been promoted for use in infrastructure based on their potential to increase durability and service life because they are stronger and less porous. Unfortunately, these benefits are not always realized due to the susceptibility of high performance concrete to undergo early age cracking caused by shrinkage. This problem is widespread and effects federal, state, and local budgets that must maintain or replace deterioration caused by cracking. As a result, methods to reduce or eliminate early age shrinkage cracking have been investigated. Internal curing is one such method in which a prewetted lightweight sand is incorporated into the concrete mixture to provide internal water as the concrete cures. This action can significantly reduce or eliminate shrinkage and in some cases causes a beneficial early age expansion. Standard laboratory tests have been developed to quantify the shrinkage cracking potential of concrete. Unfortunately, many of these tests may not be appropriate for use with internally cured mixtures and only provide limited amounts of information. Most standard tests are not designed to capture the expansive behavior of internally cured mixtures. This thesis describes the design and implementation of two new testing devices that overcome the limitations of current standards. The first device discussed in this thesis is called the dual ring. The dual ring is a testing device that quantifies the early age restrained shrinkage performance of cementitious mixtures. The design of the dual ring is based on the current ASTM C 1581-04 standard test which utilizes one steel ring to restrain a cementitious specimen. The dual ring overcomes two important limitations of the standard test. First, the standard single ring test cannot restrain the expansion that takes place at early ages which is not representative of field conditions. The dual ring incorporates a second restraining ring which is located outside of the sample to provide restraint against expansion. Second, the standard ring test is a passive test that only relies on the autogenous and drying shrinkage of the mixture to induce cracking. The dual ring test can be an active test because it has the ability to vary the temperature of the specimen in order to induce thermal stress and produce cracking. This ability enables the study of the restrained cracking capacity as the mixture ages in order to quantify crack sensitive periods of time. Measurements made with the dual ring quantify the benefits from using larger amounts of internal curing. Mixtures that resupplied internal curing water to match that of chemical shrinkage could sustain three times the magnitude of thermal change before cracking. The second device discussed in this thesis is a large scale slab testing device. This device tests the cracking potential of 15' long by 4" thick by 24" wide slab specimens in an environmentally controlled chamber. The current standard testing devices can be considered small scale and encounter problems when linking their results to the field due to size effects. Therefore, the large scale slab testing device was developed in order to calibrate the results of smaller scale tests to real world field conditions such as a pavement or bridge deck. Measurements made with the large scale testing device showed that the cracking propensity of the internally cured mixtures was reduced and that a significant benefit could be realized.

Schlitter, John L.



EPA Science Inventory

A significant limitation in defining remediation needs at contaminated sites often results from an insufficient understanding of the transport processes that control contaminant migration. The objectives of this research were to help resolve this dilemma by providing an improved ...


Quantifying commuter exposures to volatile organic compounds  

NASA Astrophysics Data System (ADS)

Motor-vehicles can be a predominant source of air pollution in cities. Traffic-related air pollution is often unavoidable for people who live in populous areas. Commuters may have high exposures to traffic-related air pollution as they are close to vehicle tailpipes. Volatile organic compounds (VOCs) are one class of air pollutants of concern because exposure to VOCs carries risk for adverse health effects. Specific VOCs of interest for this work include benzene, toluene, ethylbenzene, and xylenes (BTEX), which are often found in gasoline and combustion products. Although methods exist to measure time-integrated personal exposures to BTEX, there are few practical methods to measure a commuter's time-resolved BTEX exposure which could identify peak exposures that could be concealed with a time-integrated measurement. This study evaluated the ability of a photoionization detector (PID) to measure commuters' exposure to BTEX using Tenax TA samples as a reference and quantified the difference in BTEX exposure between cyclists and drivers with windows open and closed. To determine the suitability of two measurement methods (PID and Tenax TA) for use in this study, the precision, linearity, and limits of detection (LODs) for both the PID and Tenax TA measurement methods were determined in the laboratory with standard BTEX calibration gases. Volunteers commuted from their homes to their work places by cycling or driving while wearing a personal exposure backpack containing a collocated PID and Tenax TA sampler. Volunteers completed a survey and indicated if the windows in their vehicle were open or closed. Comparing pairs of exposure data from the Tenax TA and PID sampling methods determined the suitability of the PID to measure the BTEX exposures of commuters. The difference between BTEX exposures of cyclists and drivers with windows open and closed in Fort Collins was determined. Both the PID and Tenax TA measurement methods were precise and linear when evaluated in the laboratory using standard BTEX gases. The LODs for the Tenax TA sampling tubes (determined with a sample volume of 1,000 standard cubic centimeters which is close to the approximate commuter sample volumes collected) were orders of magnitude lower (0.04 to 0.7 parts per billion (ppb) for individual compounds of BTEX) compared to the PIDs' LODs (9.3 to 15 ppb of a BTEX mixture), which makes the Tenax TA sampling method more suitable to measure BTEX concentrations in the sub-parts per billion (ppb) range. PID and Tenax TA data for commuter exposures were inversely related. The concentrations of VOCs measured by the PID were substantially higher than BTEX concentrations measured by collocated Tenax TA samplers. The inverse trend and the large difference in magnitude between PID responses and Tenax TA BTEX measurements indicates the two methods may have been measuring different air pollutants that are negatively correlated. Drivers in Fort Collins, Colorado with closed windows experienced greater time-weighted average BTEX exposures than cyclists (p: 0.04). Commuter BTEX exposures measured in Fort Collins were lower than commuter exposures measured in prior studies that occurred in larger cities (Boston and Copenhagen). Although route and intake may affect a commuter's BTEX dose, these variables are outside of the scope of this study. Within the limitations of this study (including: small sample size, small representative area of Fort Collins, and respiration rates not taken into account), it appears health risks associated with traffic-induced BTEX exposures may be reduced by commuting via cycling instead of driving with windows closed and living in a less populous area that has less vehicle traffic. Although the PID did not reliably measure low-level commuter BTEX exposures, the Tenax TA sampling method did. The PID measured BTEX concentrations reliably in a controlled environment, at high concentrations (300-800 ppb), and in the absence of other air pollutants. In environments where there could be multiple chemicals present that may produce a PID signal (such a

Kayne, Ashleigh


The Physics of Equestrian Show Jumping  

ERIC Educational Resources Information Center

This article discusses the kinematics and dynamics of equestrian show jumping. For some time I have attended a series of show jumping events at Spruce Meadows, an international equestrian center near Calgary, Alberta, often referred to as the "Wimbledon of equestrian jumping." I have always had a desire to write an article such as this

Stinner, Art



Salton Sea Satellite Image Showing Fault Slip  

USGS Multimedia Gallery

Landsat satellite image (LE70390372003084EDC00) showing location of surface slip triggered along faults in the greater Salton Trough area. Red bars show the generalized location of 2010 surface slip along faults in the central Salton Trough and many additional faults in the southwestern section of t...


International Plowing Match & Farm Machinery Show  

NSDL National Science Digital Library

The 1995 International Plowing Match & Farm Machinery Show in Ontario, Canada has a site of the Web. The IPM is a non-profit organization of volunteers which annually organizes Canada's largest farm machinery show. The event is commercial and educational. Thousands of school children and educators attend and participate in organized educational activities.



The Language of Show Biz: A Dictionary.  

ERIC Educational Resources Information Center

This dictionary of the language of show biz provides the layman with definitions and essays on terms and expressions often used in show business. The overall pattern of selection was intended to be more rather than less inclusive, though radio, television, and film terms were deliberately omitted. Lengthy explanations are sometimes used to express

Sergel, Sherman Louis, Ed.


Livestock Shows Quick Reference Health Requirements  

E-print Network

15 MARKET STEER INFO Knoxville Spring Junior Cattle Exposition Ownership Deadline: March 1 Tattooed Deadline: 90 days prior to show Tattooed and Ear Tagged: 90 days prior to show Entries Due to UT: May 1 INFO Knoxville Spring Junior Cattle Exposition Ownership Deadline: March 15 Tattooed and Ear Tagged

Grissino-Mayer, Henri D.


Differential GPS measurements as a tool to quantify Late Cenozoic crustal deformation (Oman, Arabian Peninsula)  

NASA Astrophysics Data System (ADS)

The Sultanate of Oman is situated in the north-eastern part of the Arabian Plate. It therefore represents the leading edge as the plate is drifting north relative to the Eurasian Plate. The movement results in continent-continent collision in the northwest (Zagros fold and thrust belt) and ocean-continent collision in the northeast (Makran subduction zone). We follow the hypothesis that this plate tectonic setting results in an internal deformation of the Arabian Plate. The study presented here is part of a larger project that aims at quantifying the forcing factors of coastal evolution (Hoffmann et al. 2012). The sea level development, climate - and associated rates of weathering and sediment supply - and differential land movement (neotectonics) are identified as key factors during the Late Cenozoic. Recent vertical land movement is obvious and expressed in differences of the coastal morphology. Parts of the coastline are subsiding: these areas show drowned wadi mouths. Other parts are characterised by a straightened coastline and raised wave-cut terraces are evident well above present mean sea-level. Despite these erosional terraces, depositional terraces on alluvial fans are also encountered in close vicinity to the mountain chain. Detailed topographic profile measurements are carried out using a LEICA Viva GNSS-GS15 differential GPS. The instrument yields data with an accuracy of 1-2 cm relatively to the base station. The profile measurements are orientated perpendicular to the coastline and therefore perpendicular to the raised wave-cut terraces. Up to 6 terraces are encountered in elevations up to 400 m above present sea level with the older ones being the highest. The data allow calculating the scarp height, tread length and tread angle of the terraces. The results indicate that the terraces show an increased seaward tilting with age. This observation is interpreted as reflecting ongoing uplift. A coast-parallel deformation pattern becomes obvious when comparing parallel profiles. Profiles measured along depositional fluvial terraces also indicate a direct correlation of the age of the deposits and the dip-angle of the surface. Further evidence for ongoing uplift is seen as the older fluvial terraces are situated further inland. Additional dating evidence is needed to quantify the uplift and to resolve the differential land movement in time and space.

Rupprechter, M.; Roepert, A.; Hoffmann, G.



Incorporating both physical and kinetic limitations in quantifying dissolved oxygen flux to aquatic sediments  

USGS Publications Warehouse

Traditionally, dissolved oxygen (DO) fluxes have been calculated using the thin-film theory with DO microstructure data in systems characterized by fine sediments and low velocities. However, recent experimental evidence of fluctuating DO concentrations near the sediment-water interface suggests that turbulence and coherent motions control the mass transfer, and the surface renewal theory gives a more mechanistic model for quantifying fluxes. Both models involve quantifying the mass transfer coefficient (k) and the relevant concentration difference (??C). This study compared several empirical models for quantifying k based on both thin-film and surface renewal theories, as well as presents a new method for quantifying ??C (dynamic approach) that is consistent with the observed DO concentration fluctuations near the interface. Data were used from a series of flume experiments that includes both physical and kinetic uptake limitations of the flux. Results indicated that methods for quantifying k and ??C using the surface renewal theory better estimated the DO flux across a range of fluid-flow conditions. ?? 2009 ASCE.

O'Connor, B.L.; Hondzo, M.; Harvey, J.W.



Quantifying the Fate of Stablised Criegee Intermediates under Atmospheric Conditions  

NASA Astrophysics Data System (ADS)

The products of alkene ozonolysis have been shown in field experiments to convert SO2 to H2SO4. One fate of H2SO4 formed in the atmosphere is the formation of sulphate aerosol. This has been reported to contribute - 0.4 W m-2 to anthropogenic radiative forcing via the direct aerosol effect and can also contribute to the indirect aerosol effect, currently one of the greatest uncertainties in climate modelling. The observed SO2 oxidation has been proposed to arise from reactions of the carbonyl oxide, or Criegee Intermediate (CI), formed during alkene ozonolysis reactions, with SO2. Direct laboratory experiments have confirmed that stabilised CIs (SCIs) react more quickly with SO2 (k > 10-11 cm3 s-1) than was previously thought. The major sink for SCI in the troposphere is reaction with water vapour. The importance of the SO2 + SCI reaction in H2SO4 formation has been shown in modelling work to be critically dependent on the ratio of the rate constants for the reaction of the SCI with SO2 and with H2O. Such modelling work has suggested that the SCI + SO2 reaction is only likely to be important in regions with high alkene emissions, e.g. forests. Here we present results from a series of ozonolysis experiments performed at the EUPHORE atmospheric simulation chamber, Valencia. These experiments measure the loss of SO2, in the presence of an alkene (ethene, cis-but-2-ene and 2,3-dimethyl butene), as a function of water vapour. From these experiments we quantify the relative rates of reaction of the three smallest SCI with water and SO2 and their decomposition rates. In addition the results appear to suggest that the conversion of SO2 to H2SO4 during alkene ozonolysis may be inconsistent with the SCI + SO2 mechanism alone, particularly at high relative humidities. The results suggest that SCI are likely to provide at least an equivalent sink for SO2 to that of OH in the troposphere, in agreement with field observations. This work highlights the importance of alkene ozonolysis not only as a non-photolytic source of HOx but additionally as a source of other important atmospheric oxidants and moves towards quantifying some of the important sinks of SCI in the atmosphere.

Newland, Mike; Rickard, Andrew; Alam, Mohammed; Vereecken, Luc; Muoz, Amalia; Rdenas, Milagros; Bloss, William



Quantified Self and Comprehensive Geriatric Assessment: Older Adults Are Able to Evaluate Their Own Health and Functional Status  

PubMed Central

Background There is an increased interest of individuals in quantifying their own health and functional status. The aim of this study was to examine the concordance of answers to a self-administered questionnaire exploring health and functional status with information collected during a full clinical examination performed by a physician among cognitively healthy adults (CHI) and older patients with mild cognitive impairment (MCI) or mild-to-moderate Alzheimer disease (AD). Methods Based on cross-sectional design, a total of 60 older adults (20 CHI, 20 patients with MCI, and 20 patients with mild-to-moderate AD) were recruited in the memory clinic of Angers, France. All participants completed a self-administered questionnaire in paper format composed of 33 items exploring age, gender, nutrition, place of living, social resources, drugs daily taken, memory complaint, mood and general feeling, fatigue, activities of daily living, physical activity and history of falls. Participants then underwent a full clinical examination by a physician exploring the same domains. Results High concordance between the self-administered questionnaire and physician's clinical examination was showed. The few divergences were related to cognitive status, answers of AD and MCI patients to the self-administered questionnaire being less reliable than those of CHI. Conclusion Older adults are able to evaluate their own health and functional status, regardless of their cognitive status. This result needs to be confirmed and opens new perspectives for the quantified self-trend and could be helpful in daily clinical practice of primary care. PMID:24968016

Beauchet, Olivier; Launay, Cyrille P.; Merjagnan, Christine; Kabeshova, Anastasiia; Annweiler, Cdric



Comparison of Weather Shows in Eastern Europe  

NASA Astrophysics Data System (ADS)

Comparison of Weather Shows in Eastern Europe Television weather shows in Eastern Europe have in most cases in the high graphical standard. There is though a wast difference in duration and information content in the weather shows. There are few signs and regularities by which we can see the character of the weather show. The main differences are mainly caused by the income structure of the TV station. Either it is a fully privately funded TV relying on the TV commercials income. Or it is a public service TV station funded mainly by the national budget or fixed fee structure/tax. There are wast differences in duration and even a graphical presentation of the weather. Next important aspect is a supplier of the weather information and /or the processor. Shortly we can say, that when the TV show is produced by the national met office, the TV show consists of more scientific terms, synoptic maps, satellite imagery, etc. If the supplier is the private meteorological company, the weather show is more user-friendly, laical with less scientific terms. We are experiencing a massive shift in public weather knowledge and demand for information. In the past, weather shows consisted only of maps with weather icons. In todas world, even the laic weather shows consist partly of numerical weather model outputs - they are of course designed to be understandable and graphically attractive. Outputs of the numerical weather models used to be only a part of daily life of a professional meteorologist, today they are common part of life of regular people. Video samples are a part of this presentation.

Najman, M.



Quantifying Effects Of Water Stress On Sunflowers  

Technology Transfer Automated Retrieval System (TEKTRAN)

This poster presentation describes the data collection and analysis procedures and results for 2009 from a research grant funded by the National Sunflower Association. The primary objective was to evaluate the use of crop canopy temperature measured with infrared temperature sensors, as a more time ...


Which method for quantifying microalbuminuria in diabetics?  

Microsoft Academic Search

We have compared the chemical and clinical characteristics of an immunonephelometric assay (INA), two immunoturbidimetric assays (ITA) and two semiquantitative methods with those of a solid-phase radioimmunoassay (RIA) for measurement of urinary albumin (UA) concentration in 136 diabetic patients. INA and RIA had similar accuracy, and provided comparable results. However, RIA has slightly greater sensitivity than INA, which is easier

Ottavio Giampietro; Giuseppe Penno; Aldo Clerico; Lorella Cruschelli; Amalia Lucchetti; Monica Nannipieri; Mauro Cecere; Loredana Rizzo; Renzo Navalesi



Quantifying the Ease of Scientific Discovery  

PubMed Central

It has long been known that scientific output proceeds on an exponential increase, or more properly, a logistic growth curve. The interplay between effort and discovery is clear, and the nature of the functional form has been thought to be due to many changes in the scientific process over time. Here I show a quantitative method for examining the ease of scientific progress, another necessary component in understanding scientific discovery. Using examples from three different scientific disciplines mammalian species, chemical elements, and minor planets I find the ease of discovery to conform to an exponential decay. In addition, I show how the pace of scientific discovery can be best understood as the outcome of both scientific output and ease of discovery. A quantitative study of the ease of scientific discovery in the aggregate, such as done here, has the potential to provide a great deal of insight into both the nature of future discoveries and the technical processes behind discoveries in science. PMID:22328796

Arbesman, Samuel



Quantifying the value of redundant measurements at GCOS Reference Upper-Air Network sites  

NASA Astrophysics Data System (ADS)

The potential for measurement redundancy to reduce uncertainty in atmospheric variables has not been investigated comprehensively for climate observations. We evaluated the usefulness of entropy and mutual correlation concepts, as defined in information theory, for quantifying random uncertainty and redundancy in time series of the integrated water vapour (IWV) and water vapour mixing ratio profiles provided by five highly instrumented GRUAN (GCOS, Global Climate Observing System, Reference Upper-Air Network) stations in 2010-2012. Results show that the random uncertainties on the IWV measured with radiosondes, global positioning system, microwave and infrared radiometers, and Raman lidar measurements differed by less than 8%. Comparisons of time series of IWV content from ground-based remote sensing instruments with in situ soundings showed that microwave radiometers have the highest redundancy with the IWV time series measured by radiosondes and therefore the highest potential to reduce the random uncertainty of the radiosondes time series. Moreover, the random uncertainty of a time series from one instrument can be reduced by ~ 60% by constraining the measurements with those from another instrument. The best reduction of random uncertainty is achieved by conditioning Raman lidar measurements with microwave radiometer measurements. Specific instruments are recommended for atmospheric water vapour measurements at GRUAN sites. This approach can be applied to the study of redundant measurements for other climate variables.

Madonna, F.; Rosoldi, M.; Gldner, J.; Haefele, A.; Kivi, R.; Cadeddu, M. P.; Sisterson, D.; Pappalardo, G.



Identifying and quantifying the stromal fibrosis in muscularis propria of colorectal carcinoma by multiphoton microscopy  

NASA Astrophysics Data System (ADS)

The examination of stromal fibrosis within colorectal cancer is overlooked, not only because the routine pathological examinations seem to focus more on tumour staging and precise surgical margins, but also because of the lack of efficient diagnostic methods. Multiphoton microscopy (MPM) can be used to study the muscularis stroma of normal and colorectal carcinoma tissue at the molecular level. In this work, we attempt to show the feasibility of MPM for discerning the microstructure of the normal human rectal muscle layer and fibrosis colorectal carcinoma tissue practicably. Three types of muscularis propria stromal fibrosis beneath the colorectal cancer infiltration were first observed through the MPM imaging system by providing intercellular microstructural details in fresh, unstained tissue samples. Our approach also presents the capability of quantifying the extent of stromal fibrosis from both amount and orientation of collagen, which may further characterize the severity of fibrosis. By comparing with the pathology analysis, these results show that the MPM has potential advantages in becoming a histological tool for detecting the stromal fibrosis and collecting prognosis evidence, which may guide subsequent therapy procedures for patients into good prognosis.

Chen, Sijia; Yang, Yinghong; Jiang, Weizhong; Feng, Changyin; Chen, Zhifen; Zhuo, Shuangmu; Zhu, Xiaoqin; Guan, Guoxian; Chen, Jianxin



Quantifying the Spatial Patterns of Soil Redistribution and Soil Quality on two Contrasting Hillslopes  

Microsoft Academic Search

The soil redistribution from erosion processes may result in the spatial variability patterns of soil quality within the landscape. The objectives of this study were to (i) quantify spatial patterns and controlling processes of soil redistribution due to water and tillage erosion, and (ii) correlate soil quality parameters with soil redistribution along the hillslope transects for different land use management

Y. Li; M. J. Lindstrom; M. Frielinghaus; H. R. Bork


Quantifying nonstationary radioactivity concentration fluctuations near Chernobyl: A complete statistical description  

E-print Network

Quantifying nonstationary radioactivity concentration fluctuations near Chernobyl: A complete Chernobyl after the 1986 disaster and find three new results: i the histogram of fluctuations is well.60. x, 02.50.Fz, 05.45.Tp, 87.66.Na I. INTRODUCTION Chernobyl's No. 4 reactor was completely destroyed

Stanley, H. Eugene


Quantifying nonstationary radioactivity concentration fluctuations near Chernobyl: A complete statistical description  

E-print Network

Quantifying nonstationary radioactivity concentration fluctuations near Chernobyl: A complete fluctuations measured near Chernobyl after the 1986 disaster and find three new results: #i# the histogram patterns. PACS number#s#: 89.60.#x, 02.50.Fz, 05.45.Tp, 87.66.Na I. INTRODUCTION Chernobyl's No. 4 reactor

Shlyakhter, Ilya



EPA Science Inventory

The report goes results of (1) a comparison the hood and chamber techniques for quantifying pollutant emission rates from unvented combustion appliances, and (2) an assessment of the semivolatile and nonvolatile organic-compound emissions from unvented kerosene space heaters. In ...



E-print Network

QUANTIFYING ACCELERATED SOIL EROSION THROUGH ECOLOGICAL SITE- BASED ASSESSMENTS OF WIND AND WATER EROSION contact: Nicholas Webb phone: 575-646-3584 email: web: http change and intensification have resulted in accelerated rates of soil erosion in many areas of the world


Signal processing system to quantify bilirubin in the jaundice clinical model spectra  

Microsoft Academic Search

Neonatal jaundice is a medical condition which occurs in newborns as a result of an imbalance between the production and elimination of bilirubin. Excess bilirubin in the blood stream diffuses into the surrounding tissue leading to a yellowing of the skin. An optical system integrated with a signal processing system is used as a platform to noninvasively quantify bilirubin concentration

Suresh K Alla; Adam Huddle; Joseph F Clark; Fred R Beyette Jr



Quantifying supercoiling-induced denaturation bubbles in DNA Jozef Adamcik,a  

E-print Network

Quantifying supercoiling-induced denaturation bubbles in DNA Jozef Adamcik,a Jae-Hyung Jeon single DNA plasmid imaging. We demonstrate that long-lived single-stranded denaturation bubbles exist and temperature conditions. The results presented herein underline the important role of denaturation bubbles

Potsdam, Universität


Image Processing to quantify the Trajectory of a Visualized Air Jet  

Microsoft Academic Search

In a ventilated space, the incoming air jet and the resulting airflow pattern play key roles in the removal or supply of heat, moisture, and harmful gases from or to living organisms (man, animal and plant). In this research, an image processing method was developed to visualize and quantify the two-dimensional trajectory and the deflection angle of an air jet

A. Van Brecht; K. Janssens; D. Berckmans; E. Vranken



Quantifying the mechanical and hydrologic effects of riparian vegetation on streambank stability  

Microsoft Academic Search

Riparian vegetation strips are widely used by river managers to increase streambank stability, among other purposes. However, though the effects of vegetation on bank stability are widely discussed they are rarely quantified, and generally underemphasize the importance of hydrologic processes, some of which may be detrimental. This paper presents results from an experiment in which the hydrologic and mechanical effects

Andrew Simon; Andrew J. C. Collison



Abstract.-Monte Carlo simula-tion is used to quantify the uncer-  

E-print Network

Abstract.-Monte Carlo simula- tion is used to quantify the uncer- tainty in the results or perceived uncertainty in the inputs to the assessment model. Monte Carlo sim- ulation is then used proscribed limits while keeping the catch quota stable. We illustrate the use of the Monte Carlo approach


Transplant Problems That May Show Up Later  


... soon after transplant Next Topic Other transplant issues Transplant problems that may show up later The type ... called Second Cancers Caused by Cancer Treatment . Post-transplant lymphoproliferative disorder Post-transplant lymphoproliferative ( lim -fo-pruh- ...


Do dogs (Canis familiaris) show contagious yawning?  


We report an experimental investigation into whether domesticated dogs display contagious yawning. Fifteen dogs were shown video clips of (1) humans and (2) dogs displaying yawns and open-mouth expressions (not yawns) to investigate whether dogs showed contagious yawning to either of these social stimuli. Only one dog performed significantly more yawns during or shortly after viewing yawning videos than to the open-mouth videos, and most of these yawns occurred to the human videos. No dogs showed significantly more yawning to the open-mouth videos (human or dog). The percentage of dogs showing contagious yawning was less than chimpanzees and humans showing this behavior, and considerably less than a recently published report investigating this behavior in dogs (Joly-Mascheroni et al. in Biol Lett 4:446-448, 2008). PMID:19452178

Harr, Aimee L; Gilbert, Valerie R; Phillips, Kimberley A



Nutrition and Feeding of Show Poultry  

E-print Network

The championship potential of a chicken or turkey is determined by genetics, but proper nutrition can help an animal achieve that genetic potential. This publication outlines four principles critical to developing a nutrition program for show...

Cartwright, A. Lee



Power spectrum scale invariance quantifies limbic dysregulation in trait anxious adults using fMRI: adapting methods optimized for characterizing autonomic dysregulation to neural dynamic time series.  


In a well-regulated control system, excitatory and inhibitory components work closely together with minimum lag; in response to inputs of finite duration, outputs should show rapid rise and, following the input's termination, immediate return to baseline. The efficiency of this response can be quantified using the power spectrum density's scaling parameter beta, a measure of self-similarity, applied to the first derivative of the raw signal. In this study, we adapted power spectrum density methods, previously used to quantify autonomic dysregulation (heart rate variability), to neural time series obtained via functional MRI. The negative feedback loop we investigated was the limbic system, using affect-valent faces as stimuli. We hypothesized that trait anxiety would be related to efficiency of regulation of limbic responses, as quantified by power-law scaling of fMRI time series. Our results supported this hypothesis, showing moderate to strong correlations of trait anxiety and beta (r=0.45-0.54) for the amygdala, orbitofrontal cortex, hippocampus, superior temporal gyrus, posterior insula, and anterior cingulate. Strong anticorrelations were also found between the amygdala's beta and wake heart rate variability (r=-0.61), suggesting a robust relationship between dysregulated limbic outputs and their autonomic consequences. PMID:20025979

Tolkunov, Denis; Rubin, Denis; Mujica-Parodi, Lr



Power spectrum scale invariance quantifies limbic dysregulation in trait anxious adults using fMRI: adapting methods optimized for characterizing autonomic dysregulation to neural dynamic timeseries.  

PubMed Central

In a well-regulated control system, excitatory and inhibitory components work closely together with minimum lag; in response to inputs of finite duration, outputs should show rapid rise and, following the input's termination, immediate return to baseline. The efficiency of this response can be quantified using the power spectrum density's scaling parameter ?, a measure of self-similarity, applied to the first-derivative of the raw signal. In this study, we adapted power spectrum density methods, previously used to quantify autonomic dysregulation (heart rate variability), to neural time-series obtained via functional MRI. The negative feedback loop we investigated was the limbic system, using affect-valent faces as stimuli. We hypothesized that trait anxiety would be related to efficiency of regulation of limbic responses, as quantified by power law scaling of fMRI time series. Our results supported this hypothesis, showing moderate to strong correlations of ? (r = 0.40.54) for the amygdala, orbitofrontal cortex, hippocampus, superior temporal gyrus, posterior insula, and anterior cingulate. Strong anticorrelations were also found between the amygdala's ? and wake heart rate variability (r = ?0.61), suggesting a robust relationship between dysregulated limbic outputs and their autonomic consequences. PMID:20025979

Tolkunov, Denis; Rubin, Denis; Mujica-Parodi, LR



Quantifying signal dispersion in a hybrid ice core melting system.  


We describe a microcontroller-based ice core melting and data logging system allowing simultaneous depth coregistration of a continuous flow analysis (CFA) system (for microparticle and conductivity measurement) and a discrete sample analysis system (for geochemistry and microparticles), both supplied from the same melted ice core section. This hybrid melting system employs an ice parcel tracking algorithm which calculates real-time sample transport through all portions of the meltwater handling system, enabling accurate (1 mm) depth coregistration of all measurements. Signal dispersion is analyzed using residence time theory, experimental results of tracer injection tests and antiparallel melting of replicate cores to rigorously quantify the signal dispersion in our system. Our dispersion-limited resolution is 1.0 cm in ice and ~2 cm in firn. We experimentally observe the peak lead phenomenon, where signal dispersion causes the measured CFA peak associated with a given event to be depth assigned ~1 cm shallower than the true event depth. Dispersion effects on resolution and signal depth assignment are discussed in detail. Our results have implications for comparisons of chemistry and physical properties data recorded using multiple instruments and for deconvolution methods of enhancing CFA depth resolution. PMID:23050603

Breton, Daniel J; Koffman, Bess G; Kurbatov, Andrei V; Kreutz, Karl J; Hamilton, Gordon S




SciTech Connect

Depending on the invasive nature of performing waste management activities, excessive concentrations of mists, vapors, gases, dusts or fumes may be present thus creating hazards to the employee from either inhalation into the lungs or absorption through the skin. To address these hazards, similar exposure groups and an exposure profile result consisting of: (1) a hazard index (concentration); (2) an exposure rating (monitoring results or exposure probabilities); and (3) a frequency rating (hours of potential exposure per week) are used to assign an exposure risk rating (ERR). The ERR determines if the potential hazards pose significant risks to employees linking potential exposure and breathing zone (BZ) monitoring requirements. Three case studies consisting of: (1) a hazard-task approach; (2) a hazard-job classification-task approach; and (3) a hazard approach demonstrate how to conduct exposure assessments using this methodology. Environment, safety and health professionals can then categorize levels of risk and evaluate the need for BZ monitoring, thereby quantifying employee exposure levels accurately.

Thompson, Aaron L.; Hylko, James M.



Quantifying the direct use value of Condor seamount  

NASA Astrophysics Data System (ADS)

Seamounts often satisfy numerous uses and interests. Multiple uses can generate multiple benefits but also conflicts and impacts, calling, therefore, for integrated and sustainable management. To assist in developing comprehensive management strategies, policymakers recognise the need to include measures of socioeconomic analysis alongside ecological data so that practical compromises can be made. This study assessed the direct output impact (DOI) of the relevant marine activities operating at Condor seamount (Azores, central northeast Atlantic) as proxies of the direct use values provided by the resource system. Results demonstrated that Condor seamount supported a wide range of uses yielding distinct economic outputs. Demersal fisheries, scientific research and shark diving were the top-three activities generating the highest revenues, while tuna fisheries, whale watching and scuba-diving had marginal economic significance. Results also indicated that the economic importance of non-extractive uses of Condor is considerable, highlighting the importance of these uses as alternative income-generating opportunities for local communities. It is hoped that quantifying the direct use values provided by Condor seamount will contribute to the decision making process towards its long-term conservation and sustainable use.

Ressurreio, Adriana; Giacomello, Eva



Quantifying Repetitive Speech in Autism Spectrum Disorders and Language Impairment  

PubMed Central

We report on an automatic technique for quantifying two types of repetitive speech: repetitions of what the child says him/herself (self-repeats) and of what is uttered by an interlocutor (echolalia). We apply this technique to a sample of 111 children between the ages of four and eight: 42 typically developing children (TD), 19 children with specific language impairment (SLI), 25 children with autism spectrum disorders (ASD) plus language impairment (ALI), and 25 children with ASD with normal, non-impaired language (ALN). The results indicate robust differences in echolalia between the TD and ASD groups as a whole (ALN + ALI), and between TD and ALN children. There were no significant differences between ALI and SLI children for echolalia or self-repetitions. The results confirm previous findings that children with ASD repeat the language of others more than other populations of children. On the other hand, self-repetition does not appear to be significantly more frequent in ASD, nor does it matter whether the childs echolalia occurred within one (immediate) or two turns (near-immediate) of the adults original utterance. Furthermore, non-significant differences between ALN and SLI, between TD and SLI, and between ALI and TD are suggestive that echolalia may not be specific to ALN or to ASD in general. One important innovation of this work is an objective fully automatic technique for assessing the amount of repetition in a transcript of a childs utterances. PMID:23661504

van Santen, Jan P. H.; Sproat, Richard W.; Hill, Alison Presmanes



Quantifying Russian wheat aphid pest intensity across the Great Plains.  


Wheat, the most important cereal crop in the Northern Hemisphere, is at-risk for an approximate 10% reduction in worldwide production because of animal pests. The potential economic impact of cereal crop pests has resulted in substantial research efforts into the understanding of pest agroecosystems and development of pest management strategy. Management strategy is informed frequently by models that describe the population dynamics of important crop pests and because of the economic impact of these pests, many models have been developed. Yet, limited effort has ensued to compare and contrast models for their strategic applicability and quality. One of the most damaging pests of wheat in North America is the Russian wheat aphid, Diuraphis noxia (Kurdjumov). Eighteen D. noxia population dynamic models were developed from the literature to describe pest intensity. The strongest models quantified the negative effects of fall and spring precipitation on aphid intensity, and the positive effects associated with alternate food source availability. Population dynamic models were transformed into spatially explicit models and combined to form a spatially explicit, model-averaged result. Our findings were used to delineate pest intensity on winter wheat across much of the Great Plains and will help improve D. noxia management strategy. PMID:23321099

Merrill, Scott C; Peairs, Frank B




SciTech Connect

The topical and controversial issue of parameterizing the magnetic structure of solar active regions has vital implications in the understanding of how these structures form, evolve, produce solar flares, and decay. This interdisciplinary and ill-constrained problem of quantifying complexity is addressed by using a two-dimensional wavelet transform modulus maxima (WTMM) method to study the multifractal properties of active region photospheric magnetic fields. The WTMM method provides an adaptive space-scale partition of a fractal distribution, from which one can extract the multifractal spectra. The use of a novel segmentation procedure allows us to remove the quiet Sun component and reliably study the evolution of active region multifractal parameters. It is shown that prior to the onset of solar flares, the magnetic field undergoes restructuring as Dirac-like features (with a Hoelder exponent, h = -1) coalesce to form step functions (where h = 0). The resulting configuration has a higher concentration of gradients along neutral line features. We propose that when sufficient flux is present in an active region for a period of time, it must be structured with a fractal dimension greater than 1.2, and a Hoelder exponent greater than -0.7, in order to produce M- and X-class flares. This result has immediate applications in the study of the underlying physics of active region evolution and space weather forecasting.

Conlon, Paul A.; McAteer, R.T. James; Gallagher, Peter T.; Fennell, Linda, E-mail: mcateer@nmsu.ed [School of Physics, Trinity College Dublin, Dublin 2 (Ireland)



Quantifying food losses and the potential for reduction in Switzerland.  


A key element in making our food systems more efficient is the reduction of food losses across the entire food value chain. Nevertheless, food losses are often neglected. This paper quantifies food losses in Switzerland at the various stages of the food value chain (agricultural production, postharvest handling and trade, processing, food service industry, retail, and households), identifies hotspots and analyses the reasons for losses. Twenty-two food categories are modelled separately in a mass and energy flow analysis, based on data from 31 companies within the food value chain, and from public institutions, associations, and from the literature. The energy balance shows that 48% of the total calories produced (edible crop yields at harvest time and animal products, including slaughter waste) is lost across the whole food value chain. Half of these losses would be avoidable given appropriate mitigation measures. Most avoidable food losses occur at the household, processing, and agricultural production stage of the food value chain. Households are responsible for almost half of the total avoidable losses (in terms of calorific content). PMID:23270687

Beretta, Claudio; Stoessel, Franziska; Baier, Urs; Hellweg, Stefanie



Quantifying photometric observing conditions on Paranal using an IR camera  

NASA Astrophysics Data System (ADS)

A Low Humidity and Temperature Profiling (LHATPRO) microwave radiometer, manufactured by Radiometer Physics GmbH (RPG), is used to monitor sky conditions over ESO's Paranal observatory in support of VLT science operations. In addition to measuring precipitable water vapour (PWV) the instrument also contains an IR camera measuring sky brightness temperature at 10.5 ?m. Due to its extended operating range down to -100 C it is capable of detecting very cold and very thin, even sub-visual, cirrus clouds. We present a set of instrument flux calibration values as compared with a detrended fluctuation analysis (DFA) of the IR camera zenith-looking sky brightness data measured above Paranal taken over the past two years. We show that it is possible to quantify photometric observing conditions and that the method is highly sensitive to the presence of even very thin clouds but robust against variations of sky brightness caused by effects other than clouds such as variations of precipitable water vapour. Hence it can be used to determine photometric conditions for science operations. About 60 % of nights are free of clouds on Paranal. More work will be required to classify the clouds using this technique. For the future this approach might become part of VLT science operations for evaluating nightly sky conditions.

Kerber, Florian; Querel, Richard R.; Hanuschik, Reinhard



Quantifying the benefits of vehicle pooling with shareability networks.  


Taxi services are a vital part of urban transportation, and a considerable contributor to traffic congestion and air pollution causing substantial adverse effects on human health. Sharing taxi trips is a possible way of reducing the negative impact of taxi services on cities, but this comes at the expense of passenger discomfort quantifiable in terms of a longer travel time. Due to computational challenges, taxi sharing has traditionally been approached on small scales, such as within airport perimeters, or with dynamical ad hoc heuristics. However, a mathematical framework for the systematic understanding of the tradeoff between collective benefits of sharing and individual passenger discomfort is lacking. Here we introduce the notion of shareability network, which allows us to model the collective benefits of sharing as a function of passenger inconvenience, and to efficiently compute optimal sharing strategies on massive datasets. We apply this framework to a dataset of millions of taxi trips taken in New York City, showing that with increasing but still relatively low passenger discomfort, cumulative trip length can be cut by 40% or more. This benefit comes with reductions in service cost, emissions, and with split fares, hinting toward a wide passenger acceptance of such a shared service. Simulation of a realistic online system demonstrates the feasibility of a shareable taxi service in New York City. Shareability as a function of trip density saturates fast, suggesting effectiveness of the taxi sharing system also in cities with much sparser taxi fleets or when willingness to share is low. PMID:25197046

Santi, Paolo; Resta, Giovanni; Szell, Michael; Sobolevsky, Stanislav; Strogatz, Steven H; Ratti, Carlo



Quantifying Self-Organization with Optimal Predictors  

Microsoft Academic Search

Despite broad interest in self-organizing systems, there are few quantitative, experimentally applicable criteria for self-organization. The existing criteria all give counter-intuitive results for important cases. In this Letter, we propose a new criterion, namely, an internally generated increase in the statistical complexity, the amount of information required for optimal prediction of the system's dynamics. We precisely define this complexity for

Cosma Rohilla Shalizi; Kristina Lisa Shalizi; Robert Haslinger



Quantifying Uncertainties in Tephra Thickness and Volume Estimates  

NASA Astrophysics Data System (ADS)

Characterization of explosive volcanic eruptive processes from interpretations of deposits is a key to assessing long-term volcanic hazards and risks, particularly for large explosive eruptions which occur relatively infrequently and others whose deposits, particularly distal deposits, are transient in the geological record. Whilst eruption size - determined by measurement and interpretation of tephra fall deposits - is of particular importance, uncertainties for such measurements and volume estimates are rarely presented. In this study, we quantify the main sources of variance in determining tephra volume from thickness measurements and isopachs in terms of number and spatial distribution of such measurements, using the Fogo A deposit, So Miguel, Azores as an example. Analysis of Fogo A fall deposits show measurement uncertainties are approximately 9 % of measured thickness while uncertainty associated with natural deposit variability ranges between 10 % and 40 % of average thickness, with an average variation of 30 %. Correlations between measurement uncertainties and natural deposit variability are complex and depend on a unit's thickness, position within a succession and distance from source and local topography. The degree to which thickness measurement errors impact on volume uncertainty depends on the number of measurements in a given dataset and their associated individual uncertainties. For Fogo A, the consequent uncertainty in volume associated with thickness measurement uncertainty is 1.3 %, equivalent to a volume uncertainty of 1.5 0.02 km3. Uncertainties also arise in producing isopach maps: the spatial relationships between source location and different deposit thicknesses are described by contours subjectively drawn to encompass measurements of a given thickness, generally by eye. Recent advances in volume estimation techniques involve the application of mathematical models directly to tephra thickness data. Here, uncertainties in tephra volumes derived from isopach maps were investigated by modelling raw thickness data as bicubic splines under tension. In this way, isopachs are objectively determined in relation to the original data. This enables limitations in volume estimates to be identified in previously published maps where a mathematically formal fitting procedure was not used. Eruption volumes derived using these spline isopachs are in general smaller than published traditional estimates. Using the bicubic spline method, volume uncertainties are correlated with number of data points and decrease from as much as 40 % relative to the mean estimate for a case with 30 measurements to 10 % when 120 measurements or more are available. Thus the accuracy of volume estimation using this method depends on the number of data points, their spatial distribution and their associated measurement uncertainties, and the estimate reliability can be fully quantified on these terms; comprehensive uncertainty assessment is not feasible for most conventional tephra volume estimates determined using hand drawn isopachs.

Engwell, S. L.; Aspinall, W.; Sparks, R. S.



Quantifiable effectiveness of experimental scaling of river- and delta morphodynamics and stratigraphy  

NASA Astrophysics Data System (ADS)

Laboratory experiments to simulate landscapes and stratigraphy often suffer from scale effects, because reducing length- and time scales leads to different behaviour of water and sediment. Classically, scaling proceeded from dimensional analysis of the equations of motion and sediment transport, and minor concessions, such as vertical length scale distortion, led to acceptable results. In the past decade many experiments were done that seriously violated these scaling rules, but nevertheless produced significant and insightful results that resemble the real world in quantifiable ways. Here we focus on self-formed fluvial channels and channel patterns in experiments. The objectives of this paper are 1) to identify what aspects of scaling considerations are most important for experiments that simulate morphodynamics and stratigraphy of rivers and deltas, 2) to establish a design strategy for experiments based on a combination of relaxed classical scale rules, theory of bars and meanders, and small-scale experiments focussed at specific processes. We present a number of small laboratory setups and protocols that we use to rapidly quantify erosional and depositional types of forms and dynamics that develop in the landscape experiments as a function of detailed properties, such as effective material strength, and to assess potential scale effects. Most importantly, the width-to-depth ratio of channels determines the bar pattern and meandering tendency. The strength of floodplain material determines these channel dimensions, and theory predicts that laboratory rivers should have 1.5 times larger width-to-depth ratios for the same bar pattern. We show how floodplain formation can be controlled by adding silt-sized silicaflour, bentonite, Medicago sativa (alfalfa) or Partially Hydrolyzed PolyAcrylamide (a synthetic polymer) to poorly sorted sediment. The experiments demonstrate that there is a narrow range of conditions between no mobility of bed or banks, and too much mobility. The density of vegetation and the volume proportion of silt allow well-controllable channel dimensions whereas the polymer proved difficult to control. The theory, detailed methods of quantification, and experimental setups presented here show that the rivers and deltas created in the laboratory seem to behave as natural rivers when the experimental conditions adhere to the relaxed scaling rules identified herein, and that required types of fluvio-deltaic morphodynamics can be reproduced based on conditions and sediments selected on the basis of a series of small-scale experiments.

Kleinhans, Maarten G.; van Dijk, Wout M.; van de Lageweg, Wietse I.; Hoyal, David C. J. D.; Markies, Henk; van Maarseveen, Marcel; Roosendaal, Chris; van Weesep, Wendell; van Breemen, Dimitri; Hoendervoogt, Remko; Cheshier, Nathan



Quantifying Self-Organization with Optimal Predictors  

Microsoft Academic Search

Despite broad interest in self-organizing systems, there are few\\u000aquantitative, experimentally-applicable criteria for self-organization. The\\u000aexisting criteria all give counter-intuitive results for important cases. In\\u000athis Letter, we propose a new criterion, namely an internally-generated\\u000aincrease in the statistical complexity, the amount of information required for\\u000aoptimal prediction of the system's dynamics. We precisely define this\\u000acomplexity for spatially-extended dynamical

Cosma Rohilla Shalizi; Kristina Lisa Shalizi; Robert Haslinger



Quantifying oil filtration effects on bearing life  

NASA Technical Reports Server (NTRS)

Rolling-element bearing life is influenced by the number, size, and material properties of particles entering the Hertzian contact of the rolling element and raceway. In general, rolling-element bearing life increases with increasing level of oil filtration. Based upon test results, two equations are presented which allow for the adjustment of bearing L(sub 10) or catalog life based upon oil filter rating. It is recommended that where no oil filtration is used catalog life be reduced by 50 percent.

Needelman, William M.; Zaretsky, Erwin V.



Numerical Green's function method: Application to quantifying ground motion variations of M7 earthquakes  

NASA Astrophysics Data System (ADS)

The concept of numerical Greens functions (NGF or Greens function database) is developed. The basic idea is: a large seismic fault is divided into subfaults of appropriate size, for which synthetic Greens functions at the surface (NGF) are calculated and stored. Consequently, ground motions from arbitrary kinematic sources can be simulated, rapidly, for the whole fault or parts of it by superposition. The target fault is a simplified, vertical model of the Newport-Inglewood fault in the Los Angeles basin. This approach and its functionality are illustrated by investigating the variations of ground motions (e.g. peak ground velocity and synthetic seismograms) due to the source complexity. The source complexities are considered with two respects: hypocenter location and slip history. The results show a complex behavior, with dependence of absolute peak ground velocity and their variation on source process directionality, hypocenter location, local structure, and static slip asperity location. We concluded that combining effect due to 3-D structure and finite-source is necessary to quantify ground motion characteristics and their variations. Our results will facilitate the earthquake hazard assessment projects.

Wang, Haijiang; Igel, Heiner; Gallovic, Frantisek



Technical Note: Mesocosm approach to quantify dissolved inorganic carbon percolation fluxes  

NASA Astrophysics Data System (ADS)

Dissolved inorganic carbon (DIC) fluxes across the vadose zone are influenced by a complex interplay of biological, chemical and physical factors. A novel soil mesocosm system was evaluated as a tool for providing information on the mechanisms behind DIC percolation to the groundwater from unplanted soil. Carbon dioxide partial pressure (pCO2), alkalinity, soil moisture and temperature were measured with depth and time, and DIC in the percolate was quantified using a sodium hydroxide trap. Results showed good reproducibility between two replicate mesocosms. The pCO2 varied between 0.2 and 1.1%, and the alkalinity was 0.1-0.6 meq L-1. The measured cumulative effluent DIC flux over the 78-day experimental period was 185-196 mg L-1 m-2 and in the same range as estimates derived from pCO2 and alkalinity in samples extracted from the side of the mesocosm column and the drainage flux. Our results indicate that the mesocosm system is a promising tool for studying DIC percolation fluxes and other biogeochemical transport processes in unsaturated environments.

Thaysen, E. M.; Jessen, S.; Ambus, P.; Beier, C.; Postma, D.; Jakobsen, I.



Quantifying complexity of financial short-term time series by composite multiscale entropy measure  

NASA Astrophysics Data System (ADS)

It is significant to study the complexity of financial time series since the financial market is a complex evolved dynamic system. Multiscale entropy is a prevailing method used to quantify the complexity of a time series. Due to its less reliability of entropy estimation for short-term time series at large time scales, a modification method, the composite multiscale entropy, is applied to the financial market. To qualify its effectiveness, its applications in the synthetic white noise and 1 / f noise with different data lengths are reproduced first in the present paper. Then it is introduced for the first time to make a reliability test with two Chinese stock indices. After conducting on short-time return series, the CMSE method shows the advantages in reducing deviations of entropy estimation and demonstrates more stable and reliable results when compared with the conventional MSE algorithm. Finally, the composite multiscale entropy of six important stock indices from the world financial markets is investigated, and some useful and interesting empirical results are obtained.

Niu, Hongli; Wang, Jun



Quantifying emissions reductions from New England offshore wind energy resources  

E-print Network

Access to straightforward yet robust tools to quantify the impact of renewable energy resources on air emissions from fossil fuel power plants is important to governments aiming to improve air quality and reduce greenhouse ...

Berlinski, Michael Peter



Study Quantifies Physical Demands of Yoga in Seniors  


... external links Menu Study Quantifies Physical Demands of Yoga in Seniors A recent NCCAM-funded study measured the physical demands associated with seven commonly practiced yoga poses in older adults. Findings from the study ...



EPA Science Inventory

Methods to quantify instability of autonomic systems such as temperature regulation should be important in toxicant and drug safety studies. Stability of core temperature (Tc) in laboratory rodents is susceptible to a variety of stimuli. Calculating the temperature differential o...


Liquid Crystal Research Shows Deformation By Drying  

NASA Technical Reports Server (NTRS)

These images, from David Weitz's liquid crystal research, show ordered uniform sized droplets (upper left) before they are dried from their solution. After the droplets are dried (upper right), they are viewed with crossed polarizers that show the deformation caused by drying, a process that orients the bipolar structure of the liquid crystal within the droplets. When an electric field is applied to the dried droplets (lower left), and then increased (lower right), the liquid crystal within the droplets switches its alignment, thereby reducing the amount of light that can be scattered by the droplets when a beam is shone through them.



Quantifying deformation in a magma reservoir - a rheology study of the Listino Ring Structure, Adamello Massif, N-Italy  

NASA Astrophysics Data System (ADS)

The deformation and movement of magma in a reservoir during and after emplacement is vital for our understanding of the formation and evolution of such reservoirs. The general absence of marker beds and difficulties in interpreting foliation patterns [1] makes it difficult to trace and quantify deformation inside plutons. However, syn-plutonic dikes, sheets and enclaves can possibly be used as markers in order to quantify the deformation that occurred after the injection of magma from which they formed. Furthermore they provide a means to quantify the rheology of the magmas involved, providing quantitative insight into the deformation that goes on within magma chambers. A series of analog experiments [2] was conducted to mimic the deformation of a dike being injected into a crystal mush, using a cylindrical tank filled with corn syrup into which a tube of particle-fluid mixture is injected and subsequently sheared by rotating a rigid plate on top of the syrup. The experiments were designed to characterize the deformation of the tubes with respect to variations in yield strength of the injected material, buoyancy, and ambient flow behavior. Results show three possible deformation regimes: no break-up, boudinaged dikes or break up into enclaves. Here we present a case study of the Listino Ring Structure of the Adamello Batholith, N-Italy, where field evidence consisting of undeformed dikes to disaggregated dikes and sheets and enclave trains provides a clear indication of spatial and/or temporal changes of the deformation regime during pluton evolution. Both dike disaggregation and enclave formation were examined using equations derived from the experiments mentioned above, relating preserved length scales (dike width, enclave size) to the yield strength of the magmas and the chamber stirring velocity. This allows us to quantify the observed changes in the deformation regime in terms of changing magma rheology and strain rate. Results will be combined with petrologic constraints, and consequences for the interpretation of the deformation history will be discussed. [1] Paterson, S.R., T.K. Fowler Jr, K.L. Schmidt, A.S. Yoshinobu, E.S. Yuan, and R.B. Miller, 1998. Interpreting magmatic fabric patterns in plutons. Lithos 44: 53-82 [2] Hodge, K.F., G. Carazzo, M. Jellinek, 2010. Field and experimental constraints on the deformation and break-up up of injected magma, Abstract V54B-04 presented at 2010 Fall Meeting, AGU, San Francisco, California, 13-17 Dec.

Verberne, R.; Hodge, K. F.; Ulmer, P.; Muntener, O.



Quantifying RDX biodegradation in groundwater using ?15N isotope analysis  

NASA Astrophysics Data System (ADS)

Isotope analysis was used to examine the extent of hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX) biodegradation in groundwater along a ca. 1.35-km contamination plume. Biodegradation was proposed as a natural attenuating remediation method for the contaminated aquifer. By isotope analysis of RDX, the extent of biodegradation was found to reach up to 99.5% of the initial mass at a distance of 1.15-1.35 km down gradient from the contamination sources. A range of first-order biodegradation rates was calculated based on the degradation extents, with average half-life values ranging between 4.4 and 12.8 years for RDX biodegradation in the upper 15 m of the aquifer, assuming purely aerobic biodegradation, and between 10.9 and 31.2 years, assuming purely anaerobic biodegradation. Based on the geochemical data, an aerobic biodegradation pathway was suggested as the dominant attenuation process at the site. The calculated biodegradation rate was correlated with depth, showing decreasing degradation rates in deeper groundwater layers. Exceptionally low first-order kinetic constants were found in a borehole penetrating the bottom of the aquifer, with half life ranging between 85.0 to 161.5 years, assuming purely aerobic biodegradation, and between 207.5 and 394.3 years, assuming purely anaerobic biodegradation. The study showed that stable isotope fractionation analysis is a suitable tool to detect biodegradation of RDX in the environment. Our findings clearly indicated that RDX is naturally biodegraded in the contaminated aquifer. To the best of our knowledge, this is the first reported use of RDX isotope analysis to quantify its biodegradation in contaminated aquifers.

Bernstein, Anat; Adar, Eilon; Ronen, Zeev; Lowag, Harald; Stichler, Willibald; Meckenstock, Rainer U.



The object-oriented trivia show (TOOTS)  

Microsoft Academic Search

OOPSLA has a longstanding tradition of being a forum for discussing the cutting edge of technology in a fun and participatory environment. The type of events sponsored by OOPSLA sometimes border on the unconventional. This event represents an atypical panel that conforms to the concept of a game show that is focused on questions and answers related to SPLASH, OOPSLA,

Jeff Gray; Jules White



Showing R-Rated Videos in School.  

ERIC Educational Resources Information Center

Since 1990, there have been at least six published court decisions concerning teachers' use of controversial videos in public schools. A relevant district policy led the Colorado Supreme Court to uphold a teacher's termination for showing 12th graders an R-rated 1900 Bertolucci film on fascism. Implications are discussed. (MLH)

Zirkel, Perry A.



Show Them You Really Want the Job  

ERIC Educational Resources Information Center

Showing that one really "wants" the job entails more than just really wanting the job. An interview is part Broadway casting call, part intellectual dating game, part personality test, and part, well, job interview. When there are 300 applicants for a position, many of them will "fit" the required (and even the preferred) skills listed in the job

Perlmutter, David D.




E-print Network

1 THE CHARLIE ROSE SHOW DECEMBER 20, 2004 DISCUSSION WITH SIR PAUL NURSE CHARLIE ROSE, HOST are going to pop out really new ways that we can think about disease. (END VIDEO CLIP) CHARLIE ROSE: Paul Nurse for the hour, next. (COMMERCIAL BREAK) CHARLIE ROSE: Sir Paul Nurse is here. In 2001, he received

Papavasiliou, F. Nina


Type VII secretion mycobacteria show the way  

Microsoft Academic Search

Recent evidence shows that mycobacteria have developed novel and specialized secretion systems for the transport of extracellular proteins across their hydrophobic, and highly impermeable, cell wall. Strikingly, mycobacterial genomes encode up to five of these transport systems. Two of these systems, ESX-1 and ESX-5, are involved in virulence they both affect the cell-to-cell migration of pathogenic mycobacteria. Here, we

Abdallah M. Abdallah; Nicolaas C. Gey van Pittius; Patricia A. DiGiuseppe Champion; Jeffery Cox; Joen Luirink; Christina M. J. E. Vandenbroucke-Grauls; Ben J. Appelmelk; Wilbert Bitter



Spotted Wing Drosophila adult female. Inset shows  

E-print Network

crops are not at risk because they are not soft-fleshed fruit. Spotted Wing Drosophila flies are small America. Spotted Wing Drosophila is not a true fruit fly like blueberry maggot or cherry fruit flySpotted Wing Drosophila adult female. Inset shows saw-like ovipositor used to cut into fruit skin

Goodman, Robert M.


A Talk Show from the Past.  

ERIC Educational Resources Information Center

Describes a two-day activity in which elementary students examine voting rights, the right to assemble, and women's suffrage. Explains the game, "Assemble, Reassemble," and a student-produced talk show with five students playing the roles of leaders of the women's suffrage movement. Profiles Elizabeth Cady Stanton, Lucretia Mott, Susan B. Anthony,

Gallagher, Arlene F.



INTRODUCTION Mitotic metaphase chromosomes show sister chromatids  

E-print Network

. Meiosis I bivalents, as mitotic chromosomes, show sister-chromatid centromere and arm cohesions cohesion during meiosis I, and then release centromere cohesion during meiosis II (for review see Moore and Orr- Weaver, 1998). Consequently, this sequential loss of cohesion during meiosis might be precisely

Villefranche sur mer


George Arcement Shows Locations of USGS Streamgages  

USGS Multimedia Gallery

USGS Louisiana Water Science Center Director George Arcement shows the locations of USGS' streamgage network to WAFB Meteorologist Jay Grymes. USGS maintains more than 30 real-time streamgages throughout the area affected by the 2011 Flood. In addition, more than 50 non-real-time gages were...


ShowOrHide 1.0  

NSDL National Science Digital Library

Many Mac users have hidden files located on their computers that they might not know about. ShowOrHide is a utility designed to locate invisible files and folders so that users will have more knowledge about such items. This program is compatible with computers running Mac OS X 10.5 or later.



Quantifying Ocean Acidification During the PETM  

NASA Astrophysics Data System (ADS)

The ocean will absorb increasing amounts of fossil fuel CO2 in the future, with the pH of surface waters decreasing by up to 0.5-0.6 pH [Caldeira and Wickett, 2003]. The Palaeocene-Eocene Thermal Maximum (PETM) has been suggested as a close palaeo-analogue for future climate change and ocean acidification [Zachos, et al., 2005] as the carbon release is thought to be comparable to that possible over the coming centuries. However, a prerequisite for the use of evaluated ecological response during the PETM as a constraint on future impacts on ecosystems of acidification due to fossil fuel burning is knowing how the paleo-pH changed at this time. The boron isotopic composition ?11B of foraminiferal calcite is a proxy for pH [Hemming and Hanson, 1992], but lack of sufficient amounts of un-recrystallized, singe species, foraminiferal calcite from this time interval has prevented the application of established pH proxies. We use in-situ, high-spatial resolution secondary ionization mass spectrometry (SIMS) to characterize the ?11B and B/Ca across the PETM in the benthic foraminifer Oridorsalis umbonatus at deep-sea Maud Rise (Site 690B) and shelf-depth Lenticulina sp. at Bass River. Mg/Ca indicates a two-step temperature increase from 12.7C to 18.5C, in agreement with previous work at Maud Rise. Since the boron isotope composition of Paleocene seawater is unknown, we applied the pH estimated by an Earth system model as a starting value. The reconstructed pH record across the PETM shows a large, two-step reduction coeval with temperature rise, with a recovery period to pre-event values significantly more drawn out than that of the isotopic composition of the ocean. Caldeira, K., and M. E. Wickett (2003), Nature, 425, 365. Hemming, N. G., and G. N. Hanson (1992), GCA, 56, 537-543. Zachos, J. C., et al. (2005), Science, 308, 1611-1615.

Schmidt, D. N.; Ridgwell, A.; Kasemann, S. A.; Thomas, E.



Quantifying catchment-scale mixing and its effect on time-varying travel time distributions  

NASA Astrophysics Data System (ADS)

Travel time distributions are often used to characterize catchment discharge behavior, catchment vulnerability to pollution and pollutant loads from catchments to downstream waters. However, these distributions vary with time because they are a function of rainfall and evapotranspiration. It is important to account for these variations when the time scale of interest is smaller than the typical time-scale over which average travel time distributions can be derived. Recent studies have suggested that subsurface mixing controls how rainfall and evapotranspiration affect the variability in travel time distributions of discharge. To quantify this relation between subsurface mixing and dynamics of travel time distributions, we propose a new transformation of travel time that yields transformed travel time distributions, which we call Storage Outflow Probability (STOP) functions. STOP functions quantify the probability for water parcels in storage to leave a catchment via discharge or evapotranspiration. We show that this is equal to quantifying mixing within a catchment. Compared to the similar Age function introduced by Botter et al. (2011), we show that STOP functions are more constant in time, have a clearer physical meaning and are easier to parameterize. Catchment-scale STOP functions can be approximated by a two-parameter beta distribution. One parameter quantifies the catchment preference for discharging young water; the other parameter quantifies the preference for discharging old water from storage. Because of this simple parameterization, the STOP function is an innovative tool to explore the effects of catchment mixing behavior, seasonality and climate change on travel time distributions and the related catchment vulnerability to pollution spreading.

van der Velde, Y.; Torfs, P. J. J. F.; van der Zee, S. E. A. T. M.; Uijlenhoet, R.



Quantum Process Tomography Quantifies Coherence Transfer Dynamics in Vibrational Exciton  

PubMed Central

Quantum coherence has been a subject of great interest in many scientific disciplines. However, detailed characterization of the quantum coherence in molecular systems, especially its transfer and relaxation mechanisms, still remains a major challenge. The difficulties arise in part because the spectroscopic signatures of the coherence transfer are typically overwhelmed by other excitation relaxation processes. We use quantum process tomography (QPT) via two-dimensional infrared spectroscopy to quantify the rate of the elusive coherence transfer between two vibrational exciton states. QPT retrieves the dynamics of the dissipative quantum system directly from the experimental observables. It thus serves as an experimental alternative to theoretical models of the system-bath interaction, and can be used to validate these theories. Our results for coupled carbonyl groups of a diketone molecule in chloroform, used as a benchmark system, reveal the non-secular nature of the interaction between the exciton and the Markovian bath and open the door for the systematic studies of the dissipative quantum systems dynamics in detail. PMID:24079417

Chuntonov, Lev; Ma, Jianqiang



Quantifying the abnormal hemodynamics of sickle cell anemia  

NASA Astrophysics Data System (ADS)

Sickle red blood cells (SS-RBC) exhibit heterogeneous morphologies and abnormal hemodynamics in deoxygenated states. A multi-scale model for SS-RBC is developed based on the Dissipative Particle Dynamics (DPD) method. Different cell morphologies (sickle, granular, elongated shapes) typically observed in deoxygenated states are constructed and quantified by the Asphericity and Elliptical shape factors. The hemodynamics of SS-RBC suspensions is studied in both shear and pipe flow systems. The flow resistance obtained from both systems exhibits a larger value than the healthy blood flow due to the abnormal cell properties. Moreover, SS-RBCs exhibit abnormal adhesive interactions with both the vessel endothelium cells and the leukocytes. The effect of the abnormal adhesive interactions on the hemodynamics of sickle blood is investigated using the current model. It is found that both the SS-RBC - endothelium and the SS-RBC - leukocytes interactions, can potentially trigger the vicious ``sickling and entrapment'' cycles, resulting in vaso-occlusion phenomena widely observed in micro-circulation experiments.

Lei, Huan; Karniadakis, George



Rapidly quantifying the relative distention of a human bladder  

NASA Technical Reports Server (NTRS)

A device and method of rapidly quantifying the relative distention of the bladder in a human subject are disclosed. The ultrasonic transducer which is positioned on the subject in proximity to the bladder is excited by a pulser under the command of a microprocessor to launch an acoustic wave into the patient. This wave interacts with the bladder walls and is reflected back to the ultrasonic transducer, when it is received, amplified and processed by the receiver. The resulting signal is digitized by an analog-to-digital converter under the command of the microprocessor and is stored in the data memory. The software in the microprocessor determines the relative distention of the bladder as a function of the propagated ultrasonic energy; and based on programmed scientific measurements and individual, anatomical, and behavioral characterists of the specific subject as contained in the program memory, sends out a signal to turn on any or all of the audible alarm, the visible alarm, the tactile alarm, and the remote wireless alarm.

Companion, John A. (inventor); Heyman, Joseph S. (inventor); Mineo, Beth A. (inventor); Cavalier, Albert R. (inventor); Blalock, Travis N. (inventor)



Rapidly quantifying the relative distention of a human bladder  

NASA Technical Reports Server (NTRS)

A device and method was developed to rapidly quantify the relative distention of the bladder of a human subject. An ultrasonic transducer is positioned on the human subject near the bladder. A microprocessor controlled pulser excites the transducer by sending an acoustic wave into the human subject. This wave interacts with the bladder walls and is reflected back to the ultrasonic transducer where it is received, amplified, and processed by the receiver. The resulting signal is digitized by an analog to digital converter, controlled by the microprocessor again, and is stored in data memory. The software in the microprocessor determines the relative distention of the bladder as a function of the propagated ultrasonic energy. Based on programmed scientific measurements and the human subject's past history as contained in program memory, the microprocessor sends out a signal to turn on any or all of the available alarms. The alarm system includes and audible alarm, the visible alarm, the tactile alarm, and the remote wireless alarm.

Companion, John A. (inventor); Heyman, Joseph S. (inventor); Mineo, Beth A. (inventor); Cavalier, Albert R. (inventor); Blalock, Travis N. (inventor)



Graphical methods for quantifying macromolecules through bright field imaging.  


Bright field imaging of biological samples stained with antibodies and/or special stains provides a rapid protocol for visualizing various macromolecules. However, this method of sample staining and imaging is rarely employed for direct quantitative analysis due to variations in sample fixations, ambiguities introduced by color composition and the limited dynamic range of imaging instruments. We demonstrate that, through the decomposition of color signals, staining can be scored on a cell-by-cell basis. We have applied our method to fibroblasts grown from histologically normal breast tissue biopsies obtained from two distinct populations. Initially, nuclear regions are segmented through conversion of color images into gray scale, and detection of dark elliptic features. Subsequently, the strength of staining is quantified by a color decomposition model that is optimized by a graph cut algorithm. In rare cases where nuclear signal is significantly altered as a result of sample preparation, nuclear segmentation can be validated and corrected. Finally, segmented stained patterns are associated with each nuclear region following region-based tessellation. Compared to classical non-negative matrix factorization, proposed method: (i) improves color decomposition, (ii) has a better noise immunity, (iii) is more invariant to initial conditions and (iv) has a superior computing performance. PMID:18703588

Chang, Hang; DeFilippis, Rosa Anna; Tlsty, Thea D; Parvin, Bahram



Quantifying uncertainty in brain network measures using Bayesian connectomics  

PubMed Central

The wiring diagram of the human brain can be described in terms of graph measures that characterize structural regularities. These measures require an estimate of whole-brain structural connectivity for which one may resort to deterministic or thresholded probabilistic streamlining procedures. While these procedures have provided important insights about the characteristics of human brain networks, they ultimately rely on unwarranted assumptions such as those of noise-free data or the use of an arbitrary threshold. Therefore, resulting structural connectivity estimates as well as derived graph measures fail to fully take into account the inherent uncertainty in the structural estimate. In this paper, we illustrate an easy way of obtaining posterior distributions over graph metrics using Bayesian inference. It is shown that this posterior distribution can be used to quantify uncertainty about graph-theoretical measures at the single subject level, thereby providing a more nuanced view of the graph-theoretical properties of human brain connectivity. We refer to this model-based approach to connectivity analysis as Bayesian connectomics. PMID:25339896

Janssen, Ronald J.; Hinne, Max; Heskes, Tom; van Gerven, Marcel A. J.



Quantifying the Rheological and Hemodynamic Characteristics of Sickle Cell Anemia  

PubMed Central

Sickle erythrocytes exhibit abnormal morphology and membrane mechanics under deoxygenated conditions due to the polymerization of hemoglobin S. We employed dissipative particle dynamics to extend a validated multiscale model of red blood cells (RBCs) to represent different sickle cell morphologies based on a simulated annealing procedure and experimental observations. We quantified cell distortion using asphericity and elliptical shape factors, and the results were consistent with a medical image analysis. We then studied the rheology and dynamics of sickle RBC suspensions under constant shear and in a tube. In shear flow, the transition from shear-thinning to shear-independent flow revealed a profound effect of cell membrane stiffening during deoxygenation, with granular RBC shapes leading to the greatest viscosity. In tube flow, the increase of flow resistance by granular RBCs was also greater than the resistance of blood flow with sickle-shape RBCs. However, no occlusion was observed in a straight tube under any conditions unless an adhesive dynamics model was explicitly incorporated into simulations that partially trapped sickle RBCs, which led to full occlusion in some cases. PMID:22339854

Lei, Huan; Karniadakis, GeorgeEm



Quantifying seabed properties in shelf waters using a parametric sonar  

NASA Astrophysics Data System (ADS)

Defence Research Establishment Atlantic is developing a bottom-tethered, wide-band sonar for collecting acoustic data in the open ocean. The transmitter, a parametric array, offers three advantages: a wide bandwidth (1-10 kHz), a narrow beamwidth (icons/Journals/Common/approx" ALT="approx" ALIGN="TOP"/>3) and virtually no sidelobes. These features allow direct measurement of seabed parameters in shallow water. Direct in this context means the absence of complications resulting from unwanted interactions of the acoustic pulse with ocean boundaries. This makes the parametric sonar an ideal tool with which to interrogate the seabed in shelf waters and quantify several geo-acoustic properties. To complement the narrow-beam active sonar, a six-channel superdirective/intensity array has been developed for the receiver. The superdirective receiver obtains a significantly narrower beam for a given array aperture than that obtained using a conventional acoustic receiver. A 900 MHz rf command link is used to steer the array to any combination of azimuth and tilt angle. Together with control over azimuth and tilt angle, the sonar frame is instrumented to monitor depth, roll and vertical acceleration to ensure quality control of the data. Data transmission back to the ship is accomplished via a 2.3 GHz rf data link capable of a data-transfer rate of up to 8 Mbits s-1. This paper describes the system's technical functionality, its acoustic principles of operation and its measurement application.

Hines, Paul C.



Quantifying seismic survey reverberation off the Alaskan North Slope.  


Shallow-water airgun survey activities off the North Slope of Alaska generate impulsive sounds that are the focus of much regulatory attention. Reverberation from repetitive airgun shots, however, can also increase background noise levels, which can decrease the detection range of nearby passive acoustic monitoring (PAM) systems. Typical acoustic metrics for impulsive signals provide no quantitative information about reverberation or its relative effect on the ambient acoustic environment. Here, two conservative metrics are defined for quantifying reverberation: a minimum level metric measures reverberation levels that exist between airgun pulse arrivals, while a reverberation metric estimates the relative magnitude of reverberation vs expected ambient levels in the hypothetical absence of airgun activity, using satellite-measured wind data. The metrics are applied to acoustic data measured by autonomous recorders in the Alaskan Beaufort Sea in 2008 and demonstrate how seismic surveys can increase the background noise over natural ambient levels by 30-45 dB within 1 km of the activity, by 10-25 dB within 15 km of the activity, and by a few dB at 128 km range. These results suggest that shallow-water reverberation would reduce the performance of nearby PAM systems when monitoring for marine mammals within a few kilometers of shallow-water seismic surveys. PMID:22087932

Guerra, Melania; Thode, Aaron M; Blackwell, Susanna B; Michael Macrander, A



Quantifying Square Membrane Wrinkle Behavior Using MITC Shell Elements  

NASA Technical Reports Server (NTRS)

For future membrane based structures, quantified predictions of membrane wrinkling behavior in terms of amplitude, angle and wavelength are needed to optimize the efficiency and integrity of such structures, as well as their associated control systems. For numerical analyses performed in the past, limitations on the accuracy of membrane distortion simulations have often been related to the assumptions made while using finite elements. Specifically, this work demonstrates that critical assumptions include: effects of gravity. supposed initial or boundary conditions, and the type of element used to model the membrane. In this work, a 0.2 square meter membrane is treated as a structural material with non-negligible bending stiffness. Mixed Interpolation of Tensorial Components (MTTC) shell elements are used to simulate wrinkling behavior due to a constant applied in-plane shear load. Membrane thickness, gravity effects, and initial imperfections with respect to flatness were varied in numerous nonlinear analysis cases. Significant findings include notable variations in wrinkle modes for thickness in the range of 50 microns to 1000 microns, which also depend on the presence of an applied gravity field. However, it is revealed that relationships between overall strain energy density for cases with differing initial conditions are independent of assumed initial con&tions. In addition, analysis results indicate that the relationship between amplitude scale (W/t) and structural scale (L/t) is linear in the presence of a gravity field.

Jacobson, Mindy B.; Iwasa, Takashi; Natori, M. C.



Quantifying the Benefits of Combining Offshore Wind and Wave Energy  

NASA Astrophysics Data System (ADS)

For many locations the offshore wind resource and the wave energy resource are collocated, which suggests a natural synergy if both technologies are combined into one offshore marine renewable energy plant. Initial meteorological assessments of the western coast of the United States suggest only a weak correlation in power levels of wind and wave energy at any given hour associated with the large ocean basin wave dynamics and storm systems of the North Pacific. This finding indicates that combining the two power sources could reduce the variability in electric power output from a combined wind and wave offshore plant. A combined plant is modeled with offshore wind turbines and Pelamis wave energy converters with wind and wave data from meteorological buoys operated by the US National Buoy Data Center off the coast of California, Oregon, and Washington. This study will present results of quantifying the benefits of combining wind and wave energy for the electrical power system to facilitate increased renewable energy penetration to support reductions in greenhouse gas emissions, and air and water pollution associated with conventional fossil fuel power plants.

Stoutenburg, E.; Jacobson, M. Z.



Quantifying the limitations of small animal positron emission tomography  

NASA Astrophysics Data System (ADS)

The application of position sensitive semiconductor detectors in medical imaging is a field of global research interest. The Monte-Carlo simulation toolkit GEANT4 [] was employed to improve the understanding of detailed ?-ray interactions within the small animal Positron Emission Tomography (PET), high-purity germanium (HPGe) imaging system, SmartPET [A.J. Boston, et al., Oral contribution, ANL, Chicago, USA, 2006]. This system has shown promising results in the field of PET [R.J. Cooper, et al., Nucl. Instr. and Meth. A (2009), accepted for publication] and Compton camera imaging [J.E. Gillam, et al., Nucl. Instr. and Meth. A 579 (2007) 76]. Images for a selection of single and multiple point, line and phantom sources were successfully reconstructed using both a filtered-back-projection (FBP) [A.R. Mather, Ph.D. Thesis, University of Liverpool, 2007] and an iterative reconstruction algorithm [A.R. Mather, Ph.D. Thesis, University of Liverpool, 2007]. Simulated data were exploited as an alternative route to a reconstructed image allowing full quantification of the image distortions introduced in each phase of the data processing. Quantifying the contribution of uncertainty in all system components from detector to reconstruction algorithm allows the areas in need of most attention on the SmartPET project and semiconductor PET to be addressed.

Oxley, D. C.; Boston, A. J.; Boston, H. C.; Cooper, R. J.; Cresswell, J. R.; Grint, A. N.; Nolan, P. J.; Scraggs, D. P.; Lazarus, I. H.; Beveridge, T. E.



Quantifying determinants of cash crop expansion and their relative effects using logistic regression modeling and variance partitioning  

NASA Astrophysics Data System (ADS)

Cash crop expansion has been a major land use change in tropical and subtropical regions worldwide. Quantifying the determinants of cash crop expansion should provide deeper spatial insights into the dynamics and ecological consequences of cash crop expansion. This paper investigated the process of cash crop expansion in Hangzhou region (China) from 1985 to 2009 using remotely sensed data. The corresponding determinants (neighborhood, physical, and proximity) and their relative effects during three periods (1985-1994, 1994-2003, and 2003-2009) were quantified by logistic regression modeling and variance partitioning. Results showed that the total area of cash crops increased from 58,874.1 ha in 1985 to 90,375.1 ha in 2009, with a net growth of 53.5%. Cash crops were more likely to grow in loam soils. Steep areas with higher elevation would experience less likelihood of cash crop expansion. A consistently higher probability of cash crop expansion was found on places with abundant farmland and forest cover in the three periods. Besides, distance to river and lake, distance to county center, and distance to provincial road were decisive determinants for farmers' choice of cash crop plantation. Different categories of determinants and their combinations exerted different influences on cash crop expansion. The joint effects of neighborhood and proximity determinants were the strongest, and the unique effect of physical determinants decreased with time. Our study contributed to understanding of the proximate drivers of cash crop expansion in subtropical regions.

Xiao, Rui; Su, Shiliang; Mai, Gengchen; Zhang, Zhonghao; Yang, Chenxue



Comparing 3D Gyrification Index and area-independent curvature-based measures in quantifying neonatal brain folding  

NASA Astrophysics Data System (ADS)

In this work we compare 3D Gyrification Index and our recently proposed area-independent curvature-based surface measures [26] for the in-vivo quantification of brain surface folding in clinically acquired neonatal MR image data. A meaningful comparison of gyrification across brains of different sizes and their subregions will only be possible through the quantification of folding with measures that are independent of the area of the region of analysis. This work uses a 3D implementation of the classical Gyrification Index, a 2D measure that quantifies folding based on the ratio of the inner and outer contours of the brain and which has been used to study gyral patterns in adults with schizophrenia, among other conditions. The new surface curvature-based measures and the 3D Gyrification Index were calculated on twelve premature infants (age 28-37 weeks) from which surfaces of cerebrospinal fluid/gray matter (CSF/GM) interface and gray matter/white matter (GM/WM) interface were extracted. Experimental results show that our measures better quantify folding on the CSF/GM interface than Gyrification Index, and perform similarly on the GM/WM interface.

Rodriguez-Carranza, Claudia E.; Mukherjee, P.; Vigneron, Daniel; Barkovich, James; Studholme, Colin



Analysing, quantifying and modelling soil erosion on steep hillslopes in different climatic areas using LiDAR and SFM DEMs  

NASA Astrophysics Data System (ADS)

Soil erosion is a worldwide well known problem and has therefore been subject to various scientific studies, especially on agricultural areas. However soil erosion on steep hillslopes in mountainous drainage basins can be a threat to human infrastructure as it supplies material, e.g. for debris flows to torrents. The study presented here aims to analyse, quantify and model soil erosion on (very) steep hillslopes free of vegetation in different climatic areas ranging from South Germany to Central Italy. Multitemporal digital elevation models were acquired with terrestrial laserscanning and from terrestrial and aerial structure from motion-based imagery. Analysis of erosion is mainly based on slope wash and rill erosion during summer months as well as erosion through freezing and melting processes during winter months in catchments of the Bavarian Alps. Erosional processes in the Mediterranean are mainly controlled by different precipitation regimes throughout the year. Annual erosion and accumulation rates are quantified and used for modelling purposes. First results of the presented project show, that the amount of material eroded is mainly controlled by the size of the sediment contributing area. However there are also other controlling factors, such as slope angle, slope length and vegetation cover which are investigated within this project.

Neugirg, Fabian; Haas, Florian; Kaiser, Andreas; Schmidt, Jrgen; Becht, Michael



Worldwide trends show oropharyngeal cancer rate increasing

DCEG scientists report that the incidence of oropharyngeal cancer significantly increased in countries that are economically developed, during the period 1983-2002. The results of this study appeared online in the Journal of Clinical Oncology, on November 18, 2013.


Visual modeling shows that avian host parents use multiple visual cues in rejecting parasitic eggs  

PubMed Central

One of the most striking outcomes of coevolution between species is egg mimicry by brood parasitic birds, resulting from rejection behavior by discriminating host parents. Yet, how exactly does a host detect a parasitic egg? Brood parasitism and egg rejection behavior provide a model system for exploring the relative importance of different visual cues used in a behavioral task. Although hosts are discriminating, we do not know exactly what cues they use, and to answer this it is crucial to account for the receiver's visual perception. Color, luminance (perceived lightness) and pattern information have never been simultaneously quantified and experimentally tested through a bird's eye. The cuckoo finch Anomalospiza imberbis and its hosts show spectacular polymorphisms in egg appearance, providing a good opportunity for investigating visual discrimination owing to the large range of patterns and colors involved. Here we combine field experiments in Africa with modeling of avian color vision and pattern discrimination to identify the specific visual cues used by hosts in making rejection decisions. We found that disparity between host and foreign eggs in both color and several aspects of pattern (dispersion, principal marking size, and variability in marking size) were important predictors of rejection, especially color. These cues correspond exactly to the principal differences between host and parasitic eggs, showing that hosts use the most reliable available cues in making rejection decisions, and select for parasitic eggs that are increasingly mimetic in a range of visual attributes. PMID:20421497

Spottiswoode, Claire N.; Stevens, Martin



Idaho State University Physics Road Show  

NASA Astrophysics Data System (ADS)

The ISU Physics Road Show services over 40 schools and 12,000 students each year. Exciting and informative demonstration shows are conducted during assemblies at elementary, middle, and junior high schools. Discussion will focus on efforts taken to maximize the educational impact to students and teachers. These efforts include supplemental information and materials provided to teachers, teacher workshops, and careful catering of subject material to state and national education standards. A few sample demonstrations will be performed, including the boiling green water sucker, a magnet strongly repelled from a cooled copper disc, an artificial geyser that shoots water 6 meters, and a few liquid nitrogen tricks. This program is supported in part by a grant from the Idaho Community Foundation.

Shropshire, Steve



Do dogs ( Canis familiaris ) show contagious yawning?  

Microsoft Academic Search

We report an experimental investigation into whether domesticated dogs display contagious yawning. Fifteen dogs were shown\\u000a video clips of (1) humans and (2) dogs displaying yawns and open-mouth expressions (not yawns) to investigate whether dogs\\u000a showed contagious yawning to either of these social stimuli. Only one dog performed significantly more yawns during or shortly\\u000a after viewing yawning videos than to

Aimee L. Harr; Valerie R. Gilbert; Kimberley A. Phillips



Quantifying Mixing using Magnetic Resonance Imaging  

PubMed Central

Mixing is a unit operation that combines two or more components into a homogeneous mixture. This work involves mixing two viscous liquid streams using an in-line static mixer. The mixer is a split-and-recombine design that employs shear and extensional flow to increase the interfacial contact between the components. A prototype split-and-recombine (SAR) mixer was constructed by aligning a series of thin laser-cut Poly (methyl methacrylate) (PMMA) plates held in place in a PVC pipe. Mixing in this device is illustrated in the photograph in Fig. 1. Red dye was added to a portion of the test fluid and used as the minor component being mixed into the major (undyed) component. At the inlet of the mixer, the injected layer of tracer fluid is split into two layers as it flows through the mixing section. On each subsequent mixing section, the number of horizontal layers is duplicated. Ultimately, the single stream of dye is uniformly dispersed throughout the cross section of the device. Using a non-Newtonian test fluid of 0.2% Carbopol and a doped tracer fluid of similar composition, mixing in the unit is visualized using magnetic resonance imaging (MRI). MRI is a very powerful experimental probe of molecular chemical and physical environment as well as sample structure on the length scales from microns to centimeters. This sensitivity has resulted in broad application of these techniques to characterize physical, chemical and/or biological properties of materials ranging from humans to foods to porous media 1, 2. The equipment and conditions used here are suitable for imaging liquids containing substantial amounts of NMR mobile 1H such as ordinary water and organic liquids including oils. Traditionally MRI has utilized super conducting magnets which are not suitable for industrial environments and not portable within a laboratory (Fig. 2). Recent advances in magnet technology have permitted the construction of large volume industrially compatible magnets suitable for imaging process flows. Here, MRI provides spatially resolved component concentrations at different axial locations during the mixing process. This work documents real-time mixing of highly viscous fluids via distributive mixing with an application to personal care products. PMID:22314707

Tozzi, Emilio J.; McCarthy, Kathryn L.; Bacca, Lori A.; Hartt, William H.; McCarthy, Michael J.



Quantifying mixing using magnetic resonance imaging.  


Mixing is a unit operation that combines two or more components into a homogeneous mixture. This work involves mixing two viscous liquid streams using an in-line static mixer. The mixer is a split-and-recombine design that employs shear and extensional flow to increase the interfacial contact between the components. A prototype split-and-recombine (SAR) mixer was constructed by aligning a series of thin laser-cut Poly (methyl methacrylate) (PMMA) plates held in place in a PVC pipe. Mixing in this device is illustrated in the photograph in Fig. 1. Red dye was added to a portion of the test fluid and used as the minor component being mixed into the major (undyed) component. At the inlet of the mixer, the injected layer of tracer fluid is split into two layers as it flows through the mixing section. On each subsequent mixing section, the number of horizontal layers is duplicated. Ultimately, the single stream of dye is uniformly dispersed throughout the cross section of the device. Using a non-Newtonian test fluid of 0.2% Carbopol and a doped tracer fluid of similar composition, mixing in the unit is visualized using magnetic resonance imaging (MRI). MRI is a very powerful experimental probe of molecular chemical and physical environment as well as sample structure on the length scales from microns to centimeters. This sensitivity has resulted in broad application of these techniques to characterize physical, chemical and/or biological properties of materials ranging from humans to foods to porous media (1, 2). The equipment and conditions used here are suitable for imaging liquids containing substantial amounts of NMR mobile (1)H such as ordinary water and organic liquids including oils. Traditionally MRI has utilized super conducting magnets which are not suitable for industrial environments and not portable within a laboratory (Fig. 2). Recent advances in magnet technology have permitted the construction of large volume industrially compatible magnets suitable for imaging process flows. Here, MRI provides spatially resolved component concentrations at different axial locations during the mixing process. This work documents real-time mixing of highly viscous fluids via distributive mixing with an application to personal care products. PMID:22314707

Tozzi, Emilio J; McCarthy, Kathryn L; Bacca, Lori A; Hartt, William H; McCarthy, Michael J



Isotopes in Urban Cheatgrass Quantify Atmospheric Pollution  

NASA Astrophysics Data System (ADS)

This study presents evidence that the nitrogen and carbon stable isotope values of vegetation can be used as integrators of ephemeral atmospheric pollution signals. Leaves and stems of Bromus tectorum and soil samples were collected in the urban Salt Lake Valley and in the rural Skull Valley of Utah. These samples were used to develop a map of the spatial distribution of ?13C and ?15N values of leaves and stems of Bromus tectorum and soils around each valley. The spatial distribution of ?15N values of leaves and stems of Bromus tectorum and associated soils were significantly correlated. The average ?15N value for Salt Lake Valley Bromus tectorum leaves and stems was 2.37 while the average value for Skull Valley Bromus tectorum leaves and stems was 4.76. It is possible that the higher concentration of atmospheric nitrogen pollutants measured in the Salt Lake Valley provided the ?15N depleted nitrogen source for uptake by plants and deposition on soils, though the ?15N value of source nitrogen was not measured directly. The presence of a seasonal difference in ?15N values of leaves and stems of Bromus tectorum sampled in Salt Lake Valley but not in Skull Valley further supports this idea. Leaves and stems of Bromus tectorum sampled in the Salt Lake Valley in April 2003 had a statistically more positive average ?15N value of 2.4 than samples collected in August 2003, which had an average ?15N value of 0.90. The carbon isotope values of leaves and stems of Bromus tectorum and air samples collected in Salt Lake Valley were more negative than values measured in Skull Valley samples (Salt Lake ?13Cplant= -28.50 and ?13Cair= -9.32 ; Skull Valley ?13Cplant= -27.58 and ?13C air= -8.52 ). This supports the idea that differences in stable isotope values of source air are correlated with differences in stable isotope values of exposed vegetation. Overall, the results of this study suggest that the carbon and nitrogen stable isotope values measured in vegetation are useful indicators of differences in atmospheric pollutant concentration in urban and rural areas.

Kammerdiener, S. A.; Ehleringer, J. R.



Quantifying Regional Measurement Requirements for ASCENDS  

NASA Astrophysics Data System (ADS)

Quantification of greenhouse gas fluxes at regional and local scales is required by the Kyoto protocol and potential follow-up agreements, and their accompanying implementation mechanisms (e.g., cap-and-trade schemes and treaty verification protocols). Dedicated satellite observations, such as those provided by the Greenhouse gases Observing Satellite (GOSAT), the upcoming Orbiting Carbon Observatory (OCO-2), and future active missions, particularly Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) and Advanced Space Carbon and Climate Observation of Planet Earth (A-SCOPE), are poised to play a central role in this endeavor. In order to prepare for the ASCENDS mission, we are applying the Stochastic Time-Inverted Lagrangian Transport (STILT) model driven by meteorological fields from a customized version of the Weather Research and Forecasting (WRF) model to generate surface influence functions for ASCENDS observations. These "footprints" (or adjoint) express the sensitivity of observations to surface fluxes in the upwind source regions and thus enable the computation of a posteriori flux error reductions resulting from the inclusion of satellite observations (taking into account the vertical sensitivity and error characteristics of the latter). The overarching objective of this project is the specification of the measurement requirements for the ASCENDS mission, with a focus on policy-relevant regional scales. Several features make WRF-STILT an attractive tool for regional analysis of satellite observations: 1) WRF meteorology is available at higher resolution than for global models and is thus more realistic, 2) The Lagrangian approach minimizes numerical diffusion present in Eulerian models, 3) The WRF-STILT coupling has been specifically designed to achieve good mass conservation characteristics, and 4) The receptor-oriented approach offers a relatively straightforward way to compute the adjoint of the transport model. These aspects allow the model to compute surface influences for satellite observations at high spatiotemporal resolution and to generate realistic flux error and flux estimates at policy-relevant scales. The main drawbacks of the Lagrangian approach to satellite simulations are inefficiency and storage requirements, but these obstacles can be overcome by taking advantage of modern computing resources (the current runs are being performed on the NASA Pleiades supercomputer). We gratefully acknowledge funding by the NASA Atmospheric CO2 Observations from Space Program (grant NNX10AT87G).

Mountain, M. E.; Eluszkiewicz, J.; Nehrkorn, T.; Hegarty, J. D.; Aschbrenner, R.; Henderson, J.; Zaccheo, S.



Quantifying the Carbon Intensity of Biomass Energy  

NASA Astrophysics Data System (ADS)

Regulatory agencies at the national and regional level have recognized the importance of quantitative information about greenhouse gas emissions from biomass used in transportation fuels or in electricity generation. For example, in the recently enacted California Low-Carbon Fuel Standard, the California Air Resources Board conducted a comprehensive study to determine an appropriate methodology for setting carbon intensities for biomass-derived transportation fuels. Furthermore, the U.S. Environmental Protection Agency is currently conducting a multi-year review to develop a methodology for estimating biogenic carbon dioxide (CO2) emissions from stationary sources. Our study develops and explores a methodology to compute carbon emission intensities (CIs) per unit of biomass energy, which is a metric that could be used to inform future policy development exercises. To compute CIs for biomass, we use the Global Change Assessment Model (GCAM), which is an integrated assessment model that represents global energy, agriculture, land and physical climate systems with regional, sectoral, and technological detail. The GCAM land use and land cover component includes both managed and unmanaged land cover categories such as food crop production, forest products, and various non-commercial land uses, and it is subdivided into 151 global land regions (, ten of which are located in the U.S. To illustrate a range of values for different biomass resources, we use GCAM to compute CIs for a variety of biomass crops grown in different land regions of the U.S. We investigate differences in emissions for biomass crops such as switchgrass, miscanthus and willow. Specifically, we use GCAM to compute global carbon emissions from the land use change caused by a marginal increase in the amount of biomass crop grown in a specific model region. Thus, we are able to explore how land use change emissions vary by the type and location of biomass crop grown in the U.S. Direct emissions occur when biomass production used for energy displaces land used for food crops, forest products, pasture, or other arable land in the same region. Indirect emissions occur when increased food crop production, compensating for displaced food crop production in the biomass production region, displaces land in regions outside of the region of biomass production. Initial results from this study suggest that indirect land use emissions, mainly from converting unmanaged forest land, are likely to be as important as direct land use emissions in determining the carbon intensity of biomass energy. Finally, we value the emissions of a marginal unit of biomass production for a given carbon price path and a range of assumed social discount rates. We also compare the cost of bioenergy emissions as valued by a hypothetical private actor to the relevant cost of emissions from conventional fossil fuels, such as coal or natural gas.

Hodson, E. L.; Wise, M.; Clarke, L.; McJeon, H.; Mignone, B.



A new methodology for quantifying the impact of water repellency on the filtering function of soils  

NASA Astrophysics Data System (ADS)

Soils deliver a range of ecosystem services, and some of the most valuable relate to the regulating services resulting from the buffering and filtering of solutes by soil. However, it is commonly accepted that soil water repellency (SWR) can lead to finger flow and preferential flow. Yet, there have been few attempts to quantify the impact of such flow phenomena on the buffering and filtering of solutes. No method is available to quantify directly how SWR affects the transport of reactive solutes. We have closed this gap and developed a new method for quantifying solute transport by novel experiments with water-repellent soils. It involves sequentially applying two liquids, one water, and the other a reference fully wetting liquid, namely, aqueous ethanol, to the same intact soil core with air-drying between the application of the two liquids. Our results highlight that sorption experiments are necessary to complement our new method to ascertain directly the impact of SWR on the filtering of a solute. We conducted transport and sorption experiments, by applying our new method, with the herbicide 2,4-Dichlorophenoxyacetic acid and two Andosol top-soils; one from Japan and the other one from New Zealand. Breakthrough curves from the water experiments were characterized by preferential flow with high initial concentrations, tailing and a long prevalence of solutes remaining in the soil. Our results clearly demonstrate and quantify the impact of SWR on the leaching of this herbicide. This technique for quantifying the reduction of the soil's filtering efficiency by SWR enables assessment of the increased risk of groundwater contamination by solutes exogenously applied to water-repellent soils.

Mller, Karin; Deurer, Markus; Kawamoto, Ken; Hiradate, Syuntaro; Komatsu, Toshiko; Clothier, Brent



Software for portable laser light show system  

NASA Astrophysics Data System (ADS)

Portable laser light show system LS-3500-10M is connected to the parallel port of IBM PC/AT compatible computer. Computer performs output of digital control data describing images. Specially designed control device is used to convert digital data coming from parallel port to the analog signal driving scanner. Capabilities of even cost nothing 286 computer are quite enough for laser graphics control. Technology of scanning used in laser graphics system LS-3500-10M essentially differs from widely spread systems based on galvanometers with mobile core or with mobile magnet. Such devices are based on the same principle of work as electrically driven servo-mechanism. As scanner we use elastic system with hydraulic dampen oscillations and opened loop. For most of applications of laser graphics such system provides satisfactory precision and speed of scanning. LS-3500-10M software gives user ability to create on PC and play his own laser graphics demonstrations. It is possible to render recognizable text and pictures using different styles, 3D and abstract animation. All types of demonstrations can be mixed in slide-show. Time synchronization is supported. Software has the following features: (1) Different types of text output. Built-in text editor for typing and editing of textural information. Different fonts can be used to display text. User can create his own fonts using specially developed font editor. (2) Editor of 3D animation with library of predefined shapes. (3) Abstract animation provided by software routines. (4) Support of different graphics files formats (PCX or DXF). Original algorithm of raster image tracing was implemented. (5) Built-in slide-show editor.

Buruchin, Dmitrey J.; Leonov, Alexander F.



Latest European coelacanth shows Gondwanan affinities  

PubMed Central

The last European fossil occurrence of a coelacanth is from the Mid-Cretaceous of the English Chalk (Turonian, 90 million years ago). Here, we report the discovery of a coelacanth from Late Cretaceous non-marine rocks in southern France. It consists of a left angular bone showing structures that imply close phylogenetic affinities with some extinct Mawsoniidae. The closest relatives are otherwise known from Cretaceous continental deposits of southern continents and suggest that the dispersal of freshwater organisms from Africa to Europe occurred in the Late Cretaceous. PMID:17148159

Cavin, Lionel; Forey, Peter L; Buffetaut, Eric; Tong, Haiyan



Comparative study of two chromatographic methods for quantifying 2,4,6-trichloranisole in wines  

Microsoft Academic Search

Here we present the validation and the comparative study of two chromatographic methods for quantifying 2,4,6-trichloroanisole (TCA) in wines (red, ros and white wines). The first method involves headspace solid-phase microextraction and gas chromatography with electron-capture detection (ECD). The evaluation of the performance parameters shows limit of detection of 0.3ngl?1, limit of quantification of 1.0ngl?1, recoveries around 100% and repeatability

M. Riu; M. Mestres; O. Busto; J. Guasch



Showing Progress in Early Intervention Programs.  

ERIC Educational Resources Information Center

This evaluation report of Oregon's early intervention programs describes the Oregon Preschool Assessment System, presents demographic information, and summarizes results of analysis of data on children's progress. It concludes that the infants and children enrolled (2,740 in 1991) are making substantial gains in all areas assessed. These gains

Wilson, Darla; Brodsky, Meredith


Lighthouse: Showing the Way to Relevant Information  

Microsoft Academic Search

Lighthouse is an on-line interface for a Web-based in- formation retrieval system. It accepts queries from a user, collects the retrieved documents from the search engine, or- ganizes and presents them to the user. The system inte- grates two known presentations of the retrieved results - the ranked list and clustering visualization - in a novel and ef- fective way.

Anton Leuski; James Allan



Show Me the Invisible: Visualizing Hidden Content  

PubMed Central

Content on computer screens is often inaccessible to users because it is hidden, e.g., occluded by other windows, outside the viewport, or overlooked. In search tasks, the efficient retrieval of sought content is important. Current software, however, only provides limited support to visualize hidden occurrences and rarely supports search synchronization crossing application boundaries. To remedy this situation, we introduce two novel visualization methods to guide users to hidden content. Our first method generates awareness for occluded or out-of-viewport content using see-through visualization. For content that is either outside the screens viewport or for data sources not opened at all, our second method shows off-screen indicators and an on-demand smart preview. To reduce the chances of overlooking content, we use visual links, i.e., visible edges, to connect the visible content or the visible representations of the hidden content. We show the validity of our methods in a user study, which demonstrates that our technique enables a faster localization of hidden content compared to traditional search functionality and thereby assists users in information retrieval tasks. PMID:25325078

Geymayer, Thomas; Steinberger, Markus; Lex, Alexander; Streit, Marc; Schmalstieg, Dieter



VLA Shows "Boiling" in Atmosphere of Betelgeuse  

NASA Astrophysics Data System (ADS)

A team of astronomers says that observations with the National Science Foundation's Very Large Array (VLA) radio telescope show that a neighboring bloated star has giant convective plumes propelling gas from its surface (photosphere) up into the star's atmosphere. This new information contradicts long-held ideas that such stellar atmospheres are more uniform, and may resolve questions about how the star's atmosphere attains its enormous size as well as how dust and gas is driven away from the star. Jeremy Lim of the Academia Sinica Institute of Astronomy & Astrophysics in Taiwan; Chris Carilli, Anthony Beasley, and Ralph Marson of the National Radio Astronomy Observatory (NRAO) in Socorro, NM; and Stephen White of the University of Maryland studied the red-supergiant star Betelgeuse, about 430 light-years away in the constellation Orion. They reported their findings in the April 9 issue of the scientific journal Nature. "These radio-telescope images confirm that Betelgeuse -- already more than 600 times larger than our Sun -- has a dense atmosphere that extends to many times larger still than the star itself," said Lim. "The highest-resolution image shows the star's atmosphere to have a remarkably complex structure." "To our surprise," added White, "the images also show that most of the gas in the atmosphere is only about as hot as that on the surface. Previously, all of it was thought to be very much hotter." The astronomers used the VLA to make images of Betelgeuse at a variety of radio frequencies. The series of radio observations measured the temperature of the star's atmosphere at different heights. Previous observations with the Hubble Space Telescope (HST) at ultraviolet wavelengths showed that the star's atmosphere contains very hot gas at about twice the surface temperature. The VLA images showed that there also is lower-temperature gas throughout the atmosphere. This gas is near the surface temperature at low heights and decreases in temperature progressively outwards. Although its existence was not previously suspected, this lower-temperature gas turns out to be the most abundant constituent of Betelgeuse's atmosphere. "This alters our basic understanding of red-supergiant star atmospheres," explains Lim. "Instead of the star's atmosphere expanding uniformly because of gas heated to very high temperatures near its surface, it now appears that several giant convection cells propel gas from the star's surface into its atmosphere. This creates the complex structure we observe for Betelgeuse's atmosphere." Betelgeuse can be likened to an enormous "boiling" ball of gas heated by the release of energy from nuclear fusion in its core. The circulating boiling pattern -- convection -- appears as large regions of hot upwelling gas on the star's surface. "The idea that red-supergiant stars have enormous convection cells is not new," noted Marson. "This was suggested by Martin Schwarzschild more than 20 years ago, and was seen in optical images of Betelgeuse's surface in 1990." The new picture of Betelgeuse's atmosphere also helps resolve the mystery of how massive amounts of dust and gas are expelled from red supergiant stars, an important source of enrichment for the interstellar medium. If their atmospheres were entirely very hot at lower levels, dust grains would not be able to condense there. Dust grains could possibly condense at higher levels, but there they would not get enough "push" from the star's radiation to explain their outward movement. In the new picture, the relatively cool environment at lower levels allows dust grains to condense effectively; here they can be strongly propelled by the more-intense starlight, carrying gas with them. Indeed, dust has previously been inferred to form sporadically near Betelgeuse's surface, but its presence there was difficult to reconcile with the old picture. "This method for propelling the mass outflows of red giant and supergiant stars was proposed by Sun Kwok i



Quantifying spatial and temporal trends in beach-dune volumetric changes using spatial statistics  

NASA Astrophysics Data System (ADS)

Spatial statistics are generally underutilized in coastal geomorphology, despite offering great potential for identifying and quantifying spatial-temporal trends in landscape morphodynamics. In particular, local Moran's Ii provides a statistical framework for detecting clusters of significant change in an attribute (e.g., surface erosion or deposition) and quantifying how this changes over space and time. This study analyzes and interprets spatial-temporal patterns in sediment volume changes in a beach-foredune-transgressive dune complex following removal of invasive marram grass (Ammophila spp.). Results are derived by detecting significant changes in post-removal repeat DEMs derived from topographic surveys and airborne LiDAR. The study site was separated into discrete, linked geomorphic units (beach, foredune, transgressive dune complex) to facilitate sub-landscape scale analysis of volumetric change and sediment budget responses. Difference surfaces derived from a pixel-subtraction algorithm between interval DEMs and the LiDAR baseline DEM were filtered using the local Moran's Ii method and two different spatial weights (1.5 and 5 m) to detect statistically significant change. Moran's Ii results were compared with those derived from a more spatially uniform statistical method that uses a simpler student's t distribution threshold for change detection. Morphodynamic patterns and volumetric estimates were similar between the uniform geostatistical method and Moran's Ii at a spatial weight of 5 m while the smaller spatial weight (1.5 m) consistently indicated volumetric changes of less magnitude. The larger 5 m spatial weight was most representative of broader site morphodynamics and spatial patterns while the smaller spatial weight provided volumetric changes consistent with field observations. All methods showed foredune deflation immediately following removal with increased sediment volumes into the spring via deposition at the crest and on lobes in the lee, despite erosion on the stoss slope and dune toe. Generally, the foredune became wider by landward extension and the seaward slope recovered from erosion to a similar height and form to that of pre-restoration despite remaining essentially free of vegetation.

Eamer, Jordan B. R.; Walker, Ian J.



A simple and effective method for detecting and quantifying forest disturbances and regeneration using Landsat imagery  

NASA Astrophysics Data System (ADS)

Disturbances of any size and magnitude of intensity, whether natural or human-caused, change existing forest conditions and initiate succession to create dynamic and new ecological communities. Effective management of these forest resources, both public and private, requires reliable and timely information about their status and trends. As part of the National Land Cover Database, we have developed a focused change detection method using Landsat imagery to improve the efficiency and effectiveness of existing forest change monitoring capabilities. The Normalized Burn Ratio (NBR) derived from Landsat imagery has been widely used for monitoring fire disturbance, and the Normalized Difference Vegetation Index (NDVI) has been extensively used for indicating the vegetation biomass, or health and vitality status. By integrating these two indices derived from imagery acquired from two-date Landsat images within a growing season, a model was developed to intelligently map the location and quantify the magnitude of forest disturbance and regeneration processes. The model has been tested on four image pairs from different forest regions (Northeast, Southeast, Northwest, and Southwest) of the United States. Initial results showed that the method can map high intensity forest disturbance such as forest harvest and forest fire with high accuracy; it is also sensitive to subtle changes such as forest regeneration, forest commercial thinning, and forest degradation caused by insect damage. The model is simple, effective, and applicable to other regions with forest cover. The approach can provide critical and objective change information on status and trends on forested land for management planning purposes.

Jin, S.; Yang, L.; Danielson, P.; Homer, C.; Fry, J.



Quantifying the effect size of changing environmental controls on C release from permafrost soils  

NASA Astrophysics Data System (ADS)

Microbial decomposition of soil organic matter is controlled by substrate quality, physical protection of soil minerals and environmental conditions (e.g. temperature, soil moisture). Increasing temperatures in high latitude ecosystems not only increase carbon (C) emissions from previously frozen C in permafrost but also indirectly affect the C cycle through changes in regional and local hydrology. For instance, increasing active layer thickness due to permafrost thaw can cause better drainage in uplands but poorly drained soil conditions in lowlands which both influence the amount and form of C being released. We have compiled a database of more than 40 incubation studies with soils from across the entire permafrost zone to quantify the effect size of increasing temperatures and changes in hydrology on CO2 emissions. The difference in cumulative CO2 release for a temperature increase of 10K over time, ranged initially from less than 1% to up to 48% after one year of incubation. Drier soil incubation conditions stimulated CO2 release by 2-19% relative to saturated treatments, and there was a positive interaction with temperature. These preliminary results show that a 10K increase in temperature and a shift from wetter to drier soil conditions might similarly enhance CO2 release from permafrost. However, near saturated soil conditions likely stimulate C release in form of the more potent greenhouse gas methane (CH4) which needs to be considered to fully estimate changes in C release from permafrost soils under changing environmental conditions.

Schaedel, C.; Schuur, E. A.; Bracho, R.; Elberling, B.; Lupascu, M.; Natali, S.; ODonnell, J. A.; Waldrop, M. P.



Quantifying lubricant droplet spreading on a flat substrate using molecular dynamics  

NASA Astrophysics Data System (ADS)

Understanding the physical behavior of polymer-based lubricants on the nanoscale is of critical importance to a myriad of engineering applications and devices. We have used molecular dynamics simulations to quantitatively evaluate the physical mechanisms underlying perfluoropolyether lubricant spreading on a solid substrate. We quantify the effect of molecular mass, molecule length, and lubricant and substrate functional end groups on lubricant spreading. The results show that lubricant functional end groups play a critical role in lubricant spreading on the nanoscale. Lubricant spreading increases with increasing molecule length for lubricant with functional end groups, but decreases with the increase in molecule length for lubricant without functional end groups. In the former case, the fraction of the lubricant chain that is functional is the primary driving factor for lubricant spreading, while in the latter case, the molecular mass is most important. For both lubricants with and without functional end groups, spreading is inhibited by molecule entanglement beyond a critical molecule length, and spreading becomes independent of lubricant functional end groups and molecular mass.

Noble, Brooklyn; Ovcharenko, Andrey; Raeymaekers, Bart



Using nonlinear methods to quantify changes in infant limb movements and vocalizations  

PubMed Central

The pairing of dynamical systems theory and complexity science brings novel concepts and methods to the study of infant motor development. Accordingly, this longitudinal case study presents a new approach to characterizing the dynamics of infant limb and vocalization behaviors. A single infant's vocalizations and limb movements were recorded from 51-days to 305-days of age. On each recording day, accelerometers were placed on all four of the infant's limbs and an audio recorder was worn on the child's chest. Using nonlinear time series analysis methods, such as recurrence quantification analysis and Allan factor, we quantified changes in the stability and multiscale properties of the infant's behaviors across age as well as how these dynamics relate across modalities and effectors. We observed that particular changes in these dynamics preceded or coincided with the onset of various developmental milestones. For example, the largest changes in vocalization dynamics preceded the onset of canonical babbling. The results show that nonlinear analyses can help to understand the functional co-development of different aspects of infant behavior. PMID:25161629

Abney, Drew H.; Warlaumont, Anne S.; Haussman, Anna; Ross, Jessica M.; Wallot, Sebastian



Quantifying scattered sound energy from a single tree by means of reverberation time.  


Trees in urban spaces surrounded by buildings may be effective in dispersing sound energy, and this could affect sound level distribution and street canyon reverberation. To quantify this effect of trees with a view to including it in numerical predictions, this paper examines sound scattering from a single tree in open field by means of reverberation time (RT). Five trees of different species and crown sizes were considered. The influence of ground condition, receiver height, crown size and shape, foliage condition, and source-receiver angle and distance has been assessed. The results show that RT20 is proportional to the tree crown size, which is the most important factor. The maximum RT20 measured was 0.28?s at 4000?Hz for the studied trees when in leaf (with foliage). The presence of leaves increased RT20 at high frequencies, typically by 0.08?s at 4000?Hz. It was also demonstrated that the source-receiver angle can affect the characteristics of decay curves significantly. With increasing source-receiver distance within 40?m, RT20 was slightly changed. It was shown that ground condition and receiver height affect the decay curves, especially at low and mid frequencies, where sound scattering is of relatively limited importance. PMID:23862804

Yang, Hong-Seok; Kang, Jian; Cheal, Chris; Van Renterghem, Timothy; Botteldooren, Dick



Quantifying solar spectral irradiance in aquatic habitats for the assessment of photoenhanced toxicity  

USGS Publications Warehouse

The spectra and intensity of solar radiation (solar spectral irradiance [SSI]) was quantified in selected aquatic habitats in the vicinity of an oil field on the California coast. Solar spectral irradiance measurements consisted of spectral scans (280-700 rim) and radiometric measurements of ultraviolet (UV): UVB (280-320 nm) and UVA (320-400 nm). Solar spectral irradiance measurements were taken at the surface and at various depths in two marsh ponds, a shallow wetland, an estuary lagoon, and the intertidal area of a high-energy sandy beach. Daily fluctuation in SSI showed a general parabolic relationship with time; maximum structure-activity relationship (SAR) was observed at approximate solar noon. Solar spectral irradiance measurements taken at 10-cm depth at approximate solar noon in multiple aquatic habitats exhibited only a twofold variation in visible light and UVA and a 4.5-fold variation in UVB. Visible light ranged from 11,000 to 19,000 ??W/cm2, UVA ranged from 460 to 1,100 ??W/cm2, and UVB ranged from 8.4 to 38 ??W/cm2. In each habitat, the attenuation of light intensity with increasing water depth was differentially affected over specific wavelengths of SSI. The study results allowed the development of environmentally realistic light regimes necessary for photoenhanced toxicity studies.

Barron, M.G.; Little, E.E.; Calfee, R.; Diamond, S.



Quantifying dispersal and establishment limitation in a population of an epiphytic lichen.  


Dispersal is a process critical for the dynamics and persistence of metapopulations, but it is difficult to quantify. It has been suggested that the old-forest lichen Lobaria pulmonaria is limited by insufficient dispersal ability. We analyzed 240 DNA extracts derived from snow samples by a L. pulmonaria-specific real-time PCR (polymerase chain reaction) assay of the ITS (internal transcribed spacer) region allowing for the discrimination among propagules originating from a single, isolated source tree or propagules originating from other locations. Samples that were detected as positives by real-time PCR were additionally genotyped for five L. pulmonaria microsatellite loci. Both molecular approaches demonstrated substantial dispersal from other than local sources. In a landscape approach, we additionally analyzed 240 snow samples with real-time PCR of ITS and detected propagules not only in forests where L. pulmonaria was present, but also in large unforested pasture areas and in forest patches where L. pulmonaria was not found. Monitoring of soredia of L. pulmonaria transplanted to maple bark after two vegetation periods showed high variance in growth among forest stands, but no significant differences among different transplantation treatments. Hence, it is probably not dispersal limitation that hinders colonization in the old-forest lichen L. pulmonaria, but ecological constraints at the stand level that can result in establishment limitation. Our study exemplifies that care has to be taken to adequately separate the effects of dispersal limitation from a limitation of establishment. PMID:16937643

Werth, Silke; Wagner, Helene H; Gugerli, Felix; Holderegger, Rolf; Csencsics, Daniela; Kalwij, Jesse M; Scheidegger, Christoph



Quantifying solar spectral irradiance in aquatic habitats for the assessment of photoenhanced toxicity  

SciTech Connect

The spectra and intensity of solar radiation (solar spectral irradiance [SSI]) was quantified in selected aquatic habitats in the vicinity of an oil field on the California coast. Solar spectral irradiance measurements consisted of spectral scans and radiometric measurements of ultraviolet (UV): UVB and UVA. Solar spectral irradiance measurements were taken at the surface and at various depths in two marsh ponds, a shallow wetland, an estuary lagoon, and the intertidal area of a high-energy sandy beach. Daily fluctuation in SSI showed a general parabolic relationship with time; maximum structure-activity relationship (SAR) was observed at approximate solar noon. Solar spectral irradiance measurements taken at 10-cm depth at approximate solar noon in multiple aquatic habitats exhibited only a twofold variation in visible light and UVA and a 4.5-fold variation in UVB. Visible light ranged from 11,000 to 19,000 {micro}W/cm{sup 2}, UVA ranged from 460 to 1,100 {micro}W/cm{sup 2}, and UVB ranged from 8.4 to 38 {micro}W/cm{sup 2}. In each habitat, the attenuation of light intensity with increasing water depth was differentially affected over specific wavelengths of SSI. The study results allowed the development of environmentally realistic light regimes necessary for photoenhanced toxicity studies.

Barron, M.G.; Little, E.E.; Calfee, R.; Diamond, S.



Quantifying the impacts of dust on the Caspian Sea using a regional climate model  

NASA Astrophysics Data System (ADS)

The Karakum desert and surrounding area to the Caspian Sea (CS) provide a significant source of dust to the region. Local dust events can have a substantial impact on SSTs and evaporation from the Sea through direct radiative effects. Given the high interest in projected changes in the Caspian Sea Level (CSL), it is critical that we understand what these effects are in order to accurately model net sea evaporation, a major component of the CS hydrological budget. In this study, we employ a regional climate model (RegCM4) coupled to the 1D Hostetler lake model to explore the impact of dust on the CS. Dust is simulated in RegCM4 through an interactive dust emission transport model coupled to the radiation scheme, as well as a representation of anthropogenic aerosols. The first part of this study focuses on an evaluation of the ability of RegCM4 to simulate dust in the region by comparing 1) seasonal climatologies of modelled aerosol optical depth (AOD) to a range of satellite sources, and 2) a climatology of dust events, as well as decadal variability, to observations derived from visibility measurements. The second part of this study attempts to quantify the impact of dust on the Caspian SSTs, evaporation and heat flux components. The results of this study show that simulating the effects of dust on the CS is necessary for accurately modeling the Sea's hydrological budget.

Elguindi, N.; Solmon, F.; Turuncoglu, U.



Quantifying variable erosion rates to understand the coupling of surface processes in the Teton Range, Wyoming  

NASA Astrophysics Data System (ADS)

Short-term geomorphic processes (fluvial, glacial, and hillslope erosion) and long-term exhumation control transient alpine landscapes. Long-term measurements of exhumation are not sufficient to capture the processes driving transient responses associated with short-term climatic oscillations, because of high variability of individual processes across space and time. This study compares the efficacy of different erosional agents to assess the importance of variability in tectonically active landscapes responding to fluctuations in Quaternary climate. We focus on the Teton Range, where erosional mechanisms include hillslope, glacial, and fluvial processes. Erosion rates were quantified using sediment accumulation and cosmogenic dating (bedrock and stream sediments). Results show that rates of erosion are highly variable, with average short-term rockfall rates (0.8 mm/y) occurring faster than either apparent basin-averaged (0.2 mm/y) and long-term ridge erosion rates (0.02 mm/y). Examining erosion rates separately also demonstrates the coupling between glacial, fluvial, and hillslope processes. Apparent basin-averaged erosion rates amalgamate valley wall and ridge erosion with stream and glacial rates. Climate oscillations drive the short-term response of a single erosional process (e.g., rockfalls or other mass wasting) that may enhance or limit the erosional efficiency of other processes (glacial or fluvial). While the Teton landscape may approach long-term equilibrium, stochastic processes and rapid response to short-term climate change actively perpetuate the transient ruggedness of the topography.

Tranel, Lisa M.; Spotila, James A.; Binnie, Steven A.; Freeman, Stewart P. H. T.



A model for quantifying uncertainty in the estimation of noise-contaminated measurements of transmissibility  

NASA Astrophysics Data System (ADS)

System identification in the frequency domain is a very important process in many aspects of engineering. Among many forms of frequency domain system identification such as frequency response function analysis and modal decomposition, transmissibility (output-to-output relationship) estimation has been regarded as one of the most practical tools for its clear physical interpretation, its compatibility with output-only data, and its sensitivity to local changes of structural parameters. Due to operational and environmental variability in any real system, quantization and estimation error, and extraneous measurement noise, the computation of transmissibility may contain a significant level of uncertainty and variability, and these sources propagate to degrade system identification quality and in some cases to system mischaracterization. In this paper, the uncertainty of the magnitude of a transmissibility estimator via output auto-power density spectra is quantified, an exact probability density function for the estimates is derived analytically via a Chi-square bivariate approach, and it is validated with Monte Carlo simulation. Validation shows very consistent results between the observed histogram and predicted distribution for different estimation and noise conditions.

Mao, Zhu; Todd, Michael



Quantifying the Relative Impacts from Climate Variables and Land Use on Streamflow  

NASA Astrophysics Data System (ADS)

Watershed runoff is affected by both climate variables and land use changes. The objective of this study is to quantify the relative effects of climate variables and land use change on watershed runoff by analyzing historical weather and hydrologic information. Three methodologies have been employed including multi-regression, hydrologic sensitivity analysis, and hydrologic modeling. The data for climate variables are obtained from the Global Historical Climatology Network (GHCN) version 2 and the Climate Anomaly Monitoring System (CAMS) provided by the Climate Prediction Center (CPC). Hydrologic data, specifically, streamflow is obtained from the United States Geological Survey National Water Information System (NWIS). Analysis using the three methodologies is conducted for the data from 1950 to 2010 by splitting the data into two periods from 1950 to 1979 and 1980 to 2010. Three states, including Arizona, Indiana, and New York are included in the analysis to represent areas that have undergone various levels of land use change in the last 30 years. Preliminary results from the study show that streamflow is more affected by the changes in land use over the last 30 years compared to changes due to climate variables. Details of the three methodologies, findings, and future will be presented in the poster.

Ahn, K.; Merwade, V.



Quantifying tsunami risk at the Pisco, Peru LNG terminal project  

NASA Astrophysics Data System (ADS)

We examine and quantify the tsunami risk near Pisco, Peru, where a major Liquefied Natural Gas facility is in project at Playa Loberia. We re-assess the historical record of tsunami damage along the coast of Central and Southern Peru, from 9 deg. S (Chimbote) to 19 deg. S (Arica), building seismic models of the events involved, and conducting numerical simulations of the run-up at Pisco that such models predict. We then evaluate possible return periods for the main seismic events under consideration, from a combination of historical datasets and plate tectonics arguments. We classify tsunami hazard according to the amplitude of their run-up on the coast: decimetric tsunamis (0.1 to 1 m) do not carry a specific hazard over and beyond that presented by storm waves. Metric tsunamis (a few meters) can inflict severe damage to coastal and harbor communities, and result in inundation distances of up to 1 or 2 km. Finally, dekametric tsunamis (10 m and above) are catastrophic events leading to the total destruction. We estimate that a scenario of metric run-up, which could substantially damage port facilities and lead to a number of fatalities, may have a repeat time at Pisco of about 60 years. A catastrophic tsunami of dekametric amplitude, capable of totally destroying harbor infrastructures, may have a repeat time of about 110 years. This result is also consistent with the "back-of-the-envelope" observation that the city was destroyed four times over the past 400 years. The last such tsunami took place 136 years ago.

Synolakis, C. E.; Okal, E. A.; Borrero, J. C.



Quantifying Community Dynamics of Nitrifiers in Functionally Stable Reactors?  

PubMed Central

A sequential batch reactor (SBR) and a membrane bioreactor (MBR) were inoculated with the same sludge from a municipal wastewater treatment plant, supplemented with ammonium, and operated in parallel for 84 days. It was investigated whether the functional stability of the nitrification process corresponded with a static ammonia-oxidizing bacterial (AOB) community. The SBR provided complete nitrification during nearly the whole experimental run, whereas the MBR showed a buildup of 0 to 2 mg nitrite-N liter?1 from day 45 until day 84. Based on the denaturing gradient gel electrophoresis profiles, two novel approaches were introduced to characterize and quantify the community dynamics and interspecies abundance ratios: (i) the rate of change [?t(week)] parameter and (ii) the Pareto-Lorenz curve distribution pattern. During the whole sampling period, it was observed that neither of the reactor types maintained a static microbial community and that the SBR evolved more gradually than the MBR, particularly with respect to AOB (i.e., average weekly community changes of 12.6% 5.2% for the SBR and 24.6% 14.3% for the MBR). Based on the Pareto-Lorenz curves, it was observed that only a small group of AOB species played a numerically dominant role in the nitritation of both reactors, and this was true especially for the MBR. The remaining less dominant species were speculated to constitute a reserve of AOB which can proliferate to replace the dominant species. The value of these parameters in terms of tools to assist the operation of activated-sludge systems is discussed. PMID:17981943

Wittebolle, Lieven; Vervaeren, Han; Verstraete, Willy; Boon, Nico



Quantifying sensitivity to droughts - an experimental modeling approach  

NASA Astrophysics Data System (ADS)

Meteorological droughts like those in summer 2003 or spring 2011 in Europe are expected to become more frequent in the future. Although the spatial extent of these drought events was large, not all regions were affected in the same way. Many catchments reacted strongly to the meteorological droughts showing low levels of streamflow and groundwater, while others hardly reacted. The extent of the hydrological drought for specific catchments was also different between these two historical events due to different initial conditions and drought propagation processes. This leads to the important question of how to detect and quantify the sensitivity of a catchment to meteorological droughts. To assess this question we designed hydrological model experiments using a conceptual rainfall-runoff model. Two drought scenarios were constructed by selecting precipitation and temperature observations based on certain criteria: one scenario was a modest but constant progression of drying based on sorting the years of observations according to annual precipitation amounts. The other scenario was a more extreme progression of drying based on selecting months from different years, forming a year with the wettest months through to a year with the driest months. Both scenarios retained the typical intra-annual seasonality for the region. The sensitivity of 24 Swiss catchments to these scenarios was evaluated by analyzing the simulated discharge time series and modeled storages. Mean catchment elevation, slope and size were found to be the main controls on the sensitivity of catchment discharge to precipitation. Generally, catchments at higher elevation and with steeper slopes seemed to be less sensitive to meteorological droughts than catchments at lower elevations with less steep slopes.

Staudinger, M.; Wei