Sample records for quantified results show

  1. Quantified security is a weak hypothesis: a critical survey of results and assumptions

    Microsoft Academic Search

    Vilhelm Verendel

    2009-01-01

    This paper critically surveys previous work on quantitative representation and analysis of security. Such quantified security has been presented as a general approach to precisely assess and control security. We classify a significant part of the work between 1981 and 2008 with respect to security perspective, target of quantification, underlying assumptions and type of validation. The result shows how the

  2. 13. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF POOR CONSTRUCTION WORK. THOUGH NOT A SERIOUS STRUCTURAL DEFICIENCY, THE 'HONEYCOMB' TEXTURE OF THE CONCRETE SURFACE WAS THE RESULT OF INADEQUATE TAMPING AT THE TIME OF THE INITIAL 'POUR'. - Hume Lake Dam, Sequioa National Forest, Hume, Fresno County, CA

  3. Emerging Trends in Contextual Learning Show Positive Results for Students.

    ERIC Educational Resources Information Center

    WorkAmerica, 2001

    2001-01-01

    This issue focuses on contextual learning (CL), in which students master rigorous academic content in real-world or work-based learning experiences. "Emerging Trends in CL Show Positive Results for Students" discusses CL as an important strategy for improving student achievement. It describes: how CL raises the bar for all students, challenging…

  4. Quantifying IOHDR brachytherapy underdosage resulting from an incomplete scatter environment

    SciTech Connect

    Raina, Sanjay [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, NY (United States); Avadhani, Jaiteerth S. [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, NY (United States); Oh, Moonseong [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, NY (United States); Malhotra, Harish K. [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, NY (United States); Jaggernauth, Wainwright [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, NY (United States); Kuettel, Michael R. [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, NY (United States); Podgorsak, Matthew B. [Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, NY (United States)]. E-mail: matthew.podgorsak@roswellpark.org

    2005-04-01

    Purpose: Most brachytherapy planning systems are based on a dose calculation algorithm that assumes an infinite scatter environment surrounding the target volume and applicator. Dosimetric errors from this assumption are negligible. However, in intraoperative high-dose-rate brachytherapy (IOHDR) where treatment catheters are typically laid either directly on a tumor bed or within applicators that may have little or no scatter material above them, the lack of scatter from one side of the applicator can result in underdosage during treatment. This study was carried out to investigate the magnitude of this underdosage. Methods: IOHDR treatment geometries were simulated using a solid water phantom beneath an applicator with varying amounts of bolus material on the top and sides of the applicator to account for missing tissue. Treatment plans were developed for 3 different treatment surface areas (4 x 4, 7 x 7, 12 x 12 cm{sup 2}), each with prescription points located at 3 distances (0.5 cm, 1.0 cm, and 1.5 cm) from the source dwell positions. Ionization measurements were made with a liquid-filled ionization chamber linear array with a dedicated electrometer and data acquisition system. Results: Measurements showed that the magnitude of the underdosage varies from about 8% to 13% of the prescription dose as the prescription depth is increased from 0.5 cm to 1.5 cm. This treatment error was found to be independent of the irradiated area and strongly dependent on the prescription distance. Furthermore, for a given prescription depth, measurements in planes parallel to an applicator at distances up to 4.0 cm from the applicator plane showed that the dose delivery error is equal in magnitude throughout the target volume. Conclusion: This study demonstrates the magnitude of underdosage in IOHDR treatments delivered in a geometry that may not result in a full scatter environment around the applicator. This implies that the target volume and, specifically, the prescription depth (tumor bed) may get a dose significantly less than prescribed. It might be clinically relevant to correct for this inaccuracy.

  5. Comb-Push Ultrasound Shear Elastography of Breast Masses: Initial Results Show Promise

    PubMed Central

    Song, Pengfei; Fazzio, Robert T.; Pruthi, Sandhya; Whaley, Dana H.; Chen, Shigao; Fatemi, Mostafa

    2015-01-01

    Purpose or Objective To evaluate the performance of Comb-push Ultrasound Shear Elastography (CUSE) for classification of breast masses. Materials and Methods CUSE is an ultrasound-based quantitative two-dimensional shear wave elasticity imaging technique, which utilizes multiple laterally distributed acoustic radiation force (ARF) beams to simultaneously excite the tissue and induce shear waves. Female patients who were categorized as having suspicious breast masses underwent CUSE evaluations prior to biopsy. An elasticity estimate within the breast mass was obtained from the CUSE shear wave speed map. Elasticity estimates of various types of benign and malignant masses were compared with biopsy results. Results Fifty-four female patients with suspicious breast masses from our ongoing study are presented. Our cohort included 31 malignant and 23 benign breast masses. Our results indicate that the mean shear wave speed was significantly higher in malignant masses (6 ± 1.58 m/s) in comparison to benign masses (3.65 ± 1.36 m/s). Therefore, the stiffness of the mass quantified by the Young’s modulus is significantly higher in malignant masses. According to the receiver operating characteristic curve (ROC), the optimal cut-off value of 83 kPa yields 87.10% sensitivity, 82.61% specificity, and 0.88 for the area under the curve (AUC). Conclusion CUSE has the potential for clinical utility as a quantitative diagnostic imaging tool adjunct to B-mode ultrasound for differentiation of malignant and benign breast masses. PMID:25774978

  6. 1. ABSTRACT We show results from joint TES-OMI retrievals for

    E-print Network

    15% 1. ABSTRACT We show results from joint TES-OMI retrievals for May-August, 2006. We combine TES and OMI data by linear updates from the spectral residuals. Combined retrievals from the UV and IR, and of particular interest, increased sensitivity to the planetary boundary layer. Results are compared to the OMI

  7. Quantifying the effects of root reinforcing on slope stability: results of the first tests with an new shearing device

    NASA Astrophysics Data System (ADS)

    Rickli, Christian; Graf, Frank

    2013-04-01

    The role of vegetation in preventing shallow soil mass movements such as shallow landslides and soil erosion is generally well recognized and, correspondingly, soil bioengineering on steep slopes has been widely used in practice. However, the precise effectiveness of vegetation regarding slope stabilityis still difficult to determine. A recently designed inclinable shearing device for large scale vegetated soil samples allows quantitative evaluation of the additional shear strength provided by roots of specific plant species. In the following we describe the results of a first series of shear strength experiments with this apparatus focusing on root reinforcement of White Alder (Alnus incana) and Silver Birch (Betula pendula) in large soil block samples (500 x 500 x 400 mm). The specimen with partly saturated soil of a maximum grain size of 10 mm were slowly sheared at an inclination of 35° with low normal stresses of 3.2 kPa accounting for natural conditions on a typical slope prone to mass movements. Measurements during the experiments involved shear stress, shear displacement and normal displacement, all recorded with high accuracy. In addition, dry weights of sprout and roots were measured to quantify plant growth of the planted specimen. The results with the new apparatus indicate a considerable reinforcement of the soil due to plant roots, i.e. maximum shear stress of the vegetated specimen were substantially higher compared to non-vegetated soil and the additional strength was a function of species and growth. Soil samples with seedlings planted five months prior to the test yielded an important increase in maximum shear stress of 250% for White Alder and 240% for Silver Birch compared to non-vegetated soil. The results of a second test series with 12 month old plants showed even clearer enhancements in maximum shear stress (390% for Alder and 230% for Birch). Overall the results of this first series of shear strength experiments with the new apparatus using planted and unplanted soil specimen confirm the importance of plants in soil stabilisation. Furthermore, they demonstrate the suitability of the apparatus to quantify the additional strength of specific vegetation as a function of species and growth under clearly defined conditions in the laboratory.

  8. Nanotribology Results Show that DNA Forms a Mechanically Resistant 2D Network in Metaphase Chromatin Plates

    PubMed Central

    Gállego, Isaac; Oncins, Gerard; Sisquella, Xavier; Fernàndez-Busquets, Xavier; Daban, Joan-Ramon

    2010-01-01

    In a previous study, we found that metaphase chromosomes are formed by thin plates, and here we have applied atomic force microscopy (AFM) and friction force measurements at the nanoscale (nanotribology) to analyze the properties of these planar structures in aqueous media at room temperature. Our results show that high concentrations of NaCl and EDTA and extensive digestion with protease and nuclease enzymes cause plate denaturation. Nanotribology studies show that native plates under structuring conditions (5 mM Mg2+) have a relatively high friction coefficient (? ? 0.3), which is markedly reduced when high concentrations of NaCl or EDTA are added (? ? 0.1). This lubricant effect can be interpreted considering the electrostatic repulsion between DNA phosphate groups and the AFM tip. Protease digestion increases the friction coefficient (? ? 0.5), but the highest friction is observed when DNA is cleaved by micrococcal nuclease (? ? 0.9), indicating that DNA is the main structural element of plates. Whereas nuclease-digested plates are irreversibly damaged after the friction measurement, native plates can absorb kinetic energy from the AFM tip without suffering any damage. These results suggest that plates are formed by a flexible and mechanically resistant two-dimensional network which allows the safe storage of DNA during mitosis. PMID:21156137

  9. Nanotribology results show that DNA forms a mechanically resistant 2D network in metaphase chromatin plates.

    PubMed

    Gállego, Isaac; Oncins, Gerard; Sisquella, Xavier; Fernàndez-Busquets, Xavier; Daban, Joan-Ramon

    2010-12-15

    In a previous study, we found that metaphase chromosomes are formed by thin plates, and here we have applied atomic force microscopy (AFM) and friction force measurements at the nanoscale (nanotribology) to analyze the properties of these planar structures in aqueous media at room temperature. Our results show that high concentrations of NaCl and EDTA and extensive digestion with protease and nuclease enzymes cause plate denaturation. Nanotribology studies show that native plates under structuring conditions (5 mM Mg2+) have a relatively high friction coefficient (??0.3), which is markedly reduced when high concentrations of NaCl or EDTA are added (??0.1). This lubricant effect can be interpreted considering the electrostatic repulsion between DNA phosphate groups and the AFM tip. Protease digestion increases the friction coefficient (??0.5), but the highest friction is observed when DNA is cleaved by micrococcal nuclease (??0.9), indicating that DNA is the main structural element of plates. Whereas nuclease-digested plates are irreversibly damaged after the friction measurement, native plates can absorb kinetic energy from the AFM tip without suffering any damage. These results suggest that plates are formed by a flexible and mechanically resistant two-dimensional network which allows the safe storage of DNA during mitosis. PMID:21156137

  10. Long-Term Trial Results Show No Mortality Benefit from Annual Prostate Cancer Screening

    Cancer.gov

    Thirteen year follow-up data from the Prostate, Lung, Colorectal and Ovarian (PLCO) cancer screening trial show higher incidence but similar mortality among men screened annually with the prostate-specific antigen (PSA) test and digital rectal examination (DRE).

  11. Computational Methods continued In previous work, we showed that while the LDA results systemmatically under-

    E-print Network

    Holzwarth, Natalie

    electrolytes such as Li3PS4.[1] For a variety of interface configurations, computer modeling studies show that Li3PS4 surfaces are structurally and chemically altered by the presence of Li metal. On the other hand, experiments have shown [1] that an electrochemical cell of Li/Li3PS4/Li can be cycled many times

  12. Results of Wedge Resection for Focal Bronchioloalveolar Carcinoma Showing Pure Ground-Glass Attenuation on

    Microsoft Academic Search

    Computed Tomography; Shun-ichi Watanabe; Toshio Watanabe; Kazunori Arai; Takahiko Kasai; Joji Haratake; Hiroshi Urayama

    2010-01-01

    Background. Focal bronchioloalveolar carcinoma (BAC) showing pure ground-glass attenuation (GGA) on thin- section computed tomography (CT), which is considered to be an early-stage adenocarcinoma, has been diagnosed with increasing frequency due to the development and spread of the helical CT scanner. We discussed the appropriateness of limited resection for this type of lesion. Methods. Between July 1996 and June 2001,

  13. NIH trial shows promising results in treating a lymphoma in young people

    Cancer.gov

    Patients with a type of cancer known as primary mediastinal B-cell lymphoma who received infusions of chemotherapy, but who did not have radiation therapy to an area of the thorax known as the mediastinum, had excellent outcomes, according to clinical trial results.

  14. Trial results show high remission rate in leukemia following immune cell therapy

    Cancer.gov

    Children and young adults (age 1 to age 30) with chemotherapy-resistant B-cell acute lymphoblastic leukemia (ALL) experienced high remission rates following treatment with an experimental immunotherapy. Results demonstrated that the immunotherapy treatment had anti-leukemia effects in patients and that the treatment was feasible and safe.

  15. Stem cells show promising results for lymphoedema treatment - A literature review.

    PubMed

    Toyserkani, Navid Mohamadpour; Christensen, Marlene Louise; Sheikh, Søren Paludan; Sørensen, Jens Ahm

    2015-04-01

    Lymphoedema is a debilitating condition, manifesting in excess lymphatic fluid and swelling of subcutaneous tissues. Lymphoedema is as of yet still an incurable condition and current treatment modalities are not satisfactory. The capacity of mesenchymal stem cells to promote angiogenesis, secrete growth factors, regulate the inflammatory process, and differentiate into multiple cell types make them a potential ideal therapy for lymphoedema. Adipose tissue is the richest and most accessible source of mesenchymal stem cells and they can be harvested, isolated, and used for therapy in a single stage procedure as an autologous treatment. The aim of this paper was to review all studies using mesenchymal stem cells for lymphoedema treatment with a special focus on the potential use of adipose-derived stem cells. A systematic search was performed and five preclinical and two clinical studies were found. Different stem cell sources and lymphoedema models were used in the described studies. Most studies showed a decrease in lymphoedema and an increased lymphangiogenesis when treated with stem cells and this treatment modality has so far shown great potential. The present studies are, however, subject to bias and more preclinical studies and large-scale high quality clinical trials are needed to show if this emerging therapy can satisfy expectations. PMID:25272309

  16. Hector's dolphin risk assessments: old and new analyses show consistent results

    Microsoft Academic Search

    E Slooten; N Davies

    2012-01-01

    We review results of previous research and present new estimates of Hector's dolphin (Cephalorhynchus hectori) bycatch. Before 2008, an estimated total of 110–150 individuals were caught annually, with 35–46 caught off the east coast South Island (ECSI). We estimate that 23 Hector's dolphins were caught off ECSI during 1 May 2009–30 April 2010 (CV 0.21) based on fisheries observer data.

  17. Hector's dolphin risk assessments: old and new analyses show consistent results

    Microsoft Academic Search

    E Slooten; N Davies

    2011-01-01

    We review results of previous research and present new estimates of Hector's dolphin (Cephalorhynchus hectori) bycatch. Before 2008, an estimated total of 110–150 individuals were caught annually, with 35–46 caught off the east coast South Island (ECSI). We estimate that 23 Hector's dolphins were caught off ECSI during 1 May 2009–30 April 2010 (CV 0.21) based on fisheries observer data.

  18. Lung cancer trial results show mortality benefit with low-dose CT:

    Cancer.gov

    The NCI has released initial results from a large-scale test of screening methods to reduce deaths from lung cancer by detecting cancers at relatively early stages. The National Lung Screening Trial, a randomized national trial involving more than 53,000 current and former heavy smokers ages 55 to 74, compared the effects of two screening procedures for lung cancer -- low-dose helical computed tomography (CT) and standard chest X-ray -- on lung cancer mortality and found 20 percent fewer lung cancer deaths among trial participants screened with low-dose helical CT.

  19. International gene therapy trial for 'bubble boy' disease shows promising early results

    Cancer.gov

    Researchers reported promising outcomes data for the first group of boys with X-linked severe combined immunodeficiency syndrome (SCID-X1), a fatal genetic immunodeficiency also known as "bubble boy" disease, who were treated as part of an international clinical study of a new form of gene therapy. The mechanism used to deliver the gene therapy is designed to prevent the serious complication of leukemia that arose a decade ago in a similar trial in Europe, when one-quarter of boys treated developed the blood cancer. Researchers from the Dana-Farber Cancer Institute presented the study results annual meeting of the American Society of Hematology, on behalf of the Transatlantic Gene Therapy Consortium.

  20. Updated clinical results show experimental agent ibrutinib as highly active in CLL patients

    Cancer.gov

    Updated results from a Phase Ib/II clinical trial led by the Ohio State University Comprehensive Cancer Center – Arthur G. James Cancer Hospital and Richard J. Solove Research Institute indicates that a novel therapeutic agent for chronic lymphocytic leukemia (CLL) is highly active and well tolerated in patients who have relapsed and are resistant to other therapy. The agent, ibrutinib (PCI-32765), is the first drug designed to target Bruton's tyrosine kinase (BTK), a protein essential for CLL-cell survival and proliferation. CLL is the most common form of leukemia, with about 15,000 new cases annually in the U.S. About 4,400 Americans die of the disease each year.

  1. Results From Mars Show Electrostatic Charging of the Mars Pathfinder Sojourner Rover

    NASA Technical Reports Server (NTRS)

    Kolecki, Joseph C.; Siebert, Mark W.

    1998-01-01

    Indirect evidence (dust accumulation) has been obtained indicating that the Mars Pathfinder rover, Sojourner, experienced electrostatic charging on Mars. Lander camera images of the Sojourner rover provide distinctive evidence of dust accumulation on rover wheels during traverses, turns, and crabbing maneuvers. The sol 22 (22nd Martian "day" after Pathfinder landed) end-of-day image clearly shows fine red dust concentrated around the wheel edges with additional accumulation in the wheel hubs. A sol 41 image of the rover near the rock "Wedge" (see the next image) shows a more uniform coating of dust on the wheel drive surfaces with accumulation in the hubs similar to that in the previous image. In the sol 41 image, note particularly the loss of black-white contrast on the Wheel Abrasion Experiment strips (center wheel). This loss of contrast was also seen when dust accumulated on test wheels in the laboratory. We believe that this accumulation occurred because the Martian surface dust consists of clay-sized particles, similar to those detected by Viking, which have become electrically charged. By adhering to the wheels, the charged dust carries a net nonzero charge to the rover, raising its electrical potential relative to its surroundings. Similar charging behavior was routinely observed in an experimental facility at the NASA Lewis Research Center, where a Sojourner wheel was driven in a simulated Martian surface environment. There, as the wheel moved and accumulated dust (see the following image), electrical potentials in excess of 100 V (relative to the chamber ground) were detected by a capacitively coupled electrostatic probe located 4 mm from the wheel surface. The measured wheel capacitance was approximately 80 picofarads (pF), and the calculated charge, 8 x 10(exp -9) coulombs (C). Voltage differences of 100 V and greater are believed sufficient to produce Paschen electrical discharge in the Martian atmosphere. With an accumulated net charge of 8 x 10(exp -9) C, and average arc time of 1 msec, arcs can also occur with estimated arc currents approaching 10 milliamperes (mA). Discharges of this magnitude could interfere with the operation of sensitive electrical or electronic elements and logic circuits. Sojourner rover wheel tested in laboratory before launch to Mars. Before launch, we believed that the dust would become triboelectrically charged as it was moved about and compacted by the rover wheels. In all cases observed in the laboratory, the test wheel charged positively, and the wheel tracks charged negatively. Dust samples removed from the laboratory wheel averaged a few ones to tens of micrometers in size (clay size). Coarser grains were left behind in the wheel track. On Mars, grain size estimates of 2 to 10 mm were derived for the Martian surface materials from the Viking Gas Exchange Experiment. These size estimates approximately match the laboratory samples. Our tentative conclusion for the Sojourner observations is that fine clay-sized particles acquired an electrostatic charge during rover traverses and adhered to the rover wheels, carrying electrical charge to the rover. Since the Sojourner rover carried no instruments to measure this mission's onboard electrical charge, confirmatory measurements from future rover missions on Mars are desirable so that the physical and electrical properties of the Martian surface dust can be characterized. Sojourner was protected by discharge points, and Faraday cages were placed around sensitive electronics. But larger systems than Sojourner are being contemplated for missions to the Martian surface in the foreseeable future. The design of such systems will require a detailed knowledge of how they will interact with their environment. Validated environmental interaction models and guidelines for the Martian surface must be developed so that design engineers can test new ideas prior to cutting hardware. These models and guidelines cannot be validated without actual flighata. Electrical charging of vehicles and, one day, astronauts moving across t

  2. Animation shows promise in initiating timely cardiopulmonary resuscitation: results of a pilot study.

    PubMed

    Attin, Mina; Winslow, Katheryn; Smith, Tyler

    2014-04-01

    Delayed responses during cardiac arrest are common. Timely interventions during cardiac arrest have a direct impact on patient survival. Integration of technology in nursing education is crucial to enhance teaching effectiveness. The goal of this study was to investigate the effect of animation on nursing students' response time to cardiac arrest, including initiation of timely chest compression. Nursing students were randomized into experimental and control groups prior to practicing in a high-fidelity simulation laboratory. The experimental group was educated, by discussion and animation, about the importance of starting cardiopulmonary resuscitation upon recognizing an unresponsive patient. Afterward, a discussion session allowed students in the experimental group to gain more in-depth knowledge about the most recent changes in the cardiac resuscitation guidelines from the American Heart Association. A linear mixed model was run to investigate differences in time of response between the experimental and control groups while controlling for differences in those with additional degrees, prior code experience, and basic life support certification. The experimental group had a faster response time compared with the control group and initiated timely cardiopulmonary resuscitation upon recognition of deteriorating conditions (P < .0001). The results demonstrated the efficacy of combined teaching modalities for timely cardiopulmonary resuscitation. Providing opportunities for repetitious practice when a patient's condition is deteriorating is crucial for teaching safe practice. PMID:24473120

  3. QUantifying the Aerosol Direct and Indirect Effect over Eastern Mediterranean from Satellites (QUADIEEMS): Overview and preliminary results

    NASA Astrophysics Data System (ADS)

    Georgoulias, Aristeidis K.; Zanis, Prodromos; Pöschl, Ulrich; Kourtidis, Konstantinos A.; Alexandri, Georgia; Ntogras, Christos; Marinou, Eleni; Amiridis, Vassilis

    2013-04-01

    An overview and preliminary results from the research implemented within the framework of QUADIEEMS project are presented. For the scopes of the project, satellite data from five sensors (MODIS aboard EOS TERRA, MODIS aboard EOS AQUA, TOMS aboard Earth Probe, OMI aboard EOS AURA and CALIOP aboard CALIPSO) are used in conjunction with meteorological data from ECMWF ERA-interim reanalysis and data from a global chemical-aerosol-transport model as well as simulation results from a regional climate model (RegCM4) coupled with a simplified aerosol scheme. QUADIEEMS focuses on Eastern Mediterranean [30oN-45No, 17.5oE-37.5oE], a region situated at the crossroad of different aerosol types and thus ideal for the investigation of the direct and indirect effects of various aerosol types at a high spatial resolution. The project consists of five components. First, raw data from various databases are acquired, analyzed and spatially homogenized with the outcome being a high resolution (0.1x0.1 degree) and a moderate resolution (1.0x1.0 degree) gridded dataset of aerosol and cloud optical properties. The marine, dust and anthropogenic fraction of aerosols over the region is quantified making use of the homogenized dataset. Regional climate model simulations with REGCM4/aerosol are also implemented for the greater European region for the period 2000-2010 at a resolution of 50 km. REGCM4's ability to simulate AOD550 over Europe is evaluated. The aerosol-cloud relationships, for sub-regions of Eastern Mediterranean characterized by the presence of predominant aerosol types, are examined. The aerosol-cloud relationships are also examined taking into account the relative position of aerosol and cloud layers as defined by CALIPSO observations. Within the final component of the project, results and data that emerged from all the previous components are used in satellite-based parameterizations in order to quantify the direct and indirect (first) radiative effect of the different aerosol types at a resolution of 0.1x0.1 degrees. The procedure is repeated using a 1.0x1.0 degree resolution, in order to examine the footprint of the aerosol direct and indirect effects. The project ends with the evaluation of REGCM4's ability to simulate the aerosol direct radiative effect over the region. QUADIEEMS is co-financed by the European Social Fund (ESF) and national resources under the operational programme Education and Lifelong Learning (EdLL) within the framework of the Action "Supporting Postdoctoral Researchers".

  4. Quantifying dust input to the Subarctic North Pacific - Results from surface sediments and sea water thorium isotope measurements

    NASA Astrophysics Data System (ADS)

    Winckler, G.; Serno, S.; Hayes, C.; Anderson, R. F.; Gersonde, R.; Haug, G. H.

    2012-12-01

    The Subarctic North Pacific is one of the three primary high-nutrient-low chlorophyll regions of the modern ocean, where the biological pump is relatively inefficient at transferring carbon from the atmosphere to the deep sea. The system is thought to be iron-limited. Aeolian dust is a significant source of iron and other nutrients that are essential for the health of marine ecosystems and potentially a controlling factor of the high-nutrient-low chlorophyll status of the Subarctic North Pacific. However, constraining the size of the dust flux to the surface ocean remains difficult. Here we apply two different approaches, based on surface sediment and water column samples, respectively, obtained during the SO202/INOPEX research cruise to the Subarctic North Pacific in 2009. We map the spatial patterns of Th/U isotopes, helium isotopes and rare earth elements across surface sediments from 37 multi-core core-top sediments across the Subarctic North Pacific. In order to deconvolve the detrital endmembers in regions of the North Pacific affected by volcanic material, IRD and hemipelagic input, we use a combination of trace elements with distinct characteristics in the different endmembers. This approach allows us to calculate the relative aeolian fraction, and in combination with Thorium230-normalized mass flux data, to quantify the dust supply. Secondly, we present an innovative approach to use paired Thorium-232 and Thorium-230 concentrations of upper-ocean seawater at 7 stations along the INOPEX track. Thorium-232 in the upper water column is dominantly derived from dissolution of aeolian dust, whereas Thorium-230 data provide a measure of the thorium removal from the surface waters and, thus, allow us to derive Thorium-232 fluxes. Combined with a mean Thorium-232 concentration in dust and estimate of the thorium solubility, the Thorium-232 flux can be translated in a dust flux to the surface ocean. Dust flux estimates for the Subarctic North Pacific will be compared to results from model simulations from Mahowald et al. (2006).

  5. News Note: Long-term Results from Study of Tamoxifen and Raloxifene Shows Lower Toxicities of Raloxifene

    Cancer.gov

    Initial results in 2006 of the NCI-sponsored Study of Tamoxifen and Raloxifene (STAR) showed that a common osteoporosis drug, raloxifene, prevented breast cancer to the same degree, but with fewer serious side-effects, than the drug tamoxifen that had been in use many years for breast cancer prevention as well as treatment. The longer-term results show that raloxifene retained 76 percent of the effectiveness of tamoxifen in preventing invasive disease and grew closer to tamoxifen in preventing noninvasive disease, while remaining far less toxic – in particular, there was significantly less endometrial cancer with raloxifene use.

  6. Exhibitor Search I Product Category Search ISession Search I MyShow Return to Search Results or Search Again

    E-print Network

    Olsen, Paul E.

    Exhibitor Search I Product Category Search ISession Search I MyShow « Return to Search Results or Search Again Climate Models in Deep Time Require Tight Temporal and Latitudinal Constraints: The Colorado on are often woefully inadequate for global comparisons of climatically relevant data as demonstrated by the 10

  7. We report results showing that working memory for American Sign Language (ASL) is sensitive to irrelevant signed input

    E-print Network

    We report results showing that working memory for American Sign Language (ASL) is sensitive for sign language involves visual or quasi-visual representations, suggesting parallels to visuospatial working memory. We have previously argued that the structure of work- ing memory for sign language

  8. Seeking to quantify the ferromagnetic-to-antiferromagnetic interface coupling resulting in exchange bias with various thin-film conformations

    SciTech Connect

    Hsiao, C. H.; Wang, S.; Ouyang, H., E-mail: houyang@mx.nthu.edu.tw [Department of Materials Science and Engineering, National Tsing Hua University, Hsinchu 300, Taiwan (China); Desautels, R. D.; Lierop, J. van, E-mail: Johan.van.Lierop@umanitoba.ca [Department of Physics and Astronomy, University of Manitoba, Winnipeg, MB R3T 2N2 (Canada); Lin, K. W. [Department of Materials Science and Engineering, National Chung Hsing University, Taichung 402, Taiwan (China)

    2014-08-07

    Ni{sub 3}Fe/(Ni, Fe)O thin films with bilayer and nanocrystallite dispersion morphologies are prepared with a dual ion beam deposition technique permitting precise control of nanocrystallite growth, composition, and admixtures. A bilayer morphology provides a Ni{sub 3}Fe-to-NiO interface, while the dispersion films have different mixtures of Ni{sub 3}Fe, NiO, and FeO nanocrystallites. Using detailed analyses of high resolution transmission electron microscopy images with Multislice simulations, the nanocrystallites' structures and phases are determined, and the intermixing between the Ni{sub 3}Fe, NiO, and FeO interfaces is quantified. From field-cooled hysteresis loops, the exchange bias loop shift from spin interactions at the interfaces are determined. With similar interfacial molar ratios of FM-to-AF, we find the exchange bias field essentially unchanged. However, when the interfacial ratio of FM to AF was FM rich, the exchange bias field increases. Since the FM/AF interface ‘contact’ areas in the nanocrystallite dispersion films are larger than that of the bilayer film, and the nanocrystallite dispersions exhibit larger FM-to-AF interfacial contributions to the magnetism, we attribute the changes in the exchange bias to be from increases in the interfacial segments that suffer defects (such as vacancies and bond distortions), that also affects the coercive fields.

  9. Recombinant PNPLA3 protein shows triglyceride hydrolase activity and its I148M mutation results in loss of function.

    PubMed

    Pingitore, Piero; Pirazzi, Carlo; Mancina, Rosellina M; Motta, Benedetta M; Indiveri, Cesare; Pujia, Arturo; Montalcini, Tiziana; Hedfalk, Kristina; Romeo, Stefano

    2014-04-01

    The patatin-like phospholipase domain containing 3 (PNPLA3, also called adiponutrin, ADPN) is a membrane-bound protein highly expressed in the liver. The genetic variant I148M (rs738409) was found to be associated with progression of chronic liver disease. We aimed to establish a protein purification protocol in a yeast system (Pichia pastoris) and to examine the human PNPLA3 enzymatic activity, substrate specificity and the I148M mutation effect. hPNPLA3 148I wild type and 148M mutant cDNA were cloned into P. pastoris expression vectors. Yeast cells were grown in 3L fermentors. PNPLA3 protein was purified from membrane fractions by Ni-affinity chromatography. Enzymatic activity was assessed using radiolabeled substrates. Both 148I wild type and 148M mutant proteins are localized to the membrane. The wild type protein shows a predominant lipase activity with mild lysophosphatidic acid acyl transferase activity (LPAAT) and the I148M mutation results in a loss of function of both these activities. Our data show that PNPLA3 has a predominant lipase activity and I148M mutation results in a loss of function. PMID:24369119

  10. Unbalanced GLA mRNAs ratio quantified by real-time PCR in Fabry patients' fibroblasts results in Fabry disease.

    PubMed

    Filoni, Camilla; Caciotti, Anna; Carraresi, Laura; Donati, Maria Alice; Mignani, Renzo; Parini, Rossella; Filocamo, Mirella; Soliani, Fausto; Simi, Lisa; Guerrini, Renzo; Zammarchi, Enrico; Morrone, Amelia

    2008-11-01

    Total or partial deficiency of the human lysosomal hydrolase alpha-galactosidase A is responsible for Fabry disease, the X-linked inborn error of glycosphingolipid metabolism. Together with the predominant alpha-galactosidase A gene mRNA product encoding the lysosomal enzyme, a weakly regulated alternatively spliced alpha-galactosidase A transcript is expressed in normal tissues, but its overexpression, due to the intronic g.9331G>A mutation, leads to the cardiac variant. We report the molecular characterization of five Fabry patients including two siblings. Sequencing analysis of the alpha-galactosidase A gene coding region and intron/exon boundaries identified the new c.124A>G (p.M42V) genetic lesion as well as a known deletion in three patients, whereas in the two remaining patients, no mutations were identified. To evaluate possible alpha-galactosidase A gene transcription alterations, both predominant and alternatively spliced mRNAs were quantified by absolute real-time PCR on total RNA preparations from the patients' fibroblasts. An impressive reduction in the predominant alpha-galactosidase A transcript was detected in the last patients (Pt 4 and Pt 5). However, the alternatively spliced mRNA was dramatically overexpressed in one of them, carrying a new intronic lesion (g.9273C>T). These findings strongly suggest a correlation between this new intronic mutation and the unbalanced alpha-galactosidase A mRNAs ratio, which could therefore be responsible for the reduced enzyme activity causing Fabry disease. The real-time assay developed here to investigate the two alpha-galactosidase A mRNAs might play a crucial role in revealing possible genetic lesions and in confirming the pathogenetic mechanisms underlying Fabry disease. PMID:18560446

  11. Not all Surface Waters show a Strong Relation between DOC and Hg Species: Results from an Adirondack Mountain Watershed

    NASA Astrophysics Data System (ADS)

    Burns, D. A.; Schelker, J.; Murray, K. R.; Brigham, M. E.; Aiken, G.

    2009-12-01

    Several recent papers have highlighted the strong statistical correlation between dissolved organic carbon (DOC) concentrations and total dissolved mercury (THgd) and/or dissolved methyl Hg (MeHgd). These relations of organic carbon with Hg species are often even stronger when a measurement that reflects some fraction of the DOC is used such as UV absorbance at 254 nm or the hydrophobic acid fraction. These strong relations are not surprising given the pivotal role DOC plays in binding and transporting Hg, which is otherwise relatively insoluble in dilute waters. In this study, we show data collected monthly and during some storms and snowmelt over 2.5 years from the 65 km2 Fishing Brook watershed in the Adirondack Mountains of New York. This dataset is noteworthy because of a weak and statistically non-significant (p > 0.05) relationship between DOC and either of THgd or MeHgd over the entire study period. We believe that the lack of a strong DOC-Hg relation in Fishing Brook reflects the combined effects of the heterogeneous land cover and the presence of three ponds within the watershed. The watershed is dominantly (89.3%) hardwood and coniferous forest with 8% wetland area, and 2.7% open water. Despite the lack of a strong relation between DOC and Hg species across the annual hydrograph, the dataset shows strong within-season correlations that have different y-intercepts and slopes between the growing season (May 1 - Sept. 30) and dormant season (Oct. 1 - April 30), as well as strong, but seasonally varying DOC-Hg correlations at smaller spatial scales in data collected on several occasions in 10 sub-watersheds of Fishing Brook. We hypothesize that a combination of several factors can account for these annually weak, but seasonally and spatially strong DOC-Hg correlations: (1) seasonal variations in runoff generation processes from upland and wetland areas that may yield DOC with varying Hg-binding characteristics, (2) photo-induced losses of Hg species and DOC in ponded areas, and (3) the effects of the widely varying seasonal temperature and snow cover on the rates of microbial processes such as the decomposition of soil organic matter and methylation of Hg. These results emphasize that not all watersheds show simple linear relations between DOC and Hg species on an annual basis, and provide a caution that measurements such as the optical properties of waters are not always a strong surrogate for Hg.

  12. QUANTIFYING SPICULES

    SciTech Connect

    Pereira, Tiago M. D. [NASA Ames Research Center, Moffett Field, CA 94035 (United States); De Pontieu, Bart [Lockheed Martin Solar and Astrophysics Laboratory, Org. A021S, Building 252, 3251 Hanover Street, Palo Alto, CA 94304 (United States); Carlsson, Mats, E-mail: tiago.pereira@nasa.gov [Institute of Theoretical Astrophysics, P.O. Box 1029, Blindern, NO-0315 Oslo (Norway)

    2012-11-01

    Understanding the dynamic solar chromosphere is fundamental in solar physics. Spicules are an important feature of the chromosphere, connecting the photosphere to the corona, potentially mediating the transfer of energy and mass. The aim of this work is to study the properties of spicules over different regions of the Sun. Our goal is to investigate if there is more than one type of spicule, and how spicules behave in the quiet Sun, coronal holes, and active regions. We make use of high cadence and high spatial resolution Ca II H observations taken by Hinode/Solar Optical Telescope. Making use of a semi-automated detection algorithm, we self-consistently track and measure the properties of 519 spicules over different regions. We find clear evidence of two types of spicules. Type I spicules show a rise and fall and have typical lifetimes of 150-400 s and maximum ascending velocities of 15-40 km s{sup -1}, while type II spicules have shorter lifetimes of 50-150 s, faster velocities of 30-110 km s{sup -1}, and are not seen to fall down, but rather fade at around their maximum length. Type II spicules are the most common, seen in the quiet Sun and coronal holes. Type I spicules are seen mostly in active regions. There are regional differences between quiet-Sun and coronal hole spicules, likely attributable to the different field configurations. The properties of type II spicules are consistent with published results of rapid blueshifted events (RBEs), supporting the hypothesis that RBEs are their disk counterparts. For type I spicules we find the relations between their properties to be consistent with a magnetoacoustic shock wave driver, and with dynamic fibrils as their disk counterpart. The driver of type II spicules remains unclear from limb observations.

  13. Genomic and Enzymatic Results Show Bacillus cellulosilyticus Uses a Novel Set of LPXTA Carbohydrases to Hydrolyze Polysaccharides

    PubMed Central

    Mead, David; Drinkwater, Colleen; Brumm, Phillip J.

    2013-01-01

    Background Alkaliphilic Bacillus species are intrinsically interesting due to the bioenergetic problems posed by growth at high pH and high salt. Three alkaline cellulases have been cloned, sequenced and expressed from Bacillus cellulosilyticus N-4 (Bcell) making it an excellent target for genomic sequencing and mining of biomass-degrading enzymes. Methodology/Principal Findings The genome of Bcell is a single chromosome of 4.7 Mb with no plasmids present and three large phage insertions. The most unusual feature of the genome is the presence of 23 LPXTA membrane anchor proteins; 17 of these are annotated as involved in polysaccharide degradation. These two values are significantly higher than seen in any other Bacillus species. This high number of membrane anchor proteins is seen only in pathogenic Gram-positive organisms such as Listeria monocytogenes or Staphylococcus aureus. Bcell also possesses four sortase D subfamily 4 enzymes that incorporate LPXTA-bearing proteins into the cell wall; three of these are closely related to each other and unique to Bcell. Cell fractionation and enzymatic assay of Bcell cultures show that the majority of polysaccharide degradation is associated with the cell wall LPXTA-enzymes, an unusual feature in Gram-positive aerobes. Genomic analysis and growth studies both strongly argue against Bcell being a truly cellulolytic organism, in spite of its name. Preliminary results suggest that fungal mycelia may be the natural substrate for this organism. Conclusions/Significance Bacillus cellulosilyticus N-4, in spite of its name, does not possess any of the genes necessary for crystalline cellulose degradation, demonstrating the risk of classifying microorganisms without the benefit of genomic analysis. Bcell is the first Gram-positive aerobic organism shown to use predominantly cell-bound, non-cellulosomal enzymes for polysaccharide degradation. The LPXTA-sortase system utilized by Bcell may have applications both in anchoring cellulases and other biomass-degrading enzymes to Bcell itself and in anchoring proteins other Gram-positive organisms. PMID:23593409

  14. Quantifying chain reptation in entangled polymer melts: topological and dynamical mapping of atomistic simulation results onto the tube model.

    PubMed

    Stephanou, Pavlos S; Baig, Chunggi; Tsolou, Georgia; Mavrantzas, Vlasis G; Kröger, Martin

    2010-03-28

    The topological state of entangled polymers has been analyzed recently in terms of primitive paths which allowed obtaining reliable predictions of the static (statistical) properties of the underlying entanglement network for a number of polymer melts. Through a systematic methodology that first maps atomistic molecular dynamics (MD) trajectories onto time trajectories of primitive chains and then documents primitive chain motion in terms of a curvilinear diffusion in a tubelike region around the coarse-grained chain contour, we are extending these static approaches here even further by computing the most fundamental function of the reptation theory, namely, the probability psi(s,t) that a segment s of the primitive chain remains inside the initial tube after time t, accounting directly for contour length fluctuations and constraint release. The effective diameter of the tube is independently evaluated by observing tube constraints either on atomistic displacements or on the displacement of primitive chain segments orthogonal to the initial primitive path. Having computed the tube diameter, the tube itself around each primitive path is constructed by visiting each entanglement strand along the primitive path one after the other and approximating it by the space of a small cylinder having the same axis as the entanglement strand itself and a diameter equal to the estimated effective tube diameter. Reptation of the primitive chain longitudinally inside the effective constraining tube as well as local transverse fluctuations of the chain driven mainly from constraint release and regeneration mechanisms are evident in the simulation results; the latter causes parts of the chains to venture outside their average tube surface for certain periods of time. The computed psi(s,t) curves account directly for both of these phenomena, as well as for contour length fluctuations, since all of them are automatically captured in the atomistic simulations. Linear viscoelastic properties such as the zero shear rate viscosity and the spectra of storage and loss moduli obtained on the basis of the obtained psi(s,t) curves for three different polymer melts (polyethylene, cis-1,4-polybutadiene, and trans-1,4-polybutadiene) are consistent with experimental rheological data and in qualitative agreement with the double reptation and dual constraint models. The new methodology is general and can be routinely applied to analyze primitive path dynamics and chain reptation in atomistic trajectories (accumulated through long MD simulations) of other model polymers or polymeric systems (e.g., bidisperse, branched, grafted, etc.); it is thus believed to be particularly useful in the future in evaluating proposed tube models and developing more accurate theories for entangled systems. PMID:20370147

  15. QUANTIFYING FOREST ABOVEGROUND CARBON POOLS AND FLUXES USING MULTI-TEMPORAL LIDAR A report on field monitoring, remote sensing MMV, GIS integration, and modeling results for forestry field validation test to quantify aboveground tree biomass and carbon

    SciTech Connect

    Lee Spangler; Lee A. Vierling; Eva K. Stand; Andrew T. Hudak; Jan U.H. Eitel; Sebastian Martinuzzi

    2012-04-01

    Sound policy recommendations relating to the role of forest management in mitigating atmospheric carbon dioxide (CO{sub 2}) depend upon establishing accurate methodologies for quantifying forest carbon pools for large tracts of land that can be dynamically updated over time. Light Detection and Ranging (LiDAR) remote sensing is a promising technology for achieving accurate estimates of aboveground biomass and thereby carbon pools; however, not much is known about the accuracy of estimating biomass change and carbon flux from repeat LiDAR acquisitions containing different data sampling characteristics. In this study, discrete return airborne LiDAR data was collected in 2003 and 2009 across {approx}20,000 hectares (ha) of an actively managed, mixed conifer forest landscape in northern Idaho, USA. Forest inventory plots, established via a random stratified sampling design, were established and sampled in 2003 and 2009. The Random Forest machine learning algorithm was used to establish statistical relationships between inventory data and forest structural metrics derived from the LiDAR acquisitions. Aboveground biomass maps were created for the study area based on statistical relationships developed at the plot level. Over this 6-year period, we found that the mean increase in biomass due to forest growth across the non-harvested portions of the study area was 4.8 metric ton/hectare (Mg/ha). In these non-harvested areas, we found a significant difference in biomass increase among forest successional stages, with a higher biomass increase in mature and old forest compared to stand initiation and young forest. Approximately 20% of the landscape had been disturbed by harvest activities during the six-year time period, representing a biomass loss of >70 Mg/ha in these areas. During the study period, these harvest activities outweighed growth at the landscape scale, resulting in an overall loss in aboveground carbon at this site. The 30-fold increase in sampling density between the 2003 and 2009 did not affect the biomass estimates. Overall, LiDAR data coupled with field reference data offer a powerful method for calculating pools and changes in aboveground carbon in forested systems. The results of our study suggest that multitemporal LiDAR-based approaches are likely to be useful for high quality estimates of aboveground carbon change in conifer forest systems.

  16. Results of wedge resection for focal bronchioloalveolar carcinoma showing pure ground-glass attenuation on computed tomography

    Microsoft Academic Search

    Shun-ichi Watanabe; Toshio Watanabe; Kazunori Arai; Takahiko Kasai; Joji Haratake; Hiroshi Urayama

    2002-01-01

    Background. Focal bronchioloalveolar carcinoma (BAC) showing pure ground-glass attenuation (GGA) on thin-section computed tomography (CT), which is considered to be an early-stage adenocarcinoma, has been diagnosed with increasing frequency due to the development and spread of the helical CT scanner. We discussed the appropriateness of limited resection for this type of lesion.Methods. Between July 1996 and June 2001, 17 patients

  17. Early Results Show Reduced Infection Rate Using No-touch Technique for Expander/ADM Breast Reconstruction

    PubMed Central

    2015-01-01

    Summary: Infection is a common complication of immediate breast reconstruction that often leads to device removal, a result emotionally devastating to the patient and frustrating for her surgeon. “No-touch” techniques have been used in other surgical disciplines and plastic surgery, but they have not been reported for breast reconstruction with tissue expanders or implants and acellular dermis. We report a novel technique of tissue expander and acellular dermis placement using no-touch principles with a self-retaining retractor system that holds promise to decrease infectious complications of breast reconstruction. PMID:25878928

  18. Quantifying Dark Gas

    E-print Network

    Li, Di; Heiles, Carl; Pan, Zhichen; Tang, Ningyu

    2015-01-01

    A growing body of evidence has been supporting the existence of so-called "dark molecular gas" (DMG), which is invisible in the most common tracer of molecular gas, i.e., CO rotational emission. DMG is believed to be the main gas component of the intermediate extinction region between A$\\rm_v$$\\sim$0.05-2, roughly corresponding to the self-shielding threshold of H$_2$ and $^{13}$CO. To quantify DMG relative to HI and CO, we are pursuing three observational techniques, namely, HI self-absorption, OH absorption, and TeraHz C$^+$ emission. In this paper, we focus on preliminary results from a CO and OH absorption survey of DMG candidates. Our analysis show that the OH excitation temperature is close to that of the Galactic continuum background and that OH is a good DMG tracer co-existing with molecular hydrogen in regions without CO. Through systematic "absorption mapping" by Square Kilometer Array (SKA) and ALMA, we will have unprecedented, comprehensive knowledge of the ISM components including DMG in terms of...

  19. Results of assessments by year of cohort The following pages show the results of the assessments carried out over the six-year period of the

    E-print Network

    Bradbeer, Robin Sarah

    ways, always comparing the inter-cohort analysis. Box and whisker plots The first is a box and whisker generated using the Comparative Statistics package that comes in Analyse-It, an Excel add-on. Box-plots and parametric statistics The first table below the box-plot chart shows the data used to draw the box-plots

  20. Magic Show

    Microsoft Academic Search

    Zachary Brass

    2012-01-01

    With a concentration in theatre, I created a magic show from scratch. Over the course of the semester, I researched both the effects (more commonly known as magic tricks) in a variety of styles, especially mentalism, along with the patter, or script, that is integral in making a good effect into something utterly amazing. I chose a certain set of

  1. Assessing the use of Geoscience Laser Altimeter System data to quantify forest structure change resultant from large-scale forest disturbance events- Case Study Hurricane Katrina

    Microsoft Academic Search

    K. A. Dolan; G. C. Hurtt; J. Q. Chambers; R. Dubayah; S. E. Frolking; J. Masek

    2009-01-01

    The biodiversity, structure, and functioning of forest systems in most areas are strongly influenced by disturbances. Forest structure can both influence and help indicate forest functions such as the storage and transfer of carbon between the land surface and the atmosphere. A 2007 report published by the National Research Council states that `Quantifying changes in the size of the [vegetation

  2. Quantifiable Lateral Flow Assay Test Strips

    NASA Technical Reports Server (NTRS)

    2003-01-01

    As easy to read as a home pregnancy test, three Quantifiable Lateral Flow Assay (QLFA) strips used to test water for E. coli show different results. The brightly glowing control line on the far right of each strip indicates that all three tests ran successfully. But the glowing test line on the middle left and bottom strips reveal their samples were contaminated with E. coli bacteria at two different concentrations. The color intensity correlates with concentration of contamination.

  3. Who is eating seafood? On an annual basis, results from the survey screener showed that 65% of U.S. households purchased

    E-print Network

    Who is eating seafood? On an annual basis, results from the survey screener showed that 65% of U.S. households purchased seafood for at-home consumption at least once in the previous year while 83% of households purchased seafood in a restaurant during the same period. As shown in Figures 1a-c, retail seafood

  4. Toward quantifying infrared clutter

    NASA Astrophysics Data System (ADS)

    Reynolds, William R.

    1990-09-01

    Target detection in clutter depends sensitively on the spatial structure of the latter. In particular, it is the ratio of the target size to the clutter inhomogeneity scale which is of crucial importance. Indeed, looking for the leopard in the background of leopard skin is a difficult task. Hence quantifying thermal clutter is essential to the development of successful detection algorithms and signature analysis. This paper describes an attempt at clutter characterization along with several applications using calibrated thermal imagery collected by the Keweenaw Research Center. The key idea is to combine spatial and intensity statistics of the clutter into one number in order to characterize intensity variations over the length scale imposed by the target. Furthermore, when properly normalized, this parameter appears independent of temporal meteorological variation, thereby constituting a background scene invariant. This measure has a basis in analysis of variance and is related to digital signal processing fundamentals. Statistical analysis of thermal images is presented with promising results.

  5. Two heteronuclear dipolar results at the price of one: quantifying Na/P contacts in phosphosilicate glasses and biomimetic hydroxy-apatite.

    PubMed

    Stevensson, Baltzar; Mathew, Renny; Yu, Yang; Edén, Mattias

    2015-02-01

    The analysis of S{I} recoupling experiments applied to amorphous solids yields a heteronuclear second moment M(2)(S-I) that represents the effective through-space dipolar interaction between the detected S spins and the neighboring I-spin species. We show that both M(2)(S-I) and M(2)(I-S) values are readily accessible from a sole S{I} or I{S} experiment, which may involve either S or I detection, and is naturally selected as the most favorable option under the given experimental conditions. For the common case where I has half-integer spin, an I{S} REDOR implementation is preferred to the S{I} REAPDOR counterpart. We verify the procedure by (23)Na{(31)P} REDOR and (31)P{(23)Na} REAPDOR NMR applied to Na(2)O-CaO-SiO(2)-P(2)O(5) glasses and biomimetic hydroxyapatite, where the M(2)(P-Na) values directly determined by REAPDOR agree very well with those derived from the corresponding M(2)(Na-P) results measured by REDOR. Moreover, we show that dipolar second moments are readily extracted from the REAPDOR NMR protocol by a straightforward numerical fitting of the initial dephasing data, in direct analogy with the well-established procedure to determine M(2)(S-I) values from REDOR NMR experiments applied to amorphous materials; this avoids the problems with time-consuming numerically exact simulations whose accuracy is limited for describing the dynamics of a priori unknown multi-spin systems in disordered structures. PMID:25557863

  6. Quantifying Proportional Variability

    PubMed Central

    Heath, Joel P.; Borowski, Peter

    2013-01-01

    Real quantities can undergo such a wide variety of dynamics that the mean is often a meaningless reference point for measuring variability. Despite their widespread application, techniques like the Coefficient of Variation are not truly proportional and exhibit pathological properties. The non-parametric measure Proportional Variability (PV) [1] resolves these issues and provides a robust way to summarize and compare variation in quantities exhibiting diverse dynamical behaviour. Instead of being based on deviation from an average value, variation is simply quantified by comparing the numbers to each other, requiring no assumptions about central tendency or underlying statistical distributions. While PV has been introduced before and has already been applied in various contexts to population dynamics, here we present a deeper analysis of this new measure, derive analytical expressions for the PV of several general distributions and present new comparisons with the Coefficient of Variation, demonstrating cases in which PV is the more favorable measure. We show that PV provides an easily interpretable approach for measuring and comparing variation that can be generally applied throughout the sciences, from contexts ranging from stock market stability to climate variation. PMID:24386334

  7. Wireless quantified reflex device

    NASA Astrophysics Data System (ADS)

    Lemoyne, Robert Charles

    The deep tendon reflex is a fundamental aspect of a neurological examination. The two major parameters of the tendon reflex are response and latency, which are presently evaluated qualitatively during a neurological examination. The reflex loop is capable of providing insight for the status and therapy response of both upper and lower motor neuron syndromes. Attempts have been made to ascertain reflex response and latency, however these systems are relatively complex, resource intensive, with issues of consistent and reliable accuracy. The solution presented is a wireless quantified reflex device using tandem three dimensional wireless accelerometers to obtain response based on acceleration waveform amplitude and latency derived from temporal acceleration waveform disparity. Three specific aims have been established for the proposed wireless quantified reflex device: 1. Demonstrate the wireless quantified reflex device is reliably capable of ascertaining quantified reflex response and latency using a quantified input. 2. Evaluate the precision of the device using an artificial reflex system. 3.Conduct a longitudinal study respective of subjects with healthy patellar tendon reflexes, using the wireless quantified reflex evaluation device to obtain quantified reflex response and latency. Aim 1 has led to the steady evolution of the wireless quantified reflex device from a singular two dimensional wireless accelerometer capable of measuring reflex response to a tandem three dimensional wireless accelerometer capable of reliably measuring reflex response and latency. The hypothesis for aim 1 is that a reflex quantification device can be established for reliably measuring reflex response and latency for the patellar tendon reflex, comprised of an integrated system of wireless three dimensional MEMS accelerometers. Aim 2 further emphasized the reliability of the wireless quantified reflex device by evaluating an artificial reflex system. The hypothesis for aim 2 is that the wireless quantified reflex device can obtain reliable reflex parameters (response and latency) from an artificial reflex device. Aim 3 synthesizes the findings relevant to aim 1 and 2, while applying the wireless accelerometer reflex quantification device to a longitudinal study of healthy patellar tendon reflexes. The hypothesis for aim 3 is that during a longitudinal evaluation of the deep tendon reflex the parameters for reflex response and latency can be measured with a considerable degree of accuracy, reliability, and reproducibility. Enclosed is a detailed description of a wireless quantified reflex device with research findings and potential utility of the system, inclusive of a comprehensive description of tendon reflexes, prior reflex quantification systems, and correlated applications.

  8. Visualizing and quantifying the suppressive effects of glucocorticoids on the tadpole immune system in vivo

    NSDL National Science Digital Library

    Alexander Schreiber (St. Lawrence University)

    2011-12-01

    This article presents a laboratory module developed to show students how glucocorticoid receptor activity can be pharmacologically modulated in Xenopus laevis tadpoles and the resulting effects on thymus gland size visualized and quantified in vivo.

  9. Rapamycin and Chloroquine: The In Vitro and In Vivo Effects of Autophagy-Modifying Drugs Show Promising Results in Valosin Containing Protein Multisystem Proteinopathy

    PubMed Central

    Nalbandian, Angèle; Llewellyn, Katrina J.; Nguyen, Christopher; Yazdi, Puya G.; Kimonis, Virginia E.

    2015-01-01

    Mutations in the valosin containing protein (VCP) gene cause hereditary Inclusion body myopathy (hIBM) associated with Paget disease of bone (PDB), frontotemporal dementia (FTD), more recently termed multisystem proteinopathy (MSP). Affected individuals exhibit scapular winging and die from progressive muscle weakness, and cardiac and respiratory failure, typically in their 40s to 50s. Histologically, patients show the presence of rimmed vacuoles and TAR DNA-binding protein 43 (TDP-43)-positive large ubiquitinated inclusion bodies in the muscles. We have generated a VCPR155H/+ mouse model which recapitulates the disease phenotype and impaired autophagy typically observed in patients with VCP disease. Autophagy-modifying agents, such as rapamycin and chloroquine, at pharmacological doses have previously shown to alter the autophagic flux. Herein, we report results of administration of rapamycin, a specific inhibitor of the mechanistic target of rapamycin (mTOR) signaling pathway, and chloroquine, a lysosomal inhibitor which reverses autophagy by accumulating in lysosomes, responsible for blocking autophagy in 20-month old VCPR155H/+ mice. Rapamycin-treated mice demonstrated significant improvement in muscle performance, quadriceps histological analysis, and rescue of ubiquitin, and TDP-43 pathology and defective autophagy as indicated by decreased protein expression levels of LC3-I/II, p62/SQSTM1, optineurin and inhibiting the mTORC1 substrates. Conversely, chloroquine-treated VCPR155H/+ mice revealed progressive muscle weakness, cytoplasmic accumulation of TDP-43, ubiquitin-positive inclusion bodies and increased LC3-I/II, p62/SQSTM1, and optineurin expression levels. Our in vitro patient myoblasts studies treated with rapamycin demonstrated an overall improvement in the autophagy markers. Targeting the mTOR pathway ameliorates an increasing list of disorders, and these findings suggest that VCP disease and related neurodegenerative multisystem proteinopathies can now be included as disorders that can potentially be ameliorated by rapalogs. PMID:25884947

  10. Quantifying the RR of harm to self and others from substance misuse: results from a survey of clinical experts across Scotland

    PubMed Central

    Mackay, Kirsty; Murphy, Jen; McIntosh, Andrew; McIntosh, Claire; Anderson, Seonaid; Welch, Killian

    2012-01-01

    Objective To produce an expert consensus hierarchy of harm to self and others from legal and illegal substance use. Design Structured questionnaire with nine scored categories of harm for 19 different commonly used substances. Setting/participants 292 clinical experts from across Scotland. Results There was no stepped categorical distinction in harm between the different legal and illegal substances. Heroin was viewed as the most harmful, and cannabis the least harmful of the substances studied. Alcohol was ranked as the fourth most harmful substance, with alcohol, nicotine and volatile solvents being viewed as more harmful than some class A drugs. Conclusions The harm rankings of 19 commonly used substances did not match the A, B, C classification under the Misuse of Drugs Act. The legality of a substance of misuse is not correlated with its perceived harm. These results could inform any legal review of drug misuse and help shape public health policy and practice. PMID:22833648

  11. Meditations on Quantified Constraint Satisfaction

    E-print Network

    Chen, Hubie

    2012-01-01

    The quantified constraint satisfaction problem (QCSP) is the problem of deciding, given a structure and a first-order prenex sentence whose quantifier-free part is the conjunction of atoms, whether or not the sentence holds on the structure. One obtains a family of problems by defining, for each structure B, the problem QCSP(B) to be the QCSP where the structure is fixed to be B. In this article, we offer a viewpoint on the research program of understanding the complexity of the problems QCSP(B) on finite structures. In particular, we propose and discuss a group of conjectures; throughout, we attempt to place the conjectures in relation to existing results and to emphasize open issues and potential research directions.

  12. A high-density wireless underground sensor network (WUSN) to quantify hydro-ecological interactions for a UK floodplain; project background and initial results

    NASA Astrophysics Data System (ADS)

    Verhoef, A.; Choudhary, B.; Morris, P. J.; McCann, J.

    2012-04-01

    Floodplain meadows support some of the most diverse vegetation in the UK, and also perform key ecosystem services, such as flood storage and sediment retention. However, the UK now has less than 1500 ha of this unique habitat remaining. In order to conserve and better exploit the services provided by this grassland, an improved understanding of its functioning is essential. Vegetation functioning and species composition are known to be tightly correlated to the hydrological regime, and related temperature and nutrient regime, but the mechanisms controlling these relationships are not well established. The FUSE* project aims to investigate the spatiotemporal variability in vegetation functioning (e.g. photosynthesis and transpiration) and plant community composition in a floodplain meadow near Oxford, UK (Yarnton Mead), and their relationship to key soil physical variables (soil temperature and moisture content), soil nutrient levels and the water- and energy-balance. A distributed high density Wireless Underground Sensor Network (WUSN) is in the process of being established on Yarnton Mead. The majority, or ideally all, of the sensing and transmitting components will be installed below-ground because Yarnton Mead is a SSSI (Site of Special Scientific Interest, due to its unique plant community) and because occasionally sheep or cattle are grazing on it, and that could damage the nodes. This prerequisite has implications for the maximum spacing between UG nodes and their communications technologies; in terms of signal strength, path losses and requirements for battery life. The success of underground wireless communication is highly dependent on the soil type and water content. This floodplain environment is particularly challenging in this context because the soil contains a large amount of clay near the surface and is therefore less favourable to EM wave propagation than sandy soils. Furthermore, due to high relative saturation levels (as a result of high groundwater levels and occasional overland flooding) considerable path losses are expected. Finally, the long-term below-ground installation of the nodes means that batteries cannot be replaced easily, therefore energy conservation schemes are required to be deployed on the nodes. We present a brief overview of the project and initial findings of the approach we have adopted to address these wireless communication issues. This involves tests covering a range of transmission frequencies, antennae types, and node placements. *FUSE, Floodplain Underground SEnsors, funded by the UK Natural Environment Research Council, NE/I007288/1, start date 1-3-2011)

  13. Storytelling Slide Shows to Improve Diabetes and High Blood Pressure Knowledge and Self-Efficacy: Three-Year Results among Community Dwelling Older African Americans

    ERIC Educational Resources Information Center

    Bertera, Elizabeth M.

    2014-01-01

    This study combined the African American tradition of oral storytelling with the Hispanic medium of "Fotonovelas." A staggered pretest posttest control group design was used to evaluate four Storytelling Slide Shows on health that featured community members. A total of 212 participants were recruited for the intervention and 217 for the…

  14. The Relevance of External Quality Assessment for Molecular Testing for ALK Positive Non-Small Cell Lung Cancer: Results from Two Pilot Rounds Show Room for Optimization

    PubMed Central

    Tembuyser, Lien; Tack, Véronique; Zwaenepoel, Karen; Pauwels, Patrick; Miller, Keith; Bubendorf, Lukas; Kerr, Keith; Schuuring, Ed; Thunnissen, Erik; Dequeker, Elisabeth M. C.

    2014-01-01

    Background and Purpose Molecular profiling should be performed on all advanced non-small cell lung cancer with non-squamous histology to allow treatment selection. Currently, this should include EGFR mutation testing and testing for ALK rearrangements. ROS1 is another emerging target. ALK rearrangement status is a critical biomarker to predict response to tyrosine kinase inhibitors such as crizotinib. To promote high quality testing in non-small cell lung cancer, the European Society of Pathology has introduced an external quality assessment scheme. This article summarizes the results of the first two pilot rounds organized in 2012–2013. Materials and Methods Tissue microarray slides consisting of cell-lines and resection specimens were distributed with the request for routine ALK testing using IHC or FISH. Participation in ALK FISH testing included the interpretation of four digital FISH images. Results Data from 173 different laboratories was obtained. Results demonstrate decreased error rates in the second round for both ALK FISH and ALK IHC, although the error rates were still high and the need for external quality assessment in laboratories performing ALK testing is evident. Error rates obtained by FISH were lower than by IHC. The lowest error rates were observed for the interpretation of digital FISH images. Conclusion There was a large variety in FISH enumeration practices. Based on the results from this study, recommendations for the methodology, analysis, interpretation and result reporting were issued. External quality assessment is a crucial element to improve the quality of molecular testing. PMID:25386659

  15. Map Showing Earthquake Shaking and Tsunami Hazard in Guadeloupe and Dominica, as a Result of an M8.0 Earthquake on the Lesser Antilles Megathrust

    USGS Multimedia Gallery

    Earthquake shaking (onland) and tsunami (ocean) hazard in Guadeloupe and Dominica, as a result of anM8.0 earthquake on the Lesser Antilles megathrust adjacent to Guadeloupe. Colors onland represent scenario earthquake shaking intensities calculated in USGS ShakeMap software (Wald et al. 20...

  16. timid, or easily manipulated. This is not compassion. A marine drill ser-geant may be demanding and results-driven, but can show compassion

    E-print Network

    Kim, Duck O.

    this case, termination) while an employee is in a treatment facility or after discharge from treat- ment? I drug/alcohol-related crisis resulted or someone got killed? not rely upon the opinion of your employee enabling a friend or coworker with a severe personal problem requires making choices that may create

  17. [Medical teaching and humanistic-communicative aspects: two step-children of quality evaluation. An example of a compact teaching experiment with integrated final examination shows better results].

    PubMed

    Barolin, G S

    1997-01-01

    Evaluation of teaching regarding the humanitarian communicative aspect can not take place without controls. However, isolated controls without other evaluative measures are irrational instruments. This specially holds true for final exams. It seems important to use methods of evaluation which at the same time improve the quality and do not in the negative sense (like a final exam) knock out and disqualify (Tab. 2). We can give some examples out of daily routine which--alas--are very wide reaching. The humanitarian communicative factor, which is an integrative part of medicine very badly is strongly disregarded in favour of educating towards a mechanistic apparatschikway, namely in medical education, carrier, supervision of hospitals. We need strong good will for reformation! We do not only raise postulates here but can show an example of a teaching experiment, where students were put in small groups for intensive teaching with an integrated final exam. This was possible without more costs, but with more human engagement on the side of the teachers. PMID:9487616

  18. How to quantify structural anomalies in fluids?

    NASA Astrophysics Data System (ADS)

    Fomin, Yu. D.; Ryzhov, V. N.; Klumov, B. A.; Tsiok, E. N.

    2014-07-01

    Some fluids are known to behave anomalously. The so-called structural anomaly which means that the fluid becomes less structures under isothermal compression is among the most frequently discussed ones. Several methods for quantifying the degree of structural order are described in the literature and are used for calculating the region of structural anomaly. It is generally thought that all of the structural order determinations yield qualitatively identical results. However, no explicit comparison was made. This paper presents such a comparison for the first time. The results of some definitions are shown to contradict the intuitive notion of a fluid. On the basis of this comparison, we show that the region of structural anomaly can be most reliably determined from the behavior of the excess entropy.

  19. Quantifying air pollution removal by green roofs in Chicago Jun Yang a,c,*, Qian Yu b

    E-print Network

    Yu, Qian

    Quantifying air pollution removal by green roofs in Chicago Jun Yang a,c,*, Qian Yu b , Peng Gong c t The level of air pollution removal by green roofs in Chicago was quantified using a dry deposition model. The result showed that a total of 1675 kg of air pollutants was removed by 19.8 ha of green roofs in one year

  20. New results show that the long term stability of Large Low Shear Wave Velocity Provinces (LLSVPs) on the CMB has lasted for at least 540 My

    NASA Astrophysics Data System (ADS)

    Burke, K. C.; Torsvik, T. H.

    2010-12-01

    We have shown previously that ca. 15 active volcanic hot spots with parent ages as old as 134 Ma and ca. 25 Large Igneous Provinces (LIPs) with ages as old as ca. 300 Ma the (the latter rotated using a hybrid method to their locations at the times of their eruptions) lie vertically above narrow Plume Generation Zones (PGZs) on the Core/Mantle Boundary (CMB) at the margins of the Earth’s two LLSVPS. We infer that those PGZs and therefore the LLSVPs themselves have remained in their present locations, antipodal on the equator, for at least 300 My. We have now found that 80% of the eruption sites of a rotated population of ca. 1400 kimberlitic volcanic rocks with ages of < 320 My were similarly erupted above the same PGZs. A bootstrap operation (i) assuming that PGZ stability extended back to ca. 500 Ma and (ii) rotating two LIPs of ages 360 Ma and 510 Ma so that their eruption sites overlay PGZs enabled us to describe the locations of ancient continents in longitude. The strength of the idea that LLSVP stability extended back to ca. 500 Ma was then tested by rotating ca. 200 kimberlitic volcanic rock localities of known ages between 344 and 542 Ma and known present locations within the continents. Because those older kimberlites also lay over PGZs we consider that the long term stability of the two LLSVPs has been confirmed for the entire Phanerozoic. Models of mantle structure with stable LLSVPs will surely help in showing how the deep Earth behaves.

  1. A second generation cervico-vaginal lavage device shows similar performance as its preceding version with respect to DNA yield and HPV DNA results

    PubMed Central

    2013-01-01

    Background Attendance rates of cervical screening programs can be increased by offering HPV self-sampling to non-attendees. Acceptability, DNA yield, lavage volumes and choice of hrHPV test can influence effectiveness of the self-sampling procedures and could therefore play a role in recruiting non-attendees. To increase user-friendliness, a frequently used lavage sampler was modified. In this study, we compared this second generation lavage device with the first generation device within similar birth cohorts. Methods Within a large self-sampling cohort-study among non-responders of the Dutch cervical screening program, a subset of 2,644 women received a second generation self-sampling lavage device, while 11,977 women, matched for age and ZIP-code, received the first generation model. The second generation device was different in shape, color, lavage volume, and packaging, in comparison to its first generation model. The Cochran’s test was used to compare both devices for hrHPV positivity rate and response rate. To correct for possible heterogeneity between age and ZIP codes in both groups the Breslow-Day test of homogeneity was used. A T-test was utilized to compare DNA yields of the obtained material in both groups. Results Median DNA yields were 90.4 ?g/ml (95% CI 83.2-97.5) and 91.1 ?g/ml (95% CI 77.8-104.4, p= 0.726) and hrHPV positivity rates were 8.2% and 6.9% (p= 0.419) per sample self-collected by the second - and the first generation of the device (p= 0.726), respectively. In addition, response rates were comparable for the two models (35.4% versus 34.4%, p= 0.654). Conclusions Replacing the first generation self-sampling device by an ergonomically improved, second generation device resulted in equal DNA yields, comparable hrHPV positivity rates and similar response rates. Therefore, it can be concluded that the clinical performance of the first and second generation models are similar. Moreover, participation of non-attendees in cervical cancer screening is probably not predominantly determined by the type of self-collection device. PMID:23639287

  2. Value of Fused 18F-Choline-PET/MRI to Evaluate Prostate Cancer Relapse in Patients Showing Biochemical Recurrence after EBRT: Preliminary Results

    PubMed Central

    Piccardo, Arnoldo; Paparo, Francesco; Picazzo, Riccardo; Naseri, Mehrdad; Ricci, Paolo; Marziano, Andrea; Bacigalupo, Lorenzo; Biscaldi, Ennio; Rollandi, Gian Andrea; Grillo-Ruggieri, Filippo; Farsad, Mohsen

    2014-01-01

    Purpose. We compared the accuracy of 18F-Choline-PET/MRI with that of multiparametric MRI (mMRI), 18F-Choline-PET/CT, 18F-Fluoride-PET/CT, and contrast-enhanced CT (CeCT) in detecting relapse in patients with suspected relapse of prostate cancer (PC) after external beam radiotherapy (EBRT). We assessed the association between standard uptake value (SUV) and apparent diffusion coefficient (ADC). Methods. We evaluated 21 patients with biochemical relapse after EBRT. Patients underwent 18F-Choline-PET/contrast-enhanced (Ce)CT, 18F-Fluoride-PET/CT, and mMRI. Imaging coregistration of PET and mMRI was performed. Results. 18F-Choline-PET/MRI was positive in 18/21 patients, with a detection rate (DR) of 86%. DRs of 18F-Choline-PET/CT, CeCT, and mMRI were 76%, 43%, and 81%, respectively. In terms of DR the only significant difference was between 18F-Choline-PET/MRI and CeCT. On lesion-based analysis, the accuracy of 18F-Choline-PET/MRI, 18F-Choline-PET/CT, CeCT, and mMRI was 99%, 95%, 70%, and 85%, respectively. Accuracy, sensitivity, and NPV of 18F-Choline-PET/MRI were significantly higher than those of both mMRI and CeCT. On whole-body assessment of bone metastases, the sensitivity of 18F-Choline-PET/CT and 18F-Fluoride-PET/CT was significantly higher than that of CeCT. Regarding local and lymph node relapse, we found a significant inverse correlation between ADC and SUV-max. Conclusion. 18F-Choline-PET/MRI is a promising technique in detecting PC relapse. PMID:24877053

  3. Floating numerals and floating quantifiers

    Microsoft Academic Search

    Mana Kobuchi-Philip

    2007-01-01

    This paper investigates the basic licensing conditions of the floating quantifier, e.g. all in The boys all took a card. It proposes a new analysis in which the quantificational operation of the floating quantifier is distinct from that of the quantifier local to a DP, e.g. all in All boys took a card, in the sense that the former takes

  4. "First Things First" Shows Promising Results

    ERIC Educational Resources Information Center

    Hendrie, Caroline

    2005-01-01

    In this article, the author discusses a school improvement model, First Things First, developed by James P. Connell, a former tenured professor of psychology at the University of Rochester in New York. The model has three pillars for the high school level: (1) small, themed learning communities that each keep a group of students together…

  5. Quantifying the Arctic methane budget

    NASA Astrophysics Data System (ADS)

    Warwick, Nicola; Cain, Michelle; Pyle, John

    2014-05-01

    The Arctic is a major source of atmospheric methane, containing climate-sensitive emissions from natural wetlands and gas hydrates, as well as the fossil fuel industry. Both wetland and gas hydrate methane emissions from the Arctic may increase with increasing temperature, resulting in a positive feedback leading to enhancement of climate warming. It is important that these poorly-constrained sources are quantified by location and strength and their vulnerability to change be assessed. The MAMM project (Methane and other greenhouse gases in the Arctic: Measurements, process studies and Modelling') addresses these issues as part of the UK NERC Arctic Programme. A global chemistry transport model has been used, along with MAMM and other long term observations, to assess our understanding of the different source and sink terms in the Arctic methane budget. Simulations including methane coloured by source and latitude are used to distinguish between Arctic seasonal variability arising from transport and that arising from changes in Arctic sources and sinks. Methane isotopologue tracers provide a further constraint on modelled methane variability, distinguishing between isotopically light and heavy sources (e.g. wetlands and gas fields). We focus on quantifying the magnitude and seasonal variability of Arctic wetland emissions.

  6. Quantifying T Lymphocyte Turnover

    PubMed Central

    De Boer, Rob J.; Perelson, Alan S.

    2013-01-01

    Peripheral T cell populations are maintained by production of naive T cells in the thymus, clonal expansion of activated cells, cellular self-renewal (or homeostatic proliferation), and density dependent cell life spans. A variety of experimental techniques have been employed to quantify the relative contributions of these processes. In modern studies lymphocytes are typically labeled with 5-bromo-2?-deoxyuridine (BrdU), deuterium, or the fluorescent dye carboxy-fluorescein diacetate succinimidyl ester (CFSE), their division history has been studied by monitoring telomere shortening and the dilution of T cell receptor excision circles (TRECs) or the dye CFSE, and clonal expansion has been documented by recording changes in the population densities of antigen specific cells. Proper interpretation of such data in terms of the underlying rates of T cell production, division, and death has proven to be notoriously difficult and involves mathematical modeling. We review the various models that have been developed for each of these techniques, discuss which models seem most appropriate for what type of data, reveal open problems that require better models, and pinpoint how the assumptions underlying a mathematical model may influence the interpretation of data. Elaborating various successful cases where modeling has delivered new insights in T cell population dynamics, this review provides quantitative estimates of several processes involved in the maintenance of naive and memory, CD4+ and CD8+ T cell pools in mice and men. PMID:23313150

  7. Quantifying traffic exposure.

    PubMed

    Pratt, Gregory C; Parson, Kris; Shinoda, Naomi; Lindgren, Paula; Dunlap, Sara; Yawn, Barbara; Wollan, Peter; Johnson, Jean

    2014-01-01

    Living near traffic adversely affects health outcomes. Traffic exposure metrics include distance to high-traffic roads, traffic volume on nearby roads, traffic within buffer distances, measured pollutant concentrations, land-use regression estimates of pollution concentrations, and others. We used Geographic Information System software to explore a new approach using traffic count data and a kernel density calculation to generate a traffic density surface with a resolution of 50?m. The density value in each cell reflects all the traffic on all the roads within the distance specified in the kernel density algorithm. The effect of a given roadway on the raster cell value depends on the amount of traffic on the road segment, its distance from the raster cell, and the form of the algorithm. We used a Gaussian algorithm in which traffic influence became insignificant beyond 300?m. This metric integrates the deleterious effects of traffic rather than focusing on one pollutant. The density surface can be used to impute exposure at any point, and it can be used to quantify integrated exposure along a global positioning system route. The traffic density calculation compares favorably with other metrics for assessing traffic exposure and can be used in a variety of applications. PMID:24045427

  8. Quantifying Loopy Network Architectures

    PubMed Central

    Katifori, Eleni; Magnasco, Marcelo O.

    2012-01-01

    Biology presents many examples of planar distribution and structural networks having dense sets of closed loops. An archetype of this form of network organization is the vasculature of dicotyledonous leaves, which showcases a hierarchically-nested architecture containing closed loops at many different levels. Although a number of approaches have been proposed to measure aspects of the structure of such networks, a robust metric to quantify their hierarchical organization is still lacking. We present an algorithmic framework, the hierarchical loop decomposition, that allows mapping loopy networks to binary trees, preserving in the connectivity of the trees the architecture of the original graph. We apply this framework to investigate computer generated graphs, such as artificial models and optimal distribution networks, as well as natural graphs extracted from digitized images of dicotyledonous leaves and vasculature of rat cerebral neocortex. We calculate various metrics based on the asymmetry, the cumulative size distribution and the Strahler bifurcation ratios of the corresponding trees and discuss the relationship of these quantities to the architectural organization of the original graphs. This algorithmic framework decouples the geometric information (exact location of edges and nodes) from the metric topology (connectivity and edge weight) and it ultimately allows us to perform a quantitative statistical comparison between predictions of theoretical models and naturally occurring loopy graphs. PMID:22701593

  9. What Do Blood Tests Show?

    MedlinePLUS

    ... page from the NHLBI on Twitter. What Do Blood Tests Show? Blood tests show whether the levels ... changes may work best. Result Ranges for Common Blood Tests This section presents the result ranges for ...

  10. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  11. Solar Light Show

    NSDL National Science Digital Library

    de Nie, Michael Willem.

    Over the last few days, the Earth has been buffeted by a geomagnetic storm caused by a major solar flare. In addition to disruptions in radio, telecommunications, and electric service, the flare may also produce a dramatic light show as it peaks tonight. Weather permitting, the aurora borealis, or northern lights, may be visible as far south as Washington, D.C. The best viewing time will be local midnight. The sun is currently at the peak of its eleven-year solar cycle, spawning flares and "coronal mass ejections" (CME), violent outbursts of gas from the sun's corona that can carry up to 10 billion tons of electrified gas traveling at speeds as high as 2000 km/s. Geomagnetic storms result when solar winds compress the magnetosphere, sometimes interfering with electric power transmission and satellites, but also creating beautiful aurorae, as many stargazers hope will occur tonight.

  12. Show-Me Center

    NSDL National Science Digital Library

    The Show-Me Center, located at the University of Missouri, is a math education project of the National Science Foundation. The center's Web site "provides information and resources needed to support selection and implementation of standards-based middle grades mathematic curricula." There are some sample lesson plans offered, but most of the material is solely for use by teachers. Five different middle grade math curriculums were started in 1992, and now, the implementation and results of each curriculum are presented on this site. Teachers can examine each one, view video clips, and read case studies and other reports to choose which parts of the curriculums would fit best into their own classes.

  13. Quantified measurement of subacromial impingement

    Microsoft Academic Search

    Lieven De Wilde; Frank Plasschaert; Bart Berghs; Mike Van Hoecke; Koenraad Verstraete; René Verdonk

    2003-01-01

    We modified the Hawkins impingement maneuver in order to develop a quantifiable and reproducible impingement test. The involved anatomic structures were examined with magnetic resonance imaging of 3 cadaveric shoulders. The reproducibility of the clinical sign was assessed with an interobserver and intraobserver reliability test, with calculation of the intraclass correlation coefficient (ICC). The quantified Hawkins maneuver appears to be

  14. Public medical shows.

    PubMed

    Walusinski, Olivier

    2014-01-01

    In the second half of the 19th century, Jean-Martin Charcot (1825-1893) became famous for the quality of his teaching and his innovative neurological discoveries, bringing many French and foreign students to Paris. A hunger for recognition, together with progressive and anticlerical ideals, led Charcot to invite writers, journalists, and politicians to his lessons, during which he presented the results of his work on hysteria. These events became public performances, for which physicians and patients were transformed into actors. Major newspapers ran accounts of these consultations, more like theatrical shows in some respects. The resultant enthusiasm prompted other physicians in Paris and throughout France to try and imitate them. We will compare the form and substance of Charcot's lessons with those given by Jules-Bernard Luys (1828-1897), Victor Dumontpallier (1826-1899), Ambroise-Auguste Liébault (1823-1904), Hippolyte Bernheim (1840-1919), Joseph Grasset (1849-1918), and Albert Pitres (1848-1928). We will also note their impact on contemporary cinema and theatre. PMID:25273491

  15. Investigations of information quantifiers for the Tavis-Cummings model

    NASA Astrophysics Data System (ADS)

    Obada, A.-S. F.; Abdel-Khalek, S.; Berrada, K.; Shaheen, M. E.

    2013-12-01

    In this article, a system of two two-level atoms interacting with a single-mode quantized electromagnetic field in a lossless resonant cavity via a multi-photon transition is considered. The quantum Fisher information, negativity, classical Fisher information, and reduced von Neumann entropy for the two atoms are investigated. We found that the number of photon transitions plays an important role in the dynamics of different information quantifiers in the cases of two symmetric and two asymmetric atoms. Our results show that there is a close relationship between the different quantifiers. Also, the quantum and classical Fisher information can be useful for studying the properties of quantum states which are important in quantum optics and information.

  16. The Great Cometary Show

    NASA Astrophysics Data System (ADS)

    2007-01-01

    The ESO Very Large Telescope Interferometer, which allows astronomers to scrutinise objects with a precision equivalent to that of a 130-m telescope, is proving itself an unequalled success every day. One of the latest instruments installed, AMBER, has led to a flurry of scientific results, an anthology of which is being published this week as special features in the research journal Astronomy & Astrophysics. ESO PR Photo 06a/07 ESO PR Photo 06a/07 The AMBER Instrument "With its unique capabilities, the VLT Interferometer (VLTI) has created itself a niche in which it provide answers to many astronomical questions, from the shape of stars, to discs around stars, to the surroundings of the supermassive black holes in active galaxies," says Jorge Melnick (ESO), the VLT Project Scientist. The VLTI has led to 55 scientific papers already and is in fact producing more than half of the interferometric results worldwide. "With the capability of AMBER to combine up to three of the 8.2-m VLT Unit Telescopes, we can really achieve what nobody else can do," added Fabien Malbet, from the LAOG (France) and the AMBER Project Scientist. Eleven articles will appear this week in Astronomy & Astrophysics' special AMBER section. Three of them describe the unique instrument, while the other eight reveal completely new results about the early and late stages in the life of stars. ESO PR Photo 06b/07 ESO PR Photo 06b/07 The Inner Winds of Eta Carinae The first results presented in this issue cover various fields of stellar and circumstellar physics. Two papers deal with very young solar-like stars, offering new information about the geometry of the surrounding discs and associated outflowing winds. Other articles are devoted to the study of hot active stars of particular interest: Alpha Arae, Kappa Canis Majoris, and CPD -57o2874. They provide new, precise information about their rotating gas envelopes. An important new result concerns the enigmatic object Eta Carinae. Using AMBER with its high spatial and spectral resolution, it was possible to zoom into the very heart of this very massive star. In this innermost region, the observations are dominated by the extremely dense stellar wind that totally obscures the underlying central star. The AMBER observations show that this dense stellar wind is not spherically symmetric, but exhibits a clearly elongated structure. Overall, the AMBER observations confirm that the extremely high mass loss of Eta Carinae's massive central star is non-spherical and much stronger along the poles than in the equatorial plane. This is in agreement with theoretical models that predict such an enhanced polar mass-loss in the case of rapidly rotating stars. ESO PR Photo 06c/07 ESO PR Photo 06c/07 RS Ophiuchi in Outburst Several papers from this special feature focus on the later stages in a star's life. One looks at the binary system Gamma 2 Velorum, which contains the closest example of a star known as a Wolf-Rayet. A single AMBER observation allowed the astronomers to separate the spectra of the two components, offering new insights in the modeling of Wolf-Rayet stars, but made it also possible to measure the separation between the two stars. This led to a new determination of the distance of the system, showing that previous estimates were incorrect. The observations also revealed information on the region where the winds from the two stars collide. The famous binary system RS Ophiuchi, an example of a recurrent nova, was observed just 5 days after it was discovered to be in outburst on 12 February 2006, an event that has been expected for 21 years. AMBER was able to detect the extension of the expanding nova emission. These observations show a complex geometry and kinematics, far from the simple interpretation of a spherical fireball in extension. AMBER has detected a high velocity jet probably perpendicular to the orbital plane of the binary system, and allowed a precise and careful study of the wind and the shockwave coming from the nova. The stream of results from the VLTI and AMBER

  17. Automata and quantifier hierarchies

    Microsoft Academic Search

    Wolfgang Thomas; RWTH Aachen; Lehrstuhl f'tir Informatik

    1988-01-01

    The paper discusses results on -languages in a recursion theoretic framework which is adapted to the treatment of formal languages. We consider variants of the arithmetical hierarchy which are not based on the recursive sets but on sets defined in terms of finite automata. In particular, it is shown how the theorems of Büchi and McNaughton on regular -languages can

  18. Quantifying the quiet epidemic

    PubMed Central

    2014-01-01

    During the late 20th century numerical rating scales became central to the diagnosis of dementia and helped transform attitudes about its causes and prevalence. Concentrating largely on the development and use of the Blessed Dementia Scale, I argue that rating scales served professional ends during the 1960s and 1970s. They helped old age psychiatrists establish jurisdiction over conditions such as dementia and present their field as a vital component of the welfare state, where they argued that ‘reliable modes of diagnosis’ were vital to the allocation of resources. I show how these arguments appealed to politicians, funding bodies and patient groups, who agreed that dementia was a distinct disease and claimed research on its causes and prevention should be designated ‘top priority’. But I also show that worries about the replacement of clinical acumen with technical and depersonalized methods, which could conceivably be applied by anyone, led psychiatrists to stress that rating scales had their limits and could be used only by trained experts.

  19. Quantifying decoherence in continuous variable systems

    Microsoft Academic Search

    A Serafini; M G A Paris; F. Illuminati; S. De Siena

    2005-01-01

    We present a detailed report on the decoherence of quantum states of continuous variable systems under the action of a quantum optical master equation resulting from the interaction with general Gaussian uncorrelated environments. The rate of decoherence is quantified by relating it to the decay rates of various, complementary measures of the quantum nature of a state, such as the

  20. Quantifying periodicity in omics data

    PubMed Central

    Amariei, Cornelia; Tomita, Masaru; Murray, Douglas B.

    2014-01-01

    Oscillations play a significant role in biological systems, with many examples in the fast, ultradian, circadian, circalunar, and yearly time domains. However, determining periodicity in such data can be problematic. There are a number of computational methods to identify the periodic components in large datasets, such as signal-to-noise based Fourier decomposition, Fisher's g-test and autocorrelation. However, the available methods assume a sinusoidal model and do not attempt to quantify the waveform shape and the presence of multiple periodicities, which provide vital clues in determining the underlying dynamics. Here, we developed a Fourier based measure that generates a de-noised waveform from multiple significant frequencies. This waveform is then correlated with the raw data from the respiratory oscillation found in yeast, to provide oscillation statistics including waveform metrics and multi-periods. The method is compared and contrasted to commonly used statistics. Moreover, we show the utility of the program in the analysis of noisy datasets and other high-throughput analyses, such as metabolomics and flow cytometry, respectively. PMID:25364747

  1. Quantifying the extinction vortex

    Microsoft Academic Search

    William F. Fagan

    2006-01-01

    We developed a database of 10 wild vertebrate populations whose declines to extinction were monitored over at least 12 years. We quantitatively characterized the final declines of these well-monitored populations and tested key theoretical predictions about the process of extinction, obtaining two primary results. First, we found evidence of logarithmic scaling of time-to-extinction as a function of population size for

  2. Quantifying cosmic variance

    NASA Astrophysics Data System (ADS)

    Driver, Simon P.; Robotham, Aaron S. G.

    2010-10-01

    We determine an expression for the cosmic variance of any `normal' galaxy survey based on examination of M* +/- 1 mag galaxies in the Sloan Digital Sky Survey (SDSS) Data Release 7 (DR7) data cube. We find that cosmic variance will depend on a number of factors principally: total survey volume, survey aspect ratio and whether the area surveyed is contiguous or comprising independent sightlines. As a rule of thumb cosmic variance falls below 10 per cent once a volume of 107h-30.7Mpc3 is surveyed for a single contiguous region with a 1:1 aspect ratio. Cosmic variance will be lower for higher aspect ratios and/or non-contiguous surveys. Extrapolating outside our test region we infer that cosmic variance in the entire SDSS DR7 main survey region is ~7 per cent to z < 0.1. The equation obtained from the SDSS DR7 region can be generalized to estimate the cosmic variance for any density measurement determined from normal galaxies (e.g. luminosity densities, stellar mass densities and cosmic star formation rates) within the volume range 103-107h-30.7Mpc3. We apply our equation to show that two sightlines are required to ensure that cosmic variance is <10 per cent in any ASKAP galaxy survey (divided into ? z ~ 0.1 intervals, i.e. ~1Gyr intervals for z < 0.5). Likewise 10 MeerKAT sightlines will be required to meet the same conditions. GAMA, VVDS and zCOSMOS all suffer less than 10 per cent cosmic variance (~3-8 per cent) in ? z intervals of 0.1, 0.25 and 0.5, respectively. Finally we show that cosmic variance is potentially at the 50-70 per cent level, or greater, in the Hubble Space Telescope (HST) Ultra Deep Field depending on assumptions as to the evolution of clustering. 100 or 10 independent sightlines will be required to reduce cosmic variance to a manageable level (<10 per cent) for HST ACS or HST WFC3 surveys, respectively (in ? z ~ 1 intervals). Cosmic variance is therefore a significant factor in the z > 6 HST studies currently underway.

  3. Homemade Laser Show

    NSDL National Science Digital Library

    Children's Museum of Houston

    2011-01-01

    With a laser pointer and some household items, learners can create their own laser light show. They can explore diffuse reflection, refraction and diffraction. The webpage includes a video which shows how to set up the activity and also includes scientific explanation. Because this activity involves lasers, it requires adult supervision.

  4. Stretched View Showing 'Victoria'

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Stretched View Showing 'Victoria'

    This pair of images from the panoramic camera on NASA's Mars Exploration Rover Opportunity served as initial confirmation that the two-year-old rover is within sight of 'Victoria Crater,' which it has been approaching for more than a year. Engineers on the rover team were unsure whether Opportunity would make it as far as Victoria, but scientists hoped for the chance to study such a large crater with their roving geologist. Victoria Crater is 800 meters (nearly half a mile) in diameter, about six times wider than 'Endurance Crater,' where Opportunity spent several months in 2004 examining rock layers affected by ancient water.

    When scientists using orbital data calculated that they should be able to detect Victoria's rim in rover images, they scrutinized frames taken in the direction of the crater by the panoramic camera. To positively characterize the subtle horizon profile of the crater and some of the features leading up to it, researchers created a vertically-stretched image (top) from a mosaic of regular frames from the panoramic camera (bottom), taken on Opportunity's 804th Martian day (April 29, 2006).

    The stretched image makes mild nearby dunes look like more threatening peaks, but that is only a result of the exaggerated vertical dimension. This vertical stretch technique was first applied to Viking Lander 2 panoramas by Philip Stooke, of the University of Western Ontario, Canada, to help locate the lander with respect to orbiter images. Vertically stretching the image allows features to be more readily identified by the Mars Exploration Rover science team.

    The bright white dot near the horizon to the right of center (barely visible without labeling or zoom-in) is thought to be a light-toned outcrop on the far wall of the crater, suggesting that the rover can see over the low rim of Victoria. In figure 1, the northeast and southeast rims are labeled in bright green. Finally, the light purple lines and arrow highlight a small crater.

  5. The Diane Rehm Show

    NSDL National Science Digital Library

    The Diane Rehm Show has its origins in a mid-day program at WAMU in Washington, D.C. Diane Rehm came on to host the program in 1979, and in 1984 it was renamed "The Diane Rehm Show". Over the past several decades, Rehm has played host to hundreds of guests, include Archbishop Desmond Tutu, Julie Andrews, and President Bill Clinton. This website contains an archive of her past programs, and visitors can use the interactive calendar to look through past shows. Those visitors looking for specific topics can use the "Topics" list on the left-hand side of the page, or also take advantage of the search engine. The show has a number of social networking links, including a Facebook page and a Twitter feed.

  6. Producing Turkeys for Show

    E-print Network

    Thornberry, Fredrick D.

    2005-12-14

    . Use top-quality feeds. 4. Follow recommended management practices during the entire brooding and growing period. 5. Cull birds closely and select the show entry properly. Purchasing Poults Most youth livestock shows have rules and regulations governing... with an approved worm- ing compound. a73 Check turkeys monthly for parasites. Pay particular attention to skin around the vent area. Control external para- sites (lice, mites, etc.) with applications of Sevin ? dust. a73 Fire ants can cause skin blisters and must...

  7. Quantifying biofilm structure: facts and fiction.

    PubMed

    Beyenal, Haluk; Lewandowski, Zbigniew; Harkin, Gary

    2004-02-01

    There is no doubt among biofilm researchers that biofilm structure is important to many biofilm processes, such as the transport of nutrients to deeper layers of the biofilm. However, biofilm structure is an elusive term understood only qualitatively, and as such it cannot be directly correlated with any measurable parameters characterizing biofilm performance. To correlate biofilm structure with the parameters characterizing biofilm performance, such as the rate of nutrient transport within the space occupied by the biofilms, biofilm structure must first be quantified and expressed numerically on an appropriate scale. The task of extracting numerical parameters quantifying biofilm structure relies on using biofilm imaging and image analysis. Although defining parameters characterizing biofilm structure is relatively straightforward, and multiple parameters have been described in the computer science literature, interpreting the results of such analyses is not trivial. Existing computer software developed by several research groups, including ours, for the sole purpose of analyzing biofilm images helps quantify parameters from biofilm images but does nothing to help interpret the results of such analyses. Although computing structural parameters from biofilm images permits correlating biofilm structure with other biofilm processes, the meaning of the results is not obvious. The first step to understanding the quantification of biofilm structure, developing image analysis, methods to quantify information from biofilm images, has been made by several research groups. The next step is to explain the meaning of these analyses. This presentation explains the meaning of several parameters commonly used to characterize biofilm structure. It also reviews the authors' research and experience in quantifying biofilm structure and their attempts to quantitatively relate biofilm structure to fundamental biofilm processes. PMID:15079889

  8. The main results show that 500 p.p.m. of acetyl salicylic acid improve growth, food intake and food efficiency as well as the health of the animals.

    E-print Network

    Paris-Sud XI, Université de

    compared the nutritional value of S04CU used as food additive at relatively high doses (125 and 250 pThe main results show that 500 p.p.m. of acetyl salicylic acid improve growth, food intake and food to an increase of the food intake and to an increase of the food efficiency. The effect of the antibiotic

  9. Quantifying spatiotemporal chaos in Rayleigh-Bénard convection

    NASA Astrophysics Data System (ADS)

    Karimi, A.; Paul, M. R.

    2012-04-01

    Using large-scale parallel numerical simulations we explore spatiotemporal chaos in Rayleigh-Bénard convection in a cylindrical domain with experimentally relevant boundary conditions. We use the variation of the spectrum of Lyapunov exponents and the leading-order Lyapunov vector with system parameters to quantify states of high-dimensional chaos in fluid convection. We explore the relationship between the time dynamics of the spectrum of Lyapunov exponents and the pattern dynamics. For chaotic dynamics we find that all of the Lyapunov exponents are positively correlated with the leading-order Lyapunov exponent, and we quantify the details of their response to the dynamics of defects. The leading-order Lyapunov vector is used to identify topological features of the fluid patterns that contribute significantly to the chaotic dynamics. Our results show a transition from boundary-dominated dynamics to bulk-dominated dynamics as the system size is increased. The spectrum of Lyapunov exponents is used to compute the variation of the fractal dimension with system parameters to quantify how the underlying high-dimensional strange attractor accommodates a range of different chaotic dynamics.

  10. Showing What They Know

    ERIC Educational Resources Information Center

    Cech, Scott J.

    2008-01-01

    Having students show their skills in three dimensions, known as performance-based assessment, dates back at least to Socrates. Individual schools such as Barrington High School--located just outside of Providence--have been requiring students to actively demonstrate their knowledge for years. The Rhode Island's high school graduating class became…

  11. Demonstration Road Show

    NSDL National Science Digital Library

    Shropshire, Steven

    2009-04-06

    The Idaho State University Department of Physics conducts science demonstration shows at S. E. Idaho schools. Four different presentations are currently available; "Forces and Motion", "States of Matter", "Electricity and Magnetism", and "Sound and Waves". Information provided includes descriptions of the material and links to other resources.

  12. Hide / Show Animal Ethics

    E-print Network

    New South Wales, University of

    the Ethics Secretariat for information on Animal Ethics Courses available at UNSW. All new added personnelHide / Show Animal Ethics Modification for Approved Application New personnel or updated role since last approval New person nominated since last approval You are here: Animal Ethics Application

  13. Stage a Water Show

    ERIC Educational Resources Information Center

    Frasier, Debra

    2008-01-01

    In the author's book titled "The Incredible Water Show," the characters from "Miss Alaineus: A Vocabulary Disaster" used an ocean of information to stage an inventive performance about the water cycle. In this article, the author relates how she turned the story into hands-on science teaching for real-life fifth-grade students. The author also…

  14. What Do Maps Show?

    ERIC Educational Resources Information Center

    Geological Survey (Dept. of Interior), Reston, VA.

    This curriculum packet, appropriate for grades 4-8, features a teaching poster which shows different types of maps (different views of Salt Lake City, Utah), as well as three reproducible maps and reproducible activity sheets which complement the maps. The poster provides teacher background, including step-by-step lesson plans for four geography…

  15. The Ozone Show.

    ERIC Educational Resources Information Center

    Mathieu, Aaron

    2000-01-01

    Uses a talk show activity for a final assessment tool for students to debate about the ozone hole. Students are assessed on five areas: (1) cooperative learning; (2) the written component; (3) content; (4) self-evaluation; and (5) peer evaluation. (SAH)

  16. Show Me the Way

    ERIC Educational Resources Information Center

    Dicks, Matthew J.

    2005-01-01

    Because today's students have grown up steeped in video games and the Internet, most of them expect feedback, and usually gratification, very soon after they expend effort on a task. Teachers can get quick feedback to students by showing them videotapes of their learning performances. The author, a 3rd grade teacher describes how the seemingly…

  17. Show-Me Magazine

    NSDL National Science Digital Library

    2008-01-01

    Come along as the folks at the University of Missouri show you the history of their college days through the Show Me magazine. It's a wonderful collection of college humor published from 1946 to 1963. First-time visitors would do well to read about the magazine's colorful past, courtesy of Jerry Smith. A good place to start is the November 1920 issue (easily found when you browse by date), which contains a number of parody advertisements along with some doggerels poking good natured fun at the football team and an assortment of deans. Also, it's worth noting that visitors can scroll through issues and save them to an online "bookbag" for later use.

  18. Keeping Show Pigs Healthy

    E-print Network

    Lawhorn, D. Bruce

    2006-10-13

    within a well-managed farm ? Vaccinating to prevent serious diseases ? Deworming the pigs routinely ? Having sick pigs promptly diagnosed and treated ? Using prescribed drugs properly Starting with healthy PigS To prevent disease outbreaks in show... of disease problems. Antibiotics are totally ineffective in preventing common viral diseases such as transmissible gas- troenteritis and swine influenza. Also, vaccines are not available for all swine diseases and must be giv- en long before the pigs...

  19. Viewing television talk shows

    Microsoft Academic Search

    Alan M. Rubin; Mary M. Step

    1997-01-01

    We examined how motivation, audience activity, and attitudes influenced the likelihood of watching societal?issue and relational topics on television talk programs. Path analysis supported differences in ritualized and instrumental motives for watching talk shows. Information and exciting?entertainment motivation predicted greater’ realism of, affinity with, involvement with, and intent to watch talk television. Pass?time motivation predicted reduced affinity with and intent

  20. Mars Slide Show

    NASA Technical Reports Server (NTRS)

    2006-01-01

    15 September 2006 This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows a landslide that occurred off of a steep slope in Tithonium Chasma, part of the vast Valles Marineris trough system.

    Location near: 4.8oS, 84.6oW Image width: 3 km (1.9 mi) Illumination from: upper left Season: Southern Autumn

  1. Quantifying potential recharge in mantled sinkholes using ERT.

    PubMed

    Schwartz, Benjamin F; Schreiber, Madeline E

    2009-01-01

    Potential recharge through thick soils in mantled sinkholes was quantified using differential electrical resistivity tomography (ERT). Conversion of time series two-dimensional (2D) ERT profiles into 2D volumetric water content profiles using a numerically optimized form of Archie's law allowed us to monitor temporal changes in water content in soil profiles up to 9 m in depth. Combining Penman-Monteith daily potential evapotranspiration (PET) and daily precipitation data with potential recharge calculations for three sinkhole transects indicates that potential recharge occurred only during brief intervals over the study period and ranged from 19% to 31% of cumulative precipitation. Spatial analysis of ERT-derived water content showed that infiltration occurred both on sinkhole flanks and in sinkhole bottoms. Results also demonstrate that mantled sinkholes can act as regions of both rapid and slow recharge. Rapid recharge is likely the result of flow through macropores (such as root casts and thin gravel layers), while slow recharge is the result of unsaturated flow through fine-grained sediments. In addition to developing a new method for quantifying potential recharge at the field scale in unsaturated conditions, we show that mantled sinkholes are an important component of storage in a karst system. PMID:18823398

  2. "Cosmic Collisions" Planetarium Show

    E-print Network

    Mathis, Wayne N.

    of the Moon · Collisions between hydrogen atoms in solar interior, resulting in nuclear fusion · Charged in the Exploring the Planets Gallery (and there is one you can touch at SI's National Museum of Natural History of hydrogen atoms to fuse and form helium to provide the Sun's power, is an explicit subset of this particular

  3. A Dichotomy Theorem for Learning Quantified Boolean Formulas

    E-print Network

    Dalmau, Victor

    A Dichotomy Theorem for Learning Quantified Boolean Formulas V'ictor Dalmau Departament LSI the following classes of quantified boolean formulas. Fix a finite set of basic boolean functions. Take of basic boolean functions, the resulting set of formulas is either polynomially learnable from equivalence

  4. Automatic quantification of light sleep shows differences between apnea patients and healthy subjects

    Microsoft Academic Search

    Eero Huupponen; Sari-Leena Himanen; Joel Hasan; Alpo Värri

    2004-01-01

    A fully automatic method to quantify sleep depth during the night was developed in the present work. The method was tested using 20 all-night recordings from 10 healthy control subjects and 10 sleep apnea patients. The results showed statistically significant differences in sleep depth between control subjects and sleep apnea patients. The overall sleep was lighter in apnea patients than

  5. American History Picture Show

    NSDL National Science Digital Library

    Ms. Bennion

    2009-11-23

    In class we read Katie's Picture Show, a book about a girl who discovers art first-hand one day at an art museum in London. She realizes she can climb into the paintings, explore her surroundings, and even solve problems for the subjects of the paintings. As part of our unit on American history, we are going to use art to further learn about some of the important events we have been discussing. Each of these works of art depicts an important event in American History. When you click on a picture, you will be able to see the name of the event as well as the artist who created it. You will be using all three pictures for this assignment.Use the websites ...

  6. Quantifying the value of IT-investments

    Microsoft Academic Search

    Chris Verhoef

    2005-01-01

    We described a method to quantify the value of investments in software systems. For that, we adopted the classical risk-adjusted discounted cash flow model and geared it towards the fie ld of information technology. This resulted in a scenario-based approach incorporating two IT- specific risks that can substantially influence IT-appraisals. They are requirements creep and time compression. To account for

  7. Quantifying magnetite magnetofossil contributions to sedimentary magnetizations

    NASA Astrophysics Data System (ADS)

    Heslop, David; Roberts, Andrew P.; Chang, Liao; Davies, Maureen; Abrajevitch, Alexandra; De Deckker, Patrick

    2013-11-01

    Under suitable conditions, magnetofossils (the inorganic remains of magnetotactic bacteria) can contribute to the natural remanent magnetization (NRM) of sediments. In recent years, magnetofossils have been shown to be preserved commonly in marine sediments, which makes it essential to quantify their importance in palaeomagnetic recording. In this study, we examine a deep-sea sediment core from offshore of northwestern Western Australia. The magnetic mineral assemblage is dominated by continental detritus and magnetite magnetofossils. By separating magnetofossil and detrital components based on their different demagnetization characteristics, it is possible to quantify their respective contributions to the sedimentary NRM throughout the Brunhes chron. In the studied core, the contribution of magnetofossils to the NRM is controlled by large-scale climate changes, with their relative importance increasing during glacial periods when detrital inputs were low. Our results demonstrate that magnetite magnetofossils can dominate sedimentary NRMs in settings where they are preserved in significant abundances.

  8. A stochastic approach for quantifying immigrant integration: the Spanish test case

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Barra, Adriano; Contucci, Pierluigi; Sandell, Richard; Vernia, Cecilia

    2014-10-01

    We apply stochastic process theory to the analysis of immigrant integration. Using a unique and detailed data set from Spain, we study the relationship between local immigrant density and two social and two economic immigration quantifiers for the period 1999-2010. As opposed to the classic time-series approach, by letting immigrant density play the role of ‘time’ and the quantifier the role of ‘space,’ it becomes possible to analyse the behavior of the quantifiers by means of continuous time random walks. Two classes of results are then obtained. First, we show that social integration quantifiers evolve following diffusion law, while the evolution of economic quantifiers exhibits ballistic dynamics. Second, we make predictions of best- and worst-case scenarios taking into account large local fluctuations. Our stochastic process approach to integration lends itself to interesting forecasting scenarios which, in the hands of policy makers, have the potential to improve political responses to integration problems. For instance, estimating the standard first-passage time and maximum-span walk reveals local differences in integration performance for different immigration scenarios. Thus, by recognizing the importance of local fluctuations around national means, this research constitutes an important tool to assess the impact of immigration phenomena on municipal budgets and to set up solid multi-ethnic plans at the municipal level as immigration pressures build.

  9. A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT) assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE)

    NASA Astrophysics Data System (ADS)

    Fang, T.; Verma, V.; Guo, H.; King, L. E.; Edgerton, E. S.; Weber, R. J.

    2014-07-01

    A variety of methods are used to measure the capability of particulate matter (PM) to catalytically generate reactive oxygen species (ROS) in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT) assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants) as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with the large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (Coefficient of Variation of 12% for standards, 4% for ambient samples), and reasonably low limit of detection (0.31 nmol min-1). Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 ± 0.12, r2 = 0.92, N = 9). The system was utilized for the Southeastern Center for Air Pollution and Epidemiology (SCAPE) to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, road-side, and rural) in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per air volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88), suggesting regional sources contributing to the PM oxidative potential in southeast US. However, the greater heterogeneity in the intrinsic DTT activity (per PM mass basis) across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the instrument can also be used to determine oxidative potential with other acellular assays.

  10. An algorithm for quantifying dependence in multivariate data sets

    NASA Astrophysics Data System (ADS)

    Feindt, M.; Prim, M.

    2013-01-01

    We describe an algorithm to quantify dependence in a multivariate data set. The algorithm is able to identify any linear and non-linear dependence in the data set by performing a hypothesis test for two variables being independent. As a result we obtain a reliable measure of dependence. In high energy physics understanding dependencies is especially important in multidimensional maximum likelihood analyses. We therefore describe the problem of a multidimensional maximum likelihood analysis applied on a multivariate data set with variables that are dependent on each other. We review common procedures used in high energy physics and show that general dependence is not the same as linear correlation and discuss their limitations in practical application. Finally we present the tool CAT, which is able to perform all reviewed methods in a fully automatic mode and creates an analysis report document with numeric results and visual review.

  11. Quantifying mixing using equilibrium reactions

    SciTech Connect

    Wheat, Philip M. [Department of Mechanical and Aerospace Engineering, Arizona State University, Tempe, Arizona 85287-6106 (United States); Posner, Jonathan D. [Department of Mechanical and Aerospace Engineering, Arizona State University, Tempe, Arizona 85287-6106 (United States); Department of Chemical Engineering, Arizona State University, Tempe, Arizona 85287-6106 (United States)

    2009-03-15

    A method of quantifying equilibrium reactions in a microchannel using a fluorometric reaction of Fluo-4 and Ca{sup 2+} ions is presented. Under the proper conditions, equilibrium reactions can be used to quantify fluid mixing without the challenges associated with constituent mixing measures such as limited imaging spatial resolution and viewing angle coupled with three-dimensional structure. Quantitative measurements of CaCl and calcium-indicating fluorescent dye Fluo-4 mixing are measured in Y-shaped microchannels. Reactant and product concentration distributions are modeled using Green's function solutions and a numerical solution to the advection-diffusion equation. Equilibrium reactions provide for an unambiguous, quantitative measure of mixing when the reactant concentrations are greater than 100 times their dissociation constant and the diffusivities are equal. At lower concentrations and for dissimilar diffusivities, the area averaged fluorescence signal reaches a maximum before the species have interdiffused, suggesting that reactant concentrations and diffusivities must be carefully selected to provide unambiguous, quantitative mixing measures. Fluorometric equilibrium reactions work over a wide range of pH and background concentrations such that they can be used for a wide variety of fluid mixing measures including industrial or microscale flows.

  12. Evaluation of two methods for quantifying passeriform lice

    PubMed Central

    Koop, Jennifer A. H.; Clayton, Dale H.

    2013-01-01

    Two methods commonly used to quantify ectoparasites on live birds are visual examination and dust-ruffling. Visual examination provides an estimate of ectoparasite abundance based on an observer’s timed inspection of various body regions on a bird. Dust-ruffling involves application of insecticidal powder to feathers that are then ruffled to dislodge ectoparasites onto a collection surface where they can then be counted. Despite the common use of these methods in the field, the proportion of actual ectoparasites they account for has only been tested with Rock Pigeons (Columba livia), a relatively large-bodied species (238–302 g) with dense plumage. We tested the accuracy of the two methods using European Starlings (Sturnus vulgaris; ~75 g). We first quantified the number of lice (Brueelia nebulosa) on starlings using visual examination, followed immediately by dust-ruffling. Birds were then euthanized and the proportion of lice accounted for by each method was compared to the total number of lice on each bird as determined with a body-washing method. Visual examination and dust-ruffling each accounted for a relatively small proportion of total lice (14% and 16%, respectively), but both were still significant predictors of abundance. The number of lice observed by visual examination accounted for 68% of the variation in total abundance. Similarly, the number of lice recovered by dust-ruffling accounted for 72% of the variation in total abundance. Our results show that both methods can be used to reliably quantify the abundance of lice on European Starlings and other similar-sized passerines. PMID:24039328

  13. Evaluation of two methods for quantifying passeriform lice.

    PubMed

    Koop, Jennifer A H; Clayton, Dale H

    2013-06-01

    Two methods commonly used to quantify ectoparasites on live birds are visual examination and dust-ruffling. Visual examination provides an estimate of ectoparasite abundance based on an observer's timed inspection of various body regions on a bird. Dust-ruffling involves application of insecticidal powder to feathers that are then ruffled to dislodge ectoparasites onto a collection surface where they can then be counted. Despite the common use of these methods in the field, the proportion of actual ectoparasites they account for has only been tested with Rock Pigeons (Columba livia), a relatively large-bodied species (238-302 g) with dense plumage. We tested the accuracy of the two methods using European Starlings (Sturnus vulgaris; ~75 g). We first quantified the number of lice (Brueelia nebulosa) on starlings using visual examination, followed immediately by dust-ruffling. Birds were then euthanized and the proportion of lice accounted for by each method was compared to the total number of lice on each bird as determined with a body-washing method. Visual examination and dust-ruffling each accounted for a relatively small proportion of total lice (14% and 16%, respectively), but both were still significant predictors of abundance. The number of lice observed by visual examination accounted for 68% of the variation in total abundance. Similarly, the number of lice recovered by dust-ruffling accounted for 72% of the variation in total abundance. Our results show that both methods can be used to reliably quantify the abundance of lice on European Starlings and other similar-sized passerines. PMID:24039328

  14. A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT) assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE)

    NASA Astrophysics Data System (ADS)

    Fang, T.; Verma, V.; Guo, H.; King, L. E.; Edgerton, E. S.; Weber, R. J.

    2015-01-01

    A variety of methods are used to measure the capability of particulate matter (PM) to catalytically generate reactive oxygen species (ROS) in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT) assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants) as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (coefficient of variation of 15% for positive control, 4% for ambient samples) and reasonably low limit of detection (0.31 nmol min-1). Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 ± 0.12, r2 = 0.92, N = 9). The system was utilized for the Southeastern Center for Air Pollution & Epidemiology (SCAPE) to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, roadside, and rural) in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per-air-volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88), suggesting regional sources contributing to the PM oxidative potential in the southeastern US. The correlation may also suggest a mechanistic explanation (oxidative stress) for observed PM2.5 mass-health associations. The heterogeneity in the intrinsic DTT activity (per-PM-mass basis) across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the instrument can also be used to determine oxidative potential with other acellular assays.

  15. Measuring political polarization: Twitter shows the two sides of Venezuela.

    PubMed

    Morales, A J; Borondo, J; Losada, J C; Benito, R M

    2015-03-01

    We say that a population is perfectly polarized when divided in two groups of the same size and opposite opinions. In this paper, we propose a methodology to study and measure the emergence of polarization from social interactions. We begin by proposing a model to estimate opinions in which a minority of influential individuals propagate their opinions through a social network. The result of the model is an opinion probability density function. Next, we propose an index to quantify the extent to which the resulting distribution is polarized. Finally, we apply the proposed methodology to a Twitter conversation about the late Venezuelan president, Hugo Chávez, finding a good agreement between our results and offline data. Hence, we show that our methodology can detect different degrees of polarization, depending on the structure of the network. PMID:25833436

  16. Quantifying HIV-1 transmission due to contaminated injections

    PubMed Central

    White, Richard G.; Ben, S. Cooper; Kedhar, Anusha; Orroth, Kate K.; Biraro, Sam; Baggaley, Rebecca F.; Whitworth, Jimmy; Korenromp, Eline L.; Ghani, Azra; Boily, Marie-Claude; Hayes, Richard J.

    2007-01-01

    Assessments of the importance of different routes of HIV-1 (HIV) transmission are vital for prioritization of control efforts. Lack of consistent direct data and large uncertainty in the risk of HIV transmission from HIV-contaminated injections has made quantifying the proportion of transmission caused by contaminated injections in sub-Saharan Africa difficult and unavoidably subjective. Depending on the risk assumed, estimates have ranged from 2.5% to 30% or more. We present a method based on an age-structured transmission model that allows the relative contribution of HIV-contaminated injections, and other routes of HIV transmission, to be robustly estimated, both fully quantifying and substantially reducing the associated uncertainty. To do this, we adopt a Bayesian perspective, and show how prior beliefs regarding the safety of injections and the proportion of HIV incidence due to contaminated injections should, in many cases, be substantially modified in light of age-stratified incidence and injection data, resulting in improved (posterior) estimates. Applying the method to data from rural southwest Uganda, we show that the highest estimates of the proportion of incidence due to injections are reduced from 15.5% (95% credible interval) (0.7%, 44.9%) to 5.2% (0.5%, 17.0%) if random mixing is assumed, and from 14.6% (0.7%, 42.5%) to 11.8% (1.2%, 32.5%) under assortative mixing. Lower, and more widely accepted, estimates remain largely unchanged, between 1% and 3% (0.1–6.3%). Although important uncertainty remains, our analysis shows that in rural Uganda, contaminated injections are unlikely to account for a large proportion of HIV incidence. This result is likely to be generalizable to many other populations in sub-Saharan Africa. PMID:17522260

  17. Quantifying a cellular automata simulation of electric vehicles

    NASA Astrophysics Data System (ADS)

    Hill, Graeme; Bell, Margaret; Blythe, Phil

    2014-12-01

    Within this work the Nagel-Schreckenberg (NS) cellular automata is used to simulate a basic cyclic road network. Results from SwitchEV, a real world Electric Vehicle trial which has collected more than two years of detailed electric vehicle data, are used to quantify the results of the NS automata, demonstrating similar power consumption behavior to that observed in the experimental results. In particular the efficiency of the electric vehicles reduces as the vehicle density increases, due in part to the reduced efficiency of EVs at low speeds, but also due to the energy consumption inherent in changing speeds. Further work shows the results from introducing spatially restricted speed restriction. In general it can be seen that induced congestion from spatially transient events propagates back through the road network and alters the energy and efficiency profile of the simulated vehicles, both before and after the speed restriction. Vehicles upstream from the restriction show a reduced energy usage and an increased efficiency, and vehicles downstream show an initial large increase in energy usage as they accelerate away from the speed restriction.

  18. Quantifying properties of ICM inhomogeneities

    NASA Astrophysics Data System (ADS)

    Zhuravleva, I.; Churazov, E.; Kravtsov, A.; Lau, E. T.; Nagai, D.; Sunyaev, R.

    2013-02-01

    We present a new method to identify and characterize the structure of the intracluster medium (ICM) in simulated galaxy clusters. The method uses the median of gas properties, such as density and pressure, which we show to be very robust to the presence of gas inhomogeneities. In particular, we show that the radial profiles of median gas properties in cosmological simulations of clusters are smooth and do not exhibit fluctuations at locations of massive clumps in contrast to mean and mode properties. Analysis of simulations shows that distribution of gas properties in a given radial shell can be well described by a log-normal probability density function and a tail. The former corresponds to a nearly hydrostatic bulk component, accounting for ˜99 per cent of the volume, while the tail corresponds to high-density inhomogeneities. The clumps can thus be easily identified with the volume elements corresponding to the tail of the distribution. We show that this results in a simple and robust separation of the diffuse and clumpy components of the ICM. The full width at half-maximum of the density distribution in simulated clusters is a growing function of radius and varies from ˜0.15 dex in cluster centre to ˜0.5 dex at 2 r500 in relaxed clusters. The small scatter in the width between relaxed clusters suggests that the degree of inhomogeneity is a robust characteristic of the ICM. It broadly agrees with the amplitude of density perturbations found in the Coma cluster core. We discuss the origin of ICM density variations in spherical shells and show that less than 20 per cent of the width can be attributed to the triaxiality of the cluster gravitational potential. As a link to X-ray observations of real clusters we evaluated the ICM clumping factor, weighted with the temperature-dependent X-ray emissivity, with and without high-density inhomogeneities. We argue that these two cases represent upper and lower limits on the departure of the observed X-ray emissivity from the median value. We find that the typical value of the clumping factor in the bulk component of relaxed clusters varies from ˜1.1-1.2 at r500 up to ˜1.3-1.4 at r200, in broad agreement with recent observations.

  19. National Orange Show Photovoltaic Demonstration

    SciTech Connect

    Dan Jimenez (NOS)Sheri Raborn, CPA (National Orange Show); Tom Baker (California Construction Authority)

    2008-03-31

    National Orange Show Photovoltaic Demonstration created a 400KW Photovoltaic self-generation plant at the National Orange Show Events Center (NOS). The NOS owns a 120-acre state fairground where it operates an events center and produces an annual citrus fair known as the Orange Show. The NOS governing board wanted to employ cost-saving programs for annual energy expenses. It is hoped the Photovoltaic program will result in overall savings for the NOS, help reduce the State's energy demands as relating to electrical power consumption, improve quality of life within the affected grid area as well as increase the energy efficiency of buildings at our venue. In addition, the potential to reduce operational expenses would have a tremendous effect on the ability of the NOS to service its community.

  20. Quantifying entanglement with witness operators

    SciTech Connect

    Brandao, Fernando G.S.L. [Grupo de Informacao Quantica, Departamento de Fisica, Universidade Federal de Minas Gerais, Caixa Postal 702, Belo Horizonte, 30.123-970, MG (Brazil)

    2005-08-15

    We present a unifying approach to the quantification of entanglement based on entanglement witnesses, which includes several already established entanglement measures such as the negativity, the concurrence, and the robustness of entanglement. We then introduce an infinite family of new entanglement quantifiers, having as its limits the best separable approximation measure and the generalized robustness. Gaussian states, states with symmetry, states constrained to super-selection rules, and states composed of indistinguishable particles are studied under the view of the witnessed entanglement. We derive new bounds to the fidelity of teleportation d{sub min}, for the distillable entanglement E{sub D} and for the entanglement of formation. A particular measure, the PPT-generalized robustness, stands out due to its easy calculability and provides sharper bounds to d{sub min} and E{sub D} than the negativity in most of the states. We illustrate our approach studying thermodynamical properties of entanglement in the Heisenberg XXX and dimerized models.

  1. [How to quantify limb edema?].

    PubMed

    Boulon, C; Becker, F; Vignes, S

    2010-06-01

    Edema of venous or lymphatic origin is frequently encountered in vasculopathies. Clinical diagnosis is readily made but precise quantification of edema is difficult. Various procedures have been proposed to quantify edema volume or analyze its composition. Water volumetry remains the gold standard but volumetry calculated with formulas from circumferential measurements for a cylinder or frustum methods are reproducible with high interrater and intrarater reliability. Automated measurement systems are expensive and reserved for research. Ideally, volume measurements for a given patient during the follow-up should be made by the same practitioner using the same method because the different methods are not interchangeable. Notably, edema composition can be evaluated by high frequency ultrasound, CT-scan, MRI or bioimpedance. Edema quantification is essential during patient follow-up and the method to be used depends on the objectives to be met. PMID:20363084

  2. Monsters begat by quantifiers? Fabio Del Prete

    E-print Network

    Paris-Sud XI, Université de

    Monsters begat by quantifiers? Fabio Del Prete CNRS Lab CLLE-ERSS Toulouse Sandro Zucchi apparent consequence of this approach is that quantifiers and s are monsters in Kaplan's sense. We argue-00965284,version1-24Mar2014 #12;F. Del Prete - S. Zucchi Monsters begat by quantifiers? While some

  3. Lindstr om Quantifiers and Leaf Language Definability

    E-print Network

    Vollmer, Heribert

    and examine analogies to the concept of leaf language definability. The quantifier structure in a secondLindstrË? om Quantifiers and Leaf Language Definability Hans­JË?org Burtschick Fachbereich Informatik­ order sentence defining a language and the quantifier structure in a first­order sentence characterizing

  4. A Resolution Method for Quantified Boolean Formulas

    E-print Network

    Eckmiller, Rolf

    A Resolution Method for Quantified Boolean Formulas Marek Karpinski \\Lambda Department of Computer formulas is presented. If we restrict the resolution to unit resolution, then the completeness and soundness for extended quantified Horn formulas is shown. We prove that the truth of a quantified Horn

  5. Quantifying cell behaviors during embryonic wound healing

    NASA Astrophysics Data System (ADS)

    Mashburn, David; Ma, Xiaoyan; Crews, Sarah; Lynch, Holley; McCleery, W. Tyler; Hutson, M. Shane

    2011-03-01

    During embryogenesis, internal forces induce motions in cells leading to widespread motion in tissues. We previously developed laser hole-drilling as a consistent, repeatable way to probe such epithelial mechanics. The initial recoil (less than 30s) gives information about physical properties (elasticity, force) of cells surrounding the wound, but the long-term healing process (tens of minutes) shows how cells adjust their behavior in response to stimuli. To study this biofeedback in many cells through time, we developed tools to quantify statistics of individual cells. By combining watershed segmentation with a powerful and efficient user interaction system, we overcome problems that arise in any automatic segmentation from poor image quality. We analyzed cell area, perimeter, aspect ratio, and orientation relative to wound for a wide variety of laser cuts in dorsal closure. We quantified statistics for different regions as well, i.e. cells near to and distant from the wound. Regional differences give a distribution of wound-induced changes, whose spatial localization provides clues into the physical/chemical signals that modulate the wound healing response.

  6. Detecting and Quantifying Topography in Neural Maps

    PubMed Central

    Yarrow, Stuart; Razak, Khaleel A.; Seitz, Aaron R.; Seriès, Peggy

    2014-01-01

    Topographic maps are an often-encountered feature in the brains of many species, yet there are no standard, objective procedures for quantifying topography. Topographic maps are typically identified and described subjectively, but in cases where the scale of the map is close to the resolution limit of the measurement technique, identifying the presence of a topographic map can be a challenging subjective task. In such cases, an objective topography detection test would be advantageous. To address these issues, we assessed seven measures (Pearson distance correlation, Spearman distance correlation, Zrehen's measure, topographic product, topological correlation, path length and wiring length) by quantifying topography in three classes of cortical map model: linear, orientation-like, and clusters. We found that all but one of these measures were effective at detecting statistically significant topography even in weakly-ordered maps, based on simulated noisy measurements of neuronal selectivity and sparse sampling of the maps. We demonstrate the practical applicability of these measures by using them to examine the arrangement of spatial cue selectivity in pallid bat A1. This analysis shows that significantly topographic arrangements of interaural intensity difference and azimuth selectivity exist at the scale of individual binaural clusters. PMID:24505279

  7. Quantifying of bactericide properties of medicinal plants

    PubMed Central

    Ács, András; Gölöncsér, Flóra; Barabás, Anikó

    2011-01-01

    Extended research has been carried out to clarify the ecological role of plant secondary metabolites (SMs). Although their primary ecological function is self-defense, bioactive compounds have long been used in alternative medicine or in biological control of pests. Several members of the family Labiatae are known to have strong antimicrobial capacity. For testing and quantifying antibacterial activity, most often standard microbial protocols are used, assessing inhibitory activity on a selected strain. In this study, the applicability of a microbial ecotoxtest was evaluated to quantify the aggregate bactericide capacity of Labiatae species, based on the bioluminescence inhibition of the bacterium Vibrio fischeri. Striking differences were found amongst herbs, reaching even 10-fold toxicity. Glechoma hederacea L. proved to be the most toxic, with the EC50 of 0.4073 g dried plant/l. LC50 values generated by the standard bioassay seem to be a good indicator of the bactericide property of herbs. Traditional use of the selected herbs shows a good correlation with bioactivity expressed as bioluminescence inhibition, leading to the conclusion that the Vibrio fischeri bioassay can be a good indicator of the overall antibacterial capacity of herbs, at least on a screening level. PMID:21502819

  8. Quantifying utricular stimulation during natural behavior

    PubMed Central

    Rivera, Angela R. V.; Davis, Julian; Grant, Wally; Blob, Richard W.; Peterson, Ellengene; Neiman, Alexander B.; Rowe, Michael

    2012-01-01

    The use of natural stimuli in neurophysiological studies has led to significant insights into the encoding strategies used by sensory neurons. To investigate these encoding strategies in vestibular receptors and neurons, we have developed a method for calculating the stimuli delivered to a vestibular organ, the utricle, during natural (unrestrained) behaviors, using the turtle as our experimental preparation. High-speed digital video sequences are used to calculate the dynamic gravito-inertial (GI) vector acting on the head during behavior. X-ray computed tomography (CT) scans are used to determine the orientation of the otoconial layer (OL) of the utricle within the head, and the calculated GI vectors are then rotated into the plane of the OL. Thus, the method allows us to quantify the spatio-temporal structure of stimuli to the OL during natural behaviors. In the future, these waveforms can be used as stimuli in neurophysiological experiments to understand how natural signals are encoded by vestibular receptors and neurons. We provide one example of the method which shows that turtle feeding behaviors can stimulate the utricle at frequencies higher than those typically used in vestibular studies. This method can be adapted to other species, to other vestibular end organs, and to other methods of quantifying head movements. PMID:22753360

  9. Quantifying capital goods for waste incineration

    SciTech Connect

    Brogaard, L.K., E-mail: lksb@env.dtu.dk [Department of Environmental Engineering, Building 115, Technical University of Denmark, DK-2800 Kongens Lyngby (Denmark); Riber, C. [Ramboll, Consulting Engineers, Hannemanns Allé 53, DK-2300 Copenhagen S (Denmark); Christensen, T.H. [Department of Environmental Engineering, Building 115, Technical University of Denmark, DK-2800 Kongens Lyngby (Denmark)

    2013-06-15

    Highlights: • Materials and energy used for the construction of waste incinerators were quantified. • The data was collected from five incineration plants in Scandinavia. • Included were six main materials, electronic systems, cables and all transportation. • The capital goods contributed 2–3% compared to the direct emissions impact on GW. - Abstract: Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000–240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000–26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000–5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7–14 kg CO{sub 2} per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2–3% with respect to kg CO{sub 2} per tonne of waste combusted.

  10. Quantifying Drosophila food intake: comparative analysis of current methodology.

    PubMed

    Deshpande, Sonali A; Carvalho, Gil B; Amador, Ariadna; Phillips, Angela M; Hoxha, Sany; Lizotte, Keith J; Ja, William W

    2014-05-01

    Food intake is a fundamental parameter in animal studies. Despite the prevalent use of Drosophila in laboratory research, precise measurements of food intake remain challenging in this model organism. Here, we compare several common Drosophila feeding assays: the capillary feeder (CAFE), food labeling with a radioactive tracer or colorimetric dye and observations of proboscis extension (PE). We show that the CAFE and radioisotope labeling provide the most consistent results, have the highest sensitivity and can resolve differences in feeding that dye labeling and PE fail to distinguish. We conclude that performing the radiolabeling and CAFE assays in parallel is currently the best approach for quantifying Drosophila food intake. Understanding the strengths and limitations of methods for measuring food intake will greatly advance Drosophila studies of nutrition, behavior and disease. PMID:24681694

  11. Quantifying Global Uncertainties in a Simple Microwave Rainfall Algorithm

    NASA Technical Reports Server (NTRS)

    Kummerow, Christian; Berg, Wesley; Thomas-Stahle, Jody; Masunaga, Hirohiko

    2006-01-01

    While a large number of methods exist in the literature for retrieving rainfall from passive microwave brightness temperatures, little has been written about the quantitative assessment of the expected uncertainties in these rainfall products at various time and space scales. The latter is the result of two factors: sparse validation sites over most of the world's oceans, and algorithm sensitivities to rainfall regimes that cause inconsistencies against validation data collected at different locations. To make progress in this area, a simple probabilistic algorithm is developed. The algorithm uses an a priori database constructed from the Tropical Rainfall Measuring Mission (TRMM) radar data coupled with radiative transfer computations. Unlike efforts designed to improve rainfall products, this algorithm takes a step backward in order to focus on uncertainties. In addition to inversion uncertainties, the construction of the algorithm allows errors resulting from incorrect databases, incomplete databases, and time- and space-varying databases to be examined. These are quantified. Results show that the simple algorithm reduces errors introduced by imperfect knowledge of precipitation radar (PR) rain by a factor of 4 relative to an algorithm that is tuned to the PR rainfall. Database completeness does not introduce any additional uncertainty at the global scale, while climatologically distinct space/time domains add approximately 25% uncertainty that cannot be detected by a radiometer alone. Of this value, 20% is attributed to changes in cloud morphology and microphysics, while 5% is a result of changes in the rain/no-rain thresholds. All but 2%-3% of this variability can be accounted for by considering the implicit assumptions in the algorithm. Additional uncertainties introduced by the details of the algorithm formulation are not quantified in this study because of the need for independent measurements that are beyond the scope of this paper. A validation strategy for these errors is outlined.

  12. Results.

    ERIC Educational Resources Information Center

    Zemsky, Robert; Shaman, Susan; Shapiro, Daniel B.

    2001-01-01

    Describes the Collegiate Results Instrument (CRI), which measures a range of collegiate outcomes for alumni 6 years after graduation. The CRI was designed to target alumni from institutions across market segments and assess their values, abilities, work skills, occupations, and pursuit of lifelong learning. (EV)

  13. Computed tomography to quantify tooth abrasion

    NASA Astrophysics Data System (ADS)

    Kofmehl, Lukas; Schulz, Georg; Deyhle, Hans; Filippi, Andreas; Hotz, Gerhard; Berndt-Dagassan, Dorothea; Kramis, Simon; Beckmann, Felix; Müller, Bert

    2010-09-01

    Cone-beam computed tomography, also termed digital volume tomography, has become a standard technique in dentistry, allowing for fast 3D jaw imaging including denture at moderate spatial resolution. More detailed X-ray images of restricted volumes for post-mortem studies in dental anthropology are obtained by means of micro computed tomography. The present study evaluates the impact of the pipe smoking wear on teeth morphology comparing the abraded tooth with its contra-lateral counterpart. A set of 60 teeth, loose or anchored in the jaw, from 12 dentitions have been analyzed. After the two contra-lateral teeth were scanned, one dataset has been mirrored before the two datasets were registered using affine and rigid registration algorithms. Rigid registration provides three translational and three rotational parameters to maximize the overlap of two rigid bodies. For the affine registration, three scaling factors are incorporated. Within the present investigation, affine and rigid registrations yield comparable values. The restriction to the six parameters of the rigid registration is not a limitation. The differences in size and shape between the tooth and its contra-lateral counterpart generally exhibit only a few percent in the non-abraded volume, validating that the contralateral tooth is a reasonable approximation to quantify, for example, the volume loss as the result of long-term clay pipe smoking. Therefore, this approach allows quantifying the impact of the pipe abrasion on the internal tooth morphology including root canal, dentin, and enamel volumes.

  14. A Generalizable Methodology for Quantifying User Satisfaction

    NASA Astrophysics Data System (ADS)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  15. Towards quantifying complexity with quantum mechanics

    NASA Astrophysics Data System (ADS)

    Tan, Ryan; R. Terno, Daniel; Thompson, Jayne; Vedral, Vlatko; Gu, Mile

    2014-09-01

    While we have intuitive notions of structure and complexity, the formalization of this intuition is non-trivial. The statistical complexity is a popular candidate. It is based on the idea that the complexity of a process can be quantified by the complexity of its simplest mathematical model —the model that requires the least past information for optimal future prediction. Here we review how such models, known as -machines can be further simplified through quantum logic, and explore the resulting consequences for understanding complexity. In particular, we propose a new measure of complexity based on quantum -machines. We apply this to a simple system undergoing constant thermalization. The resulting quantum measure of complexity aligns more closely with our intuition of how complexity should behave.

  16. Stimfit: quantifying electrophysiological data with Python.

    PubMed

    Guzman, Segundo J; Schlögl, Alois; Schmidt-Hieber, Christoph

    2014-01-01

    Intracellular electrophysiological recordings provide crucial insights into elementary neuronal signals such as action potentials and synaptic currents. Analyzing and interpreting these signals is essential for a quantitative understanding of neuronal information processing, and requires both fast data visualization and ready access to complex analysis routines. To achieve this goal, we have developed Stimfit, a free software package for cellular neurophysiology with a Python scripting interface and a built-in Python shell. The program supports most standard file formats for cellular neurophysiology and other biomedical signals through the Biosig library. To quantify and interpret the activity of single neurons and communication between neurons, the program includes algorithms to characterize the kinetics of presynaptic action potentials and postsynaptic currents, estimate latencies between pre- and postsynaptic events, and detect spontaneously occurring events. We validate and benchmark these algorithms, give estimation errors, and provide sample use cases, showing that Stimfit represents an efficient, accessible and extensible way to accurately analyze and interpret neuronal signals. PMID:24600389

  17. Message passing for quantified Boolean formulas

    NASA Astrophysics Data System (ADS)

    Zhang, Pan; Ramezanpour, Abolfazl; Zdeborová, Lenka; Zecchina, Riccardo

    2012-05-01

    We introduce two types of message passing algorithms for quantified Boolean formulas (QBF). The first type is a message passing based heuristics that can prove unsatisfiability of the QBF by assigning the universal variables in such a way that the remaining formula is unsatisfiable. In the second type, we use message passing to guide branching heuristics of a Davis-Putnam-Logemann-Loveland (DPLL) complete solver. Numerical experiments show that on random QBFs our branching heuristics give robust exponential efficiency gain with respect to state-of-the-art solvers. We also manage to solve some previously unsolved benchmarks from the QBFLIB library. Apart from this, our study sheds light on using message passing in small systems and as subroutines in complete solvers.

  18. Stimfit: quantifying electrophysiological data with Python

    PubMed Central

    Guzman, Segundo J.; Schlögl, Alois; Schmidt-Hieber, Christoph

    2013-01-01

    Intracellular electrophysiological recordings provide crucial insights into elementary neuronal signals such as action potentials and synaptic currents. Analyzing and interpreting these signals is essential for a quantitative understanding of neuronal information processing, and requires both fast data visualization and ready access to complex analysis routines. To achieve this goal, we have developed Stimfit, a free software package for cellular neurophysiology with a Python scripting interface and a built-in Python shell. The program supports most standard file formats for cellular neurophysiology and other biomedical signals through the Biosig library. To quantify and interpret the activity of single neurons and communication between neurons, the program includes algorithms to characterize the kinetics of presynaptic action potentials and postsynaptic currents, estimate latencies between pre- and postsynaptic events, and detect spontaneously occurring events. We validate and benchmark these algorithms, give estimation errors, and provide sample use cases, showing that Stimfit represents an efficient, accessible and extensible way to accurately analyze and interpret neuronal signals. PMID:24600389

  19. Quantifying the Shape of Aging

    PubMed Central

    Wrycza, Tomasz F.; Missov, Trifon I.; Baudisch, Annette

    2015-01-01

    In Biodemography, aging is typically measured and compared based on aging rates. We argue that this approach may be misleading, because it confounds the time aspect with the mere change aspect of aging. To disentangle these aspects, here we utilize a time-standardized framework and, instead of aging rates, suggest the shape of aging as a novel and valuable alternative concept for comparative aging research. The concept of shape captures the direction and degree of change in the force of mortality over age, which—on a demographic level—reflects aging. We 1) provide a list of shape properties that are desirable from a theoretical perspective, 2) suggest several demographically meaningful and non-parametric candidate measures to quantify shape, and 3) evaluate performance of these measures based on the list of properties as well as based on an illustrative analysis of a simple dataset. The shape measures suggested here aim to provide a general means to classify aging patterns independent of any particular mortality model and independent of any species-specific time-scale. Thereby they support systematic comparative aging research across different species or between populations of the same species under different conditions and constitute an extension of the toolbox available to comparative research in Biodemography. PMID:25803427

  20. Quantifying uncertainty from material inhomogeneity.

    SciTech Connect

    Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

    2009-09-01

    Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the deformation behavior was found to depend strongly on the character of the nearby microstructure.

  1. Quantifying the isotopic ‘continental effect’

    NASA Astrophysics Data System (ADS)

    Winnick, Matthew J.; Chamberlain, C. Page; Caves, Jeremy K.; Welker, Jeffrey M.

    2014-11-01

    Since the establishment of the IAEA-WMO precipitation-monitoring network in 1961, it has been observed that isotope ratios in precipitation (? 2H and ? 18O) generally decrease from coastal to inland locations, an observation described as the 'continental effect.' While discussed frequently in the literature, there have been few attempts to quantify the variables controlling this effect despite the fact that isotopic gradients over continents can vary by orders of magnitude. In a number of studies, traditional Rayleigh fractionation has proven inadequate in describing the global variability of isotopic gradients due to its simplified treatment of moisture transport and its lack of moisture recycling processes. In this study, we use a one-dimensional idealized model of water vapor transport along a storm track to investigate the dominant variables controlling isotopic gradients in precipitation across terrestrial environments. We find that the sensitivity of these gradients to progressive rainout is controlled by a combination of the amount of evapotranspiration and the ratio of transport by advection to transport by eddy diffusion, with these variables becoming increasingly important with decreasing length scales of specific humidity. A comparison of modeled gradients with global precipitation isotope data indicates that these variables can account for the majority of variability in observed isotopic gradients between coastal and inland locations. Furthermore, the dependence of the 'continental effect' on moisture recycling allows for the quantification of evapotranspiration fluxes from measured isotopic gradients, with implications for both paleoclimate reconstructions and large-scale monitoring efforts in the context of global warming and a changing hydrologic cycle.

  2. Quantifying flow-assistance and implications for movement research.

    PubMed

    Kemp, Michael U; Shamoun-Baranes, Judy; van Loon, E Emiel; McLaren, James D; Dokter, Adriaan M; Bouten, Willem

    2012-09-01

    The impact that flows of air and water have on organisms moving through these environments has received a great deal of attention in theoretical and empirical studies. There are many behavioral strategies that animals can adopt to interact with these flows, and by assuming one of these strategies a researcher can quantify the instantaneous assistance an animal derives from a particular flow. Calculating flow-assistance in this way can provide an elegant simplification of a multivariate problem to a univariate one and has many potential uses; however, the resultant flow-assistance values are inseparably linked to the specific behavioral strategy assumed. We expect that flow-assistance may differ considerably depending on the behavioral strategy assumed and the accuracy of the assumptions associated with that strategy. Further, we expect that the magnitude of these differences may depend on the specific flow conditions. We describe equations to quantify flow-assistance of increasing complexity (i.e. more assumptions), focusing on the behavioral strategies assumed by each. We illustrate differences in suggested flow-assistance between these equations and calculate the sensitivity of each equation to uncertainty in its particular assumptions for a range of theoretical flow conditions. We then simulate trajectories that occur if an animal behaves according to the assumptions inherent in these equations. We find large differences in flow-assistance between the equations, particularly with increasing lateral flow and increasingly supportive axial flow. We find that the behavioral strategy assumed is generally more influential on the perception of flow-assistance than a small amount of uncertainty in the specification of an animal's speed (i.e. <5 ms(-1)) or preferred direction of movement (i.e. <10°). Using simulated trajectories, we show that differences between flow-assistance equations can accumulate over time and distance. The appropriateness and potential biases of an equation to quantify flow-assistance, and the behavioral assumptions the equation implies, must be considered in the context of the system being studied, particularly when interpreting results. Thus, we offer this framework for researchers to evaluate the suitability of a particular flow-assistance equation and assess the implications of its use. PMID:22683380

  3. Quantifying anatomical shape variations in neurological disorders.

    PubMed

    Singh, Nikhil; Fletcher, P Thomas; Preston, J Samuel; King, Richard D; Marron, J S; Weiner, Michael W; Joshi, Sarang

    2014-04-01

    We develop a multivariate analysis of brain anatomy to identify the relevant shape deformation patterns and quantify the shape changes that explain corresponding variations in clinical neuropsychological measures. We use kernel Partial Least Squares (PLS) and formulate a regression model in the tangent space of the manifold of diffeomorphisms characterized by deformation momenta. The scalar deformation momenta completely encode the diffeomorphic changes in anatomical shape. In this model, the clinical measures are the response variables, while the anatomical variability is treated as the independent variable. To better understand the "shape-clinical response" relationship, we also control for demographic confounders, such as age, gender, and years of education in our regression model. We evaluate the proposed methodology on the Alzheimer's Disease Neuroimaging Initiative (ADNI) database using baseline structural MR imaging data and neuropsychological evaluation test scores. We demonstrate the ability of our model to quantify the anatomical deformations in units of clinical response. Our results also demonstrate that the proposed method is generic and generates reliable shape deformations both in terms of the extracted patterns and the amount of shape changes. We found that while the hippocampus and amygdala emerge as mainly responsible for changes in test scores for global measures of dementia and memory function, they are not a determinant factor for executive function. Another critical finding was the appearance of thalamus and putamen as most important regions that relate to executive function. These resulting anatomical regions were consistent with very high confidence irrespective of the size of the population used in the study. This data-driven global analysis of brain anatomy was able to reach similar conclusions as other studies in Alzheimer's disease based on predefined ROIs, together with the identification of other new patterns of deformation. The proposed methodology thus holds promise for discovering new patterns of shape changes in the human brain that could add to our understanding of disease progression in neurological disorders. PMID:24667299

  4. 2013 Goat Shows Show Date Show Name Entries Due Eligibility Weigh In Show Time Contact Phone Extra Info

    E-print Network

    Grissino-Mayer, Henri D.

    2013 Goat Shows Show Date Show Name Entries Due Eligibility Weigh In Show Time Contact Phone Extra/13/2013 Cannon Co. Day of Show Youth Must 8 a.m. to 12 p.m. Carol 615-563-5260 Bring own Jr. Goat $1.00 a head Control 10 a.m. Melton Bedding Association Goat 7/20/2013 Overton Co. Day of Show Youth Before 5 p.m. 6

  5. Quantifying Inductive Bias: AI Learning Algorithms and Valiant's Learning Framework

    Microsoft Academic Search

    David Haussler

    1988-01-01

    We show that the notion of inductive bias in concept learning can be quantified in a way that directl_v relates to learning performance in the framework recently introduced by Valiant. Our measure of bias is based on the growth function introduced by Vapnik and Chervonenkis, and on the Vapnik-Chervonenkis dimension. We measure some common language biases, including restriction to conjunctive

  6. Quantifying extreme bedrock erosion during Icelandic Megafloods

    E-print Network

    , with some areas unable to be surveyed due to restricted access (heavy vegetation etc.). This issue also, due to tree coverage and the size of the canyon. #12;Baynes: Field Report 2012: Quantifying Megaflood canyon is prevented due to dense vegetation cover. Aerial image source: LMI Baynes: 2012: Quantifying

  7. Subclasses of Quantified Boolean Formulas Andreas Flogel

    E-print Network

    Eckmiller, Rolf

    Subclasses of Quantified Boolean Formulas Andreas Fl¨ogel Department of Computer Science University of the authors, for certain subclasses of quantified Boolean formulas it is shown, that the evalution problem is coNP­complete. These subclasses can be seen as extensions of Horn and 2­CNF formulas. Further

  8. Quantifying diet for nutrigenomic studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The field of nutrigenomics shows tremendous promise for improved understanding of the effects of dietary intake on health. The knowledge that metabolic pathways may be altered in individuals with genetic variants in the presence of certain dietary exposures offers great potential for personalized nu...

  9. Long-term results show triple stapling facilitates safe low colorectal and coloanal anastomosis and is associated with low rates of local recurrence after anterior resection for rectal cancer

    Microsoft Academic Search

    D. P. Edwards; R. Sexton; R. J. Heald; B. J. Moran

    2007-01-01

    Background\\u000a   During low anterior resection (AR), placement of a staple line distal to an occlusion clamp is often difficult due to the\\u000a confines of a narrow bony pelvis. This study reviewed the results of AR with a technique in which a linear staple line is\\u000a fired below the tumour as an oncologically safe occlusion clamp.\\u000a \\u000a \\u000a \\u000a \\u000a Methods\\u000a   Between 1995 and 2000,

  10. The "Life Potential": a new complex algorithm to assess "Heart Rate Variability" from Holter records for cognitive and diagnostic aims. Preliminary experimental results showing its dependence on age, gender and health conditions

    E-print Network

    Barra, Orazio A

    2013-01-01

    Although HRV (Heart Rate Variability) analyses have been carried out for several decades, several limiting factors still make these analyses useless from a clinical point of view. The present paper aims at overcoming some of these limits by introducing the "Life Potential" (BMP), a new mathematical algorithm which seems to exhibit surprising cognitive and predictive capabilities. BMP is defined as a linear combination of five HRV Non-Linear Variables, in turn derived from the thermodynamic formalism of chaotic dynamic systems. The paper presents experimental measurements of BMP (Average Values and Standard Deviations) derived from 1048 Holter tests, matched in age and gender, including a control group of 356 healthy subjects. The main results are: (a) BMP always decreases when the age increases, and its dependence on age and gender is well established; (b) the shape of the age dependence within "healthy people" is different from that found in the general group: this behavior provides evidence of possible illn...

  11. Quantifying Effective Flow and Transport Properties in Heterogeneous Porous Media

    NASA Astrophysics Data System (ADS)

    Heidari, P.; Li, L.

    2012-12-01

    Spatial heterogeneity, the spatial variation in physical and chemical properties, exists at almost all scales and is an intrinsic property of natural porous media. It is important to understand and quantify how small-scale spatial variations determine large-scale "effective" properties in order to predict fluid flow and transport behavior in the natural subsurface. In this work, we aim to systematically understand and quantify the role of the spatial distribution of sand grains of different sizes in determining effective dispersivity and effective permeability using quasi-2D flow-cell experiments and numerical simulations. Two dimensional flow cells (20 cm by 20 cm) were packed with the same total amount of fine and coarse sands however with different spatial patterns. The homogeneous case has the completely mixed fine and coarse sands. The four zone case distributes the fine sand in four identical square zones within the coarse sand matrix. The one square case has all the fine sands in one square block. With the one square case pattern, two more experiments were designed in order to examine the effect of grain size contrast on effective permeability and dispersivity. Effective permeability was calculated based on both experimental and modeling results. Tracer tests were run for all cases. Advection dispersion equations were solved to match breakthrough data and to obtain average dispersivity. We also used Continuous Time Random Walk (CTRW) to quantify the non-Fickian transport behavior for each case. For the three cases with the same grain size contrast, the results show that the effective permeability does not differ significantly. The effective dispersion coefficient is the smallest for the homogeneous case (0.05 cm) and largest for the four zone case (0.27 cm). With the same pattern, the dispersivity value is the largest with the highest size contrast (0.28 cm), which is higher than the one with the lowest case by a factor of 2. The non-Fickian behavior was quantified by the ? value within the CTRW framework. Fickian transport will result in ? values larger than 2 while its deviation from 2 indicates the extent of non-Fickian behavior. Among the three cases with the same grain size contrast, the ? value is closest to 2 in the homogeneous case (1.95), while smallest in the four zone case (1.89). In the one square case, with the highest size contrast, the ? value was 1.57, indicating increasing extent of non-Fickian behavior with higher size contrast. This study is one step toward understanding how small-scale spatial variation in physical properties affect large-scale flow and transport behavior. This step is important in predicting subsurface transport processes that are relevant to earth sciences, environmental engineering, and petroleum engineering.

  12. Managing Beef Cattle for Show

    E-print Network

    Herd, Dennis B.; Boleman, Chris; Boleman, Larry L.

    2001-11-16

    This publication gives advice on raising beef cattle to exhibit at shows. Topics include animal selection, feeding, general health management, disease prevention, calf handling, and preparing for the show....

  13. Asia: Showing the Changing Seasons

    NSDL National Science Digital Library

    Jesse Allen

    1998-09-09

    SeaWiFS false color data showing seasonal change in the oceans and on land for Asia. The data is seasonally averaged, and shows the sequence: fall, winter, spring, summer, fall, winter, spring (for the Northern Hemisphere).

  14. ELI Talent Show Final Exams

    E-print Network

    Pilyugin, Sergei S.

    Highlights ELI Talent Show Final Exams Scholarship Nominees Graduate Admissions Workshop Reminders from the Office Manners, Cultures, & Grammar TheELIWeekly ELI Talent Show It's going to be a blast! Come one, come all! The 2nd Annual ELI Talent Show will be on Tuesday, April 15th

  15. Results of the HepZero study comparing heparin-grafted membrane and standard care show that heparin-grafted dialyzer is safe and easy to use for heparin-free dialysis.

    PubMed

    Laville, Maurice; Dorval, Marc; Fort Ros, Joan; Fay, Renaud; Cridlig, Joëlle; Nortier, Joëlle L; Juillard, Laurent; D?bska-?lizie?, Alicja; Fernández Lorente, Loreto; Thibaudin, Damien; Franssen, Casper; Schulz, Michael; Moureau, Frédérique; Loughraieb, Nathalie; Rossignol, Patrick

    2014-12-01

    Heparin is used to prevent clotting during hemodialysis, but heparin-free hemodialysis is sometimes needed to decrease the risk of bleeding. The HepZero study is a randomized, multicenter international controlled open-label trial comparing no-heparin hemodialysis strategies designed to assess non-inferiority of a heparin grafted dialyzer (NCT01318486). A total of 251 maintenance hemodialysis patients at increased risk of hemorrhage were randomly allocated for up to three heparin-free hemodialysis sessions using a heparin-grafted dialyzer or the center standard-of-care consisting of regular saline flushes or pre-dilution. The first heparin-free hemodialysis session was considered successful when there was neither complete occlusion of air traps or dialyzer, nor additional saline flushes, changes of dialyzer or bloodlines, or premature termination. The current standard-of-care resulted in high failure rates (50%). The success rate in the heparin-grafted membrane arm was significantly higher than in the control group (68.5% versus 50.4%), which was consistent for both standard-of-care modalities. The absolute difference between the heparin-grafted membrane and the controls was 18.2%, with a lower bound of the 90% confidence interval equal to plus 7.9%. The hypothesis of the non-inferiority at the minus 15% level was accepted, although superiority at the plus 15% level was not reached. Thus, use of a heparin-grafted membrane is a safe, helpful, and easy-to-use method for heparin-free hemodialysis in patients at increased risk of hemorrhage. PMID:25007166

  16. Quantifying Uncertainties in Land-Surface Microwave Emissivity Retrievals

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko

    2013-01-01

    Uncertainties in the retrievals of microwaveland-surface emissivities are quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including the Special Sensor Microwave Imager, the Tropical Rainfall Measuring Mission Microwave Imager, and the Advanced Microwave Scanning Radiometer for Earth Observing System, are studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land-surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors inthe retrievals. Generally, these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 1%-4% (3-12 K) over desert and 1%-7% (3-20 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.5%-2% (2-6 K). In particular, at 85.5/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are most likely caused by rain/cloud contamination, which can lead to random errors up to 10-17 K under the most severe conditions.

  17. Quantified Energy Dissipation Rates in the Terrestrial Bow Shock

    NASA Astrophysics Data System (ADS)

    Wilson, L. B., III; Sibeck, D. G.; Breneman, A. W.; Le Contel, O.; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.

    2013-12-01

    We present the first observationally quantified measure of the energy dissipation rate due to wave-particle interactions in the transition region of the Earth's collisionless bow shock using data from the THEMIS spacecraft. Each of more than 11 bow shock crossings examined with available wave burst data showed both low frequency (<10 Hz) magnetosonic-whistler waves and high frequency (?10 Hz) electromagnetic and electrostatic waves throughout the entire transition region and into the magnetosheath. The high frequency waves were identified as combinations of ion-acoustic waves, electron cyclotron drift instability driven waves, electrostatic solitary waves, and electromagnetic whistler mode waves. These waves were found to have: (1) amplitudes capable of exceeding ?B ~ 10 nT and ?E ~ 300 mV/m, though more typical values were ?B ~ 0.1-1.0 nT and ?E ~ 10-50 mV/m; (2) energy fluxes in excess of 2000 ?W m-2; (3) resistivities > 9000 ? m; and (4) energy dissipation rates > 3 ?W m-3. The high frequency (>10 Hz) electromagnetic waves produce such excessive energy dissipation that they need only be, at times, < 0.01% efficient to produce the observed increase in entropy across the shocks necessary to balance the nonlinear wave steepening that produces the shocks. These results show that wave-particle interactions have the capacity to regulate the global structure and dominate the energy dissipation of collisionless shocks.

  18. Quantifying nanosheet graphene oxide using electrospray-differential mobility analysis.

    PubMed

    Tai, Jui-Ting; Lai, Yen-Chih; Yang, Jian-He; Ho, Hsin-Chia; Wang, Hsiao-Fang; Ho, Rong-Ming; Tsai, De-Hao

    2015-04-01

    We report a high-resolution, traceable method to quantify number concentrations and dimensional properties of nanosheet graphene oxide (N-GO) colloids using electrospray-differential mobility analysis (ES-DMA). Transmission electron microscopy (TEM) was employed orthogonally to provide complementary data and imagery of N-GOs. Results show that the equivalent mobility sizes, size distributions, and number concentrations of N-GOs were able to be successfully measured by ES-DMA. Colloidal stability and filtration efficiency of N-GOs were shown to be effectively characterized based on the change of size distributions and number concentrations. Through the use of an analytical model, the DMA data were able to be converted into lateral size distributions, showing the average lateral size of N-GOs was ?32 nm with an estimated thickness ?0.8 nm. This prototype study demonstrates the proof of concept of using ES-DMA to quantitatively characterize N-GOs and provides traceability for applications involving the formulation of N-GOs. PMID:25783039

  19. Quantifying Uncertainties in Land Surface Microwave Emissivity Retrievals

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko

    2012-01-01

    Uncertainties in the retrievals of microwave land surface emissivities were quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including SSM/I, TMI and AMSR-E, were studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors in the retrievals. Generally these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 14% (312 K) over desert and 17% (320 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.52% (26 K). In particular, at 85.0/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are mostly likely caused by rain/cloud contamination, which can lead to random errors up to 1017 K under the most severe conditions.

  20. Quantifying the value of information

    SciTech Connect

    Riis, T. [Caesar Petroleum Systems, Houston, TX (United States)

    1999-06-01

    Every oil and gas company frequently makes decisions in situations where the result is not directly measurable in terms of impact on costs and revenue. This article presents the concept of Value of Information and discusses how this approach can assist in the decision process, using a simple example and a more realistic case.

  1. Quantifying Coral Reef Ecosystem Services

    EPA Science Inventory

    Coral reefs have been declining during the last four decades as a result of both local and global anthropogenic stresses. Numerous research efforts to elucidate the nature, causes, magnitude, and potential remedies for the decline have led to the widely held belief that the recov...

  2. Quantifying the Risk of Blood Exposure in Optometric Clinical Education.

    ERIC Educational Resources Information Center

    Hoppe, Elizabeth

    1997-01-01

    A study attempted to quantify risk of blood exposure in optometric clinical education by surveying optometric interns in their fourth year at the Southern California College of Optometry concerning their history of exposure or use of a needle. Results indicate blood exposure or needle use ranged from 0.95 to 18.71 per 10,000 patient encounters.…

  3. Quantifying Annual Aboveground Net Primary Production in the Intermountain West

    Technology Transfer Automated Retrieval System (TEKTRAN)

    As part of a larger project, methods were developed to quantify current year growth on grasses, forbs, and shrubs. Annual aboveground net primary production (ANPP) data are needed for this project to calibrate results from computer simulation models and remote-sensing data. Measuring annual ANPP of ...

  4. Development of a metric to quantify diesel engine irregularities

    Microsoft Academic Search

    G. Lowet; P. Van de Ponseele; S. Pauwels; P. Sas

    This paper describes the development of a metric to quantify the perception of irregularity in diesel engine sound. This development started from the formulation of hypotheses on signal parameters that possibly had an influence on the perception .Using dedicated signals, the validity of each of the hypotheses was tested using a listeners test. Based on the results of the tests,

  5. Quantifying Flaw Characteristics from IR NDE Data

    SciTech Connect

    Miller, W; Philips, N R; Burke, M W; Robbins, C L

    2003-02-14

    Work is presented which allows flaw characteristics to be quantified from the transient IR NDE signature. The goal of this effort was to accurately determine the type, size and depth of flaws revealed with IR NDE, using sonic IR as the example IR NDE technique. Typically an IR NDE experiment will result in a positive qualitative indication of a flaw such as a cold or hot spot in the image, but will not provide quantitative data thereby leaving the practitioner to make educated guesses as to the source of the signal. The technique presented here relies on comparing the transient IR signature to exact heat transfer analytical results for prototypical flaws, using the flaw characteristics as unknown fitting parameters. A nonlinear least squares algorithm is used to evaluate the fitting parameters, which then provide a direct measure of the flaw characteristics that can be mapped to the imaged surface for visual reference. The method uses temperature data for the heat transfer analysis, so radiometric calibration of the IR signal is required. The method provides quantitative data with a single thermal event (e.g. acoustic pulse or flash), as compared to phase-lock techniques that require many events. The work has been tested with numerical data but remains to be validated by experimental data, and that effort is underway.

  6. Quantifying Diet for Nutrigenomic Studies

    PubMed Central

    Tucker, Katherine L.; Smith, Caren E.; Lai, Chao-Qiang; Ordovas, Jose M.

    2015-01-01

    The field of nutrigenomics shows tremendous promise for improved understanding of the effects of dietary intake on health. The knowledge that metabolic pathways may be altered in individuals with genetic variants in the presence of certain dietary exposures offers great potential for personalized nutrition advice. However, although considerable resources have gone into improving technology for measurement of the genome and biological systems, dietary intake assessment remains inadequate. Each of the methods currently used has limitations tliat may be exaggerated in the context of gene x nutrient interaction in large multiethnic studies. Because of the specificity of most gene x nutrient interactions, valid data are needed for nutrient intakes at the individual level. Most statistical adjustment efforts are designed to improve estimates of nutrient intake distributions in populations and are unlikely to solve this problem. An improved method of direct measurement of individual usual dietary intake that is unbiased across populations is urgently needed. PMID:23642200

  7. Quantifying capital goods for waste incineration.

    PubMed

    Brogaard, L K; Riber, C; Christensen, T H

    2013-06-01

    Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000-240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000-26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000-5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7-14 kg CO2 per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2-3% with respect to kg CO2 per tonne of waste combusted. PMID:23561797

  8. Cortical, but Not Posterior Subcapsular, Cataract Shows Significant Familial

    E-print Network

    Broman, Karl W.

    cataract surgery1 and the most common form of cataract in clinical surgical series.2 Cortical cataractCortical, but Not Posterior Subcapsular, Cataract Shows Significant Familial Aggregation Wojciechowski, OD, MPH, Sheila K. West, PhD Purpose: To quantify the risk for age-related cortical cataract

  9. Planning a Successful Tech Show

    ERIC Educational Resources Information Center

    Nikirk, Martin

    2011-01-01

    Tech shows are a great way to introduce prospective students, parents, and local business and industry to a technology and engineering or career and technical education program. In addition to showcasing instructional programs, a tech show allows students to demonstrate their professionalism and skills, practice public presentations, and interact…

  10. Quantifying Magnetic Stellar Wind Torques

    NASA Astrophysics Data System (ADS)

    Matt, Sean; MacGregor, K. B.; Pinsonneault, M. H.; Greene, T. P.

    2011-01-01

    In order to be able to understand the evolution of stellar spin rates and differential rotation, it is necessary to have a rigorous theory for predicting angular momentum loss via magnetic stellar winds that is applicable over a wide range of conditions. Based upon the results of multidimensional, numerical simulations and semi-analytic calculations, we present an improved formulation for predicting the stellar wind torque, which is valid for varying degrees of magnetization in the wind, as well as for stellar spin rates that range from the slow- to the fast-magnetic-rotator regimes.

  11. Quantifying hybridization in realistic time.

    PubMed

    Collins, Joshua; Linz, Simone; Semple, Charles

    2011-10-01

    Recently, numerous practical and theoretical studies in evolutionary biology aim at calculating the extent to which reticulation-for example, horizontal gene transfer, hybridization, or recombination-has influenced the evolution for a set of present-day species. It has been shown that inferring the minimum number of hybridization events that is needed to simultaneously explain the evolutionary history for a set of trees is an NP-hard and also fixed-parameter tractable problem. In this article, we give a new fixed-parameter algorithm for computing the minimum number of hybridization events for when two rooted binary phylogenetic trees are given. This newly developed algorithm is based on interleaving-a technique using repeated kernelization steps that are applied throughout the exhaustive search part of a fixed-parameter algorithm. To show that our algorithm runs efficiently to be applicable to a wide range of practical problem instances, we apply it to a grass data set and highlight the significant improvements in terms of running times in comparison to an algorithm that has previously been implemented. PMID:21210735

  12. Quantifying avoided emissions from renewable generation

    E-print Network

    Gomez, Gabriel R. (Gabriel Rodriguez)

    2009-01-01

    Quantifying the reduced emissions due to renewable power integration and providing increasingly accurate emissions analysis has become more important for policy makers in the age of renewable portfolio standards (RPS) and ...

  13. New Drug Shows Mixed Results Against Early Alzheimer's

    MedlinePLUS

    ... Sharp People With Alzheimer’s at High Risk of Falls and Injury Can Vitamin B12 Help Ward Off Alzheimer’s? Heavy Smoking Increases Alzheimer’s Risk Fitness in a Flash: Exercise for Older Adults and Caregivers A Walk in the Park Senior-Friendly Fitness Close to Home Doing Crossword ...

  14. Showing results, 3 Energy technology and energy planning

    E-print Network

    loads, 6 ­ Energy materials and new energy technologies, 8 Fuel cells, 8 Superconductors, 8 Fusion progress of the fuel cell project deserves attention. The project which is managed by Risø is so far techniques for industry ­ Wind energy, 4 Wind turbines, 4 Wind energy systems, 5 Wind resources and wind

  15. Quantifying Comparisons of Threshold Resummations

    E-print Network

    George Sterman; Mao Zeng

    2014-05-29

    We explore similarities and differences between widely-used threshold resummation formalisms, employing electroweak boson production as an instructive example. Resummations based on both full QCD and soft-collinear effective theory (SCET) share common underlying factorizations and resulting evolution equations. The formalisms differ primarily in their choices of boundary conditions for evolution, in moment space for many treatments based on full QCD, and in momentum space for treatments based on soft-collinear effective theory. At the level of factorized hadronic cross sections, these choices lead to quite different expressions. Nevertheless, we can identify a natural expansion for parton luminosity functions, in which SCET and full QCD resummations agree for the first term, and for which subsequent terms provide differences that are small in most cases. We also clarify the roles of the non-leading resummation constants in the two formalisms, and observe a relationship of the QCD resummation function $D(\\alpha_s)$ to the web expansion.

  16. Quantifying comparisons of threshold resummations

    NASA Astrophysics Data System (ADS)

    Sterman, George; Zeng, Mao

    2014-05-01

    We explore similarities and differences between widely-used threshold resummation formalisms, employing electroweak boson production as an instructive example. Resummations based on both full QCD and soft-collinear effective theory (SCET) share common underlying factorizations and resulting evolution equations. The formalisms differ primarily in their choices of boundary conditions for evolution, in moment space for many treatments based on full QCD, and in momentum space for treatments based on soft-collinear effective theory. At the level of factorized hadronic cross sections, these choices lead to quite different expressions. Nevertheless, we can identify a natural expansion for parton luminosity functions, in which SCET and full QCD resummations agree for the first term, and for which subsequent terms provide differences that are small in most cases. We also clarify the roles of the non-leading resummation constants in the two formalisms, and observe a relationship of the QCD resummation function D( ? s ) to the web expansion.

  17. Efficiently solving quantified bit-vector formulas

    Microsoft Academic Search

    Christoph M. Wintersteiger; Youssef Hamadi; Leonardo Mendonça de Moura

    2010-01-01

    In recent years, bit-precise reasoning has gained importance in hardware and software verification. Of renewed interest is the use of symbolic reasoning for synthesising loop invariants, ranking functions, or whole program fragments and hardware circuits. Solvers for the quantifier-free fragment of bit-vector logic exist and often rely on SAT solvers for efficiency. However, many techniques require quantifiers in bit-vector for-

  18. Using Graphs to Show Connections

    NSDL National Science Digital Library

    The GLOBE Program, University Corporation for Atmospheric Research (UCAR)

    2003-08-01

    The purpose of this resource is to show how graphs of GLOBE data over time show the interconnectedness of Earth's system components at the local level. Students visit a study site, where they observe and recall their existing knowledge of air, water, soil, and living things to make a list of interconnections among the four Earth system components. They make predictions about the effects of a change in a system, inferring ways these changes affect the characteristics of other related components.

  19. Quantifying Urban Groundwater in Environmental Field Observatories

    NASA Astrophysics Data System (ADS)

    Welty, C.; Miller, A. J.; Belt, K.; Smith, J. A.; Band, L. E.; Groffman, P.; Scanlon, T.; Warner, J.; Ryan, R. J.; Yeskis, D.; McGuire, M. P.

    2006-12-01

    Despite the growing footprint of urban landscapes and their impacts on hydrologic and biogeochemical cycles, comprehensive field studies of urban water budgets are few. The cumulative effects of urban infrastructure (buildings, roads, culverts, storm drains, detention ponds, leaking water supply and wastewater pipe networks) on temporal and spatial patterns of groundwater stores, fluxes, and flowpaths are poorly understood. The goal of this project is to develop expertise and analytical tools for urban groundwater systems that will inform future environmental observatory planning and that can be shared with research teams working in urban environments elsewhere. The work plan for this project draws on a robust set of information resources in Maryland provided by ongoing monitoring efforts of the Baltimore Ecosystem Study (BES), USGS, and the U.S. Forest Service working together with university scientists and engineers from multiple institutions. A key concern is to bridge the gap between small-scale intensive field studies and larger-scale and longer-term hydrologic patterns using synoptic field surveys, remote sensing, numerical modeling, data mining and visualization tools. Using the urban water budget as a unifying theme, we are working toward estimating the various elements of the budget in order to quantify the influence of urban infrastructure on groundwater. Efforts include: (1) comparison of base flow behavior from stream gauges in a nested set of watersheds at four different spatial scales from 0.8 to 171 km2, with diverse patterns of impervious cover and urban infrastructure; (2) synoptic survey of well water levels to characterize the regional water table; (3) use of airborne thermal infrared imagery to identify locations of groundwater seepage into streams across a range of urban development patterns; (4) use of seepage transects and tracer tests to quantify the spatial pattern of groundwater fluxes to the drainage network in selected subwatersheds; (5) development of a mass balance for precipitation over a 170 km2 area on a 1x1 km2 grid using recording rain gages for bias correction of weather radar products; (5) calculation of urban evapotranspiration using the Penman-Monteith method compared with results from an eddy correlation station; (7) use of numerical groundwater model in a screening mode to estimate depth of groundwater contributing surface water flow; and (8) data mining of public agency records of potable water and wastewater flows to estimate leakage rates and flowpaths in relation to streamflow and groundwater fluxes.

  20. Quantifying selection in immune receptor repertoires

    PubMed Central

    Elhanati, Yuval; Murugan, Anand; Callan, Curtis G.; Mora, Thierry; Walczak, Aleksandra M.

    2014-01-01

    The efficient recognition of pathogens by the adaptive immune system relies on the diversity of receptors displayed at the surface of immune cells. T-cell receptor diversity results from an initial random DNA editing process, called VDJ recombination, followed by functional selection of cells according to the interaction of their surface receptors with self and foreign antigenic peptides. Using high-throughput sequence data from the ?-chain of human T-cell receptors, we infer factors that quantify the overall effect of selection on the elements of receptor sequence composition: the V and J gene choice and the length and amino acid composition of the variable region. We find a significant correlation between biases induced by VDJ recombination and our inferred selection factors together with a reduction of diversity during selection. Both effects suggest that natural selection acting on the recombination process has anticipated the selection pressures experienced during somatic evolution. The inferred selection factors differ little between donors or between naive and memory repertoires. The number of sequences shared between donors is well-predicted by our model, indicating a stochastic origin of such public sequences. Our approach is based on a probabilistic maximum likelihood method, which is necessary to disentangle the effects of selection from biases inherent in the recombination process. PMID:24941953

  1. Quantifying Ant Activity Using Vibration Measurements

    PubMed Central

    Oberst, Sebastian; Baro, Enrique Nava; Lai, Joseph C. S.; Evans, Theodore A.

    2014-01-01

    Ant behaviour is of great interest due to their sociality. Ant behaviour is typically observed visually, however there are many circumstances where visual observation is not possible. It may be possible to assess ant behaviour using vibration signals produced by their physical movement. We demonstrate through a series of bioassays with different stimuli that the level of activity of meat ants (Iridomyrmex purpureus) can be quantified using vibrations, corresponding to observations with video. We found that ants exposed to physical shaking produced the highest average vibration amplitudes followed by ants with stones to drag, then ants with neighbours, illuminated ants and ants in darkness. In addition, we devised a novel method based on wavelet decomposition to separate the vibration signal owing to the initial ant behaviour from the substrate response, which will allow signals recorded from different substrates to be compared directly. Our results indicate the potential to use vibration signals to classify some ant behaviours in situations where visual observation could be difficult. PMID:24658467

  2. Quantifying the Complexity of Flaring Active Regions

    NASA Technical Reports Server (NTRS)

    Stark, B.; Hagyard, M. J.

    1997-01-01

    While solar physicists have a better understanding of the importance magnetic fields play in the solar heating mechanism, it is still not possible to predict whether or when an active region will flare. In recent decades, qualitative studies of the changes in active region morphology have shown that there is generally an increase in the complexity of the spatial configuration of a solar active region leading up to a flare event. In this study, we quantify the spatial structure of the region using the differential Box-Counting Method (DBC) of fractal analysis. We analyze data from NASA/Marshall Space Flight Centr's vector magnetograph from two flaring active regions: AR 6089 from June 10, 1990, which produced one M1.7 flare, and AR 6659 from June 8, 9 and 10, 1991, this data set including one C5.7 and two M(6.4 and 3.2) flare. (AR 6659 produced several other flares). Several magnetic parameters are studied, including the transverse and longitudinal magnetic field components (Bt and B1), the total field (Bmag), and the magnetic shear, which describes the non-potentiality of the field. Results are presented for the time series of magnetograms in relation to the timing of flare events.

  3. Quantifying measurement uncertainty in full-scale compost piles using organic micro-pollutant concentrations.

    PubMed

    Sadef, Yumna; Poulsen, Tjalfe G; Bester, Kai

    2014-05-01

    Reductions in measurement uncertainty for organic micro-pollutant concentrations in full scale compost piles using comprehensive sampling and allowing equilibration time before sampling were quantified. Results showed that both application of a comprehensive sampling procedure (involving sample crushing) and allowing one week of equilibration time before sampling reduces measurement uncertainty by about 50%. Results further showed that for measurements carried out on samples collected using a comprehensive procedure, measurement uncertainty was associated exclusively with the analytic methods applied. Application of statistical analyses confirmed that these results were significant at the 95% confidence level. Overall implications of these results are (1) that it is possible to eliminate uncertainty associated with material inhomogeneity and (2) that in order to reduce uncertainty, sampling procedure is very important early in the composting process but less so later in the process. PMID:24729348

  4. Cross-Linguistic Relations between Quantifiers and Numerals in Language Acquisition: Evidence from Japanese

    ERIC Educational Resources Information Center

    Barner, David; Libenson, Amanda; Cheung, Pierina; Takasaki, Mayu

    2009-01-01

    A study of 104 Japanese-speaking 2- to 5-year-olds tested the relation between numeral and quantifier acquisition. A first study assessed Japanese children's comprehension of quantifiers, numerals, and classifiers. Relative to English-speaking counterparts, Japanese children were delayed in numeral comprehension at 2 years of age but showed no…

  5. Sensitivity of edge detection methods for quantifying cell migration assays.

    PubMed

    Treloar, Katrina K; Simpson, Matthew J

    2013-01-01

    Quantitative imaging methods to analyze cell migration assays are not standardized. Here we present a suite of two-dimensional barrier assays describing the collective spreading of an initially-confined population of 3T3 fibroblast cells. To quantify the motility rate we apply two different automatic image detection methods to locate the position of the leading edge of the spreading population after 24, 48 and 72 hours. These results are compared with a manual edge detection method where we systematically vary the detection threshold. Our results indicate that the observed spreading rates are very sensitive to the choice of image analysis tools and we show that a standard measure of cell migration can vary by as much as 25% for the same experimental images depending on the details of the image analysis tools. Our results imply that it is very difficult, if not impossible, to meaningfully compare previously published measures of cell migration since previous results have been obtained using different image analysis techniques and the details of these techniques are not always reported. Using a mathematical model, we provide a physical interpretation of our edge detection results. The physical interpretation is important since edge detection algorithms alone do not specify any physical measure, or physical definition, of the leading edge of the spreading population. Our modeling indicates that variations in the image threshold parameter correspond to a consistent variation in the local cell density. This means that varying the threshold parameter is equivalent to varying the location of the leading edge in the range of approximately 1-5% of the maximum cell density. PMID:23826283

  6. Quantifiable diagnosis of muscular dystrophies and neurogenic atrophies through network analysis

    PubMed Central

    2013-01-01

    Background The diagnosis of neuromuscular diseases is strongly based on the histological characterization of muscle biopsies. However, this morphological analysis is mostly a subjective process and difficult to quantify. We have tested if network science can provide a novel framework to extract useful information from muscle biopsies, developing a novel method that analyzes muscle samples in an objective, automated, fast and precise manner. Methods Our database consisted of 102 muscle biopsy images from 70 individuals (including controls, patients with neurogenic atrophies and patients with muscular dystrophies). We used this to develop a new method, Neuromuscular DIseases Computerized Image Analysis (NDICIA), that uses network science analysis to capture the defining signature of muscle biopsy images. NDICIA characterizes muscle tissues by representing each image as a network, with fibers serving as nodes and fiber contacts as links. Results After a ‘training’ phase with control and pathological biopsies, NDICIA was able to quantify the degree of pathology of each sample. We validated our method by comparing NDICIA quantification of the severity of muscular dystrophies with a pathologist’s evaluation of the degree of pathology, resulting in a strong correlation (R?=?0.900, P <0.00001). Importantly, our approach can be used to quantify new images without the need for prior ‘training’. Therefore, we show that network science analysis captures the useful information contained in muscle biopsies, helping the diagnosis of muscular dystrophies and neurogenic atrophies. Conclusions Our novel network analysis approach will serve as a valuable tool for assessing the etiology of muscular dystrophies or neurogenic atrophies, and has the potential to quantify treatment outcomes in preclinical and clinical trials. PMID:23514382

  7. Diarrheal Disease in Show Swine

    E-print Network

    Lawhorn, D. Bruce

    2007-02-27

    D iarrhea is one of the most important problems in show pigs. It can occur at any time, from first obtaining a show pig through the last day of exhi- bition. It can become a chronic condition that persists for weeks. Diarrhea can be caused..., in itself, cause loose stools or diarrhea. If a pig is grow- ing well but has a chronically loose stool, reduce the pro- tein and/or increase fiber in the diet. If the stool does not firm up, there are probably other causes of diarrhea. Parasitic Causes...

  8. Progress toward quantifying landscape-scale movement patterns of the glassy-winged sharpshooter and its natural enemies using a novel marl-capture technique

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Here we present the results of the first year of our research targeted at quantifying the landscape-level movement patterns of GWSS and its natural enemies. We showed that protein markers can be rapidly acquired and retained on insects for several weeks after marking directly in the field. Specifica...

  9. Quantifying the Ripple: Word-of-Mouth and Advertising Effectiveness

    Microsoft Academic Search

    JOHN E. HOGAN; KATHERINE N. LEMON; BARAK LIBAI

    2004-01-01

    In this article the authors demonstrate how a customer lifetime value approach can provide a better assessment of advertising effectiveness that takes into account postpurchase behaviors such as word-of-mouth. Although for many advertisers word-of-mouth is viewed as an alternative to advertising, the authors show that it is possible to quantify the way in which word-of-mouth often complements and extends the

  10. Midterm Picnic ELI Talent Show

    E-print Network

    Pilyugin, Sergei S.

    Highlights Midterm Picnic ELI Talent Show Notes from the Office Birthdays Manners Grammar The the new ELI students. All ELI students, staff, and friends are invited to the Midterm/Welcome picnic Where: Broward Beach (Behind Broward and Yulee Halls, across 13th Street from the ELI). What to Bring

  11. Producing Talent and Variety Shows.

    ERIC Educational Resources Information Center

    Szabo, Chuck

    1995-01-01

    Identifies key aspects of producing talent shows and outlines helpful hints for avoiding pitfalls and ensuring a smooth production. Presents suggestions concerning publicity, scheduling, and support personnel. Describes types of acts along with special needs and problems specific to each act. Includes a list of resources. (MJP)

  12. Image Restoration for Quantifying TFT-LCD Defect Levels

    NASA Astrophysics Data System (ADS)

    Choi, Kyu Nam; Park, No Kap; Yoo, Suk In

    Though machine vision systems for automatically detecting visual defects, called mura, have been developed for thin flat transistor liquid crystal display (TFT-LCD) panels, they have not yet reached a level of reliability which can replace human inspectors. To establish an objective criterion for identifying real defects, some index functions for quantifying defect levels based on human perception have been recently researched. However, while these functions have been verified in the laboratory, further consideration is needed in order to apply them to real systems in the field. To begin with, we should correct the distortion occurring through the capturing of panels. Distortion can cause the defect level in the observed image to differ from that in the panel. There are several known methods to restore the observed image in general vision systems. However, TFT-LCD panel images have a unique background degradation composed of background non-uniformity and vignetting effect which cannot easily be restored through traditional methods. Therefore, in this paper we present a new method to correct background degradation of TFT-LCD panel images using principal component analysis (PCA). Experimental results show that our method properly restores the given observed images and the transformed shape of muras closely approaches the original undistorted shape.

  13. Statistical physics approach to quantifying differences in myelinated nerve fibers

    NASA Astrophysics Data System (ADS)

    Comin, César H.; Santos, João R.; Corradini, Dario; Morrison, Will; Curme, Chester; Rosene, Douglas L.; Gabrielli, Andrea; da F. Costa, Luciano; Stanley, H. Eugene

    2014-03-01

    We present a new method to quantify differences in myelinated nerve fibers. These differences range from morphologic characteristics of individual fibers to differences in macroscopic properties of collections of fibers. Our method uses statistical physics tools to improve on traditional measures, such as fiber size and packing density. As a case study, we analyze cross-sectional electron micrographs from the fornix of young and old rhesus monkeys using a semi-automatic detection algorithm to identify and characterize myelinated axons. We then apply a feature selection approach to identify the features that best distinguish between the young and old age groups, achieving a maximum accuracy of 94% when assigning samples to their age groups. This analysis shows that the best discrimination is obtained using the combination of two features: the fraction of occupied axon area and the effective local density. The latter is a modified calculation of axon density, which reflects how closely axons are packed. Our feature analysis approach can be applied to characterize differences that result from biological processes such as aging, damage from trauma or disease or developmental differences, as well as differences between anatomical regions such as the fornix and the cingulum bundle or corpus callosum.

  14. Quantifying thiol-gold interactions towards the efficient strength control

    NASA Astrophysics Data System (ADS)

    Xue, Yurui; Li, Xun; Li, Hongbin; Zhang, Wenke

    2014-07-01

    The strength of the thiol-gold interactions provides the basis to fabricate robust self-assembled monolayers for diverse applications. Investigation on the stability of thiol-gold interactions has thus become a hot topic. Here we use atomic force microscopy to quantify the stability of individual thiol-gold contacts formed both by isolated single thiols and in self-assembled monolayers on gold surface. Our results show that the oxidized gold surface can enhance greatly the stability of gold-thiol contacts. In addition, the shift of binding modes from a coordinate bond to a covalent bond with the change in environmental pH and interaction time has been observed experimentally. Furthermore, isolated thiol-gold contact is found to be more stable than that in self-assembled monolayers. Our findings revealed mechanisms to control the strength of thiol-gold contacts and will help guide the design of thiol-gold contacts for a variety of practical applications.

  15. Identifying and quantifying radiation damage at the atomic level.

    PubMed

    Gerstel, Markus; Deane, Charlotte M; Garman, Elspeth F

    2015-03-01

    Radiation damage impedes macromolecular diffraction experiments. Alongside the well known effects of global radiation damage, site-specific radiation damage affects data quality and the veracity of biological conclusions on protein mechanism and function. Site-specific radiation damage follows a relatively predetermined pattern, in that different structural motifs are affected at different dose regimes: in metal-free proteins, disulfide bonds tend to break first followed by the decarboxylation of aspartic and glutamic acids. Even within these damage motifs the decay does not progress uniformly at equal rates. Within the same protein, radiation-induced electron density decay of a particular chemical group is faster than for the same group elsewhere in the protein: an effect known as preferential specific damage. Here, BDamage, a new atomic metric, is defined and validated to recognize protein regions susceptible to specific damage and to quantify the damage at these sites. By applying BDamage to a large set of known protein structures in a statistical survey, correlations between the rates of damage and various physicochemical parameters were identified. Results indicate that specific radiation damage is independent of secondary protein structure. Different disulfide bond groups (spiral, hook, and staple) show dissimilar radiation damage susceptibility. There is a consistent positive correlation between specific damage and solvent accessibility. PMID:25723922

  16. Identifying and quantifying radiation damage at the atomic level

    PubMed Central

    Gerstel, Markus; Deane, Charlotte M.; Garman, Elspeth F.

    2015-01-01

    Radiation damage impedes macromolecular diffraction experiments. Alongside the well known effects of global radiation damage, site-specific radiation damage affects data quality and the veracity of biological conclusions on protein mechanism and function. Site-specific radiation damage follows a relatively predetermined pattern, in that different structural motifs are affected at different dose regimes: in metal-free proteins, disulfide bonds tend to break first followed by the decarboxylation of aspartic and glutamic acids. Even within these damage motifs the decay does not progress uniformly at equal rates. Within the same protein, radiation-induced electron density decay of a particular chemical group is faster than for the same group elsewhere in the protein: an effect known as preferential specific damage. Here, B Damage, a new atomic metric, is defined and validated to recognize protein regions susceptible to specific damage and to quantify the damage at these sites. By applying B Damage to a large set of known protein structures in a statistical survey, correlations between the rates of damage and various physicochemical parameters were identified. Results indicate that specific radiation damage is independent of secondary protein structure. Different disulfide bond groups (spiral, hook, and staple) show dissimilar radiation damage susceptibility. There is a consistent positive correlation between specific damage and solvent accessibility. PMID:25723922

  17. Quantifying singlet fission in novel organic materials using nonlinear optics

    NASA Astrophysics Data System (ADS)

    Busby, Erik; Xia, Jianlong; Yaffe, Omer; Kumar, Bharat; Berkelbach, Timothy; Wu, Qin; Miller, John; Nuckolls, Colin; Zhu, Xiaoyang; Reichman, David; Campos, Luis; Sfeir, Matthew Y.

    2014-10-01

    Singlet fission is a form of multiple exciton generation in which two triplet excitons are produced from the decay of a photoexcited singlet exciton. In a small number of organic materials, most notably pentacene, this conversion process has been shown to occur with unity quantum yield on sub-ps timescales. However, a poorly understood mechanism for fission along with strict energy and geometry requirements have so far limited the observation of this process to a few classes of organic materials, with only a subset of these (most notably the polyacenes) showing both efficient fission and long-lived triplets. Here, we utilize novel organic materials to investigate how the efficiency of the fission process depends on the coupling and the energetic driving force between chromophores in both intra- and intermolecular singlet fission materials. We demonstrate how the triplet yield can be accurately quantified using a combination of traditional transient spectroscopies and recently developed excited state saturable absorption techniques. These results allow us to gain mechanistic insight into the fission process and suggest general strategies for generating new materials that can undergo efficient fission.

  18. Statistical physics approach to quantifying differences in myelinated nerve fibers

    PubMed Central

    Comin, César H.; Santos, João R.; Corradini, Dario; Morrison, Will; Curme, Chester; Rosene, Douglas L.; Gabrielli, Andrea; da F. Costa, Luciano; Stanley, H. Eugene

    2014-01-01

    We present a new method to quantify differences in myelinated nerve fibers. These differences range from morphologic characteristics of individual fibers to differences in macroscopic properties of collections of fibers. Our method uses statistical physics tools to improve on traditional measures, such as fiber size and packing density. As a case study, we analyze cross–sectional electron micrographs from the fornix of young and old rhesus monkeys using a semi-automatic detection algorithm to identify and characterize myelinated axons. We then apply a feature selection approach to identify the features that best distinguish between the young and old age groups, achieving a maximum accuracy of 94% when assigning samples to their age groups. This analysis shows that the best discrimination is obtained using the combination of two features: the fraction of occupied axon area and the effective local density. The latter is a modified calculation of axon density, which reflects how closely axons are packed. Our feature analysis approach can be applied to characterize differences that result from biological processes such as aging, damage from trauma or disease or developmental differences, as well as differences between anatomical regions such as the fornix and the cingulum bundle or corpus callosum. PMID:24676146

  19. Quantifying the impact of ocean acidification on our future climate

    NASA Astrophysics Data System (ADS)

    Matear, R. J.; Lenton, A.

    2014-07-01

    Ocean acidification (OA) is the consequence of rising atmospheric CO2 levels, and it is occurring in conjunction with global warming. Observational studies show that OA will impact ocean biogeochemical cycles. Here, we use an Earth system model under the RCP8.5 emission scenario to evaluate and quantify the first-order impacts of OA on marine biogeochemical cycles, and its potential feedback on our future climate. We find that OA impacts have only a small impact on the future atmospheric CO2 (less than 45 ppm) and global warming (less than a 0.25 K) by 2100. While the climate change feedbacks are small, OA impacts may significantly alter the distribution of biological production and remineralisation, which would alter the dissolved oxygen distribution in the ocean interior. Our results demonstrate that the consequences of OA will not be through its impact on climate change, but on how it impacts the flow of energy in marine ecosystems, which may significantly impact their productivity, composition and diversity.

  20. Quantifying transfer after perceptual-motor sequence learning: how inflexible is implicit learning?

    PubMed

    Sanchez, Daniel J; Yarnik, Eric N; Reber, Paul J

    2015-03-01

    Studies of implicit perceptual-motor sequence learning have often shown learning to be inflexibly tied to the training conditions during learning. Since sequence learning is seen as a model task of skill acquisition, limits on the ability to transfer knowledge from the training context to a performance context indicates important constraints on skill learning approaches. Lack of transfer across contexts has been demonstrated by showing that when task elements are changed following training, this leads to a disruption in performance. These results have typically been taken as suggesting that the sequence knowledge relies on integrated representations across task elements (Abrahamse, Jiménez, Verwey, & Clegg, Psychon Bull Rev 17:603-623, 2010a). Using a relatively new sequence learning task, serial interception sequence learning, three experiments are reported that quantify this magnitude of performance disruption after selectively manipulating individual aspects of motor performance or perceptual information. In Experiment 1, selective disruption of the timing or order of sequential actions was examined using a novel response manipulandum that allowed for separate analysis of these two motor response components. In Experiments 2 and 3, transfer was examined after selective disruption of perceptual information that left the motor response sequence intact. All three experiments provided quantifiable estimates of partial transfer to novel contexts that suggest some level of information integration across task elements. However, the ability to identify quantifiable levels of successful transfer indicates that integration is not all-or-none and that measurement sensitivity is a key in understanding sequence knowledge representations. PMID:24668505

  1. Quantifying Shape Changes and Tissue Deformation in Leaf Development1[C][W][OPEN

    PubMed Central

    Rolland-Lagan, Anne-Gaëlle; Remmler, Lauren; Girard-Bock, Camille

    2014-01-01

    The analysis of biological shapes has applications in many areas of biology, and tools exist to quantify organ shape and detect shape differences between species or among variants. However, such measurements do not provide any information about the mechanisms of shape generation. Quantitative data on growth patterns may provide insights into morphogenetic processes, but since growth is a complex process occurring in four dimensions, growth patterns alone cannot intuitively be linked to shape outcomes. Here, we present computational tools to quantify tissue deformation and surface shape changes over the course of leaf development, applied to the first leaf of Arabidopsis (Arabidopsis thaliana). The results show that the overall leaf shape does not change notably during the developmental stages analyzed, yet there is a clear upward radial deformation of the leaf tissue in early time points. This deformation pattern may provide an explanation for how the Arabidopsis leaf maintains a relatively constant shape despite spatial heterogeneities in growth. These findings highlight the importance of quantifying tissue deformation when investigating the control of leaf shape. More generally, experimental mapping of deformation patterns may help us to better understand the link between growth and shape in organ development. PMID:24710066

  2. Quantifying disturbed hill dipterocarp forest lands in Ulu Tembeling, Malaysia with HRV/SPOT images

    NASA Astrophysics Data System (ADS)

    Jusoff, Kamaruzaman; D'Souza, Giles

    A satellite remote sensing survey was conducted in a disturbed logged-over hill dipterocarp forest of Ulu Tembeling in northern Pahang, Malaysia to identify and quantify the site disturbance classes due to road construction and logging activities. The merged SPOT data path/row K271-J341 was taken in October 3, 1988. The SPOT scene was obtained in a computer-compatible tape format and manual analysis was initiated by selecting a representative subsection of the scene that covered the study area. Ground truthing/field work was carried out and parameters such as causal factors, forms, sizes, shapes and patterns of soil disturbance were recorded, measured and correlated with image classification. Image interpretation, registration and classification were conducted for visual image analysis. The results showed that forest soil disturbance could be easily detected and monitored with 93% accuracy. Six classes of soil disturbance were quantified and recognized, namely (i) primary forest road, (ii) secondary forest road, (iii) skid road, (iv) skid trail, (v) secondary landing, and (vi) primary landing. By reference to maps of licensing applications and logging permits, legal and unpermitted logging activity can be well identified. However it is emphasized that interpreters should have a prior knowledge of the topography and the soil disturbance pattern of a logged-over forest area to be quantified and identified before actually beginning with image interpretation.

  3. Rocks and Minerals Slide Show

    NSDL National Science Digital Library

    This interactive slide show of common rocks and minerals allows students to choose from two sets of minerals and click on a thumbnail to see a larger photograph with a full description of the mineral including color, streak, hardness, cleavage/fracture, and chemical composition. Also included are its use and where it is found. The rocks are divided into igneous, sedimentary, and metamorphic and can be accessed in the same manner. They are described on the basis of crystal size and mineral composition as well as use.

  4. DOE: Quantifying the Value of Hydropower in the Electric Grid

    SciTech Connect

    None

    2012-12-31

    The report summarizes research to Quantify the Value of Hydropower in the Electric Grid. This 3-year DOE study focused on defining value of hydropower assets in a changing electric grid. Methods are described for valuation and planning of pumped storage and conventional hydropower. The project team conducted plant case studies, electric system modeling, market analysis, cost data gathering, and evaluations of operating strategies and constraints. Five other reports detailing these research results are available a project website, www.epri.com/hydrogrid. With increasing deployment of wind and solar renewable generation, many owners, operators, and developers of hydropower have recognized the opportunity to provide more flexibility and ancillary services to the electric grid. To quantify value of services, this study focused on the Western Electric Coordinating Council region. A security-constrained, unit commitment and economic dispatch model was used to quantify the role of hydropower for several future energy scenarios up to 2020. This hourly production simulation considered transmission requirements to deliver energy, including future expansion plans. Both energy and ancillary service values were considered. Addressing specifically the quantification of pumped storage value, no single value stream dominated predicted plant contributions in various energy futures. Modeling confirmed that service value depends greatly on location and on competition with other available grid support resources. In this summary, ten different value streams related to hydropower are described. These fell into three categories; operational improvements, new technologies, and electricity market opportunities. Of these ten, the study was able to quantify a monetary value in six by applying both present day and future scenarios for operating the electric grid. This study confirmed that hydropower resources across the United States contribute significantly to operation of the grid in terms of energy, capacity, and ancillary services. Many potential improvements to existing hydropower plants were found to be cost-effective. Pumped storage is the most likely form of large new hydro asset expansions in the U.S. however, justifying investments in new pumped storage plants remains very challenging with current electricity market economics. Even over a wide range of possible energy futures, up to 2020, no energy future was found to bring quantifiable revenues sufficient to cover estimated costs of plant construction. Value streams not quantified in this study may provide a different cost-benefit balance and an economic tipping point for hydro. Future studies are essential in the quest to quantify the full potential value. Additional research should consider the value of services provided by advanced storage hydropower and pumped storage at smaller time steps for integration of variable renewable resources, and should include all possible value streams such as capacity value and portfolio benefits i.e.; reducing cycling on traditional generation.

  5. A standardized method for quantifying unidirectional genetic introgression

    PubMed Central

    Karlsson, Sten; Diserud, Ola H; Moen, Thomas; Hindar, Kjetil

    2014-01-01

    Genetic introgression of domesticated to wild conspecifics is of great concern to the genetic integrity and viability of the wild populations. Therefore, we need tools that can be used for monitoring unidirectional gene flow from domesticated to wild populations. A challenge to quantitation of unidirectional gene flow is that both the donor and the recipient population may be genetically substructured and that the subpopulations are subjected to genetic drift and may exchange migrants between one another. We develop a standardized method for quantifying and monitoring domesticated to wild gene flow and demonstrate its usefulness to farm and wild Atlantic salmon as a model species. The challenge of having several wild and farm populations was circumvented by in silico generating one analytical center point for farm and wild salmon, respectively. Distributions for the probability that an individual is wild were generated from individual-based analyses of observed wild and farm genotypes using STRUCTURE. We show that estimates of proportions of the genome being of domesticated origin in a particular wild population can be obtained without having a historical reference sample for the same population. The main advantages of the method presented are the standardized way in which genetic processes within and between populations are taken into account, and the individual-based analyses giving estimates for each individual independent of other individuals. The method makes use of established software, and as long as genetic markers showing generic genetic differences between domesticated and wild populations are available, it can be applied to all species with unidirectional gene flow. Results from our method are easy to interpret and understand, and will serve as a powerful tool for management, especially because there is no need for a specific historical wild reference sample. PMID:25473478

  6. Obtaining Laws Through Quantifying Experiments: Justifications of Pre-service Physics Teachers in the Case of Electric Current, Voltage and Resistance

    NASA Astrophysics Data System (ADS)

    Mäntylä, Terhi; Hämäläinen, Ari

    2015-03-01

    The language of physics is mathematics, and physics ideas, laws and models describing phenomena are usually represented in mathematical form. Therefore, an understanding of how to navigate between phenomena and the models representing them in mathematical form is important for a physics teacher so that the teacher can make physics understandable to students. Here, the focus is on the "experimental mathematization," how laws are established through quantifying experiments. A sequence from qualitative experiments to mathematical formulations through quantifying experiments on electric current, voltage and resistance in pre-service physics teachers' laboratory reports is examined. The way students reason and justify the mathematical formulation of the measurement results and how they combine the treatment and presentation of empirical data to their justifications is analyzed. The results show that pre-service physics teachers understand the basic idea of how quantifying experiments establish the quantities and laws but are not able to argue it in a justified manner.

  7. Casimir experiments showing saturation effects

    SciTech Connect

    Sernelius, Bo E. [Division of Theory and Modeling, Department of Physics, Chemistry and Biology, Linkoeping University, SE-581 83 Linkoeping (Sweden)

    2009-10-15

    We address several different Casimir experiments where theory and experiment disagree. First out is the classical Casimir force measurement between two metal half spaces; here both in the form of the torsion pendulum experiment by Lamoreaux and in the form of the Casimir pressure measurement between a gold sphere and a gold plate as performed by Decca et al.; theory predicts a large negative thermal correction, absent in the high precision experiments. The third experiment is the measurement of the Casimir force between a metal plate and a laser irradiated semiconductor membrane as performed by Chen et al.; the change in force with laser intensity is larger than predicted by theory. The fourth experiment is the measurement of the Casimir force between an atom and a wall in the form of the measurement by Obrecht et al. of the change in oscillation frequency of a {sup 87}Rb Bose-Einstein condensate trapped to a fused silica wall; the change is smaller than predicted by theory. We show that saturation effects can explain the discrepancies between theory and experiment observed in all these cases.

  8. Quantifying multipartite entanglement Tzu-Chieh Wei

    E-print Network

    Goldbart, Paul M.

    pure and mixed states. It is determined analytically for arbitrary two-qubit mixed statesQuantifying multipartite entanglement Tzu-Chieh Wei , Joseph B. Altepeter , Dyutiman Das , Marie the degree of entanglement for a pure quantum state is to compare how far this state is from the set of all

  9. Quantifying Content Consistency Improvements Through Opportunistic Contacts

    E-print Network

    Chaintreau, Augustin

    infrastructure-based mechanisms. While the benefits of such opportunistic sharing are intu- itive, quantifying on the structure of the social networks users be- long to. Furthermore, social connectivity influences not only, and willingness to share content, e.g., only to the members of their own social networks. We establish

  10. Comparison of Approaches to Quantify Arterial Damping

    E-print Network

    Chesler, Naomi C.

    Comparison of Approaches to Quantify Arterial Damping Capacity From Pressurization Tests on Mouse Conduit Arteries Lian Tian e-mail: ltian22@wisc.edu Zhijie Wang e-mail: zwang48@wisc.edu Department-mail: chesler@engr.wisc.edu Large conduit arteries are not purely elastic, but viscoelastic, which affects

  11. Quantifying Loading Efficiency Losses on Lithography Clusters

    Microsoft Academic Search

    J. Foster; M. Mohile; J. Matthews

    2008-01-01

    The high cost of lithography clusters requires fabs to continuously work to optimize their utilization and output. Unfortunately, due to their highly complex nature and the parallel processing capability of the cluster, accurately determining the utilization and output detractors can be very difficult. Without the ability to accurately identify and quantify the equipment capacity loss it is obviously very difficult

  12. Book Review Quantifying Behavior the JWatcher Way.

    E-print Network

    Grether, Gregory

    researcher, I have scored countless hours of videotaped behavior of wild animals. I always knew JWatcher ability to guide the user through the process of scoring animal behavior successfully, using the softwareBook Review Quantifying Behavior the JWatcher Way. Daniel T. Blumstein and Janice C. Daniel

  13. Quantifying precipitation suppression due to air Pollution

    E-print Network

    Li, Zhanqing

    Quantifying precipitation suppression due to air Pollution First author: Amir Givati The Hebrew January 2004 #12;ABSTRACT: Urban and industrial air pollution has been shown qualitatively to suppress. The evidence suggests that air pollution aerosols that are incorporated in orographic clouds slow down cloud

  14. Quantifying the Thermal Fatigue of CPV Modules

    SciTech Connect

    Bosco, N.; Kurtz, S.

    2011-02-01

    A method is presented to quantify thermal fatigue in the CPV die-attach from meteorological data. A comparative; study between cities demonstrates a significant difference in the accumulated damage. These differences are most; sensitive to the number of larger (?T) thermal cycles experienced for a location. High frequency data (<1/min) may; be required to most accurately employ this method.

  15. Quantifying Item Dependency by Fisher's Z.

    ERIC Educational Resources Information Center

    Shen, Linjun

    Three aspects of the usual approach to assessing local item dependency, Yen's "Q" (H. Huynh, H. Michaels, and S. Ferrara, 1995), deserve further investigation. Pearson correlation coefficients do not distribute normally when the coefficients are large, and thus cannot quantify the dependency well. In the second place, the accuracy of item response…

  16. Quantifying the Reuse of Learning Objects

    ERIC Educational Resources Information Center

    Elliott, Kristine; Sweeney, Kevin

    2008-01-01

    This paper reports the findings of one case study from a larger project, which aims to quantify the claimed efficiencies of reusing learning objects to develop e-learning resources. The case study describes how an online inquiry project "Diabetes: A waste of energy" was developed by searching for, evaluating, modifying and then integrating as many…

  17. Partial Cylindrical Algebraic Decomposition for Quantifier Elimination

    Microsoft Academic Search

    George E. Collins; Jay H. Hong

    1991-01-01

    The Cylindrical Algebraic Decomposition method (CAD) decomposes $R^r$ into regions over which given polynomials have constant signs. An important application of CAD is quantifier elimination in elementary algebra and geometry. In this paper we present a method which intermingles CAD construction with truth evaluation so that parts of the CAD are constructed only as needed to further truth evaluation and

  18. Improved estimates show large circumpolar stocks of permafrost carbon while quantifying substantial uncertainty ranges and identifying remaining data gaps

    NASA Astrophysics Data System (ADS)

    Hugelius, G.; Strauss, J.; Zubrzycki, S.; Harden, J. W.; Schuur, E. A. G.; Ping, C. L.; Schirrmeister, L.; Grosse, G.; Michaelson, G. J.; Koven, C. D.; O'Donnell, J. A.; Elberling, B.; Mishra, U.; Camill, P.; Yu, Z.; Palmtag, J.; Kuhry, P.

    2014-03-01

    Soils and other unconsolidated deposits in the northern circumpolar permafrost region store large amounts of soil organic carbon (SOC). This SOC is potentially vulnerable to remobilization following soil warming and permafrost thaw, but stock estimates are poorly constrained and quantitative error estimates were lacking. This study presents revised estimates of the permafrost SOC pool, including quantitative uncertainty estimates, in the 0-3 m depth range in soils as well as for deeper sediments (>3 m) in deltaic deposits of major rivers and in the Yedoma region of Siberia and Alaska. The revised estimates are based on significantly larger databases compared to previous studies. Compared to previous studies, the number of individual sites/pedons has increased by a factor ×8-11 for soils in the 1-3 m depth range,, a factor ×8 for deltaic alluvium and a factor ×5 for Yedoma region deposits. Upscaled based on regional soil maps, estimated permafrost region SOC stocks are 217 ± 15 and 472 ± 34 Pg for the 0-0.3 m and 0-1 m soil depths, respectively (±95% confidence intervals). Depending on the regional subdivision used to upscale 1-3 m soils (following physiography or continents), estimated 0-3 m SOC storage is 1034 ± 183 Pg or 1104 ± 133 Pg. Of this, 34 ± 16 Pg C is stored in thin soils of the High Arctic. Based on generalised calculations, storage of SOC in deep deltaic alluvium (>3 m to ?60 m depth) of major Arctic rivers is estimated to 91 ± 39 Pg (of which 69 ± 34 Pg is in permafrost). In the Yedoma region, estimated >3 m SOC stocks are 178 +140/-146 Pg, of which 74 +54/-57 Pg is stored in intact, frozen Yedoma (late Pleistocene ice- and organic-rich silty sediments) with the remainder in refrozen thermokarst deposits (±16/84th percentiles of bootstrapped estimates). A total estimated mean storage for the permafrost region of ca. 1300-1370 Pg with an uncertainty range of 930-1690 Pg encompasses the combined revised estimates. Of this, ?819-836 Pg is perennially frozen. While some components of the revised SOC stocks are similar in magnitude to those previously reported for this region, there are also substantial differences in individual components. There is evidence of remaining regional data-gaps. Estimates remain particularly poorly constrained for soils in the High Arctic region and physiographic regions with thin sedimentary overburden (mountains, highlands and plateaus) as well as for >3 m depth deposits in deltas and the Yedoma region.

  19. Entropy generation method to quantify thermal comfort

    NASA Technical Reports Server (NTRS)

    Boregowda, S. C.; Tiwari, S. N.; Chaturvedi, S. K.

    2001-01-01

    The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study is needed in the future to fully establish the validity of the OTCI formula and the model. One of the practical applications of this index is that could it be integrated in thermal control systems to develop human-centered environmental control systems for potential use in aircraft, mass transit vehicles, intelligent building systems, and space vehicles.

  20. Quantifying uncertainties in U.S. wildland fire emissions across space and time scales

    NASA Astrophysics Data System (ADS)

    Larkin, N. K.; Strand, T. T.; Raffuse, S. M.; Drury, S.

    2011-12-01

    Smoke from wildland fire is a growing concern as air quality regulations tighten and public acceptance declines. Wildland fire emissions inventories are not only important for understanding smoke impacts on air quality but also in quantifying sources of greenhouse gas emissions. Wildland fire emissions can be calculated using a number of models and methods. We show an overview of results from the Smoke and Emissions Model Intercomparison Project (SEMIP) describing uncertainties in calculations of U.S. wildland fire emissions across space and time scales from single fires to annual national totals. Differences in emissions calculated from different models and systems and satallite algorithms and ground based systems are shown. The relative importance of uncertainties in fire size and available fuel data, consumption modeling techniques, and emissions factors are compared and quantified and can be applied to various use cases that include air quality impact modeling and greenhouse gas accounting. The results of this work show where additional information and updated models can most improve wildland fire emission inventories.

  1. COMPLEXITY&APPROXIMABILITY OF QUANTIFIED&STOCHASTIC CONSTRAINT SATISFACTION PROBLEMS

    SciTech Connect

    Hunt, H. B. (Harry B.); Marathe, M. V. (Madhav V.); Stearns, R. E. (Richard E.)

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S and T be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SATc(S).) Here, we study simultaneously the complexity of decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. We present simple yet general techniques to characterize simultaneously, the complexity or efficient approximability of a number of versions/variants of the problems SAT(S), Q-SAT(S), S-SAT(S),MAX-Q-SAT(S) etc., for many different such D,C ,S, T. These versions/variants include decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. Our unified approach is based on the following two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic represent ability. Some of the results extend the earlier results in [Pa85,LMP99,CF+93,CF+94O]u r techniques and results reported here also provide significant steps towards obtaining dichotomy theorems, for a number of the problems above, including the problems MAX-&-SAT( S), and MAX-S-SAT(S). The discovery of such dichotomy theorems, for unquantified formulas, has received significant recent attention in the literature [CF+93,CF+94,Cr95,KSW97

  2. Effect of soil structure on the growth of bacteria in soil quantified using CARD-FISH

    NASA Astrophysics Data System (ADS)

    Juyal, Archana; Eickhorst, Thilo; Falconer, Ruth; Otten, Wilfred

    2014-05-01

    It has been reported that compaction of soil due to use of heavy machinery has resulted in the reduction of crop yield. Compaction affects the physical properties of soil such as bulk density, soil strength and porosity. This causes an alteration in the soil structure which limits the mobility of nutrients, water and air infiltration and root penetration in soil. Several studies have been conducted to explore the effect of soil compaction on plant growth and development. However, there is scant information on the effect of soil compaction on the microbial community and its activities in soil. Understanding the effect of soil compaction on microbial community is essential as microbial activities are very sensitive to abrupt environmental changes in soil. Therefore, the aim of this work was to investigate the effect of soil structure on growth of bacteria in soil. The bulk density of soil was used as a soil physical parameter to quantify the effect of soil compaction. To detect and quantify bacteria in soil the method of catalyzed reporter deposition-fluorescence in situ hybridization (CARD-FISH) was used. This technique results in high intensity fluorescent signals which make it easy to quantify bacteria against high levels of autofluorescence emitted by soil particles and organic matter. In this study, bacterial strains Pseudomonas fluorescens SBW25 and Bacillus subtilis DSM10 were used. Soils of aggregate size 2-1mm were packed at five different bulk densities in polyethylene rings (4.25 cm3).The soil rings were sampled at four different days. Results showed that the total number of bacteria counts was reduced significantly (P

  3. Quantifying structural redundancy in ecological communities

    Microsoft Academic Search

    K. R. Clarke; R. M. Warwick

    1998-01-01

    In multivariate analyses of the effects of both natural and anthropogenic environmental variability on community composition,\\u000a many species are interchangeable in the way that they characterise the samples, giving rise to the concept of structural redundancy\\u000a in community composition. Here, we develop a method of quantifying the extent of this redundancy by extracting a series of\\u000a subsets of species, the

  4. Crisis of Japanese Vascular Flora Shown By Quantifying Extinction Risks for 1618 Taxa

    PubMed Central

    Kadoya, Taku; Takenaka, Akio; Ishihama, Fumiko; Fujita, Taku; Ogawa, Makoto; Katsuyama, Teruo; Kadono, Yasuro; Kawakubo, Nobumitsu; Serizawa, Shunsuke; Takahashi, Hideki; Takamiya, Masayuki; Fujii, Shinji; Matsuda, Hiroyuki; Muneda, Kazuo; Yokota, Masatsugu; Yonekura, Koji; Yahara, Tetsukazu

    2014-01-01

    Although many people have expressed alarm that we are witnessing a mass extinction, few projections have been quantified, owing to limited availability of time-series data on threatened organisms, especially plants. To quantify the risk of extinction, we need to monitor changes in population size over time for as many species as possible. Here, we present the world's first quantitative projection of plant species loss at a national level, with stochastic simulations based on the results of population censuses of 1618 threatened plant taxa in 3574 map cells of ca. 100 km2. More than 500 lay botanists helped monitor those taxa in 1994–1995 and in 2003–2004. We projected that between 370 and 561 vascular plant taxa will go extinct in Japan during the next century if past trends of population decline continue. This extinction rate is approximately two to three times the global rate. Using time-series data, we show that existing national protected areas (PAs) covering ca. 7% of Japan will not adequately prevent population declines: even core PAs can protect at best <60% of local populations from decline. Thus, the Aichi Biodiversity Target to expand PAs to 17% of land (and inland water) areas, as committed to by many national governments, is not enough: only 29.2% of currently threatened species will become non-threatened under the assumption that probability of protection success by PAs is 0.5, which our assessment shows is realistic. In countries where volunteers can be organized to monitor threatened taxa, censuses using our method should be able to quantify how fast we are losing species and to assess how effective current conservation measures such as PAs are in preventing species extinction. PMID:24922311

  5. Quantifying occupant energy behavior using pattern analysis techniques

    SciTech Connect

    Emery, A. [Univ. of Washington, Seattle, WA (United States). Dept. of Mechanical Engineering; Gartland, L. [Lawrence Berkeley National Lab., CA (United States). Energy and Environment Div.

    1996-08-01

    Occupant energy behavior is widely agreed upon to have a major influence over the amount of energy used in buildings. Few attempts have been made to quantify this energy behavior, even though vast amounts of end-use data containing useful information lay fallow. This paper describes analysis techniques developed to extract behavioral information from collected residential end-use data. Analysis of the averages, standard deviations and frequency distributions of hourly data can yield important behavioral information. Pattern analysis can be used to group similar daily energy patterns together for a particular end-use or set of end-uses. Resulting pattern groups can then be examined statistically using multinomial logit modeling to find their likelihood of occurrence for a given set of daily conditions. These techniques were tested successfully using end-use data for families living in four heavily instrumented residences. Energy behaviors were analyzed for individual families during each heating season of the study. These behaviors (indoor temperature, ventilation load, water heating, large appliance energy, and miscellaneous outlet energy) capture how occupants directly control the residence. The pattern analysis and multinomial logit model were able to match the occupant behavior correctly 40 to 70% of the time. The steadier behaviors of indoor temperature and ventilation were matched most successfully. Simple changes to capture more detail during pattern analysis can increase accuracy for the more variable behavior patterns. The methods developed here show promise for extracting meaningful and useful information about occupant energy behavior from the stores of existing end-use data.

  6. A mass-balance model to separate and quantify colloidal and solute redistributions in soil

    USGS Publications Warehouse

    Bern, C.R.; Chadwick, O.A.; Hartshorn, A.S.; Khomo, L.M.; Chorover, J.

    2011-01-01

    Studies of weathering and pedogenesis have long used calculations based upon low solubility index elements to determine mass gains and losses in open systems. One of the questions currently unanswered in these settings is the degree to which mass is transferred in solution (solutes) versus suspension (colloids). Here we show that differential mobility of the low solubility, high field strength (HFS) elements Ti and Zr can trace colloidal redistribution, and we present a model for distinguishing between mass transfer in suspension and solution. The model is tested on a well-differentiated granitic catena located in Kruger National Park, South Africa. Ti and Zr ratios from parent material, soil and colloidal material are substituted into a mixing equation to quantify colloidal movement. The results show zones of both colloid removal and augmentation along the catena. Colloidal losses of 110kgm-2 (-5% relative to parent material) are calculated for one eluviated soil profile. A downslope illuviated profile has gained 169kgm-2 (10%) colloidal material. Elemental losses by mobilization in true solution are ubiquitous across the catena, even in zones of colloidal accumulation, and range from 1418kgm-2 (-46%) for an eluviated profile to 195kgm-2 (-23%) at the bottom of the catena. Quantification of simultaneous mass transfers in solution and suspension provide greater specificity on processes within soils and across hillslopes. Additionally, because colloids include both HFS and other elements, the ability to quantify their redistribution has implications for standard calculations of soil mass balances using such index elements. ?? 2011.

  7. Quantifying nonverbal communicative behavior in face-to-face human dialogues

    NASA Astrophysics Data System (ADS)

    Skhiri, Mustapha; Cerrato, Loredana

    2002-11-01

    The referred study is based on the assumption that understanding how humans use nonverbal behavior in dialogues can be very useful in the design of more natural-looking animated talking heads. The goal of the study is twofold: (1) to explore how people use specific facial expressions and head movements to serve important dialogue functions, and (2) to show evidence that it is possible to measure and quantify the entity of these movements with the Qualisys MacReflex motion tracking system. Naturally elicited dialogues between humans have been analyzed with focus on the attention on those nonverbal behaviors that serve the very relevant functions of regulating the conversational flux (i.e., turn taking) and producing information about the state of communication (i.e., feedback). The results show that eyebrow raising, head nods, and head shakes are typical signals involved during the exchange of speaking turns, as well as in the production and elicitation of feedback. These movements can be easily measured and quantified, and this measure can be implemented in animated talking heads.

  8. Quantifying litter decomposition losses to dissolved organic carbon and respiration

    NASA Astrophysics Data System (ADS)

    Soong, J.; Parton, W. J.; Calderon, F. J.; Guilbert, K.; Cotrufo, M.

    2013-12-01

    As litter decomposes its carbon is lost from the litter layer, largely through microbial processing. However, much of the carbon lost from the surface litter layer during decomposition is not truly lost from the ecosystem but gets transferred to the soil through fragmentation and leaching of dissolved organic carbon (DOC). This DOC in the soil acts as a stock of soil organic matter (SOM) to be utilized by soil microbes, stabilized in the soil, or leached further through the soil profile. The total amount of C that ends up leaching from litter to the soil, as well as its chemical composition, has important implications on the residence time of decomposing litter C in the soil and is not currently well parameterized in models. In this study we aim to quantify the proportional relationship between CO2 efflux and DOC partitioning during decomposition of fresh leaf litter with distinct structural and chemical composition. The results from this one-year laboratory incubation show a clear relationship between the lignin to cellulose ratios of litter and DOC to CO2 partitioning during four distinct phases of litter decomposition. For example, bluestem grass litter with a low lignin to cellulose ratio loses almost 50% of its C as DOC whereas pine needles with a high lignin to cellulose ratio loses only 10% of its C as DOC, indicating a potential ligno-cellulose complexation effect on carbon use efficiency during litter decomposition. DOC production also decreases with time during decomposition, correlating with increasing lignin to cellulose ratios as decomposition progresses. Initial DOC leaching can be predicted based on the amount of labile fraction in each litter type. Field data using stable isotope labeled bluestem grass show that about 18% of the surface litter C lost in 18 months of decomposition is stored in the soil, and that over 50% of this is recovered in mineral-associated heavy SOM fractions, not as litter fragments, confirming the relative importance of the DOC flux of C from the litter layer to the soil for stable SOM formation. These results are being used to parameterize a new litter decomposition sub-model to more accurately represent the movement of decomposing surface litter C to CO2 and the mineral soil. This surface litter sub-model can be used to strengthen our understanding of the litter C and microbial processes that feed into larger ecosystem models such as Daycent.

  9. Quantifying near-surface water exchange to assess hydrometeorological models

    NASA Astrophysics Data System (ADS)

    Parent, Annie-Claude; Anctil, François; Morais, Anne

    2013-04-01

    Modelling water exchange from the lower atmosphere, crop and soil system using hydrometeorological models allows processing an actual evapotranspiration (ETa) which is a complex but critical value for numerous hydrological purposes e.g. hydrological modelling and crop irrigation. This poster presents a summary of the hydrometeorological research activity conducted by our research group. The first purpose of this research is to quantify ETa and drainage of a rainfed potato crop located in South-Eastern Canada. Then, the outputs of the hydrometeorological models under study are compared with the observed turbulent fluxes. Afterwards, the sensibility of the hydrometeorological models to different inputs is assessed for an environment under a changing climate. ETa was measured from micrometeorological instrumentation (CSAT3, Campbell SCI Inc.; Li7500, LiCor Inc.), and the eddy covariance techniques. Near surface soil heat flux and soil water content at different layers from 10 cm to 100 cm were also measured. Other parameters required by the hydrometeorological models were observed using meteorological standard instrumentation: shortwave and longwave solar radiation, wind speed, air temperature, atmospheric pressure and precipitation. The cumulative ETa during the growth season (123 days) was 331.5 mm, with a daily maximum of 6.5 mm at full coverage; precipitation was 350.6 mm which is rather small compared with the historical mean (563.3 mm). This experimentation allowed calculating crop coefficients that vary among the growth season for a rainfed potato crop. Land surface schemes as CLASS (Canadian Land Surface Scheme) and c-ISBA (a Canadian version of the model Interaction Sol-Biosphère-Atmosphère) are 1-D physical hydrometeorological models that produce turbulent fluxes (including ETa) for a given crop. The schemes performances were assessed for both energy and water balance, based on the resulting turbulent fluxes and the given observations. CLASS showed overestimating the turbulent fluxes (including ETa) so as the fluctuations in the soil flux which were higher than those measured. ETa and runoff were overestimated by c-ISBA while drainage was weaker, compared to CLASS. On the whole, CLASS showed better modelling drainage. Further works include: 1- comparing observations and results from CLASS to the French model SURFEX (Surface Externalisée), that uses the scheme ISBA, and 2- assessing the sensibility of CLASS to different meteorological inputs (i.e. 6 regional climate models) in producing a consistent ETa, in a context of climate changes.

  10. Quantifying and managing the risk of information security breaches participants in a supply chain

    E-print Network

    Bellefeuille, Cynthia Lynn

    2005-01-01

    Technical integration between companies can result in an increased risk of information security breaches. This thesis proposes a methodology for quantifying information security risk to a supply chain participant. Given a ...

  11. Quantifying and scaling airplane performance in turbulence

    NASA Astrophysics Data System (ADS)

    Richardson, Johnhenri R.

    This dissertation studies the effects of turbulent wind on airplane airspeed and normal load factor, determining how these effects scale with airplane size and developing envelopes to account for them. The results have applications in design and control of aircraft, especially small scale aircraft, for robustness with respect to turbulence. Using linearized airplane dynamics and the Dryden gust model, this dissertation presents analytical and numerical scaling laws for airplane performance in gusts, safety margins that guarantee, with specified probability, that steady flight can be maintained when stochastic wind gusts act upon an airplane, and envelopes to visualize these safety margins. Presented here for the first time are scaling laws for the phugoid natural frequency, phugoid damping ratio, airspeed variance in turbulence, and flight path angle variance in turbulence. The results show that small aircraft are more susceptible to high frequency gusts, that the phugoid damping ratio does not depend directly on airplane size, that the airspeed and flight path angle variances can be parameterized by the ratio of the phugoid natural frequency to a characteristic turbulence frequency, and that the coefficient of variation of the airspeed decreases with increasing airplane size. Accompanying numerical examples validate the results using eleven different airplanes models, focusing on NASA's hypothetical Boeing 757 analog the Generic Transport Model and its operational 5.5% scale model, the NASA T2. Also presented here for the first time are stationary flight, where the flight state is a stationary random process, and the stationary flight envelope, an adjusted steady flight envelope to visualize safety margins for stationary flight. The dissertation shows that driving the linearized airplane equations of motion with stationary, stochastic gusts results in stationary flight. It also shows how feedback control can enlarge the stationary flight envelope by alleviating gust loads, though the enlargement is significantly limited by control surface saturation. The results end with a numerical example of a Navion general aviation aircraft performing various steady flight maneuvers in moderate turbulence, showing substantial reductions in the steady flight envelope for some combinations of maneuvers, turbulence, and safety margins.

  12. Quantifying tissue level ischemia: hypoxia response element-luciferase transfection in a rabbit ear model.

    PubMed

    Said, Hakim K; Roy, Nakshatra K; Gurjala, Anandev N; Mustoe, Thomas A

    2009-01-01

    Ischemia is a common underlying factor in a number of pathologic conditions ranging from cardiac dysfunction to delayed wound healing. Previous efforts have shown the resulting hypoxia activates the hypoxia inducible factor, a transcription factor with signaling effects through an intranuclear hypoxia response element (HRE). We hypothesized that ischemic conditions should activate these hypoxic signaling pathways in a measurable manner. We tested our hypothesis using variations of an established rabbit ear ischemic wound model and an HRE-luciferase-reporter gene construct. This plasmid construct was transfected into the ears of young, female New Zealand White rabbits, harvested at day 7 and processed to yield a reactive solution. Luminometry was used to quantify luciferase expression in each solution as a marker for HRE activation in each wound. Quantitative readings of hypoxic signaling as measured by luminescence yielded profound and statistically significant differences between the various ischemic models. Our results suggest that the biologic systems for hypoxic signaling can be used to detect local ischemia. HRE-luciferase transfection is an effective tool for quantifying the degree of tissue hypoxia. The caudal ischemic rabbit ear model showed significantly higher levels of hypoxia. Use of a validated model that produces sufficient tissue levels of hypoxia is recommended for meaningful study of ischemic wound healing. PMID:19614911

  13. A Time-Domain Hybrid Analysis Method for Detecting and Quantifying T-Wave Alternans

    PubMed Central

    Wan, Xiangkui; Yan, Kanghui; Zhang, Linlin; Zeng, Yanjun

    2014-01-01

    T-wave alternans (TWA) in surface electrocardiograph (ECG) signals has been recognized as a marker of cardiac electrical instability and is hypothesized to be associated with increased risk for ventricular arrhythmias among patients. A novel time-domain TWA hybrid analysis method (HAM) utilizing the correlation method and least squares regression technique is described in this paper. Simulated ECGs containing artificial TWA (cases of absence of TWA and presence of stationary or time-varying or phase-reversal TWA) under different baseline wanderings are used to test the method, and the results show that HAM has a better ability of quantifying TWA amplitude compared with the correlation method (CM) and adapting match filter method (AMFM). The HAM is subsequently used to analyze the clinical ECGs, and results produced by the HAM have, in general, demonstrated consistency with those produced by the CM and the AMFM, while the quantifying TWA amplitudes by the HAM are universally higher than those by the other two methods. PMID:24803951

  14. An index for quantifying flocking behavior.

    PubMed

    Quera, Vicenç; Herrando, Salvador; Beltran, Francesc S; Salas, Laura; Miñano, Meritxell

    2007-12-01

    One of the classic research topics in adaptive behavior is the collective displacement of groups of organisms such as flocks of birds, schools of fish, herds of mammals, and crowds of people. However, most agent-based simulations of group behavior do not provide a quantitative index for determining the point at which the flock emerges. An index was developed of the aggregation of moving individuals in a flock and an example was provided of how it can be used to quantify the degree to which a group of moving individuals actually forms a flock. PMID:18229552

  15. 3D Wind: Quantifying wind speed and turbulence intensity

    NASA Astrophysics Data System (ADS)

    Barthelmie, R. J.; Pryor, S. C.; Wang, H.; Crippa, P.

    2013-12-01

    Integrating measurements and modeling of wind characteristics for wind resource assessment and wind farm control is increasingly challenging as the scales of wind farms increases. Even offshore or in relatively homogeneous landscapes, there are significant gradients of both wind speed and turbulence intensity on scales that typify large wind farms. Our project is, therefore, focused on (i) improving methods to integrate remote sensing and in situ measurements with model simulations to produce a 3-dimensional view of the flow field on wind farm scales and (ii) investigating important controls on the spatiotemporal variability of flow fields within the coastal zone. The instrument suite deployed during the field experiments includes; 3-D sonic and cup anemometers deployed on meteorological masts and buoys, anemometers deployed on tethersondes and an Unmanned Aerial Vehicle, multiple vertically-pointing continuous-wave lidars and scanning Doppler lidars. We also integrate data from satellite-borne instrumentation - specifically synthetic aperture radar and scatterometers and output from the Weather Research and Forecast (WRF) model. Spatial wind fields and vertical profiles of wind speed from WRF and from the full in situ observational suite exhibit excellent agreement in a proof-of-principle experiment conducted in north Indiana particularly during convective conditions, but showed some discrepancies during the breakdown of the nocturnal stable layer. Our second experiment in May 2013 focused on triangulating a volume above an area of coastal water extending from the port in Cleveland out to an offshore water intake crib (about 5 km) and back to the coast, and includes extremely high resolution WRF simulations designed to characterize the coastal zone. Vertically pointing continuous-wave lidars were operated at each apex of the triangle, while the scanning Doppler lidar scanned out across the water over 90 degrees azimuth angle. Preliminary results pertaining to objective (i) indicate there is good agreement between wind and turbulence profiles to 200 m as measured by the vertically pointing lidars with the expected modification of the offshore profiles relative to the profiles at the coastline. However, these profiles do not always fully agree with wind speed and direction profiles measured by the scanning Doppler lidar. Further investigation is required to elucidate these results and to analyze whether these discrepancies occur during particular atmospheric conditions. Preliminary results regarding controls on flow in the coastal zone (i.e. objective ii) include clear evidence that the wind profile to 200 m was modified due to swell during unstable conditions even under moderate to high wind speed conditions. The measurement campaigns will be described in detail, with a view to evaluating optimal strategies for offshore measurement campaigns and in the context of quantifying wind and turbulence in a 3D volume.

  16. Can we quantify local groundwater recharge using electrical resistivity tomography?

    NASA Astrophysics Data System (ADS)

    Noell, U.; Günther, T.; Ganz, C.; Lamparter, A.

    2012-04-01

    Electrical resistivity tomography (ERT) has become a common tool to observe flow processes within the saturated/unsaturated zones. While it is still doubtful whether the method can reliably yield quantitative results the qualitative success has been shown in "numerous" examples. To quantify the rate of rainfall which reaches the groundwater table is still a problematic venture due to a sad combination of several physical and mathematical obstacles that may lead to huge errors. In 2007 an infiltration experiment was performed and observed using 3D array ERT. The site is located close to Hannover, Germany, on a well studied sandy soil. The groundwater table at this site was at a depth of about 1.3 m. The inversion results of the ERT data yield reliably looking pictures of the infiltration process. Later experiments nearby using tracer fluid and combined TDR and resistivity measurements in the subsurface strongly supported the assumption that the resistivity pictures indeed depict the water distributions during infiltration reliably. The quantitative interpretation shows that two days after infiltration about 40% of the water has reached the groundwater. However, the question remains how reliable this quantitative interpretation actually is. The first obstacle: The inversion of the ERT data gives one possible resistivity distribution within the subsurface that can explain the data. It is not necessarily the right one and the result depends on the error model and the inversion parameters and method. For these measurements we assume the same error for every single quadrupole (3%), applied the Gauss-Newton method and minimum length constraints in order to reduce the smoothing to a minimum (very small lambda). Numerical experiments showed little smoothing using this approach, and smoothing must be suppressed if preferential flow is to be seen. The inversion showed artefacts of minor amplitude compared with other inversion parameter settings. The second obstacle: The petrophysical function that relates the resistivity changes to water content changes is doubtful. This relationship was constructed by two ways; firstly by comparing in situ measured water contents and the ERT inversion results, secondly by laboratory measurements of soil samples taken at different depth. The results of these both methods vary; moreover, heterogeneity in the subsurface may cause an even greater variability of this relationship. For the calculation an "average" function was applied. The third obstacle: The pore water conductivity may change during the infiltration due to exchange of pore water. This effect is neglected for this experiment on account of the very similar resistivity of original pore water and infiltrated water. This effect, however, is of great importance if saline water is used for infiltration experiments. It will also hamper the quantitative interpretation if solution and precipitation processes within the soil during the infiltration are expected. The fourth obstacle: The disadvantageous shape of the function relating resistivity and water content. Unfortunately at high water contents only very little change in resistivity is observed if the water content increases or decreases, the function is steep only at small and medium water contents but very flat at high water contents. We conclude from the combination of these four obstacles that quantitative interpretation of recharge with ERT is possible only in fortunate cases. ERT can enable us to actually measure recharge processes. However, if the conditions are not fortunate, the interpretation of the ERT data will permit the conclusion whether there is recharge. The quantitative value will remain doubtful if no additional measurements are taken that narrow the uncertainties. Particularly TDR/resistivity measurements with the same probe are helpful to get the information about the mixing of the pore water.

  17. The SEGUE K Giant Survey. III. Quantifying Galactic Halo Substructure

    E-print Network

    Janesh, William; Ma, Zhibo; Harding, Paul; Rockosi, Constance; Starkenburg, Else; Xue, Xiang Xiang; Rix, Hans-Walter; Beers, Timothy C; Johnson, Jennifer; Lee, Young Sun; Schneider, Donald P

    2015-01-01

    We statistically quantify the amount of substructure in the Milky Way stellar halo using a sample of 4568 halo K giant stars at Galactocentric distances ranging over 5-125 kpc. These stars have been selected photometrically and confirmed spectroscopically as K giants from the Sloan Digital Sky Survey's SEGUE project. We use a position-velocity clustering estimator (the 4distance) and a smooth stellar halo model to quantify the amount of substructure in the halo. Overall, we find that the halo as a whole is highly structured, and confirm earlier work using BHB stars which showed that there is an increasing amount of substructure with increasing Galactocentric radius. In addition, we find that the amount of substructure in the halo increases with increasing metallicity, and that the K giant sample shows significantly stronger substructure than the BHB stars, which only sample the most metal poor stars. Using a friends-of-friends algorithm to identify groups, we find that a large fraction ($\\sim 33\\%$) of the st...

  18. Precise thermal NDE for quantifying structural damage

    SciTech Connect

    Del Grande, N.K.; Durbin, P.F.

    1995-09-18

    The authors demonstrated a fast, wide-area, precise thermal NDE imaging system to quantify aircraft corrosion damage, such as percent metal loss, above a threshold of 5% with 3% overall uncertainties. The DBIR precise thermal imaging and detection method has been used successfully to characterize defect types, and their respective depths, in aircraft skins, and multi-layered composite materials used for wing patches, doublers and stiffeners. This precise thermal NDE inspection tool has long-term potential benefits to evaluate the structural integrity of airframes, pipelines and waste containers. They proved the feasibility of the DBIR thermal NDE imaging system to inspect concrete and asphalt-concrete bridge decks. As a logical extension to the successful feasibility study, they plan to inspect a concrete bridge deck from a moving vehicle to quantify the volumetric damage within the deck and the percent of the deck which has subsurface delaminations. Potential near-term benefits are in-service monitoring from a moving vehicle to inspect the structural integrity of the bridge deck. This would help prioritize the repair schedule for a reported 200,000 bridge decks in the US which need substantive repairs. Potential long-term benefits are affordable, and reliable, rehabilitation for bridge decks.

  19. Quantifying Dirac hydrogenic effects via complexity measures

    E-print Network

    P. A. Bouvrie; S. López-Rosa; J. S. Dehesa

    2014-08-29

    The primary dynamical Dirac relativistic effects can only be seen in hydrogenic systems without the complications introduced by electron-electron interactions in many-electron systems. They are known to be the contraction-towards-the-origin of the electronic charge in hydrogenic systems and the nodal disapearance (because of the raising of all the non-relativistic minima) in the electron density of the excited states of these systems. In addition we point out the (largely ignored) gradient reduction of the charge density near and far the nucleus. In this work we quantify these effects by means of single (Fisher information) and composite (Fisher-Shannon complexity and plane, LMC complexity) information-theoretic measures. While the Fisher information measures the gradient content of the density, the (dimensionless) composite information-theoretic quantities grasp two-fold facets of the electronic distribution: The Fisher-Shannon complexity measures the combined balance of the gradient content and the total extent of the electronic charge, and the LMC complexity quantifies the disequilibrium jointly with the spreading of the density in the configuration space. Opposite to other complexity notions (e.g., computational and algorithmic complexities), these two quantities describe intrinsic properties of the system because they do not depend on the context but they are functionals of the electron density. Moreover, they are closely related to the intuitive notion of complexity because they are minimum for the two extreme (or least complex) distributions of perfect order and maximum disorder.

  20. Beyond immunity: quantifying the effects of host anti-parasite behavior on parasite transmission.

    PubMed

    Daly, Elizabeth W; Johnson, Pieter T J

    2011-04-01

    A host's first line of defense in response to the threat of parasitic infection is behavior, yet the efficacy of anti-parasite behaviors in reducing infection are rarely quantified relative to immunological defense mechanisms. Larval amphibians developing in aquatic habitats are at risk of infection from a diverse assemblage of pathogens, some of which cause substantial morbidity and mortality, suggesting that behavioral avoidance and resistance could be significant defensive strategies. To quantify the importance of anti-parasite behaviors in reducing infection, we exposed larval Pacific chorus frogs (Pseudacris regilla) to pathogenic trematodes (Ribeiroia and Echinostoma) in one of two experimental conditions: behaviorally active (unmanipulated) or behaviorally impaired (anesthetized). By quantifying both the number of successful and unsuccessful parasites, we show that host behavior reduces infection prevalence and intensity for both parasites. Anesthetized hosts were 20-39% more likely to become infected and, when infected, supported 2.8-fold more parasitic cysts. Echinostoma had a 60% lower infection success relative to the more deadly Ribeiroia and was also more vulnerable to behaviorally mediated reductions in transmission. For Ribeiroia, increases in host mass enhanced infection success, consistent with epidemiological theory, but this relationship was eroded among active hosts. Our results underscore the importance of host behavior in mitigating disease risk and suggest that, in some systems, anti-parasite behaviors can be as or more effective than immune-mediated defenses in reducing infection. Considering the severe pathologies induced by these and other pathogens of amphibians, we emphasize the value of a broader understanding of anti-parasite behaviors and how co-occurring stressors affect them. PMID:20857146

  1. Where do roots take up water? A method to quantify local root water uptake

    NASA Astrophysics Data System (ADS)

    Zarebanadkouki, M.; Kim, Y.; Carminati, A.

    2012-04-01

    During the past decades, considerable advances have been made in the conceptual understanding and mathematical description of root water uptake process. A large number of models of root water uptake with different degrees of complexity are now available. However, effective application of these models to practical situations for mechanistic description of root water uptake requires proper experimental data. The aim of this study is to introduce and test a non-destructive method for quantifying local water flow from soil to roots. We grew lupin in 30-25-1 cm containers. Each container was filled with a sandy soil which was partitioned into different compartments using 1cm-thick layers of coarse sand. Deuterium (D2O) was locally injected in soil near the root surface of 18-day old plans. The flow of D2O into transpiring plants (day) and non-transpiring plants (night) was traced using time-series neutron radiography. The results showed that: 1) D2O entered the roots faster during the day than night; 2) D2O quickly transported inside the roots towards the shoots during the day, while at night this flow was negligible. Differences between day and night were explained by convective flow of water into the root due to transpiration. To quantify the transport of D2O into roots, we developed a simple convection-diffusion model. The root water uptake predicted by the model was compared with the direct measurements of axial water flow in the roots. This new method allows for quantifying local water uptake in different parts of the root system.

  2. Quantifying individual performance in Cricket — A network analysis of batsmen and bowlers

    NASA Astrophysics Data System (ADS)

    Mukherjee, Satyam

    2014-01-01

    Quantifying individual performance in the game of Cricket is critical for team selection in International matches. The number of runs scored by batsmen and wickets taken by bowlers serves as a natural way of quantifying the performance of a cricketer. Traditionally the batsmen and bowlers are rated on their batting or bowling average respectively. However, in a game like Cricket it is always important the manner in which one scores the runs or claims a wicket. Scoring runs against a strong bowling line-up or delivering a brilliant performance against a team with a strong batting line-up deserves more credit. A player’s average is not able to capture this aspect of the game. In this paper we present a refined method to quantify the ‘quality’ of runs scored by a batsman or wickets taken by a bowler. We explore the application of Social Network Analysis (SNA) to rate the players in a team performance. We generate a directed and weighted network of batsmen-bowlers using the player-vs-player information available for Test cricket and ODI cricket. Additionally we generate a network of batsmen and bowlers based on the dismissal record of batsmen in the history of cricket-Test (1877-2011) and ODI (1971-2011). Our results show that M. Muralitharan is the most successful bowler in the history of Cricket. Our approach could potentially be applied in domestic matches to judge a player’s performance which in turn paves the way for a balanced team selection for International matches.

  3. Quantifying Proteinuria in Hypertensive Disorders of Pregnancy

    PubMed Central

    Amin, Sapna V.; Illipilla, Sireesha; Rai, Lavanya; Kumar, Pratap; Pai, Muralidhar V.

    2014-01-01

    Background. Progressive proteinuria indicates worsening of the condition in hypertensive disorders of pregnancy and hence its quantification guides clinician in decision making and treatment planning. Objective. To evaluate the efficacy of spot dipstick analysis and urinary protein-creatinine ratio (UPCR) in hypertensive disease of pregnancy for predicting 24-hour proteinuria. Subjects and Methods. A total of 102 patients qualifying inclusion criteria were evaluated with preadmission urine dipstick test and UPCR performed on spot voided sample. After admission, the entire 24-hour urine sample was collected and analysed for daily protein excretion. Dipstick estimation and UPCR were compared to the 24-hour results. Results. Seventy-eight patients (76.5%) had significant proteinuria of more than 300?mg/24?h. Dipstick method showed 59% sensitivity and 67% specificity for prediction of significant proteinuria. Area under curve for UPCR was 0.89 (95% CI: 0.83 to 0.95, P < 0.001) showing 82% sensitivity and 12.5% false positive rate for cutoff value of 0.45. Higher cutoff values (1.46 and 1.83) predicted heavy proteinuria (2?g and 3?g/24?h, resp.). Conclusion. This study suggests that random urinary protein?:?creatine ratio is a reliable investigation compared to dipstick method to assess proteinuria in hypertensive pregnant women. However, clinical laboratories should standardize the reference values for their setup. PMID:25302114

  4. Using nitrate to quantify quick flow in a karst aquifer.

    PubMed

    Mahler, Barbara J; Garner, Bradley D

    2009-01-01

    In karst aquifers, contaminated recharge can degrade spring water quality, but quantifying the rapid recharge (quick flow) component of spring flow is challenging because of its temporal variability. Here, we investigate the use of nitrate in a two-endmember mixing model to quantify quick flow in Barton Springs, Austin, Texas. Historical nitrate data from recharging creeks and Barton Springs were evaluated to determine a representative nitrate concentration for the aquifer water endmember (1.5 mg/L) and the quick flow endmember (0.17 mg/L for nonstormflow conditions and 0.25 mg/L for stormflow conditions). Under nonstormflow conditions for 1990 to 2005, model results indicated that quick flow contributed from 0% to 55% of spring flow. The nitrate-based two-endmember model was applied to the response of Barton Springs to a storm and results compared to those produced using the same model with delta(18)O and specific conductance (SC) as tracers. Additionally, the mixing model was modified to allow endmember quick flow values to vary over time. Of the three tracers, nitrate appears to be the most advantageous because it is conservative and because the difference between the concentrations in the two endmembers is large relative to their variance. The delta(18)O-based model was very sensitive to variability within the quick flow endmember, and SC was not conservative over the timescale of the storm response. We conclude that a nitrate-based two-endmember mixing model might provide a useful approach for quantifying the temporally variable quick flow component of spring flow in some karst systems. PMID:18800970

  5. Automated Reasoning in Quantified Modal and Temporal Logics 

    E-print Network

    Castellini, Claudio

    This thesis is about automated reasoning in quantified modal and temporal logics, with an application to formal methods. Quantified modal and temporal logics are extensions of classical first-order logic in which the notion of truth is extended...

  6. Quantifying the BICEP2-Planck tension over gravitational waves.

    PubMed

    Smith, Kendrick M; Dvorkin, Cora; Boyle, Latham; Turok, Neil; Halpern, Mark; Hinshaw, Gary; Gold, Ben

    2014-07-18

    The recent BICEP2 measurement of B-mode polarization in the cosmic microwave background (r = 0.2(-0.05)(+0.07)), a possible indication of primordial gravity waves, appears to be in tension with the upper limit from WMAP (r < 0.13 at 95% C.L.) and Planck (r < 0.11 at 95% C.L.). We carefully quantify the level of tension and show that it is very significant (around 0.1% unlikely) when the observed deficit of large-scale temperature power is taken into account. We show that measurements of TE and EE power spectra in the near future will discriminate between the hypotheses that this tension is either a statistical fluke or a sign of new physics. We also discuss extensions of the standard cosmological model that relieve the tension and some novel ways to constrain them. PMID:25083631

  7. Quantifying the BICEP2-Planck Tension over Gravitational Waves

    NASA Astrophysics Data System (ADS)

    Smith, Kendrick M.; Dvorkin, Cora; Boyle, Latham; Turok, Neil; Halpern, Mark; Hinshaw, Gary; Gold, Ben

    2014-07-01

    The recent BICEP2 measurement of B-mode polarization in the cosmic microwave background (r =0.2-0.05+0.07), a possible indication of primordial gravity waves, appears to be in tension with the upper limit from WMAP (r<0.13 at 95% C.L.) and Planck (r <0.11 at 95% C.L.). We carefully quantify the level of tension and show that it is very significant (around 0.1% unlikely) when the observed deficit of large-scale temperature power is taken into account. We show that measurements of TE and EE power spectra in the near future will discriminate between the hypotheses that this tension is either a statistical fluke or a sign of new physics. We also discuss extensions of the standard cosmological model that relieve the tension and some novel ways to constrain them.

  8. Quantifying touch feel perception: tribological aspects

    NASA Astrophysics Data System (ADS)

    Liu, X.; Yue, Z.; Cai, Z.; Chetwynd, D. G.; Smith, S. T.

    2008-08-01

    We report a new investigation into how surface topography and friction affect human touch-feel perception. In contrast with previous work based on micro-scale mapping of surface mechanical and tribological properties, this investigation focuses on the direct measurement of the friction generated when a fingertip is stroked on a test specimen. A special friction apparatus was built for the in situ testing, based on a linear flexure mechanism with both contact force and frictional force measured simultaneously. Ten specimens, already independently assessed in a 'perception clinic', with materials including natural wood, leather, engineered plastics and metal were tested and the results compared with the perceived rankings. Because surface geometrical features are suspected to play a significant role in perception, a second set of samples, all of one material, were prepared and tested in order to minimize the influence of properties such as hardness and thermal conductivity. To minimize subjective effects, all specimens were also tested in a roller-on-block configuration based upon the same friction apparatus, with the roller materials being steel, brass and rubber. This paper reports the detailed design and instrumentation of the friction apparatus, the experimental set-up and the friction test results. Attempts have been made to correlate the measured properties and the perceived feelings for both roughness and friction. The results show that the measured roughness and friction coefficient both have a strong correlation with the rough-smooth and grippy-slippery feelings.

  9. Approach to quantify human dermal skin aging using multiphoton laser scanning microscopy

    NASA Astrophysics Data System (ADS)

    Puschmann, Stefan; Rahn, Christian-Dennis; Wenck, Horst; Gallinat, Stefan; Fischer, Frank

    2012-03-01

    Extracellular skin structures in human skin are impaired during intrinsic and extrinsic aging. Assessment of these dermal changes is conducted by subjective clinical evaluation and histological and molecular analysis. We aimed to develop a new parameter for the noninvasive quantitative determination of dermal skin alterations utilizing the high-resolution three-dimensional multiphoton laser scanning microscopy (MPLSM) technique. To quantify structural differences between chronically sun-exposed and sun-protected human skin, the respective collagen-specific second harmonic generation and the elastin-specific autofluorescence signals were recorded in young and elderly volunteers using the MPLSM technique. After image processing, the elastin-to-collagen ratio (ELCOR) was calculated. Results show that the ELCOR parameter of volar forearm skin significantly increases with age. For elderly volunteers, the ELCOR value calculated for the chronically sun-exposed temple area is significantly augmented compared to the sun-protected upper arm area. Based on the MPLSM technology, we introduce the ELCOR parameter as a new means to quantify accurately age-associated alterations in the extracellular matrix.

  10. Quantifying uncertainties of cloud microphysical property retrievals with a perturbation method

    NASA Astrophysics Data System (ADS)

    Zhao, Chuanfeng; Xie, Shaocheng; Chen, Xiao; Jensen, Michael P.; Dunn, Maureen

    2014-05-01

    Quantifying the uncertainty of cloud retrievals is an emerging topic important for both cloud process studies and modeling studies. This paper presents a general approach to estimate uncertainties in ground-based retrievals of cloud properties. This approach, called the perturbation method, quantifies the cloud retrieval uncertainties by perturbing the cloud retrieval influential factors (like inputs and parameters) within their error ranges. The error ranges for the cloud retrieval inputs and parameters are determined by either instrument limitations or comparisons against aircraft observations. With the knowledge from observations and the retrieval algorithms, the perturbation method can provide an estimate of the cloud retrieval uncertainties, regardless of the complexity (like nonlinearity) of the retrieval algorithm. The relative contribution to the uncertainties of retrieved cloud properties from the inputs, assumptions, and parameterizations can also be assessed with this perturbation method. As an example, we apply this approach to the Atmospheric Radiation Measurement Program baseline retrieval, MICROBASE. Only nonprecipitating single-phase (liquid or ice) clouds have been examined in this study. Results reveal that different influential factors play the dominant contributing role to the uncertainties of different cloud properties. To reduce uncertainties in cloud retrievals, future efforts should be emphasized on the major contributing factors for considered cloud properties. This study also shows high sensitivity of cloud retrieval uncertainties to different cloud types, with the largest uncertainties for deep convective clouds. Limitations and further efforts for this uncertainty quantification method are discussed.

  11. Quantifying the effect of environment stability on the transcription factor repertoire of marine microbes

    PubMed Central

    2011-01-01

    Background DNA-binding transcription factors (TFs) regulate cellular functions in prokaryotes, often in response to environmental stimuli. Thus, the environment exerts constant selective pressure on the TF gene content of microbial communities. Recently a study on marine Synechococcus strains detected differences in their genomic TF content related to environmental adaptation, but so far the effect of environmental parameters on the content of TFs in bacterial communities has not been systematically investigated. Results We quantified the effect of environment stability on the transcription factor repertoire of marine pelagic microbes from the Global Ocean Sampling (GOS) metagenome using interpolated physico-chemical parameters and multivariate statistics. Thirty-five percent of the difference in relative TF abundances between samples could be explained by environment stability. Six percent was attributable to spatial distance but none to a combination of both spatial distance and stability. Some individual TFs showed a stronger relationship to environment stability and space than the total TF pool. Conclusions Environmental stability appears to have a clearly detectable effect on TF gene content in bacterioplanktonic communities described by the GOS metagenome. Interpolated environmental parameters were shown to compare well to in situ measurements and were essential for quantifying the effect of the environment on the TF content. It is demonstrated that comprehensive and well-structured contextual data will strongly enhance our ability to interpret the functional potential of microbes from metagenomic data. PMID:22587903

  12. A Synthetic Phased Array Surface Acoustic Wave Sensor for Quantifying Bolt Tension

    PubMed Central

    Martinez, Jairo; Sisman, Alper; Onen, Onursal; Velasquez, Dean; Guldiken, Rasim

    2012-01-01

    In this paper, we report our findings on implementing a synthetic phased array surface acoustic wave sensor to quantify bolt tension. Maintaining proper bolt tension is important in many fields such as for ensuring safe operation of civil infrastructures. Significant advantages of this relatively simple methodology is its capability to assess bolt tension without any contact with the bolt, thus enabling measurement at inaccessible locations, multiple bolt measurement capability at a time, not requiring data collection during the installation and no calibration requirements. We performed detailed experiments on a custom-built flexible bench-top experimental setup consisting of 1018 steel plate of 12.7 mm (½ in) thickness, a 6.4 mm (¼ in) grade 8 bolt and a stainless steel washer with 19 mm (¾ in) of external diameter. Our results indicate that this method is not only capable of clearly distinguishing properly bolted joints from loosened joints but also capable of quantifying how loose the bolt actually is. We also conducted detailed signal-to-noise (SNR) analysis and showed that the SNR value for the entire bolt tension range was sufficient for image reconstruction.

  13. Quantifying the threat of extinction from Muller's ratchet in the diploid Amazon molly (Poecilia formosa)

    PubMed Central

    2008-01-01

    Background The Amazon molly (Poecilia formosa) is a small unisexual fish that has been suspected of being threatened by extinction from the stochastic accumulation of slightly deleterious mutations that is caused by Muller's ratchet in non-recombining populations. However, no detailed quantification of the extent of this threat is available. Results Here we quantify genomic decay in this fish by using a simple model of Muller's ratchet with the most realistic parameter combinations available employing the evolution@home global computing system. We also describe simple extensions of the standard model of Muller's ratchet that allow us to deal with selfing diploids, triploids and mitotic recombination. We show that Muller's ratchet creates a threat of extinction for the Amazon molly for many biologically realistic parameter combinations. In most cases, extinction is expected to occur within a time frame that is less than previous estimates of the age of the species, leading to a genomic decay paradox. Conclusion How then does the Amazon molly survive? Several biological processes could individually or in combination solve this genomic decay paradox, including paternal leakage of undamaged DNA from sexual sister species, compensatory mutations and many others. More research is needed to quantify the contribution of these potential solutions towards the survival of the Amazon molly and other (ancient) asexual species. PMID:18366680

  14. The Physics of Equestrian Show Jumping

    NASA Astrophysics Data System (ADS)

    Stinner, Art

    2014-04-01

    This article discusses the kinematics and dynamics of equestrian show jumping. For some time I have attended a series of show jumping events at Spruce Meadows, an international equestrian center near Calgary, Alberta, often referred to as the "Wimbledon of equestrian jumping." I have always had a desire to write an article such as this one, but when I searched the Internet for information and looked at YouTube presentations, I could only find simplistic references to Newton's laws and the conservation of mechanical energy principle. Nowhere could I find detailed calculations. On the other hand, there were several biomechanical articles with empirical reports of the results of kinetic and dynamic investigations of show jumping using high-speed digital cameras and force plates. They summarize their results in tables that give information about the motion of a horse jumping over high fences (1.40 m) and the magnitudes of the forces encountered when landing. However, they do not describe the physics of these results.

  15. World Health Organization: Quantifying environmental health impacts

    NSDL National Science Digital Library

    The World Health Organization works in a number of public health areas, and their work on quantifying environmental health impacts has been receiving praise from many quarters. This site provides materials on their work in this area and visitors with a penchant for international health relief efforts and policy analysis will find the site invaluable. Along the left-hand side of the site, visitors will find topical sections that include "Methods", "Assessment at national level", "Global estimates", and "Publications". In the "Methods" area, visitors will learn about how the World Health Organization's methodology for studying environmental health impacts has been developed and they can also read a detailed report on the subject. The "Global Estimates" area is worth a look as well, and users can look at their complete report, "Preventing Disease Through Healthy Environments: Towards An Estimate Of the Global Burden Of Disease".

  16. Quantifying mitochondrial content in living cells.

    PubMed

    Viana, Matheus Palhares; Lim, Swee; Rafelski, Susanne M

    2015-01-01

    We describe a novel version of MitoGraph, our fully automated image processing method and software, dedicated to calculating the volume of 3D intracellular structures and organelles in live cells. MitoGraph is optimized and validated for quantifying the volume of tubular mitochondrial networks in budding yeast. We therefore include the experimental protocol, microscopy conditions, and software parameters focusing on mitochondria in budding yeast. However, MitoGraph can also be applied to mitochondria in other cell types and possibly other intracellular structures. We begin with our protocol and then include substantial discussion of the validation, requirements, and limits of MitoGraph to aid a wide range of potential users in applying MitoGraph to their data and troubleshooting any potential problems that arise. MitoGraph is freely available at the Web site http://rafelski.com/susanne/MitoGraph. PMID:25640425

  17. Quantifying fault recovery in multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Malek, Miroslaw; Harary, Frank

    1990-01-01

    Various aspects of reliable computing are formalized and quantified with emphasis on efficient fault recovery. The mathematical model which proves to be most appropriate is provided by the theory of graphs. New measures for fault recovery are developed and the value of elements of the fault recovery vector are observed to depend not only on the computation graph H and the architecture graph G, but also on the specific location of a fault. In the examples, a hypercube is chosen as a representative of parallel computer architecture, and a pipeline as a typical configuration for program execution. Dependability qualities of such a system is defined with or without a fault. These qualities are determined by the resiliency triple defined by three parameters: multiplicity, robustness, and configurability. Parameters for measuring the recovery effectiveness are also introduced in terms of distance, time, and the number of new, used, and moved nodes and edges.

  18. Animal biometrics: quantifying and detecting phenotypic appearance.

    PubMed

    Kühl, Hjalmar S; Burghardt, Tilo

    2013-07-01

    Animal biometrics is an emerging field that develops quantified approaches for representing and detecting the phenotypic appearance of species, individuals, behaviors, and morphological traits. It operates at the intersection between pattern recognition, ecology, and information sciences, producing computerized systems for phenotypic measurement and interpretation. Animal biometrics can benefit a wide range of disciplines, including biogeography, population ecology, and behavioral research. Currently, real-world applications are gaining momentum, augmenting the quantity and quality of ecological data collection and processing. However, to advance animal biometrics will require integration of methodologies among the scientific disciplines involved. Such efforts will be worthwhile because the great potential of this approach rests with the formal abstraction of phenomics, to create tractable interfaces between different organizational levels of life. PMID:23537688

  19. How to quantify conduits in wood?

    PubMed Central

    Scholz, Alexander; Klepsch, Matthias; Karimi, Zohreh; Jansen, Steven

    2013-01-01

    Vessels and tracheids represent the most important xylem cells with respect to long distance water transport in plants. Wood anatomical studies frequently provide several quantitative details of these cells, such as vessel diameter, vessel density, vessel element length, and tracheid length, while important information on the three dimensional structure of the hydraulic network is not considered. This paper aims to provide an overview of various techniques, although there is no standard protocol to quantify conduits due to high anatomical variation and a wide range of techniques available. Despite recent progress in image analysis programs and automated methods for measuring cell dimensions, density, and spatial distribution, various characters remain time-consuming and tedious. Quantification of vessels and tracheids is not only important to better understand functional adaptations of tracheary elements to environment parameters, but will also be essential for linking wood anatomy with other fields such as wood development, xylem physiology, palaeobotany, and dendrochronology. PMID:23507674

  20. Children's ambiguous understanding of weak and strong quantifiers

    Microsoft Academic Search

    Erik-Jan Smits; Tom Roeper; Bart Hollebrandse

    Despite suggestions in the literature that the semantics of many might be the key for understanding children's non-adult-like interpretations of quantified sentences (cf. Drozd 2001, Geurts 2003), experimental data on the acquisition of weak quantifiers like many is rare. This paper investigates children's comprehension of weak (many) versus strong (many of, all) quantifiers in English. In particular, by means of

  1. Quantifying the Behavioural Relevance of Hippocampal Neurogenesis

    PubMed Central

    Lazic, Stanley E.; Fuss, Johannes; Gass, Peter

    2014-01-01

    Few studies that examine the neurogenesis–behaviour relationship formally establish covariation between neurogenesis and behaviour or rule out competing explanations. The behavioural relevance of neurogenesis might therefore be overestimated if other mechanisms account for some, or even all, of the experimental effects. A systematic review of the literature was conducted and the data reanalysed using causal mediation analysis, which can estimate the behavioural contribution of new hippocampal neurons separately from other mechanisms that might be operating. Results from eleven eligible individual studies were then combined in a meta-analysis to increase precision (representing data from 215 animals) and showed that neurogenesis made a negligible contribution to behaviour (standarised effect ?=?0.15; 95% CI ?=??0.04 to 0.34; p?=?0.128); other mechanisms accounted for the majority of experimental effects (standardised effect ?=?1.06; 95% CI ?=?0.74 to 1.38; p?=?1.7×10?11). PMID:25426717

  2. Quantifiers More or Less Quantify On-Line: ERP Evidence for Partial Incremental Interpretation

    ERIC Educational Resources Information Center

    Urbach, Thomas P.; Kutas, Marta

    2010-01-01

    Event-related brain potentials were recorded during RSVP reading to test the hypothesis that quantifier expressions are incrementally interpreted fully and immediately. In sentences tapping general knowledge ("Farmers grow crops/worms as their primary source of income"), Experiment 1 found larger N400s for atypical ("worms") than typical objects…

  3. Quantifying copepod kinematics in a laboratory turbulence apparatus

    NASA Astrophysics Data System (ADS)

    Yen, J.; Rasberry, K. D.; Webster, D. R.

    We describe application of a new apparatus that permits simultaneous detailed observations of plankton behavior and turbulent velocities. We are able to acquire 3D trajectories amenable to statistical analyses for comparisons of copepod responses to well-quantified turbulence intensities that match those found in the coastal ocean environment. The turbulence characteristics consist of nearly isotropic and homogeneous velocity fluctuation statistics in the observation region. In the apparatus, three species of copepods, Acartia hudsonica, Temora longicornis, and Calanus finmarchicus were exposed separately to stagnant water plus four sequentially increasing levels of turbulence intensity. Copepod kinematics were quantified via several measures, including transport speed, motility number, net-to-gross displacement ratio, number of escape events, and number of animals phototactically aggregating per minute. The results suggest that these copepods could control their position and movements at low turbulence intensity. At higher turbulence intensity, the copepods movement was dominated by the water motion, although species-specific modifications due to size and swimming mode of the copepod influenced the results. Several trends support a dome-shaped variation of copepod kinematics with increasing turbulence. These species-specific trends and threshold quantities provide a data set for future comparative analyses of copepod responses to turbulence of varying duration as well as intensity.

  4. A revised metric for quantifying body shape in vertebrates.

    PubMed

    Collar, David C; Reynaga, Crystal M; Ward, Andrea B; Mehta, Rita S

    2013-08-01

    Vertebrates exhibit tremendous diversity in body shape, though quantifying this variation has been challenging. In the past, researchers have used simplified metrics that either describe overall shape but reveal little about its anatomical basis or that characterize only a subset of the morphological features that contribute to shape variation. Here, we present a revised metric of body shape, the vertebrate shape index (VSI), which combines the four primary morphological components that lead to shape diversity in vertebrates: head shape, length of the second major body axis (depth or width), and shape of the precaudal and caudal regions of the vertebral column. We illustrate the usefulness of VSI on a data set of 194 species, primarily representing five major vertebrate clades: Actinopterygii, Lissamphibia, Squamata, Aves, and Mammalia. We quantify VSI diversity within each of these clades and, in the course of doing so, show how measurements of the morphological components of VSI can be obtained from radiographs, articulated skeletons, and cleared and stained specimens. We also demonstrate that head shape, secondary body axis, and vertebral characteristics are important independent contributors to body shape diversity, though their importance varies across vertebrate groups. Finally, we present a functional application of VSI to test a hypothesized relationship between body shape and the degree of axial bending associated with locomotor modes in ray-finned fishes. Altogether, our study highlights the promise VSI holds for identifying the morphological variation underlying body shape diversity as well as the selective factors driving shape evolution. PMID:23746908

  5. SANTA: Quantifying the Functional Content of Molecular Networks

    PubMed Central

    Cornish, Alex J.; Markowetz, Florian

    2014-01-01

    Linking networks of molecular interactions to cellular functions and phenotypes is a key goal in systems biology. Here, we adapt concepts of spatial statistics to assess the functional content of molecular networks. Based on the guilt-by-association principle, our approach (called SANTA) quantifies the strength of association between a gene set and a network, and functionally annotates molecular networks like other enrichment methods annotate lists of genes. As a general association measure, SANTA can (i) functionally annotate experimentally derived networks using a collection of curated gene sets and (ii) annotate experimentally derived gene sets using a collection of curated networks, as well as (iii) prioritize genes for follow-up analyses. We exemplify the efficacy of SANTA in several case studies using the S. cerevisiae genetic interaction network and genome-wide RNAi screens in cancer cell lines. Our theory, simulations, and applications show that SANTA provides a principled statistical way to quantify the association between molecular networks and cellular functions and phenotypes. SANTA is available from http://bioconductor.org/packages/release/bioc/html/SANTA.html. PMID:25210953

  6. Using multiscale norms to quantify mixing and transport

    NASA Astrophysics Data System (ADS)

    Thiffeault, Jean-Luc

    2012-02-01

    Mixing is relevant to many areas of science and engineering, including the pharmaceutical and food industries, oceanography, atmospheric sciences and civil engineering. In all these situations one goal is to quantify and often then to improve the degree of homogenization of a substance being stirred, referred to as a passive scalar or tracer. A classical measure of mixing is the variance of the concentration of the scalar, which is the L2 norm of a mean-zero concentration field. Recently, other norms have been used to quantify mixing, in particular the mix-norm as well as negative Sobolev norms. These norms have the advantage that unlike variance they decay even in the absence of diffusion, and their decay corresponds to the flow being mixing in the sense of ergodic theory. General Sobolev norms weigh scalar gradients differently, and are known as multiscale norms for mixing. We review the applications of such norms to mixing and transport, and show how they can be used to optimize the stirring and mixing of a decaying passive scalar. We then review recent work on the less-studied case of a continuously replenished scalar field—the source-sink problem. In that case the flows that optimally reduce the norms are associated with transport rather than mixing: they push sources onto sinks, and vice versa.

  7. A framework for quantifying net benefits of alternative prognostic models.

    PubMed

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-30

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. PMID:21905066

  8. Quantifying fiber formation in meat analogs under high moisture extrusion using image processing

    NASA Astrophysics Data System (ADS)

    Ranasinghesagara, J.; Hsieh, F.; Yao, G.

    2005-11-01

    High moisture extrusion using twin-screw extruders shows great promise of producing meat analog products with vegetable proteins. The resulting products have well defined fiber formations; resemble real meat in both visual appearance and taste sensation. Developing reliable non-destructive techniques to quantify the textural properties of extrudates is important for quality control in the manufacturing process. In this study, we developed an image processing technique to automatically characterize sample fiber formation using digital imaging. The algorithm is based on statistical analysis of Hough transform. This objective method can be used as a standard method for evaluating other non-invasive methods. We have compared the fiber formation indices measured using this technique and a non-invasive fluorescence polarization method and obtained a high correlation.

  9. Quantifying and transferring contextual information in object detection.

    PubMed

    Zheng, Wei-Shi; Gong, Shaogang; Xiang, Tao

    2012-04-01

    Context is critical for reducing the uncertainty in object detection. However, context modeling is challenging because there are often many different types of contextual information coexisting with different degrees of relevance to the detection of target object(s) in different images. It is therefore crucial to devise a context model to automatically quantify and select the most effective contextual information for assisting in detecting the target object. Nevertheless, the diversity of contextual information means that learning a robust context model requires a larger training set than learning the target object appearance model, which may not be available in practice. In this work, a novel context modeling framework is proposed without the need for any prior scene segmentation or context annotation. We formulate a polar geometric context descriptor for representing multiple types of contextual information. In order to quantify context, we propose a new maximum margin context (MMC) model to evaluate and measure the usefulness of contextual information directly and explicitly through a discriminant context inference method. Furthermore, to address the problem of context learning with limited data, we exploit the idea of transfer learning based on the observation that although two categories of objects can have very different visual appearance, there can be similarity in their context and/or the way contextual information helps to distinguish target objects from nontarget objects. To that end, two novel context transfer learning models are proposed which utilize training samples from source object classes to improve the learning of the context model for a target object class based on a joint maximum margin learning framework. Experiments are carried out on PASCAL VOC2005 and VOC2007 data sets, a luggage detection data set extracted from the i-LIDS data set, and a vehicle detection data set extracted from outdoor surveillance footage. Our results validate the effectiveness of the proposed models for quantifying and transferring contextual information, and demonstrate that they outperform related alternative context models. PMID:21844619

  10. UV-vis spectra as an alternative to the Lowry method for quantify hair damage induced by surfactants.

    PubMed

    Pires-Oliveira, Rafael; Joekes, Inés

    2014-11-01

    It is well known that long term use of shampoo causes damage to human hair. Although the Lowry method has been widely used to quantify hair damage, it is unsuitable to determine this in the presence of some surfactants and there is no other method proposed in literature. In this work, a different method is used to investigate and compare the hair damage induced by four types of surfactants (including three commercial-grade surfactants) and water. Hair samples were immersed in aqueous solution of surfactants under conditions that resemble a shower (38 °C, constant shaking). These solutions become colored with time of contact with hair and its UV-vis spectra were recorded. For comparison, the amount of extracted proteins from hair by sodium dodecyl sulfate (SDS) and by water were estimated by the Lowry method. Additionally, non-pigmented vs. pigmented hair and also sepia melanin were used to understand the washing solution color and their spectra. The results presented herein show that hair degradation is mostly caused by the extraction of proteins, cuticle fragments and melanin granules from hair fiber. It was found that the intensity of solution color varies with the charge density of the surfactants. Furthermore, the intensity of solution color can be correlated to the amount of proteins quantified by the Lowry method as well as to the degree of hair damage. UV-vis spectrum of hair washing solutions is a simple and straightforward method to quantify and compare hair damages induced by different commercial surfactants. PMID:25277290

  11. Quantifying transmission by stage of infection in the field: the example of SIV-1 and STLV-1 infecting mandrills.

    PubMed

    Roussel, Marion; Pontier, Dominique; Kazanji, Mirdad; Ngoubangoye, Barthélémy; Mahieux, Renaud; Verrier, Delphine; Fouchet, David

    2015-03-01

    The early stage of viral infection is often followed by an important increase of viral load and is generally considered to be the most at risk for pathogen transmission. Most methods quantifying the relative importance of the different stages of infection were developed for studies aimed at measuring HIV transmission in Humans. However, they cannot be transposed to animal populations in which less information is available. Here we propose a general method to quantify the importance of the early and late stages of the infection on micro-organism transmission from field studies. The method is based on a state space dynamical model parameterized using Bayesian inference. It is illustrated by a 28 years dataset in mandrills infected by Simian Immunodeficiency Virus type-1 (SIV-1) and the Simian T-Cell Lymphotropic Virus type-1 (STLV-1). For both viruses we show that transmission is predominant during the early stage of the infection (transmission ratio for SIV-1: 1.16 [0.0009; 18.15] and 9.92 [0.03; 83.8] for STLV-1). However, in terms of basic reproductive number (R0 ), which quantifies the weight of both stages in the spread of the virus, the results suggest that the epidemics of SIV-1 and STLV-1 are mainly driven by late transmissions in this population. PMID:25296992

  12. Quantifying fluvial bedrock erosion using repeat terrestrial Lidar

    NASA Astrophysics Data System (ADS)

    Cook, Kristen

    2013-04-01

    The Da'an River Gorge in western Taiwan provides a unique opportunity to observe the formation and evolution of a natural bedrock gorge. The 1.2 km long and up to 20 m deep gorge has formed since 1999 in response to uplift of the riverbed during the Chi-Chi earthquake. The extremely rapid pace of erosion enables us to observe both downcutting and channel widening over short time periods. We have monitored the evolution of the gorge since 2009 using repeat RTK GPS surveys and terrestrial Lidar scans. GPS surveys of the channel profile are conducted frequently, with 24 surveys to date, while Lidar scans are conducted after major floods, or after 5-9 months without a flood, for a total of 8 scans to date. The Lidar data are most useful for recording erosion of channel walls, which is quite episodic and highly variable along the channel. By quantifying the distribution of wall erosion in space and time, we can improve our understanding of channel widening processes and of the development of the channel planform, particularly the growth of bends. During the summer of 2012, the Da'an catchment experienced two large storm events, a meiyu (plum rain) event on June 10-13 that brought 800 mm of rain and a typhoon on August 1-3 that brought 650 mm of rain. The resulting floods had significant geomorphic effects on the Da'an gorge, including up to 10s of meters of erosion in some sections of the gorge walls. We quantify these changes using Lidar surveys conducted on June 7, July 3, and August 30. Channel wall collapses also occur in the absence of large floods, and we use scans from August 23, 2011 and June 7, 2012 to quantify erosion during a period that included a number of small floods, but no large ones. This allows us to compare the impact of 9 months of normal conditions to the impact of short-duration extreme events. The observed variability of erosion in space and time highlights the need for 3D techniques such as terrestrial Lidar to properly quantify erosion in this setting.

  13. Quantifying GCM uncertainty for estimating storage requirements in Australian reservoir

    NASA Astrophysics Data System (ADS)

    Woldemeskel, Fitsum; Sharma, Ashish; Sivakumar, Bellie; Mehrotra, Raj

    2014-05-01

    Climate change is anticipated to have enormous impacts on our water resources. Whether or not the existing storage capacity of reservoirs is sufficient to meet the future water demands is a question of great interest to water managers and policy makers. Amongst other things, uncertainties in GCM projections make accurate estimation of future water availability and reservoir storage requirements extremely complicated. The present study proposes a new method to quantify GCM uncertainties and their influence on the estimation of reservoir storage. Reliable quantification of GCM uncertainties requires utilization of many ensemble runs for each model and emission scenario. However, the climate modeling groups around the world produce only a few ensemble runs for each scenario. Using these limited number of ensemble runs, this study presents a method to quantify GCM uncertainty that varies with space and time as a function of the GCM being assessed. Then, using GCM projection and estimated associated uncertainty, new data series are generated assuming an additive error model which are used to ascertain effects of GCM uncertainties in impact assessment studies. The analysis involves the following important steps: First, standard errors of bias-corrected GCM projections are estimated using multiple model, scenario and ensemble runs conditional on each percentile. Second, assuming an additive error model, several realizations are generated by randomly sampling from normal distribution. Finally, the generated realizations are applied to evaluate impacts of climate change on reservoir storage estimation and establish its associated uncertainty. The proposed method is applied to quantify uncertainties in rainfall and temperature projections obtained from six GCMs, three emission scenarios and three ensemble runs after correcting biases using the Nested Bias Correction (NBC). Then, thousands of rainfall and temperature realizations are generated using an additive error model for selected GCM and scenario projection. The temperature data are used to estimate evaporation realizations which are then used as input (together with rainfall) to rainfall-runoff model for estimating streamflow. Finally, the streamflow realizations are used to quantify reservoir storage requirements with its associated uncertainties using behavior analysis. Results at the Warragamba dam in Australia reveal that GCM uncertainties will be significantly large for the future period than that for the historical period for both rainfall and temperature at different demand levels. Further, comparison of effects of rainfall and evaporation uncertainty suggests that reservoir storage uncertainty is introduced mainly from rainfall, rather than evaporation.

  14. Using Insar to Quantify Seasonal Fluctuations in Landslide Velocity, Eel River, Northern California

    NASA Astrophysics Data System (ADS)

    Handwerger, A. L.; Roering, J. J.; Schmidt, D. A.

    2011-12-01

    Large, slow-moving, deep-seated landslides are hydrologically driven and respond to precipitation over seasonal time scales. Precipitation causes changes in pore pressure, which alters effective stress and landslide velocity. Here, we use InSAR to quantify changes in landslide velocity for 32 landslides between February 2007 and January 2011 in the Eel River catchment, northern California. We investigate relationships between lithology and landslide properties (including aspect ratio, planform area, depth) and landslide dynamics. The time series behavior of each landslide was calculated by performing an inversion of small-baseline interferograms. We produced 165 differential interferograms with a minimum satellite return interval of 46 days using ALOS PALSAR data from tracks 223 and 224 with the ROI_PAC processing package. Climatic data and geologic maps were provided by NOAA and the California State Geological Survey, respectively. For each landslide we analyzed the planform area, depth, slope, and drainage area using DEMs derived from LiDAR and SRTM data. To quantify the resolution of our time series methodology, we performed a sensitivity analysis using a synthetic data set to determine the minimum detectable temporal signal given the temporal distribution of interferograms. This analysis shows that the temporal sampling of the data set is sufficient to resolve a seasonal signal with a wavelength of ~1 year, which is consistent with the expected seasonal response time of these landslides. Preliminary results show that holding lithology and climate constant, landslides move continuously through the year, accelerating well into the wet season and decelerating during the dry season with a lag time of weeks to months. The 32 identified landslides move at line-of-sight rates ranging from 0.1 m yr -1 to 0.45 m yr -1, and have dimensions ranging from 0.5 to 5 km long and 0.27 to 3 km wide. Each landslide has distinct kinematic zones (e.g. source, transport, toe) that exhibit different seasonal behaviors; the largest seasonal response occurs in the source and toe zones for the largest landslides. Landslide size (i.e. planform area) also appears to influence temporal changes in velocity as smaller landslides respond to precipitation before larger landslides. Because landslide area scales with depth, this implies that the depth to the shear zone modulates landslide response time. Future work is aimed at quantifying relationships between topography (e.g. drainage area, slope, convergent vs. divergent drainage) and landslide velocity. We hypothesize that landslides with large upslope convergent zones receive high recharge. As a result, they experience shorter response times than landslides that lie within planar or divergent areas.

  15. Land cover change and remote sensing: Examples of quantifying spatiotemporal dynamics in tropical forests

    SciTech Connect

    Krummel, J.R.; Su, Haiping [Argonne National Lab., IL (United States); Fox, J. [East-West Center, Honolulu, HI (United States); Yarnasan, S.; Ekasingh, M. [Chiang Mai Univ. (Thailand)

    1995-06-01

    Research on human impacts or natural processes that operate over broad geographic areas must explicitly address issues of scale and spatial heterogeneity. While the tropical forests of Southeast Asia and Mexico have been occupied and used to meet human needs for thousands of years, traditional forest management systems are currently being transformed by rapid and far-reaching demographic, political, economic, and environmental changes. The dynamics of population growth, migration into the remaining frontiers, and responses to national and international market forces result in a demand for land to produce food and fiber. These results illustrate some of the mechanisms that drive current land use changes, especially in the tropical forest frontiers. By linking the outcome of individual land use decisions and measures of landscape fragmentation and change, the aggregated results shows the hierarchy of temporal and spatial events that in summation result in global changes to the most complex and sensitive biome -- tropical forests. By quantifying the spatial and temporal patterns of tropical forest change, researchers can assist policy makers by showing how landscape systems in these tropical forests are controlled by physical, biological, social, and economic parameters.

  16. Methodology for quantifying the tonal prominence of frequency modulated tones

    NASA Astrophysics Data System (ADS)

    Lee, Kyoung Hoon; Davies, Patricia; Surprenant, Aimee

    2003-10-01

    The paper is focused on research conducted to develop a metric to quantify the tonal prominence of frequency modulated tones in noise. Frequency modulated tones are commonly encountered in machinery noise because of RPM variations caused by changing loads and poor control in timing and repeatability of combustion events in engines, particularly diesel engines. When tonal components are noticeable they increase annoyance; thus it is important to quantify both their strength and the resulting contribution to annoyance. In preliminary work on tones with randomly varying frequencies, it was found that it is possible to remove the trackable portion of the frequency variation and use established tonal prominence metrics (Aures' Tonality and Tone-to-Noise Ratio) on the modified signal to predict the perceived tonalness of the sounds (Lee, Hastings, Davies and Surprenant, Internoise 2003). The work presented is an extension of this earlier work: deterministic frequency variations are studied along with different levels of background noise. Modifications to the procedure to determine the strength of the tonal feature are described, as well as the issues that must be addressed before using this approach to analyze more complex machinery sounds. [Work sponsored by Caterpillar Inc.

  17. Quantifying complexity in translational research: an integrated approach

    PubMed Central

    Munoz, David A.; Nembhard, Harriet Black; Kraschnewski, Jennifer L.

    2014-01-01

    Purpose This article quantifies complexity in translational research. The impact of major operational steps and technical requirements (TR) is calculated with respect to their ability to accelerate moving new discoveries into clinical practice. Design/Methodology/Approach A three-phase integrated Quality Function Deployment (QFD) and Analytic Hierarchy Process (AHP) method was used to quantify complexity in translational research. A case study in obesity was used to usability. Findings Generally, the evidence generated was valuable for understanding various components in translational research. Particularly, we found that collaboration networks, multidisciplinary team capacity and community engagement are crucial for translating new discoveries into practice. Research limitations/implications As the method is mainly based on subjective opinion, some argue that the results may be biased. However, a consistency ratio is calculated and used as a guide to subjectivity. Alternatively, a larger sample may be incorporated to reduce bias. Practical implications The integrated QFD-AHP framework provides evidence that could be helpful to generate agreement, develop guidelines, allocate resources wisely, identify benchmarks and enhance collaboration among similar projects. Originality/value Current conceptual models in translational research provide little or no clue to assess complexity. The proposed method aimed to fill this gap. Additionally, the literature review includes various features that have not been explored in translational research. PMID:25417380

  18. Quantifying Relative Diver Effects in Underwater Visual Censuses

    PubMed Central

    Dickens, Luke C.; Goatley, Christopher H. R.; Tanner, Jennifer K.; Bellwood, David R.

    2011-01-01

    Diver-based Underwater Visual Censuses (UVCs), particularly transect-based surveys, are key tools in the study of coral reef fish ecology. These techniques, however, have inherent problems that make it difficult to collect accurate numerical data. One of these problems is the diver effect (defined as the reaction of fish to a diver). Although widely recognised, its effects have yet to be quantified and the extent of taxonomic variation remains to be determined. We therefore examined relative diver effects on a reef fish assemblage on the Great Barrier Reef. Using common UVC methods, the recorded abundance of seven reef fish groups were significantly affected by the ongoing presence of SCUBA divers. Overall, the diver effect resulted in a 52% decrease in the mean number of individuals recorded, with declines of up to 70% in individual families. Although the diver effect appears to be a significant problem, UVCs remain a useful approach for quantifying spatial and temporal variation in relative fish abundances, especially if using methods that minimise the exposure of fishes to divers. Fixed distance transects using tapes or lines deployed by a second diver (or GPS-calibrated timed swims) would appear to maximise fish counts and minimise diver effects. PMID:21533039

  19. Quantifiable outcomes from corporate and higher education learning collaborations

    NASA Astrophysics Data System (ADS)

    Devine, Thomas G.

    The study investigated the existence of measurable learning outcomes that emerged out of the shared strengths of collaborating sponsors. The study identified quantifiable learning outcomes that confirm corporate, academic and learner participation in learning collaborations. Each of the three hypotheses and the synergy indicator quantitatively and qualitatively confirmed learning outcomes benefiting participants. The academic-indicator quantitatively confirmed that learning outcomes attract learners to the institution. The corporate-indicator confirmed that learning outcomes include knowledge exchange and enhanced workforce talents for careers in the energy-utility industry. The learner-indicator confirmed that learning outcomes provide professional development opportunities for employment. The synergy-indicator confirmed that best learning practices in learning collaborations emanate out of the sponsors' shared strengths, and that partnerships can be elevated to strategic alliances, going beyond response to the desires of sponsors to create learner-centered cultures. The synergy-indicator confirmed the value of organizational processes that elevate sponsors' interactions to sharing strength, to create a learner-centered culture. The study's series of qualitative questions confirmed prior success factors, while verifying the hypothesis results and providing insight not available from quantitative data. The direct benefactors of the study are the energy-utility learning-collaboration participants of the study, and corporation, academic institutions, and learners of the collaboration. The indirect benefactors are the stakeholders of future learning collaborations, through improved knowledge of the existence or absence of quantifiable learning outcomes.

  20. Quantifying chaotic dynamics from integrate-and-fire processes.

    PubMed

    Pavlov, A N; Pavlova, O N; Mohammad, Y K; Kurths, J

    2015-01-01

    Characterizing chaotic dynamics from integrate-and-fire (IF) interspike intervals (ISIs) is relatively easy performed at high firing rates. When the firing rate is low, a correct estimation of Lyapunov exponents (LEs) describing dynamical features of complex oscillations reflected in the IF ISI sequences becomes more complicated. In this work we discuss peculiarities and limitations of quantifying chaotic dynamics from IF point processes. We consider main factors leading to underestimated LEs and demonstrate a way of improving numerical determining of LEs from IF ISI sequences. We show that estimations of the two largest LEs can be performed using around 400 mean periods of chaotic oscillations in the regime of phase-coherent chaos. Application to real data is discussed. PMID:25637929

  1. Quantifying chaotic dynamics from integrate-and-fire processes

    NASA Astrophysics Data System (ADS)

    Pavlov, A. N.; Pavlova, O. N.; Mohammad, Y. K.; Kurths, J.

    2015-01-01

    Characterizing chaotic dynamics from integrate-and-fire (IF) interspike intervals (ISIs) is relatively easy performed at high firing rates. When the firing rate is low, a correct estimation of Lyapunov exponents (LEs) describing dynamical features of complex oscillations reflected in the IF ISI sequences becomes more complicated. In this work we discuss peculiarities and limitations of quantifying chaotic dynamics from IF point processes. We consider main factors leading to underestimated LEs and demonstrate a way of improving numerical determining of LEs from IF ISI sequences. We show that estimations of the two largest LEs can be performed using around 400 mean periods of chaotic oscillations in the regime of phase-coherent chaos. Application to real data is discussed.

  2. Quantifying spin Hall angles from spin pumping: experiments and theory.

    PubMed

    Mosendz, O; Pearson, J E; Fradin, F Y; Bauer, G E W; Bader, S D; Hoffmann, A

    2010-01-29

    Spin Hall effects intermix spin and charge currents even in nonmagnetic materials and, therefore, ultimately may allow the use of spin transport without the need for ferromagnets. We show how spin Hall effects can be quantified by integrating Ni{80}Fe{20}|normal metal (N) bilayers into a coplanar waveguide. A dc spin current in N can be generated by spin pumping in a controllable way by ferromagnetic resonance. The transverse dc voltage detected along the Ni{80}Fe{20}|N has contributions from both the anisotropic magnetoresistance and the spin Hall effect, which can be distinguished by their symmetries. We developed a theory that accounts for both. In this way, we determine the spin Hall angle quantitatively for Pt, Au, and Mo. This approach can readily be adapted to any conducting material with even very small spin Hall angles. PMID:20366725

  3. Live Cell Interferometry Quantifies Dynamics of Biomass Partitioning during Cytokinesis

    PubMed Central

    Zangle, Thomas A.; Teitell, Michael A.; Reed, Jason

    2014-01-01

    The equal partitioning of cell mass between daughters is the usual and expected outcome of cytokinesis for self-renewing cells. However, most studies of partitioning during cell division have focused on daughter cell shape symmetry or segregation of chromosomes. Here, we use live cell interferometry (LCI) to quantify the partitioning of daughter cell mass during and following cytokinesis. We use adherent and non-adherent mouse fibroblast and mouse and human lymphocyte cell lines as models and show that, on average, mass asymmetries present at the time of cleavage furrow formation persist through cytokinesis. The addition of multiple cytoskeleton-disrupting agents leads to increased asymmetry in mass partitioning which suggests the absence of active mass partitioning mechanisms after cleavage furrow positioning. PMID:25531652

  4. Use of short half-life cosmogenic isotopes to quantify sediment mixing and transport in karst conduits

    NASA Astrophysics Data System (ADS)

    Paylor, R.

    2011-12-01

    Particulate inorganic carbon (PIC) transport and flux in karst aquifers is poorly understood. Methods to quantify PIC flux are needed in order to account for total inorganic carbon removal (chemical plus mechanical) from karst settings. Quantifying PIC flux will allow more accurate calculations of landscape denudation and global carbon sink processes. The study concentrates on the critical processes of the suspended sediment component of mass flux - surface soil/stored sediment mixing, transport rates and distance, and sediment storage times. The primary objective of the study is to describe transport and mixing with the resolution of single storm-flow events. To quantify the transport processes, short half-life cosmogenic isotopes are utilized. The isotopes 7Be (t1/2 = 53d) and 210Pb (t1/2 = 22y) are the primary isotopes measured, and other potential isotopes such as 137Cs and 241Am are investigated. The study location is at Mammoth Cave National Park within the Logsdon River watershed. The Logsdon River conduit is continuously traversable underground for two kilometers. Background levels and input concentrations of isotopes are determined from soil samples taken at random locations in the catchment area, and suspended sediment collected from the primary sinking stream during a storm event. Suspended sediment was also collected from the downstream end of the conduit during the storm event. After the storm flow receded, fine sediment samples were taken from the cave stream at regular intervals to determine transport distances and mixing ratios along the conduit. Samples were analyzed with a Canberra Industries gamma ray spectrometer, counted for 24 hours to increase detection of low radionuclide activities. The measured activity levels of radionuclides in the samples were adjusted for decay from time of sampling using standard decay curves. The results of the study show that surface sediment mixing, transport and storage in karst conduits is a dynamic but potentially quantifiable process at the storm-event scale.

  5. Quantifying bank storage of variably saturated aquifers.

    PubMed

    Li, Hailong; Boufadel, Michel C; Weaver, James W

    2008-01-01

    Numerical simulations were conducted to quantify bank storage in a variably saturated, homogenous, and anisotropic aquifer abutting a stream during rising stream stage. Seepage faces and bank slopes ranging from 1/3 to 100/3 were simulated. The initial conditions were assumed steady-state flow with water draining toward the stream. Then, the stream level rose at a constant rate to the specified elevation of the water table given by the landward boundary condition and stayed there until the system reached a new steady state. This represents a highly simplified version of a real world hydrograph. For the specific examples considered, the following conclusions can be made. The volume of surface water entering the bank increased with the rate of stream level rise, became negligible when the rate of rise was slow, and approached a positive constant when the rate was large. Also, the volume decreased with the dimensionless parameter M (the product of the anisotropy ratio and the square of the domain's aspect ratio). When M was large (>10), bank storage was small because most pore space was initially saturated with ground water due to the presence of a significant seepage face. When M was small, the seepage face became insignificant and capillarity began to play a role. The weaker the capillary effect, the easier for surface water to enter the bank. The effect of the capillary forces on the volume of surface water entering the bank was significant and could not be neglected. PMID:18657116

  6. Quantifying the transmission potential of pandemic influenza

    NASA Astrophysics Data System (ADS)

    Chowell, Gerardo; Nishiura, Hiroshi

    2008-03-01

    This article reviews quantitative methods to estimate the basic reproduction number of pandemic influenza, a key threshold quantity to help determine the intensity of interventions required to control the disease. Although it is difficult to assess the transmission potential of a probable future pandemic, historical epidemiologic data is readily available from previous pandemics, and as a reference quantity for future pandemic planning, mathematical and statistical analyses of historical data are crucial. In particular, because many historical records tend to document only the temporal distribution of cases or deaths (i.e. epidemic curve), our review focuses on methods to maximize the utility of time-evolution data and to clarify the detailed mechanisms of the spread of influenza. First, we highlight structured epidemic models and their parameter estimation method which can quantify the detailed disease dynamics including those we cannot observe directly. Duration-structured epidemic systems are subsequently presented, offering firm understanding of the definition of the basic and effective reproduction numbers. When the initial growth phase of an epidemic is investigated, the distribution of the generation time is key statistical information to appropriately estimate the transmission potential using the intrinsic growth rate. Applications of stochastic processes are also highlighted to estimate the transmission potential using similar data. Critically important characteristics of influenza data are subsequently summarized, followed by our conclusions to suggest potential future methodological improvements.

  7. Fluorescence imaging to quantify crop residue cover

    NASA Technical Reports Server (NTRS)

    Daughtry, C. S. T.; Mcmurtrey, J. E., III; Chappelle, E. W.

    1994-01-01

    Crop residues, the portion of the crop left in the field after harvest, can be an important management factor in controlling soil erosion. Methods to quantify residue cover are needed that are rapid, accurate, and objective. Scenes with known amounts of crop residue were illuminated with long wave ultraviolet (UV) radiation and fluorescence images were recorded with an intensified video camera fitted with a 453 to 488 nm band pass filter. A light colored soil and a dark colored soil were used as background for the weathered soybean stems. Residue cover was determined by counting the proportion of the pixels in the image with fluorescence values greater than a threshold. Soil pixels had the lowest gray levels in the images. The values of the soybean residue pixels spanned nearly the full range of the 8-bit video data. Classification accuracies typically were within 3(absolute units) of measured cover values. Video imaging can provide an intuitive understanding of the fraction of the soil covered by residue.

  8. Choosing appropriate techniques for quantifying groundwater recharge

    USGS Publications Warehouse

    Scanlon, B.R.; Healy, R.W.; Cook, P.G.

    2002-01-01

    Various techniques are available to quantify recharge; however, choosing appropriate techniques is often difficult. Important considerations in choosing a technique include space/time scales, range, and reliability of recharge estimates based on different techniques; other factors may limit the application of particular techniques. The goal of the recharge study is important because it may dictate the required space/time scales of the recharge estimates. Typical study goals include water-resource evaluation, which requires information on recharge over large spatial scales and on decadal time scales; and evaluation of aquifer vulnerability to contamination, which requires detailed information on spatial variability and preferential flow. The range of recharge rates that can be estimated using different approaches should be matched to expected recharge rates at a site. The reliability of recharge estimates using different techniques is variable. Techniques based on surface-water and unsaturated-zone data provide estimates of potential recharge, whereas those based on groundwater data generally provide estimates of actual recharge. Uncertainties in each approach to estimating recharge underscore the need for application of multiple techniques to increase reliability of recharge estimates.

  9. Quantifying Climate Risks for Urban Environments

    NASA Astrophysics Data System (ADS)

    Hayhoe, K.; Stoner, A. K.; Dickson, L.

    2013-12-01

    High-density urban areas are both uniquely vulnerable and uniquely able to adapt to climate change. Enabling this potential requires identifying the vulnerabilities, however, and these depend strongly on location: the challenges climate change poses for a southern coastal city such as Miami, for example, have little in common with those facing a northern inland city such as Chicago. By combining local knowledge with climate science, risk assessment, engineering analysis, and adaptation planning, it is possible to develop relevant climate information that feeds directly into vulnerability assessment and long-term planning. Key steps include: developing climate projections tagged to long-term weather stations within the city itself that reflect local characteristics; mining local knowledge to identify existing vulnerabilities to, and costs of, weather and climate extremes; understanding how future projections can be integrated into the planning process; and identifying ways in which the city may adapt. Using examples from our work in the cities of Boston, Chicago, and Mobile we illustrate the practical application of this approach to quantify the impacts of climate change on these cities and identify robust adaptation options as diverse as reducing the urban heat island effect, protecting essential infrastructure, changing design standards and building codes, developing robust emergency management plans, and rebuilding storm sewer systems.

  10. Data Used in Quantified Reliability Models

    NASA Technical Reports Server (NTRS)

    DeMott, Diana; Kleinhammer, Roger K.; Kahn, C. J.

    2014-01-01

    Data is the crux to developing quantitative risk and reliability models, without the data there is no quantification. The means to find and identify reliability data or failure numbers to quantify fault tree models during conceptual and design phases is often the quagmire that precludes early decision makers consideration of potential risk drivers that will influence design. The analyst tasked with addressing a system or product reliability depends on the availability of data. But, where is does that data come from and what does it really apply to? Commercial industries, government agencies, and other international sources might have available data similar to what you are looking for. In general, internal and external technical reports and data based on similar and dissimilar equipment is often the first and only place checked. A common philosophy is "I have a number - that is good enough". But, is it? Have you ever considered the difference in reported data from various federal datasets and technical reports when compared to similar sources from national and/or international datasets? Just how well does your data compare? Understanding how the reported data was derived, and interpreting the information and details associated with the data is as important as the data itself.

  11. Quantifying capital goods for waste landfilling.

    PubMed

    Brogaard, Line K; Stentsøe, Steen; Willumsen, Hans Christian; Christensen, Thomas H

    2013-06-01

    Materials and energy used for construction of a hill-type landfill of 4 million m(3) were quantified in detail. The landfill is engineered with a liner and leachate collections system, as well as a gas collection and control system. Gravel and clay were the most common materials used, amounting to approximately 260 kg per tonne of waste landfilled. The environmental burdens from the extraction and manufacturing of the materials used in the landfill, as well as from the construction of the landfill, were modelled as potential environmental impacts. For example, the potential impact on global warming was 2.5 kg carbon dioxide (CO2) equivalents or 0.32 milli person equivalents per tonne of waste. The potential impacts from the use of materials and construction of the landfill are low-to-insignificant compared with data reported in the literature on impact potentials of landfills in operation. The construction of the landfill is only a significant contributor to the impact of resource depletion owing to the high use of gravel and steel. PMID:23535149

  12. Optical metabolic imaging quantifies heterogeneous cell populations

    PubMed Central

    Walsh, Alex J.; Skala, Melissa C.

    2015-01-01

    The genetic and phenotypic heterogeneity of cancers can contribute to tumor aggressiveness, invasion, and resistance to therapy. Fluorescence imaging occupies a unique niche to investigate tumor heterogeneity due to its high resolution and molecular specificity. Here, heterogeneous populations are identified and quantified by combined optical metabolic imaging and subpopulation analysis (OMI-SPA). OMI probes the fluorescence intensities and lifetimes of metabolic enzymes in cells to provide images of cellular metabolism, and SPA models cell populations as mixed Gaussian distributions to identify cell subpopulations. In this study, OMI-SPA is characterized by simulation experiments and validated with cell experiments. To generate heterogeneous populations, two breast cancer cell lines, SKBr3 and MDA-MB-231, were co-cultured at varying proportions. OMI-SPA correctly identifies two populations with minimal mean and proportion error using the optical redox ratio (fluorescence intensity of NAD(P)H divided by the intensity of FAD), mean NAD(P)H fluorescence lifetime, and OMI index. Simulation experiments characterized the relationships between sample size, data standard deviation, and subpopulation mean separation distance required for OMI-SPA to identify subpopulations. PMID:25780745

  13. Quantifying traction stresses in adherent cells.

    PubMed

    Kraning-Rush, Casey M; Carey, Shawn P; Califano, Joseph P; Reinhart-King, Cynthia A

    2012-01-01

    Contractile force generation plays a critical role in cell adhesion, migration, and extracellular matrix reorganization in both 2D and 3D environments. Characterization of cellular forces has led to a greater understanding of cell migration, cellular mechanosensing, tissue formation, and disease progression. Methods to characterize cellular traction stresses now date back over 30 years, and they have matured from qualitative comparisons of cell-mediated substrate movements to high-resolution, highly quantitative measures of cellular force. Here, we will provide an overview of common methods used to measure forces in both 2D and 3D microenvironments. Specific focus will be placed on traction force microscopy, which measures the force exerted by cells on 2D planar substrates, and the use of confocal reflectance microscopy, which can be used to quantify collagen fibril compaction as a metric for 3D traction forces. In addition to providing experimental methods to analyze cellular forces, we discuss the application of these techniques to a large range of biomedical problems and some of the significant challenges that still remain in this field. PMID:22482948

  14. Automated Counting of Particles To Quantify Cleanliness

    NASA Technical Reports Server (NTRS)

    Rhode, James

    2005-01-01

    A machine vision system, similar to systems used in microbiological laboratories to count cultured microbes, has been proposed for quantifying the cleanliness of nominally precisely cleaned hardware by counting residual contaminant particles. The system would include a microscope equipped with an electronic camera and circuitry to digitize the camera output, a personal computer programmed with machine-vision and interface software, and digital storage media. A filter pad, through which had been aspirated solvent from rinsing the hardware in question, would be placed on the microscope stage. A high-resolution image of the filter pad would be recorded. The computer would analyze the image and present a histogram of sizes of particles on the filter. On the basis of the histogram and a measure of the desired level of cleanliness, the hardware would be accepted or rejected. If the hardware were accepted, the image would be saved, along with other information, as a quality record. If the hardware were rejected, the histogram and ancillary information would be recorded for analysis of trends. The software would perceive particles that are too large or too numerous to meet a specified particle-distribution profile. Anomalous particles or fibrous material would be flagged for inspection.

  15. Quantifying and Generalizing Hydrologic Responses to Dam Regulation using a Statistical Modeling Approach

    SciTech Connect

    McManamay, Ryan A [ORNL

    2014-01-01

    Despite the ubiquitous existence of dams within riverscapes, much of our knowledge about dams and their environmental effects remains context-specific. Hydrology, more than any other environmental variable, has been studied in great detail with regard to dam regulation. While much progress has been made in generalizing the hydrologic effects of regulation by large dams, many aspects of hydrology show site-specific fidelity to dam operations, small dams (including diversions), and regional hydrologic regimes. A statistical modeling framework is presented to quantify and generalize hydrologic responses to varying degrees of dam regulation. Specifically, the objectives were to 1) compare the effects of local versus cumulative dam regulation, 2) determine the importance of different regional hydrologic regimes in influencing hydrologic responses to dams, and 3) evaluate how different regulation contexts lead to error in predicting hydrologic responses to dams. Overall, model performance was poor in quantifying the magnitude of hydrologic responses, but performance was sufficient in classifying hydrologic responses as negative or positive. Responses of some hydrologic indices to dam regulation were highly dependent upon hydrologic class membership and the purpose of the dam. The opposing coefficients between local and cumulative-dam predictors suggested that hydrologic responses to cumulative dam regulation are complex, and predicting the hydrology downstream of individual dams, as opposed to multiple dams, may be more easy accomplished using statistical approaches. Results also suggested that particular contexts, including multipurpose dams, high cumulative regulation by multiple dams, diversions, close proximity to dams, and certain hydrologic classes are all sources of increased error when predicting hydrologic responses to dams. Statistical models, such as the ones presented herein, show promise in their ability to model the effects of dam regulation effects at large spatial scales as to generalize the directionality of hydrologic responses.

  16. Quantifying collective attention from tweet stream.

    PubMed

    Sasahara, Kazutoshi; Hirata, Yoshito; Toyoda, Masashi; Kitsuregawa, Masaru; Aihara, Kazuyuki

    2013-01-01

    Online social media are increasingly facilitating our social interactions, thereby making available a massive "digital fossil" of human behavior. Discovering and quantifying distinct patterns using these data is important for studying social behavior, although the rapid time-variant nature and large volumes of these data make this task difficult and challenging. In this study, we focused on the emergence of "collective attention" on Twitter, a popular social networking service. We propose a simple method for detecting and measuring the collective attention evoked by various types of events. This method exploits the fact that tweeting activity exhibits a burst-like increase and an irregular oscillation when a particular real-world event occurs; otherwise, it follows regular circadian rhythms. The difference between regular and irregular states in the tweet stream was measured using the Jensen-Shannon divergence, which corresponds to the intensity of collective attention. We then associated irregular incidents with their corresponding events that attracted the attention and elicited responses from large numbers of people, based on the popularity and the enhancement of key terms in posted messages or "tweets." Next, we demonstrate the effectiveness of this method using a large dataset that contained approximately 490 million Japanese tweets by over 400,000 users, in which we identified 60 cases of collective attentions, including one related to the Tohoku-oki earthquake. "Retweet" networks were also investigated to understand collective attention in terms of social interactions. This simple method provides a retrospective summary of collective attention, thereby contributing to the fundamental understanding of social behavior in the digital era. PMID:23637913

  17. Quantifying the mixing due to bars

    NASA Astrophysics Data System (ADS)

    Sanchez-Blazquez, Patricia

    2015-03-01

    We will present star formation histories and the stellar and gaseous metallicity gradients in the disk of a sample of 50 face-on spiral galaxies with and without bars observed with the integral field unit spectrograph PMAS. The final aim is to quantify the redistribution of mass and angular momentum in the galactic disks due to bars by comparing both the gas-phase and star-phase metallicity gradients on the disk of barred and non-barred galaxies. Numerical simulations have shown that strong gravitational torque by non-axisymmetric components induce evolutionary processes such as redistribution of mass and angular momentum in the galactic disks (Sellwood & Binney 2002) and consequent change of chemical abundance profiles. If we hope to understand chemical evolution gradients and their evolution we must understand the secular processes and re-arrangement of material by non-axisymmetric components and vice-versa. Furthermore, the re-arrangement of stellar disk material influences the interpretation of various critical observed metrics of Galaxy evolution, including the age-metallicity relation in the solar neighborhood and the local G-dwarf metallicity distribution. Perhaps the most obvious of these aforementioned non-axisymmetric components are bars - at least 2/3 of spiral galaxies host a bar, and possibly all disk galaxies have hosted a bar at some point in their evolution. While observationally it has been found that barred galaxies have shallower gas-phase metallicity gradients than non-barred galaxies, a complementary analysis of the stellar abundance profiles has not yet been undertaken. This is unfortunate because the study of both gas and stars is important in providing a complete picture, as the two components undergo (and suffer from) very different evolutionary processes.

  18. Quantifying Collective Attention from Tweet Stream

    PubMed Central

    Sasahara, Kazutoshi; Hirata, Yoshito; Toyoda, Masashi; Kitsuregawa, Masaru; Aihara, Kazuyuki

    2013-01-01

    Online social media are increasingly facilitating our social interactions, thereby making available a massive “digital fossil” of human behavior. Discovering and quantifying distinct patterns using these data is important for studying social behavior, although the rapid time-variant nature and large volumes of these data make this task difficult and challenging. In this study, we focused on the emergence of “collective attention” on Twitter, a popular social networking service. We propose a simple method for detecting and measuring the collective attention evoked by various types of events. This method exploits the fact that tweeting activity exhibits a burst-like increase and an irregular oscillation when a particular real-world event occurs; otherwise, it follows regular circadian rhythms. The difference between regular and irregular states in the tweet stream was measured using the Jensen-Shannon divergence, which corresponds to the intensity of collective attention. We then associated irregular incidents with their corresponding events that attracted the attention and elicited responses from large numbers of people, based on the popularity and the enhancement of key terms in posted messages or “tweets.” Next, we demonstrate the effectiveness of this method using a large dataset that contained approximately 490 million Japanese tweets by over 400,000 users, in which we identified 60 cases of collective attentions, including one related to the Tohoku-oki earthquake. “Retweet” networks were also investigated to understand collective attention in terms of social interactions. This simple method provides a retrospective summary of collective attention, thereby contributing to the fundamental understanding of social behavior in the digital era. PMID:23637913

  19. Quantifying Missing Heritability at Known GWAS Loci

    PubMed Central

    Gusev, Alexander; Bhatia, Gaurav; Zaitlen, Noah; Vilhjalmsson, Bjarni J.; Diogo, Dorothée; Stahl, Eli A.; Gregersen, Peter K.; Worthington, Jane; Klareskog, Lars; Raychaudhuri, Soumya; Plenge, Robert M.; Pasaniuc, Bogdan; Price, Alkes L.

    2013-01-01

    Recent work has shown that much of the missing heritability of complex traits can be resolved by estimates of heritability explained by all genotyped SNPs. However, it is currently unknown how much heritability is missing due to poor tagging or additional causal variants at known GWAS loci. Here, we use variance components to quantify the heritability explained by all SNPs at known GWAS loci in nine diseases from WTCCC1 and WTCCC2. After accounting for expectation, we observed all SNPs at known GWAS loci to explain more heritability than GWAS-associated SNPs on average (). For some diseases, this increase was individually significant: for Multiple Sclerosis (MS) () and for Crohn's Disease (CD) (); all analyses of autoimmune diseases excluded the well-studied MHC region. Additionally, we found that GWAS loci from other related traits also explained significant heritability. The union of all autoimmune disease loci explained more MS heritability than known MS SNPs () and more CD heritability than known CD SNPs (), with an analogous increase for all autoimmune diseases analyzed. We also observed significant increases in an analysis of Rheumatoid Arthritis (RA) samples typed on ImmunoChip, with more heritability from all SNPs at GWAS loci () and more heritability from all autoimmune disease loci () compared to known RA SNPs (including those identified in this cohort). Our methods adjust for LD between SNPs, which can bias standard estimates of heritability from SNPs even if all causal variants are typed. By comparing adjusted estimates, we hypothesize that the genome-wide distribution of causal variants is enriched for low-frequency alleles, but that causal variants at known GWAS loci are skewed towards common alleles. These findings have important ramifications for fine-mapping study design and our understanding of complex disease architecture. PMID:24385918

  20. A new model for quantifying climate episodes

    NASA Astrophysics Data System (ADS)

    Biondi, Franco; Kozubowski, Tomasz J.; Panorska, Anna K.

    2005-07-01

    When long records of climate (precipitation, temperature, stream runoff, etc.) are available, either from instrumental observations or from proxy records, the objective evaluation and comparison of climatic episodes becomes necessary. Such episodes can be quantified in terms of duration (the number of time intervals, e.g. years, the process remains continuously above or below a reference level) and magnitude (the sum of all series values for a given duration). The joint distribution of duration and magnitude is represented here by a stochastic model called BEG, for bivariate distribution with exponential and geometric marginals. The model is based on the theory of random sums, and its mathematical derivation confirms and extends previous empirical findings. Probability statements that can be obtained from the model are illustrated by applying it to a 2300-year dendroclimatic reconstruction of water-year precipitation for the eastern Sierra Nevada-western Great Basin. Using the Dust Bowl drought period as an example, the chance of a longer or greater drought is 8%. Conditional probabilities are much higher, i.e. a drought of that magnitude has a 62% chance of lasting for 11 years or longer, and a drought that lasts 11 years has a 46% chance of having an equal or greater magnitude. In addition, because of the bivariate model, we can estimate a 6% chance of witnessing a drought that is both longer and greater. Additional examples of model application are also provided. This type of information provides a way to place any climatic episode in a temporal perspective, and such numerical statements help with reaching science-based management and policy decisions.

  1. A method to quantify organic functional groups and inorganic compounds in ambient aerosols using attenuated total reflectance FTIR spectroscopy and multivariate chemometric techniques

    NASA Astrophysics Data System (ADS)

    Coury, Charity; Dillner, Ann M.

    An attenuated total reflectance-Fourier transform infrared (ATR-FTIR) spectroscopic technique and a multivariate calibration method were developed to quantify ambient aerosol organic functional groups and inorganic compounds. These methods were applied to size-resolved particulate matter samples collected in winter and summer of 2004 at three sites: a downtown Phoenix, Arizona location, a rural site near Phoenix, and an urban fringe site between the urban and rural site. Ten organic compound classes, including four classes which contain a carbonyl functional group, and three inorganic species were identified in the ambient samples. A partial least squares calibration was developed and applied to the ambient spectra, and 13 functional groups related to organic compounds (aliphatic and aromatic CH, methylene, methyl, alkene, aldehydes/ketones, carboxylic acids, esters/lactones, acid anhydrides, carbohydrate hydroxyl and ethers, amino acids, and amines) as well as ammonium sulfate and ammonium nitrate were quantified. Comparison of the sum of the mass measured by the ATR-FTIR technique and gravimetric mass indicates that this method can quantify nearly all of the aerosol mass on sub-micrometer size-segregated samples. Analysis of sample results shows that differences in organic functional group and inorganic compound concentrations at the three sampling sites can be measured with these methods. Future work will analyze the quantified data from these three sites in detail.

  2. Prostatic ductal adenocarcinoma showing Bcl-2 expression.

    PubMed

    Tulunay, Ozden; Orhan, Diclehan; Baltaci, Sümer; Gögü?, Cagatay; Müftüoglu, Yusuf Z

    2004-09-01

    Prostatic ductal adenocarcinoma represents a rare histological variant of prostatic carcinoma with features of a papillary lesion at cystoscopy. There are conflicts regarding the existence, origin, staging, grading, treatment and clinical behavior of this tumor. The aim of the present study is to examine the expression of Bcl-2 and p53 in prostatic ductal adenocarcinoma and to evaluate its origin by analyzing prostate specific antigen, prostate specific acid phosphatase, cytokeratins, epithelial membrane antigen and carcinoembryonic antigen expressions. The results confirmed the expression of prostate specific antigen and prostate specific acid phosphatase in prostatic ductal adenocarcinoma. The demonstrated expression of Bcl-2 was predominant in the better-differentiated tumor. Bcl-2 expression appears not to be associated with neuroendocrine differentiation as assessed by chromogranin A reactivity. Thus, the first case of a prostatic ductal adenocarcinoma showing Bcl-2 expression is presented. The tumor was negative for p53. PMID:15379952

  3. Lemurs and macaques show similar numerical sensitivity.

    PubMed

    Jones, Sarah M; Pearson, John; DeWind, Nicholas K; Paulsen, David; Tenekedjieva, Ana-Maria; Brannon, Elizabeth M

    2014-05-01

    We investigated the precision of the approximate number system (ANS) in three lemur species (Lemur catta, Eulemur mongoz, and Eulemur macaco flavifrons), one Old World monkey species (Macaca mulatta) and humans (Homo sapiens). In Experiment 1, four individuals of each nonhuman primate species were trained to select the numerically larger of two visual arrays on a touchscreen. We estimated numerical acuity by modeling Weber fractions (w) and found quantitatively equivalent performance among all four nonhuman primate species. In Experiment 2, we tested adult humans in a similar procedure, and they outperformed the four nonhuman species but showed qualitatively similar performance. These results indicate that the ANS is conserved over the primate order. PMID:24068469

  4. Everything, everywhere, all the time: quantifying the information gained from intensive hydrochemical sampling

    NASA Astrophysics Data System (ADS)

    Kirchner, J. W.; Neal, C.

    2011-12-01

    Catchment hydrochemical studies have suffered from a stark mismatch of measurement timescales: water fluxes are typically measured sub-hourly, but their chemical signatures are typically sampled only weekly or monthly. At the Plynlimon catchment in mid-Wales, however, precipitation and streamflow have now been sampled every seven hours for nearly two years, and analyzed for deuterium, oxygen-18, and more than 40 chemical species. This high-frequency sampling reveals temporal patterns that would be invisible in typical weekly monitoring samples. Furthermore, recent technological developments are now leading to systems that can provide measurements of rainfall and streamflow chemistry at hourly or sub-hourly intervals, similar to the time scales at which hydrometric data have long been available - and to provide these measurements for long spans of time, not just for intensive field campaigns associated with individual storms. But at what point will higher-frequency measurements become pointless, as additional measurements simply "connect the dots" between lower-frequency data points? Information Theory, dating back to the original work of Shannon and colleagues in the 1940's, provides mathematical tools for rigorously quantifying the information content of a time series. The key input data for such an analysis are the power spectrum of the measured data, and the power spectrum of the measurement noise. Here we apply these techniques to the high-frequency Plynlimon data set. The results show that, at least up to 7-hourly sampling frequency, the information content of the time series increases nearly linearly with the frequency of sampling. These results rigorously quantify what inspection of the time series visually suggests: these high-frequency data do not simply "connect the dots" between lower-frequency measurements, but instead contain a richly textured signature of dynamic behavior in catchment hydrochemistry.

  5. Quantifying the impact of metamorphic reactions on strain localization in the mantle

    NASA Astrophysics Data System (ADS)

    Huet, Benjamin; Yamato, Philippe

    2014-05-01

    Metamorphic reactions are most often considered as a passive record of changes in pressure, temperature and fluid conditions that rocks experience. In that way, they provide key constraints on the tectonic evolution of the crust and the mantle. However, natural examples show that metamorphism can also modify the strength of rocks and affect the strain localization in ductile shear zones. Hence, metamorphic reactions have an active role in tectonics by inducing softening and/or hardening depending on the involved reactions. Quantifying the mechanical effect of such metamorphic reactions is, therefore, a crucial task for determining both the strength distribution in the lithosphere and its evolution. However, the estimate of the effective strength of such polyphase rocks remains still an open issue. Some flow laws (determined experimentally) already exist for monophase aggregates and polyphase rocks for rheologically important materials. They provide good constraints on lithology-controlled lithospheric strength variations. Unfortunately, since the whole range of mineralogical and chemical rock compositions cannot be experimentally tested, the variations of strength due to in metamorphism reaction cannot be systematically and fully characterized. In order to tackle this issue, we here present the results of a study coupling thermodynamical and mechanical modeling that allows us to predict the mechanical impact of metamorphic reactions on the strength of the mantle. Thermodynamic modeling (using Theriak-Domino) is used for calculating the mineralogical composition of a typical peridotite as a function of pressure, temperature and water content. The calculated modes and flow laws parameters for monophase aggregates are then used as input of the Minimized Power Geometric model for predicting the polyphase aggregate strength. Our results are then used to quantify the strength evolution of the mantle as a function of pressure, temperature and water content in two characteristic tectonic contexts by following P-T evolutions underwent by the lithospheric mantle in both subduction zones and rifts. The mechanical consequences of metamorphic reactions at the convergent and divergent plate boundaries are finally discussed.

  6. Quantifying the Behavior of Stock Correlations Under Market Stress

    PubMed Central

    Preis, Tobias; Kenett, Dror Y.; Stanley, H. Eugene; Helbing, Dirk; Ben-Jacob, Eshel

    2012-01-01

    Understanding correlations in complex systems is crucial in the face of turbulence, such as the ongoing financial crisis. However, in complex systems, such as financial systems, correlations are not constant but instead vary in time. Here we address the question of quantifying state-dependent correlations in stock markets. Reliable estimates of correlations are absolutely necessary to protect a portfolio. We analyze 72 years of daily closing prices of the 30 stocks forming the Dow Jones Industrial Average (DJIA). We find the striking result that the average correlation among these stocks scales linearly with market stress reflected by normalized DJIA index returns on various time scales. Consequently, the diversification effect which should protect a portfolio melts away in times of market losses, just when it would most urgently be needed. Our empirical analysis is consistent with the interesting possibility that one could anticipate diversification breakdowns, guiding the design of protected portfolios. PMID:23082242

  7. Muscle ultrasound quantifies segmental neuromuscular outcome in pediatric myelomeningocele.

    PubMed

    Verbeek, Renate J; Hoving, Eelco W; Maurits, Natalia M; Brouwer, Oebele F; van der Hoeven, Johannes H; Sival, Deborah A

    2014-01-01

    In pediatric spina bifida aperta (SBA), non-invasive assessment of neuromuscular integrity by muscle ultrasound density (MUD) could provide important information about the clinical condition. We therefore aimed to determine the association between pediatric SBA MUD and segmental neurologic function. We included 23 children (age range: 1-18 y) with SBA with L4-5 lesions, and we associated SBA MUD with control values and segmental neuromuscular function. Results revealed that MUD outcomes in the lower extremities: (i) are independent of age, (ii) exceed control values, (iii) differ intra-individually (i.e., between the left and right sides in the same individual) in association with segmental neuromuscular function. We concluded that SBA leg MUD can quantify the segmental neuromuscular condition throughout childhood. PMID:24210858

  8. Quantifying light exposure patterns in young adult students

    PubMed Central

    Alvarez, Amanda A.; Wildsoet, Christine F.

    2014-01-01

    Exposure to bright light appears to be protective against myopia in both animals (chicks, monkeys) and children, but quantitative data on human light exposure are limited. In this study, we report on a technique for quantifying light exposure using wearable sensors. Twenty-seven young adult subjects wore a light sensor continuously for two weeks during one of three seasons, and also completed questionnaires about their visual activities. Light data were analyzed with respect to refractive error and season, and the objective sensor data were compared with subjects’ estimates of time spent indoors and outdoors. Subjects’ estimates of time spent indoors and outdoors were in poor agreement with durations reported by the sensor data. The results of questionnaire-based studies of light exposure should thus be interpreted with caution. The role of light in refractive error development should be investigated using multiple methods such as sensors to complement questionnaires. PMID:25342873

  9. Quantifying the Relationship Between Financial News and the Stock Market

    NASA Astrophysics Data System (ADS)

    Alanyali, Merve; Moat, Helen Susannah; Preis, Tobias

    2013-12-01

    The complex behavior of financial markets emerges from decisions made by many traders. Here, we exploit a large corpus of daily print issues of the Financial Times from 2nd January 2007 until 31st December 2012 to quantify the relationship between decisions taken in financial markets and developments in financial news. We find a positive correlation between the daily number of mentions of a company in the Financial Times and the daily transaction volume of a company's stock both on the day before the news is released, and on the same day as the news is released. Our results provide quantitative support for the suggestion that movements in financial markets and movements in financial news are intrinsically interlinked.

  10. Quantifying bushfire penetration into urban areas in Australia

    NASA Astrophysics Data System (ADS)

    Chen, Keping; McAneney, John

    2004-06-01

    The extent and trajectory of bushfire penetration at the bushland-urban interface are quantified using data from major historical fires in Australia. We find that the maximum distance at which homes are destroyed is typically less than 700 m. The probability of home destruction emerges as a simple linear and decreasing function of distance from the bushland-urban boundary but with a variable slope that presumably depends upon fire regime and human intervention. The collective data suggest that the probability of home destruction at the forest edge is around 60%. Spatial patterns of destroyed homes display significant neighbourhood clustering. Our results provide revealing spatial evidence for estimating fire risk to properties and suggest an ember-attack model.

  11. An organotypic spinal cord slice culture model to quantify neurodegeneration.

    PubMed

    Ravikumar, Madhumitha; Jain, Seema; Miller, Robert H; Capadona, Jeffrey R; Selkirk, Stephen M

    2012-11-15

    Activated microglia cells have been implicated in the neurodegenerative process of Alzheimer's disease, Parkinson's disease, Huntington's disease, amyotrophic lateral sclerosis, and multiple sclerosis; however, the precise roles of microglia in disease progression are unclear. Despite these diseases having been described for more than a century, current FDA approved therapeutics are symptomatic in nature with little evidence to supporting a neuroprotective effect. Furthermore, identifying novel therapeutics remains challenging due to undetermined etiology, a variable disease course, and the paucity of validated targets. Here, we describe the use of a novel ex vivo spinal cord culture system that offers the ability to screen potential neuroprotective agents, while maintaining the complexity of the in vivo environment. To this end, we treated spinal cord slice cultures with lipopolysaccharide and quantified neuron viability in culture using measurements of axon length and FluoroJadeC intensity. To simulate a microglia-mediated response to cellular debris, antigens, or implanted materials/devices, we supplemented the culture media with increasing densities of microspheres, facilitating microglia-mediated phagocytosis of the particles, which demonstrated a direct correlation between the phagocytic activities of microglia and neuronal health. To validate our model's capacity to accurately depict neuroprotection, cultures were treated with resveratrol, which demonstrated enhanced neuronal health. Our results successfully demonstrate the use of this model to reproducibly quantify the extent of neurodegeneration through the measurement of axon length and FluoroJadeC intensity, and we suggest this model will allow for accurate, high-throughput screening, which could result in expedited success in translational efficacy of therapeutic agents to clinical trials. PMID:22975474

  12. Quantifying the Nonlinear, Anisotropic Material Response of Spinal Ligaments

    NASA Astrophysics Data System (ADS)

    Robertson, Daniel J.

    Spinal ligaments may be a significant source of chronic back pain, yet they are often disregarded by the clinical community due to a lack of information with regards to their material response, and innervation characteristics. The purpose of this dissertation was to characterize the material response of spinal ligaments and to review their innervation characteristics. Review of relevant literature revealed that all of the major spinal ligaments are innervated. They cause painful sensations when irritated and provide reflexive control of the deep spinal musculature. As such, including the neurologic implications of iatrogenic ligament damage in the evaluation of surgical procedures aimed at relieving back pain will likely result in more effective long-term solutions. The material response of spinal ligaments has not previously been fully quantified due to limitations associated with standard soft tissue testing techniques. The present work presents and validates a novel testing methodology capable of overcoming these limitations. In particular, the anisotropic, inhomogeneous material constitutive properties of the human supraspinous ligament are quantified and methods for determining the response of the other spinal ligaments are presented. In addition, a method for determining the anisotropic, inhomogeneous pre-strain distribution of the spinal ligaments is presented. The multi-axial pre-strain distributions of the human anterior longitudinal ligament, ligamentum flavum and supraspinous ligament were determined using this methodology. Results from this work clearly demonstrate that spinal ligaments are not uniaxial structures, and that finite element models which account for pre-strain and incorporate ligament's complex material properties may provide increased fidelity to the in vivo condition.

  13. Quantifying realized inbreeding in wild and captive animal populations.

    PubMed

    Knief, U; Hemmrich-Stanisak, G; Wittig, M; Franke, A; Griffith, S C; Kempenaers, B; Forstmeier, W

    2015-04-01

    Most molecular measures of inbreeding do not measure inbreeding at the scale that is most relevant for understanding inbreeding depression-namely the proportion of the genome that is identical-by-descent (IBD). The inbreeding coefficient FPed obtained from pedigrees is a valuable estimator of IBD, but pedigrees are not always available, and cannot capture inbreeding loops that reach back in time further than the pedigree. We here propose a molecular approach to quantify the realized proportion of the genome that is IBD (propIBD), and we apply this method to a wild and a captive population of zebra finches (Taeniopygia guttata). In each of 948 wild and 1057 captive individuals we analyzed available single-nucleotide polymorphism (SNP) data (260 SNPs) spread over four different genomic regions in each population. This allowed us to determine whether any of these four regions was completely homozygous within an individual, which indicates IBD with high confidence. In the highly nomadic wild population, we did not find a single case of IBD, implying that inbreeding must be extremely rare (propIBD=0-0.00094, 95% CI). In the captive population, a five-generation pedigree strongly underestimated the average amount of realized inbreeding (FPed=0.013quantifying inbreeding at the individual or population level, and we show analytically that it can capture inbreeding loops that reach back up to a few hundred generations. PMID:25585923

  14. Quantifying the impacts of global disasters

    NASA Astrophysics Data System (ADS)

    Jones, L. M.; Ross, S.; Wilson, R. I.; Borrero, J. C.; Brosnan, D.; Bwarie, J. T.; Geist, E. L.; Hansen, R. A.; Johnson, L. A.; Kirby, S. H.; Long, K.; Lynett, P. J.; Miller, K. M.; Mortensen, C. E.; Perry, S. C.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Thio, H. K.; Wein, A. M.; Whitmore, P.; Wood, N. J.

    2012-12-01

    The US Geological Survey, National Oceanic and Atmospheric Administration, California Geological Survey, and other entities are developing a Tsunami Scenario, depicting a realistic outcome of a hypothetical but plausible large tsunami originating in the eastern Aleutian Arc, affecting the west coast of the United States, including Alaska and Hawaii. The scenario includes earth-science effects, damage and restoration of the built environment, and social and economic impacts. Like the earlier ShakeOut and ARkStorm disaster scenarios, the purpose of the Tsunami Scenario is to apply science to quantify the impacts of natural disasters in a way that can be used by decision makers in the affected sectors to reduce the potential for loss. Most natural disasters are local. A major hurricane can destroy a city or damage a long swath of coastline while mostly sparing inland areas. The largest earthquake on record caused strong shaking along 1500 km of Chile, but left the capital relatively unscathed. Previous scenarios have used the local nature of disasters to focus interaction with the user community. However, the capacity for global disasters is growing with the interdependency of the global economy. Earthquakes have disrupted global computer chip manufacturing and caused stock market downturns. Tsunamis, however, can be global in their extent and direct impact. Moreover, the vulnerability of seaports to tsunami damage can increase the global consequences. The Tsunami Scenario is trying to capture the widespread effects while maintaining the close interaction with users that has been one of the most successful features of the previous scenarios. The scenario tsunami occurs in the eastern Aleutians with a source similar to the 2011 Tohoku event. Geologic similarities support the argument that a Tohoku-like source is plausible in Alaska. It creates a major nearfield tsunami in the Aleutian arc and peninsula, a moderate tsunami in the US Pacific Northwest, large but not the maximum in Hawaii, and the largest plausible tsunami in southern California. To support the analysis of global impacts, we begin with the Ports of Los Angeles and Long Beach which account for >40% of the imports to the United States. We expand from there throughout California for the first level economic analysis. We are looking to work with Alaska and Hawaii, especially on similar economic issues in ports, over the next year and to expand the analysis to consideration of economic interactions between the regions.

  15. Quantifying Permafrost Characteristics with DCR-ERT

    NASA Astrophysics Data System (ADS)

    Schnabel, W.; Trochim, E.; Munk, J.; Kanevskiy, M. Z.; Shur, Y.; Fortier, R.

    2012-12-01

    Geophysical methods are an efficient method for quantifying permafrost characteristics for Arctic road design and engineering. In the Alaskan Arctic construction and maintenance of roads requires integration of permafrost; ground that is below 0 degrees C for two or more years. Features such as ice content and temperature are critical for understanding current and future ground conditions for planning, design and evaluation of engineering applications. This study focused on the proposed Foothills West Transportation Access project corridor where the purpose is to construct a new all-season road connecting the Dalton Highway to Umiat. Four major areas were chosen that represented a range of conditions including gravel bars, alluvial plains, tussock tundra (both unburned and burned conditions), high and low centered ice-wedge polygons and an active thermokarst feature. Direct-current resistivity using galvanic contact (DCR-ERT) was applied over transects. In conjunction complimentary site data including boreholes, active layer depths, vegetation descriptions and site photographs was obtained. The boreholes provided information on soil morphology, ice texture and gravimetric moisture content. Horizontal and vertical resolutions in the DCR-ERT were varied to determine the presence or absence of ground ice; subsurface heterogeneity; and the depth to groundwater (if present). The four main DCR-ERT methods used were: 84 electrodes with 2 m spacing; 42 electrodes with 0.5 m spacing; 42 electrodes with 2 m spacing; and 84 electrodes with 1 m spacing. In terms of identifying the ground ice characteristics the higher horizontal resolution DCR-ERT transects with either 42 or 84 electrodes and 0.5 or 1 m spacing were best able to differentiate wedge-ice. This evaluation is based on a combination of both borehole stratigraphy and surface characteristics. Simulated apparent resistivity values for permafrost areas varied from a low of 4582 ? m to a high of 10034 ? m. Previous studies found permafrost conditions with corresponding resistivity values as low as 5000 ? m. This work emphasizes the necessity of tailoring the DCR-ERT survey to verified ground ice characteristics.

  16. Quantifying missing heritability at known GWAS loci.

    PubMed

    Gusev, Alexander; Bhatia, Gaurav; Zaitlen, Noah; Vilhjalmsson, Bjarni J; Diogo, Dorothée; Stahl, Eli A; Gregersen, Peter K; Worthington, Jane; Klareskog, Lars; Raychaudhuri, Soumya; Plenge, Robert M; Pasaniuc, Bogdan; Price, Alkes L

    2013-01-01

    Recent work has shown that much of the missing heritability of complex traits can be resolved by estimates of heritability explained by all genotyped SNPs. However, it is currently unknown how much heritability is missing due to poor tagging or additional causal variants at known GWAS loci. Here, we use variance components to quantify the heritability explained by all SNPs at known GWAS loci in nine diseases from WTCCC1 and WTCCC2. After accounting for expectation, we observed all SNPs at known GWAS loci to explain 1.29 x more heritability than GWAS-associated SNPs on average (P=3.3 x 10??). For some diseases, this increase was individually significant: 2.07 x for Multiple Sclerosis (MS) (P=6.5 x 10??) and 1.48 x for Crohn's Disease (CD) (P = 1.3 x 10?³); all analyses of autoimmune diseases excluded the well-studied MHC region. Additionally, we found that GWAS loci from other related traits also explained significant heritability. The union of all autoimmune disease loci explained 7.15 x more MS heritability than known MS SNPs (P < 1.0 x 10?¹? and 2.20 x more CD heritability than known CD SNPs (P = 6.1 x 10??), with an analogous increase for all autoimmune diseases analyzed. We also observed significant increases in an analysis of > 20,000 Rheumatoid Arthritis (RA) samples typed on ImmunoChip, with 2.37 x more heritability from all SNPs at GWAS loci (P = 2.3 x 10??) and 5.33 x more heritability from all autoimmune disease loci (P < 1 x 10?¹? compared to known RA SNPs (including those identified in this cohort). Our methods adjust for LD between SNPs, which can bias standard estimates of heritability from SNPs even if all causal variants are typed. By comparing adjusted estimates, we hypothesize that the genome-wide distribution of causal variants is enriched for low-frequency alleles, but that causal variants at known GWAS loci are skewed towards common alleles. These findings have important ramifications for fine-mapping study design and our understanding of complex disease architecture. PMID:24385918

  17. Tetrahydrobiopterin shows chaperone activity for tyrosine hydroxylase.

    PubMed

    Thöny, Beat; Calvo, Ana C; Scherer, Tanja; Svebak, Randi M; Haavik, Jan; Blau, Nenad; Martinez, Aurora

    2008-07-01

    Tyrosine hydroxylase (TH) is the rate-limiting enzyme in the synthesis of catecholamine neurotransmitters. Primary inherited defects in TH have been associated with l-DOPA responsive and non-responsive dystonia and infantile parkinsonism. In this study, we show that both the cofactor (6R)-l-erythro-5,6,7,8-tetrahydrobiopterin (BH(4)) and the feedback inhibitor and catecholamine product dopamine increase the kinetic stability of human TH isoform 1 in vitro. Activity measurements and synthesis of the enzyme by in vitro transcription-translation revealed a complex regulation by the cofactor including both enzyme inactivation and conformational stabilization. Oral BH(4) supplementation to mice increased TH activity and protein levels in brain extracts, while the Th-mRNA level was not affected. All together our results indicate that the molecular mechanisms for the stabilization are a primary folding-aid effect of BH(4) and a secondary effect by increased synthesis and binding of catecholamine ligands. Our results also establish that orally administered BH(4) crosses the blood-brain barrier and therapeutic regimes based on BH(4) supplementation should thus consider the effect on TH. Furthermore, BH(4) supplementation arises as a putative therapeutic agent in the treatment of brain disorders associated with TH misfolding, such as for the human TH isoform 1 mutation L205P. PMID:18419768

  18. Toward quantifying uncertainty in travel time tomography using the null-space shuttle

    E-print Network

    Utrecht, Universiteit

    Toward quantifying uncertainty in travel time tomography using the null-space shuttle R. W. L. de in travel time tomography using the null-space shuttle, J. Geophys. Res., 117, B03301, doi:10.1029/2011JB the null-space of the forward operator. We show that with the null-space shuttle it is possible to assess

  19. FOR SUBMISSION 1 Quantifying the degree of self-nestedness of trees.

    E-print Network

    Paris-Sud XI, Université de

    FOR SUBMISSION 1 Quantifying the degree of self-nestedness of trees. Application to the structural in the problem of approximating trees by trees with a particular self-nested structure. Self-nested trees are such that all their subtrees of a given height are isomorphic. We show that these trees present remarkable

  20. gene encoding enhanced green fluorescent protein to the repressor gene, and quantify

    E-print Network

    Weeks, Eric R.

    gene encoding enhanced green fluorescent protein to the repressor gene, and quantify of gene expression in the feedback network, compared with the control networks. They also show concentrations of anhydrotetra- cycline--achemicalinhibitorofTetR. In past theoretical studies of gene

  1. A numerical analysis on the applicability of the water level fluctuation method for quantifying groundwater recharge

    NASA Astrophysics Data System (ADS)

    Koo, M.; Lee, D.

    2002-12-01

    The water table fluctuation(WTF) method is a conventional method for quantifying groundwater recharge by multiplying the specific yield to the water level rise. Based on the van Genuchten model, an analytical relationship between groundwater recharge and the water level rise is derived. The equation is used to analyze the effects of the depth to water level and the soil properties on the recharge estimate using the WTF method. The results show that the WTF method is reliable when applied to the aquifers of the fluvial sand provided the water table is below 1m depth. However, if it is applied to the silt loam having the water table depth ranging 4~10m, the recharge is overestimated by 30~80%, and the error increases drastically as the water table is getting shallower. A 2-D unconfined flow model with a time series of the recharge rate is developed. It is used for elucidating the errors of the WTF method, which is implicitly based on the tank model where the horizontal flow in the saturated zone is ignored. Simulations show that the recharge estimated by the WTF method is underestimated for the observation well near the discharge boundary. This is due to the fact that the hydraulic stress resulting from the recharge is rapidly dissipating by the horizontal flow near the discharge boundary. Simulations also reveal that the recharge is significantly underestimated with increase in the hydraulic conductivity and the recharge duration, and decrease in the specific yield.

  2. Quantified energy dissipation rates in the terrestrial bow shock: 1. Analysis techniques and methodology

    NASA Astrophysics Data System (ADS)

    Wilson, L. B.; Sibeck, D. G.; Breneman, A. W.; Contel, O. Le; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.

    2014-08-01

    We present a detailed outline and discussion of the analysis techniques used to compare the relevance of different energy dissipation mechanisms at collisionless shock waves. We show that the low-frequency, quasi-static fields contribute less to ohmic energy dissipation, (-j·E), than their high-frequency counterparts. In fact, we found that high-frequency, large-amplitude (>100 mV/m and/or >1 nT) waves are ubiquitous in the transition region of collisionless shocks. We quantitatively show that their fields, through wave-particle interactions, cause enough energy dissipation to regulate the global structure of collisionless shocks. The purpose of this paper, part one of two, is to outline and describe in detail the background, analysis techniques, and theoretical motivation for our new results presented in the companion paper. The companion paper presents the results of our quantitative energy dissipation rate estimates and discusses the implications. Together, the two manuscripts present the first study quantifying the contribution that high-frequency waves provide, through wave-particle interactions, to the total energy dissipation budget of collisionless shock waves.

  3. A new time quantifiable Monte Carlo method in simulating magnetization reversal process

    E-print Network

    X. Z. Cheng; M. B. A. Jalil; H. K. Lee; Y. Okabe

    2005-04-14

    We propose a new time quantifiable Monte Carlo (MC) method to simulate the thermally induced magnetization reversal for an isolated single domain particle system. The MC method involves the determination of density of states, and the use of Master equation for time evolution. We derive an analytical factor to convert MC steps into real time intervals. Unlike a previous time quantified MC method, our method is readily scalable to arbitrarily long time scales, and can be repeated for different temperatures with minimal computational effort. Based on the conversion factor, we are able to make a direct comparison between the results obtained from MC and Langevin dynamics methods, and find excellent agreement between them. An analytical formula for the magnetization reversal time is also derived, which agrees very well with both numerical Langevin and time-quantified MC results, over a large temperature range and for parallel and oblique easy axis orientations.

  4. Quantifying post-fire recovery of forest canopy structure and its environmental drivers using satellite image time-series

    NASA Astrophysics Data System (ADS)

    Khanal, Shiva; Duursma, Remko; Boer, Matthias

    2014-05-01

    Fire is a recurring disturbance in most of Australia's forests. Depending on fire severity, impacts on forest canopies vary from light scorching to complete defoliation, with related variation in the magnitude and duration of post-fire gas exchange by that canopy. Estimates of fire impacts on forest canopy structure and carbon uptake for south-eastern Australia's forests do not exist. Here, we use 8-day composite measurements of the fraction of Absorbed Photosynthetically Active radiation (FPAR) as recorded by the Moderate-resolution Imaging Spectroradiometer (MODIS) to characterise forest canopies before and after fire and to compare burnt and unburnt sites. FPAR is a key biophysical canopy variable and primary input for estimating Gross Primary Productivity (GPP). Post-fire FPAR loss was quantified for all forest areas burnt between 2001 and 2010, showing good agreement with independent assessments of fire severity patterns of 2009 Black Saturday fires. A new method was developed to determine the duration of post-fire recovery from MODIS-FPAR time-series. The method involves a spatial-mode principal component analysis on full FPAR time series followed by a K-means clustering to group pixels based on similarity in temporal patterns. Using fire history data, time series of FPAR for burnt and unburnt pixels in each cluster were then compared to quantify the duration of the post-fire recovery period, which ranged from less than 1 to 8 years. The results show that time series of MODIS FPAR are well suited to detect and quantify disturbances of forest canopy structure and function in large areas of highly variable climate and phenology. Finally, the role of post-fire climate conditions and previous fire history on the duration of the post-fire recovery of the forest canopy was examined using generalized additive models.

  5. Quantifying invasion pathways: fish introductions from the aquarium trade

    E-print Network

    Leung, Brian

    Quantifying invasion pathways: fish introductions from the aquarium trade Erin Gertzen, Oriana to identify invasion risk. We focused on fishes introduced via the aquarium trade, because this pathway by (i) identifying and quantifying aquarium fishes sold, (ii) determining fish owner behavior and dis

  6. Quantifying population genetic differentiation from Next-Generation Sequencing data

    E-print Network

    Nielsen, Rasmus

    Quantifying population genetic differentiation from Next-Generation Sequencing data Matteo on this idea, we propose a novel method for quantifying population genetic differentiation from next of populations sampled at low coverage. 3 #12;Introduction Determining the level of genetic variation within

  7. Quantifying Non-Functional Requirements: A Process Oriented Approach

    Microsoft Academic Search

    Raquel L. Hill; Jun Wang; Klara Nahrstedt

    2004-01-01

    Abstract In this work, we propose a framework for quantifying non - functional requirements (NFRs) This framework uses quality characteristics of the execution domain, application domain and component architectures to refine qualitative requirements into quantifiable ones Conflicts are resolved during the refinement process and more concrete non - functional requirements are produced

  8. Quantifying Peptide Signal in MALDI-TOF Mass Spectrometry Data

    Microsoft Academic Search

    Timothy W. Randolph; Bree L. Mitchell; Dale F. McLerran; Paul D. Lampe; Ziding Feng

    2005-01-01

    This study addressed the question of which properties in MALDI-TOF spectra are relevant to the task of identifying mass and abundance of a peptide species in human se- rum. Data of this type are common to biomarker studies, but significant within- and between-spectrum variabilities make quantifying biologically induced features difficult. We investigated this signal content and quantified the existence, or

  9. Quantifying Meteorite Impact Craters Individual Volume Data Sheet

    E-print Network

    Polly, David

    Quantifying Meteorite Impact Craters Individual Volume Data Sheet Experiment One (Volume) Drop Height (cm) Crater Diameter (cm) Crater Depth (cm) Observations concerning crater shape Large Sphere 1 150 Trial 2 150 Trial 3 150 #12;Quantifying Meteorite Impact Craters Individual Speed Data Sheet

  10. Visual Attention and Quantifier-Spreading in Heritage Russian Bilinguals

    ERIC Educational Resources Information Center

    Sekerina, Irina A.; Sauermann, Antje

    2015-01-01

    It is well established in language acquisition research that monolingual children and adult second language learners misinterpret sentences with the universal quantifier "every" and make quantifier-spreading errors that are attributed to a preference for a match in number between two sets of objects. The present Visual World eye-tracking…

  11. Shortcuts to Quantifier Interpretation in Children and Adults

    ERIC Educational Resources Information Center

    Brooks, Patricia J.; Sekerina, Irina

    2006-01-01

    Errors involving universal quantification are common in contexts depicting sets of individuals in partial, one-to-one correspondence. In this article, we explore whether quantifier-spreading errors are more common with distributive quantifiers each and every than with all. In Experiments 1 and 2, 96 children (5- to 9-year-olds) viewed pairs of…

  12. Quantifier elimination for real closed fields by cylindrical algebraic decomposition

    Microsoft Academic Search

    George E. Collins

    1975-01-01

    Tarski in 1948, [18] published a quantifier elimination method for the elementary theory of real closed fields (which he had discovered in 1930). As noted by Tarski, any quantifier elimination method for this theory also provides a decision method, which enables one to decide whether any sentence of the theory is true or false. Since many important and difficult mathematical

  13. Randomization tests for quantifying species importance to ecosystem function

    E-print Network

    Gotelli, Nicholas J.

    of species interactions and the recognition that certain `keystone species' (sensu Paine 1969) or `ecosystemRandomization tests for quantifying species importance to ecosystem function Nicholas J. Gotelli1´stoles, Spain Summary 1. Quantifying the contribution of different species to ecosystem function is an important

  14. Quantifier elimination for formulas constrained by quadratic equations

    Microsoft Academic Search

    Hoon Hong

    1993-01-01

    An algorithm is given for constructing a quantifier free formula (a boolean expression of polynomial equations and inequalities) equivalent to a given formula of the form: (% c R)[azzz + alz + a. = O A F], where F is a quantifier free formula in Z1, . . . . z~, z, and az, al, ao are polynomials in z

  15. Quantifying the Extent of IPv6 Deployment Elliott Karpilovsky1

    E-print Network

    Singh, Jaswinder Pal

    Quantifying the Extent of IPv6 Deployment Elliott Karpilovsky1 , Alexandre Gerber2 , Dan Pei2 of IPv6 deployment is surprisingly limited. In fact, it is not even clear how we should quantify IPv6 deployment. In this paper, we collect and analyze a variety of data to characterize the penetration of IPv6

  16. Children's Knowledge of the Quantifier "Dou" in Mandarin Chinese

    ERIC Educational Resources Information Center

    Zhou, Peng; Crain, Stephen

    2011-01-01

    The quantifier "dou" (roughly corresponding to English "all") in Mandarin Chinese has been the topic of much discussion in the theoretical literature. This study investigated children's knowledge of this quantifier using a new methodological technique, which we dubbed the Question-Statement Task. Three questions were addressed: (i) whether young…

  17. Some Dichotomy Theorems on Constantfree Quantified Boolean Formulas

    E-print Network

    Dalmau, Victor

    Some Dichotomy Theorems on Constant­free Quantified Boolean Formulas V'ictor Dalmau \\Lambda In this paper we study the satisfiability of constant­free quanti­ fied boolean formulas. We consider the following classes of quantified boolean formulas. Fix a finite set of basic boolean logical functions. Take

  18. Quantifying Digit Force Vector Coordination during Precision Pinch

    PubMed Central

    Marquardt, Tamara L.; Li, Zong-Ming

    2013-01-01

    A methodology was established to investigate the contact mechanics of the thumb and the index finger at the digit-object interface during precision pinch. Two force/torque transducers were incorporated into an apparatus designed to overcome the thickness of each transducer and provide a flexible pinch span for digit placement and force application. To demonstrate the utility of the device, five subjects completed a pinch task with the pulps of their thumb and index finger. Inter-digit force vector coordination was quantified by examining the 1) force vector component magnitudes, 2) resultant force vector magnitudes, 3) coordination angle – the angle formed by the resultant vectors of each digit, 4) direction angles – the angle formed by each vector and the coordinate axes, and 5) center of pressure locations. It was shown that the resultant force magnitude of the index finger exceeded that of the thumb by 0.8 ± 0.3 N and that the coordination angle between the digit resultant force vectors was 160.2 ± 4.6°. The experimental apparatus and analysis methods provide a valuable tool for the quantitative examination of biomechanics and motor control during dexterous manipulation. PMID:24443624

  19. Quantifying higher-order correlations in a neuronal pool

    NASA Astrophysics Data System (ADS)

    Montangie, Lisandro; Montani, Fernando

    2015-03-01

    Recent experiments involving a relatively large population of neurons have shown a very significant amount of higher-order correlations. However, little is known of how these affect the integration and firing behavior of a population of neurons beyond the second order statistics. To investigate how higher-order inputs statistics can shape beyond pairwise spike correlations and affect information coding in the brain, we consider a neuronal pool where each neuron fires stochastically. We develop a simple mathematically tractable model that makes it feasible to account for higher-order spike correlations in a neuronal pool with highly interconnected common inputs beyond second order statistics. In our model, correlations between neurons appear from q-Gaussian inputs into threshold neurons. The approach constitutes the natural extension of the Dichotomized Gaussian model, where the inputs to the model are just Gaussian distributed and therefore have no input interactions beyond second order. We obtain an exact analytical expression for the joint distribution of firing, quantifying the degree of higher-order spike correlations, truly emphasizing the functional aspects of higher-order statistics, as we account for beyond second order inputs correlations seen by each neuron within the pool. We determine how higher-order correlations depend on the interaction structure of the input, showing that the joint distribution of firing is skewed as the parameter q increases inducing larger excursions of synchronized spikes. We show how input nonlinearities can shape higher-order correlations and enhance coding performance by neural populations.

  20. 47 CFR 90.505 - Showing required.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...CONTINUED) SAFETY AND SPECIAL RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Developmental Operation § 90.505 Showing...showing that: (1) The applicant has an organized plan of development leading to a specific objective; (2) The actual...

  1. Planetarium Shows for K-12 School Groups

    E-print Network

    Berdichevsky, Victor

    Planetarium Shows for K-12 School Groups TheWayne State University Planetarium offers Instructional planetarium shows typically consist of three parts: interactive demonstrations, current night sky-577-2107 planetarium.wayne.edu Explore, Discover, Be Inspired. #12;

  2. New Hampshire Guide 4-H Dog Shows

    E-print Network

    New Hampshire, University of

    New Hampshire Guide to 4-H Dog Shows UNH Cooperative Extension 4-H Youth Development Moiles House cooperating. #12;NH Guide to 4-H Dog Shows i Table of Contents INTRODUCTION .................................................................................................................................2 Purpose of the 4-H Dog Project

  3. Quantifying Qualitative Data Using Cognitive Maps

    ERIC Educational Resources Information Center

    Scherp, Hans-Ake

    2013-01-01

    The aim of the article is to show how substantial qualitative material consisting of graphic cognitive maps can be analysed by using digital CmapTools, Excel and SPSS. Evidence is provided of how qualitative and quantitative methods can be combined in educational research by transforming qualitative data into quantitative data to facilitate…

  4. Quantifying Phycocyanin Concentration in Cyanobacterial Algal Blooms from Remote Sensing Reflectance-A Quasi Analytical Approach

    NASA Astrophysics Data System (ADS)

    Mishra, S.; Mishra, D. R.; Tucker, C.

    2011-12-01

    Cyanobacterial harmful algal blooms (CHAB) are notorious for depleting dissolved oxygen level, producing various toxins, causing threats to aquatic life, altering the food-web dynamics and the overall ecosystem functioning in inland lakes, estuaries, and coastal waters. Most of these algal blooms produce various toxins that can damage cells, tissues and even cause mortality of living organisms. Frequent monitoring of water quality in a synoptic scale has been possible by the virtue of remote sensing techniques. In this research, we present a novel technique to monitor CHAB using remote sensing reflectance products. We have modified a multi-band quasi analytical algorithm that determines phytoplankton absorption coefficients from above surface remote sensing reflectance measurements using an inversion method. In situ hyperspectral remote sensing reflectance data were collected from several highly turbid and productive aquaculture ponds. A novel technique was developed to further decompose the phytoplankton absorption coefficients at 620 nm and obtain phycocyanin absorption coefficient at the same wavelength. An empirical relationship was established between phycocyanin absorption coefficients at 620 nm and measured phycocyanin concentrations. Model calibration showed strong relationship between phycocyanin absorption coefficients and phycocyanin pigment concentration (r2=0.94). Validation of the model in a separate dataset produced a root mean squared error of 167 mg m-3 (phycocyanin range: 26-1012 mg m-3). Results demonstrate that the new approach will be suitable for quantifying phycocyanin concentration in cyanobacteria dominated turbid productive waters. Band architecture of the model matches with the band configuration of the Medium Resolution Imaging Spectrometer (MERIS) and assures that MERIS reflectance products can be used to quantify phycocyanin in cyanobacterial harmful algal blooms in optically complex waters.

  5. Quantifying Forearm Muscle Activity during Wrist and Finger Movements by Means of Multi-Channel Electromyography

    PubMed Central

    Gazzoni, Marco; Celadon, Nicolò; Mastrapasqua, Davide; Paleari, Marco; Margaria, Valentina; Ariano, Paolo

    2014-01-01

    The study of hand and finger movement is an important topic with applications in prosthetics, rehabilitation, and ergonomics. Surface electromyography (sEMG) is the gold standard for the analysis of muscle activation. Previous studies investigated the optimal electrode number and positioning on the forearm to obtain information representative of muscle activation and robust to movements. However, the sEMG spatial distribution on the forearm during hand and finger movements and its changes due to different hand positions has never been quantified. The aim of this work is to quantify 1) the spatial localization of surface EMG activity of distinct forearm muscles during dynamic free movements of wrist and single fingers and 2) the effect of hand position on sEMG activity distribution. The subjects performed cyclic dynamic tasks involving the wrist and the fingers. The wrist tasks and the hand opening/closing task were performed with the hand in prone and neutral positions. A sensorized glove was used for kinematics recording. sEMG signals were acquired from the forearm muscles using a grid of 112 electrodes integrated into a stretchable textile sleeve. The areas of sEMG activity have been identified by a segmentation technique after a data dimensionality reduction step based on Non Negative Matrix Factorization applied to the EMG envelopes. The results show that 1) it is possible to identify distinct areas of sEMG activity on the forearm for different fingers; 2) hand position influences sEMG activity level and spatial distribution. This work gives new quantitative information about sEMG activity distribution on the forearm in healthy subjects and provides a basis for future works on the identification of optimal electrode configuration for sEMG based control of prostheses, exoskeletons, or orthoses. An example of use of this information for the optimization of the detection system for the estimation of joint kinematics from sEMG is reported. PMID:25289669

  6. Utilizing novel diversity estimators to quantify multiple dimensions of microbial biodiversity across domains

    PubMed Central

    2013-01-01

    Background Microbial ecologists often employ methods from classical community ecology to analyze microbial community diversity. However, these methods have limitations because microbial communities differ from macro-organismal communities in key ways. This study sought to quantify microbial diversity using methods that are better suited for data spanning multiple domains of life and dimensions of diversity. Diversity profiles are one novel, promising way to analyze microbial datasets. Diversity profiles encompass many other indices, provide effective numbers of diversity (mathematical generalizations of previous indices that better convey the magnitude of differences in diversity), and can incorporate taxa similarity information. To explore whether these profiles change interpretations of microbial datasets, diversity profiles were calculated for four microbial datasets from different environments spanning all domains of life as well as viruses. Both similarity-based profiles that incorporated phylogenetic relatedness and naïve (not similarity-based) profiles were calculated. Simulated datasets were used to examine the robustness of diversity profiles to varying phylogenetic topology and community composition. Results Diversity profiles provided insights into microbial datasets that were not detectable with classical univariate diversity metrics. For all datasets analyzed, there were key distinctions between calculations that incorporated phylogenetic diversity as a measure of taxa similarity and naïve calculations. The profiles also provided information about the effects of rare species on diversity calculations. Additionally, diversity profiles were used to examine thousands of simulated microbial communities, showing that similarity-based and naïve diversity profiles only agreed approximately 50% of the time in their classification of which sample was most diverse. This is a strong argument for incorporating similarity information and calculating diversity with a range of emphases on rare and abundant species when quantifying microbial community diversity. Conclusions For many datasets, diversity profiles provided a different view of microbial community diversity compared to analyses that did not take into account taxa similarity information, effective diversity, or multiple diversity metrics. These findings are a valuable contribution to data analysis methodology in microbial ecology. PMID:24238386

  7. Time and frequency domain methods for quantifying common modulation of motor unit firing patterns.

    PubMed

    Myers, Lance J; Erim, Zeynep; Lowery, Madeleine M

    2004-10-14

    BACKGROUND: In investigations of the human motor system, two approaches are generally employed toward the identification of common modulating drives from motor unit recordings. One is a frequency domain method and uses the coherence function to determine the degree of linear correlation between each frequency component of the signals. The other is a time domain method that has been developed to determine the strength of low frequency common modulations between motor unit spike trains, often referred to in the literature as 'common drive'. METHODS: The relationships between these methods are systematically explored using both mathematical and experimental procedures. A mathematical derivation is presented that shows the theoretical relationship between both time and frequency domain techniques. Multiple recordings from concurrent activities of pairs of motor units are studied and linear regressions are performed between time and frequency domain estimates (for different time domain window sizes) to assess their equivalence. RESULTS: Analytically, it may be demonstrated that under the theoretical condition of a narrowband point frequency, the two relations are equivalent. However practical situations deviate from this ideal condition. The correlation between the two techniques varies with time domain moving average window length and for window lengths of 200 ms, 400 ms and 800 ms, the r2 regression statistics (p < 0.05) are 0.56, 0.81 and 0.80 respectively. CONCLUSIONS: Although theoretically equivalent and experimentally well correlated there are a number of minor discrepancies between the two techniques that are explored. The time domain technique is preferred for short data segments and is better able to quantify the strength of a broad band drive into a single index. The frequency domain measures are more encompassing, providing a complete description of all oscillatory inputs and are better suited to quantifying narrow ranges of descending input into a single index. In general the physiological question at hand should dictate which technique is best suited. PMID:15679910

  8. Complexity and approximability of quantified and stochastic constraint satisfaction problems

    SciTech Connect

    Hunt, H. B. (Harry B.); Stearns, R. L.; Marathe, M. V. (Madhav V.)

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SAT{sub c}(S)). Here, we study simultaneously the complexity of and the existence of efficient approximation algorithms for a number of variants of the problems SAT(S) and SAT{sub c}(S), and for many different D, C, and S. These problem variants include decision and optimization problems, for formulas, quantified formulas stochastically-quantified formulas. We denote these problems by Q-SAT(S), MAX-Q-SAT(S), S-SAT(S), MAX-S-SAT(S) MAX-NSF-Q-SAT(S) and MAX-NSF-S-SAT(S). The main contribution is the development of a unified predictive theory for characterizing the the complexity of these problems. Our unified approach is based on the following basic two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic representability. Let k {ge} 2. Let S be a finite set of finite-arity relations on {Sigma}{sub k} with the following condition on S: All finite arity relations on {Sigma}{sub k} can be represented as finite existentially-quantified conjunctions of relations in S applied to variables (to variables and constant symbols in C), Then we prove the following new results: (1) The problems SAT(S) and SAT{sub c}(S) are both NQL-complete and {le}{sub logn}{sup bw}-complete for NP. (2) The problems Q-SAT(S), Q-SAT{sub c}(S), are PSPACE-complete. Letting k = 2, the problem S-SAT(S) and S-SAT{sub c}(S) are PSPACE-complete. (3) {exists} {epsilon} > 0 for which approximating the problems MAX-Q-SAT(S) within {epsilon} times optimum is PSPACE-hard. Letting k =: 2, {exists} {epsilon} > 0 for which approximating the problems MAX-S-SAT(S) within {epsilon} times optimum is PSPACE-hard. (4) {forall} {epsilon} > 0 the problems MAX-NSF-Q-SAT(S) and MAX-NSF-S-SAT(S), are PSPACE-hard to approximate within a factor of n{sup {epsilon}} times optimum. These results significantly extend the earlier results by (i) Papadimitriou [Pa851] on complexity of stochastic satisfiability, (ii) Condon, Feigenbaum, Lund and Shor [CF+93, CF+94] by identifying natural classes of PSPACE-hard optimization problems with provably PSPACE-hard {epsilon}-approximation problems. Moreover, most of our results hold not just for Boolean relations: most previous results were done only in the context of Boolean domains. The results also constitute as a significant step towards obtaining a dichotomy theorems for the problems MAX-S-SAT(S) and MAX-Q-SAT(S): a research area of recent interest [CF+93, CF+94, Cr95, KSW97, LMP99].

  9. Inside Gun Shows What Goes On

    E-print Network

    Leistikow, Bruce N.

    Inside Gun Shows What Goes On When Everybody Thinks Nobody's Watching Epilogue #12;Inside Gun Shows What Goes on When Everybody Thinks Nobody's Watching Garen Wintemute, MD, MPH Violence Prevention;Epilogue In February 2010, I attended a Crossroads of the West gun show at the Arizona State Fairgrounds

  10. Daytime television talk shows: Guests, content and interactions

    Microsoft Academic Search

    Bradley S. Greenberg; John L. Sherry; Rick W. Busselle; Lynn Rampoldi Hnilo; Sandi W. Smith

    1997-01-01

    Strident controversy over the often?bizarre subject matter of daytime television talk shows motivated this content analysis of 11 shows with the highest Nielsen ratings in 1994–95. A sample of 10 episodes of each of the shows was videotaped and subjected to a systematic analysis of the shows’ guests, topics of discussion and interactions. Results indicate that this genre of program

  11. Using a biokinetic model to quantify and optimize cortisol measurements for acute and chronic environmental stress exposure during pregnancy.

    PubMed

    Smith, Marissa N; Griffith, William C; Beresford, Shirley A A; Vredevoogd, Melinda; Vigoren, Eric M; Faustman, Elaine M

    2014-01-01

    To fully understand the potentially harmful effects of prenatal stress exposure impacts, it is necessary to quantify long-term and episodic stress exposure during pregnancy. There is a strong body of research relating psychological stress to elevated cortisol levels in biomarkers. Recently, maternal hair has been used to measure cortisol levels, and provides the unique opportunity to assess stress exposure throughout gestation. Understanding how cortisol in the hair is related to more common biomarkers, such as, blood, saliva and urine is currently lacking. Therefore, we developed a biokinetic model to quantify the relationships between hair, blood, saliva and urine cortisol concentrations using published literature values. Hair concentrations were used to retrospectively predict peaks in blood and saliva concentrations over days and months. Simulations showed realistic values in all compartments when results were compared with published literature. We also showed that the significant variability of cortisol in blood leads to a weak relationship between long-term and episodic measurements of stress. To our knowledge, this is the first integrative biokinetic cortisol model for blood, urine, hair and saliva. As such, it makes an important contribution to our understanding of cortisol as a biomarker and will be useful for future epidemiological studies. PMID:24301353

  12. Quantifying human response capabilities towards tsunami threats at community level

    NASA Astrophysics Data System (ADS)

    Post, J.; Mück, M.; Zosseder, K.; Wegscheider, S.; Taubenböck, H.; Strunz, G.; Muhari, A.; Anwar, H. Z.; Birkmann, J.; Gebert, N.

    2009-04-01

    Decision makers at the community level need detailed information on tsunami risks in their area. Knowledge on potential hazard impact, exposed elements such as people, critical facilities and lifelines, people's coping capacity and recovery potential are crucial to plan precautionary measures for adaptation and to mitigate potential impacts of tsunamis on society and the environment. A crucial point within a people-centred tsunami risk assessment is to quantify the human response capabilities towards tsunami threats. Based on this quantification and spatial representation in maps tsunami affected and safe areas, difficult-to-evacuate areas, evacuation target points and evacuation routes can be assigned and used as an important contribution to e.g. community level evacuation planning. Major component in the quantification of human response capabilities towards tsunami impacts is the factor time. The human response capabilities depend on the estimated time of arrival (ETA) of a tsunami, the time until technical or natural warning signs (ToNW) can be received, the reaction time (RT) of the population (human understanding of a tsunami warning and the decision to take appropriate action), the evacuation time (ET, time people need to reach a safe area) and the actual available response time (RsT = ETA - ToNW - RT). If RsT is larger than ET, people in the respective areas are able to reach a safe area and rescue themselves. Critical areas possess RsT values equal or even smaller ET and hence people whin these areas will be directly affected by a tsunami. Quantifying the factor time is challenging and an attempt to this is presented here. The ETA can be derived by analyzing pre-computed tsunami scenarios for a respective area. For ToNW we assume that the early warning center is able to fulfil the Indonesian presidential decree to issue a warning within 5 minutes. RT is difficult as here human intrinsic factors as educational level, believe, tsunami knowledge and experience besides others play a role. An attempt to quantify this variable under high uncertainty is also presented. Quantifying ET is based on a GIS modelling using a Cost Weighted Distance approach. Basic principle is to define the best evacuation path from a given point to the next safe area (shelter location). Here the fastest path from that point to the shelter location has to be found. Thereby the impact of land cover, slope, population density, population age and gender distribution are taken into account as literature studies prove these factors as highly important. Knowing the fastest path and the distance to the next safe area together with a spatially distributed pattern of evacuation speed delivers the time needed from each location to a safe area. By considering now the obtained time value for RsT the coverage area of an evacuation target point (safe area) can be assigned. Incorporating knowledge on people capacity of an evacuation target point the respective coverage area is refined. Hence areas with weak, moderate and good human response capabilities can be detected. This allows calculation of potential amount of people affected (dead or injured) and amount of people dislocated. First results for Kuta (Bali) for a worst case tsunami event deliver people affected of approx. 25 000 when RT = 0 minutes (direct evacuation when receiving a tsunami warning to 120 000 when RT > ETA (no evacuation action until tsunami hits the land). Additionally fastest evacuation routes to the evacuation target points can be assigned. Areas with weak response capabilities can be assigned as priority areas to install e.g. additional evacuation target points or to increase tsunami knowledge and awareness to promote a faster reaction time. Especially in analyzing underlying socio-economic properties causing deficiencies in responding to a tsunami threat can lead to valuable information and direct planning of adaptation measures. Keywords: Community level, Risk and vulnerability assessment, Early warning, Disaster management, Tsunami, Indonesia

  13. QUANTIFYING DIFFUSIVE MASS TRANSFER IN FRACTURED SHALEBEDROCK

    EPA Science Inventory

    A significant limitation in defining remediation needs at contaminated sites often results from an insufficient understanding of the transport processes that control contaminant migration. The objectives of this research were to help resolve this dilemma by providing an improved ...

  14. Comparison between a nucleic acid sequence-based amplification and branched DNA test for quantifying HIV RNA load in blood plasma.

    PubMed

    Berndt, C; Müller, U; Bergmann, F; Schmitt, U; Kaiser, R; Müller, C

    2000-09-01

    HIV RNA was quantified in blood plasma from 209 patients and in control specimen comparing the NucliSens HIV-1 QT test (Organon Teknika), which is based on the nucleic acid sequence amplification procedure, and the Quantiplex 3.0 test (Bayer), which uses hybridization signal enhancement by branched DNA (bDNA) probes. A highly significant correlation (P=0.01) was found between the two methods with 88% of the samples showing similar results. In cases of discrepant findings, higher virus load was observed with either test (14xNASBA>bDNA; 12xbNDA>NASBA). Differences could neither be related to clinical features nor to divergent virus subtypes. Standard preparations containing 35000 and 222000 copies were quantified with intra-assay coefficients of variation of <20% using both methods. A preparation of 192 copies was measured with lower precision by both tests, yet was detected more reliably by the bDNA method. PMID:10996651

  15. Low-Order Non-Spatial Effects Dominate Second-Order Spatial Effects in the Texture Quantifier Analysis of 18F-FDG-PET Images

    PubMed Central

    Brooks, Frank J.; Grigsby, Perry W.

    2015-01-01

    Background There is increasing interest in applying image texture quantifiers to assess the intra-tumor heterogeneity observed in FDG-PET images of various cancers. Use of these quantifiers as prognostic indicators of disease outcome and/or treatment response has yielded inconsistent results. We study the general applicability of some well-established texture quantifiers to the image data unique to FDG-PET. Methods We first created computer-simulated test images with statistical properties consistent with clinical image data for cancers of the uterine cervix. We specifically isolated second-order statistical effects from low-order effects and analyzed the resulting variation in common texture quantifiers in response to contrived image variations. We then analyzed the quantifiers computed for FIGOIIb cervical cancers via receiver operating characteristic (ROC) curves and via contingency table analysis of detrended quantifier values. Results We found that image texture quantifiers depend strongly on low-effects such as tumor volume and SUV distribution. When low-order effects are controlled, the image texture quantifiers tested were not able to discern only the second-order effects. Furthermore, the results of clinical tumor heterogeneity studies might be tunable via choice of patient population analyzed. Conclusion Some image texture quantifiers are strongly affected by factors distinct from the second-order effects researchers ostensibly seek to assess via those quantifiers. PMID:25714472

  16. Solar System Odyssey - Fulldome Digital Planetarium Show

    NSDL National Science Digital Library

    This is a Fulldome Digital Planetarium Show. Learners go on a futuristic journey through our Solar System. They explore the inner and outer planets, then the moons: Titan, Europa, and Callisto as possible places to establish a human colony. A full-length preview of the show is available on the website, you need to scroll down about 3/4 of the page - under section on children's shows, direct link not available.

  17. Incorporating both physical and kinetic limitations in quantifying dissolved oxygen flux to aquatic sediments

    USGS Publications Warehouse

    O'Connor, B.L.; Hondzo, M.; Harvey, J.W.

    2009-01-01

    Traditionally, dissolved oxygen (DO) fluxes have been calculated using the thin-film theory with DO microstructure data in systems characterized by fine sediments and low velocities. However, recent experimental evidence of fluctuating DO concentrations near the sediment-water interface suggests that turbulence and coherent motions control the mass transfer, and the surface renewal theory gives a more mechanistic model for quantifying fluxes. Both models involve quantifying the mass transfer coefficient (k) and the relevant concentration difference (??C). This study compared several empirical models for quantifying k based on both thin-film and surface renewal theories, as well as presents a new method for quantifying ??C (dynamic approach) that is consistent with the observed DO concentration fluctuations near the interface. Data were used from a series of flume experiments that includes both physical and kinetic uptake limitations of the flux. Results indicated that methods for quantifying k and ??C using the surface renewal theory better estimated the DO flux across a range of fluid-flow conditions. ?? 2009 ASCE.

  18. Quantifying Energy Savings by Improving Boiler Operation

    E-print Network

    Carpenter, K.; Kissock, J. K.

    2005-01-01

    and Actual UA Values The firing rate of a fire-tube boiler with rated heat input of 5.23 mmBtu/hr boiler when producing 50 psig saturated steam was modulated from about 30% of full fire to 100% of full fire. Figure 3 shows measured boiler efficiency, ?... is the product of the mass flow rate of the product gasses, the specific heat of the gasses, Cpg (about 0.26 Btu/lbm-F), and the temperature rise of the gasses through the combustion chamber. If maximum heat input to the burner, Qmax, and fraction...

  19. Analyzing quantum simulators efficiently: Scalable state tomography and quantifying entanglement with routine measurements

    NASA Astrophysics Data System (ADS)

    Cramer, Marcus; Baumgratz, Tillmann; Marty, Oliver; Gross, David; Plenio, Martin

    2013-03-01

    Conventional full state tomography reaches its limit already for a few qubits and hence novel methods for the verification and benchmarking of quantum devices are called for. We show how the complete reconstruction of density matrices is possible even if one relies only on local information about the state. This results in an experimental effort that is linear in the number of qubits and efficient post-processing -- in stark contrast to the exponential scaling of standard tomography. Whenever full tomography is not needed but instead less information required, one would expect that even fewer measurements suffice. Taking entanglement content of solid state samples and bosons in lattices as an example, we show how it may be quantified unconditionally using already routinely performed measurements only.Scalable reconstruction of density matrices, T. Baumgratz, D. Gross, M. Cramer, and M.B. Plenio, arXiv:1207.0358.Efficient quantum state tomography, M. Cramer, M.B. Plenio, S.T. Flammia, R. Somma, D. Gross, S.D. Bartlett, O. Landon-Cardinal, D. Poulin, and Y.-K. Liu, Nat. Commun. 1, 149 (2010).Measuring entanglement in condensed matter systems, M. Cramer, M.B. Plenio, and H. Wunderlich, Phys. Rev. Lett. 106, 020401 (2011).

  20. Quantifying Spatial Genetic Structuring in Mesophotic Populations of the Precious Coral Corallium rubrum

    PubMed Central

    Costantini, Federica; Carlesi, Lorenzo; Abbiati, Marco

    2013-01-01

    While shallow water red coral populations have been overharvested in the past, nowadays, commercial harvesting shifted its pressure on mesophotic organisms. An understanding of red coral population structure, particularly larval dispersal patterns and connectivity among harvested populations is paramount to the viability of the species. In order to determine patterns of genetic spatial structuring of deep water Corallium rubrum populations, for the first time, colonies found between 58–118 m depth within the Tyrrhenian Sea were collected and analyzed. Ten microsatellite loci and two regions of mitochondrial DNA (mtMSH and mtC) were used to quantify patterns of genetic diversity within populations and to define population structuring at spatial scales from tens of metres to hundreds of kilometres. Microsatellites showed heterozygote deficiencies in all populations. Significant levels of genetic differentiation were observed at all investigated spatial scales, suggesting that populations are likely to be isolated. This differentiation may by the results of biological interactions, occurring within a small spatial scale and/or abiotic factors acting at a larger scale. Mitochondrial markers revealed significant genetic structuring at spatial scales greater then 100 km showing the occurrence of a barrier to gene flow between northern and southern Tyrrhenian populations. These findings provide support for the establishment of marine protected areas in the deep sea and off-shore reefs, in order to effectively maintain genetic diversity of mesophotic red coral populations. PMID:23646109

  1. Quantifying the value of redundant measurements at GCOS Reference Upper-Air Network sites

    NASA Astrophysics Data System (ADS)

    Madonna, F.; Rosoldi, M.; Güldner, J.; Haefele, A.; Kivi, R.; Cadeddu, M. P.; Sisterson, D.; Pappalardo, G.

    2014-11-01

    The potential for measurement redundancy to reduce uncertainty in atmospheric variables has not been investigated comprehensively for climate observations. We evaluated the usefulness of entropy and mutual correlation concepts, as defined in information theory, for quantifying random uncertainty and redundancy in time series of the integrated water vapour (IWV) and water vapour mixing ratio profiles provided by five highly instrumented GRUAN (GCOS, Global Climate Observing System, Reference Upper-Air Network) stations in 2010-2012. Results show that the random uncertainties on the IWV measured with radiosondes, global positioning system, microwave and infrared radiometers, and Raman lidar measurements differed by less than 8%. Comparisons of time series of IWV content from ground-based remote sensing instruments with in situ soundings showed that microwave radiometers have the highest redundancy with the IWV time series measured by radiosondes and therefore the highest potential to reduce the random uncertainty of the radiosondes time series. Moreover, the random uncertainty of a time series from one instrument can be reduced by ~ 60% by constraining the measurements with those from another instrument. The best reduction of random uncertainty is achieved by conditioning Raman lidar measurements with microwave radiometer measurements. Specific instruments are recommended for atmospheric water vapour measurements at GRUAN sites. This approach can be applied to the study of redundant measurements for other climate variables.

  2. Identifying and quantifying the stromal fibrosis in muscularis propria of colorectal carcinoma by multiphoton microscopy

    NASA Astrophysics Data System (ADS)

    Chen, Sijia; Yang, Yinghong; Jiang, Weizhong; Feng, Changyin; Chen, Zhifen; Zhuo, Shuangmu; Zhu, Xiaoqin; Guan, Guoxian; Chen, Jianxin

    2014-10-01

    The examination of stromal fibrosis within colorectal cancer is overlooked, not only because the routine pathological examinations seem to focus more on tumour staging and precise surgical margins, but also because of the lack of efficient diagnostic methods. Multiphoton microscopy (MPM) can be used to study the muscularis stroma of normal and colorectal carcinoma tissue at the molecular level. In this work, we attempt to show the feasibility of MPM for discerning the microstructure of the normal human rectal muscle layer and fibrosis colorectal carcinoma tissue practicably. Three types of muscularis propria stromal fibrosis beneath the colorectal cancer infiltration were first observed through the MPM imaging system by providing intercellular microstructural details in fresh, unstained tissue samples. Our approach also presents the capability of quantifying the extent of stromal fibrosis from both amount and orientation of collagen, which may further characterize the severity of fibrosis. By comparing with the pathology analysis, these results show that the MPM has potential advantages in becoming a histological tool for detecting the stromal fibrosis and collecting prognosis evidence, which may guide subsequent therapy procedures for patients into good prognosis.

  3. Quantifying Security Threats and Their Impact

    SciTech Connect

    Aissa, Anis Ben [University of Tunis, Belvedere, Tunisia] [University of Tunis, Belvedere, Tunisia; Abercrombie, Robert K [ORNL] [ORNL; Sheldon, Frederick T [ORNL] [ORNL; Mili, Ali [New Jersey Insitute of Technology] [New Jersey Insitute of Technology

    2009-01-01

    In earlier works, we present a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper we illustrate this infrastructure by means of a sample example involving an e-commerce application.

  4. Quantifying Effects Of Water Stress On Sunflowers

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This poster presentation describes the data collection and analysis procedures and results for 2009 from a research grant funded by the National Sunflower Association. The primary objective was to evaluate the use of crop canopy temperature measured with infrared temperature sensors, as a more time ...

  5. Quantifying MCMC Exploration of Phylogenetic Tree Space

    PubMed Central

    Whidden, Chris; Matsen, Frederick A.

    2015-01-01

    In order to gain an understanding of the effectiveness of phylogenetic Markov chain Monte Carlo (MCMC), it is important to understand how quickly the empirical distribution of the MCMC converges to the posterior distribution. In this article, we investigate this problem on phylogenetic tree topologies with a metric that is especially well suited to the task: the subtree prune-and-regraft (SPR) metric. This metric directly corresponds to the minimum number of MCMC rearrangements required to move between trees in common phylogenetic MCMC implementations. We develop a novel graph-based approach to analyze tree posteriors and find that the SPR metric is much more informative than simpler metrics that are unrelated to MCMC moves. In doing so, we show conclusively that topological peaks do occur in Bayesian phylogenetic posteriors from real data sets as sampled with standard MCMC approaches, investigate the efficiency of Metropolis-coupled MCMC (MCMCMC) in traversing the valleys between peaks, and show that conditional clade distribution (CCD) can have systematic problems when there are multiple peaks. PMID:25631175

  6. Quantifying MCMC Exploration of Phylogenetic Tree Space.

    PubMed

    Whidden, Chris; Matsen, Frederick A

    2015-05-01

    In order to gain an understanding of the effectiveness of phylogenetic Markov chain Monte Carlo (MCMC), it is important to understand how quickly the empirical distribution of the MCMC converges to the posterior distribution. In this article, we investigate this problem on phylogenetic tree topologies with a metric that is especially well suited to the task: the subtree prune-and-regraft (SPR) metric. This metric directly corresponds to the minimum number of MCMC rearrangements required to move between trees in common phylogenetic MCMC implementations. We develop a novel graph-based approach to analyze tree posteriors and find that the SPR metric is much more informative than simpler metrics that are unrelated to MCMC moves. In doing so, we show conclusively that topological peaks do occur in Bayesian phylogenetic posteriors from real data sets as sampled with standard MCMC approaches, investigate the efficiency of Metropolis-coupled MCMC (MCMCMC) in traversing the valleys between peaks, and show that conditional clade distribution (CCD) can have systematic problems when there are multiple peaks. PMID:25631175

  7. Quantifying tissue mechanical properties using photoplethysmography

    PubMed Central

    Akl, Tony J.; Wilson, Mark A.; Ericson, M. Nance; Coté, Gerard L.

    2014-01-01

    Photoplethysmography (PPG) is a non-invasive optical method that can be used to detect blood volume changes in the microvascular bed of tissue. The PPG signal comprises two components; a pulsatile waveform (AC) attributed to changes in the interrogated blood volume with each heartbeat, and a slowly varying baseline (DC) combining low frequency fluctuations mainly due to respiration and sympathetic nervous system activity. In this report, we investigate the AC pulsatile waveform of the PPG pulse for ultimate use in extracting information regarding the biomechanical properties of tissue and vasculature. By analyzing the rise time of the pulse in the diastole period, we show that PPG is capable of measuring changes in the Young’s Modulus of tissue mimicking phantoms with a resolution of 4 KPa in the range of 12 to 61 KPa. In addition, the shape of the pulse can potentially be used to diagnose vascular complications by differentiating upstream from downstream complications. A Windkessel model was used to model changes in the biomechanical properties of the circulation and to test the proposed concept. The modeling data confirmed the response seen in vitro and showed the same trends in the PPG rise and fall times with changes in compliance and vascular resistance. PMID:25071970

  8. Quantifying tissue mechanical properties using photoplethysmography

    SciTech Connect

    Akl, Tony [Texas A& M University; Wilson, Mark A. [University of Pittsburgh School of Medicine, Pittsburgh PA; Ericson, Milton Nance [ORNL; Cote, Gerard L. [Texas A& M University

    2014-01-01

    Photoplethysmography (PPG) is a non-invasive optical method that can be used to detect blood volume changes in the microvascular bed of tissue. The PPG signal comprises two components; a pulsatile waveform (AC) attributed to changes in the interrogated blood volume with each heartbeat, and a slowly varying baseline (DC) combining low frequency fluctuations mainly due to respiration and sympathetic nervous system activity. In this report, we investigate the AC pulsatile waveform of the PPG pulse for ultimate use in extracting information regarding the biomechanical properties of tissue and vasculature. By analyzing the rise time of the pulse in the diastole period, we show that PPG is capable of measuring changes in the Young s Modulus of tissue mimicking phantoms with a resolution of 4 KPa in the range of 12 to 61 KPa. In addition, the shape of the pulse can potentially be used to diagnose vascular complications by differentiating upstream from downstream complications. A Windkessel model was used to model changes in the biomechanical properties of the circulation and to test the proposed concept. The modeling data confirmed the response seen in vitro and showed the same trends in the PPG rise and fall times with changes in compliance and vascular resistance.

  9. Power spectrum scale invariance quantifies limbic dysregulation in trait anxious adults using fMRI: adapting methods optimized for characterizing autonomic dysregulation to neural dynamic timeseries.

    PubMed Central

    Tolkunov, Denis; Rubin, Denis; Mujica-Parodi, LR

    2010-01-01

    In a well-regulated control system, excitatory and inhibitory components work closely together with minimum lag; in response to inputs of finite duration, outputs should show rapid rise and, following the input's termination, immediate return to baseline. The efficiency of this response can be quantified using the power spectrum density's scaling parameter ?, a measure of self-similarity, applied to the first-derivative of the raw signal. In this study, we adapted power spectrum density methods, previously used to quantify autonomic dysregulation (heart rate variability), to neural time-series obtained via functional MRI. The negative feedback loop we investigated was the limbic system, using affect-valent faces as stimuli. We hypothesized that trait anxiety would be related to efficiency of regulation of limbic responses, as quantified by power law scaling of fMRI time series. Our results supported this hypothesis, showing moderate to strong correlations of ? (r = 0.4–0.54) for the amygdala, orbitofrontal cortex, hippocampus, superior temporal gyrus, posterior insula, and anterior cingulate. Strong anticorrelations were also found between the amygdala's ? and wake heart rate variability (r = ?0.61), suggesting a robust relationship between dysregulated limbic outputs and their autonomic consequences. PMID:20025979

  10. The Language of Show Biz: A Dictionary.

    ERIC Educational Resources Information Center

    Sergel, Sherman Louis, Ed.

    This dictionary of the language of show biz provides the layman with definitions and essays on terms and expressions often used in show business. The overall pattern of selection was intended to be more rather than less inclusive, though radio, television, and film terms were deliberately omitted. Lengthy explanations are sometimes used to express…

  11. The Physics of Equestrian Show Jumping

    ERIC Educational Resources Information Center

    Stinner, Art

    2014-01-01

    This article discusses the kinematics and dynamics of equestrian show jumping. For some time I have attended a series of show jumping events at Spruce Meadows, an international equestrian center near Calgary, Alberta, often referred to as the "Wimbledon of equestrian jumping." I have always had a desire to write an article such as this…

  12. Salton Sea Satellite Image Showing Fault Slip

    USGS Multimedia Gallery

    Landsat satellite image (LE70390372003084EDC00) showing location of surface slip triggered along faults in the greater Salton Trough area. Red bars show the generalized location of 2010 surface slip along faults in the central Salton Trough and many additional faults in the southwestern section of t...

  13. Talk shows’ representations of interpersonal conflicts

    Microsoft Academic Search

    Susan L. Brinson; J. Emmett Winn

    1997-01-01

    In the past ten years, daytime talk shows became very popular among television programmers and viewers alike. Given the large audiences to whom talk shows communicate, it is important to analyze the messages contained in the programs. Remarkably little academic attention has been paid to this phenomenon, however. The present study focuses on the presentation of interpersonal conflicts, particularly regarding

  14. "The Stars Tonight" LIVE Planetarium Show

    E-print Network

    Mathis, Wayne N.

    "The Stars Tonight" LIVE Planetarium Show Theme: The Stars Tonight Program is built around Planetarium is particularly well-suited to host The Stars Tonight. Its large dome and Zeiss Mark 6A projector. The Einstein Planetarium at NASM shows two other programs on astronomical topics: · Infinity Express · Cosmic

  15. Inside Gun Shows What Goes On

    E-print Network

    Nguyen, Danh

    Systems of Gun Commerce Modern gun commerce operates under the terms of the Gun Control Act of 1968. ThoseInside Gun Shows What Goes On When Everybody Thinks Nobody's Watching Executive Summary #12;Inside Gun Shows What Goes on When Everybody Thinks Nobody's Watching Garen Wintemute, MD, MPH Violence

  16. Inside Gun Shows What Goes On

    E-print Network

    Leistikow, Bruce N.

    for that reason, are an important source of guns used in criminal violence. The intent of this reportPreface Inside Gun Shows What Goes On When Everybody Thinks Nobody's Watching #12;#12;Inside Gun Shows What Goes On When Everybody Thinks Nobody's Watching Garen Wintemute, MD, MPH #12;Violence

  17. International Plowing Match & Farm Machinery Show

    NSDL National Science Digital Library

    The 1995 International Plowing Match & Farm Machinery Show in Ontario, Canada has a site of the Web. The IPM is a non-profit organization of volunteers which annually organizes Canada's largest farm machinery show. The event is commercial and educational. Thousands of school children and educators attend and participate in organized educational activities.

  18. Quantifying nonstationary radioactivity concentration fluctuations near Chernobyl: A complete statistical description

    E-print Network

    Stanley, H. Eugene

    Quantifying nonstationary radioactivity concentration fluctuations near Chernobyl: A complete Chernobyl after the 1986 disaster and find three new results: i the histogram of fluctuations is well.60. x, 02.50.Fz, 05.45.Tp, 87.66.Na I. INTRODUCTION Chernobyl's No. 4 reactor was completely destroyed

  19. Quantifying nonstationary radioactivity concentration fluctuations near Chernobyl: A complete statistical description

    E-print Network

    Shlyakhter, Ilya

    Quantifying nonstationary radioactivity concentration fluctuations near Chernobyl: A complete fluctuations measured near Chernobyl after the 1986 disaster and find three new results: #i# the histogram patterns. PACS number#s#: 89.60.#x, 02.50.Fz, 05.45.Tp, 87.66.Na I. INTRODUCTION Chernobyl's No. 4 reactor

  20. COMPARISON OF MEASUREMENT TECHNIQUES FOR QUANTIFYING SELECTED ORGANIC EMISSIONS FROM KEROSENE SPACE HEATERS

    EPA Science Inventory

    The report goes results of (1) a comparison the hood and chamber techniques for quantifying pollutant emission rates from unvented combustion appliances, and (2) an assessment of the semivolatile and nonvolatile organic-compound emissions from unvented kerosene space heaters. In ...

  1. QUANTIFYING THE POTENTIAL IMPACTS OF ATMS ON AIR QUALITY Bruce Hellinga

    E-print Network

    Hellinga, Bruce

    QUANTIFYING THE POTENTIAL IMPACTS OF ATMS ON AIR QUALITY Bruce Hellinga Department of Civil to implement traffic control strategies that satisfy legislated air quality standards. Unfortunately, the relationships between various traffic management options and the resulting air quality impacts are generally

  2. Quantifying the mechanical and hydrologic effects of riparian vegetation on streambank stability

    Microsoft Academic Search

    Andrew Simon; Andrew J. C. Collison

    2002-01-01

    Riparian vegetation strips are widely used by river managers to increase streambank stability, among other purposes. However, though the effects of vegetation on bank stability are widely discussed they are rarely quantified, and generally underemphasize the importance of hydrologic processes, some of which may be detrimental. This paper presents results from an experiment in which the hydrologic and mechanical effects

  3. Quantifying self-organization with optimal predictors.

    PubMed

    Shalizi, Cosma Rohilla; Shalizi, Kristina Lisa; Haslinger, Robert

    2004-09-10

    Despite broad interest in self-organizing systems, there are few quantitative, experimentally applicable criteria for self-organization. The existing criteria all give counter-intuitive results for important cases. In this Letter, we propose a new criterion, namely, an internally generated increase in the statistical complexity, the amount of information required for optimal prediction of the system's dynamics. We precisely define this complexity for spatially extended dynamical systems, using the probabilistic ideas of mutual information and minimal sufficient statistics. This leads to a general method for predicting such systems and a simple algorithm for estimating statistical complexity. The results of applying this algorithm to a class of models of excitable media (cyclic cellular automata) strongly support our proposal. PMID:15447385

  4. Quantifying the direct use value of Condor seamount

    NASA Astrophysics Data System (ADS)

    Ressurreição, Adriana; Giacomello, Eva

    2013-12-01

    Seamounts often satisfy numerous uses and interests. Multiple uses can generate multiple benefits but also conflicts and impacts, calling, therefore, for integrated and sustainable management. To assist in developing comprehensive management strategies, policymakers recognise the need to include measures of socioeconomic analysis alongside ecological data so that practical compromises can be made. This study assessed the direct output impact (DOI) of the relevant marine activities operating at Condor seamount (Azores, central northeast Atlantic) as proxies of the direct use values provided by the resource system. Results demonstrated that Condor seamount supported a wide range of uses yielding distinct economic outputs. Demersal fisheries, scientific research and shark diving were the top-three activities generating the highest revenues, while tuna fisheries, whale watching and scuba-diving had marginal economic significance. Results also indicated that the economic importance of non-extractive uses of Condor is considerable, highlighting the importance of these uses as alternative income-generating opportunities for local communities. It is hoped that quantifying the direct use values provided by Condor seamount will contribute to the decision making process towards its long-term conservation and sustainable use.

  5. Quantifying Repetitive Speech in Autism Spectrum Disorders and Language Impairment

    PubMed Central

    van Santen, Jan P. H.; Sproat, Richard W.; Hill, Alison Presmanes

    2013-01-01

    We report on an automatic technique for quantifying two types of repetitive speech: repetitions of what the child says him/herself (self-repeats) and of what is uttered by an interlocutor (echolalia). We apply this technique to a sample of 111 children between the ages of four and eight: 42 typically developing children (TD), 19 children with specific language impairment (SLI), 25 children with autism spectrum disorders (ASD) plus language impairment (ALI), and 25 children with ASD with normal, non-impaired language (ALN). The results indicate robust differences in echolalia between the TD and ASD groups as a whole (ALN + ALI), and between TD and ALN children. There were no significant differences between ALI and SLI children for echolalia or self-repetitions. The results confirm previous findings that children with ASD repeat the language of others more than other populations of children. On the other hand, self-repetition does not appear to be significantly more frequent in ASD, nor does it matter whether the child’s echolalia occurred within one (immediate) or two turns (near-immediate) of the adult’s original utterance. Furthermore, non-significant differences between ALN and SLI, between TD and SLI, and between ALI and TD are suggestive that echolalia may not be specific to ALN or to ASD in general. One important innovation of this work is an objective fully automatic technique for assessing the amount of repetition in a transcript of a child’s utterances. PMID:23661504

  6. Quantifying colloid retention in partially saturated porous media

    NASA Astrophysics Data System (ADS)

    Zevi, Yuniati; Dathe, Annette; Gao, Bin; Richards, Brian K.; Steenhuis, Tammo S.

    2006-12-01

    The transport of colloid-contaminant complexes and colloid-sized pathogens through soil to groundwater is of concern. Visualization and quantification of pore-scale colloid behavior will enable better description and simulation of retention mechanisms at individual surfaces, in contrast to breakthrough curves which only provide an integrated signal. We tested two procedures for quantifying colloid movement and retention as observed in pore-scale image sequences. After initial testing with static images, three series of images of synthetic microbead suspensions passing through unsaturated sand were examined. The region procedure (implemented in ImageJ) and the Boolean procedure (implemented in KS400) yielded nearly identical results for initial test images and for total colloid-covered areas in three image series. Because of electronic noise resulting in pixel-level brightness fluctuations the Boolean procedure tended to underestimate attached colloid counts and conversely overestimate mobile colloid counts. The region procedure had a smaller overestimation error of attached colloids. Reliable quantification of colloid retention at pore scale can be used to improve current understanding on the transport mechanisms of colloids in unsaturated porous media. For example, attachment counts at individual air/water meniscus/solid interface were well described by Langmuir isotherms.

  7. Quantifying dose to the reconstructed breast: Can we adequately treat?

    SciTech Connect

    Chung, Eugene; Marsh, Robin B.; Griffith, Kent A.; Moran, Jean M. [Department of Radiation Oncology, University of Michigan, Ann Arbor, MI (United States); Pierce, Lori J., E-mail: ljpierce@umich.edu [Department of Radiation Oncology, University of Michigan, Ann Arbor, MI (United States)

    2013-04-01

    To evaluate how immediate reconstruction (IR) impacts postmastectomy radiotherapy (PMRT) dose distributions to the reconstructed breast (RB), internal mammary nodes (IMN), heart, and lungs using quantifiable dosimetric end points. 3D conformal plans were developed for 20 IR patients, 10 autologous reconstruction (AR), and 10 expander-implant (EI) reconstruction. For each reconstruction type, 5 right- and 5 left-sided reconstructions were selected. Two plans were created for each patient, 1 with RB coverage alone and 1 with RB + IMN coverage. Left-sided EI plans without IMN coverage had higher heart Dmean than left-sided AR plans (2.97 and 0.84 Gy, p = 0.03). Otherwise, results did not vary by reconstruction type and all remaining metrics were evaluated using a combined AR and EI dataset. RB coverage was adequate regardless of laterality or IMN coverage (Dmean 50.61 Gy, D95 45.76 Gy). When included, IMN Dmean and D95 were 49.57 and 40.96 Gy, respectively. Mean heart doses increased with left-sided treatment plans and IMN inclusion. Right-sided treatment plans and IMN inclusion increased mean lung V{sub 20}. Using standard field arrangements and 3D planning, we observed excellent coverage of the RB and IMN, regardless of laterality or reconstruction type. Our results demonstrate that adequate doses can be delivered to the RB with or without IMN coverage.

  8. Quantifying uncertainty in LCA-modelling of waste management systems.

    PubMed

    Clavreul, Julie; Guyonnet, Dominique; Christensen, Thomas H

    2012-12-01

    Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining the selected methods: (Step 1) a sensitivity analysis evaluating the sensitivities of the results with respect to the input uncertainties, (Step 2) an uncertainty propagation providing appropriate tools for representing uncertainties and calculating the overall uncertainty of the model results, (Step 3) an uncertainty contribution analysis quantifying the contribution of each parameter uncertainty to the final uncertainty and (Step 4) as a new approach, a combined sensitivity analysis providing a visualisation of the shift in the ranking of different options due to variations of selected key parameters. This tiered approach optimises the resources available to LCA practitioners by only propagating the most influential uncertainties. PMID:22863069

  9. Method for quantifying optical properties of the human lens

    DOEpatents

    Loree, T.R.; Bigio, I.J.; Zuclich, J.A.; Shimada, Tsutomu; Strobl, K.

    1999-04-13

    A method is disclosed for quantifying optical properties of the human lens. The present invention includes the application of fiberoptic, OMA-based instrumentation as an in vivo diagnostic tool for the human ocular lens. Rapid, noninvasive and comprehensive assessment of the optical characteristics of a lens using very modest levels of exciting light are described. Typically, the backscatter and fluorescence spectra (from about 300- to 900-nm) elicited by each of several exciting wavelengths (from about 300- to 600-nm) are collected within a few seconds. The resulting optical signature of individual lenses is then used to assess the overall optical quality of the lens by comparing the results with a database of similar measurements obtained from a reference set of normal human lenses having various ages. Several metrics have been identified which gauge the optical quality of a given lens relative to the norm for the subject`s chronological age. These metrics may also serve to document accelerated optical aging and/or as early indicators of cataract or other disease processes. 8 figs.

  10. Quantifying space-time dynamics of flood event types

    NASA Astrophysics Data System (ADS)

    Viglione, Alberto; Chirico, Giovanni Battista; Komma, Jürgen; Woods, Ross; Borga, Marco; Blöschl, Günter

    2010-11-01

    SummaryA generalised framework of space-time variability in flood response is used to characterise five flood events of different type in the Kamp area in Austria: one long-rain event, two short-rain events, one rain-on-snow event and one snowmelt event. Specifically, the framework quantifies the contributions of the space-time variability of rainfall/snowmelt, runoff coefficient, hillslope and channel routing to the flood runoff volume and the delay and spread of the resulting hydrograph. The results indicate that the components obtained by the framework clearly reflect the individual processes which characterise the event types. For the short-rain events, temporal, spatial and movement components can all be important in runoff generation and routing, which would be expected because of their local nature in time and, particularly, in space. For the long-rain event, the temporal components tend to be more important for runoff generation, because of the more uniform spatial coverage of rainfall, while for routing the spatial distribution of the produced runoff, which is not uniform, is also important. For the rain-on-snow and snowmelt events, the spatio-temporal variability terms typically do not play much role in runoff generation and the spread of the hydrograph is mainly due to the duration of the event. As an outcome of the framework, a dimensionless response number is proposed that represents the joint effect of runoff coefficient and hydrograph peakedness and captures the absolute magnitudes of the observed flood peaks.

  11. Method for quantifying optical properties of the human lens

    DOEpatents

    Loree, deceased, Thomas R. (late of Albuquerque, NM); Bigio, Irving J. (Los Alamos, NM); Zuclich, Joseph A. (San Antonio, TX); Shimada, Tsutomu (Los Alamos, NM); Strobl, Karlheinz (Fiskdale, MA)

    1999-01-01

    Method for quantifying optical properties of the human lens. The present invention includes the application of fiberoptic, OMA-based instrumentation as an in vivo diagnostic tool for the human ocular lens. Rapid, noninvasive and comprehensive assessment of the optical characteristics of a lens using very modest levels of exciting light are described. Typically, the backscatter and fluorescence spectra (from about 300- to 900-nm) elicited by each of several exciting wavelengths (from about 300- to 600-nm) are collected within a few seconds. The resulting optical signature of individual lenses is then used to assess the overall optical quality of the lens by comparing the results with a database of similar measurements obtained from a reference set of normal human lenses having various ages. Several metrics have been identified which gauge the optical quality of a given lens relative to the norm for the subject's chronological age. These metrics may also serve to document accelerated optical aging and/or as early indicators of cataract or other disease processes.

  12. Quantifying Russian wheat aphid pest intensity across the Great Plains.

    PubMed

    Merrill, Scott C; Peairs, Frank B

    2012-12-01

    Wheat, the most important cereal crop in the Northern Hemisphere, is at-risk for an approximate 10% reduction in worldwide production because of animal pests. The potential economic impact of cereal crop pests has resulted in substantial research efforts into the understanding of pest agroecosystems and development of pest management strategy. Management strategy is informed frequently by models that describe the population dynamics of important crop pests and because of the economic impact of these pests, many models have been developed. Yet, limited effort has ensued to compare and contrast models for their strategic applicability and quality. One of the most damaging pests of wheat in North America is the Russian wheat aphid, Diuraphis noxia (Kurdjumov). Eighteen D. noxia population dynamic models were developed from the literature to describe pest intensity. The strongest models quantified the negative effects of fall and spring precipitation on aphid intensity, and the positive effects associated with alternate food source availability. Population dynamic models were transformed into spatially explicit models and combined to form a spatially explicit, model-averaged result. Our findings were used to delineate pest intensity on winter wheat across much of the Great Plains and will help improve D. noxia management strategy. PMID:23321099

  13. QUANTIFYING THE EVOLVING MAGNETIC STRUCTURE OF ACTIVE REGIONS

    SciTech Connect

    Conlon, Paul A.; McAteer, R.T. James; Gallagher, Peter T.; Fennell, Linda, E-mail: mcateer@nmsu.ed [School of Physics, Trinity College Dublin, Dublin 2 (Ireland)

    2010-10-10

    The topical and controversial issue of parameterizing the magnetic structure of solar active regions has vital implications in the understanding of how these structures form, evolve, produce solar flares, and decay. This interdisciplinary and ill-constrained problem of quantifying complexity is addressed by using a two-dimensional wavelet transform modulus maxima (WTMM) method to study the multifractal properties of active region photospheric magnetic fields. The WTMM method provides an adaptive space-scale partition of a fractal distribution, from which one can extract the multifractal spectra. The use of a novel segmentation procedure allows us to remove the quiet Sun component and reliably study the evolution of active region multifractal parameters. It is shown that prior to the onset of solar flares, the magnetic field undergoes restructuring as Dirac-like features (with a Hoelder exponent, h = -1) coalesce to form step functions (where h = 0). The resulting configuration has a higher concentration of gradients along neutral line features. We propose that when sufficient flux is present in an active region for a period of time, it must be structured with a fractal dimension greater than 1.2, and a Hoelder exponent greater than -0.7, in order to produce M- and X-class flares. This result has immediate applications in the study of the underlying physics of active region evolution and space weather forecasting.

  14. Quantifying protein diffusion and capture on filaments.

    PubMed

    Reithmann, Emanuel; Reese, Louis; Frey, Erwin

    2015-02-17

    The functional relevance of regulating proteins is often limited to specific binding sites such as the ends of microtubules or actin-filaments. A localization of proteins on these functional sites is of great importance. We present a quantitative theory for a diffusion and capture process, where proteins diffuse on a filament and stop diffusing when reaching the filament's end. It is found that end-association after one-dimensional diffusion is the main source for tip-localization of such proteins. As a consequence, diffusion and capture is highly efficient in enhancing the reaction velocity of enzymatic reactions, where proteins and filament ends are to each other as enzyme and substrate. We show that the reaction velocity can effectively be described within a Michaelis-Menten framework. Together, one-dimensional diffusion and capture beats the (three-dimensional) Smoluchowski diffusion limit for the rate of protein association to filament ends. PMID:25692582

  15. Quantifying protein diffusion and capture on filaments

    E-print Network

    Reithmann, Emanuel; Frey, Erwin

    2015-01-01

    The functional relevance of regulating proteins is often limited to specific binding sites such as the ends of microtubules or actin-filaments. A localization of proteins on these functional sites is of great importance. We present a quantitative theory for a diffusion and capture process, where proteins diffuse on a filament and stop diffusing when reaching the filament's end. It is found that end-association after one-dimensional diffusion is the main source for tip-localization of such proteins. As a consequence, diffusion and capture is highly efficient in enhancing the reaction velocity of enzymatic reactions, where proteins and filament ends are to each other as enzyme and substrate. We show that the reaction velocity can effectively be described within a Michaelis-Menten framework. Together one-dimensional diffusion and capture beats the (three-dimensional) Smoluchowski diffusion limit for the rate of protein association to filament ends.

  16. Quantifying protein diffusion and capture on filaments

    E-print Network

    Emanuel Reithmann; Louis Reese; Erwin Frey

    2015-03-03

    The functional relevance of regulating proteins is often limited to specific binding sites such as the ends of microtubules or actin-filaments. A localization of proteins on these functional sites is of great importance. We present a quantitative theory for a diffusion and capture process, where proteins diffuse on a filament and stop diffusing when reaching the filament's end. It is found that end-association after one-dimensional diffusion is the main source for tip-localization of such proteins. As a consequence, diffusion and capture is highly efficient in enhancing the reaction velocity of enzymatic reactions, where proteins and filament ends are to each other as enzyme and substrate. We show that the reaction velocity can effectively be described within a Michaelis-Menten framework. Together one-dimensional diffusion and capture beats the (three-dimensional) Smoluchowski diffusion limit for the rate of protein association to filament ends.

  17. Quantifying the benefits of vehicle pooling with shareability networks

    PubMed Central

    Santi, Paolo; Resta, Giovanni; Szell, Michael; Sobolevsky, Stanislav; Strogatz, Steven H.; Ratti, Carlo

    2014-01-01

    Taxi services are a vital part of urban transportation, and a considerable contributor to traffic congestion and air pollution causing substantial adverse effects on human health. Sharing taxi trips is a possible way of reducing the negative impact of taxi services on cities, but this comes at the expense of passenger discomfort quantifiable in terms of a longer travel time. Due to computational challenges, taxi sharing has traditionally been approached on small scales, such as within airport perimeters, or with dynamical ad hoc heuristics. However, a mathematical framework for the systematic understanding of the tradeoff between collective benefits of sharing and individual passenger discomfort is lacking. Here we introduce the notion of shareability network, which allows us to model the collective benefits of sharing as a function of passenger inconvenience, and to efficiently compute optimal sharing strategies on massive datasets. We apply this framework to a dataset of millions of taxi trips taken in New York City, showing that with increasing but still relatively low passenger discomfort, cumulative trip length can be cut by 40% or more. This benefit comes with reductions in service cost, emissions, and with split fares, hinting toward a wide passenger acceptance of such a shared service. Simulation of a realistic online system demonstrates the feasibility of a shareable taxi service in New York City. Shareability as a function of trip density saturates fast, suggesting effectiveness of the taxi sharing system also in cities with much sparser taxi fleets or when willingness to share is low. PMID:25197046

  18. Quantifying the benefits of vehicle pooling with shareability networks.

    PubMed

    Santi, Paolo; Resta, Giovanni; Szell, Michael; Sobolevsky, Stanislav; Strogatz, Steven H; Ratti, Carlo

    2014-09-16

    Taxi services are a vital part of urban transportation, and a considerable contributor to traffic congestion and air pollution causing substantial adverse effects on human health. Sharing taxi trips is a possible way of reducing the negative impact of taxi services on cities, but this comes at the expense of passenger discomfort quantifiable in terms of a longer travel time. Due to computational challenges, taxi sharing has traditionally been approached on small scales, such as within airport perimeters, or with dynamical ad hoc heuristics. However, a mathematical framework for the systematic understanding of the tradeoff between collective benefits of sharing and individual passenger discomfort is lacking. Here we introduce the notion of shareability network, which allows us to model the collective benefits of sharing as a function of passenger inconvenience, and to efficiently compute optimal sharing strategies on massive datasets. We apply this framework to a dataset of millions of taxi trips taken in New York City, showing that with increasing but still relatively low passenger discomfort, cumulative trip length can be cut by 40% or more. This benefit comes with reductions in service cost, emissions, and with split fares, hinting toward a wide passenger acceptance of such a shared service. Simulation of a realistic online system demonstrates the feasibility of a shareable taxi service in New York City. Shareability as a function of trip density saturates fast, suggesting effectiveness of the taxi sharing system also in cities with much sparser taxi fleets or when willingness to share is low. PMID:25197046

  19. Quantifying photometric observing conditions on Paranal using an IR camera

    NASA Astrophysics Data System (ADS)

    Kerber, Florian; Querel, Richard R.; Hanuschik, Reinhard

    2014-08-01

    A Low Humidity and Temperature Profiling (LHATPRO) microwave radiometer, manufactured by Radiometer Physics GmbH (RPG), is used to monitor sky conditions over ESO's Paranal observatory in support of VLT science operations. In addition to measuring precipitable water vapour (PWV) the instrument also contains an IR camera measuring sky brightness temperature at 10.5 ?m. Due to its extended operating range down to -100 °C it is capable of detecting very cold and very thin, even sub-visual, cirrus clouds. We present a set of instrument flux calibration values as compared with a detrended fluctuation analysis (DFA) of the IR camera zenith-looking sky brightness data measured above Paranal taken over the past two years. We show that it is possible to quantify photometric observing conditions and that the method is highly sensitive to the presence of even very thin clouds but robust against variations of sky brightness caused by effects other than clouds such as variations of precipitable water vapour. Hence it can be used to determine photometric conditions for science operations. About 60 % of nights are free of clouds on Paranal. More work will be required to classify the clouds using this technique. For the future this approach might become part of VLT science operations for evaluating nightly sky conditions.

  20. Parkinson's Law quantified: three investigations on bureaucratic inefficiency

    NASA Astrophysics Data System (ADS)

    Klimek, Peter; Hanel, Rudolf; Thurner, Stefan

    2009-03-01

    We formulate three famous, descriptive essays of Parkinson on bureaucratic inefficiency in a quantifiable and dynamical socio-physical framework. In the first model we show how the use of recent opinion formation models for small groups can be used to understand Parkinson's observation that decision-making bodies such as cabinets or boards become highly inefficient once their size exceeds a critical 'Coefficient of Inefficiency', typically around 20. A second observation of Parkinson—which is sometimes referred to as Parkinson's Law—is that the growth of bureaucratic or administrative bodies usually goes hand in hand with a drastic decrease of its overall efficiency. In our second model we view a bureaucratic body as a system of a flow of workers, who enter, become promoted to various internal levels within the system over time, and leave the system after having served for a certain time. Promotion usually is associated with an increase of subordinates. Within the proposed model it becomes possible to work out the phase diagram under which conditions of bureaucratic growth can be confined. In our last model we assign individual efficiency curves to workers throughout their life in administration, and compute the optimum time to give them the old age pension, in order to ensure a maximum of efficiency within the body—in Parkinson's words we compute the 'Pension Point'.

  1. Quantifying cross-correlations using local and global detrending approaches

    NASA Astrophysics Data System (ADS)

    Podobnik, B.; Grosse, I.; Horvati?, D.; Ilic, S.; Ivanov, P. Ch.; Stanley, H. E.

    2009-09-01

    In order to quantify the long-range cross-correlations between two time series qualitatively, we introduce a new cross-correlations test QCC(m), where m is the number of degrees of freedom. If there are no cross-correlations between two time series, the cross-correlation test agrees well with the ?2(m) distribution. If the cross-correlations test exceeds the critical value of the ?2(m) distribution, then we say that the cross-correlations are significant. We show that if a Fourier phase-randomization procedure is carried out on a power-law cross-correlated time series, the cross-correlations test is substantially reduced compared to the case before Fourier phase randomization. We also study the effect of periodic trends on systems with power-law cross-correlations. We find that periodic trends can severely affect the quantitative analysis of long-range correlations, leading to crossovers and other spurious deviations from power laws, implying both local and global detrending approaches should be applied to properly uncover long-range power-law auto-correlations and cross-correlations in the random part of the underlying stochastic process.

  2. Quantifying contributions to the recent temperature variability in the tropical tropopause layer

    NASA Astrophysics Data System (ADS)

    Wang, W.; Matthes, K.; Schmidt, T.

    2014-08-01

    The recently observed variability in the tropical tropopause layer, which features an unexpected warming of 1.1 K over the past decade (2001-2011), is investigated with a number of sensitivity experiments from simulations with NCAR's CESM-WACCM chemistry climate model. The experiments have been designed to specifically quantify the contributions from natural as well as anthropogenic factors, such as solar variability (Solar), sea surface temperatures (SSTs), the Quasi-Biennial Oscillation (QBO), stratospheric aerosols (Aerosol), greenhouse gases (GHGs), as well as the dependence on the vertical resolution in the model. The results show that, in the TTL: a cooling in tropical SSTs leads to a weakening of tropical upwelling around the tropical tropopause and hence relative downwelling and adiabatic warming of 0.3 K decade-1; an increased QBO amplitude results in a 0.3 K decade-1 warming; increasing aerosols in the lower stratosphere lead to a 0.4 K decade-1 warming; a prolonged solar minimum and increased GHGs contribute about 0.2 and 0.1 K decade-1 to a cooling, respectively. Two simulations with different vertical resolution show that the vertical resolution can strongly influence the response of the TTL temperature to changes such as SSTs. With higher vertical resolution, an extra 0.6 K decade-1 warming can be simulated through the last decade, compared with results from the "standard" low vertical resolution simulation. Considering all the factors mentioned above, we compute a net 1.3 K decade-1 warming, which is in very good agreement with the observed 1.1 K decade-1 warming over the past decade in the TTL. The model results indicate that the recent warming in the TTL is mainly due to internal variability, i.e. the QBO and tropical SSTs.

  3. Quantifying the Restorable Water Volume of California's Sierra Nevada Meadows

    NASA Astrophysics Data System (ADS)

    Emmons, J. D.; Yarnell, S. M.; Fryjoff-Hung, A.; Viers, J.

    2013-12-01

    The Sierra Nevada is estimated to provide over 66% of California's water supply, which is largely derived from snowmelt. Global climate warming is expected to result in a decrease in snow pack and an increase in melting rate, making the attenuation of snowmelt by any means, an important ecosystem service for ensuring water availability. Montane meadows are dispersed throughout the mountain range and can act like natural reservoirs, and also provide wildlife habitat, water filtration, and water storage. Despite the important role of meadows in the Sierra Nevada, a large proportion is degraded from stream incision, which increases volume outflows and reduces overbank flooding, thus reducing infiltration and potential water storage. Restoration of meadow stream channels would therefore improve hydrological functioning, including increased water storage. The potential water holding capacity of restored meadows has yet to be quantified, thus this research seeks to address this knowledge gap by estimating the restorable water volume due to stream incision. More than 17,000 meadows were analyzed by categorizing their erosion potential using channel slope and soil texture, ultimately resulting in six general erodibility types. Field measurements of over 100 meadows, stratified by latitude, elevation, and geologic substrate, were then taken and analyzed for each erodibility type to determine average depth of incision. Restorable water volume was then quantified as a function of water holding capacity of the soil, meadow area and incised depth. Total restorable water volume was found to be 120 x 10^6 m3, or approximately 97,000 acre-feet. Using 95% confidence intervals for incised depth, the upper and lower bounds of the total restorable water volume were found to be 107 - 140 x 10^6 m3. Though this estimate of restorable water volume is small in regards to the storage capacity of typical California reservoirs, restoration of Sierra Nevada meadows remains an important objective. Storage of water in meadows benefits California wildlife, potentially attenuate floods, and elevates base flows, which can ease effects to the spring recession curve from the expected decline in Sierran snowpack with atmospheric warming.

  4. Quantifiable effectiveness of experimental scaling of river- and delta morphodynamics and stratigraphy

    NASA Astrophysics Data System (ADS)

    Kleinhans, Maarten G.; van Dijk, Wout M.; van de Lageweg, Wietse I.; Hoyal, David C. J. D.; Markies, Henk; van Maarseveen, Marcel; Roosendaal, Chris; van Weesep, Wendell; van Breemen, Dimitri; Hoendervoogt, Remko; Cheshier, Nathan

    2014-06-01

    Laboratory experiments to simulate landscapes and stratigraphy often suffer from scale effects, because reducing length- and time scales leads to different behaviour of water and sediment. Classically, scaling proceeded from dimensional analysis of the equations of motion and sediment transport, and minor concessions, such as vertical length scale distortion, led to acceptable results. In the past decade many experiments were done that seriously violated these scaling rules, but nevertheless produced significant and insightful results that resemble the real world in quantifiable ways. Here we focus on self-formed fluvial channels and channel patterns in experiments. The objectives of this paper are 1) to identify what aspects of scaling considerations are most important for experiments that simulate morphodynamics and stratigraphy of rivers and deltas, 2) to establish a design strategy for experiments based on a combination of relaxed classical scale rules, theory of bars and meanders, and small-scale experiments focussed at specific processes. We present a number of small laboratory setups and protocols that we use to rapidly quantify erosional and depositional types of forms and dynamics that develop in the landscape experiments as a function of detailed properties, such as effective material strength, and to assess potential scale effects. Most importantly, the width-to-depth ratio of channels determines the bar pattern and meandering tendency. The strength of floodplain material determines these channel dimensions, and theory predicts that laboratory rivers should have 1.5 times larger width-to-depth ratios for the same bar pattern. We show how floodplain formation can be controlled by adding silt-sized silicaflour, bentonite, Medicago sativa (alfalfa) or Partially Hydrolyzed PolyAcrylamide (a synthetic polymer) to poorly sorted sediment. The experiments demonstrate that there is a narrow range of conditions between no mobility of bed or banks, and too much mobility. The density of vegetation and the volume proportion of silt allow well-controllable channel dimensions whereas the polymer proved difficult to control. The theory, detailed methods of quantification, and experimental setups presented here show that the rivers and deltas created in the laboratory seem to behave as natural rivers when the experimental conditions adhere to the relaxed scaling rules identified herein, and that required types of fluvio-deltaic morphodynamics can be reproduced based on conditions and sediments selected on the basis of a series of small-scale experiments.

  5. Quantifying oil filtration effects on bearing life

    NASA Technical Reports Server (NTRS)

    Needelman, William M.; Zaretsky, Erwin V.

    1991-01-01

    Rolling-element bearing life is influenced by the number, size, and material properties of particles entering the Hertzian contact of the rolling element and raceway. In general, rolling-element bearing life increases with increasing level of oil filtration. Based upon test results, two equations are presented which allow for the adjustment of bearing L(sub 10) or catalog life based upon oil filter rating. It is recommended that where no oil filtration is used catalog life be reduced by 50 percent.

  6. Quantifying the Impact of Dust on Heterogeneous Ice Generation in Midlevel Supercooled Stratiform Clouds

    SciTech Connect

    Zhang, Damao; Wang, Zhien; Heymsfield, Andrew J.; Fan, Jiwen; Liu, Dong; Zhao, Ming

    2012-09-26

    Dust aerosols have been regarded as effective ice nuclei (IN), but large uncertainties regarding their efficiencies remain. Here, four years of collocated CALIPSO and CloudSat measurements are used to quantify the impact of dust on heterogeneous ice generation in midlevel supercooled stratiform clouds (MSSCs) over the ‘dust belt’. The results show that the dusty MSSCs have an up to 20% higher mixed-phase cloud occurrence, up to 8 dBZ higher mean maximum Ze (Ze_max), and up to 11.5 g/m2 higher ice water path (IWP) than similar MSSCs under background aerosol conditions. Assuming similar ice growth and fallout history in similar MSSCs, the significant differences in Ze_max between dusty and non-dusty MSSCs reflect ice particle number concentration differences. Therefore, observed Ze_max differences indicate that dust could enhance ice particle concentration in MSSCs by a factor of 2 to 6 at temperatures colder than ?12°C. The enhancements are strongly dependent on the cloud top temperature, large dust particle concentration and chemical compositions. These results imply an important role of dust particles in modifying mixed-phase cloud properties globally.

  7. ZebIAT, an image analysis tool for registering zebrafish embryos and quantifying cancer metastasis

    PubMed Central

    2013-01-01

    Background Zebrafish embryos have recently been established as a xenotransplantation model of the metastatic behaviour of primary human tumours. Current tools for automated data extraction from the microscope images are restrictive concerning the developmental stage of the embryos, usually require laborious manual image preprocessing, and, in general, cannot characterize the metastasis as a function of the internal organs. Methods We present a tool, ZebIAT, that allows both automatic or semi-automatic registration of the outer contour and inner organs of zebrafish embryos. ZebIAT provides a registration at different stages of development and an automatic analysis of cancer metastasis per organ, thus allowing to study cancer progression. The semi-automation relies on a graphical user interface. Results We quantified the performance of the registration method, and found it to be accurate, except in some of the smallest organs. Our results show that the accuracy of registering small organs can be improved by introducing few manual corrections. We also demonstrate the applicability of the tool to studies of cancer progression. Conclusions ZebIAT offers major improvement relative to previous tools by allowing for an analysis on a per-organ or region basis. It should be of use in high-throughput studies of cancer metastasis in zebrafish embryos. PMID:24267347

  8. Quantifying fluvial sediment flux on a monsoonal mega-river: the Mekong

    NASA Astrophysics Data System (ADS)

    Parsons, D. R.; Darby, S. E.; Hackney, C. R.; Best, J.; Aalto, R. E.; Nicholas, A. P.; Leyland, J.

    2013-12-01

    Quantifying sediment fluxes and distinguishing between bed-load and suspended-load (bed-load and suspended bed-load) transport of large rivers remains a significant challenge. It is increasingly apparent that prediction of large river morphodynamics in response to environmental change requires a robust quantification of sediment fluxes across a range of discharges. Such quantification becomes even more problematic for monsoonal rivers where large non-linearities in hydrological-sediment relations exist. This paper, as part of a NERC funded STELAR-S2S project (www.stelar-s2s.org), presents a series of multibeam sonar repeat bed surveys and acoustic calibrations that allow simultaneous quantification of bed-load transport and suspended load fluxes in the lower Mekong River. Results show how multibeam sonar can be used to map bedform evolution across a range of time scales and produce robust extimates of bed-load whilst acoustic backscatter calibration to suspended sediment load can be used in combination to Doppler flow velocity estimates to recover full sediment fluxes at the reach scale. The methods, estimates of error and implications of the results for the function of large river systems will be discussed.

  9. Quantifying the sensitivity of North Atlantic cyclone development to atmospheric precursor fields

    NASA Astrophysics Data System (ADS)

    Gray, S. L.; Dacre, H. F.

    2012-04-01

    North Atlantic cyclones can develop due to a wide variety of mechanisms. This makes it difficult to determine typical evolution characteristics and hence to quantify the relative importance of factors contributing to their development. In this study cyclones identified in the ERA-Interim, 6-hourly, reanalysis fields have been clustered according to their genesis location (west or east Atlantic) and their time to maximum intensity. These relatively homogeneous clusters allow relevant mean precursor fields (such as upper-level potential vorticity) to be created, whilst differences among cyclones in each cluster provide diversity. Using a sensitivity analysis technique, the linear relationships between maximum cyclone intensity and various atmospheric precursors for each cyclone cluster have been calculated. Confidence in the linear relationship is calculated and the spatial variability of the precursor fields taken into account. This standardisation of the sensitivity results allows quantitative comparison among different precursor fields to be performed. Results from this quantitative representative sensitivity analysis show that the maximum intensity of cyclones originating in the east Atlantic is more sensitive to diabatic processes than for those originating in the west Atlantic. This suggests that east Atlantic cyclones may be particularly sensitive to climate warming.

  10. Technical Note: Mesocosm approach to quantify dissolved inorganic carbon percolation fluxes

    NASA Astrophysics Data System (ADS)

    Thaysen, E. M.; Jessen, S.; Ambus, P.; Beier, C.; Postma, D.; Jakobsen, I.

    2014-02-01

    Dissolved inorganic carbon (DIC) fluxes across the vadose zone are influenced by a complex interplay of biological, chemical and physical factors. A novel soil mesocosm system was evaluated as a tool for providing information on the mechanisms behind DIC percolation to the groundwater from unplanted soil. Carbon dioxide partial pressure (pCO2), alkalinity, soil moisture and temperature were measured with depth and time, and DIC in the percolate was quantified using a sodium hydroxide trap. Results showed good reproducibility between two replicate mesocosms. The pCO2 varied between 0.2 and 1.1%, and the alkalinity was 0.1-0.6 meq L-1. The measured cumulative effluent DIC flux over the 78-day experimental period was 185-196 mg L-1 m-2 and in the same range as estimates derived from pCO2 and alkalinity in samples extracted from the side of the mesocosm column and the drainage flux. Our results indicate that the mesocosm system is a promising tool for studying DIC percolation fluxes and other biogeochemical transport processes in unsaturated environments.

  11. map showing predicted habitat potentional for tortoise

    USGS Multimedia Gallery

    This map shows the spatial representation of the predicted habitat potential index values for desert tortoise in the Mojave and parts of the Sonoran Deserts of California, Nevada, Utah, and Arizona. Map: USGS. ...

  12. Incident Response Planning for Selected Livestock Shows

    E-print Network

    Tomascik, Chelsea Roxanne

    2012-02-14

    Incidents affecting the livestock industry are unavoidable in today's society. These incidents can happen at livestock shows across the country putting thousands of exhibitors, visitors, employees and livestock in danger. The purpose of this study...

  13. More Dangerous Ebola Strain Unlikely, Study Shows

    MedlinePLUS

    ... on this page, please enable JavaScript. More Dangerous Ebola Strain Unlikely, Study Shows Researchers compared virus samples ... 2015) Thursday, March 26, 2015 Related MedlinePlus Page Ebola THURSDAY, March 26, 2015 (HealthDay News) -- Ebola likely ...

  14. Ebola Drug Shows Promise in Monkey Trial

    MedlinePLUS

    ... sharing features on this page, please enable JavaScript. Ebola Drug Shows Promise in Monkey Trial Experimental medicine ... Mozes Tuesday, February 10, 2015 Related MedlinePlus Page Ebola TUESDAY, Feb. 10, 2015 (HealthDay News) -- An investigational ...

  15. Quantifying the effects of melittin on liposomes.

    PubMed

    Popplewell, J F; Swann, M J; Freeman, N J; McDonnell, C; Ford, R C

    2007-01-01

    Melittin, the soluble peptide of bee venom, has been demonstrated to induce lysis of phospholipid liposomes. We have investigated the dependence of the lytic activity of melittin on lipid composition. The lysis of liposomes, measured by following their mass and dimensions when immobilised on a solid substrate, was close to zero when the negatively charged lipids phosphatidyl glycerol or phosphatidyl serine were used as the phospholipid component of the liposome. Whilst there was significant binding of melittin to the liposomes, there was little net change in their diameter with melittin binding reversed upon salt injection. For the zwitterionic phosphatidyl choline the lytic ability of melittin is dependent on the degree of acyl chain unsaturation, with melittin able to induce lysis of liposomes in the liquid crystalline state, whilst those in the gel state showed strong resistance to lysis. By directly measuring the dimensions and mass changes of liposomes on exposure to melittin using Dual Polarisation Interferometry, rather than following the florescence of entrapped dyes we attained further information about the initial stages of melittin binding to liposomes. PMID:17092481

  16. A NEW METHOD TO QUANTIFY CORE TEMPERATURE INSTABILITY IN RODENTS.

    EPA Science Inventory

    Methods to quantify instability of autonomic systems such as temperature regulation should be important in toxicant and drug safety studies. Stability of core temperature (Tc) in laboratory rodents is susceptible to a variety of stimuli. Calculating the temperature differential o...

  17. Quantifying emissions reductions from New England offshore wind energy resources

    E-print Network

    Berlinski, Michael Peter

    2006-01-01

    Access to straightforward yet robust tools to quantify the impact of renewable energy resources on air emissions from fossil fuel power plants is important to governments aiming to improve air quality and reduce greenhouse ...

  18. Quantifying the Performability of Cluster-Based Services

    E-print Network

    Quantifying the Performability of Cluster-Based Services Kiran Nagaraja, Student Member, IEEE, Gustavo Gama, Ricardo Bianchini, Member, IEEE, Richard P. Martin, Member, IEEE, Wagner Meira Jr., and Thu

  19. Evolutionary modification of development in mammalian teeth: Quantifying gene

    E-print Network

    Jernvall, Jukka

    Evolutionary modification of development in mammalian teeth: Quantifying gene expression patterns Geographic Information Systems. We investi- gated how genetic markers for epithelial signaling centers known, usually involve little initial modification of morphology. One system that offers promise for linking

  20. Quantifying the evidence for biodiversity effects on ecosystem

    E-print Network

    Chave, Jérôme

    Grey : multitrophic #12; Biodiversity-ecosystem service relationships ­ Productivity : biodiversity : multitrophic #12; Biodiversity-ecosystem service relationships ­ Productivity : biodiversity at one trophicQuantifying the evidence for biodiversity effects on ecosystem functioning and services. Balvanera

  1. Quantifying environmental limiting factors on tree cover using geospatial data.

    PubMed

    Greenberg, Jonathan A; Santos, Maria J; Dobrowski, Solomon Z; Vanderbilt, Vern C; Ustin, Susan L

    2015-01-01

    Environmental limiting factors (ELFs) are the thresholds that determine the maximum or minimum biological response for a given suite of environmental conditions. We asked the following questions: 1) Can we detect ELFs on percent tree cover across the eastern slopes of the Lake Tahoe Basin, NV? 2) How are the ELFs distributed spatially? 3) To what extent are unmeasured environmental factors limiting tree cover? ELFs are difficult to quantify as they require significant sample sizes. We addressed this by using geospatial data over a relatively large spatial extent, where the wall-to-wall sampling ensures the inclusion of rare data points which define the minimum or maximum response to environmental factors. We tested mean temperature, minimum temperature, potential evapotranspiration (PET) and PET minus precipitation (PET-P) as potential limiting factors on percent tree cover. We found that the study area showed system-wide limitations on tree cover, and each of the factors showed evidence of being limiting on tree cover. However, only 1.2% of the total area appeared to be limited by the four (4) environmental factors, suggesting other unmeasured factors are limiting much of the tree cover in the study area. Where sites were near their theoretical maximum, non-forest sites (tree cover < 25%) were primarily limited by cold mean temperatures, open-canopy forest sites (tree cover between 25% and 60%) were primarily limited by evaporative demand, and closed-canopy forests were not limited by any particular environmental factor. The detection of ELFs is necessary in order to fully understand the width of limitations that species experience within their geographic range. PMID:25692604

  2. Quantifying Environmental Limiting Factors on Tree Cover Using Geospatial Data

    PubMed Central

    Greenberg, Jonathan A.; Santos, Maria J.; Dobrowski, Solomon Z.; Vanderbilt, Vern C.; Ustin, Susan L.

    2015-01-01

    Environmental limiting factors (ELFs) are the thresholds that determine the maximum or minimum biological response for a given suite of environmental conditions. We asked the following questions: 1) Can we detect ELFs on percent tree cover across the eastern slopes of the Lake Tahoe Basin, NV? 2) How are the ELFs distributed spatially? 3) To what extent are unmeasured environmental factors limiting tree cover? ELFs are difficult to quantify as they require significant sample sizes. We addressed this by using geospatial data over a relatively large spatial extent, where the wall-to-wall sampling ensures the inclusion of rare data points which define the minimum or maximum response to environmental factors. We tested mean temperature, minimum temperature, potential evapotranspiration (PET) and PET minus precipitation (PET-P) as potential limiting factors on percent tree cover. We found that the study area showed system-wide limitations on tree cover, and each of the factors showed evidence of being limiting on tree cover. However, only 1.2% of the total area appeared to be limited by the four (4) environmental factors, suggesting other unmeasured factors are limiting much of the tree cover in the study area. Where sites were near their theoretical maximum, non-forest sites (tree cover < 25%) were primarily limited by cold mean temperatures, open-canopy forest sites (tree cover between 25% and 60%) were primarily limited by evaporative demand, and closed-canopy forests were not limited by any particular environmental factor. The detection of ELFs is necessary in order to fully understand the width of limitations that species experience within their geographic range. PMID:25692604

  3. Educational Outreach: The Space Science Road Show

    NASA Astrophysics Data System (ADS)

    Cox, N. L. J.

    2002-01-01

    The poster presented will give an overview of a study towards a "Space Road Show". The topic of this show is space science. The target group is adolescents, aged 12 to 15, at Dutch high schools. The show and its accompanying experiments would be supported with suitable educational material. Science teachers at schools can decide for themselves if they want to use this material in advance, afterwards or not at all. The aims of this outreach effort are: to motivate students for space science and engineering, to help them understand the importance of (space) research, to give them a positive feeling about the possibilities offered by space and in the process give them useful knowledge on space basics. The show revolves around three main themes: applications, science and society. First the students will get some historical background on the importance of space/astronomy to civilization. Secondly they will learn more about novel uses of space. On the one hand they will learn of "Views on Earth" involving technologies like Remote Sensing (or Spying), Communication, Broadcasting, GPS and Telemedicine. On the other hand they will experience "Views on Space" illustrated by past, present and future space research missions, like the space exploration missions (Cassini/Huygens, Mars Express and Rosetta) and the astronomy missions (Soho and XMM). Meanwhile, the students will learn more about the technology of launchers and satellites needed to accomplish these space missions. Throughout the show and especially towards the end attention will be paid to the third theme "Why go to space"? Other reasons for people to get into space will be explored. An important question in this is the commercial (manned) exploration of space. Thus, the questions of benefit of space to society are integrated in the entire show. It raises some fundamental questions about the effects of space travel on our environment, poverty and other moral issues. The show attempts to connect scientific with community thought. The difficulty with a show this elaborate and intricate is communicating on a level understandable for teenagers, whilst not treating them like children. Professional space scientists know how easy it is to lose oneself in technical specifics. This would, of course, only confuse young people. The author would like to discuss the ideas for this show with a knowledgeable audience and hopefully get some (constructive) feedback.

  4. Quantifying the power of multiple event interpretations

    NASA Astrophysics Data System (ADS)

    Chien, Yang-Ting; Farhi, David; Krohn, David; Marantan, Andrew; Mateos, David Lopez; Schwartz, Matthew

    2014-12-01

    A number of methods have been proposed recently which exploit multiple highly-correlated interpretations of events, or of jets within an event. For example, Qjets reclusters a jet multiple times and telescoping jets uses multiple cone sizes. Previous work has employed these methods in pseudo-experimental analyses and found that, with a simplified statistical treatment, they give sizable improvements over traditional methods. In this paper, the improvement gain from multiple event interpretations is explored with methods much closer to those used in real experiments. To this end, we derive and study a generalized extended maximum likelihood procedure, and find that using multiple jet radii can provide substantial benefit over a single radius in fitting procedures. Another major concern we address is that multiple event interpretations might be exploiting similar information to that already present in the standard kinematic variables. We perform multivariate analyses (boosted decision trees) on a set of standard kinematic variables, a single observable computed with several different cone sizes, and both sets combined. We find that using multiple radii is still helpful even on top of standard kinematic variables (providing a 12% improvement at low p T and 20% at high p T ). These results suggest that including multiple event interpretations in a realistic search for Higgs to would give additional sensitivity over traditional approaches.

  5. Quantifying the Intercellular Forces during Drosophila Morphogenesis

    NASA Astrophysics Data System (ADS)

    Ma, Xiaoyan; Hutson, M. Shane

    2006-03-01

    In many models of morphogenesis, cellular movements are driven by differences in interfacial tension along cell-cell boundaries. We have developed a microsurgical method to determine these tensions in living fruit fly (Drosophila) embryos. Cell edges in these embryos are labeled with green fluorescent protein chimeras; and line scan images that intersect several cell edges are recorded with a laser-scanning confocal microscope at a time resolution of 2 ms. While recording these scans, a Q-switched Nd:YAG laser is used to cut a single cell edge. The recoil of adjacent cell edges is evident in the line scans and the time-dependent cell edge positions are extracted using custom ImageJ plugins based on the Lucas-Kanade algorithm. The post-incision recoil velocities of cell edges are determined by fitting the cell edge positions to a double exponential function. In addition, a power spectrum analysis of cell-edge position fluctuations is used to determine the viscous damping constant. In the regime of low Reynolds number, the tension along a cell-cell boundary is well-approximated by the product of the viscous damping constant and the initial recoil velocity of adjacent cell edges. We will present initial results from two stages of Drosophila development -- germ band retraction and early dorsal closure.

  6. Quantifying seismic survey reverberation off the Alaskan North Slope.

    PubMed

    Guerra, Melania; Thode, Aaron M; Blackwell, Susanna B; Michael Macrander, A

    2011-11-01

    Shallow-water airgun survey activities off the North Slope of Alaska generate impulsive sounds that are the focus of much regulatory attention. Reverberation from repetitive airgun shots, however, can also increase background noise levels, which can decrease the detection range of nearby passive acoustic monitoring (PAM) systems. Typical acoustic metrics for impulsive signals provide no quantitative information about reverberation or its relative effect on the ambient acoustic environment. Here, two conservative metrics are defined for quantifying reverberation: a minimum level metric measures reverberation levels that exist between airgun pulse arrivals, while a reverberation metric estimates the relative magnitude of reverberation vs expected ambient levels in the hypothetical absence of airgun activity, using satellite-measured wind data. The metrics are applied to acoustic data measured by autonomous recorders in the Alaskan Beaufort Sea in 2008 and demonstrate how seismic surveys can increase the background noise over natural ambient levels by 30-45 dB within 1 km of the activity, by 10-25 dB within 15 km of the activity, and by a few dB at 128 km range. These results suggest that shallow-water reverberation would reduce the performance of nearby PAM systems when monitoring for marine mammals within a few kilometers of shallow-water seismic surveys. PMID:22087932

  7. Quantifying uncertainty in brain network measures using Bayesian connectomics.

    PubMed

    Janssen, Ronald J; Hinne, Max; Heskes, Tom; van Gerven, Marcel A J

    2014-01-01

    The wiring diagram of the human brain can be described in terms of graph measures that characterize structural regularities. These measures require an estimate of whole-brain structural connectivity for which one may resort to deterministic or thresholded probabilistic streamlining procedures. While these procedures have provided important insights about the characteristics of human brain networks, they ultimately rely on unwarranted assumptions such as those of noise-free data or the use of an arbitrary threshold. Therefore, resulting structural connectivity estimates as well as derived graph measures fail to fully take into account the inherent uncertainty in the structural estimate. In this paper, we illustrate an easy way of obtaining posterior distributions over graph metrics using Bayesian inference. It is shown that this posterior distribution can be used to quantify uncertainty about graph-theoretical measures at the single subject level, thereby providing a more nuanced view of the graph-theoretical properties of human brain connectivity. We refer to this model-based approach to connectivity analysis as Bayesian connectomics. PMID:25339896

  8. Quantifying uncertainty in brain network measures using Bayesian connectomics

    PubMed Central

    Janssen, Ronald J.; Hinne, Max; Heskes, Tom; van Gerven, Marcel A. J.

    2014-01-01

    The wiring diagram of the human brain can be described in terms of graph measures that characterize structural regularities. These measures require an estimate of whole-brain structural connectivity for which one may resort to deterministic or thresholded probabilistic streamlining procedures. While these procedures have provided important insights about the characteristics of human brain networks, they ultimately rely on unwarranted assumptions such as those of noise-free data or the use of an arbitrary threshold. Therefore, resulting structural connectivity estimates as well as derived graph measures fail to fully take into account the inherent uncertainty in the structural estimate. In this paper, we illustrate an easy way of obtaining posterior distributions over graph metrics using Bayesian inference. It is shown that this posterior distribution can be used to quantify uncertainty about graph-theoretical measures at the single subject level, thereby providing a more nuanced view of the graph-theoretical properties of human brain connectivity. We refer to this model-based approach to connectivity analysis as Bayesian connectomics. PMID:25339896

  9. Quantifying the Rheological and Hemodynamic Characteristics of Sickle Cell Anemia

    PubMed Central

    Lei, Huan; Karniadakis, George Em

    2012-01-01

    Sickle erythrocytes exhibit abnormal morphology and membrane mechanics under deoxygenated conditions due to the polymerization of hemoglobin S. We employed dissipative particle dynamics to extend a validated multiscale model of red blood cells (RBCs) to represent different sickle cell morphologies based on a simulated annealing procedure and experimental observations. We quantified cell distortion using asphericity and elliptical shape factors, and the results were consistent with a medical image analysis. We then studied the rheology and dynamics of sickle RBC suspensions under constant shear and in a tube. In shear flow, the transition from shear-thinning to shear-independent flow revealed a profound effect of cell membrane stiffening during deoxygenation, with granular RBC shapes leading to the greatest viscosity. In tube flow, the increase of flow resistance by granular RBCs was also greater than the resistance of blood flow with sickle-shape RBCs. However, no occlusion was observed in a straight tube under any conditions unless an adhesive dynamics model was explicitly incorporated into simulations that partially trapped sickle RBCs, which led to full occlusion in some cases. PMID:22339854

  10. Measuring the gap: quantifying and comparing local health inequalities.

    PubMed

    Low, Anne; Low, Allan

    2004-12-01

    Primary Care Trusts (PCTs) and Local Strategic Partnerships (LSPs) are being asked to assess local health inequalities in order to prioritize local action, to set local targets for reducing levels of health inequality locally and to demonstrate measurable progress. Despite this, little guidance has been provided on how to quantify health inequalities within PCTs and LSPs. This paper advocates the use of a metric, the slope index of inequality, which provides a consistent measure of health inequalities across local populations. The metric can be presented as a relative gap, which is easily understood and enables levels of inequality to be compared between health conditions, lifestyles and rates of service provision at any one time, or across different time periods. The metric is applied to Sunderland Teaching PCT, using routine data sources. Examples of the results and their uses are presented. It is suggested that more widespread use of the metric could enable levels of health inequalities to be compared across PCTs and lead to the development of local health inequality and inequity benchmarks. PMID:15598860

  11. Quantifying the abnormal hemodynamics of sickle cell anemia

    NASA Astrophysics Data System (ADS)

    Lei, Huan; Karniadakis, George

    2012-02-01

    Sickle red blood cells (SS-RBC) exhibit heterogeneous morphologies and abnormal hemodynamics in deoxygenated states. A multi-scale model for SS-RBC is developed based on the Dissipative Particle Dynamics (DPD) method. Different cell morphologies (sickle, granular, elongated shapes) typically observed in deoxygenated states are constructed and quantified by the Asphericity and Elliptical shape factors. The hemodynamics of SS-RBC suspensions is studied in both shear and pipe flow systems. The flow resistance obtained from both systems exhibits a larger value than the healthy blood flow due to the abnormal cell properties. Moreover, SS-RBCs exhibit abnormal adhesive interactions with both the vessel endothelium cells and the leukocytes. The effect of the abnormal adhesive interactions on the hemodynamics of sickle blood is investigated using the current model. It is found that both the SS-RBC - endothelium and the SS-RBC - leukocytes interactions, can potentially trigger the vicious ``sickling and entrapment'' cycles, resulting in vaso-occlusion phenomena widely observed in micro-circulation experiments.

  12. Quantum Process Tomography Quantifies Coherence Transfer Dynamics in Vibrational Exciton

    PubMed Central

    Chuntonov, Lev; Ma, Jianqiang

    2013-01-01

    Quantum coherence has been a subject of great interest in many scientific disciplines. However, detailed characterization of the quantum coherence in molecular systems, especially its transfer and relaxation mechanisms, still remains a major challenge. The difficulties arise in part because the spectroscopic signatures of the coherence transfer are typically overwhelmed by other excitation relaxation processes. We use quantum process tomography (QPT) via two-dimensional infrared spectroscopy to quantify the rate of the elusive coherence transfer between two vibrational exciton states. QPT retrieves the dynamics of the dissipative quantum system directly from the experimental observables. It thus serves as an experimental alternative to theoretical models of the system-bath interaction, and can be used to validate these theories. Our results for coupled carbonyl groups of a diketone molecule in chloroform, used as a benchmark system, reveal the non-secular nature of the interaction between the exciton and the Markovian bath and open the door for the systematic studies of the dissipative quantum systems dynamics in detail. PMID:24079417

  13. Quantifying interictal metabolic activity in human temporal lobe epilepsy

    SciTech Connect

    Henry, T.R.; Mazziotta, J.C.; Engel, J. Jr.; Christenson, P.D.; Zhang, J.X.; Phelps, M.E.; Kuhl, D.E. (Univ. of California, Los Angeles (USA))

    1990-09-01

    The majority of patients with complex partial seizures of unilateral temporal lobe origin have interictal temporal hypometabolism on (18F)fluorodeoxyglucose positron emission tomography (FDG PET) studies. Often, this hypometabolism extends to ipsilateral extratemporal sites. The use of accurately quantified metabolic data has been limited by the absence of an equally reliable method of anatomical analysis of PET images. We developed a standardized method for visual placement of anatomically configured regions of interest on FDG PET studies, which is particularly adapted to the widespread, asymmetric, and often severe interictal metabolic alterations of temporal lobe epilepsy. This method was applied by a single investigator, who was blind to the identity of subjects, to 10 normal control and 25 interictal temporal lobe epilepsy studies. All subjects had normal brain anatomical volumes on structural neuroimaging studies. The results demonstrate ipsilateral thalamic and temporal lobe involvement in the interictal hypometabolism of unilateral temporal lobe epilepsy. Ipsilateral frontal, parietal, and basal ganglial metabolism is also reduced, although not as markedly as is temporal and thalamic metabolism.

  14. Cardiovascular regulation during sleep quantified by symbolic coupling traces

    NASA Astrophysics Data System (ADS)

    Suhrbier, A.; Riedl, M.; Malberg, H.; Penzel, T.; Bretthauer, G.; Kurths, J.; Wessel, N.

    2010-12-01

    Sleep is a complex regulated process with short periods of wakefulness and different sleep stages. These sleep stages modulate autonomous functions such as blood pressure and heart rate. The method of symbolic coupling traces (SCT) is used to analyze and quantify time-delayed coupling of these measurements during different sleep stages. The symbolic coupling traces, defined as the symmetric and diametric traces of the bivariate word distribution matrix, allow the quantification of time-delayed coupling. In this paper, the method is applied to heart rate and systolic blood pressure time series during different sleep stages for healthy controls as well as for normotensive and hypertensive patients with sleep apneas. Using the SCT, significant different cardiovascular mechanisms not only between the deep sleep and the other sleep stages but also between healthy subjects and patients can be revealed. The SCT method is applied to model systems, compared with established methods, such as cross correlation, mutual information, and cross recurrence analysis and demonstrates its advantages especially for nonstationary physiological data. As a result, SCT proves to be more specific in detecting delays of directional interactions than standard coupling analysis methods and yields additional information which cannot be measured by standard parameters of heart rate and blood pressure variability. The proposed method may help to indicate the pathological changes in cardiovascular regulation and also the effects of continuous positive airway pressure therapy on the cardiovascular system.

  15. Rapidly quantifying the relative distention of a human bladder

    NASA Technical Reports Server (NTRS)

    Companion, John A. (inventor); Heyman, Joseph S. (inventor); Mineo, Beth A. (inventor); Cavalier, Albert R. (inventor); Blalock, Travis N. (inventor)

    1989-01-01

    A device and method of rapidly quantifying the relative distention of the bladder in a human subject are disclosed. The ultrasonic transducer which is positioned on the subject in proximity to the bladder is excited by a pulser under the command of a microprocessor to launch an acoustic wave into the patient. This wave interacts with the bladder walls and is reflected back to the ultrasonic transducer, when it is received, amplified and processed by the receiver. The resulting signal is digitized by an analog-to-digital converter under the command of the microprocessor and is stored in the data memory. The software in the microprocessor determines the relative distention of the bladder as a function of the propagated ultrasonic energy; and based on programmed scientific measurements and individual, anatomical, and behavioral characterists of the specific subject as contained in the program memory, sends out a signal to turn on any or all of the audible alarm, the visible alarm, the tactile alarm, and the remote wireless alarm.

  16. A diagnostic for quantifying heat flux from a thermite spray

    SciTech Connect

    E. P. Nixon; M. L. Pantoya; D. J. Prentice; E. D. Steffler; M. A. Daniels; S. P. D'Arche

    2010-02-01

    Characterizing the combustion behaviors of energetic materials requires diagnostic tools that are often not readily or commercially available. For example, a jet of thermite spray provides a high temperature and pressure reaction that can also be highly corrosive and promote undesirable conditions for the survivability of any sensor. Developing a diagnostic to quantify heat flux from a thermite spray is the objective of this study. Quick response sensors such as thin film heat flux sensors cannot survive the harsh conditions of the spray, but more rugged sensors lack the response time for the resolution desired. A sensor that will allow for adequate response time while surviving the entire test duration was constructed. The sensor outputs interior temperatures of the probes at known locations and utilizes an inverse heat conduction code to calculate heat flux values. The details of this device are discussed and illustrated. Temperature and heat flux measurements of various thermite sprays are reported. Results indicate that this newly designed heat flux sensor provides quantitative data with good repeatability suitable for characterizing energetic material combustion.

  17. Quantifying protein sequences with reference to the genetic code.

    PubMed

    Hannon Bozorgmehr, Joseph E

    2015-05-01

    Although the analysis of protein molecules is extensive, their primary sequences have yet to be quantified like their mass or size. The composition and particular arrangement of amino acids in proteins confers the distinct biochemical functionality, but it remains unclear why only a tiny proportion of possible character combinations are potentially functional. Here, I offer a simple but effective technique, utilizing the assignment of codons in the genetic code, that permits the quantification of polypeptide sequences and establishes statistical parameters through which they can now be numerically compared. Two main tests were conducted, one analyzing the composition and the other the specific order of the amino acids within the primary sequence. The results confirm that natural proteins are significantly different to random heteropolymers of equivalent size, although this is much more marginal in the case of the arrangement than it is for the composition. Moreover, they reveal that there are key patterns that have hitherto not been identified, relevant to the the study of the evolution of proteins, and which raise doubts about the plausibility of some purported cases of the de novo origination of protein-coding genes from intergenic DNA. Despite the fact that the applicability of quantification to the design of novel proteins is probably limited, it nonetheless provides a useful guideline that could complement much more precise methods. PMID:25728786

  18. Source Inversion Validation: Quantifying Uncertainties in Earthquake Source Inversions

    NASA Astrophysics Data System (ADS)

    Mai, P. M.; Page, M. T.; Schorlemmer, D.

    2010-12-01

    Earthquake source inversions image the spatio-temporal rupture evolution on one or more fault planes using seismic and/or geodetic data. Source inversion methods thus represent an important research tool in seismology to unravel the complexity of earthquake ruptures. Subsequently, source-inversion results are used to study earthquake mechanics, to develop spontaneous dynamic rupture models, to build models for generating rupture realizations for ground-motion simulations, and to perform Coulomb-stress modeling. In all these applications, the underlying finite-source rupture models are treated as “data” (input information), but the uncertainties in these data (i.e. source models obtained from solving an inherently ill-posed inverse problem) are hardly known, and almost always neglected. The Source Inversion Validation (SIV) project attempts to better understand the intra-event variability of earthquake rupture models. We plan to build a long-standing and rigorous testing platform to examine the current state-of-the-art in earthquake source inversion that also facilitates to develop robust approaches to quantify rupture-model uncertainties. Our contribution reviews the current status of the SIV project, recent forward-modeling tests for point and extended sources in layered media, and discusses the strategy of the SIV-project for the coming years.

  19. Quantifying self-absorption losses in luminescent solar concentrators.

    PubMed

    Ten Kate, Otmar M; Hooning, Koen M; van der Kolk, Erik

    2014-08-10

    Analytical equations quantifying self-absorption losses in circular luminescent solar concentrators (LSCs) are presented that can easily be solved numerically by commercial math software packages. With the quantum efficiency, the absorption and emission spectra of a luminescent material, the LSC dimensions, and the refractive index as the only input parameters, the model gives an accurate account of the decrease of LSC efficiency due to self-absorption as a function of LSC radius, thickness, and luminescence quantum efficiency. Results give insight into how many times light is reabsorbed and reemitted, the red shift of the emission spectrum, and on how multiple reabsorptions and reemissions are distributed over the LSC. As an example case the equations were solved for a circular LSC containing a Lumogen F Red 305 dye with 80% luminescence quantum efficiency, and it follows that for an LSC with a 50 cm radius the self-absorption reduces the number of photons reaching the LSC edge by a factor of four compared to the case when there would be no self-absorption. The equations can just as well be solved for any material for which the optical properties are known like type I and type II quantum dots. PMID:25320934

  20. Quantifying hurricane-induced coastal changes using topographic lidar

    USGS Publications Warehouse

    Sallenger, Asbury H., Jr.; Krabill, William; Swift, Robert; Brock, John

    2001-01-01

    USGS and NASA are investigating the impacts of hurricanes on the United States East and Gulf of Mexico coasts with the ultimate objective of improving predictive capabilities. The cornerstone of our effort is to use topographic lidar to acquire pre- and post-storm topography to quantify changes to beaches and dunes. With its rapidity of acquisition and very high density, lidar is revolutionizing the. quantification of storm-induced coastal change. Lidar surveys have been acquired for the East and Gulf coasts to serve as pre-storm baselines. Within a few days of a hurricane landfall anywhere within the study area, the impacted area will be resurveyed to detect changes. For example, during 1999, Hurricane Dennis impacted the northern North Carolina coast. Along a 70-km length of coast between Cape Hatteras and Oregon Inlet, there was large variability in the types of impacts including overwash, dune erosion, dune stability, and even accretion at the base of dunes. These types of impacts were arranged in coherent patterns that repeated along the coast over scales of tens of kilometers. Preliminary results suggest the variability is related to the influence of offshore shoals that induce longshore gradients in wave energy by wave refraction.

  1. Quantifying Potential Error in Painting Breast Excision Specimens

    PubMed Central

    Godden, Amy

    2013-01-01

    Aim. When excision margins are close or involved following breast conserving surgery, many surgeons will attempt to reexcise the corresponding cavity margin. Margins are ascribed to breast specimens such that six faces are identifiable to the pathologist, a process that may be prone to error at several stages. Methods. An experimental model was designed according to stated criteria in order to answer the research question. Computer software was used to measure the surface areas of experimental surfaces to compare human-painted surfaces with experimental controls. Results. The variability of the hand-painted surfaces was considerable. Thirty percent of hand-painted surfaces were 20% larger or smaller than controls. The mean area of the last surface painted was significantly larger than controls (mean 58996 pixels versus 50096 pixels, CI 1477–16324, P = 0.014). By chance, each of the six volunteers chose to paint the deep surface last. Conclusion. This study is the first to attempt to quantify the extent of human error in marking imaginary boundaries on a breast excision model and suggests that humans do not make these judgements well, raising questions about the safety of targeting single margins at reexcision. PMID:23762569

  2. Graphical methods for quantifying macromolecules through bright field imaging

    PubMed Central

    Chang, Hang; DeFilippis, Rosa Anna; Tlsty, Thea D.; Parvin, Bahram

    2009-01-01

    Bright field imaging of biological samples stained with antibodies and/or special stains provides a rapid protocol for visualizing various macromolecules. However, this method of sample staining and imaging is rarely employed for direct quantitative analysis due to variations in sample fixations, ambiguities introduced by color composition and the limited dynamic range of imaging instruments. We demonstrate that, through the decomposition of color signals, staining can be scored on a cell-by-cell basis. We have applied our method to fibroblasts grown from histologically normal breast tissue biopsies obtained from two distinct populations. Initially, nuclear regions are segmented through conversion of color images into gray scale, and detection of dark elliptic features. Subsequently, the strength of staining is quantified by a color decomposition model that is optimized by a graph cut algorithm. In rare cases where nuclear signal is significantly altered as a result of sample preparation, nuclear segmentation can be validated and corrected. Finally, segmented stained patterns are associated with each nuclear region following region-based tessellation. Compared to classical non-negative matrix factorization, proposed method: (i) improves color decomposition, (ii) has a better noise immunity, (iii) is more invariant to initial conditions and (iv) has a superior computing performance. contact: hchang@lbl.gov PMID:18703588

  3. Quantifying determinants of cash crop expansion and their relative effects using logistic regression modeling and variance partitioning

    NASA Astrophysics Data System (ADS)

    Xiao, Rui; Su, Shiliang; Mai, Gengchen; Zhang, Zhonghao; Yang, Chenxue

    2015-02-01

    Cash crop expansion has been a major land use change in tropical and subtropical regions worldwide. Quantifying the determinants of cash crop expansion should provide deeper spatial insights into the dynamics and ecological consequences of cash crop expansion. This paper investigated the process of cash crop expansion in Hangzhou region (China) from 1985 to 2009 using remotely sensed data. The corresponding determinants (neighborhood, physical, and proximity) and their relative effects during three periods (1985-1994, 1994-2003, and 2003-2009) were quantified by logistic regression modeling and variance partitioning. Results showed that the total area of cash crops increased from 58,874.1 ha in 1985 to 90,375.1 ha in 2009, with a net growth of 53.5%. Cash crops were more likely to grow in loam soils. Steep areas with higher elevation would experience less likelihood of cash crop expansion. A consistently higher probability of cash crop expansion was found on places with abundant farmland and forest cover in the three periods. Besides, distance to river and lake, distance to county center, and distance to provincial road were decisive determinants for farmers' choice of cash crop plantation. Different categories of determinants and their combinations exerted different influences on cash crop expansion. The joint effects of neighborhood and proximity determinants were the strongest, and the unique effect of physical determinants decreased with time. Our study contributed to understanding of the proximate drivers of cash crop expansion in subtropical regions.

  4. Effective rates of heavy metal release from alkaline wastes--quantified by column outflow experiments and inverse simulations.

    PubMed

    Wehrer, Markus; Totsche, Kai Uwe

    2008-10-23

    Column outflow experiments operated at steady state flow conditions do not allow the identification of rate limited release processes. This requires an alternative experimental methodology. In this study, the aim was to apply such a methodology in order to identify and quantify effective release rates of heavy metals from granular wastes. Column experiments were conducted with demolition waste and municipal waste incineration (MSWI) bottom ash using different flow velocities and multiple flow interruptions. The effluent was analyzed for heavy metals, DOC, electrical conductivity and pH. The breakthrough-curves were inversely modeled with a numerical code based on the advection-dispersion equation with first order mass-transfer and nonlinear interaction terms. Chromium, Copper, Nickel and Arsenic are usually released under non-equilibrium conditions. DOC might play a role as carrier for those trace metals. By inverse simulations, generally good model fits are derived. Although some parameters are correlated and some model deficiencies can be revealed, we are able to deduce physically reasonable release-mass-transfer time scales. Applying forward simulations, the parameter space with equifinal parameter sets was delineated. The results demonstrate that the presented experimental design is capable of identifying and quantifying non-equilibrium conditions. They show also that the possibility of rate limited release must not be neglected in release and transport studies involving inorganic contaminants. PMID:18757112

  5. Analysing, quantifying and modelling soil erosion on steep hillslopes in different climatic areas using LiDAR and SFM DEMs

    NASA Astrophysics Data System (ADS)

    Neugirg, Fabian; Haas, Florian; Kaiser, Andreas; Schmidt, Jürgen; Becht, Michael

    2014-05-01

    Soil erosion is a worldwide well known problem and has therefore been subject to various scientific studies, especially on agricultural areas. However soil erosion on steep hillslopes in mountainous drainage basins can be a threat to human infrastructure as it supplies material, e.g. for debris flows to torrents. The study presented here aims to analyse, quantify and model soil erosion on (very) steep hillslopes free of vegetation in different climatic areas ranging from South Germany to Central Italy. Multitemporal digital elevation models were acquired with terrestrial laserscanning and from terrestrial and aerial structure from motion-based imagery. Analysis of erosion is mainly based on slope wash and rill erosion during summer months as well as erosion through freezing and melting processes during winter months in catchments of the Bavarian Alps. Erosional processes in the Mediterranean are mainly controlled by different precipitation regimes throughout the year. Annual erosion and accumulation rates are quantified and used for modelling purposes. First results of the presented project show, that the amount of material eroded is mainly controlled by the size of the sediment contributing area. However there are also other controlling factors, such as slope angle, slope length and vegetation cover which are investigated within this project.

  6. Comparing 3D Gyrification Index and area-independent curvature-based measures in quantifying neonatal brain folding

    NASA Astrophysics Data System (ADS)

    Rodriguez-Carranza, Claudia E.; Mukherjee, P.; Vigneron, Daniel; Barkovich, James; Studholme, Colin

    2007-03-01

    In this work we compare 3D Gyrification Index and our recently proposed area-independent curvature-based surface measures [26] for the in-vivo quantification of brain surface folding in clinically acquired neonatal MR image data. A meaningful comparison of gyrification across brains of different sizes and their subregions will only be possible through the quantification of folding with measures that are independent of the area of the region of analysis. This work uses a 3D implementation of the classical Gyrification Index, a 2D measure that quantifies folding based on the ratio of the inner and outer contours of the brain and which has been used to study gyral patterns in adults with schizophrenia, among other conditions. The new surface curvature-based measures and the 3D Gyrification Index were calculated on twelve premature infants (age 28-37 weeks) from which surfaces of cerebrospinal fluid/gray matter (CSF/GM) interface and gray matter/white matter (GM/WM) interface were extracted. Experimental results show that our measures better quantify folding on the CSF/GM interface than Gyrification Index, and perform similarly on the GM/WM interface.

  7. Effective rates of heavy metal release from alkaline wastes — Quantified by column outflow experiments and inverse simulations

    NASA Astrophysics Data System (ADS)

    Wehrer, Markus; Totsche, Kai Uwe

    2008-10-01

    Column outflow experiments operated at steady state flow conditions do not allow the identification of rate limited release processes. This requires an alternative experimental methodology. In this study, the aim was to apply such a methodology in order to identify and quantify effective release rates of heavy metals from granular wastes. Column experiments were conducted with demolition waste and municipal waste incineration (MSWI) bottom ash using different flow velocities and multiple flow interruptions. The effluent was analyzed for heavy metals, DOC, electrical conductivity and pH. The breakthrough-curves were inversely modeled with a numerical code based on the advection-dispersion equation with first order mass-transfer and nonlinear interaction terms. Chromium, Copper, Nickel and Arsenic are usually released under non-equilibrium conditions. DOC might play a role as carrier for those trace metals. By inverse simulations, generally good model fits are derived. Although some parameters are correlated and some model deficiencies can be revealed, we are able to deduce physically reasonable release-mass-transfer time scales. Applying forward simulations, the parameter space with equifinal parameter sets was delineated. The results demonstrate that the presented experimental design is capable of identifying and quantifying non-equilibrium conditions. They show also that the possibility of rate limited release must not be neglected in release and transport studies involving inorganic contaminants.

  8. Adhesion of subsets of human blood mononuclear cells to endothelial cells in vitro, as quantified by flow cytometry.

    PubMed

    Benschop, R J; de Smet, M B; Bloem, A C; Ballieux, R E

    1992-12-01

    Binding of leucocytes to endothelial cells (EC) is essential as an initial step in inflammatory responses. We present a rapid, non-radioactive method to measure adhesion of human lymphoid cells to EC using flow cytometry. Freshly isolated peripheral blood mononuclear cells (PBMC) were allowed to adhere to EC grown in 24-well plates. Non-adhering cells were removed, after which adhering cells and EC were dissociated using trypsin/EDTA. These samples were subsequently analysed by flow cytometry, using scatter properties to distinguish between adhering cells and EC. The ratio of the number of adhering leucocytes and EC was calculated to quantify adhesion. Results of the flow cytometric adhesion assay were comparable to those obtained with a conventional adhesion assay using chromium-labelled cells. We additionally show that by using the flow cytometric adhesion assay, adhesion of lymphocytes and monocytes present within the adhering PBMC can be quantified simultaneously. As a model, the contribution of LFA-1 (CD11a/CD18) and ICAM-1 (CD54) in adhesion of PBMC to EC was studied. It was found that adhesion of lymphocytes and monocytes is regulated differently by phorbol ester and that the relative contribution of LFA-1 and ICAM-1 differs for both cell types. PMID:1361077

  9. Quantifying water content using GPR: impact of petrophysical variability

    NASA Astrophysics Data System (ADS)

    West, L. J.; Endres, A. L.

    2006-05-01

    Electromagnetic signal velocity measurements are commonly used to quantify liquid water contents of near surface geomaterials. Typically, a single-valued function, such as Topp's equation, is used to convert dielectric permittivity (K) into water content (?). Several factors contribute to error in such water content estimates, including use of incorrect petrophysical relationships, dependence of dielectric permittivity on pore scale geometry, frequency dispersive behavior (for example caused by the presence of swelling clay minerals), and macroscopic heterogeneity that leads to incorrect estimates of dielectric permittivity from field data. We use field and laboratory measurements and synthetic examples to investigate the relative importance of these sources of error. Co-axial cell measurements on clean sand samples suggest that even where the K versus ? relationship is well characterized, heterogeneity in the distribution of water at the pore scale for example caused by wetting-drying hysteresis, can lead to moisture content errors of ±2% volumetric water content, although more typically errors of <±1% can be expected. Measurements on sandstone samples suggest that larger errors, of up to ±5% can arise at GPR frequencies (i.e. 100MHz), resulting from the presence of swelling clay, although the influence of clay is less important at higher frequencies (i.e. >300MHz). Similar sized errors can result from variations in pore-scale geometry. However, experience suggests that the largest errors in water content measurements arise from assumptions concerning radar ray path geometry. GPR estimates of water content in macroscopically layered media made by assuming that the first radar wave arrivals are direct rays, whereas in fact these are critically refracted rays, can result in water content estimates that are inaccurate by up to 20%. Synthetic modeling to investigate the dependence of the magnitude of such errors on the geometric characteristics of heterogeneity will be presented.

  10. Isotopes in Urban Cheatgrass Quantify Atmospheric Pollution

    NASA Astrophysics Data System (ADS)

    Kammerdiener, S. A.; Ehleringer, J. R.

    2006-12-01

    This study presents evidence that the nitrogen and carbon stable isotope values of vegetation can be used as integrators of ephemeral atmospheric pollution signals. Leaves and stems of Bromus tectorum and soil samples were collected in the urban Salt Lake Valley and in the rural Skull Valley of Utah. These samples were used to develop a map of the spatial distribution of ?13C and ?15N values of leaves and stems of Bromus tectorum and soils around each valley. The spatial distribution of ?15N values of leaves and stems of Bromus tectorum and associated soils were significantly correlated. The average ?15N value for Salt Lake Valley Bromus tectorum leaves and stems was 2.37‰ while the average value for Skull Valley Bromus tectorum leaves and stems was 4.76‰. It is possible that the higher concentration of atmospheric nitrogen pollutants measured in the Salt Lake Valley provided the ?15N depleted nitrogen source for uptake by plants and deposition on soils, though the ?15N value of source nitrogen was not measured directly. The presence of a seasonal difference in ?15N values of leaves and stems of Bromus tectorum sampled in Salt Lake Valley but not in Skull Valley further supports this idea. Leaves and stems of Bromus tectorum sampled in the Salt Lake Valley in April 2003 had a statistically more positive average ?15N value of 2.4 ‰ than samples collected in August 2003, which had an average ?15N value of 0.90‰. The carbon isotope values of leaves and stems of Bromus tectorum and air samples collected in Salt Lake Valley were more negative than values measured in Skull Valley samples (Salt Lake ?13Cplant= -28.50‰ and ?13Cair= -9.32 ‰; Skull Valley ?13Cplant= -27.58‰ and ?13C air= -8.52 ‰). This supports the idea that differences in stable isotope values of source air are correlated with differences in stable isotope values of exposed vegetation. Overall, the results of this study suggest that the carbon and nitrogen stable isotope values measured in vegetation are useful indicators of differences in atmospheric pollutant concentration in urban and rural areas.

  11. Global climate change: the quantifiable sustainability challenge.

    PubMed

    Princiotta, Frank T; Loughlin, Daniel H

    2014-09-01

    Population growth and the pressures spawned by increasing demands for energy and resource-intensive goods, foods, and services are driving unsustainable growth in greenhouse gas (GHG) emissions. Recent GHG emission trends are consistent with worst-case scenarios of the previous decade. Dramatic and near-term emission reductions likely will be needed to ameliorate the potential deleterious impacts of climate change. To achieve such reductions, fundamental changes are required in the way that energy is generated and used. New technologies must be developed and deployed at a rapid rate. Advances in carbon capture and storage, renewable, nuclear and transportation technologies are particularly important; however, global research and development efforts related to these technologies currently appear to fall short relative to needs. Even with a proactive and international mitigation effort, humanity will need to adapt to climate change, but the adaptation needs and damages will be far greater if mitigation activities are not pursued in earnest. In this review, research is highlighted that indicates increasing global and regional temperatures and ties climate changes to increasing GHG emissions. GHG mitigation targets necessary for limiting future global temperature increases are discussed, including how factors such as population growth and the growing energy intensity of the developing world will make these reduction targets more challenging. Potential technological pathways for meeting emission reduction targets are examined, barriers are discussed, and global and US. modeling results are presented that suggest that the necessary pathways will require radically transformed electric and mobile sectors. While geoengineering options have been proposed to allow more time for serious emission reductions, these measures are at the conceptual stage with many unanswered cost, environmental, and political issues. Implications: This paper lays out the case that mitigating the potential for catastrophic climate change will be a monumental challenge, requiring the global community to transform its energy system in an aggressive, coordinated, and timely manner. If this challenge is to be met, new technologies will have to be developed and deployed at a rapid rate. Advances in carbon capture and storage, renewable, nuclear, and transportation technologies are particularly important. Even with an aggressive international mitigation effort, humanity will still need to adapt to significant climate change. PMID:25282995

  12. Quantifying mixing using magnetic resonance imaging.

    PubMed

    Tozzi, Emilio J; McCarthy, Kathryn L; Bacca, Lori A; Hartt, William H; McCarthy, Michael J

    2012-01-01

    Mixing is a unit operation that combines two or more components into a homogeneous mixture. This work involves mixing two viscous liquid streams using an in-line static mixer. The mixer is a split-and-recombine design that employs shear and extensional flow to increase the interfacial contact between the components. A prototype split-and-recombine (SAR) mixer was constructed by aligning a series of thin laser-cut Poly (methyl methacrylate) (PMMA) plates held in place in a PVC pipe. Mixing in this device is illustrated in the photograph in Fig. 1. Red dye was added to a portion of the test fluid and used as the minor component being mixed into the major (undyed) component. At the inlet of the mixer, the injected layer of tracer fluid is split into two layers as it flows through the mixing section. On each subsequent mixing section, the number of horizontal layers is duplicated. Ultimately, the single stream of dye is uniformly dispersed throughout the cross section of the device. Using a non-Newtonian test fluid of 0.2% Carbopol and a doped tracer fluid of similar composition, mixing in the unit is visualized using magnetic resonance imaging (MRI). MRI is a very powerful experimental probe of molecular chemical and physical environment as well as sample structure on the length scales from microns to centimeters. This sensitivity has resulted in broad application of these techniques to characterize physical, chemical and/or biological properties of materials ranging from humans to foods to porous media (1, 2). The equipment and conditions used here are suitable for imaging liquids containing substantial amounts of NMR mobile (1)H such as ordinary water and organic liquids including oils. Traditionally MRI has utilized super conducting magnets which are not suitable for industrial environments and not portable within a laboratory (Fig. 2). Recent advances in magnet technology have permitted the construction of large volume industrially compatible magnets suitable for imaging process flows. Here, MRI provides spatially resolved component concentrations at different axial locations during the mixing process. This work documents real-time mixing of highly viscous fluids via distributive mixing with an application to personal care products. PMID:22314707

  13. Quantifying Regional Measurement Requirements for ASCENDS

    NASA Astrophysics Data System (ADS)

    Mountain, M. E.; Eluszkiewicz, J.; Nehrkorn, T.; Hegarty, J. D.; Aschbrenner, R.; Henderson, J.; Zaccheo, S.

    2011-12-01

    Quantification of greenhouse gas fluxes at regional and local scales is required by the Kyoto protocol and potential follow-up agreements, and their accompanying implementation mechanisms (e.g., cap-and-trade schemes and treaty verification protocols). Dedicated satellite observations, such as those provided by the Greenhouse gases Observing Satellite (GOSAT), the upcoming Orbiting Carbon Observatory (OCO-2), and future active missions, particularly Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) and Advanced Space Carbon and Climate Observation of Planet Earth (A-SCOPE), are poised to play a central role in this endeavor. In order to prepare for the ASCENDS mission, we are applying the Stochastic Time-Inverted Lagrangian Transport (STILT) model driven by meteorological fields from a customized version of the Weather Research and Forecasting (WRF) model to generate surface influence functions for ASCENDS observations. These "footprints" (or adjoint) express the sensitivity of observations to surface fluxes in the upwind source regions and thus enable the computation of a posteriori flux error reductions resulting from the inclusion of satellite observations (taking into account the vertical sensitivity and error characteristics of the latter). The overarching objective of this project is the specification of the measurement requirements for the ASCENDS mission, with a focus on policy-relevant regional scales. Several features make WRF-STILT an attractive tool for regional analysis of satellite observations: 1) WRF meteorology is available at higher resolution than for global models and is thus more realistic, 2) The Lagrangian approach minimizes numerical diffusion present in Eulerian models, 3) The WRF-STILT coupling has been specifically designed to achieve good mass conservation characteristics, and 4) The receptor-oriented approach offers a relatively straightforward way to compute the adjoint of the transport model. These aspects allow the model to compute surface influences for satellite observations at high spatiotemporal resolution and to generate realistic flux error and flux estimates at policy-relevant scales. The main drawbacks of the Lagrangian approach to satellite simulations are inefficiency and storage requirements, but these obstacles can be overcome by taking advantage of modern computing resources (the current runs are being performed on the NASA Pleiades supercomputer). We gratefully acknowledge funding by the NASA Atmospheric CO2 Observations from Space Program (grant NNX10AT87G).

  14. Children's Art Show: An Educational Family Experience

    ERIC Educational Resources Information Center

    Bakerlis, Julienne

    2007-01-01

    In a time of seemingly rampant budget cuts in the arts in school systems throughout the country, a children's art show reaps many rewards. It can strengthen family-school relationships and community ties and stimulate questions and comments about the benefits of art and its significance in the development of young children. In this photo essay of…

  15. ShowMeTheSign Accessibility App

    E-print Network

    Painter, Kevin

    1 ShowMeTheSign Accessibility App Dimitrios Papastogiannidis Supervised by Elaine Farrow Second Farrow, for the support, the guidelines and the feedback that gave me along with her vision for the app where the app is given to the users for testing and to be compared with another similar application

  16. What Pain Asymbolia Really Shows Colin Klein

    E-print Network

    Klein, Colin

    What Pain Asymbolia Really Shows Colin Klein Macquarie University cvklein@gmail.com Abstract Pain asymbolics feel pain, but act as if they are indifferent to it. Nikola Grahek argues that such patients present a clear counterexample to motivationalism about pain. I argue that Grahek has mischaracterised

  17. No Show Student Survey, Schoolcraft College.

    ERIC Educational Resources Information Center

    Schoolcraft Coll., Livonia, MI.

    In winter semester 1995, Schoolcraft College, in Michigan, experienced a 7% drop in enrollment and an 8% drop in credit hours, including 461 students who applied but did not enroll. To determine the reasons that the no shows had for not enrolling and how many planned to enroll in the future, demographic data were collected from application…

  18. INTRODUCTION Mitotic metaphase chromosomes show sister chromatids

    E-print Network

    Villefranche sur mer

    . Meiosis I bivalents, as mitotic chromosomes, show sister-chromatid centromere and arm cohesions cohesion during meiosis I, and then release centromere cohesion during meiosis II (for review see Moore and Orr- Weaver, 1998). Consequently, this sequential loss of cohesion during meiosis might be precisely

  19. A Talk Show from the Past.

    ERIC Educational Resources Information Center

    Gallagher, Arlene F.

    1991-01-01

    Describes a two-day activity in which elementary students examine voting rights, the right to assemble, and women's suffrage. Explains the game, "Assemble, Reassemble," and a student-produced talk show with five students playing the roles of leaders of the women's suffrage movement. Profiles Elizabeth Cady Stanton, Lucretia Mott, Susan B. Anthony,…

  20. Show Them You Really Want the Job

    ERIC Educational Resources Information Center

    Perlmutter, David D.

    2012-01-01

    Showing that one really "wants" the job entails more than just really wanting the job. An interview is part Broadway casting call, part intellectual dating game, part personality test, and part, well, job interview. When there are 300 applicants for a position, many of them will "fit" the required (and even the preferred) skills listed in the job…

  1. Type VII secretion — mycobacteria show the way

    Microsoft Academic Search

    Abdallah M. Abdallah; Nicolaas C. Gey van Pittius; Patricia A. DiGiuseppe Champion; Jeffery Cox; Joen Luirink; Christina M. J. E. Vandenbroucke-Grauls; Ben J. Appelmelk; Wilbert Bitter

    2007-01-01

    Recent evidence shows that mycobacteria have developed novel and specialized secretion systems for the transport of extracellular proteins across their hydrophobic, and highly impermeable, cell wall. Strikingly, mycobacterial genomes encode up to five of these transport systems. Two of these systems, ESX-1 and ESX-5, are involved in virulence — they both affect the cell-to-cell migration of pathogenic mycobacteria. Here, we

  2. George Arcement Shows Locations of USGS Streamgages

    USGS Multimedia Gallery

    USGS Louisiana Water Science Center Director George Arcement shows the locations of USGS' streamgage network to WAFB Meteorologist Jay Grymes.  USGS maintains more than 30 real-time streamgages throughout the area affected by the 2011 Flood. In addition, more than 50 non-real-time gages were...

  3. Quantifying Arctic Terrestrial Environment Behaviors Using Geophysical, Point-Scale and Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Dafflon, B.; Hubbard, S. S.; Ulrich, C.; Peterson, J. E.; Wu, Y.; Wainwright, H. M.; Gangodagamage, C.; Kholodov, A. L.; Kneafsey, T. J.

    2013-12-01

    Improvement in parameterizing Arctic process-rich terrestrial models to simulate feedbacks to a changing climate requires advances in estimating the spatiotemporal variations in active layer and permafrost properties - in sufficiently high resolution yet over modeling-relevant scales. As part of the DOE Next-Generation Ecosystem Experiments (NGEE-Arctic), we are developing advanced strategies for imaging the subsurface and for investigating land and subsurface co-variability and dynamics. Our studies include acquisition and integration of various measurements, including point-based, surface-based geophysical, and remote sensing datasets These data have been collected during a series of campaigns at the NGEE Barrow, AK site along transects that traverse a range of hydrological and geomorphological conditions, including low- to high- centered polygons and drained thaw lake basins. In this study, we describe the use of galvanic-coupled electrical resistance tomography (ERT), capacitively-coupled resistivity (CCR) , permafrost cores, above-ground orthophotography, and digital elevation model (DEM) to (1) explore complementary nature and trade-offs between characterization resolution, spatial extent and accuracy of different datasets; (2) develop inversion approaches to quantify permafrost characteristics (such as ice content, ice wedge frequency, and presence of unfrozen deep layer) and (3) identify correspondences between permafrost and land surface properties (such as water inundation, topography, and vegetation). In terms of methods, we developed a 1D-based direct search approach to estimate electrical conductivity distribution while allowing exploration of multiple solutions and prior information in a flexible way. Application of the method to the Barrow datasets reveals the relative information content of each dataset for characterizing permafrost properties, which shows features variability from below one meter length scales to large trends over more than a kilometer. Further, we used Pole- and Kite-based low-altitude aerial photography with inferred DEM, as well as DEM from LiDAR dataset, to quantify land-surface properties and their co-variability with the subsurface properties. Comparison of the above- and below-ground characterization information indicate that while some permafrost characteristics correspond with changes in hydrogeomorphological expressions, others features show more complex linkages with landscape properties. Overall, our results indicate that remote sensing data, point-scale measurements and surface geophysical measurements enable the identification of regional zones having similar relations between subsurface and land surface properties. Identification of such zonation and associated permafrost-land surface properties can be used to guide investigations of carbon cycling processes and for model parameterization.

  4. Can satellite-derived aerosol optical depth quantify the surface aerosol radiative forcing?

    NASA Astrophysics Data System (ADS)

    Xu, Hui; Ceamanos, Xavier; Roujean, Jean-Louis; Carrer, Dominique; Xue, Yong

    2014-12-01

    Aerosols play an important role in the climate of the Earth through aerosol radiative forcing (ARF). Nowadays, aerosol particles are detected, quantified and monitored by remote sensing techniques using low Earth orbit (LEO) and geostationary (GEO) satellites. In the present article, the use of satellite-derived AOD (aerosol optical depth) products is investigated in order to quantify on a daily basis the ARF at the surface level (SARF). By daily basis we mean that an average SARF value is computed every day based upon the available AOD satellite measurements for each station. In the first part of the study, the performance of four state-of-art different AOD products (MODIS-DT, MODIS-DB, MISR, and SEVIRI) is assessed through comparison against ground-based AOD measurements from 24 AERONET stations located in Europe and Africa during a 6-month period. While all AOD products are found to be comparable in terms of measured value (RMSE of 0.1 for low and average AOD values), a higher number of AOD estimates is made available by GEO satellites due to their enhanced frequency of scan. Experiments show a general lower agreement of AOD estimates over the African sites (RMSE of 0.2), which show the highest aerosol concentrations along with the occurrence of dust aerosols, coarse particles, and bright surfaces. In the second part of this study, the lessons learned about the confidence in aerosol burden derived from satellites are used to estimate SARF under clear sky conditions. While the use of AOD products issued from GEO observations like SEVIRI brings improvement in the SARF estimates with regard to LEO-based AOD products, the resulting absolute bias (13 W/m2 in average when AERONET AOD is used as reference) is judged to be still high in comparison with the average values of SARF found in this study (from - 25 W/m2 to - 43 W/m2) and also in the literature (from - 10 W/m2 to - 47 W/m2).

  5. Quantifying the Carbon Intensity of Biomass Energy

    NASA Astrophysics Data System (ADS)

    Hodson, E. L.; Wise, M.; Clarke, L.; McJeon, H.; Mignone, B.

    2012-12-01

    Regulatory agencies at the national and regional level have recognized the importance of quantitative information about greenhouse gas emissions from biomass used in transportation fuels or in electricity generation. For example, in the recently enacted California Low-Carbon Fuel Standard, the California Air Resources Board conducted a comprehensive study to determine an appropriate methodology for setting carbon intensities for biomass-derived transportation fuels. Furthermore, the U.S. Environmental Protection Agency is currently conducting a multi-year review to develop a methodology for estimating biogenic carbon dioxide (CO2) emissions from stationary sources. Our study develops and explores a methodology to compute carbon emission intensities (CIs) per unit of biomass energy, which is a metric that could be used to inform future policy development exercises. To compute CIs for biomass, we use the Global Change Assessment Model (GCAM), which is an integrated assessment model that represents global energy, agriculture, land and physical climate systems with regional, sectoral, and technological detail. The GCAM land use and land cover component includes both managed and unmanaged land cover categories such as food crop production, forest products, and various non-commercial land uses, and it is subdivided into 151 global land regions (wiki.umd.edu/gcam), ten of which are located in the U.S. To illustrate a range of values for different biomass resources, we use GCAM to compute CIs for a variety of biomass crops grown in different land regions of the U.S. We investigate differences in emissions for biomass crops such as switchgrass, miscanthus and willow. Specifically, we use GCAM to compute global carbon emissions from the land use change caused by a marginal increase in the amount of biomass crop grown in a specific model region. Thus, we are able to explore how land use change emissions vary by the type and location of biomass crop grown in the U.S. Direct emissions occur when biomass production used for energy displaces land used for food crops, forest products, pasture, or other arable land in the same region. Indirect emissions occur when increased food crop production, compensating for displaced food crop production in the biomass production region, displaces land in regions outside of the region of biomass production. Initial results from this study suggest that indirect land use emissions, mainly from converting unmanaged forest land, are likely to be as important as direct land use emissions in determining the carbon intensity of biomass energy. Finally, we value the emissions of a marginal unit of biomass production for a given carbon price path and a range of assumed social discount rates. We also compare the cost of bioenergy emissions as valued by a hypothetical private actor to the relevant cost of emissions from conventional fossil fuels, such as coal or natural gas.

  6. A new methodology for quantifying the impact of water repellency on the filtering function of soils

    NASA Astrophysics Data System (ADS)

    Müller, Karin; Deurer, Markus; Kawamoto, Ken; Hiradate, Syuntaro; Komatsu, Toshiko; Clothier, Brent

    2014-05-01

    Soils deliver a range of ecosystem services, and some of the most valuable relate to the regulating services resulting from the buffering and filtering of solutes by soil. However, it is commonly accepted that soil water repellency (SWR) can lead to finger flow and preferential flow. Yet, there have been few attempts to quantify the impact of such flow phenomena on the buffering and filtering of solutes. No method is available to quantify directly how SWR affects the transport of reactive solutes. We have closed this gap and developed a new method for quantifying solute transport by novel experiments with water-repellent soils. It involves sequentially applying two liquids, one water, and the other a reference fully wetting liquid, namely, aqueous ethanol, to the same intact soil core with air-drying between the application of the two liquids. Our results highlight that sorption experiments are necessary to complement our new method to ascertain directly the impact of SWR on the filtering of a solute. We conducted transport and sorption experiments, by applying our new method, with the herbicide 2,4-Dichlorophenoxyacetic acid and two Andosol top-soils; one from Japan and the other one from New Zealand. Breakthrough curves from the water experiments were characterized by preferential flow with high initial concentrations, tailing and a long prevalence of solutes remaining in the soil. Our results clearly demonstrate and quantify the impact of SWR on the leaching of this herbicide. This technique for quantifying the reduction of the soil's filtering efficiency by SWR enables assessment of the increased risk of groundwater contamination by solutes exogenously applied to water-repellent soils.

  7. A New Method to Quantify the Effect After Subcutaneous Injection of Lipolytic Substances

    Microsoft Academic Search

    S. M. Klein; L. Prantl; A. Berner; S. Schreml; T. Schubert; J. Rennert; C. Fellner; A. Stopfer; P. Angele; A. G. Schreyer; C. I. Schreyer; S. Feuerbach; E. M. Jung

    2008-01-01

    Background  Increasing numbers of patients request lipolytic injection therapy for aesthetic indications. However, only the clinical results\\u000a of these therapies have been published to date. In most cases, pre- and postprocedure photographs and measurements have been\\u000a presented. As with every other medical procedure, it is necessary to ensure that the results of lipolytic injections are quantified\\u000a on an objective and scientific

  8. New Drug Shows Promise for MS

    MedlinePLUS

    ... 2015) Tuesday, April 14, 2015 Related MedlinePlus Page Multiple Sclerosis TUESDAY, April 14, 2015 (HealthDay News) -- An experimental drug appears to repair nerve damage seen in multiple sclerosis (MS) patients, results of an early trial suggest. ...

  9. Worldwide trends show oropharyngeal cancer rate increasing

    Cancer.gov

    DCEG scientists report that the incidence of oropharyngeal cancer significantly increased in countries that are economically developed, during the period 1983-2002. The results of this study appeared online in the Journal of Clinical Oncology, on November 18, 2013.

  10. Robots at NPE 2006, the Plastics Show

    Microsoft Academic Search

    Richard Bloss

    2007-01-01

    Purpose – This paper aims to present a review of the NPE 2006, Plastics Show held in Chicago, IL with emphasis on robots, their application in the plastics industry and end-of-arm-tooling. Design\\/methodology\\/approach – In-depth interviews with suppliers of robots, injection molding machines, system integration of robots into plastic processing applications, control suppliers and end-of-arm-tooling. Findings – The plastic injection molding

  11. Learning helicopter control through “teaching by showing

    Microsoft Academic Search

    James F. Montgomery; George A. Bekey

    1998-01-01

    A model-free “teaching by showing” methodology is developed to train a fuzzy-neural controller for an autonomous robot helicopter. The controller is generated and tuned using training data gathered while a teacher operates the helicopter. A hierarchical behavior-based control architecture is used, with each behavior implemented as a hybrid fuzzy logic controller (FLC) and general regression neural network controller (GRNNC). The

  12. Software for portable laser light show system

    Microsoft Academic Search

    Dmitrey J. Buruchin; Aleksandr F. Leonov

    1995-01-01

    Portable laser light show system LS-3500-10M is connected to the parallel port of IBM PC\\/AT compatible computer. Computer performs output of digital control data describing images. Specially designed control device is used to convert digital data coming from parallel port to the analog signal driving scanner. Capabilities of even cost nothing 286 computer are quite enough for laser graphics control.

  13. Mercury's Core Molten, Radar Study Shows

    NASA Astrophysics Data System (ADS)

    2007-05-01

    Scientists using a high-precision planetary radar technique for the first time have discovered that the innermost planet Mercury probably has a molten core, resolving a mystery of more than three decades. The discovery, which used the National Science Foundation's Robert C. Byrd Green Bank Telescope in West Virginia and Arecibo Observatory in Puerto Rico, and NASA/Jet Propulsion Laboratory antennas in California, is an important step toward a better understanding of how planets form and evolve. Planetary Radar High-precision planetary radar technique sent signal to Mercury, received reflection. CREDIT: Bill Saxton, NRAO/AUI/NSF Click on image for high-resolution file (447 KB) "For a long time it was thought we'd have to land spacecraft on Mercury to learn if its core is solid or molten. Now we've answered that question using ground-based telescopes," said Jean-Luc Margot, of Cornell University, leader of the research team, which published its results in the May 4 issue of the journal Science. Mercury is one of the least-understood of the planets in our Solar System. Its distance from the Sun is just over one-third that of the Earth, and it contains a mass just 5½ percent that of Earth. Only about half of Mercury's surface has been photographed by a spacecraft, Mariner 10, back in 1974. Mariner 10 also discovered that Mercury has a weak magnetic field, about one percent as strong as Earth's. That discovery spurred a scientific debate about the planet's core. Scientists normally expect a rocky planet's magnetic field to be caused by an electromagnetic dynamo in a molten core. However, Mercury is so small that most scientists expected its core to have cooled and solidified long ago. Those scientists speculated that the magnetic field seen today may have been "frozen" into the planet when the core cooled. "Whether the core is molten or solid today depends greatly on the chemical composition of the core. That chemical composition can provide important clues about the processes involved in planet formation," Margot said. To answer the question, the scientists implemented an ingenious, high-precision technique in which they sent a powerful beam of radio waves to bounce off Mercury, then received and analyzed the reflected signal using pairs of ground-based radio telescopes. While similar radar systems have been used in the past to map planetary surfaces, this technique instead measured the rate at which Mercury spins on its axis, and did so with an unprecedented precision of one part in 100,000. By making 21 separate observations, the research team was able to measure minute variations in the planet's spin rate. This was the key to learning whether Mercury's core is solid or molten. Using an understanding of the Sun's gravitational effect on the planet, they realized that the tiny variations in its spin rate would be twice as large if the core is liquid than they would be if Mercury has a solid core. "The variations in Mercury's spin rate that we measured are best explained by a core that is at least partially molten. We have a 95 percent confidence level in this conclusion," Margot said. For most of their observations, carried out from 2002-2006, the scientists transmitted a powerful radar beam from the NASA/JPL 70-meter antenna at Goldstone, California, and received the reflected signal with the Green Bank Telescope and the Goldstone antenna. For some observations, they transmitted from the Arecibo Observatory in Puerto Rico and received at Arecibo and two Goldstone antennas. They used radar signals at frequencies of 8.5 and 2.4 GHz. To make the precision measurements of Mercury's spin rate, the geometry between the planet and the receiving antennas had to match a specific alignment. Such an alignment only occurs for about 20 seconds a day. In addition to measuring Mercury's spin rate, their technique also made the best measurement ever of the alignment of the planet's axis of rotation. "We improved the accuracy of this measurement by 100 times, and showed that Mercury's spin axis

  14. Quantifying light scattering with single-mode fiber -optic confocal microscopy

    PubMed Central

    2009-01-01

    Background Confocal microscopy has become an important option for examining tissues in vivo as a diagnostic tool and a quality control tool for tissue-engineered constructs. Collagen is one of the primary determinants of biomechanical stability. Since collagen is also the primary scattering element in skin and other soft tissues, we hypothesized that laser-optical imaging methods, particularly confocal scattered-light scanning, would allow us to quantify scattering intensity and determine collagen content in biological layers. Methods We built a fully automated confocal scattered-light scanner to examine how light scatters in Intralipid, a common tissue phantom, and three-dimensional collagen gels. Intralipid with 0.5%, 1.0%, 1.5%, and 2.0% concentration was filled between precisely spaced glass coverslips. Collagen gels at collagen concentrations from 0.30 mg/mL to 3.30 mg/mL were prepared, and all samples underwent A-mode scanning with multiple averaged scans. In Intralipid samples, light reflected from the upper fluid-glass interface was measured. In collagen gels, average scattering intensity inside the actual gel was measured. In both cases, intensity was correlated with concentration. Results By measuring light attenuation at interface reflections of various thicknesses using our device, we were able to determine that the scattering coefficient at 660 nm of Intralipid at increasing concentrations in water to be 39 cm-1 for each percent increase of Intralipid. We were also able to measure the amount of scattering of various concentrations of collagen in gels directly using backscattered light. The results show a highly linear relationship with an increase of 8.2 arbitrary units in backscattering intensity for every 1 mg increase of collagen within a 1 mL gel volume. Conclusion The confocal scattered-light scanner allows to accurately quantify scattering in Intralipid and collagen gels. Furthermore, a linear relationship between collagen concentration and intensity was found. Confocal scattered-light scanning therefore promises to allow imaging of collagen content in soft tissue layers. PMID:19925674

  15. Quantifying the entropy of binding for water molecules in protein cavities by computing correlations.

    PubMed

    Huggins, David J

    2015-02-17

    Protein structural analysis demonstrates that water molecules are commonly found in the internal cavities of proteins. Analysis of experimental data on the entropies of inorganic crystals suggests that the entropic cost of transferring such a water molecule to a protein cavity will not typically be greater than 7.0 cal/mol/K per water molecule, corresponding to a contribution of approximately +2.0 kcal/mol to the free energy. In this study, we employ the statistical mechanical method of inhomogeneous fluid solvation theory to quantify the enthalpic and entropic contributions of individual water molecules in 19 protein cavities across five different proteins. We utilize information theory to develop a rigorous estimate of the total two-particle entropy, yielding a complete framework to calculate hydration free energies. We show that predictions from inhomogeneous fluid solvation theory are in excellent agreement with predictions from free energy perturbation (FEP) and that these predictions are consistent with experimental estimates. However, the results suggest that water molecules in protein cavities containing charged residues may be subject to entropy changes that contribute more than +2.0 kcal/mol to the free energy. In all cases, these unfavorable entropy changes are predicted to be dominated by highly favorable enthalpy changes. These findings are relevant to the study of bridging water molecules at protein-protein interfaces as well as in complexes with cognate ligands and small-molecule inhibitors. PMID:25692597

  16. A versatile new tool to quantify abasic sites in DNA and inhibit base excision repair.

    PubMed

    Wei, Shanqiao; Shalhout, Sophia; Ahn, Young-Hoon; Bhagwat, Ashok S

    2015-03-01

    A number of endogenous and exogenous agents, and cellular processes create abasic (AP) sites in DNA. If unrepaired, AP sites cause mutations, strand breaks and cell death. Aldehyde-reactive agent methoxyamine reacts with AP sites and blocks their repair. Another alkoxyamine, ARP, tags AP sites with a biotin and is used to quantify these sites. We have combined both these abilities into one alkoxyamine, AA3, which reacts with AP sites with a better pH profile and reactivity than ARP. Additionally, AA3 contains an alkyne functionality for bioorthogonal click chemistry that can be used to link a wide variety of biochemical tags to AP sites. We used click chemistry to tag AP sites with biotin and a fluorescent molecule without the use of proteins or enzymes. AA3 has a better reactivity profile than ARP and gives much higher product yields at physiological pH than ARP. It is simpler to use than ARP and its use results in lower background and greater sensitivity for AP site detection. We also show that AA3 inhibits the first enzyme in the repair of abasic sites, APE-1, to about the same extent as methoxyamine. Furthermore, AA3 enhances the ability of an alkylating agent, methylmethane sulfonate, to kill human cells and is more effective in such combination chemotherapy than methoxyamine. PMID:25616257

  17. Revisiting pyramid compression to quantify flexoelectricity: A three-dimensional simulation study

    NASA Astrophysics Data System (ADS)

    Abdollahi, Amir; Millán, Daniel; Peco, Christian; Arroyo, Marino; Arias, Irene

    2015-03-01

    Flexoelectricity is a universal property of all dielectrics by which they generate a voltage in response to an inhomogeneous deformation. One of the controversial issues in this field concerns the magnitude of flexoelectric coefficients measured experimentally, which greatly exceed theoretical estimates. Furthermore, there is a broad scatter amongst experimental measurements. The truncated pyramid compression method is one of the common setups to quantify flexoelectricity, the interpretation of which relies on simplified analytical equations to estimate strain gradients. However, the deformation fields in three-dimensional pyramid configurations are highly complex, particularly around its edges. In the present work, using three-dimensional self-consistent simulations of flexoelectricity, we show that the simplified analytical estimations of strain gradients in compressed pyramids significantly overestimate flexoelectric coefficients, thus providing a possible explanation to reconcile different estimates. In fact, the interpretation of pyramid compression experiments is highly nontrivial. We systematically characterize the magnitude of this overestimation, of over one order of magnitude, as a function of the truncated pyramid configuration. These results are important to properly characterize flexoelectricity, and provide design guidelines for effective electromechanical transducers exploiting flexoelectricity.

  18. Quantifying the radiation belt seed population in the 17 March 2013 electron acceleration event

    NASA Astrophysics Data System (ADS)

    Boyd, A. J.; Spence, H. E.; Claudepierre, S. G.; Fennell, J. F.; Blake, J. B.; Baker, D. N.; Reeves, G. D.; Turner, D. L.

    2014-04-01

    We present phase space density (PSD) observations using data from the Magnetic Electron Ion Spectrometer instrument on the Van Allen Probes for the 17 March 2013 electron acceleration event. We confirm previous results and quantify how PSD gradients depend on the first adiabatic invariant. We find a systematic difference between the lower-energy electrons (1 MeV with a source region within the radiation belts. Our observations show that the source process begins with enhancements to the 10s-100s keV energy seed population, followed by enhancements to the >1 MeV population and eventually leading to enhancements in the multi-MeV electron population. These observations provide the clearest evidence to date of the timing and nature of the radial transport of a 100s keV electron seed population into the heart of the outer belt and subsequent local acceleration of those electrons to higher radiation belt energies.

  19. Quantifying sub-pixel urban impervious surface through fusion of optical and inSAR imagery

    USGS Publications Warehouse

    Yang, L.; Jiang, L.; Lin, H.; Liao, M.

    2009-01-01

    In this study, we explored the potential to improve urban impervious surface modeling and mapping with the synergistic use of optical and Interferometric Synthetic Aperture Radar (InSAR) imagery. We used a Classification and Regression Tree (CART)-based approach to test the feasibility and accuracy of quantifying Impervious Surface Percentage (ISP) using four spectral bands of SPOT 5 high-resolution geometric (HRG) imagery and three parameters derived from the European Remote Sensing (ERS)-2 Single Look Complex (SLC) SAR image pair. Validated by an independent ISP reference dataset derived from the 33 cm-resolution digital aerial photographs, results show that the addition of InSAR data reduced the ISP modeling error rate from 15.5% to 12.9% and increased the correlation coefficient from 0.71 to 0.77. Spatially, the improvement is especially noted in areas of vacant land and bare ground, which were incorrectly mapped as urban impervious surfaces when using the optical remote sensing data. In addition, the accuracy of ISP prediction using InSAR images alone is only marginally less than that obtained by using SPOT imagery. The finding indicates the potential of using InSAR data for frequent monitoring of urban settings located in cloud-prone areas. Copyright ?? 2009 by Bellwether Publishing, Ltd. All right reserved.

  20. Quantifying terrestrial ecosystem carbon dynamics in the Jinsha watershed, upper Yangtze, China from 1975 to 2000.

    PubMed

    Zhao, Shuqing; Liu, Shuguang; Yin, Runsheng; Li, Zhengpeng; Deng, Yulin; Tan, Kun; Deng, Xiangzheng; Rothstein, David; Qi, Jiaguo

    2010-03-01

    Quantifying the spatial and temporal dynamics of carbon stocks in terrestrial ecosystems and carbon fluxes between the terrestrial biosphere and the atmosphere is critical to our understanding of regional patterns of carbon budgets. Here we use the General Ensemble biogeochemical Modeling System to simulate the terrestrial ecosystem carbon dynamics in the Jinsha watershed of China's upper Yangtze basin from 1975 to 2000, based on unique combinations of spatial and temporal dynamics of major driving forces, such as climate, soil properties, nitrogen deposition, and land use and land cover changes. Our analysis demonstrates that the Jinsha watershed ecosystems acted as a carbon sink during the period of 1975-2000, with an average rate of 0.36 Mg/ha/yr, primarily resulting from regional climate variation and local land use and land cover change. Vegetation biomass accumulation accounted for 90.6% of the sink, while soil organic carbon loss before 1992 led to a lower net gain of carbon in the watershed, and after that soils became a small sink. Ecosystem carbon sink/source patterns showed a high degree of spatial heterogeneity. Carbon sinks were associated with forest areas without disturbances, whereas carbon sources were primarily caused by stand-replacing disturbances. It is critical to adequately represent the detailed fast-changing dynamics of land use activities in regional biogeochemical models to determine the spatial and temporal evolution of regional carbon sink/source patterns. PMID:19296154

  1. Quantifying variable erosion rates to understand the coupling of surface processes in the Teton Range, Wyoming

    NASA Astrophysics Data System (ADS)

    Tranel, Lisa M.; Spotila, James A.; Binnie, Steven A.; Freeman, Stewart P. H. T.

    2015-01-01

    Short-term geomorphic processes (fluvial, glacial, and hillslope erosion) and long-term exhumation control transient alpine landscapes. Long-term measurements of exhumation are not sufficient to capture the processes driving transient responses associated with short-term climatic oscillations, because of high variability of individual processes across space and time. This study compares the efficacy of different erosional agents to assess the importance of variability in tectonically active landscapes responding to fluctuations in Quaternary climate. We focus on the Teton Range, where erosional mechanisms include hillslope, glacial, and fluvial processes. Erosion rates were quantified using sediment accumulation and cosmogenic dating (bedrock and stream sediments). Results show that rates of erosion are highly variable, with average short-term rockfall rates (0.8 mm/y) occurring faster than either apparent basin-averaged (0.2 mm/y) and long-term ridge erosion rates (0.02 mm/y). Examining erosion rates separately also demonstrates the coupling between glacial, fluvial, and hillslope processes. Apparent basin-averaged erosion rates amalgamate valley wall and ridge erosion with stream and glacial rates. Climate oscillations drive the short-term response of a single erosional process (e.g., rockfalls or other mass wasting) that may enhance or limit the erosional efficiency of other processes (glacial or fluvial). While the Teton landscape may approach long-term equilibrium, stochastic processes and rapid response to short-term climate change actively perpetuate the transient ruggedness of the topography.

  2. Quantifying the impacts of dust on the Caspian Sea using a regional climate model

    NASA Astrophysics Data System (ADS)

    Elguindi, N.; Solmon, F.; Turuncoglu, U.

    2013-12-01

    The Karakum desert and surrounding area to the Caspian Sea (CS) provide a significant source of dust to the region. Local dust events can have a substantial impact on SSTs and evaporation from the Sea through direct radiative effects. Given the high interest in projected changes in the Caspian Sea Level (CSL), it is critical that we understand what these effects are in order to accurately model net sea evaporation, a major component of the CS hydrological budget. In this study, we employ a regional climate model (RegCM4) coupled to the 1D Hostetler lake model to explore the impact of dust on the CS. Dust is simulated in RegCM4 through an interactive dust emission transport model coupled to the radiation scheme, as well as a representation of anthropogenic aerosols. The first part of this study focuses on an evaluation of the ability of RegCM4 to simulate dust in the region by comparing 1) seasonal climatologies of modelled aerosol optical depth (AOD) to a range of satellite sources, and 2) a climatology of dust events, as well as decadal variability, to observations derived from visibility measurements. The second part of this study attempts to quantify the impact of dust on the Caspian SSTs, evaporation and heat flux components. The results of this study show that simulating the effects of dust on the CS is necessary for accurately modeling the Sea's hydrological budget.

  3. Quantifying dispersal and establishment limitation in a population of an epiphytic lichen.

    PubMed

    Werth, Silke; Wagner, Helene H; Gugerli, Felix; Holderegger, Rolf; Csencsics, Daniela; Kalwij, Jesse M; Scheidegger, Christoph

    2006-08-01

    Dispersal is a process critical for the dynamics and persistence of metapopulations, but it is difficult to quantify. It has been suggested that the old-forest lichen Lobaria pulmonaria is limited by insufficient dispersal ability. We analyzed 240 DNA extracts derived from snow samples by a L. pulmonaria-specific real-time PCR (polymerase chain reaction) assay of the ITS (internal transcribed spacer) region allowing for the discrimination among propagules originating from a single, isolated source tree or propagules originating from other locations. Samples that were detected as positives by real-time PCR were additionally genotyped for five L. pulmonaria microsatellite loci. Both molecular approaches demonstrated substantial dispersal from other than local sources. In a landscape approach, we additionally analyzed 240 snow samples with real-time PCR of ITS and detected propagules not only in forests where L. pulmonaria was present, but also in large unforested pasture areas and in forest patches where L. pulmonaria was not found. Monitoring of soredia of L. pulmonaria transplanted to maple bark after two vegetation periods showed high variance in growth among forest stands, but no significant differences among different transplantation treatments. Hence, it is probably not dispersal limitation that hinders colonization in the old-forest lichen L. pulmonaria, but ecological constraints at the stand level that can result in establishment limitation. Our study exemplifies that care has to be taken to adequately separate the effects of dispersal limitation from a limitation of establishment. PMID:16937643

  4. Quantifying solar spectral irradiance in aquatic habitats for the assessment of photoenhanced toxicity

    USGS Publications Warehouse

    Barron, M.G.; Little, E.E.; Calfee, R.; Diamond, S.

    2000-01-01

    The spectra and intensity of solar radiation (solar spectral irradiance [SSI]) was quantified in selected aquatic habitats in the vicinity of an oil field on the California coast. Solar spectral irradiance measurements consisted of spectral scans (280-700 rim) and radiometric measurements of ultraviolet (UV): UVB (280-320 nm) and UVA (320-400 nm). Solar spectral irradiance measurements were taken at the surface and at various depths in two marsh ponds, a shallow wetland, an estuary lagoon, and the intertidal area of a high-energy sandy beach. Daily fluctuation in SSI showed a general parabolic relationship with time; maximum structure-activity relationship (SAR) was observed at approximate solar noon. Solar spectral irradiance measurements taken at 10-cm depth at approximate solar noon in multiple aquatic habitats exhibited only a twofold variation in visible light and UVA and a 4.5-fold variation in UVB. Visible light ranged from 11,000 to 19,000 ??W/cm2, UVA ranged from 460 to 1,100 ??W/cm2, and UVB ranged from 8.4 to 38 ??W/cm2. In each habitat, the attenuation of light intensity with increasing water depth was differentially affected over specific wavelengths of SSI. The study results allowed the development of environmentally realistic light regimes necessary for photoenhanced toxicity studies.

  5. Quantifying ammonia emissions from a cattle feedlot using a dispersion model.

    PubMed

    McGinn, S M; Flesch, T K; Crenna, B P; Beauchemin, K A; Coates, T

    2007-01-01

    Livestock manure is a significant source of ammonia (NH3) emissions. In the atmosphere, NH3 is a precursor to the formation of fine aerosols that contribute to poor air quality associated with human health. Other environmental issues result when NH3 is deposited to land and water. Our study documented the quantity of NH3 emitted from a feedlot housing growing beef cattle. The study was conducted between June and October 2006 at a feedlot with a one-time capacity of 22,500 cattle located in southern Alberta, Canada. A backward Lagrangian stochastic (bLS) inverse-dispersion technique was used to calculate NH3 emissions, based on measurements of NH3 concentration (open-path laser) and wind (sonic anemometer) taken above the interior of the feedlot. There was an average of 3146 kg NH3 d(-1) lost from the entire feedlot, equivalent to 84 microg NH3 m(-2) s(-1) or 140 g NH3 head(-1) d(-1). The NH3 emissions correlated with sensible heat flux (r2 = 0.84) and to a lesser extent the wind speed (r2 = 0.56). There was also evidence that rain suppressed the NH3 emission. Quantifying NH3 emission and dispersion from farms is essential to show the impact of farm management on reducing NH3-related environmental issues. PMID:17940257

  6. Quantifying downstream impacts of impoundment on flow regime and channel planform, lower Trinity River, Texas

    NASA Astrophysics Data System (ADS)

    Wellmeyer, Jessica L.; Slattery, Michael C.; Phillips, Jonathan D.

    2005-07-01

    As human population worldwide has grown, so has interest in harnessing and manipulating the flow of water for the benefit of humans. The Trinity River of eastern Texas is one such watershed greatly impacted by engineering and urbanization. Draining the Dallas-Fort Worth metroplex, just under 30 reservoirs are in operation in the basin, regulating flow while containing public supplies, supporting recreation, and providing flood control. Lake Livingston is the lowest, as well as largest, reservoir in the basin, a mere 95 km above the Trinity's outlet near Galveston Bay. This study seeks to describe and quantify channel activity and flow regime, identifying effects of the 1968 closure of Livingston dam. Using historic daily and peak discharge data from USGS gauging stations, flow duration curves are constructed, identifying pre- and post-dam flow conditions. A digital historic photo archive was also constructed using six sets of aerial photographs spanning from 1938 to 1995, and three measures of channel activity applied using a GIS. Results show no changes in high flow conditions following impoundment, while low flows are elevated. However, the entire post-dam period is characterized by significantly higher rainfall, which may be obscuring the full impact of flow regulation. Channel activity rates do not indicate a more stabilized planform following dam closure; rather they suggest that the Trinity River is adjusting itself to the stress of Livingston dam in a slow, gradual process that may not be apparent in a modern time scale.

  7. Using nonlinear methods to quantify changes in infant limb movements and vocalizations

    PubMed Central

    Abney, Drew H.; Warlaumont, Anne S.; Haussman, Anna; Ross, Jessica M.; Wallot, Sebastian

    2014-01-01

    The pairing of dynamical systems theory and complexity science brings novel concepts and methods to the study of infant motor development. Accordingly, this longitudinal case study presents a new approach to characterizing the dynamics of infant limb and vocalization behaviors. A single infant's vocalizations and limb movements were recorded from 51-days to 305-days of age. On each recording day, accelerometers were placed on all four of the infant's limbs and an audio recorder was worn on the child's chest. Using nonlinear time series analysis methods, such as recurrence quantification analysis and Allan factor, we quantified changes in the stability and multiscale properties of the infant's behaviors across age as well as how these dynamics relate across modalities and effectors. We observed that particular changes in these dynamics preceded or coincided with the onset of various developmental milestones. For example, the largest changes in vocalization dynamics preceded the onset of canonical babbling. The results show that nonlinear analyses can help to understand the functional co-development of different aspects of infant behavior. PMID:25161629

  8. Quantifying terrestrial ecosystem carbon dynamics in the Jinsha watershed, Upper Yangtze, China from 1975 to 2000

    USGS Publications Warehouse

    Zhao, Shuqing

    2010-01-01

    Quantifying the spatial and temporal dynamics of carbon stocks in terrestrial ecosystems and carbon fluxes between the terrestrial biosphere and the atmosphere is critical to our understanding of regional patterns of carbon budgets. Here we use the General Ensemble biogeochemical Modeling System to simulate the terrestrial ecosystem carbon dynamics in the Jinsha watershed of China’s upper Yangtze basin from 1975 to 2000, based on unique combinations of spatial and temporal dynamics of major driving forces, such as climate, soil properties, nitrogen deposition, and land use and land cover changes. Our analysis demonstrates that the Jinsha watershed ecosystems acted as a carbon sink during the period of 1975–2000, with an average rate of 0.36 Mg/ha/yr, primarily resulting from regional climate variation and local land use and land cover change. Vegetation biomass accumulation accounted for 90.6% of the sink, while soil organic carbon loss before 1992 led to a lower net gain of carbon in the watershed, and after that soils became a small sink. Ecosystem carbon sink/source patterns showed a high degree of spatial heterogeneity. Carbon sinks were associated with forest areas without disturbances, whereas carbon sources were primarily caused by stand-replacing disturbances. It is critical to adequately represent the detailed fast-changing dynamics of land use activities in regional biogeochemical models to determine the spatial and temporal evolution of regional carbon sink/source patterns.

  9. Quantifying tsunami risk at the Pisco, Peru LNG terminal project

    NASA Astrophysics Data System (ADS)

    Synolakis, C. E.; Okal, E. A.; Borrero, J. C.

    2004-12-01

    We examine and quantify the tsunami risk near Pisco, Peru, where a major Liquefied Natural Gas facility is in project at Playa Loberia. We re-assess the historical record of tsunami damage along the coast of Central and Southern Peru, from 9 deg. S (Chimbote) to 19 deg. S (Arica), building seismic models of the events involved, and conducting numerical simulations of the run-up at Pisco that such models predict. We then evaluate possible return periods for the main seismic events under consideration, from a combination of historical datasets and plate tectonics arguments. We classify tsunami hazard according to the amplitude of their run-up on the coast: decimetric tsunamis (0.1 to 1 m) do not carry a specific hazard over and beyond that presented by storm waves. Metric tsunamis (a few meters) can inflict severe damage to coastal and harbor communities, and result in inundation distances of up to 1 or 2 km. Finally, dekametric tsunamis (10 m and above) are catastrophic events leading to the total destruction. We estimate that a scenario of metric run-up, which could substantially damage port facilities and lead to a number of fatalities, may have a repeat time at Pisco of about 60 years. A catastrophic tsunami of dekametric amplitude, capable of totally destroying harbor infrastructures, may have a repeat time of about 110 years. This result is also consistent with the "back-of-the-envelope" observation that the city was destroyed four times over the past 400 years. The last such tsunami took place 136 years ago.

  10. Quantifying community dynamics of nitrifiers in functionally stable reactors.

    PubMed

    Wittebolle, Lieven; Vervaeren, Han; Verstraete, Willy; Boon, Nico

    2008-01-01

    A sequential batch reactor (SBR) and a membrane bioreactor (MBR) were inoculated with the same sludge from a municipal wastewater treatment plant, supplemented with ammonium, and operated in parallel for 84 days. It was investigated whether the functional stability of the nitrification process corresponded with a static ammonia-oxidizing bacterial (AOB) community. The SBR provided complete nitrification during nearly the whole experimental run, whereas the MBR showed a buildup of 0 to 2 mg nitrite-N liter(-1) from day 45 until day 84. Based on the denaturing gradient gel electrophoresis profiles, two novel approaches were introduced to characterize and quantify the community dynamics and interspecies abundance ratios: (i) the rate of change [Delta(t)((week))] parameter and (ii) the Pareto-Lorenz curve distribution pattern. During the whole sampling period, it was observed that neither of the reactor types maintained a static microbial community and that the SBR evolved more gradually than the MBR, particularly with respect to AOB (i.e., average weekly community changes of 12.6% +/- 5.2% for the SBR and 24.6% +/- 14.3% for the MBR). Based on the Pareto-Lorenz curves, it was observed that only a small group of AOB species played a numerically dominant role in the nitritation of both reactors, and this was true especially for the MBR. The remaining less dominant species were speculated to constitute a reserve of AOB which can proliferate to replace the dominant species. The value of these parameters in terms of tools to assist the operation of activated-sludge systems is discussed. PMID:17981943

  11. Quantifying Reconnection in Fragmented 3D Current Layers

    NASA Astrophysics Data System (ADS)

    Fraser Wyper, Peter; Hesse, Michael

    2015-04-01

    There is growing evidence that when magnetic reconnection occurs in high Lundquist number plasmas such as in the Solar Corona or the Earth's Magnetosphere it does so within a fragmented, rather than a smooth current layer. Within the extent of these fragmented current regions the associated magnetic flux transfer and energy release occurs simultaneously in many different places. This simultaneous energy release and flux transfer has been postulated as a possible resolution to the problem of obtaining “fast” reconnection rates in such high conductivity plasmas. But how does one measure the reconnection rate in such fragmented current layers?In 2D the reconnection rate is simply given by the electric field at the dominant X-point, typically then normalized by the product of the upstream magnetic field strength and Alfven speed. However, the continuous nature of connection change in 3D makes measuring the reconnection rate much more challenging. Building on the analytical work of previous investigations (e.g. Hesse & Schindler 1988, Hesse & Birn 1993, Hesse et al. 2005) we present recently derived expressions providing, for the first time, a quantitative measure of reconnection rate in fragmented 3D current layers. We show that in 3D two measures actually characterize the rate of flux transfer; a total rate which measures the true rate at which new connections are formed and a net rate which measures the net change of connection associated with the largest value of ?E?dl through all of the non-ideal regions. Some simple examples will be used to illustrate how each expression may be applied and what it quantifies. This work was supported by an appointment to the NASA Postdoctoral Program and by NASA’s Magnetospheric Multiscale mission.

  12. Quantifying Community Dynamics of Nitrifiers in Functionally Stable Reactors? †

    PubMed Central

    Wittebolle, Lieven; Vervaeren, Han; Verstraete, Willy; Boon, Nico

    2008-01-01

    A sequential batch reactor (SBR) and a membrane bioreactor (MBR) were inoculated with the same sludge from a municipal wastewater treatment plant, supplemented with ammonium, and operated in parallel for 84 days. It was investigated whether the functional stability of the nitrification process corresponded with a static ammonia-oxidizing bacterial (AOB) community. The SBR provided complete nitrification during nearly the whole experimental run, whereas the MBR showed a buildup of 0 to 2 mg nitrite-N liter?1 from day 45 until day 84. Based on the denaturing gradient gel electrophoresis profiles, two novel approaches were introduced to characterize and quantify the community dynamics and interspecies abundance ratios: (i) the rate of change [?t(week)] parameter and (ii) the Pareto-Lorenz curve distribution pattern. During the whole sampling period, it was observed that neither of the reactor types maintained a static microbial community and that the SBR evolved more gradually than the MBR, particularly with respect to AOB (i.e., average weekly community changes of 12.6% ± 5.2% for the SBR and 24.6% ± 14.3% for the MBR). Based on the Pareto-Lorenz curves, it was observed that only a small group of AOB species played a numerically dominant role in the nitritation of both reactors, and this was true especially for the MBR. The remaining less dominant species were speculated to constitute a reserve of AOB which can proliferate to replace the dominant species. The value of these parameters in terms of tools to assist the operation of activated-sludge systems is discussed. PMID:17981943

  13. Quantified trends in the history of verbal behavior research

    PubMed Central

    Eshleman, John W.

    1991-01-01

    The history of scientific research about verbal behavior research, especially that based on Verbal Behavior (Skinner, 1957), can be assessed on the basis of a frequency and celeration analysis of the published and presented literature. In order to discover these quantified trends, a comprehensive bibliographical database was developed. Based on several literature searches, the bibliographic database included papers pertaining to verbal behavior that were published in the Journal of the Experimental Analysis of Behavior, the Journal of Applied Behavior Analysis, Behaviorism, The Behavior Analyst, and The Analysis of Verbal Behavior. A nonbehavioral journal, the Journal of Verbal Learning and Verbal Behavior was assessed as a nonexample comparison. The bibliographic database also included a listing of verbal behavior papers presented at the meetings of the Association for Behavior Analysis. Papers were added to the database if they (a) were about verbal behavior, (b) referenced B.F. Skinner's (1957) book Verbal Behavior, or (c) did both. Because the references indicated the year of publication or presentation, a count per year of them was measured. These yearly frequencies were plotted on Standard Celeration Charts. Once plotted, various celeration trends in the literature became visible, not the least of which was the greater quantity of verbal behavior research than is generally acknowledged. The data clearly show an acceleration of research across the past decade. The data also question the notion that a “paucity” of research based on Verbal Behavior currently exists. Explanations of the acceleration of verbal behavior research are suggested, and plausible reasons are offered as to why a relative lack of verbal behavior research extended through the mid 1960s to the latter 1970s. PMID:22477630

  14. Quantifying sensitivity to droughts - an experimental modeling approach

    NASA Astrophysics Data System (ADS)

    Staudinger, M.; Weiler, M.; Seibert, J.

    2015-03-01

    Meteorological droughts like those in summer 2003 or spring 2011 in Europe are expected to become more frequent in the future. Although the spatial extent of these drought events was large, not all regions were affected in the same way. Many catchments reacted strongly to the meteorological droughts showing low levels of streamflow and groundwater, while others hardly reacted. Also, the extent of the hydrological drought for specific catchments was different between these two historical events due to different initial conditions and drought propagation processes. This leads to the important question of how to detect and quantify the sensitivity of a catchment to meteorological droughts. To assess this question we designed hydrological model experiments using a conceptual rainfall-runoff model. Two drought scenarios were constructed by selecting precipitation and temperature observations based on certain criteria: one scenario was a modest but constant progression of drying based on sorting the years of observations according to annual precipitation amounts. The other scenario was a more extreme progression of drying based on selecting months from different years, forming a year with the wettest months through to a year with the driest months. Both scenarios retained the observed intra-annual seasonality for the region. We evaluated the sensitivity of 24 Swiss catchments to these scenarios by analyzing the simulated discharge time series and modeled storage. Mean catchment elevation, slope and area were the main controls on the sensitivity of catchment discharge to precipitation. Generally, catchments at higher elevation and with steeper slopes appeared less sensitive to meteorological droughts than catchments at lower elevations with less steep slopes.

  15. Evaluation of airborne topographic lidar for quantifying beach changes

    USGS Publications Warehouse

    Sallenger, A.H., Jr.; Krabill, W.B.; Swift, R.N.; Brock, J.; List, J.; Hansen, M.; Holman, R.A.; Manizade, S.; Sontag, J.; Meredith, A.; Morgan, K.; Yunkel, J.K.; Frederick, E.B.; Stockdon, H.

    2003-01-01

    A scanning airborne topographic lidar was evaluated for its ability to quantify beach topography and changes during the Sandy Duck experiment in 1997 along the North Carolina coast. Elevation estimates, acquired with NASA's Airborne Topographic Mapper (ATM), were compared to elevations measured with three types of ground-based measurements - 1) differential GPS equipped all-terrain vehicle (ATV) that surveyed a 3-km reach of beach from the shoreline to the dune, 2) GPS antenna mounted on a stadia rod used to intensely survey a different 100 m reach of beach, and 3) a second GPS-equipped ATV that surveyed a 70-km-long transect along the coast. Over 40,000 individual intercomparisons between ATM and ground surveys were calculated. RMS vertical differences associated with the ATM when compared to ground measurements ranged from 13 to 19 cm. Considering all of the intercomparisons together, RMS ??? 15 cm. This RMS error represents a total error for individual elevation estimates including uncertainties associated with random and mean errors. The latter was the largest source of error and was attributed to drift in differential GPS. The ??? 15 cm vertical accuracy of the ATM is adequate to resolve beach-change signals typical of the impact of storms. For example, ATM surveys of Assateague Island (spanning the border of MD and VA) prior to and immediately following a severe northeaster showed vertical beach changes in places greater than 2 m, much greater than expected errors associated with the ATM. A major asset of airborne lidar is the high spatial data density. Measurements of elevation are acquired every few m2 over regional scales of hundreds of kilometers. Hence, many scales of beach morphology and change can be resolved, from beach cusps tens of meters in wavelength to entire coastal cells comprising tens to hundreds of kilometers of coast. Topographic lidars similar to the ATM are becoming increasingly available from commercial vendors and should, in the future, be widely used in beach surveying.

  16. Quantifying cortical EEG responses to TMS in (un)consciousness.

    PubMed

    Sarasso, Simone; Rosanova, Mario; Casali, Adenauer G; Casarotto, Silvia; Fecchio, Matteo; Boly, Melanie; Gosseries, Olivia; Tononi, Giulio; Laureys, Steven; Massimini, Marcello

    2014-01-01

    We normally assess another individual's level of consciousness based on her or his ability to interact with the surrounding environment and communicate. Usually, if we observe purposeful behavior, appropriate responses to sensory inputs, and, above all, appropriate answers to questions, we can be reasonably sure that the person is conscious. However, we know that consciousness can be entirely within the brain, even in the absence of any interaction with the external world; this happens almost every night, while we dream. Yet, to this day, we lack an objective, dependable measure of the level of consciousness that is independent of processing sensory inputs and producing appropriate motor outputs. Theoretically, consciousness is thought to require the joint presence of functional integration and functional differentiation, otherwise defined as brain complexity. Here we review a series of recent studies in which Transcranial Magnetic Stimulation combined with electroencephalography (TMS/EEG) has been employed to quantify brain complexity in wakefulness and during physiological (sleep), pharmacological (anesthesia) and pathological (brain injury) loss of consciousness. These studies invariably show that the complexity of the cortical response to TMS collapses when consciousness is lost during deep sleep, anesthesia and vegetative state following severe brain injury, while it recovers when consciousness resurges in wakefulness, during dreaming, in the minimally conscious state or locked-in syndrome. The present paper will also focus on how this approach may contribute to unveiling the pathophysiology of disorders of consciousness affecting brain-injured patients. Finally, we will underline some crucial methodological aspects concerning TMS/EEG measurements of brain complexity. PMID:24403317

  17. A FRAMEWORK FOR QUANTIFYING THE DEGENERACIES OF EXOPLANET INTERIOR COMPOSITIONS

    SciTech Connect

    Rogers, L. A.; Seager, S. [Department of Physics, Massachusetts Institute of Technology, Cambridge, MA 02139 (United States)

    2010-04-01

    Several transiting super-Earths are expected to be discovered in the coming few years. While tools to model the interior structure of transiting planets exist, inferences about the composition are fraught with ambiguities. We present a framework to quantify how much we can robustly infer about super-Earth and Neptune-size exoplanet interiors from radius and mass measurements. We introduce quaternary diagrams to illustrate the range of possible interior compositions for planets with four layers (iron core, silicate mantles, water layers, and H/He envelopes). We apply our model to CoRoT-7b, GJ 436b, and HAT-P-11b. Interpretation of planets with H/He envelopes is limited by the model uncertainty in the interior temperature, while for CoRoT-7b observational uncertainties dominate. We further find that our planet interior model sharpens the observational constraints on CoRoT-7b's mass and radius, assuming the planet does not contain significant amounts of water or gas. We show that the strength of the limits that can be placed on a super-Earth's composition depends on the planet's density; for similar observational uncertainties, high-density super-Mercuries allow the tightest composition constraints. Finally, we describe how techniques from Bayesian statistics can be used to take into account in a formal way the combined contributions of both theoretical and observational uncertainties to ambiguities in a planet's interior composition. On the whole, with only a mass and radius measurement an exact interior composition cannot be inferred for an exoplanet because the problem is highly underconstrained. Detailed quantitative ranges of plausible compositions, however, can be found.

  18. Quantified energy dissipation rates in the terrestrial bow shock: 2. Waves and dissipation

    NASA Astrophysics Data System (ADS)

    Wilson, L. B.; Sibeck, D. G.; Breneman, A. W.; Contel, O. Le; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.

    2014-08-01

    We present the first quantified measure of the energy dissipation rates, due to wave-particle interactions, in the transition region of the Earth's collisionless bow shock using data from the Time History of Events and Macroscale Interactions during Substorms spacecraft. Our results show that wave-particle interactions can regulate the global structure and dominate the energy dissipation of collisionless shocks. In every bow shock crossing examined, we observed both low-frequency (<10 Hz) and high-frequency (?10 Hz) electromagnetic waves throughout the entire transition region and into the magnetosheath. The low-frequency waves were consistent with magnetosonic-whistler waves. The high-frequency waves were combinations of ion-acoustic waves, electron cyclotron drift instability driven waves, electrostatic solitary waves, and whistler mode waves. The high-frequency waves had the following: (1) peak amplitudes exceeding ?B˜ 10 nT and ?E˜ 300 mV/m, though more typical values were ?B˜ 0.1-1.0 nT and ?E˜ 10-50 mV/m; (2) Poynting fluxes in excess of 2000 ?W m-2 (typical values were ˜1-10 ?W m-2); (3) resistivities > 9000 ? m; and (4) associated energy dissipation rates >10 ?W m-3. The dissipation rates due to wave-particle interactions exceeded rates necessary to explain the increase in entropy across the shock ramps for ˜90% of the wave burst durations. For ˜22% of these times, the wave-particle interactions needed to only be ? 0.1% efficient to balance the nonlinear wave steepening that produced the shock waves. These results show that wave-particle interactions have the capacity to regulate the global structure and dominate the energy dissipation of collisionless shocks.

  19. Populations of Monarch butterflies with different migratory behaviors show divergence in wing morphology.

    PubMed

    Altizer, Sonia; Davis, Andrew K

    2010-04-01

    The demands of long-distance flight represent an important evolutionary force operating on the traits of migratory species. Monarchs are widespread butterflies known for their annual migrations in North America. We examined divergence in wing morphology among migratory monarchs from eastern and western N. America, and nonmigratory monarchs in S. Florida, Puerto Rico, Costa Rica, and Hawaii. For the three N. American populations, we also examined monarchs reared in four common environment experiments. We used image analysis to measure multiple traits including forewing area and aspect ratio; for laboratory-reared monarchs we also quantified body area and wing loading. Results showed wild monarchs from all nonmigratory populations were smaller than those from migratory populations. Wild and captive-reared eastern monarchs had the largest and most elongated forewings, whereas monarchs from Puerto Rico and Costa Rica had the smallest and roundest forewings. Eastern monarchs also had the largest bodies and high measures of wing loading, whereas western and S. Florida monarchs had less elongated forewings and smaller bodies. Among captive-reared butterflies, family-level effects provided evidence that genetic factors contributed to variation in wing traits. Collectively, these results support evolutionary responses to long-distance flight in monarchs, with implications for the conservation of phenotypically distinct wild populations. PMID:20067519

  20. Software for portable laser light show system

    NASA Astrophysics Data System (ADS)

    Buruchin, Dmitrey J.; Leonov, Alexander F.

    1995-04-01

    Portable laser light show system LS-3500-10M is connected to the parallel port of IBM PC/AT compatible computer. Computer performs output of digital control data describing images. Specially designed control device is used to convert digital data coming from parallel port to the analog signal driving scanner. Capabilities of even cost nothing 286 computer are quite enough for laser graphics control. Technology of scanning used in laser graphics system LS-3500-10M essentially differs from widely spread systems based on galvanometers with mobile core or with mobile magnet. Such devices are based on the same principle of work as electrically driven servo-mechanism. As scanner we use elastic system with hydraulic dampen oscillations and opened loop. For most of applications of laser graphics such system provides satisfactory precision and speed of scanning. LS-3500-10M software gives user ability to create on PC and play his own laser graphics demonstrations. It is possible to render recognizable text and pictures using different styles, 3D and abstract animation. All types of demonstrations can be mixed in slide-show. Time synchronization is supported. Software has the following features: (1) Different types of text output. Built-in text editor for typing and editing of textural information. Different fonts can be used to display text. User can create his own fonts using specially developed font editor. (2) Editor of 3D animation with library of predefined shapes. (3) Abstract animation provided by software routines. (4) Support of different graphics files formats (PCX or DXF). Original algorithm of raster image tracing was implemented. (5) Built-in slide-show editor.