Sample records for quantified results show

  1. Quantified security is a weak hypothesis: a critical survey of results and assumptions

    Microsoft Academic Search

    Vilhelm Verendel

    2009-01-01

    This paper critically surveys previous work on quantitative representation and analysis of security. Such quantified security has been presented as a general approach to precisely assess and control security. We classify a significant part of the work between 1981 and 2008 with respect to security perspective, target of quantification, underlying assumptions and type of validation. The result shows how the

  2. 14. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF INADEQUATE TAMPING. THE SIZE OF THE GRANITE AGGREGATE USED IN THE DAMS CONCRETE IS CLEARLY SHOWN. - Hume Lake Dam, Sequioa National Forest, Hume, Fresno County, CA

  3. 13. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. DETAIL VIEW OF BUTTRESS 4 SHOWING THE RESULTS OF POOR CONSTRUCTION WORK. THOUGH NOT A SERIOUS STRUCTURAL DEFICIENCY, THE 'HONEYCOMB' TEXTURE OF THE CONCRETE SURFACE WAS THE RESULT OF INADEQUATE TAMPING AT THE TIME OF THE INITIAL 'POUR'. - Hume Lake Dam, Sequioa National Forest, Hume, Fresno County, CA

  4. The Paris MEGAPOLI campaign to better quantify organic aerosol formation in a large agglomeration: first results

    NASA Astrophysics Data System (ADS)

    Beekmann, Matthias; Baltensperger, Urs; Sciare, Jean; Gros, Valérie; Borbon, Agnes; Baklanov, Alexander; Lawrence, Mark; Pandis, Spyros

    2010-05-01

    Within the FP7 MEGAPOLI project, two intensive field campaigns have been conducted in the Greater Paris region during July 2009 and January/February 2010. The major aim was to quantify sources of primary and secondary aerosol, and the interaction with gaseous precursors, within a large agglomeration, and in its plume. Greater Paris has been chosen for such a campaign because it is a major and dense pollution source (more than 10 million inhabitants), surrounded by rural areas and relatively flat terrain. A particular focus is put on organic carbon, for which secondary formation, but also primary emissions are still not well quantified. Detailed aerosol and gaseous precursor measurements have been conducted at an urban and two sub-urban sites, from five mobile platforms and from the French ATR-42 research aircraft (for plume characterisation). State of the art instrumentation has allowed determination of aerosol chemical composition, either with very high frequency (several minutes to half an hour), or with large chemical detail (several dozens of organic compounds from filter samples). In addition, the size distribution, optical and hygroscopic and mixing properties has been determined in order to relate the aerosol chemical composition to its potential radiative and climate impact in the urban region and its plume. Gas phase measurements have focussed especially on detailed VOC measurements in order to relate SOA build-up to gaseous precursor species abundance. A network of backscatter lidars at urban and rural sites and on a mobile platform gives the access to the aerosol vertical distribution in the region and to variations of the boundary layer height at the urban / rural interface. Meteorological parameters and especially wind profile measurements allow interpretation of transport processes in the region. In this paper, the campaign set-up and objectives, meteorological and general pollution conditions observed during the field experiments and a first overview over the measurement results will be given. First particular results obtained during the campaign will be highlighted. For instance, from airborne primary pollutant measurements it appeared that the pollution plume was still well defined at more than one hundred kilometres downwind from the agglomeration. This will give a "safe" framework for evaluating secondary organic aerosol build-up in the plume. Significant new particle formation events were observed in the area during the whole month of the campaign. These events were assisted by the relatively low particulate matter concentration levels and resulting low surface area during most of July 2009. Preliminary attribution of organic aerosol (OA) from AMS mass spectrometer urban and peri-urban measurements during the summer campaign shows a large fraction of oxidised organic aerosol (OOA), comprising both chemically processed (oxidized) primary organic aerosol and classical secondary organic aerosol (from aromatic and biogenic VOC precursors), and a smaller fraction of unoxidised organic aerosol (HOA) of primary origin. Another aspect is water solubility of OA available from PILS-TOC measurements. At the urban LHVP site, about half of OA is water soluble, corresponding probably to classical secondary organic aerosol, another half is water insoluble, corresponding probably to primary and chemically processed primary OA. First attempts of source attribution of primary OA will also be presented. Finally, the comprehensive data set obtained during the campaign will be used for a first evaluation of regional chemistry-transport model simulations.

  5. Results of using a wireless inertial measuring system to quantify gait motions in control subjects.

    PubMed

    Tien, Iris; Glaser, Steven D; Bajcsy, Ruzena; Goodin, Douglas S; Aminoff, Michael J

    2010-07-01

    Gait analysis is important for the diagnosis of many neurological diseases such as Parkinson's. The discovery and interpretation of minor gait abnormalities can aid in early diagnosis. We have used an inertial measuring system mounted on the subject's foot to provide numerical measures of a subject's gait (3-D displacements and rotations), thereby creating an automated tool intended to facilitate diagnosis and enable quantitative prognostication of various neurological disorders in which gait is disturbed. This paper describes the process used for ensuring that these inertial measurement units yield accurate and reliable displacement and rotation data, and for validating the preciseness and robustness of the gait-deconstruction algorithms. It also presents initial results from control subjects, focusing on understanding the data recorded by the shoe-mounted sensor to quantify relevant gait-related motions. PMID:19423449

  6. Quantifying Uncertainty in Model Predictions for the Pliocene (Plio-QUMP): Initial results

    USGS Publications Warehouse

    Pope, J.O.; Collins, M.; Haywood, A.M.; Dowsett, H.J.; Hunter, S.J.; Lunt, D.J.; Pickering, S.J.; Pound, M.J.

    2011-01-01

    Examination of the mid-Pliocene Warm Period (mPWP; ~. 3.3 to 3.0. Ma BP) provides an excellent opportunity to test the ability of climate models to reproduce warm climate states, thereby assessing our confidence in model predictions. To do this it is necessary to relate the uncertainty in model simulations of mPWP climate to uncertainties in projections of future climate change. The uncertainties introduced by the model can be estimated through the use of a Perturbed Physics Ensemble (PPE). Developing on the UK Met Office Quantifying Uncertainty in Model Predictions (QUMP) Project, this paper presents the results from an initial investigation using the end members of a PPE in a fully coupled atmosphere-ocean model (HadCM3) running with appropriate mPWP boundary conditions. Prior work has shown that the unperturbed version of HadCM3 may underestimate mPWP sea surface temperatures at higher latitudes. Initial results indicate that neither the low sensitivity nor the high sensitivity simulations produce unequivocally improved mPWP climatology relative to the standard. Whilst the high sensitivity simulation was able to reconcile up to 6 ??C of the data/model mismatch in sea surface temperatures in the high latitudes of the Northern Hemisphere (relative to the standard simulation), it did not produce a better prediction of global vegetation than the standard simulation. Overall the low sensitivity simulation was degraded compared to the standard and high sensitivity simulations in all aspects of the data/model comparison. The results have shown that a PPE has the potential to explore weaknesses in mPWP modelling simulations which have been identified by geological proxies, but that a 'best fit' simulation will more likely come from a full ensemble in which simulations that contain the strengths of the two end member simulations shown here are combined. ?? 2011 Elsevier B.V.

  7. Astronomy Diagnostic Test Results Reflect Course Goals and Show Room for Improvement

    NASA Astrophysics Data System (ADS)

    Lopresto, Michael C.

    The results of administering the Astronomy Diagnostic Test (ADT) to introductory astronomy students at Henry Ford Community College over three years have shown gains comparable with national averages. Results have also accurately corresponded to course goals, showing greater gains in topics covered in more detail, and lower gains in topics covered in less detail. Also evident in the results were topics for which improvement of instruction is needed. These factors and the ease with which the ADT can be administered constitute evidence of the usefulness of the ADT as an assessment instrument for introductory astronomy.

  8. Nanotribology Results Show that DNA Forms a Mechanically Resistant 2D Network in Metaphase Chromatin Plates

    PubMed Central

    Gállego, Isaac; Oncins, Gerard; Sisquella, Xavier; Fernàndez-Busquets, Xavier; Daban, Joan-Ramon

    2010-01-01

    In a previous study, we found that metaphase chromosomes are formed by thin plates, and here we have applied atomic force microscopy (AFM) and friction force measurements at the nanoscale (nanotribology) to analyze the properties of these planar structures in aqueous media at room temperature. Our results show that high concentrations of NaCl and EDTA and extensive digestion with protease and nuclease enzymes cause plate denaturation. Nanotribology studies show that native plates under structuring conditions (5 mM Mg2+) have a relatively high friction coefficient (? ? 0.3), which is markedly reduced when high concentrations of NaCl or EDTA are added (? ? 0.1). This lubricant effect can be interpreted considering the electrostatic repulsion between DNA phosphate groups and the AFM tip. Protease digestion increases the friction coefficient (? ? 0.5), but the highest friction is observed when DNA is cleaved by micrococcal nuclease (? ? 0.9), indicating that DNA is the main structural element of plates. Whereas nuclease-digested plates are irreversibly damaged after the friction measurement, native plates can absorb kinetic energy from the AFM tip without suffering any damage. These results suggest that plates are formed by a flexible and mechanically resistant two-dimensional network which allows the safe storage of DNA during mitosis. PMID:21156137

  9. Results of inpatient survey show the case for safe staffing is undeniable.

    PubMed

    Osborne, Susan

    2015-06-01

    When are we going to learn lessons from the national NHS inpatient survey, published at the end of last month by the Care Quality Commission, which shows standards are sub-optimal in our cherished NHS? PMID:26036398

  10. Long-Term Trial Results Show No Mortality Benefit from Annual Prostate Cancer Screening

    Cancer.gov

    Thirteen year follow-up data from the Prostate, Lung, Colorectal and Ovarian (PLCO) cancer screening trial show higher incidence but similar mortality among men screened annually with the prostate-specific antigen (PSA) test and digital rectal examination (DRE).

  11. What Gdel's Incompleteness Result Does and Does Not Show Haim Gaifman

    E-print Network

    Gaifman, Haim

    cannot fully grasp how it works. Gödel's result also points out a certain essential limitation of self-reflection this is related to self-reflection, will become clear at the end of this comment. I should add that the full

  12. Hector's dolphin risk assessments: old and new analyses show consistent results

    Microsoft Academic Search

    E Slooten; N Davies

    2012-01-01

    We review results of previous research and present new estimates of Hector's dolphin (Cephalorhynchus hectori) bycatch. Before 2008, an estimated total of 110–150 individuals were caught annually, with 35–46 caught off the east coast South Island (ECSI). We estimate that 23 Hector's dolphins were caught off ECSI during 1 May 2009–30 April 2010 (CV 0.21) based on fisheries observer data.

  13. Hector's dolphin risk assessments: old and new analyses show consistent results

    Microsoft Academic Search

    E Slooten; N Davies

    2011-01-01

    We review results of previous research and present new estimates of Hector's dolphin (Cephalorhynchus hectori) bycatch. Before 2008, an estimated total of 110–150 individuals were caught annually, with 35–46 caught off the east coast South Island (ECSI). We estimate that 23 Hector's dolphins were caught off ECSI during 1 May 2009–30 April 2010 (CV 0.21) based on fisheries observer data.

  14. Results From Mars Show Electrostatic Charging of the Mars Pathfinder Sojourner Rover

    NASA Technical Reports Server (NTRS)

    Kolecki, Joseph C.; Siebert, Mark W.

    1998-01-01

    Indirect evidence (dust accumulation) has been obtained indicating that the Mars Pathfinder rover, Sojourner, experienced electrostatic charging on Mars. Lander camera images of the Sojourner rover provide distinctive evidence of dust accumulation on rover wheels during traverses, turns, and crabbing maneuvers. The sol 22 (22nd Martian "day" after Pathfinder landed) end-of-day image clearly shows fine red dust concentrated around the wheel edges with additional accumulation in the wheel hubs. A sol 41 image of the rover near the rock "Wedge" (see the next image) shows a more uniform coating of dust on the wheel drive surfaces with accumulation in the hubs similar to that in the previous image. In the sol 41 image, note particularly the loss of black-white contrast on the Wheel Abrasion Experiment strips (center wheel). This loss of contrast was also seen when dust accumulated on test wheels in the laboratory. We believe that this accumulation occurred because the Martian surface dust consists of clay-sized particles, similar to those detected by Viking, which have become electrically charged. By adhering to the wheels, the charged dust carries a net nonzero charge to the rover, raising its electrical potential relative to its surroundings. Similar charging behavior was routinely observed in an experimental facility at the NASA Lewis Research Center, where a Sojourner wheel was driven in a simulated Martian surface environment. There, as the wheel moved and accumulated dust (see the following image), electrical potentials in excess of 100 V (relative to the chamber ground) were detected by a capacitively coupled electrostatic probe located 4 mm from the wheel surface. The measured wheel capacitance was approximately 80 picofarads (pF), and the calculated charge, 8 x 10(exp -9) coulombs (C). Voltage differences of 100 V and greater are believed sufficient to produce Paschen electrical discharge in the Martian atmosphere. With an accumulated net charge of 8 x 10(exp -9) C, and average arc time of 1 msec, arcs can also occur with estimated arc currents approaching 10 milliamperes (mA). Discharges of this magnitude could interfere with the operation of sensitive electrical or electronic elements and logic circuits. Sojourner rover wheel tested in laboratory before launch to Mars. Before launch, we believed that the dust would become triboelectrically charged as it was moved about and compacted by the rover wheels. In all cases observed in the laboratory, the test wheel charged positively, and the wheel tracks charged negatively. Dust samples removed from the laboratory wheel averaged a few ones to tens of micrometers in size (clay size). Coarser grains were left behind in the wheel track. On Mars, grain size estimates of 2 to 10 mm were derived for the Martian surface materials from the Viking Gas Exchange Experiment. These size estimates approximately match the laboratory samples. Our tentative conclusion for the Sojourner observations is that fine clay-sized particles acquired an electrostatic charge during rover traverses and adhered to the rover wheels, carrying electrical charge to the rover. Since the Sojourner rover carried no instruments to measure this mission's onboard electrical charge, confirmatory measurements from future rover missions on Mars are desirable so that the physical and electrical properties of the Martian surface dust can be characterized. Sojourner was protected by discharge points, and Faraday cages were placed around sensitive electronics. But larger systems than Sojourner are being contemplated for missions to the Martian surface in the foreseeable future. The design of such systems will require a detailed knowledge of how they will interact with their environment. Validated environmental interaction models and guidelines for the Martian surface must be developed so that design engineers can test new ideas prior to cutting hardware. These models and guidelines cannot be validated without actual flighata. Electrical charging of vehicles and, one day, astronauts moving across t

  15. Aortic emboli show surprising size dependent predilection for cerebral arteries: Results from computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Carr, Ian; Schwartz, Robert; Shadden, Shawn

    2012-11-01

    Cardiac emboli can have devastating consequences if they enter the cerebral circulation, and are the most common cause of embolic stroke. Little is known about relationships of embolic origin/density/size to cerebral events; as these relationships are difficult to observe. To better understand stoke risk from cardiac and aortic emboli, we developed a computational model to track emboli from the heart to the brain. Patient-specific models of the human aorta and arteries to the brain were derived from CT angiography from 10 MHIF patients. Blood flow was modeled by the Navier-Stokes equations using pulsatile inflow at the aortic valve, and physiologic Windkessel models at the outlets. Particulate was injected at the aortic valve and tracked using modified Maxey-Riley equations with a wall collision model. Results demonstrate aortic emboli that entered the cerebral circulation through the carotid or vertebral arteries were localized to specific locations of the proximal aorta. The percentage of released particles embolic to the brain markedly increased with particle size from 0 to ~1-1.5 mm in all patients. Larger particulate became less likely to traverse the cerebral vessels. These findings are consistent with sparse literature based on transesophageal echo measurements. This work was supported in part by the National Science Foundation, award number 1157041.

  16. Animation shows promise in initiating timely cardiopulmonary resuscitation: results of a pilot study.

    PubMed

    Attin, Mina; Winslow, Katheryn; Smith, Tyler

    2014-04-01

    Delayed responses during cardiac arrest are common. Timely interventions during cardiac arrest have a direct impact on patient survival. Integration of technology in nursing education is crucial to enhance teaching effectiveness. The goal of this study was to investigate the effect of animation on nursing students' response time to cardiac arrest, including initiation of timely chest compression. Nursing students were randomized into experimental and control groups prior to practicing in a high-fidelity simulation laboratory. The experimental group was educated, by discussion and animation, about the importance of starting cardiopulmonary resuscitation upon recognizing an unresponsive patient. Afterward, a discussion session allowed students in the experimental group to gain more in-depth knowledge about the most recent changes in the cardiac resuscitation guidelines from the American Heart Association. A linear mixed model was run to investigate differences in time of response between the experimental and control groups while controlling for differences in those with additional degrees, prior code experience, and basic life support certification. The experimental group had a faster response time compared with the control group and initiated timely cardiopulmonary resuscitation upon recognition of deteriorating conditions (P < .0001). The results demonstrated the efficacy of combined teaching modalities for timely cardiopulmonary resuscitation. Providing opportunities for repetitious practice when a patient's condition is deteriorating is crucial for teaching safe practice. PMID:24473120

  17. Quantifying geological processes on Mars-Results of the high resolution stereo camera (HRSC) on Mars express

    NASA Astrophysics Data System (ADS)

    Jaumann, R.; Tirsch, D.; Hauber, E.; Ansan, V.; Di Achille, G.; Erkeling, G.; Fueten, F.; Head, J.; Kleinhans, M. G.; Mangold, N.; Michael, G. G.; Neukum, G.; Pacifici, A.; Platz, T.; Pondrelli, M.; Raack, J.; Reiss, D.; Williams, D. A.; Adeli, S.; Baratoux, D.; de Villiers, G.; Foing, B.; Gupta, S.; Gwinner, K.; Hiesinger, H.; Hoffmann, H.; Deit, L. Le; Marinangeli, L.; Matz, K.-D.; Mertens, V.; Muller, J. P.; Pasckert, J. H.; Roatsch, T.; Rossi, A. P.; Scholten, F.; Sowe, M.; Voigt, J.; Warner, N.

    2015-07-01

    This review summarizes the use of High Resolution Stereo Camera (HRSC) data as an instrumental tool and its application in the analysis of geological processes and landforms on Mars during the last 10 years of operation. High-resolution digital elevations models on a local to regional scale are the unique strength of the HRSC instrument. The analysis of these data products enabled quantifying geological processes such as effusion rates of lava flows, tectonic deformation, discharge of water in channels, formation timescales of deltas, geometry of sedimentary deposits as well as estimating the age of geological units by crater size-frequency distribution measurements. Both the quantification of geological processes and the age determination allow constraining the evolution of Martian geologic activity in space and time. A second major contribution of HRSC is the discovery of episodicity in the intensity of geological processes on Mars. This has been revealed by comparative age dating of volcanic, fluvial, glacial, and lacustrine deposits. Volcanic processes on Mars have been active over more than 4 Gyr, with peak phases in all three geologic epochs, generally ceasing towards the Amazonian. Fluvial and lacustrine activity phases spread a time span from Noachian until Amazonian times, but detailed studies show that they have been interrupted by multiple and long lasting phases of quiescence. Also glacial activity shows discrete phases of enhanced intensity that may correlate with periods of increased spin-axis obliquity. The episodicity of geological processes like volcanism, erosion, and glaciation on Mars reflects close correlation between surface processes and endogenic activity as well as orbit variations and changing climate condition.

  18. Quantifying microwear on experimental Mistassini quartzite scrapers: preliminary results of exploratory research using LSCM and scale-sensitive fractal analysis.

    PubMed

    Stemp, W James; Lerner, Harry J; Kristant, Elaine H

    2013-01-01

    Although previous use-wear studies involving quartz and quartzite have been undertaken by archaeologists, these are comparatively few in number. Moreover, there has been relatively little effort to quantify use-wear on stone tools made from quartzite. The purpose of this article is to determine the effectiveness of a measurement system, laser scanning confocal microscopy (LSCM), to document the surface roughness or texture of experimental Mistassini quartzite scrapers used on two different contact materials (fresh and dry deer hide). As in previous studies using LSCM on chert, flint, and obsidian, this exploratory study incorporates a mathematical algorithm that permits the discrimination of surface roughness based on comparisons at multiple scales. Specifically, we employ measures of relative area (RelA) coupled with the F-test to discriminate used from unused stone tool surfaces, as well as surfaces of quartzite scrapers used on dry and fresh deer hide. Our results further demonstrate the effect of raw material variation on use-wear formation and its documentation using LSCM and RelA. PMID:22688593

  19. News Note: Long-term Results from Study of Tamoxifen and Raloxifene Shows Lower Toxicities of Raloxifene

    Cancer.gov

    Initial results in 2006 of the NCI-sponsored Study of Tamoxifen and Raloxifene (STAR) showed that a common osteoporosis drug, raloxifene, prevented breast cancer to the same degree, but with fewer serious side-effects, than the drug tamoxifen that had been in use many years for breast cancer prevention as well as treatment. The longer-term results show that raloxifene retained 76 percent of the effectiveness of tamoxifen in preventing invasive disease and grew closer to tamoxifen in preventing noninvasive disease, while remaining far less toxic – in particular, there was significantly less endometrial cancer with raloxifene use.

  20. Quantifying dust input to the Subarctic North Pacific - Results from surface sediments and sea water thorium isotope measurements

    NASA Astrophysics Data System (ADS)

    Winckler, G.; Serno, S.; Hayes, C.; Anderson, R. F.; Gersonde, R.; Haug, G. H.

    2012-12-01

    The Subarctic North Pacific is one of the three primary high-nutrient-low chlorophyll regions of the modern ocean, where the biological pump is relatively inefficient at transferring carbon from the atmosphere to the deep sea. The system is thought to be iron-limited. Aeolian dust is a significant source of iron and other nutrients that are essential for the health of marine ecosystems and potentially a controlling factor of the high-nutrient-low chlorophyll status of the Subarctic North Pacific. However, constraining the size of the dust flux to the surface ocean remains difficult. Here we apply two different approaches, based on surface sediment and water column samples, respectively, obtained during the SO202/INOPEX research cruise to the Subarctic North Pacific in 2009. We map the spatial patterns of Th/U isotopes, helium isotopes and rare earth elements across surface sediments from 37 multi-core core-top sediments across the Subarctic North Pacific. In order to deconvolve the detrital endmembers in regions of the North Pacific affected by volcanic material, IRD and hemipelagic input, we use a combination of trace elements with distinct characteristics in the different endmembers. This approach allows us to calculate the relative aeolian fraction, and in combination with Thorium230-normalized mass flux data, to quantify the dust supply. Secondly, we present an innovative approach to use paired Thorium-232 and Thorium-230 concentrations of upper-ocean seawater at 7 stations along the INOPEX track. Thorium-232 in the upper water column is dominantly derived from dissolution of aeolian dust, whereas Thorium-230 data provide a measure of the thorium removal from the surface waters and, thus, allow us to derive Thorium-232 fluxes. Combined with a mean Thorium-232 concentration in dust and estimate of the thorium solubility, the Thorium-232 flux can be translated in a dust flux to the surface ocean. Dust flux estimates for the Subarctic North Pacific will be compared to results from model simulations from Mahowald et al. (2006).

  1. Genetic correlations between field test results of Swedish Warmblood Riding Horses as 4-year-olds and lifetime performance results in dressage and show jumping

    Microsoft Academic Search

    Lena Wallin; Erling Strandberg; Jan Philipsson

    2003-01-01

    The main objective of this study was to estimate genetic correlations between traits of young sport horses (4 years old) evaluated in the Swedish Riding Horse Quality Test (RHQT) and later competition results in dressage and show jumping. The data comprised 3708 Warmblood horses born between 1968 and 1982 that had participated in the RHQT as 4-year-olds and 25?605 horses

  2. Development and application of methods to quantify spatial and temporal hyperpolarized 3He MRI ventilation dynamics: preliminary results in chronic obstructive pulmonary disease

    NASA Astrophysics Data System (ADS)

    Kirby, Miranda; Wheatley, Andrew; McCormack, David G.; Parraga, Grace

    2010-03-01

    Hyperpolarized helium-3 (3He) magnetic resonance imaging (MRI) has emerged as a non-invasive research method for quantifying lung structural and functional changes, enabling direct visualization in vivo at high spatial and temporal resolution. Here we described the development of methods for quantifying ventilation dynamics in response to salbutamol in Chronic Obstructive Pulmonary Disease (COPD). Whole body 3.0 Tesla Excite 12.0 MRI system was used to obtain multi-slice coronal images acquired immediately after subjects inhaled hyperpolarized 3He gas. Ventilated volume (VV), ventilation defect volume (VDV) and thoracic cavity volume (TCV) were recorded following segmentation of 3He and 1H images respectively, and used to calculate percent ventilated volume (PVV) and ventilation defect percent (VDP). Manual segmentation and Otsu thresholding were significantly correlated for VV (r=.82, p=.001), VDV (r=.87 p=.0002), PVV (r=.85, p=.0005), and VDP (r=.85, p=.0005). The level of agreement between these segmentation methods was also evaluated using Bland-Altman analysis and this showed that manual segmentation was consistently higher for VV (Mean=.22 L, SD=.05) and consistently lower for VDV (Mean=-.13, SD=.05) measurements than Otsu thresholding. To automate the quantification of newly ventilated pixels (NVp) post-bronchodilator, we used translation, rotation, and scaling transformations to register pre-and post-salbutamol images. There was a significant correlation between NVp and VDV (r=-.94 p=.005) and between percent newly ventilated pixels (PNVp) and VDP (r=- .89, p=.02), but not for VV or PVV. Evaluation of 3He MRI ventilation dynamics using Otsu thresholding and landmark-based image registration provides a way to regionally quantify functional changes in COPD subjects after treatment with beta-agonist bronchodilators, a common COPD and asthma therapy.

  3. Low-frequency ac electroporation shows strong frequency dependence and yields comparable transfection results to dc electroporation

    E-print Network

    Lu, Chang

    Low-frequency ac electroporation shows strong frequency dependence and yields comparable in the frequency range of 10 kHz­1 MHz. Based on Schwan equation, it was thought that with low ac frequencies (10 24061, USA d School of Public Health, Nantong University, Nantong, 226019, PR China e Department

  4. QUANTIFYING SPICULES

    SciTech Connect

    Pereira, Tiago M. D. [NASA Ames Research Center, Moffett Field, CA 94035 (United States); De Pontieu, Bart [Lockheed Martin Solar and Astrophysics Laboratory, Org. A021S, Building 252, 3251 Hanover Street, Palo Alto, CA 94304 (United States); Carlsson, Mats, E-mail: tiago.pereira@nasa.gov [Institute of Theoretical Astrophysics, P.O. Box 1029, Blindern, NO-0315 Oslo (Norway)

    2012-11-01

    Understanding the dynamic solar chromosphere is fundamental in solar physics. Spicules are an important feature of the chromosphere, connecting the photosphere to the corona, potentially mediating the transfer of energy and mass. The aim of this work is to study the properties of spicules over different regions of the Sun. Our goal is to investigate if there is more than one type of spicule, and how spicules behave in the quiet Sun, coronal holes, and active regions. We make use of high cadence and high spatial resolution Ca II H observations taken by Hinode/Solar Optical Telescope. Making use of a semi-automated detection algorithm, we self-consistently track and measure the properties of 519 spicules over different regions. We find clear evidence of two types of spicules. Type I spicules show a rise and fall and have typical lifetimes of 150-400 s and maximum ascending velocities of 15-40 km s{sup -1}, while type II spicules have shorter lifetimes of 50-150 s, faster velocities of 30-110 km s{sup -1}, and are not seen to fall down, but rather fade at around their maximum length. Type II spicules are the most common, seen in the quiet Sun and coronal holes. Type I spicules are seen mostly in active regions. There are regional differences between quiet-Sun and coronal hole spicules, likely attributable to the different field configurations. The properties of type II spicules are consistent with published results of rapid blueshifted events (RBEs), supporting the hypothesis that RBEs are their disk counterparts. For type I spicules we find the relations between their properties to be consistent with a magnetoacoustic shock wave driver, and with dynamic fibrils as their disk counterpart. The driver of type II spicules remains unclear from limb observations.

  5. A collaborative accountable care model in three practices showed promising early results on costs and quality of care.

    PubMed

    Salmon, Richard B; Sanderson, Mark I; Walters, Barbara A; Kennedy, Karen; Flores, Robert C; Muney, Alan M

    2012-11-01

    Cigna's Collaborative Accountable Care initiative provides financial incentives to physician groups and integrated delivery systems to improve the quality and efficiency of care for patients in commercial open-access benefit plans. Registered nurses who serve as care coordinators employed by participating practices are a central feature of the initiative. They use patient-specific reports and practice performance reports provided by Cigna to improve care coordination, identify and close care gaps, and address other opportunities for quality improvement. We report interim quality and cost results for three geographically and structurally diverse provider practices in Arizona, New Hampshire, and Texas. Although not statistically significant, these early results revealed favorable trends in total medical costs and quality of care, suggesting that a shared-savings accountable care model and collaborative support from the payer can enable practices to take meaningful steps toward full accountability for care quality and efficiency. PMID:23129667

  6. QUANTIFYING FOREST ABOVEGROUND CARBON POOLS AND FLUXES USING MULTI-TEMPORAL LIDAR A report on field monitoring, remote sensing MMV, GIS integration, and modeling results for forestry field validation test to quantify aboveground tree biomass and carbon

    SciTech Connect

    Lee Spangler; Lee A. Vierling; Eva K. Stand; Andrew T. Hudak; Jan U.H. Eitel; Sebastian Martinuzzi

    2012-04-01

    Sound policy recommendations relating to the role of forest management in mitigating atmospheric carbon dioxide (CO{sub 2}) depend upon establishing accurate methodologies for quantifying forest carbon pools for large tracts of land that can be dynamically updated over time. Light Detection and Ranging (LiDAR) remote sensing is a promising technology for achieving accurate estimates of aboveground biomass and thereby carbon pools; however, not much is known about the accuracy of estimating biomass change and carbon flux from repeat LiDAR acquisitions containing different data sampling characteristics. In this study, discrete return airborne LiDAR data was collected in 2003 and 2009 across {approx}20,000 hectares (ha) of an actively managed, mixed conifer forest landscape in northern Idaho, USA. Forest inventory plots, established via a random stratified sampling design, were established and sampled in 2003 and 2009. The Random Forest machine learning algorithm was used to establish statistical relationships between inventory data and forest structural metrics derived from the LiDAR acquisitions. Aboveground biomass maps were created for the study area based on statistical relationships developed at the plot level. Over this 6-year period, we found that the mean increase in biomass due to forest growth across the non-harvested portions of the study area was 4.8 metric ton/hectare (Mg/ha). In these non-harvested areas, we found a significant difference in biomass increase among forest successional stages, with a higher biomass increase in mature and old forest compared to stand initiation and young forest. Approximately 20% of the landscape had been disturbed by harvest activities during the six-year time period, representing a biomass loss of >70 Mg/ha in these areas. During the study period, these harvest activities outweighed growth at the landscape scale, resulting in an overall loss in aboveground carbon at this site. The 30-fold increase in sampling density between the 2003 and 2009 did not affect the biomass estimates. Overall, LiDAR data coupled with field reference data offer a powerful method for calculating pools and changes in aboveground carbon in forested systems. The results of our study suggest that multitemporal LiDAR-based approaches are likely to be useful for high quality estimates of aboveground carbon change in conifer forest systems.

  7. Presentation Showing Results of a Hydrogeochemical Investigation of the Standard Mine Vicinity, Upper Elk Creek Basin, Colorado

    USGS Publications Warehouse

    Manning, Andrew H.; Verplanck, Philip L.; Mast, M. Alisa; Wanty, Richard B.

    2008-01-01

    PREFACE This Open-File Report consists of a presentation given in Crested Butte, Colorado on December 13, 2007 to the Standard Mine Advisory Group. The presentation was paired with another presentation given by the Colorado Division of Reclamation, Mining, and Safety on the physical features and geology of the Standard Mine. The presentation in this Open-File Report summarizes the results and conclusions of a hydrogeochemical investigation of the Standard Mine performed by the U.S. Geological Survey (Manning and others, in press). The purpose of the investigation was to aid the U.S. Environmental Protection Agency in evaluating remediation options for the Standard Mine site. Additional details and supporting data related to the information in this presentation can be found in Manning and others (in press).

  8. Uncertainty quantified trait predictions

    NASA Astrophysics Data System (ADS)

    Fazayeli, Farideh; Kattge, Jens; Banerjee, Arindam; Schrodt, Franziska; Reich, Peter

    2015-04-01

    Functional traits of organisms are key to understanding and predicting biodiversity and ecological change, which motivates continuous collection of traits and their integration into global databases. Such composite trait matrices are inherently sparse, severely limiting their usefulness for further analyses. On the other hand, traits are characterized by the phylogenetic trait signal, trait-trait correlations and environmental constraints, all of which provide information that could be used to statistically fill gaps. We propose the application of probabilistic models which, for the first time, utilize all three characteristics to fill gaps in trait databases and predict trait values at larger spatial scales. For this purpose we introduce BHPMF, a hierarchical Bayesian extension of Probabilistic Matrix Factorization (PMF). PMF is a machine learning technique which exploits the correlation structure of sparse matrices to impute missing entries. BHPMF additionally utilizes the taxonomic hierarchy for trait prediction. Implemented in the context of a Gibbs Sampler MCMC approach BHPMF provides uncertainty estimates for each trait prediction. We present comprehensive experimental results on the problem of plant trait prediction using the largest database of plant traits, where BHPMF shows strong empirical performance in uncertainty quantified trait prediction, outperforming the state-of-the-art based on point estimates. Further, we show that BHPMF is more accurate when it is confident, whereas the error is high when the uncertainty is high.

  9. Quantifying Transmission Reliability Margin

    Microsoft Academic Search

    Jianfeng Zhang; Ian Dobson; Fernando L. Alvarado

    2002-01-01

    In bulk electric power transfer capability com- putations, the transmission reliability margin accounts for uncertainties related to the transmission system conditions, contingencies, and parameter values. We propose a formula which quantifies transmission reliability margin based on transfer capability sensitivities and a probabilistic character- ization of the various uncertainties. The formula is verified by comparison with results from two systems small

  10. "The Show"

    ERIC Educational Resources Information Center

    Gehring, John

    2004-01-01

    For the past 16 years, the blue-collar city of Huntington, West Virginia, has rolled out the red carpet to welcome young wrestlers and their families as old friends. They have come to town chasing the same dream for a spot in what many of them call "The Show". For three days, under the lights of an arena packed with 5,000 fans, the state's best…

  11. Quantifying robustness of biochemical network models

    PubMed Central

    2002-01-01

    Background Robustness of mathematical models of biochemical networks is important for validation purposes and can be used as a means of selecting between different competing models. Tools for quantifying parametric robustness are needed. Results Two techniques for describing quantitatively the robustness of an oscillatory model were presented and contrasted. Single-parameter bifurcation analysis was used to evaluate the stability robustness of the limit cycle oscillation as well as the frequency and amplitude of oscillations. A tool from control engineering – the structural singular value (SSV) – was used to quantify robust stability of the limit cycle. Using SSV analysis, we find very poor robustness when the model's parameters are allowed to vary. Conclusion The results show the usefulness of incorporating SSV analysis to single parameter sensitivity analysis to quantify robustness. PMID:12482327

  12. Sci Show

    NSDL National Science Digital Library

    The Sci Show, an entertaining series of quirky YouTube videos, tackles topics ranging from â??How Do Polarized Sunglasses Workâ? to â??Strong Interaction: The Four Fundamental Forces of Physics.â? Most episodes are less than five minutes long, but they pack a wallop of handy science info. Anyone short on time but long on big questions will benefit from the series. Episodes will be helpful to teachers and parents looking to spark enthusiasm in young minds. Viewers may want to start with recent episodes like â??Todayâ??s Mass Extinction,â? and the â??Worldâ??s First See-Through Animalâ? and â??How Do Animals Change Color?â? before digging into the archives for gems like â??The Truth About Gingersâ? and â??The Science of Lying.â?

  13. Quantifiable Lateral Flow Assay Test Strips

    NASA Technical Reports Server (NTRS)

    2003-01-01

    As easy to read as a home pregnancy test, three Quantifiable Lateral Flow Assay (QLFA) strips used to test water for E. coli show different results. The brightly glowing control line on the far right of each strip indicates that all three tests ran successfully. But the glowing test line on the middle left and bottom strips reveal their samples were contaminated with E. coli bacteria at two different concentrations. The color intensity correlates with concentration of contamination.

  14. Simple instruments used in monitoring ionospheric perturbations and some observational results showing the ionospheric responses to the perturbations mainly from the lower atmosphere

    NASA Astrophysics Data System (ADS)

    Xiao, Zuo; Hao, Yongqiang; Zhang, Donghe; Xiao, Sai-Guan; Huang, Weiquan

    Ionospheric disturbances such as SID and acoustic gravity waves in different scales are well known and commonly discussed topics. Some simple ground equipment was designed and used for monitoring continuously the effects of these disturbances, especially, SWF, SFD. Besides SIDs, They also reflect clearly the acoustic gravity waves in different scale and Spread-F and these data are important supplementary to the traditional ionosonde records. It is of signifi-cance in understanding physical essentials of the ionospheric disturbances and applications in SID warning. In this paper, the designing of the instruments is given and results are discussed in detail. Some case studies were introduced as example which showed very clearly not only immediate effects of solar flare, but also the phenomena of ionospheric responses to large scale gravity waves from lower atmosphere such as typhoon, great earthquake and volcano erup-tion. Particularlyresults showed that acoustic gravity waves play significant role in seeding ionospheric Spread-F. These examples give evidence that lower atmospheric activities strongly influence the ionosphere.

  15. Visualizing and quantifying the suppressive effects of glucocorticoids on the tadpole immune system in vivo

    NSDL National Science Digital Library

    Alexander Schreiber (St. Lawrence University)

    2011-12-01

    This article presents a laboratory module developed to show students how glucocorticoid receptor activity can be pharmacologically modulated in Xenopus laevis tadpoles and the resulting effects on thymus gland size visualized and quantified in vivo.

  16. Quantifying Proportional Variability

    PubMed Central

    Heath, Joel P.; Borowski, Peter

    2013-01-01

    Real quantities can undergo such a wide variety of dynamics that the mean is often a meaningless reference point for measuring variability. Despite their widespread application, techniques like the Coefficient of Variation are not truly proportional and exhibit pathological properties. The non-parametric measure Proportional Variability (PV) [1] resolves these issues and provides a robust way to summarize and compare variation in quantities exhibiting diverse dynamical behaviour. Instead of being based on deviation from an average value, variation is simply quantified by comparing the numbers to each other, requiring no assumptions about central tendency or underlying statistical distributions. While PV has been introduced before and has already been applied in various contexts to population dynamics, here we present a deeper analysis of this new measure, derive analytical expressions for the PV of several general distributions and present new comparisons with the Coefficient of Variation, demonstrating cases in which PV is the more favorable measure. We show that PV provides an easily interpretable approach for measuring and comparing variation that can be generally applied throughout the sciences, from contexts ranging from stock market stability to climate variation. PMID:24386334

  17. Quantifying proportional variability.

    PubMed

    Heath, Joel P; Borowski, Peter

    2013-01-01

    Real quantities can undergo such a wide variety of dynamics that the mean is often a meaningless reference point for measuring variability. Despite their widespread application, techniques like the Coefficient of Variation are not truly proportional and exhibit pathological properties. The non-parametric measure Proportional Variability (PV) [1] resolves these issues and provides a robust way to summarize and compare variation in quantities exhibiting diverse dynamical behaviour. Instead of being based on deviation from an average value, variation is simply quantified by comparing the numbers to each other, requiring no assumptions about central tendency or underlying statistical distributions. While PV has been introduced before and has already been applied in various contexts to population dynamics, here we present a deeper analysis of this new measure, derive analytical expressions for the PV of several general distributions and present new comparisons with the Coefficient of Variation, demonstrating cases in which PV is the more favorable measure. We show that PV provides an easily interpretable approach for measuring and comparing variation that can be generally applied throughout the sciences, from contexts ranging from stock market stability to climate variation. PMID:24386334

  18. Kinemetry: quantifying kinematic maps

    E-print Network

    Y. Copin; R. Bacon; M. Bureau; R. L. Davies; E. Emsellem; H. Kuntschner; B. Miller; R. Peletier; E. K. Verolme; P. T. de Zeeuw

    2001-09-06

    We describe a new technique, kinemetry, to quantify kinematic maps of early-type galaxies in an efficient way. We present the first applications to velocity fields obtained with the integral-field spectrograph SAURON.

  19. Wireless quantified reflex device

    NASA Astrophysics Data System (ADS)

    Lemoyne, Robert Charles

    The deep tendon reflex is a fundamental aspect of a neurological examination. The two major parameters of the tendon reflex are response and latency, which are presently evaluated qualitatively during a neurological examination. The reflex loop is capable of providing insight for the status and therapy response of both upper and lower motor neuron syndromes. Attempts have been made to ascertain reflex response and latency, however these systems are relatively complex, resource intensive, with issues of consistent and reliable accuracy. The solution presented is a wireless quantified reflex device using tandem three dimensional wireless accelerometers to obtain response based on acceleration waveform amplitude and latency derived from temporal acceleration waveform disparity. Three specific aims have been established for the proposed wireless quantified reflex device: 1. Demonstrate the wireless quantified reflex device is reliably capable of ascertaining quantified reflex response and latency using a quantified input. 2. Evaluate the precision of the device using an artificial reflex system. 3.Conduct a longitudinal study respective of subjects with healthy patellar tendon reflexes, using the wireless quantified reflex evaluation device to obtain quantified reflex response and latency. Aim 1 has led to the steady evolution of the wireless quantified reflex device from a singular two dimensional wireless accelerometer capable of measuring reflex response to a tandem three dimensional wireless accelerometer capable of reliably measuring reflex response and latency. The hypothesis for aim 1 is that a reflex quantification device can be established for reliably measuring reflex response and latency for the patellar tendon reflex, comprised of an integrated system of wireless three dimensional MEMS accelerometers. Aim 2 further emphasized the reliability of the wireless quantified reflex device by evaluating an artificial reflex system. The hypothesis for aim 2 is that the wireless quantified reflex device can obtain reliable reflex parameters (response and latency) from an artificial reflex device. Aim 3 synthesizes the findings relevant to aim 1 and 2, while applying the wireless accelerometer reflex quantification device to a longitudinal study of healthy patellar tendon reflexes. The hypothesis for aim 3 is that during a longitudinal evaluation of the deep tendon reflex the parameters for reflex response and latency can be measured with a considerable degree of accuracy, reliability, and reproducibility. Enclosed is a detailed description of a wireless quantified reflex device with research findings and potential utility of the system, inclusive of a comprehensive description of tendon reflexes, prior reflex quantification systems, and correlated applications.

  20. Modeling upward brine migration through faults as a result of CO2 storage in the Northeast German Basin shows negligible salinization in shallow aquifers

    NASA Astrophysics Data System (ADS)

    Kuehn, M.; Tillner, E.; Kempka, T.; Nakaten, B.

    2012-12-01

    The geological storage of CO2 in deep saline formations may cause salinization of shallower freshwater resources by upward flow of displaced brine from the storage formation into potable groundwater. In this regard, permeable faults or fractures can serve as potential leakage pathways for upward brine migration. The present study uses a regional-scale 3D model based on real structural data of a prospective CO2 storage site in Northeastern Germany to determine the impact of compartmentalization and fault permeability on upward brine migration as a result of pressure elevation by CO2 injection. To evaluate the degree of salinization in the shallower aquifers, different fault leakage scenarios were carried out using a newly developed workflow in which the model grid from the software package Petrel applied for pre-processing is transferred to the reservoir simulator TOUGH2-MP/ECO2N. A discrete fault description is achieved by using virtual elements. A static 3D geological model of the CO2 storage site with an a real size of 40 km x 40 km and a thickness of 766 m was implemented. Subsequently, large-scale numerical multi-phase multi-component (CO2, NaCl, H2O) flow simulations were carried out on a high performance computing system. The prospective storage site, located in the Northeast German Basin is part of an anticline structure characterized by a saline multi-layer aquifer system. The NE and SW boundaries of the study area are confined by the Fuerstenwalde Gubener and the Lausitzer Abbruch fault zones represented by four discrete faults in the model. Two formations of the Middle Bunter were chosen to assess brine migration through faults triggered by an annual injection rate of 1.7 Mt CO2 into the lowermost formation over a time span of 20 years. In addition to varying fault permeabilities, different boundary conditions were applied to evaluate the effects of reservoir compartmentalization. Simulation results show that the highest pressurization within the storage formation with a relative pressure increase of up to 150 % after 20 years of injection is caused by strong compartmentalization effects if closed boundaries and closed faults are assumed. The CO2 plume is considerably smaller compared to those that develop when laterally open boundaries are applied. Laterally open boundaries and highly permeable faults lead to the strongest pressure dissipation and cause the CO2 plume to come up almost 3 km closer to the fault. Closed model boundaries in the lower aquifers and four highly permeable faults (> 1,000 mD) lead to the highest salinities in the uppermost Stuttgart formation with an average salinity increase of 0.24 % (407 mg/l) after 20 years of injection. Less salinity changes in the uppermost aquifers are observed with closed boundaries in the lower aquifers and only one major fault open for brine flow. Here, also fault permeability, unexpectedly does not significantly influence salinization in the uppermost Stuttgart formation. Salinity increases by 0.04% (75 mg/l) for a fault permeability of 1,000 mD and by at least 0.06 % (96 mg/l) for a fault permeability of 10,000 mD and until the end of injection. Taking into account the modeling results shallow aquifer salinization is not expected to be of concern for the investigated study area in the Northeastern German Basin.

  1. timid, or easily manipulated. This is not compassion. A marine drill ser-geant may be demanding and results-driven, but can show compassion

    E-print Network

    Kim, Duck O.

    drug/alcohol-related crisis resulted or someone got killed? not rely upon the opinion of your employee"). The result is cover-up and protection of the drug user. Most people understand enabling as protecting am sure there are em- ployees in our work or- ganization using illicit substances. I understand

  2. Storytelling Slide Shows to Improve Diabetes and High Blood Pressure Knowledge and Self-Efficacy: Three-Year Results among Community Dwelling Older African Americans

    ERIC Educational Resources Information Center

    Bertera, Elizabeth M.

    2014-01-01

    This study combined the African American tradition of oral storytelling with the Hispanic medium of "Fotonovelas." A staggered pretest posttest control group design was used to evaluate four Storytelling Slide Shows on health that featured community members. A total of 212 participants were recruited for the intervention and 217 for the…

  3. Covariation and quantifier polarity: What determines causal attribution in vignettes?

    Microsoft Academic Search

    Asifa Majid; Anthony J. Sanford; Martin J. Pickering

    2006-01-01

    Tests of causal attribution often use verbal vignettes, with covariation information provided through statements quantified with natural language expressions. The effect of covariation information has typically been taken to show that set size information affects attribution. However, recent research shows that quantifiers provide information about discourse focus as well as covariation information. In the attribution literature, quantifiers are used to

  4. The Relevance of External Quality Assessment for Molecular Testing for ALK Positive Non-Small Cell Lung Cancer: Results from Two Pilot Rounds Show Room for Optimization

    PubMed Central

    Tembuyser, Lien; Tack, Véronique; Zwaenepoel, Karen; Pauwels, Patrick; Miller, Keith; Bubendorf, Lukas; Kerr, Keith; Schuuring, Ed; Thunnissen, Erik; Dequeker, Elisabeth M. C.

    2014-01-01

    Background and Purpose Molecular profiling should be performed on all advanced non-small cell lung cancer with non-squamous histology to allow treatment selection. Currently, this should include EGFR mutation testing and testing for ALK rearrangements. ROS1 is another emerging target. ALK rearrangement status is a critical biomarker to predict response to tyrosine kinase inhibitors such as crizotinib. To promote high quality testing in non-small cell lung cancer, the European Society of Pathology has introduced an external quality assessment scheme. This article summarizes the results of the first two pilot rounds organized in 2012–2013. Materials and Methods Tissue microarray slides consisting of cell-lines and resection specimens were distributed with the request for routine ALK testing using IHC or FISH. Participation in ALK FISH testing included the interpretation of four digital FISH images. Results Data from 173 different laboratories was obtained. Results demonstrate decreased error rates in the second round for both ALK FISH and ALK IHC, although the error rates were still high and the need for external quality assessment in laboratories performing ALK testing is evident. Error rates obtained by FISH were lower than by IHC. The lowest error rates were observed for the interpretation of digital FISH images. Conclusion There was a large variety in FISH enumeration practices. Based on the results from this study, recommendations for the methodology, analysis, interpretation and result reporting were issued. External quality assessment is a crucial element to improve the quality of molecular testing. PMID:25386659

  5. Map Showing Earthquake Shaking and Tsunami Hazard in Guadeloupe and Dominica, as a Result of an M8.0 Earthquake on the Lesser Antilles Megathrust

    USGS Multimedia Gallery

    Earthquake shaking (onland) and tsunami (ocean) hazard in Guadeloupe and Dominica, as a result of anM8.0 earthquake on the Lesser Antilles megathrust adjacent to Guadeloupe. Colors onland represent scenario earthquake shaking intensities calculated in USGS ShakeMap software (Wald et al. 20...

  6. Quantifying and Reducing the Uncertainties in Future Projections of Droughts and Heat Waves for North America that Result from the Diversity of Models in CMIP5

    NASA Astrophysics Data System (ADS)

    Herrera-Estrada, J. E.; Sheffield, J.

    2014-12-01

    There are many sources of uncertainty regarding the future projections of our climate, including the multiple possible Representative Concentration Pathways (RCPs), the variety of climate models used, and the initial and boundary conditions with which they are run. Moreover, it has been shown that the internal variability of the climate system can sometimes be of the same order of magnitude as the climate change signal or even larger for some variables. Nonetheless, in order to help inform stakeholders in water resources and agriculture in North America when developing adaptation strategies, particularly for extreme events such as droughts and heat waves, it is necessary to study the plausible range of changes that the region might experience during the 21st century. We aim to understand and reduce the uncertainties associated with this range of possible scenarios by focusing on the diversity of climate models involved in the Coupled Model Intercomparison Project Phase 5 (CMIP5). Data output from various CMIP5 models is compared against near surface climate and land-surface hydrological data from the North American Land Data Assimilation System (NLDAS)-2 to evaluate how well each climate model represents the land-surface processes associated with droughts and heat waves during the overlapping historical period (1979-2005). These processes include the representation of precipitation and radiation and their partitioning at the land surface, land-atmosphere interactions, and the propagation of signals of these extreme events through the land surface. The ability of the CMIP5 models to reproduce these important physical processes for regions of North America is used to inform a multi-model ensemble in which models that represent the processes relevant to droughts and heat waves better are given more importance. Furthermore, the future projections are clustered to identify possible dependencies in behavior across models. The results indicate a wide range in performance for the historical runs with some models hampered by poor interannual variability in summer precipitation and near surface air temperature, whilst others partition too much precipitation into evapotranspiration with implications for drought and heat wave development.

  7. Quantifying the Uruguay Round

    Microsoft Academic Search

    Thomas F. Rutherford; David G. Tarr

    1997-01-01

    The effects of the Uruguay Round are quantified using a numerical general equilibrium model which incorporates increasing returns to scale, twenty-four regions, twenty-two commodities, and steady state growth effects. The authors conclude that the aggregate welfare gains from the Round are in the order of $96 billion per year in the short run, but could be as high as $171

  8. Quantifying the Wave Driving of the Stratosphere

    NASA Technical Reports Server (NTRS)

    Newman, Paul A.; Nash, Eric R.

    1999-01-01

    The zonal mean eddy heat flux is directly proportional to the wave activity that propagates from the troposphere into the stratosphere. This quantity is a simple eddy diagnostic which is easily calculated from conventional meteorological analyses. Because this "wave driving" of the stratosphere has a strong impact on the stratospheric temperature, it is necessary to compare the impact of the flux with respect to stratospheric radiative changes caused by greenhouse gas changes. Hence, we must understand the precision and accuracy of the heat flux derived from our global meteorological analyses. Herein, we quantify the stratospheric heat flux using five different meteorological analyses, and show that there are 30% differences between these analyses during the disturbed conditions of the northern hemisphere winter. Such large differences result from the planetary differences in the stationary temperature and meridional wind fields. In contrast, planetary transient waves show excellent agreement amongst these five analyses, and this transient heat flux appears to have a long term downward trend.

  9. Value of Fused 18F-Choline-PET/MRI to Evaluate Prostate Cancer Relapse in Patients Showing Biochemical Recurrence after EBRT: Preliminary Results

    PubMed Central

    Piccardo, Arnoldo; Paparo, Francesco; Picazzo, Riccardo; Naseri, Mehrdad; Ricci, Paolo; Marziano, Andrea; Bacigalupo, Lorenzo; Biscaldi, Ennio; Rollandi, Gian Andrea; Grillo-Ruggieri, Filippo; Farsad, Mohsen

    2014-01-01

    Purpose. We compared the accuracy of 18F-Choline-PET/MRI with that of multiparametric MRI (mMRI), 18F-Choline-PET/CT, 18F-Fluoride-PET/CT, and contrast-enhanced CT (CeCT) in detecting relapse in patients with suspected relapse of prostate cancer (PC) after external beam radiotherapy (EBRT). We assessed the association between standard uptake value (SUV) and apparent diffusion coefficient (ADC). Methods. We evaluated 21 patients with biochemical relapse after EBRT. Patients underwent 18F-Choline-PET/contrast-enhanced (Ce)CT, 18F-Fluoride-PET/CT, and mMRI. Imaging coregistration of PET and mMRI was performed. Results. 18F-Choline-PET/MRI was positive in 18/21 patients, with a detection rate (DR) of 86%. DRs of 18F-Choline-PET/CT, CeCT, and mMRI were 76%, 43%, and 81%, respectively. In terms of DR the only significant difference was between 18F-Choline-PET/MRI and CeCT. On lesion-based analysis, the accuracy of 18F-Choline-PET/MRI, 18F-Choline-PET/CT, CeCT, and mMRI was 99%, 95%, 70%, and 85%, respectively. Accuracy, sensitivity, and NPV of 18F-Choline-PET/MRI were significantly higher than those of both mMRI and CeCT. On whole-body assessment of bone metastases, the sensitivity of 18F-Choline-PET/CT and 18F-Fluoride-PET/CT was significantly higher than that of CeCT. Regarding local and lymph node relapse, we found a significant inverse correlation between ADC and SUV-max. Conclusion. 18F-Choline-PET/MRI is a promising technique in detecting PC relapse. PMID:24877053

  10. A new index quantifying the precipitation extremes

    NASA Astrophysics Data System (ADS)

    Busuioc, Aristita; Baciu, Madalina; Stoica, Cerasela

    2015-04-01

    Events of extreme precipitation have a great impact on society. They are associated with flooding, erosion and landslides.Various indices have been proposed to quantify these extreme events and they are mainly related to daily precipitation amount, which are usually available for long periods in many places over the world. The climate signal related to changes in the characteristics of precipitation extremes is different over various regions and it is dependent on the season and the index used to quantify the precipitation extremes. The climate model simulations and empirical evidence suggest that warmer climates, due to increased water vapour, lead to more intense precipitation events, even when the total annual precipitation is slightly reduced. It was suggested that there is a shift in the nature of precipitation events towards more intense and less frequent rains and increases in heavy rains are expected to occur in most places, even when the mean precipitation is not increasing. This conclusion was also proved for the Romanian territory in a recent study, showing a significant increasing trend of the rain shower frequency in the warm season over the entire country, despite no significant changes in the seasonal amount and the daily extremes. The shower events counted in that paper refer to all convective rains, including torrential ones giving high rainfall amount in very short time. The problem is to find an appropriate index to quantify such events in terms of their highest intensity in order to extract the maximum climate signal. In the present paper, a new index is proposed to quantify the maximum precipitation intensity in an extreme precipitation event, which could be directly related to the torrential rain intensity. This index is tested at nine Romanian stations (representing various physical-geographical conditions) and it is based on the continuous rainfall records derived from the graphical registrations (pluviograms) available at National Meteorological Administration in Romania. These types of records contain the rainfall intensity (mm/minute) over various intervals for which it remains constant. The maximum intensity for each continuous rain over the May-August interval has been calculated for each year. The corresponding time series over the 1951-2008 period have been analysed in terms of their long term trends and shifts in the mean; the results have been compared to those resulted from other rainfall indices based on daily and hourly data, computed over the same interval such as: total rainfall amount, maximum daily amount, contribution of total hourly amounts exceeding 10mm/day, contribution of daily amounts exceeding the 90th percentile, the 90th, 99th and 99.9th percentiles of 1-hour data . The results show that the proposed index exhibit a coherent and stronger climate signal (significant increase) for all analysed stations compared to the other indices associated to precipitation extremes, which show either no significant change or weaker signal. This finding shows that the proposed index is most appropriate to quantify the climate change signal of the precipitation extremes. We consider that this index is more naturally connected to the maximum intensity of a real rainfall event. The results presented is this study were funded by the Executive Agency for Higher Education, Research, Development and Innovation Funding (UEFISCDI) through the research project CLIMHYDEX, "Changes in climate extremes and associated impact in hydrological events in Romania", code PNII-ID-2011-2-0073 (http://climhydex.meteoromania.ro)

  11. Time to Quantify Falsifiability

    E-print Network

    Nemenman, Ilya

    2015-01-01

    Here we argue that the notion of falsifiability, a key concept in defining a valid scientific theory, can be quantified using Bayesian Model Selection, which is a standard tool in modern statistics. This relates falsifiability to the quantitative version of the Occam's razor, and allows transforming some long-running arguments about validity of certain scientific theories from philosophical discussions to mathematical calculations. This is a Letter to the editor.

  12. Quantifying light pollution

    NASA Astrophysics Data System (ADS)

    Cinzano, P.; Falchi, F.

    2014-05-01

    In this paper we review new available indicators useful to quantify and monitor light pollution, defined as the alteration of the natural quantity of light in the night environment due to introduction of manmade light. With the introduction of recent radiative transfer methods for the computation of light pollution propagation, several new indicators become available. These indicators represent a primary step in light pollution quantification, beyond the bare evaluation of the night sky brightness, which is an observational effect integrated along the line of sight and thus lacking the three-dimensional information.

  13. Terahertz spectroscopy for quantifying refined oil mixtures.

    PubMed

    Li, Yi-nan; Li, Jian; Zeng, Zhou-mo; Li, Jie; Tian, Zhen; Wang, Wei-kui

    2012-08-20

    In this paper, the absorption coefficient spectra of samples prepared as mixtures of gasoline and diesel in different proportions are obtained by terahertz time-domain spectroscopy. To quantify the components of refined oil mixtures, a method is proposed to evaluate the best frequency band for regression analysis. With the data in this frequency band, dualistic linear regression fitting is used to determine the volume fraction of gasoline and diesel in the mixture based on the Beer-Lambert law. The minimum of regression fitting R-Square is 0.99967, and the mean error of fitted volume fraction of 97# gasoline is 4.3%. Results show that refined oil mixtures can be quantitatively analyzed through absorption coefficient spectra in terahertz frequency, which it has bright application prospects in the storage and transportation field for refined oil. PMID:22907017

  14. Quantifying actin wave modulation on periodic topography

    NASA Astrophysics Data System (ADS)

    Guven, Can; Driscoll, Meghan; Sun, Xiaoyu; Parker, Joshua; Fourkas, John; Carlsson, Anders; Losert, Wolfgang

    2014-03-01

    Actin is the essential builder of the cell cytoskeleton, whose dynamics are responsible for generating the necessary forces for the formation of protrusions. By exposing amoeboid cells to periodic topographical cues, we show that actin can be directionally guided via inducing preferential polymerization waves. To quantify the dynamics of these actin waves and their interaction with the substrate, we modify a technique from computer vision called ``optical flow.'' We obtain vectors that represent the apparent actin flow and cluster these vectors to obtain patches of newly polymerized actin, which represent actin waves. Using this technique, we compare experimental results, including speed distribution of waves and distance from the wave centroid to the closest ridge, with actin polymerization simulations. We hypothesize the modulation of the activity of nucleation promotion factors on ridges (elevated regions of the surface) as a potential mechanism for the wave-substrate coupling. Funded by NIH grant R01GM085574.

  15. Quantifier Comprehension in Corticobasal Degeneration

    ERIC Educational Resources Information Center

    McMillan, Corey T.; Clark, Robin; Moore, Peachie; Grossman, Murray

    2006-01-01

    In this study, we investigated patients with focal neurodegenerative diseases to examine a formal linguistic distinction between classes of generalized quantifiers, like "some X" and "less than half of X." Our model of quantifier comprehension proposes that number knowledge is required to understand both first-order and higher-order quantifiers

  16. On quantifying insect movements

    SciTech Connect

    Wiens, J.A.; Crist, T.O. (Colorado State Univ., Fort Collins (United States)); Milne, B.T. (Univ. of New Mexico, Albuquerque (United States))

    1993-08-01

    We elaborate on methods described by Turchin, Odendaal Rausher for quantifying insect movement pathways. We note the need to scale measurement resolution to the study insects and the questions being asked, and we discuss the use of surveying instrumentation for recording sequential positions of individuals on pathways. We itemize several measures that may be used to characterize movement pathways and illustrate these by comparisons among several Eleodes beetles occurring in shortgrass steppe. The fractal dimension of pathways may provide insights not available from absolute measures of pathway configuration. Finally, we describe a renormalization procedure that may be used to remove sequential interdependence among locations of moving individuals while preserving the basic attributes of the pathway.

  17. What Do Blood Tests Show?

    MedlinePLUS

    ... page from the NHLBI on Twitter. What Do Blood Tests Show? Blood tests show whether the levels ... changes may work best. Result Ranges for Common Blood Tests This section presents the result ranges for ...

  18. Quantifying Loopy Network Architectures

    PubMed Central

    Katifori, Eleni; Magnasco, Marcelo O.

    2012-01-01

    Biology presents many examples of planar distribution and structural networks having dense sets of closed loops. An archetype of this form of network organization is the vasculature of dicotyledonous leaves, which showcases a hierarchically-nested architecture containing closed loops at many different levels. Although a number of approaches have been proposed to measure aspects of the structure of such networks, a robust metric to quantify their hierarchical organization is still lacking. We present an algorithmic framework, the hierarchical loop decomposition, that allows mapping loopy networks to binary trees, preserving in the connectivity of the trees the architecture of the original graph. We apply this framework to investigate computer generated graphs, such as artificial models and optimal distribution networks, as well as natural graphs extracted from digitized images of dicotyledonous leaves and vasculature of rat cerebral neocortex. We calculate various metrics based on the asymmetry, the cumulative size distribution and the Strahler bifurcation ratios of the corresponding trees and discuss the relationship of these quantities to the architectural organization of the original graphs. This algorithmic framework decouples the geometric information (exact location of edges and nodes) from the metric topology (connectivity and edge weight) and it ultimately allows us to perform a quantitative statistical comparison between predictions of theoretical models and naturally occurring loopy graphs. PMID:22701593

  19. Quantifying T Lymphocyte Turnover

    PubMed Central

    De Boer, Rob J.; Perelson, Alan S.

    2013-01-01

    Peripheral T cell populations are maintained by production of naive T cells in the thymus, clonal expansion of activated cells, cellular self-renewal (or homeostatic proliferation), and density dependent cell life spans. A variety of experimental techniques have been employed to quantify the relative contributions of these processes. In modern studies lymphocytes are typically labeled with 5-bromo-2?-deoxyuridine (BrdU), deuterium, or the fluorescent dye carboxy-fluorescein diacetate succinimidyl ester (CFSE), their division history has been studied by monitoring telomere shortening and the dilution of T cell receptor excision circles (TRECs) or the dye CFSE, and clonal expansion has been documented by recording changes in the population densities of antigen specific cells. Proper interpretation of such data in terms of the underlying rates of T cell production, division, and death has proven to be notoriously difficult and involves mathematical modeling. We review the various models that have been developed for each of these techniques, discuss which models seem most appropriate for what type of data, reveal open problems that require better models, and pinpoint how the assumptions underlying a mathematical model may influence the interpretation of data. Elaborating various successful cases where modeling has delivered new insights in T cell population dynamics, this review provides quantitative estimates of several processes involved in the maintenance of naive and memory, CD4+ and CD8+ T cell pools in mice and men. PMID:23313150

  20. Quantifying traffic exposure.

    PubMed

    Pratt, Gregory C; Parson, Kris; Shinoda, Naomi; Lindgren, Paula; Dunlap, Sara; Yawn, Barbara; Wollan, Peter; Johnson, Jean

    2014-01-01

    Living near traffic adversely affects health outcomes. Traffic exposure metrics include distance to high-traffic roads, traffic volume on nearby roads, traffic within buffer distances, measured pollutant concentrations, land-use regression estimates of pollution concentrations, and others. We used Geographic Information System software to explore a new approach using traffic count data and a kernel density calculation to generate a traffic density surface with a resolution of 50?m. The density value in each cell reflects all the traffic on all the roads within the distance specified in the kernel density algorithm. The effect of a given roadway on the raster cell value depends on the amount of traffic on the road segment, its distance from the raster cell, and the form of the algorithm. We used a Gaussian algorithm in which traffic influence became insignificant beyond 300?m. This metric integrates the deleterious effects of traffic rather than focusing on one pollutant. The density surface can be used to impute exposure at any point, and it can be used to quantify integrated exposure along a global positioning system route. The traffic density calculation compares favorably with other metrics for assessing traffic exposure and can be used in a variety of applications. PMID:24045427

  1. Quantifying Order in Semiconducting Polymers

    NASA Astrophysics Data System (ADS)

    Snyder, Chad

    2015-03-01

    Semiconducting polymers form the basis for the burgeoning flexible electronics industry. However, quantifying their order can be challenging due to the nanophase separation induced by the side chains which are used to impart solubility, their propensity to form mesophases, and their often high levels of paracrystalline disorder. Recent successes in our laboratory in understanding these materials and quantifying their order will be presented.

  2. Quantifying robustness of biochemical network models

    Microsoft Academic Search

    Lan Ma; Pablo A. Iglesias

    2002-01-01

    Background: Robustness of mathematical models of biochemical networks is important for validation purposes and can be used as a means of selecting between different competing models. Tools for quantifying parametric robustness are needed. Results: Two techniques for describing quantitatively the robustness of an oscillatory model were presented and contrasted. Single-parameter bifurcation analysis was used to evaluate the stability robustness of

  3. Quantifying decoherence in continuous variable systems

    Microsoft Academic Search

    A Serafini; M G A Paris; F. Illuminati; S. De Siena

    2005-01-01

    We present a detailed report on the decoherence of quantum states of continuous variable systems under the action of a quantum optical master equation resulting from the interaction with general Gaussian uncorrelated environments. The rate of decoherence is quantified by relating it to the decay rates of various, complementary measures of the quantum nature of a state, such as the

  4. FPGA Logic Synthesis using Quantified Boolean Satisfiability

    E-print Network

    Brown, Stephen Dean

    FPGA Logic Synthesis using Quantified Boolean Satisfiability Andrew Ling1 , Deshanand P. Singh2 a novel Field Programmable Gate Ar- ray (FPGA) logic synthesis technique which determines if a logic. The applications demonstrated in this paper include FPGA technology mapping and resynthesis where their results

  5. 9 Validation of Results Using TCP/Dual and TCP/Vegas In this chapter, we show that the DCA algorithms associated with TCP/Vegas and TCP/Dual are not able to

    E-print Network

    Martin, Jim

    146 9 Validation of Results Using TCP/Dual and TCP/Vegas In this chapter, we show that the DCA algorithms associated with TCP/Vegas and TCP/Dual are not able to reliably avoid packet loss and consequently by examining the performance of a TCP/Vegas and TCP/Dual model. The performance metrics of interest

  6. A flow cytometric approach to quantify biofilms.

    PubMed

    Kerstens, Monique; Boulet, Gaëlle; Van Kerckhoven, Marian; Clais, Sofie; Lanckacker, Ellen; Delputte, Peter; Maes, Louis; Cos, Paul

    2015-07-01

    Since biofilms are important in many clinical, industrial, and environmental settings, reliable methods to quantify these sessile microbial populations are crucial. Most of the currently available techniques do not allow the enumeration of the viable cell fraction within the biofilm and are often time consuming. This paper proposes flow cytometry (FCM) using the single-stain viability dye TO-PRO(®)-3 iodide as a fast and precise alternative. Mature biofilms of Candida albicans and Escherichia coli were used to optimize biofilm removal and dissociation, as a single-cell suspension is needed for accurate FCM enumeration. To assess the feasibility of FCM quantification of biofilms, E. coli and C. albicans biofilms were analyzed using FCM and crystal violet staining at different time points. A combination of scraping and rinsing proved to be the most efficient technique for biofilm removal. Sonicating for 10 min eliminated the remaining aggregates, resulting in a single-cell suspension. Repeated FCM measurements of biofilm samples revealed a good intraday precision of approximately 5 %. FCM quantification and the crystal violet assay yielded similar biofilm growth curves for both microorganisms, confirming the applicability of our technique. These results show that FCM using TO-PRO(®)-3 iodide as a single-stain viability dye is a valid fast alternative for the quantification of viable cells in a biofilm. PMID:25948317

  7. Developing accurate quantified speckle shearing data

    NASA Astrophysics Data System (ADS)

    Wan Abdullah, W. S.; Petzing, Jon N.; Tyrer, John R.

    1999-08-01

    Electronic Speckle Pattern Shearing Interferometry (ESPSI) is becoming a common tool for the qualitative analysis of material defects in the aerospace and marine industries. Current trends in the development of this optical metrology nondestructive testing (NDT) technique is the introduction of quantitative analysis, which attempts to detail the defects examined and identified by the ESPSI systems. Commercial systems use divergent laser illumination, this being a design feature imposed by the typically large sizes of objects being examined, which negates the use of collimated optics. Furthermore, commercial systems are being applied to complex surfaces which distort the understanding of the instrumentation results. The growing commercial demand for quantitative out-of-lane and in-plane ESPSI for NDT is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of ESPSI interferometers. This paper presents work which has been carried out on the measurement accuracy due to the divergence of the illumination wavefront and associated with the magnitude of lateral shearing function. The error is measured by comparing measurements using divergent (curvature) illumination with respect to collimated illumination. Results show that the error is increased by approximately a power factor as the distance from the illumination source to the object surface decreases.

  8. Quantifying periodicity in omics data.

    PubMed

    Amariei, Cornelia; Tomita, Masaru; Murray, Douglas B

    2014-01-01

    Oscillations play a significant role in biological systems, with many examples in the fast, ultradian, circadian, circalunar, and yearly time domains. However, determining periodicity in such data can be problematic. There are a number of computational methods to identify the periodic components in large datasets, such as signal-to-noise based Fourier decomposition, Fisher's g-test and autocorrelation. However, the available methods assume a sinusoidal model and do not attempt to quantify the waveform shape and the presence of multiple periodicities, which provide vital clues in determining the underlying dynamics. Here, we developed a Fourier based measure that generates a de-noised waveform from multiple significant frequencies. This waveform is then correlated with the raw data from the respiratory oscillation found in yeast, to provide oscillation statistics including waveform metrics and multi-periods. The method is compared and contrasted to commonly used statistics. Moreover, we show the utility of the program in the analysis of noisy datasets and other high-throughput analyses, such as metabolomics and flow cytometry, respectively. PMID:25364747

  9. Autumn shows roundup

    Microsoft Academic Search

    Richard Bloss

    2002-01-01

    A report on three major American automation shows where innovative products and automated assembly technologies formed a focus. Products reviewed include grippers, actuators, assembly modules, dispensing controller and pneumatic components from a number of suppliers.

  10. Do Elephants Show Empathy?

    Microsoft Academic Search

    Lucy A. Bates; Phyllis C. Lee; Norah Njiraini; Joyce H. Poole; Katito Sayialel; Soila Sayialel; Cynthia J. Moss; Richard W. Byrne

    2008-01-01

    Elephants show a rich social organization and display a number of unusual traits. In this paper, we analyse reports collected over a thirty-five year period, describing behaviour that has the potential to reveal signs of empathic understanding. These include coalition formation, the offering of protection and comfort to others, retrieving and 'babysitting' calves, aiding individuals that would otherwise have difficulty

  11. Show Me the Way

    ERIC Educational Resources Information Center

    Dicks, Matthew J.

    2005-01-01

    Because today's students have grown up steeped in video games and the Internet, most of them expect feedback, and usually gratification, very soon after they expend effort on a task. Teachers can get quick feedback to students by showing them videotapes of their learning performances. The author, a 3rd grade teacher describes how the seemingly…

  12. Demonstration Road Show

    NSDL National Science Digital Library

    2009-04-06

    The Idaho State University Department of Physics conducts science demonstration shows at S. E. Idaho schools. Four different presentations are currently available; "Forces and Motion", "States of Matter", "Electricity and Magnetism", and "Sound and Waves". Information provided includes descriptions of the material and links to other resources.

  13. Earthquake Damage Slide Show

    NSDL National Science Digital Library

    This slide show presents examples of various types of damage caused by earthquakes. Photos include structural failures in bridges and buildings, landshifts, landslides, liquefaction, fires, tsunamis, and human impacts. Supplemental notes are provided to aid instructors about the photos presented on each slide.

  14. Honored Teacher Shows Commitment.

    ERIC Educational Resources Information Center

    Ratte, Kathy

    1987-01-01

    Part of the acceptance speech of the 1985 National Council for the Social Studies Teacher of the Year, this article describes the censorship experience of this honored social studies teacher. The incident involved the showing of a videotape version of the feature film entitled "The Seduction of Joe Tynan." (JDH)

  15. What Do Maps Show?

    ERIC Educational Resources Information Center

    Geological Survey (Dept. of Interior), Reston, VA.

    This curriculum packet, appropriate for grades 4-8, features a teaching poster which shows different types of maps (different views of Salt Lake City, Utah), as well as three reproducible maps and reproducible activity sheets which complement the maps. The poster provides teacher background, including step-by-step lesson plans for four geography…

  16. Show-Me Center

    NSDL National Science Digital Library

    The Show-Me Center is a partnership of four NSF-sponsored middle grades mathematics curriculum development Satellite Centers (University of Wisconsin, Michigan State University, University of Montana, and the Educational Development Center). The group's website provides "information and resources needed to support selection and implementation of standards-based middle grades mathematics curricula." The Video Showcase includes segments on Number, Algebra, Geometry, Measure, and Data Analysis, with information on ways to obtain the complete video set. The Curricula Showcase provides general information, unit goals, sample lessons and teacher pages spanning four projects: the Connected Mathematics Project (CMP), Mathematics in Context (MiC), MathScape: Seeing and Thinking Mathematically, and Middle Grades Math Thematics. The website also posts Show-Me Center newsletters, information on upcoming conferences and workshops, and links to resources including published articles and unpublished commentary on mathematics school reform.

  17. Quantifying uncertainty in stable isotope mixing models

    NASA Astrophysics Data System (ADS)

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-01

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, Stable Isotope Analysis in R (SIAR), a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (?15N and ?18O) but all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated mixing fractions.

  18. The Truman Show

    Microsoft Academic Search

    Rolf F. Nohr

    The Truman Show is hardly a film you would automatically speak about as a game. At first glance, it is tempting to interpret the story of\\u000a Truman Burbank — his perpetual subjection to the artificial (televisual) world of Seahaven and its gargantuan reality TV project,\\u000a his eventual escape from the “OmniCam Ecosphere” building and the paternalistic surveillance of director Christof

  19. American History Picture Show

    NSDL National Science Digital Library

    Ms. Bennion

    2009-11-23

    In class we read Katie's Picture Show, a book about a girl who discovers art first-hand one day at an art museum in London. She realizes she can climb into the paintings, explore her surroundings, and even solve problems for the subjects of the paintings. As part of our unit on American history, we are going to use art to further learn about some of the important events we have been discussing. Each of these works of art depicts an important event in American History. When you click on a picture, you will be able to see the name of the event as well as the artist who created it. You will be using all three pictures for this assignment.Use the websites ...

  20. Quantifying the geometric sensitivity of attractor basins: Power law dependence on parameter variations and noise

    NASA Astrophysics Data System (ADS)

    Siapas, Athanassios G.

    1994-10-01

    We show that for many physical systems the dependence of attractor basin geometry on parameter variations and noise can be characterized by power laws. We introduce new invariants-the basin immunities-that quantify this dependence and we analyze their origin and properties. Results from extensive numerical experiments are presented; examples include the driven pendulum and the Hénon map. Potential applications of basin immunities include quantifying the effect of parameter uncertainties and noise on the behavior of nonlinear devices, as well as improving parameter estimation algorithms.

  1. Gaussian intrinsic entanglement: An entanglement quantifier based on secret correlations

    NASA Astrophysics Data System (ADS)

    Mišta, Ladislav; Tatham, Richard

    2015-06-01

    Intrinsic entanglement (IE) is a quantity which aims at quantifying bipartite entanglement carried by a quantum state as an optimal amount of the intrinsic information that can be extracted from the state by measurement. We investigate in detail the properties of a Gaussian version of IE, the so-called Gaussian intrinsic entanglement (GIE). We show explicitly how GIE simplifies to the mutual information of a distribution of outcomes of measurements on a conditional state obtained by a measurement on a purifying subsystem of the analyzed state, which is first minimized over all measurements on the purifying subsystem and then maximized over all measurements on the conditional state. By constructing for any separable Gaussian state a purification and a measurement on the purifying subsystem which projects the purification onto a product state, we prove that GIE vanishes on all Gaussian separable states. Via realization of quantum operations by teleportation, we further show that GIE is nonincreasing under Gaussian local trace-preserving operations and classical communication. For pure Gaussian states and a reduction of the continuous-variable GHZ state, we calculate GIE analytically and we show that it is always equal to the Gaussian Rényi-2 entanglement. We also extend the analysis of IE to a non-Gaussian case by deriving an analytical lower bound on IE for a particular form of the non-Gaussian continuous-variable Werner state. Our results indicate that mapping of entanglement onto intrinsic information is capable of transmitting also quantitative properties of entanglement and that this property can be used for introduction of a quantifier of Gaussian entanglement which is a compromise between computable and physically meaningful entanglement quantifiers.

  2. Quantifying the value of IT-investments

    Microsoft Academic Search

    Chris Verhoef

    2005-01-01

    We described a method to quantify the value of investments in software systems. For that, we adopted the classical risk-adjusted discounted cash flow model and geared it towards the fie ld of information technology. This resulted in a scenario-based approach incorporating two IT- specific risks that can substantially influence IT-appraisals. They are requirements creep and time compression. To account for

  3. QUERY LANGUAGES WITH GENERALIZED QUANTIFIERS

    E-print Network

    Van Gucht, Dirk

    queries with embedded sub-queries as well as sub-query comparison statements. It is often argued the phenomenon of sub-query syntax in query languages and the theory of generalized quantifiers statements and set- predicate statements over sub-queries. We also introduce the language QLGQ which

  4. Quantifying crystal-melt segregation in dykes

    NASA Astrophysics Data System (ADS)

    Yamato, Philippe; Duretz, Thibault; May, Dave A.; Tartèse, Romain

    2015-04-01

    The dynamics of magma flow is highly affected by the presence of a crystalline load. During magma ascent, it has been demonstrated that crystal-melt segregation constitutes a viable mechanism for magmatic differentiation. However, the influences of crystal volume fraction, geometry, size and density on crystal melt segregation are still not well constrained. In order to address these issues, we performed a parametric study using 2D direct numerical simulations, which model the ascension of crystal-bearing magma in a vertical dyke. Using these models, we have characterised the amount of segregation as a function of different quantities including: the crystal fraction (?), the density contrast between crystals and melt (??), the size of the crystals (Ac) and their aspect ratio (R). Results show that crystal aspect ratio does not affect the segregation if R is small enough (long axis smaller than ~1/6 of the dyke width, Wd). Inertia within the system was also found not to influence crystal-melt segregation. The degree of segregation was however found to be highly dependent upon other parameters. Segregation is highest when ?? and Ac are large, and lowest for large pressure gradient (Pd) and/or large values of Wd. These four parameters can be combined into a single one, the Snumber, which can be used to quantify the segregation. Based on systematic numerical modelling and dimensional analysis, we provide a first order scaling law which allows quantification of the segregation for an arbitrary Snumber and ?, encompassing a wide range of typical parameters encountered in terrestrial magmatic systems.

  5. Quantify Simulation Verification and Validation

    Microsoft Academic Search

    Peng Shi; Fei Liu; Ming Yang

    2009-01-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence application military areas, such as, safety of nuclear weapons, and the trial of some missiles. In this paper we discuss how to quantify the V&V process for a better description

  6. QUANTIFYING ASSAY VARIATION IN NUTRIENT ANALYSIS OF FEEDSTUFFS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Analytical results from different laboratories have greater variation than those from a single laboratory, and this variation differs by nutrient. Objectives of this presentation are to describe methods for quantifying the analytical reproducibility among and repeatability within laboratories, estim...

  7. Plurality, Negation, and Quantification: Towards Comprehensive Quantifier Scope Disambiguation

    E-print Network

    Gildea, Daniel

    Plurality, Negation, and Quantification: Towards Comprehensive Quantifier Scope Disambiguation scope disambiguation (QSD) has improved upon earlier work by scoping an arbitrary num- ber and type promising, results for automatic QSD when handling both phenomena. We also present a general model

  8. Career and Technical Education: Show Us the Buck, We'll Show You the Bang!

    ERIC Educational Resources Information Center

    Whetstone, Ryan

    2011-01-01

    Adult and CTE programs in California have been cut by about 60 percent over the past three years. A number of school districts have summarily eliminated these programs to preserve funding for other educational endeavors. The author says part of the problem has been the community's inability to communicate quantifiable results. One of the hottest…

  9. Measuring political polarization: Twitter shows the two sides of Venezuela

    NASA Astrophysics Data System (ADS)

    Morales, A. J.; Borondo, J.; Losada, J. C.; Benito, R. M.

    2015-03-01

    We say that a population is perfectly polarized when divided in two groups of the same size and opposite opinions. In this paper, we propose a methodology to study and measure the emergence of polarization from social interactions. We begin by proposing a model to estimate opinions in which a minority of influential individuals propagate their opinions through a social network. The result of the model is an opinion probability density function. Next, we propose an index to quantify the extent to which the resulting distribution is polarized. Finally, we apply the proposed methodology to a Twitter conversation about the late Venezuelan president, Hugo Chávez, finding a good agreement between our results and offline data. Hence, we show that our methodology can detect different degrees of polarization, depending on the structure of the network.

  10. Measuring Political Polarization: Twitter shows the two sides of Venezuela

    E-print Network

    Morales, A J; Losada, J C; Benito, R M

    2015-01-01

    We say that a population is perfectly polarized when divided in two groups of the same size and opposite opinions. In this paper, we propose a methodology to study and measure the emergence of polarization from social interactions. We begin by proposing a model to estimate opinions in which a minority of influential individuals propagate their opinions through a social network. The result of the model is an opinion probability density function. Next, we propose an index to quantify the extent to which the resulting distribution is polarized. Finally, we apply the proposed methodology to a Twitter conversation about the late Venezuelan president, Hugo Ch\\'avez, finding a good agreement between our results and offline data. Hence, we show that our methodology can detect different degrees of polarization, depending on the structure of the network.

  11. A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT) assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE)

    NASA Astrophysics Data System (ADS)

    Fang, T.; Verma, V.; Guo, H.; King, L. E.; Edgerton, E. S.; Weber, R. J.

    2014-07-01

    A variety of methods are used to measure the capability of particulate matter (PM) to catalytically generate reactive oxygen species (ROS) in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT) assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants) as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with the large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (Coefficient of Variation of 12% for standards, 4% for ambient samples), and reasonably low limit of detection (0.31 nmol min-1). Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 ± 0.12, r2 = 0.92, N = 9). The system was utilized for the Southeastern Center for Air Pollution and Epidemiology (SCAPE) to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, road-side, and rural) in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per air volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88), suggesting regional sources contributing to the PM oxidative potential in southeast US. However, the greater heterogeneity in the intrinsic DTT activity (per PM mass basis) across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the instrument can also be used to determine oxidative potential with other acellular assays.

  12. Quantifying mixing using equilibrium reactions

    SciTech Connect

    Wheat, Philip M. [Department of Mechanical and Aerospace Engineering, Arizona State University, Tempe, Arizona 85287-6106 (United States); Posner, Jonathan D. [Department of Mechanical and Aerospace Engineering, Arizona State University, Tempe, Arizona 85287-6106 (United States); Department of Chemical Engineering, Arizona State University, Tempe, Arizona 85287-6106 (United States)

    2009-03-15

    A method of quantifying equilibrium reactions in a microchannel using a fluorometric reaction of Fluo-4 and Ca{sup 2+} ions is presented. Under the proper conditions, equilibrium reactions can be used to quantify fluid mixing without the challenges associated with constituent mixing measures such as limited imaging spatial resolution and viewing angle coupled with three-dimensional structure. Quantitative measurements of CaCl and calcium-indicating fluorescent dye Fluo-4 mixing are measured in Y-shaped microchannels. Reactant and product concentration distributions are modeled using Green's function solutions and a numerical solution to the advection-diffusion equation. Equilibrium reactions provide for an unambiguous, quantitative measure of mixing when the reactant concentrations are greater than 100 times their dissociation constant and the diffusivities are equal. At lower concentrations and for dissimilar diffusivities, the area averaged fluorescence signal reaches a maximum before the species have interdiffused, suggesting that reactant concentrations and diffusivities must be carefully selected to provide unambiguous, quantitative mixing measures. Fluorometric equilibrium reactions work over a wide range of pH and background concentrations such that they can be used for a wide variety of fluid mixing measures including industrial or microscale flows.

  13. Quantifying Drosophila food intake: comparative analysis of current methodology

    PubMed Central

    Deshpande, Sonali A.; Carvalho, Gil B.; Amador, Ariadna; Phillips, Angela M.; Hoxha, Sany; Lizotte, Keith J.; Ja, William W.

    2014-01-01

    Food intake is a fundamental parameter in animal studies. Despite the prevalent use of Drosophila in laboratory research, precise measurements of food intake remain challenging in this model organism. Here, we compare several common Drosophila feeding assays: the Capillary Feeder (CAFE), food-labeling with a radioactive tracer or a colorimetric dye, and observations of proboscis extension (PE). We show that the CAFE and radioisotope-labeling provide the most consistent results, have the highest sensitivity, and can resolve differences in feeding that dye-labeling and PE fail to distinguish. We conclude that performing the radiolabeling and CAFE assays in parallel is currently the best approach for quantifying Drosophila food intake. Understanding the strengths and limitations of food intake methodology will greatly advance Drosophila studies of nutrition, behavior, and disease. PMID:24681694

  14. Introduction Quantifying high-gradient behavior

    E-print Network

    Kuhn, Matthew R.

    Introduction Quantifying high-gradient behavior Summary Continuum Models of Discrete Particle-Discrete Granular Models with Bending #12;Introduction Quantifying high-gradient behavior Summary Outline 1 Introduction 2 Quantifying high-gradient behavior DEM "bending" experiments Questions about granular behavior

  15. A Study of Quantifiers in Mandarin Chinese.

    ERIC Educational Resources Information Center

    Lu, John H-T.

    1980-01-01

    Studies, using Mandarin Chinese as a test case: (1) the interaction of syntax and semantics when quantifiers and negatives co-occur; (2) the linear interpretation of quantifiers when the universal and existential quantifiers co-occur; (3) the logical relationship between them; and (4) the basic word order of existential sentences involving…

  16. Quantifying Global Uncertainties in a Simple Microwave Rainfall Algorithm

    NASA Technical Reports Server (NTRS)

    Kummerow, Christian; Berg, Wesley; Thomas-Stahle, Jody; Masunaga, Hirohiko

    2006-01-01

    While a large number of methods exist in the literature for retrieving rainfall from passive microwave brightness temperatures, little has been written about the quantitative assessment of the expected uncertainties in these rainfall products at various time and space scales. The latter is the result of two factors: sparse validation sites over most of the world's oceans, and algorithm sensitivities to rainfall regimes that cause inconsistencies against validation data collected at different locations. To make progress in this area, a simple probabilistic algorithm is developed. The algorithm uses an a priori database constructed from the Tropical Rainfall Measuring Mission (TRMM) radar data coupled with radiative transfer computations. Unlike efforts designed to improve rainfall products, this algorithm takes a step backward in order to focus on uncertainties. In addition to inversion uncertainties, the construction of the algorithm allows errors resulting from incorrect databases, incomplete databases, and time- and space-varying databases to be examined. These are quantified. Results show that the simple algorithm reduces errors introduced by imperfect knowledge of precipitation radar (PR) rain by a factor of 4 relative to an algorithm that is tuned to the PR rainfall. Database completeness does not introduce any additional uncertainty at the global scale, while climatologically distinct space/time domains add approximately 25% uncertainty that cannot be detected by a radiometer alone. Of this value, 20% is attributed to changes in cloud morphology and microphysics, while 5% is a result of changes in the rain/no-rain thresholds. All but 2%-3% of this variability can be accounted for by considering the implicit assumptions in the algorithm. Additional uncertainties introduced by the details of the algorithm formulation are not quantified in this study because of the need for independent measurements that are beyond the scope of this paper. A validation strategy for these errors is outlined.

  17. Quantifying utricular stimulation during natural behavior

    PubMed Central

    Rivera, Angela R. V.; Davis, Julian; Grant, Wally; Blob, Richard W.; Peterson, Ellengene; Neiman, Alexander B.; Rowe, Michael

    2012-01-01

    The use of natural stimuli in neurophysiological studies has led to significant insights into the encoding strategies used by sensory neurons. To investigate these encoding strategies in vestibular receptors and neurons, we have developed a method for calculating the stimuli delivered to a vestibular organ, the utricle, during natural (unrestrained) behaviors, using the turtle as our experimental preparation. High-speed digital video sequences are used to calculate the dynamic gravito-inertial (GI) vector acting on the head during behavior. X-ray computed tomography (CT) scans are used to determine the orientation of the otoconial layer (OL) of the utricle within the head, and the calculated GI vectors are then rotated into the plane of the OL. Thus, the method allows us to quantify the spatio-temporal structure of stimuli to the OL during natural behaviors. In the future, these waveforms can be used as stimuli in neurophysiological experiments to understand how natural signals are encoded by vestibular receptors and neurons. We provide one example of the method which shows that turtle feeding behaviors can stimulate the utricle at frequencies higher than those typically used in vestibular studies. This method can be adapted to other species, to other vestibular end organs, and to other methods of quantifying head movements. PMID:22753360

  18. Detecting and Quantifying Topography in Neural Maps

    PubMed Central

    Yarrow, Stuart; Razak, Khaleel A.; Seitz, Aaron R.; Seriès, Peggy

    2014-01-01

    Topographic maps are an often-encountered feature in the brains of many species, yet there are no standard, objective procedures for quantifying topography. Topographic maps are typically identified and described subjectively, but in cases where the scale of the map is close to the resolution limit of the measurement technique, identifying the presence of a topographic map can be a challenging subjective task. In such cases, an objective topography detection test would be advantageous. To address these issues, we assessed seven measures (Pearson distance correlation, Spearman distance correlation, Zrehen's measure, topographic product, topological correlation, path length and wiring length) by quantifying topography in three classes of cortical map model: linear, orientation-like, and clusters. We found that all but one of these measures were effective at detecting statistically significant topography even in weakly-ordered maps, based on simulated noisy measurements of neuronal selectivity and sparse sampling of the maps. We demonstrate the practical applicability of these measures by using them to examine the arrangement of spatial cue selectivity in pallid bat A1. This analysis shows that significantly topographic arrangements of interaural intensity difference and azimuth selectivity exist at the scale of individual binaural clusters. PMID:24505279

  19. Quantifying entanglement with witness operators

    SciTech Connect

    Brandao, Fernando G.S.L. [Grupo de Informacao Quantica, Departamento de Fisica, Universidade Federal de Minas Gerais, Caixa Postal 702, Belo Horizonte, 30.123-970, MG (Brazil)

    2005-08-15

    We present a unifying approach to the quantification of entanglement based on entanglement witnesses, which includes several already established entanglement measures such as the negativity, the concurrence, and the robustness of entanglement. We then introduce an infinite family of new entanglement quantifiers, having as its limits the best separable approximation measure and the generalized robustness. Gaussian states, states with symmetry, states constrained to super-selection rules, and states composed of indistinguishable particles are studied under the view of the witnessed entanglement. We derive new bounds to the fidelity of teleportation d{sub min}, for the distillable entanglement E{sub D} and for the entanglement of formation. A particular measure, the PPT-generalized robustness, stands out due to its easy calculability and provides sharper bounds to d{sub min} and E{sub D} than the negativity in most of the states. We illustrate our approach studying thermodynamical properties of entanglement in the Heisenberg XXX and dimerized models.

  20. Quantifying strain variability in modeling growth of Listeria monocytogenes.

    PubMed

    Aryani, D C; den Besten, H M W; Hazeleger, W C; Zwietering, M H

    2015-09-01

    Prediction of microbial growth kinetics can differ from the actual behavior of the target microorganisms. In the present study, the impact of strain variability on maximum specific growth rate (?max) (h(-1)) was quantified using twenty Listeria monocytogenes strains. The ?max was determined as function of four different variables, namely pH, water activity (aw)/NaCl concentration [NaCl], undissociated lactic acid concentration ([HA]), and temperature (T). The strain variability was compared to biological and experimental variabilities to determine their importance. The experiment was done in duplicate at the same time to quantify experimental variability and reproduced at least twice on different experimental days to quantify biological (reproduction) variability. For all variables, experimental variability was clearly lower than biological variability and strain variability; and remarkably, biological variability was similar to strain variability. Strain variability in cardinal growth parameters, namely pHmin, [NaCl]max, [HA]max, and Tmin was further investigated by fitting secondary growth models to the ?max data, including a modified secondary pH model. The fitting results showed that L. monocytogenes had an average pHmin of 4.5 (5-95% prediction interval (PI) 4.4-4.7), [NaCl]max of 2.0mM (PI 1.8-2.1), [HA]max of 5.1mM (PI 4.2-5.9), and Tmin of -2.2°C (PI (-3.3)-(-1.1)). The strain variability in cardinal growth parameters was benchmarked to available literature data, showing that the effect of strain variability explained around 1/3 or less of the variability found in literature. The cardinal growth parameters and their prediction intervals were used as input to illustrate the effect of strain variability on the growth of L. monocytogenes in food products with various characteristics, resulting in 2-4 logCFU/ml(g) difference in growth prediction between the most and least robust strains, depending on the type of food product. This underlined the importance to obtain quantitative knowledge on variability factors to realistically predict the microbial growth kinetics. PMID:26011600

  1. Plan Showing Cross Bracing Under Upper Stringers, Typical Section Showing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Plan Showing Cross Bracing Under Upper Stringers, Typical Section Showing End Framing, Plan Showing Cross Bracing Under Lower Stringers, End Elevation - Covered Bridge, Spanning Contoocook River, Hopkinton, Merrimack County, NH

  2. A Generalizable Methodology for Quantifying User Satisfaction

    NASA Astrophysics Data System (ADS)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  3. Message passing for quantified Boolean formulas

    E-print Network

    Zhang, Pan; Zdeborová, Lenka; Zecchina, Riccardo

    2012-01-01

    We introduce two types of message passing algorithms for quantified Boolean formulas (QBF). The first type is a message passing based heuristics that can prove unsatisfiability of the QBF by assigning the universal variables in such a way that the remaining formula is unsatisfiable. In the second type, we use message passing to guide branching heuristics of a Davis-Putnam Logemann-Loveland (DPLL) complete solver. Numerical experiments show that on random QBFs our branching heuristics gives robust exponential efficiency gain with respect to the state-of-art solvers. We also manage to solve some previously unsolved benchmarks from the QBFLIB library. Apart from this our study sheds light on using message passing in small systems and as subroutines in complete solvers.

  4. Quantifying Einstein-Podolsky-Rosen steering.

    PubMed

    Skrzypczyk, Paul; Navascués, Miguel; Cavalcanti, Daniel

    2014-05-01

    Einstein-Podolsky-Rosen steering is a form of bipartite quantum correlation that is intermediate between entanglement and Bell nonlocality. It allows for entanglement certification when the measurements performed by one of the parties are not characterized (or are untrusted) and has applications in quantum key distribution. Despite its foundational and applied importance, Einstein-Podolsky-Rosen steering lacks a quantitative assessment. Here we propose a way of quantifying this phenomenon and use it to study the steerability of several quantum states. In particular, we show that every pure entangled state is maximally steerable and the projector onto the antisymmetric subspace is maximally steerable for all dimensions; we provide a new example of one-way steering and give strong support that states with positive-partial transposition are not steerable. PMID:24856679

  5. Stimfit: quantifying electrophysiological data with Python

    PubMed Central

    Guzman, Segundo J.; Schlögl, Alois; Schmidt-Hieber, Christoph

    2013-01-01

    Intracellular electrophysiological recordings provide crucial insights into elementary neuronal signals such as action potentials and synaptic currents. Analyzing and interpreting these signals is essential for a quantitative understanding of neuronal information processing, and requires both fast data visualization and ready access to complex analysis routines. To achieve this goal, we have developed Stimfit, a free software package for cellular neurophysiology with a Python scripting interface and a built-in Python shell. The program supports most standard file formats for cellular neurophysiology and other biomedical signals through the Biosig library. To quantify and interpret the activity of single neurons and communication between neurons, the program includes algorithms to characterize the kinetics of presynaptic action potentials and postsynaptic currents, estimate latencies between pre- and postsynaptic events, and detect spontaneously occurring events. We validate and benchmark these algorithms, give estimation errors, and provide sample use cases, showing that Stimfit represents an efficient, accessible and extensible way to accurately analyze and interpret neuronal signals. PMID:24600389

  6. Stimfit: quantifying electrophysiological data with Python.

    PubMed

    Guzman, Segundo J; Schlögl, Alois; Schmidt-Hieber, Christoph

    2014-01-01

    Intracellular electrophysiological recordings provide crucial insights into elementary neuronal signals such as action potentials and synaptic currents. Analyzing and interpreting these signals is essential for a quantitative understanding of neuronal information processing, and requires both fast data visualization and ready access to complex analysis routines. To achieve this goal, we have developed Stimfit, a free software package for cellular neurophysiology with a Python scripting interface and a built-in Python shell. The program supports most standard file formats for cellular neurophysiology and other biomedical signals through the Biosig library. To quantify and interpret the activity of single neurons and communication between neurons, the program includes algorithms to characterize the kinetics of presynaptic action potentials and postsynaptic currents, estimate latencies between pre- and postsynaptic events, and detect spontaneously occurring events. We validate and benchmark these algorithms, give estimation errors, and provide sample use cases, showing that Stimfit represents an efficient, accessible and extensible way to accurately analyze and interpret neuronal signals. PMID:24600389

  7. Quantifying tetrodotoxin levels in the California newt using a non-destructive sampling method.

    PubMed

    Bucciarelli, Gary M; Li, Amy; Kats, Lee B; Green, David B

    2014-03-01

    Toxic or noxious substances often serve as a means of chemical defense for numerous taxa. However, such compounds may also facilitate ecological or evolutionary processes. The neurotoxin, tetrodotoxin (TTX), which is found in newts of the genus Taricha, acts as a selection pressure upon predatory garter snakes, is a chemical cue to conspecific larvae, which elicits antipredator behavior, and may also affect macroinvertebrate foraging behavior. To understand selection patterns and how potential variation might affect ecological and evolutionary processes, it is necessary to quantify TTX levels within individuals and populations. To do so has often required that animals be destructively sampled or removed from breeding habitats and brought into the laboratory. Here we demonstrate a non-destructive method of sampling adult Taricha that obviates the need to capture and collect individuals. We also show that embryos from oviposited California newt (Taricha torosa) egg masses can be individually sampled and TTX quantified from embryos. We employed three different extraction techniques to isolate TTX. Using a custom fabricated high performance liquid chromatography (HPLC) system we quantified recovery of TTX. We found that a newly developed micro-extraction technique significantly improved recovery compared to previously used methods. Results also indicate our improvements to the HPLC method have high repeatability and increased sensitivity, with a detection limit of 48 pg (0.15 pmol) TTX. The quantified amounts of TTX in adult newts suggest fine geographic variation in toxin levels between sampling localities isolated by as little as 3 km. PMID:24467994

  8. Quantifying the vitamin D economy.

    PubMed

    Heaney, Robert P; Armas, Laura A G

    2015-01-01

    Vitamin D enters the body through multiple routes and in a variety of chemical forms. Utilization varies with input, demand, and genetics. Vitamin D and its metabolites are carried in the blood on a Gc protein that has three principal alleles with differing binding affinities and ethnic prevalences. Three major metabolites are produced, which act via two routes, endocrine and autocrine/paracrine, and in two compartments, extracellular and intracellular. Metabolic consumption is influenced by physiological controls, noxious stimuli, and tissue demand. When administered as a supplement, varying dosing schedules produce major differences in serum metabolite profiles. To understand vitamin D's role in human physiology, it is necessary both to identify the foregoing entities, mechanisms, and pathways and, specifically, to quantify them. This review was performed to delineate the principal entities and transitions involved in the vitamin D economy, summarize the status of present knowledge of the applicable rates and masses, draw inferences about functions that are implicit in these quantifications, and point out implications for the determination of adequacy. PMID:26024057

  9. Quantifying Uncertainty in Epidemiological Models

    SciTech Connect

    Ramanathan, Arvind [ORNL; Jha, Sumit Kumar [University of Central Florida

    2012-01-01

    Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.

  10. Roanoke Area Junior Livestock Show

    E-print Network

    Liskiewicz, Maciej

    & Computations Tom Stanley, Chairman Tyler Painter Beth Hawse Katherine Carter Ribbons & Record Keeping Carolyn Supper at Arena 6:00 PM Hog Show: Showmanship, Market Hog Show and Breeding Gilt Show 7:00 pm Check

  11. In favour of the definition "adolescents with idiopathic scoliosis": juvenile and adolescent idiopathic scoliosis braced after ten years of age, do not show different end results. SOSORT award winner 2014

    PubMed Central

    2014-01-01

    Background The most important factor discriminating juvenile (JIS) from adolescent idiopathic scoliosis (AIS) is the risk of deformity progression. Brace treatment can change natural history, even when risk of progression is high. The aim of this study was to compare the end of growth results of JIS subjects, treated after 10 years of age, with final results of AIS. Methods Design: prospective observational controlled cohort study nested in a prospective database. Setting: outpatient tertiary referral clinic specialized in conservative treatment of spinal deformities. Inclusion criteria: idiopathic scoliosis; European Risser 0–2; 25 degrees to 45 degrees Cobb; start treatment age: 10 years or more, never treated before. Exclusion criteria: secondary scoliosis, neurological etiology, prior treatment for scoliosis (brace or surgery). Groups: 27 patients met the inclusion criteria for the AJIS, (Juvenile Idiopathic Scoliosis treated in adolescence), demonstrated by an x-ray before 10 year of age, and treatment start after 10 years of age. AIS group included 45 adolescents with a diagnostic x-ray made after the threshold of age 10 years. Results at the end of growth were analysed; the threshold of 5 Cobb degree to define worsened, improved and stabilized curves was considered. Statistics: Mean and SD were used for descriptive statistics of clinical and radiographic changes. Relative Risk of failure (RR), Chi-square and T-test of all data was calculated to find differences among the two groups. 95% Confidence Interval (CI) , and of radiographic changes have been calculated. Results We did not find any Cobb angle significant differences among groups at baseline and at the end of treatment. The only difference was in the number of patients progressed above 45 degrees, found in the JIS group. The RR of progression of AJIS was, 1.35 (IC95% 0.57-3.17) versus AIS, and it wasn't statistically significant in the AJIS group, in respect to AIS group (p = 0.5338). Conclusion There are no significant differences in the final results of AIS and JIS, treated with total respect of the SRS and SOSORT criteria, in adolescence. Brace efficacy can neutralize the risk of progression. PMID:25031608

  12. 6. VIEW SHOWING DOWNSTREAM FACE OF DAM, SHOWING SEEPAGE CONTROL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VIEW SHOWING DOWNSTREAM FACE OF DAM, SHOWING SEEPAGE CONTROL REINFORCEMENT, LOOKING SOUTHWEST - High Mountain Dams in Upalco Unit, East Timothy Lake Dam, Ashley National Forest, 8.4 miles North of Swift Creek Campground, Mountain Home, Duchesne County, UT

  13. 10. INTERIOR VIEW SHOWING MOUNTINGS FROM TUNING DEVICE. VIEW SHOWS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. INTERIOR VIEW SHOWING MOUNTINGS FROM TUNING DEVICE. VIEW SHOWS COPPER SHEETING ON WALLS. - Chollas Heights Naval Radio Transmitting Facility, Helix House, 6410 Zero Road, San Diego, San Diego County, CA

  14. Quantifying uncertainty from material inhomogeneity.

    SciTech Connect

    Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

    2009-09-01

    Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the deformation behavior was found to depend strongly on the character of the nearby microstructure.

  15. The "Life Potential": a new complex algorithm to assess "Heart Rate Variability" from Holter records for cognitive and diagnostic aims. Preliminary experimental results showing its dependence on age, gender and health conditions

    E-print Network

    Barra, Orazio A

    2013-01-01

    Although HRV (Heart Rate Variability) analyses have been carried out for several decades, several limiting factors still make these analyses useless from a clinical point of view. The present paper aims at overcoming some of these limits by introducing the "Life Potential" (BMP), a new mathematical algorithm which seems to exhibit surprising cognitive and predictive capabilities. BMP is defined as a linear combination of five HRV Non-Linear Variables, in turn derived from the thermodynamic formalism of chaotic dynamic systems. The paper presents experimental measurements of BMP (Average Values and Standard Deviations) derived from 1048 Holter tests, matched in age and gender, including a control group of 356 healthy subjects. The main results are: (a) BMP always decreases when the age increases, and its dependence on age and gender is well established; (b) the shape of the age dependence within "healthy people" is different from that found in the general group: this behavior provides evidence of possible illn...

  16. Robust performance and structured singular value computation by quantifier elimination

    Microsoft Academic Search

    Bassam Bamieh

    1997-01-01

    We consider the problem of computing the robust performance norm in H? or equivalently, the complex structured singular value. This problem is equivalent to that of computing the H? norm of a function of several complex variables. We show how one can use quantifier elimination techniques to provide a test for whether the H? norm is less than one. These

  17. Quantifying Inductive Bias: AI Learning Algorithms and Valiant's Learning Framework

    Microsoft Academic Search

    David Haussler

    1988-01-01

    We show that the notion of inductive bias in concept learning can be quantified in a way that directl_v relates to learning performance in the framework recently introduced by Valiant. Our measure of bias is based on the growth function introduced by Vapnik and Chervonenkis, and on the Vapnik-Chervonenkis dimension. We measure some common language biases, including restriction to conjunctive

  18. Results of the HepZero study comparing heparin-grafted membrane and standard care show that heparin-grafted dialyzer is safe and easy to use for heparin-free dialysis.

    PubMed

    Laville, Maurice; Dorval, Marc; Fort Ros, Joan; Fay, Renaud; Cridlig, Joëlle; Nortier, Joëlle L; Juillard, Laurent; D?bska-?lizie?, Alicja; Fernández Lorente, Loreto; Thibaudin, Damien; Franssen, Casper; Schulz, Michael; Moureau, Frédérique; Loughraieb, Nathalie; Rossignol, Patrick

    2014-12-01

    Heparin is used to prevent clotting during hemodialysis, but heparin-free hemodialysis is sometimes needed to decrease the risk of bleeding. The HepZero study is a randomized, multicenter international controlled open-label trial comparing no-heparin hemodialysis strategies designed to assess non-inferiority of a heparin grafted dialyzer (NCT01318486). A total of 251 maintenance hemodialysis patients at increased risk of hemorrhage were randomly allocated for up to three heparin-free hemodialysis sessions using a heparin-grafted dialyzer or the center standard-of-care consisting of regular saline flushes or pre-dilution. The first heparin-free hemodialysis session was considered successful when there was neither complete occlusion of air traps or dialyzer, nor additional saline flushes, changes of dialyzer or bloodlines, or premature termination. The current standard-of-care resulted in high failure rates (50%). The success rate in the heparin-grafted membrane arm was significantly higher than in the control group (68.5% versus 50.4%), which was consistent for both standard-of-care modalities. The absolute difference between the heparin-grafted membrane and the controls was 18.2%, with a lower bound of the 90% confidence interval equal to plus 7.9%. The hypothesis of the non-inferiority at the minus 15% level was accepted, although superiority at the plus 15% level was not reached. Thus, use of a heparin-grafted membrane is a safe, helpful, and easy-to-use method for heparin-free hemodialysis in patients at increased risk of hemorrhage. PMID:25007166

  19. Asia: Showing the Changing Seasons

    NSDL National Science Digital Library

    Jesse Allen

    1998-09-09

    SeaWiFS false color data showing seasonal change in the oceans and on land for Asia. The data is seasonally averaged, and shows the sequence: fall, winter, spring, summer, fall, winter, spring (for the Northern Hemisphere).

  20. Quantifying climatological ranges and anomalies for Pacific coral reef ecosystems.

    PubMed

    Gove, Jamison M; Williams, Gareth J; McManus, Margaret A; Heron, Scott F; Sandin, Stuart A; Vetter, Oliver J; Foley, David G

    2013-01-01

    Coral reef ecosystems are exposed to a range of environmental forcings that vary on daily to decadal time scales and across spatial scales spanning from reefs to archipelagos. Environmental variability is a major determinant of reef ecosystem structure and function, including coral reef extent and growth rates, and the abundance, diversity, and morphology of reef organisms. Proper characterization of environmental forcings on coral reef ecosystems is critical if we are to understand the dynamics and implications of abiotic-biotic interactions on reef ecosystems. This study combines high-resolution bathymetric information with remotely sensed sea surface temperature, chlorophyll-a and irradiance data, and modeled wave data to quantify environmental forcings on coral reefs. We present a methodological approach to develop spatially constrained, island- and atoll-scale metrics that quantify climatological range limits and anomalous environmental forcings across U.S. Pacific coral reef ecosystems. Our results indicate considerable spatial heterogeneity in climatological ranges and anomalies across 41 islands and atolls, with emergent spatial patterns specific to each environmental forcing. For example, wave energy was greatest at northern latitudes and generally decreased with latitude. In contrast, chlorophyll-a was greatest at reef ecosystems proximate to the equator and northern-most locations, showing little synchrony with latitude. In addition, we find that the reef ecosystems with the highest chlorophyll-a concentrations; Jarvis, Howland, Baker, Palmyra and Kingman are each uninhabited and are characterized by high hard coral cover and large numbers of predatory fishes. Finally, we find that scaling environmental data to the spatial footprint of individual islands and atolls is more likely to capture local environmental forcings, as chlorophyll-a concentrations decreased at relatively short distances (>7 km) from 85% of our study locations. These metrics will help identify reef ecosystems most exposed to environmental stress as well as systems that may be more resistant or resilient to future climate change. PMID:23637939

  1. Quantifying temporal ventriloquism in audiovisual synchrony perception.

    PubMed

    Kuling, Irene A; Kohlrausch, Armin; Juola, James F

    2013-10-01

    The integration of visual and auditory inputs in the human brain works properly only if the components are perceived in close temporal proximity. In the present study, we quantified cross-modal interactions in the human brain for audiovisual stimuli with temporal asynchronies, using a paradigm from rhythm perception. In this method, participants had to align the temporal position of a target in a rhythmic sequence of four markers. In the first experiment, target and markers consisted of a visual flash or an auditory noise burst, and all four combinations of target and marker modalities were tested. In the same-modality conditions, no temporal biases and a high precision of the adjusted temporal position of the target were observed. In the different-modality conditions, we found a systematic temporal bias of 25-30 ms. In the second part of the first and in a second experiment, we tested conditions in which audiovisual markers with different stimulus onset asynchronies (SOAs) between the two components and a visual target were used to quantify temporal ventriloquism. The adjusted target positions varied by up to about 50 ms and depended in a systematic way on the SOA and its proximity to the point of subjective synchrony. These data allowed testing different quantitative models. The most satisfying model, based on work by Maij, Brenner, and Smeets (Journal of Neurophysiology 102, 490-495, 2009), linked temporal ventriloquism and the percept of synchrony and was capable of adequately describing the results from the present study, as well as those of some earlier experiments. PMID:23868564

  2. Processing queries with quantifiers a horticultural approach

    Microsoft Academic Search

    Umeshwar Dayal

    1983-01-01

    Most research on query processing has focussed on quantifier-free conjunctive queries. Existing techniques for processing queries with quantifiers either compile the query into a nested loop program or use variants of Codd's reduction from the Relational Calculus to the Relational Algebra. In this paper we propose an alternative technique that uses an algebra of graft and prune operations on trees.

  3. Backward Analysis for Inferring Quantified Preconditions

    Microsoft Academic Search

    Tal Lev-AmiMooly Sagiv; Sumit Gulwani

    2007-01-01

    This paper presents a method to infer preconditions that contain quantifiers. Such invariants can quantify over an unbounded num- ber of storage locations and elements of arrays, and allow shape properties of programs that manipulate pointers and dynamically allocated data structures to be described concisely. The precondi- tions that are found ensure that any assertions that exist in the code

  4. Scalar Quantifiers: Logic, Acquisition, and Processing

    ERIC Educational Resources Information Center

    Geurts, Bart; Katsos, Napoleon; Cummins, Chris; Moons, Jonas; Noordman, Leo

    2010-01-01

    Superlative quantifiers ("at least 3", "at most 3") and comparative quantifiers ("more than 2", "fewer than 4") are traditionally taken to be interdefinable: the received view is that "at least n" and "at most n" are equivalent to "more than n-1" and "fewer than n+1", respectively. Notwithstanding the prima facie plausibility of this claim, Geurts…

  5. IR action spectroscopy shows competitive oxazolone and diketopiperazine formation in

    E-print Network

    Wysocki, Vicki H.

    IR action spectroscopy shows competitive oxazolone and diketopiperazine formation in peptides depends on peptide length and identity of terminal residue in the departing fragment L. J. Morrison,a J-quantified using gas-phase hydrogen­deuterium exchange. The formation of the oxazolone and diketopiperazine has

  6. 1. Show the synthesis of prontosil. Show the starting

    E-print Network

    Gates, Kent. S.

    how the three analogs shown below can be prepared. Draw an arrow-pushing mechanism for each step is not active in an in vitro assay, but shows good activity in animal models and human patients. Explain: what

  7. Quantifying diet for nutrigenomic studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The field of nutrigenomics shows tremendous promise for improved understanding of the effects of dietary intake on health. The knowledge that metabolic pathways may be altered in individuals with genetic variants in the presence of certain dietary exposures offers great potential for personalized nu...

  8. Quantifying immersion in virtual reality

    Microsoft Academic Search

    Randy F. Pausch; Dennis Proffitt; George H. Williams

    1997-01-01

    Virtual Reality (VR) has generated much excitement but little for- mal proof that it is useful. Because VR interfaces are difficult and expensive to build, the computer graphics community needs to be able to predict which applications will benefit from VR. In this paper, we show that users with a VR interface complete a search task faster than users with

  9. Quantifying Barrier Island Recovery Following a Hurricane

    NASA Astrophysics Data System (ADS)

    Hammond, B.; Houser, C.

    2014-12-01

    Barrier islands are dynamic landscapes that are believed to minimize storm impact to mainland communities and also provide important ecological services in the coastal environment. The protection afforded by the island and the services it provides, however, depend on island resiliency in the face of accelerated sea level rise, which is in turn dependent on the rate of island recovery following storm events that may also change in both frequency and magnitude in the future. These changes in frequency may affect even large dunes and their resiliency, resulting in the island transitioning from a high to a low elevation. Previous research has shown that the condition of the foredune depends on the recovery of the nearshore and beach profile and the ability of vegetation to capture aeolian-transported sediment. An inability of the foredune to recover may result in mainland susceptibility to storm energy, inability for ecosystems to recover and thrive, and sediment budget instability. In this study, LiDAR data is used to quantify the rates of dune recovery at Fire Island, NY, the Outer Banks, NC, Santa Rosa Island, FL, and Matagorda Island, TX. Preliminary results indicate foredune recovery varies significantly both alongshore and in the cross-shore, suggesting that barrier island response and recovery to storm events cannot be considered from a strictly two-dimensional (cross-shore) perspective.

  10. Quantifying Uncertainties in Land Surface Microwave Emissivity Retrievals

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko

    2012-01-01

    Uncertainties in the retrievals of microwave land surface emissivities were quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including SSM/I, TMI and AMSR-E, were studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors in the retrievals. Generally these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 14% (312 K) over desert and 17% (320 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.52% (26 K). In particular, at 85.0/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are mostly likely caused by rain/cloud contamination, which can lead to random errors up to 1017 K under the most severe conditions.

  11. Quantified Energy Dissipation Rates in the Terrestrial Bow Shock

    NASA Astrophysics Data System (ADS)

    Wilson, L. B., III; Sibeck, D. G.; Breneman, A. W.; Le Contel, O.; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.

    2013-12-01

    We present the first observationally quantified measure of the energy dissipation rate due to wave-particle interactions in the transition region of the Earth's collisionless bow shock using data from the THEMIS spacecraft. Each of more than 11 bow shock crossings examined with available wave burst data showed both low frequency (<10 Hz) magnetosonic-whistler waves and high frequency (?10 Hz) electromagnetic and electrostatic waves throughout the entire transition region and into the magnetosheath. The high frequency waves were identified as combinations of ion-acoustic waves, electron cyclotron drift instability driven waves, electrostatic solitary waves, and electromagnetic whistler mode waves. These waves were found to have: (1) amplitudes capable of exceeding ?B ~ 10 nT and ?E ~ 300 mV/m, though more typical values were ?B ~ 0.1-1.0 nT and ?E ~ 10-50 mV/m; (2) energy fluxes in excess of 2000 ?W m-2; (3) resistivities > 9000 ? m; and (4) energy dissipation rates > 3 ?W m-3. The high frequency (>10 Hz) electromagnetic waves produce such excessive energy dissipation that they need only be, at times, < 0.01% efficient to produce the observed increase in entropy across the shocks necessary to balance the nonlinear wave steepening that produces the shocks. These results show that wave-particle interactions have the capacity to regulate the global structure and dominate the energy dissipation of collisionless shocks.

  12. Quantifying Uncertainties in Land-Surface Microwave Emissivity Retrievals

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko

    2013-01-01

    Uncertainties in the retrievals of microwaveland-surface emissivities are quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including the Special Sensor Microwave Imager, the Tropical Rainfall Measuring Mission Microwave Imager, and the Advanced Microwave Scanning Radiometer for Earth Observing System, are studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land-surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors inthe retrievals. Generally, these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 1%-4% (3-12 K) over desert and 1%-7% (3-20 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.5%-2% (2-6 K). In particular, at 85.5/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are most likely caused by rain/cloud contamination, which can lead to random errors up to 10-17 K under the most severe conditions.

  13. Planning a Successful Tech Show

    ERIC Educational Resources Information Center

    Nikirk, Martin

    2011-01-01

    Tech shows are a great way to introduce prospective students, parents, and local business and industry to a technology and engineering or career and technical education program. In addition to showcasing instructional programs, a tech show allows students to demonstrate their professionalism and skills, practice public presentations, and interact…

  14. An automated homogeneous method for quantifying polysorbate using fluorescence polarization

    Microsoft Academic Search

    Marc D. Wenger; Amy M. Bowman; Marc V. Thorsteinsson; Kristine K. Little; Leslie Wang; Jinglin Zhong; Ann L. Lee; Peter DePhillips

    2005-01-01

    An automated fluorescence polarization (FP) assay has been developed for the quantitation of polysorbate in bioprocess samples. Using the lipophilic probe 5-dodecanoylaminofluorescein (DAF), polysorbate concentrations above the critical micelle concentration can be quantified by the FP increase that results when DAF inserts into the detergent micelles. The specificity, accuracy, and precision of this assay were defined for samples obtained from

  15. Quantifying Annual Aboveground Net Primary Production in the Intermountain West

    Technology Transfer Automated Retrieval System (TEKTRAN)

    As part of a larger project, methods were developed to quantify current year growth on grasses, forbs, and shrubs. Annual aboveground net primary production (ANPP) data are needed for this project to calibrate results from computer simulation models and remote-sensing data. Measuring annual ANPP of ...

  16. Quantifying capital goods for biological treatment of organic waste.

    PubMed

    Brogaard, Line K; Petersen, Per H; Nielsen, Peter D; Christensen, Thomas H

    2015-02-01

    Materials and energy used for construction of anaerobic digestion (AD) and windrow composting plants were quantified in detail. The two technologies were quantified in collaboration with consultants and producers of the parts used to construct the plants. The composting plants were quantified based on the different sizes for the three different types of waste (garden and park waste, food waste and sludge from wastewater treatment) in amounts of 10,000 or 50,000 tonnes per year. The AD plant was quantified for a capacity of 80,000 tonnes per year. Concrete and steel for the tanks were the main materials for the AD plant. For the composting plants, gravel and concrete slabs for the pavement were used in large amounts. To frame the quantification, environmental impact assessments (EIAs) showed that the steel used for tanks at the AD plant and the concrete slabs at the composting plants made the highest contribution to Global Warming. The total impact on Global Warming from the capital goods compared to the operation reported in the literature on the AD plant showed an insignificant contribution of 1-2%. For the composting plants, the capital goods accounted for 10-22% of the total impact on Global Warming from composting. PMID:25595291

  17. Interpreting quantifier scope ambiguity: evidence of heuristic first, algorithmic second processing.

    PubMed

    Dwivedi, Veena D

    2013-01-01

    The present work suggests that sentence processing requires both heuristic and algorithmic processing streams, where the heuristic processing strategy precedes the algorithmic phase. This conclusion is based on three self-paced reading experiments in which the processing of two-sentence discourses was investigated, where context sentences exhibited quantifier scope ambiguity. Experiment 1 demonstrates that such sentences are processed in a shallow manner. Experiment 2 uses the same stimuli as Experiment 1 but adds questions to ensure deeper processing. Results indicate that reading times are consistent with a lexical-pragmatic interpretation of number associated with context sentences, but responses to questions are consistent with the algorithmic computation of quantifier scope. Experiment 3 shows the same pattern of results as Experiment 2, despite using stimuli with different lexical-pragmatic biases. These effects suggest that language processing can be superficial, and that deeper processing, which is sensitive to structure, only occurs if required. Implications for recent studies of quantifier scope ambiguity are discussed. PMID:24278439

  18. Quantifying safety benefit of winter road maintenance: accident frequency modeling.

    PubMed

    Usman, Taimur; Fu, Liping; Miranda-Moreno, Luis F

    2010-11-01

    This research presents a modeling approach to investigate the association of the accident frequency during a snow storm event with road surface conditions, visibility and other influencing factors controlling for traffic exposure. The results have the premise to be applied for evaluating different maintenance strategies using safety as a performance measure. As part of this approach, this research introduces a road surface condition index as a surrogate measure of the commonly used friction measure to capture different road surface conditions. Data from various data sources, such as weather, road condition observations, traffic counts and accidents, are integrated and used to test three event-based models including the Negative Binomial model, the generalized NB model and the zero inflated NB model. These models are compared for their capability to explain differences in accident frequencies between individual snow storms. It was found that the generalized NB model best fits the data, and is most capable of capturing heterogeneity other than excess zeros. Among the main results, it was found that the road surface condition index was statistically significant influencing the accident occurrence. This research is the first showing the empirical relationship between safety and road surface conditions at a disaggregate level (event-based), making it feasible to quantify the safety benefits of alternative maintenance goals and methods. PMID:20728638

  19. quantifying and Predicting Reactive Transport

    SciTech Connect

    Peter C. Burns, Department of Civil Engineering and Geological Sciences, University of Notre Dame

    2009-12-04

    This project was led by Dr. Jiamin Wan at Lawrence Berkeley National Laboratory. Peter Burns provided expertise in uranium mineralogy and in identification of uranium minerals in test materials. Dr. Wan conducted column tests regarding uranium transport at LBNL, and samples of the resulting columns were sent to Dr. Burns for analysis. Samples were analyzed for uranium mineralogy by X-ray powder diffraction and by scanning electron microscopy, and results were provided to Dr. Wan for inclusion in the modeling effort. Full details of the project can be found in Dr. Wan's final reports for the associated effort at LBNL.

  20. Diarrheal Disease in Show Swine

    E-print Network

    Lawhorn, D. Bruce

    2007-02-27

    Diarrhea, an important problem in show pigs, can be caused by poor nutrition, infectious diseases, internal parasites or a combination of factors. This publication explains how the cause is diagnosed and the illness treated....

  1. Polymer microlenses for quantifying cell sheet mechanics

    PubMed Central

    Miquelard-Garnier, Guillaume; Zimberlin, Jessica A.; Sikora, Christian B.; Wadsworth, Patricia

    2010-01-01

    Mechanical interactions between individual cells and their substrate have been studied extensively over the past decade; however, understanding how these interactions change as cells interact with neighboring cells in the development of a cell sheet, or early stage tissue, is less developed. We use a recently developed experimental technique for quantifying the mechanics of confluent cell sheets. Living cells are cultured on a thin film of polystyrene [PS], which is attached to a patterned substrate of crosslinked poly(dimethyl siloxane) [PDMS] microwells. As cells attach to the substrate and begin to form a sheet, they apply sufficient contractile force to buckle the PS film over individual microwells to form a microlens array. The curvature for each microlens is measured by confocal microscopy and can be related to the strain and stress applied by the cell sheet using simple mechanical analysis for the buckling of thin films. We demonstrate that this technique can provide insight into the important materials properties and length scales that govern cell sheet responses, especially the role of stiffness of the substrate. We show that intercellular forces can lead to significantly different behaviors than the ones observed for individual cells, where focal adhesion is the relevant parameter. PMID:20445765

  2. Asteroid Geophysics and Quantifying the Impact Hazard

    NASA Technical Reports Server (NTRS)

    Sears, D.; Wooden, D. H.; Korycanksy, D. G.

    2015-01-01

    Probably the major challenge in understanding, quantifying, and mitigating the effects of an impact on Earth is understanding the nature of the impactor. Of the roughly 25 meteorite craters on the Earth that have associated meteorites, all but one was produced by an iron meteorite and only one was produced by a stony meteorite. Equally important, even meteorites of a given chemical class produce a wide variety of behavior in the atmosphere. This is because they show considerable diversity in their mechanical properties which have a profound influence on the behavior of meteorites during atmospheric passage. Some stony meteorites are weak and do not reach the surface or reach the surface as thousands of relatively harmless pieces. Some stony meteorites roll into a maximum drag configuration and are strong enough to remain intact so a large single object reaches the surface. Others have high concentrations of water that may facilitate disruption. However, while meteorite falls and meteorites provide invaluable information on the physical nature of the objects entering the atmosphere, there are many unknowns concerning size and scale that can only be determined by from the pre-atmospheric properties of the asteroids. Their internal structure, their thermal properties, their internal strength and composition, will all play a role in determining the behavior of the object as it passes through the atmosphere, whether it produces an airblast and at what height, and the nature of the impact and amount and distribution of ejecta.

  3. The missing metric: quantifying contributions of reviewers

    PubMed Central

    Cantor, Maurício; Gero, Shane

    2015-01-01

    The number of contributing reviewers often outnumbers the authors of publications. This has led to apathy towards reviewing and the conclusion that the peer-review system is broken. Given the trade-offs between submitting and reviewing manuscripts, reviewers and authors naturally want visibility for their efforts. While study after study has called for revolutionizing publication practices, the current paradigm does not recognize reviewers' time and expertise. We propose the R-index as a simple way to quantify scientists' contributions as reviewers. We modelled its performance using simulations based on real data to show that early–mid career scientists, who complete high-quality reviews of longer manuscripts within their field, can perform as well as leading scientists reviewing only for high-impact journals. By giving citeable academic recognition for reviewing, R-index will encourage more participation with better reviews, regardless of the career stage. Moreover, the R-index will allow editors to exploit scores to manage and improve their review team, and for journals to promote high average scores as signals of a practical and efficient service to authors. Peer-review is a pervasive necessity across disciplines and the simple utility of this missing metric will credit a valuable aspect of academic productivity without having to revolutionize the current peer-review system.

  4. Quantifying information and uncertainty of rock property estimation from seismic data

    NASA Astrophysics Data System (ADS)

    Takahashi, Isao

    Geophysical prospecting consists of making a quantitative inference about subsurface properties from geophysical measurements. Due to many ineluctable difficulties, observed data are almost always insufficient to uniquely specify the rock properties of interest. Hence, inevitable uncertainty remains after the estimation. The sources of the uncertainty arise from many factors: inconsistency in data acquisition conditions, insufficient available data as compared to the subsurface complexities, limited resolution, imperfect dependence between observed data and target rock properties, and our limited physical knowledge. While the uncertainty has been identified for a long time, quantitative framework to discuss the uncertainty has not been well established. The objective of this dissertation is to quantify uncertainty of rock property estimation and to reduce it by using multiple seismic observables. Using existing laboratory data and rock physics model parameters, we establish the general relationships between rock properties and pairs of seismic attributes. We show how optimal selections of seismic attributes allow us to better distinguish different rock property effects. One of the novel innovations in this work is to combine statistical formulations---information theory and Bayes decision theory---with rock physics models to quantitatively describe the dependence of seismic attributes on several important rock properties. Various sources of uncertainty about rock property estimation are quantified using the developed formulations. Furthermore, We propose a method of combining stochastic simulations and Bayes inversion to quantify the uncertainty about the dependence between seismic observables and target rock properties, caused by ignorance of other rock properties. We apply this method to explore scale effects on sand/shale ratio estimation from seismic reflectivity. One of the new results of this investigation is to show from the full probability density function that the effective medium average tends to overestimate the sand/shale ratio when the reservoir is randomly layered. The proposed framework of quantifying information given by seismic data will serve as a decision making guideline in various exploration stages.

  5. Quantifying rat pulmonary intravascular mononuclear phagocytes.

    PubMed

    Niehaus, G D; Mehendale, S R

    1998-12-01

    Cells of the mononuclear phagocyte system (MPS) protect the host by clearing effete and foreign particulates from the circulation. The current study was designed to identify, quantify, harvest, and provide a partial functional characterization of the systemic host-defense cell located in the pulmonary microvasculature of the rat. Critical colloid doses of test particulates (monastral blue B [MBB] or polystyrene beads) were infused intra-arterially into anesthetized rats so that phagocytically active pulmonary intravascular phagocytes could be identified. Morphologic characterization of in situ phagocytes was performed using electron microscopy. The number of active phagocytes was then determined using tissue samples processed for light microscopy. Finally, sequential perfusion of the pulmonary vasculature with buffer, chelating agent, and collagenase allowed elution and preliminary functional characterization of the pulmonary intravascular mononuclear phagocyte (PIMP). Electron microscopy demonstrated that both mononuclear phagocytes and neutrophils contributed to pulmonary sequestration of circulating particulates. Light microscopy showed that the microvasculature of each alveolus contained 0.50+/-0.19 active mononuclear phagocytes and 0.14+/-0.12 active neutrophils. A chelation/collagenase elution technique was then used to harvest the PIMP. Histologic evaluation of the postperfusion lungs indicated that 80% of the active phagocytes were removed by the technique. In total, the elution fluids contained 2.63+/-1.04 x 10(7) cells, with 1.60+/-0.78 x 10(7), 0.49+/-0.17 x 10(7), and 0.54+/-0.26 x 10(7) of those cells being mononuclear phagocytes, neutrophils, and lymphocytes, respectively. Functionally, the mononuclear phagocyte population exhibited a spectrum of phagocytic activities, with 51.5+/-19.5% of the cells being inactive, 33.9+/-13.4% exhibiting moderate phagocytic activity, and 14.6+/-9.8% demonstrating intensive phagocytic capacity. The current study provides the first quantified demonstration that mononuclear phagocytes are primarily responsible for sequestering blood-borne foreign particulates in the pulmonary circulation of the rat. Approximately 2 x 10(7) PIMP existed in the lungs of 300 gram rats. The functionally heterogeneous mononuclear phagocytes exhibited phagocytic capacities ranging from avidly phagocytic (14.6+/-9.8%) through moderately active (33.9+/-13.4%) to inactive. The lung microvasculature's large pool of inactive mononuclear phagocytes may provide a recruitable mechanism to allow significant increases in clearance of circulating particulates. A resident pool of activatable mononuclear phagocytes might explain previous clinical observations of increased particulate localization in the lung microvasculature of septic patients. PMID:9845213

  6. Comprehension of Indefinite Pronouns and Quantifiers by Hearing-Impaired Students.

    ERIC Educational Resources Information Center

    Wilbur, Ronnie B.; Goodhart, Wendy C.

    1985-01-01

    Deaf students' recognition of indefinite pronouns and quantifiers was tested using written materials in the form of comic strips. The subjects were 187 profoundly hearing-impaired students, aged 7 to 23 years. Findings showed significant developmental trends for both forms. Quantifiers were found to be significantly more difficult than indefinite…

  7. Quantifying avoided emissions from renewable generation

    E-print Network

    Gomez, Gabriel R. (Gabriel Rodriguez)

    2009-01-01

    Quantifying the reduced emissions due to renewable power integration and providing increasingly accurate emissions analysis has become more important for policy makers in the age of renewable portfolio standards (RPS) and ...

  8. Quantifying the parameters of successful agricultural producers 

    E-print Network

    Kaase, Gregory Herman

    2006-08-16

    The primary purpose of the study was to quantify the parameters of successful agricultural producers. Through the use of the Financial and Risk Management (FARM) Assistance database, this study evaluated economic measures ...

  9. Diarrheal Disease in Show Swine 

    E-print Network

    Lawhorn, D. Bruce

    2007-02-27

    of gain. Lawsonia intracellularis is the causative bacterium. Though it is rarely the cause of diarrheal disease in show swine, veterinarians consider this organism as a potential cause when making a differ- ential diagnosis. L. intracellularis does.... TGE does not cause human disease. Clinical Diagnosis The stool of the normal pig should be firm and well- formed. When a normal show pig is on free-choice feed and water, the stool tends to loosen to the consistency of a cow patty as feed consumption...

  10. Quantifiable diagnosis of muscular dystrophies and neurogenic atrophies through network analysis

    PubMed Central

    2013-01-01

    Background The diagnosis of neuromuscular diseases is strongly based on the histological characterization of muscle biopsies. However, this morphological analysis is mostly a subjective process and difficult to quantify. We have tested if network science can provide a novel framework to extract useful information from muscle biopsies, developing a novel method that analyzes muscle samples in an objective, automated, fast and precise manner. Methods Our database consisted of 102 muscle biopsy images from 70 individuals (including controls, patients with neurogenic atrophies and patients with muscular dystrophies). We used this to develop a new method, Neuromuscular DIseases Computerized Image Analysis (NDICIA), that uses network science analysis to capture the defining signature of muscle biopsy images. NDICIA characterizes muscle tissues by representing each image as a network, with fibers serving as nodes and fiber contacts as links. Results After a ‘training’ phase with control and pathological biopsies, NDICIA was able to quantify the degree of pathology of each sample. We validated our method by comparing NDICIA quantification of the severity of muscular dystrophies with a pathologist’s evaluation of the degree of pathology, resulting in a strong correlation (R?=?0.900, P <0.00001). Importantly, our approach can be used to quantify new images without the need for prior ‘training’. Therefore, we show that network science analysis captures the useful information contained in muscle biopsies, helping the diagnosis of muscular dystrophies and neurogenic atrophies. Conclusions Our novel network analysis approach will serve as a valuable tool for assessing the etiology of muscular dystrophies or neurogenic atrophies, and has the potential to quantify treatment outcomes in preclinical and clinical trials. PMID:23514382

  11. Sensitivity of Edge Detection Methods for Quantifying Cell Migration Assays

    PubMed Central

    Treloar, Katrina K.; Simpson, Matthew J.

    2013-01-01

    Quantitative imaging methods to analyze cell migration assays are not standardized. Here we present a suite of two-dimensional barrier assays describing the collective spreading of an initially-confined population of 3T3 fibroblast cells. To quantify the motility rate we apply two different automatic image detection methods to locate the position of the leading edge of the spreading population after , and hours. These results are compared with a manual edge detection method where we systematically vary the detection threshold. Our results indicate that the observed spreading rates are very sensitive to the choice of image analysis tools and we show that a standard measure of cell migration can vary by as much as 25% for the same experimental images depending on the details of the image analysis tools. Our results imply that it is very difficult, if not impossible, to meaningfully compare previously published measures of cell migration since previous results have been obtained using different image analysis techniques and the details of these techniques are not always reported. Using a mathematical model, we provide a physical interpretation of our edge detection results. The physical interpretation is important since edge detection algorithms alone do not specify any physical measure, or physical definition, of the leading edge of the spreading population. Our modeling indicates that variations in the image threshold parameter correspond to a consistent variation in the local cell density. This means that varying the threshold parameter is equivalent to varying the location of the leading edge in the range of approximately 1–5% of the maximum cell density. PMID:23826283

  12. The quantified patient: a patient participatory culture.

    PubMed

    Appelboom, Geoff; LoPresti, Melissa; Reginster, Jean-Yves; Sander Connolly, E; Dumont, Emmanuel P L

    2014-12-01

    The Quantified Self Movement, which aims to improve various aspects of life and health through recording and reviewing daily activities and biometrics, is a new and upcoming practice of self monitoring that holds much promise. Now, the most underutilized resource in ambulatory health care, the patient, can participate like never before, and the patient's Quantified Self can be directly monitored and remotely accessed by health care professionals. PMID:25118077

  13. Pembrolizumab Shows Promise for NSCLC.

    PubMed

    2015-06-01

    Data from the KEYNOTE-001 trial show that pembrolizumab improves clinical outcomes for patients with advanced non-small cell lung cancer, and is well tolerated. PD-L1 expression in at least 50% of tumor cells correlated with improved efficacy. PMID:25895920

  14. The OOPSLA trivia show (TOOTS)

    Microsoft Academic Search

    Jeff Gray; Douglas C. Schmidt

    2009-01-01

    OOPSLA has a longstanding tradition of being a forum for discussing the cutting edge of technology in a fun and participatory environment. The type of events sponsored by OOPSLA sometimes border on the unconventional. This event represents an atypical panel that conforms to the concept of a game show that is focused on questions and answers related to OOPSLA themes.

  15. Managing Beef Cattle for Show

    E-print Network

    Herd, Dennis B.; Boleman, Chris; Boleman, Larry L.

    2001-11-16

    in show diets because of its rapid digestion and tendency to cause acido- sis (see the section on health). Oats are excellent for growth and development of steers or heifers. A mixture similar in nutrient content to oats can be formulated with a high...

  16. Quantifying hybridization in realistic time.

    PubMed

    Collins, Joshua; Linz, Simone; Semple, Charles

    2011-10-01

    Recently, numerous practical and theoretical studies in evolutionary biology aim at calculating the extent to which reticulation-for example, horizontal gene transfer, hybridization, or recombination-has influenced the evolution for a set of present-day species. It has been shown that inferring the minimum number of hybridization events that is needed to simultaneously explain the evolutionary history for a set of trees is an NP-hard and also fixed-parameter tractable problem. In this article, we give a new fixed-parameter algorithm for computing the minimum number of hybridization events for when two rooted binary phylogenetic trees are given. This newly developed algorithm is based on interleaving-a technique using repeated kernelization steps that are applied throughout the exhaustive search part of a fixed-parameter algorithm. To show that our algorithm runs efficiently to be applicable to a wide range of practical problem instances, we apply it to a grass data set and highlight the significant improvements in terms of running times in comparison to an algorithm that has previously been implemented. PMID:21210735

  17. Common ecology quantifies human insurgency.

    PubMed

    Bohorquez, Juan Camilo; Gourley, Sean; Dixon, Alexander R; Spagat, Michael; Johnson, Neil F

    2009-12-17

    Many collective human activities, including violence, have been shown to exhibit universal patterns. The size distributions of casualties both in whole wars from 1816 to 1980 and terrorist attacks have separately been shown to follow approximate power-law distributions. However, the possibility of universal patterns ranging across wars in the size distribution or timing of within-conflict events has barely been explored. Here we show that the sizes and timing of violent events within different insurgent conflicts exhibit remarkable similarities. We propose a unified model of human insurgency that reproduces these commonalities, and explains conflict-specific variations quantitatively in terms of underlying rules of engagement. Our model treats each insurgent population as an ecology of dynamically evolving, self-organized groups following common decision-making processes. Our model is consistent with several recent hypotheses about modern insurgency, is robust to many generalizations, and establishes a quantitative connection between human insurgency, global terrorism and ecology. Its similarity to financial market models provides a surprising link between violent and non-violent forms of human behaviour. PMID:20016600

  18. Quantifying thermodynamics of collagen thermal denaturation by second harmonic generation imaging

    NASA Astrophysics Data System (ADS)

    Hovhannisyan, Vladimir A.; Su, Ping-Jung; Lin, Sung-Jan; Dong, Chen-Yuan

    2009-06-01

    Time-lapse second harmonic generation (SHG) microscopy was applied for the extraction of thermodynamic parameters of collagen thermal denaturation. We found that at sufficiently high temperatures, temporal dependence of SHG intensity from the isothermal treatment of chicken dermal collagen was single exponential and can be modeled by the Arrhenius equation. Activation energy and the frequency factor of chicken dermal collagen thermal denaturation were determined using temporal decays of SHG intensity at different temperatures. Our results show that time-lapse, high temperature SHG imaging can be used to quantify kinetic properties of collagen thermal denaturation within a microscopic volume of 1 nl.

  19. Quantifying Urban Groundwater in Environmental Field Observatories

    NASA Astrophysics Data System (ADS)

    Welty, C.; Miller, A. J.; Belt, K.; Smith, J. A.; Band, L. E.; Groffman, P.; Scanlon, T.; Warner, J.; Ryan, R. J.; Yeskis, D.; McGuire, M. P.

    2006-12-01

    Despite the growing footprint of urban landscapes and their impacts on hydrologic and biogeochemical cycles, comprehensive field studies of urban water budgets are few. The cumulative effects of urban infrastructure (buildings, roads, culverts, storm drains, detention ponds, leaking water supply and wastewater pipe networks) on temporal and spatial patterns of groundwater stores, fluxes, and flowpaths are poorly understood. The goal of this project is to develop expertise and analytical tools for urban groundwater systems that will inform future environmental observatory planning and that can be shared with research teams working in urban environments elsewhere. The work plan for this project draws on a robust set of information resources in Maryland provided by ongoing monitoring efforts of the Baltimore Ecosystem Study (BES), USGS, and the U.S. Forest Service working together with university scientists and engineers from multiple institutions. A key concern is to bridge the gap between small-scale intensive field studies and larger-scale and longer-term hydrologic patterns using synoptic field surveys, remote sensing, numerical modeling, data mining and visualization tools. Using the urban water budget as a unifying theme, we are working toward estimating the various elements of the budget in order to quantify the influence of urban infrastructure on groundwater. Efforts include: (1) comparison of base flow behavior from stream gauges in a nested set of watersheds at four different spatial scales from 0.8 to 171 km2, with diverse patterns of impervious cover and urban infrastructure; (2) synoptic survey of well water levels to characterize the regional water table; (3) use of airborne thermal infrared imagery to identify locations of groundwater seepage into streams across a range of urban development patterns; (4) use of seepage transects and tracer tests to quantify the spatial pattern of groundwater fluxes to the drainage network in selected subwatersheds; (5) development of a mass balance for precipitation over a 170 km2 area on a 1x1 km2 grid using recording rain gages for bias correction of weather radar products; (5) calculation of urban evapotranspiration using the Penman-Monteith method compared with results from an eddy correlation station; (7) use of numerical groundwater model in a screening mode to estimate depth of groundwater contributing surface water flow; and (8) data mining of public agency records of potable water and wastewater flows to estimate leakage rates and flowpaths in relation to streamflow and groundwater fluxes.

  20. Quantifying global international migration flows.

    PubMed

    Abel, Guy J; Sander, Nikola

    2014-03-28

    Widely available data on the number of people living outside of their country of birth do not adequately capture contemporary intensities and patterns of global migration flows. We present data on bilateral flows between 196 countries from 1990 through 2010 that provide a comprehensive view of international migration flows. Our data suggest a stable intensity of global 5-year migration flows at ~0.6% of world population since 1995. In addition, the results aid the interpretation of trends and patterns of migration flows to and from individual countries by placing them in a regional or global context. We estimate the largest movements to occur between South and West Asia, from Latin to North America, and within Africa. PMID:24675962

  1. Quantifying the dynamics of financial correlations

    NASA Astrophysics Data System (ADS)

    Dro?d?, S.; Kwapie?, J.; Grümmer, F.; Ruf, F.; Speth, J.

    2001-10-01

    A novel application of the correlation matrix formalism to study dynamics of the financial evolution is presented. This formalism allows to quantify the memory effects as well as some potential repeatable intraday structures in the financial time series. The present study is based on the high-frequency Deutsche Aktienindex (DAX) data over the time period between November 1997 and December 1999 and demonstrates a power of the method. In this way, two significant new aspects of the DAX evolution are identified: (i) the memory effects turn out to be sizably shorter than what the standard autocorrelation function analysis seems to indicate and (ii) there exist short term repeatable structures in fluctuations that are governed by a distinct dynamics. The former of these results may provide an argument in favour of the market efficiency while the latter one may indicate origin of the difficulty in reaching a Gaussian limit, expected from the central limit theorem, in the distribution of returns on longer time horizons.

  2. Quantifying Ant Activity Using Vibration Measurements

    PubMed Central

    Oberst, Sebastian; Baro, Enrique Nava; Lai, Joseph C. S.; Evans, Theodore A.

    2014-01-01

    Ant behaviour is of great interest due to their sociality. Ant behaviour is typically observed visually, however there are many circumstances where visual observation is not possible. It may be possible to assess ant behaviour using vibration signals produced by their physical movement. We demonstrate through a series of bioassays with different stimuli that the level of activity of meat ants (Iridomyrmex purpureus) can be quantified using vibrations, corresponding to observations with video. We found that ants exposed to physical shaking produced the highest average vibration amplitudes followed by ants with stones to drag, then ants with neighbours, illuminated ants and ants in darkness. In addition, we devised a novel method based on wavelet decomposition to separate the vibration signal owing to the initial ant behaviour from the substrate response, which will allow signals recorded from different substrates to be compared directly. Our results indicate the potential to use vibration signals to classify some ant behaviours in situations where visual observation could be difficult. PMID:24658467

  3. Quantifying pressure variations from petrographic observations

    NASA Astrophysics Data System (ADS)

    Vrijmoed, Johannes C.; Podladchikov, Yuri Y.

    2015-04-01

    The existence of grain scale pressure variations has been established over the last decennia. Mineral reactions are often accompanied by volume and shape changes in a system where much heterogeneity in material properties exists. This gives rise to internal stresses and pressure variation during metamorphic reactions. The residual pressure in inclusions can be measured by Raman spectroscopy, but is restricted to a narrow range of minerals that (potentially) have a well calibrated Raman shift with pressure. Several alternative methods to quantify pressure variations from petrographic observations are presented. We distinguish equilibrium and non-equilibrium methods. Equilibrium methods are based on a newly developed method to predict phase equilibria and composition under a given pressure gradient. The pressure gradient can be found by iteratively matching predicted phase assemblages and composition with petrographic observations. Non-equilibrium methods involve the estimation of pressure variation in initial stages of reaction in which the system may still be isochoric. It then results in the potential pressure buildup for a given unreacted rock for example in the initial stages of dehydration of serpentinite in subduction settings.

  4. Quantifying truncation errors in effective field theory

    E-print Network

    R. J. Furnstahl; N. Klco; D. R. Phillips; S. Wesolowski

    2015-06-03

    Bayesian procedures designed to quantify truncation errors in perturbative calculations of quantum chromodynamics observables are adapted to expansions in effective field theory (EFT). In the Bayesian approach, such truncation errors are derived from degree-of-belief (DOB) intervals for EFT predictions. Computation of these intervals requires specification of prior probability distributions ("priors") for the expansion coefficients. By encoding expectations about the naturalness of these coefficients, this framework provides a statistical interpretation of the standard EFT procedure where truncation errors are estimated using the order-by-order convergence of the expansion. It also permits exploration of the ways in which such error bars are, and are not, sensitive to assumptions about EFT-coefficient naturalness. We first demonstrate the calculation of Bayesian probability distributions for the EFT truncation error in some representative examples, and then focus on the application of chiral EFT to neutron-proton scattering. Epelbaum, Krebs, and Mei{\\ss}ner recently articulated explicit rules for estimating truncation errors in such EFT calculations of few-nucleon-system properties. We find that their basic procedure emerges generically from one class of naturalness priors considered, and that all such priors result in consistent quantitative predictions for 68% DOB intervals. We then explore several methods by which the convergence properties of the EFT for a set of observables may be used to check the statistical consistency of the EFT expansion parameter.

  5. Magic Carpet Shows Its Colors

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The upper left image in this display is from the panoramic camera on the Mars Exploration Rover Spirit, showing the 'Magic Carpet' region near the rover at Gusev Crater, Mars, on Sol 7, the seventh martian day of its journey (Jan. 10, 2004). The lower image, also from the panoramic camera, is a monochrome (single filter) image of a rock in the 'Magic Carpet' area. Note that colored portions of the rock correlate with extracted spectra shown in the plot to the side. Four different types of materials are shown: the rock itself, the soil in front of the rock, some brighter soil on top of the rock, and some dust that has collected in small recesses on the rock face ('spots'). Each color on the spectra matches a line on the graph, showing how the panoramic camera's different colored filters are used to broadly assess the varying mineral compositions of martian rocks and soils.

  6. Quantifying subsurface mixing of groundwater from lowland stream perspective.

    NASA Astrophysics Data System (ADS)

    van der Velde, Ype; Torfs, Paul; van der Zee, Sjoerd; Uijlenhoet, Remko

    2013-04-01

    The distribution of time it takes water from the moment of precipitation to reach the catchment outlet is widely used as a characteristic for catchment discharge behaviour, catchment vulnerability to pollution spreading and pollutant loads from catchments to downstream waters. However, this distribution tends to vary in time driven by variability in precipitation and evapotranspiration. Subsurface mixing controls to what extent dynamics in rainfall and evpotranspiration are translated into dynamics of travel time distributions. This insight in hydrologic functioning of catchments requires new definitions and concepts that link dynamics of catchment travel time distributions to the degree of subsurface mixing. In this presentation we propose the concept of STorage Outflow Probability (STOP) functions, that quantify the probability of water parcels stored in a catchment, to leave this catchment by discharge or evapotranspiration. We will show how STOPs relate to the topography and subsurface and how they can be used for deriving time varying travel time distributions of a catchment. The presented analyses will combine a unique dataset of high-frequent discharge and nitrate concentration measurements with results of a spatially distributed groundwater model and conceptual models of water flow and solute transport. Remarkable findings are the large contrasts in discharge behaviour expressed in travel time between lowland and sloping catchments and the strong relationship between evapotranspiration and stream water nutrient concentration dynamics.

  7. Identifying and quantifying radiation damage at the atomic level

    PubMed Central

    Gerstel, Markus; Deane, Charlotte M.; Garman, Elspeth F.

    2015-01-01

    Radiation damage impedes macromolecular diffraction experiments. Alongside the well known effects of global radiation damage, site-specific radiation damage affects data quality and the veracity of biological conclusions on protein mechanism and function. Site-specific radiation damage follows a relatively predetermined pattern, in that different structural motifs are affected at different dose regimes: in metal-free proteins, disulfide bonds tend to break first followed by the decarboxylation of aspartic and glutamic acids. Even within these damage motifs the decay does not progress uniformly at equal rates. Within the same protein, radiation-induced electron density decay of a particular chemical group is faster than for the same group elsewhere in the protein: an effect known as preferential specific damage. Here, B Damage, a new atomic metric, is defined and validated to recognize protein regions susceptible to specific damage and to quantify the damage at these sites. By applying B Damage to a large set of known protein structures in a statistical survey, correlations between the rates of damage and various physicochemical parameters were identified. Results indicate that specific radiation damage is independent of secondary protein structure. Different disulfide bond groups (spiral, hook, and staple) show dissimilar radiation damage susceptibility. There is a consistent positive correlation between specific damage and solvent accessibility. PMID:25723922

  8. Pig and the Poultry Show 

    E-print Network

    Sloan

    2009-01-01

    directed toward 1ncreasing d1gestibi11ty of the gra1n. Fitch and Wolberg (2l) found that 435 of the grain of Kansas Orange and 36K of the grain of Atlas Sorgo s1lage was voided intact in the animal's feces. Analysis of the vo1ded grains showed... whole kernels, but the starchy endosperm had been digested. Davis and Waldern (14) observed that 7. 1~~ of the kernels and 1. 2? of the whole s1lage DM appeared in the feces as whole kernels. The propor- tion of in v1tro digestible dry matter ( IVDDM...

  9. Managing Beef Cattle for Show 

    E-print Network

    Herd, Dennis B.; Boleman, Chris; Boleman, Larry L.

    2001-11-16

    . Protein supplements?Feeds such as cottonseed meal, soybean meal and linseed meal increase the protein content of the diet. Small amounts (less than 3 percent) of fish meal, dried blood meal, corn gluten meal, linseed meal and brewers or distillers grains... of nutritional ailments of acidosis, bloat and possibly founder. A big full middle on a steer can be more effectively controlled by limiting feed and water the last few weeks before show, not by elimi- nating hay from the diet. Hay should be free of mold, dust...

  10. The red planet shows off

    NASA Astrophysics Data System (ADS)

    Beish, J. D.; Parker, D. C.; Hernandez, C. E.

    1989-01-01

    Results from observations of Mars between November 1987 and September 1988 are reviewed. The observations were part of a program to provide continuous global coverage of Mars in the period surrounding its opposition on September 28, 1988. Observations of Martian clouds, dust storms, the planet's south pole, and the Martian surface are discussed.

  11. Quantifying Neural Coding of Event Timing

    PubMed Central

    Soteropoulos, Demetris S.; Baker, Stuart N.

    2009-01-01

    Single-neuron firing is often analyzed relative to an external event, such as successful task performance or the delivery of a stimulus. The perievent time histogram (PETH) examines how, on average, neural firing modulates before and after the alignment event. However, the PETH contains no information about the single-trial reliability of the neural response, which is important from the perspective of a target neuron. In this study, we propose the concept of using the neural activity to predict the timing of the occurrence of an event, as opposed to using the event to predict the neural response. We first estimate the likelihood of an observed spike train, under the assumption that it was generated by an inhomogeneous gamma process with rate profile similar to the PETH shifted by a small time. This is used to generate a probability distribution of the event occurrence, using Bayes’ rule. By an information theoretic approach, this method yields a single value (in bits) that quantifies the reduction in uncertainty regarding the time of an external event following observation of the spike train. We show that the approach is sensitive to the amplitude of a response, to the level of baseline firing, and to the consistency of a response between trials, all of which are factors that will influence a neuron's ability to code for the time of the event. The technique can provide a useful means not only of determining which of several behavioral events a cell encodes best, but also of permitting objective comparison of different cell populations. PMID:19019976

  12. Quantifying Sentiment and Influence in Blogspaces

    SciTech Connect

    Hui, Peter SY; Gregory, Michelle L.

    2010-07-25

    The weblog, or blog, has become a popular form of social media, through which authors can write posts, which can in turn generate feedback in the form of user comments. When considered in totality, a collection of blogs can thus be viewed as a sort of informal collection of mass sentiment and opinion. An obvious topic of interest might be to mine this collection to obtain some gauge of public sentiment over the wide variety of topics contained therein. However, the sheer size of the so-called blogosphere, combined with the fact that the subjects of posts can vary over a practically limitless number of topics poses some serious challenges when any meaningful analysis is attempted. Namely, the fact that largely anyone with access to the Internet can author their own blog, raises the serious issue of credibility— should some blogs be considered to be more influential than others, and consequently, when gauging sentiment with respect to a topic, should some blogs be weighted more heavily than others? In addition, as new posts and comments can be made on almost a constant basis, any blog analysis algorithm must be able to handle such updates efficiently. In this paper, we give a formalization of the blog model. We give formal methods of quantifying sentiment and influence with respect to a hierarchy of topics, with the specific aim of facilitating the computation of a per-topic, influence-weighted sentiment measure. Finally, as efficiency is a specific endgoal, we give upper bounds on the time required to update these values with new posts, showing that our analysis and algorithms are scalable.

  13. IMPAIRED VERBAL COMPREHENSION OF QUANTIFIERS IN CORTICOBASAL SYNDROME

    PubMed Central

    Troiani, Vanessa; Clark, Robin; Grossman, Murray

    2011-01-01

    Objective Patients with Corticobasal Syndrome (CBS) have atrophy in posterior parietal cortex. This region of atrophy has been previously linked with their quantifier comprehension difficulty, but previous studies used visual stimuli, making it difficult to account for potential visuospatial deficits in CBS patients. The current study evaluated comprehension of generalized quantifiers using strictly verbal materials. Method CBS patients, a brain-damaged control group (consisting of Alzheimer's Disease and frontotemporal dementia), and age-matched controls participated in this study. We assessed familiar temporal, spatial, and monetary domains of verbal knowledge comparatively. Judgment accuracy was only evaluated in statements for which patients demonstrated accurate factual knowledge about the target domain. Results We found that patients with CBS are significantly impaired in their ability to evaluate quantifiers compared to healthy seniors and a brain-damaged control group, even in this strictly visual task. This impairment was seen in the vast majority of individual CBS patients. Conclusions These findings offer additional evidence of quantifier impairment in CBS patients and emphasize that this impairment cannot be attributed to potential spatial processing impairments in patients with parietal disease. PMID:21381823

  14. Phoenix Scoop Inverted Showing Rasp

    NASA Technical Reports Server (NTRS)

    2008-01-01

    This image taken by the Surface Stereo Imager on Sol 49, or the 49th Martian day of the mission (July 14, 2008), shows the silver colored rasp protruding from NASA's Phoenix Mars Lander's Robotic Arm scoop. The scoop is inverted and the rasp is pointing up.

    Shown with its forks pointing toward the ground is the thermal and electrical conductivity probe, at the lower right. The Robotic Arm Camera is pointed toward the ground.

    The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is led by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  15. Obtaining Laws Through Quantifying Experiments: Justifications of Pre-service Physics Teachers in the Case of Electric Current, Voltage and Resistance

    NASA Astrophysics Data System (ADS)

    Mäntylä, Terhi; Hämäläinen, Ari

    2015-07-01

    The language of physics is mathematics, and physics ideas, laws and models describing phenomena are usually represented in mathematical form. Therefore, an understanding of how to navigate between phenomena and the models representing them in mathematical form is important for a physics teacher so that the teacher can make physics understandable to students. Here, the focus is on the "experimental mathematization," how laws are established through quantifying experiments. A sequence from qualitative experiments to mathematical formulations through quantifying experiments on electric current, voltage and resistance in pre-service physics teachers' laboratory reports is examined. The way students reason and justify the mathematical formulation of the measurement results and how they combine the treatment and presentation of empirical data to their justifications is analyzed. The results show that pre-service physics teachers understand the basic idea of how quantifying experiments establish the quantities and laws but are not able to argue it in a justified manner.

  16. Casimir experiments showing saturation effects

    E-print Network

    Bo E. Sernelius

    2009-10-27

    We address several different Casimir experiments where theory and experiment disagree. First out is the classical Casimir force measurement between two metal half spaces; here both in the form of the torsion pendulum experiment by Lamoreaux and in the form of the Casimir pressure measurement between a gold sphere and a gold plate as performed by Decca et al.; theory predicts a large negative thermal correction, absent in the high precision experiments. The third experiment is the measurement of the Casimir force between a metal plate and a laser irradiated semiconductor membrane as performed by Chen et al.; the change in force with laser intensity is larger than predicted by theory. The fourth experiment is the measurement of the Casimir force between an atom and a wall in the form of the measurement by Obrecht et al. of the change in oscillation frequency of a 87 Rb Bose-Einstein condensate trapped to a fused silica wall; the change is smaller than predicted by theory. We show that saturation effects can explain the discrepancies between theory and experiment observed in all these cases.

  17. Casimir experiments showing saturation effects

    SciTech Connect

    Sernelius, Bo E. [Division of Theory and Modeling, Department of Physics, Chemistry and Biology, Linkoeping University, SE-581 83 Linkoeping (Sweden)

    2009-10-15

    We address several different Casimir experiments where theory and experiment disagree. First out is the classical Casimir force measurement between two metal half spaces; here both in the form of the torsion pendulum experiment by Lamoreaux and in the form of the Casimir pressure measurement between a gold sphere and a gold plate as performed by Decca et al.; theory predicts a large negative thermal correction, absent in the high precision experiments. The third experiment is the measurement of the Casimir force between a metal plate and a laser irradiated semiconductor membrane as performed by Chen et al.; the change in force with laser intensity is larger than predicted by theory. The fourth experiment is the measurement of the Casimir force between an atom and a wall in the form of the measurement by Obrecht et al. of the change in oscillation frequency of a {sup 87}Rb Bose-Einstein condensate trapped to a fused silica wall; the change is smaller than predicted by theory. We show that saturation effects can explain the discrepancies between theory and experiment observed in all these cases.

  18. A standardized method for quantifying unidirectional genetic introgression

    PubMed Central

    Karlsson, Sten; Diserud, Ola H; Moen, Thomas; Hindar, Kjetil

    2014-01-01

    Genetic introgression of domesticated to wild conspecifics is of great concern to the genetic integrity and viability of the wild populations. Therefore, we need tools that can be used for monitoring unidirectional gene flow from domesticated to wild populations. A challenge to quantitation of unidirectional gene flow is that both the donor and the recipient population may be genetically substructured and that the subpopulations are subjected to genetic drift and may exchange migrants between one another. We develop a standardized method for quantifying and monitoring domesticated to wild gene flow and demonstrate its usefulness to farm and wild Atlantic salmon as a model species. The challenge of having several wild and farm populations was circumvented by in silico generating one analytical center point for farm and wild salmon, respectively. Distributions for the probability that an individual is wild were generated from individual-based analyses of observed wild and farm genotypes using STRUCTURE. We show that estimates of proportions of the genome being of domesticated origin in a particular wild population can be obtained without having a historical reference sample for the same population. The main advantages of the method presented are the standardized way in which genetic processes within and between populations are taken into account, and the individual-based analyses giving estimates for each individual independent of other individuals. The method makes use of established software, and as long as genetic markers showing generic genetic differences between domesticated and wild populations are available, it can be applied to all species with unidirectional gene flow. Results from our method are easy to interpret and understand, and will serve as a powerful tool for management, especially because there is no need for a specific historical wild reference sample. PMID:25473478

  19. Toward quantifying the deep Atlantic carbon storage increase during the last glaciation

    NASA Astrophysics Data System (ADS)

    Yu, J.; Menviel, L.; Jin, Z.

    2014-12-01

    Ice core records show that atmospheric CO2 concentrations during peak glacial time were ~30% lower than the levels during interglacial periods. The terrestrial biosphere carbon stock was likely reduced during glacials. Increased carbon storage in the deep ocean is thought to play an important role in lowering glacial atmospheric CO2. However, it has been challenging to quantify carbon storage changes in the deep ocean using existing proxy data. Here, we present deepwater carbonate ion reconstructions for a few locations in the deep Atlantic. These data allow us to estimate the minimum carbon storage increase in the deep Atlantic Ocean during the last glaciation. Our results show that, despite its relative small volume, the deep Atlantic Ocean may contribute significantly to atmospheric CO2 variations at major climate transitions. Furthermore, our results suggest a strong coupling of ocean circulation and carbon cycle in the deep Atlantic during the last glaciation.

  20. A 15N tracing method to quantify N2O pathways from terrestrial ecosystems

    NASA Astrophysics Data System (ADS)

    Müller, Christoph; Laughlin, Ronnie; Spott, Oliver; Rütting, Tobias

    2014-05-01

    To quantify N2O production pathways from terrestrial ecosystems a 15N tracing model was developed. The model is based on previous tracing models to quantify gross nitrogen (N) transformations including soil nitrite (NO2-) dynamics. Four N2O pathways are considered in the model which are associated with NO2- subpools: i) reduction of NO2- associated with nitrification (NO2-nit- N2Onit), ii) reduction of NO2- associated with denitrification (NO2-den- N2Oden), iii) reduction of NO2- associated with organic N (Norg) oxidation (NO2-org - N2Oorg), and iv) codenitrification (N2Ocod), a hybrid reaction where one N atom in N2O originates from organic N and the other from NO2-den. The reaction kinetics and emission notations are based on first-order approaches. For all four N2O sub-pools specific reduction rates to N2 were implemented. Parameters are optimized with the Metropolis algorithm (a Monte Carlo technique). A data set from an old grassland was used to test the 15N tracing tool. Results show that on average over a 12 day period N2Onit, N2Oden, N2Oorg and N2Ocod contributed by 9%, 20%, 54% and 18% to the total N2O emission, respectively. Alternative techniques based on analytical approaches, which consider three N2O emission pathways, provide similar results. For the first time four N2O emission pathways, including a hybrid-reaction, can simultaneously be quantified. The analysis for the old grassland study showed that heterotrophic processes related to organic N turnover are the prevailing pathway for N2O production. The underlying NO2- and N2O reduction kinetics are in agreement with microbial measurements and the calculated N2/N2O ratios are in the expected range. The model provides a framework for the development of more realistic representations of soil N cycling in ecosystem models.

  1. Quantifying the sensitivity of simulated climate change to model configuration

    Microsoft Academic Search

    Barry H. Lynn; Richard Healy; Leonard M. Druyan

    2009-01-01

    This study used “factor separation” to quantify the sensitivity of simulated present and future surface temperatures and precipitation\\u000a to alternative regional climate model physics components. The method enables a quantitative isolation of the effects of using\\u000a each physical component as well as the combined effect of two or more components. Simulation results are presented from eight\\u000a versions of the Mesoscale

  2. Quantifying Nitrogen Loss From Flooded Hawaiian Taro Fields

    NASA Astrophysics Data System (ADS)

    Deenik, J. L.; Penton, C. R.; Bruland, G. L.; Popp, B. N.; Engstrom, P.; Mueller, J. A.; Tiedje, J.

    2010-12-01

    In 2004 a field fertilization experiment showed that approximately 80% of the fertilizer nitrogen (N) added to flooded Hawaiian taro (Colocasia esculenta) fields could not be accounted for using classic N balance calculations. To quantify N loss through denitrification and anaerobic ammonium oxidation (anammox) pathways in these taro systems we utilized a slurry-based isotope pairing technique (IPT). Measured nitrification rates and porewater N profiles were also used to model ammonium and nitrate fluxes through the top 10 cm of soil. Quantitative PCR of nitrogen cycling functional genes was used to correlate porewater N dynamics with potential microbial activity. Rates of denitrification calculated using porewater profiles were compared to those obtained using the slurry method. Potential denitrification rates of surficial sediments obtained with the slurry method were found to drastically overestimate the calculated in-situ rates. The largest discrepancies were present in fields greater than one month after initial fertilization, reflecting a microbial community poised to denitrify the initial N pulse. Potential surficial nitrification rates varied between 1.3% of the slurry-measured denitrification potential in a heavily-fertilized site to 100% in an unfertilized site. Compared to the use of urea, fish bone meal fertilizer use resulted in decreased N loss through denitrification in the surface sediment, according to both porewater modeling and IPT measurements. In addition, sub-surface porewater profiles point to root-mediated coupled nitrification/denitrification as a potential N loss pathway that is not captured in surface-based incubations. Profile-based surface plus subsurface coupled nitrification/denitrification estimates were between 1.1 and 12.7 times denitrification estimates from the surface only. These results suggest that the use of a ‘classic’ isotope pairing technique that employs 15NO3- in fertilized agricultural systems can lead to a drastic overestimation of in-situ denitrification rates and that root-associated subsurface coupled nitrification/denitrification may be a major N loss pathway in these flooded agricultural systems.

  3. Quantifying dithiothreitol displacement of functional ligands from gold nanoparticles.

    PubMed

    Tsai, De-Hao; Shelton, Melanie P; DelRio, Frank W; Elzey, Sherrie; Guha, Suvajyoti; Zachariah, Michael R; Hackley, Vincent A

    2012-12-01

    Dithiothreitol (DTT)-based displacement is widely utilized for separating ligands from their gold nanoparticle (AuNP) conjugates, a critical step for differentiating and quantifying surface-bound functional ligands and therefore the effective surface density of these species on nanoparticle-based therapeutics and other functional constructs. The underlying assumption is that DTT is smaller and much more reactive toward gold compared with most ligands of interest, and as a result will reactively displace the ligands from surface sites thereby enabling their quantification. In this study, we use complementary dimensional and spectroscopic methods to characterize the efficiency of DTT displacement. Thiolated methoxypolyethylene glycol (SH-PEG) and bovine serum albumin (BSA) were chosen as representative ligands. Results clearly show that (1) DTT does not completely displace bound SH-PEG or BSA from AuNPs, and (2) the displacement efficiency is dependent on the binding affinity between the ligands and the AuNP surface. Additionally, the displacement efficiency for conjugated SH-PEG is moderately dependent on the molecular mass (yielding efficiencies ranging from 60 to 80% measured by ATR-FTIR and ?90% by ES-DMA), indicating that the displacement efficiency for SH-PEG is predominantly determined by the S-Au bond. BSA is particularly difficult to displace with DTT (i.e., the displacement efficiency is nearly zero) when it is in the so-called normal form. The displacement efficiency for BSA improves to 80% when it undergoes a conformational change to the expanded form through a process of pH change or treatment with a surfactant. An analysis of the three-component system (SH-PEG + BSA + AuNP) indicates that the presence of SH-PEG decreases the displacement efficiency for BSA, whereas the displacement efficiency for SH-PEG is less impacted by the presence of BSA. PMID:23104310

  4. Determining a quantifiable pollution management model (QPM)

    Microsoft Academic Search

    Rhys Rowland-Jones; Malcolm Cresser

    2005-01-01

    Purpose – The aim of this research is to develop a model for environmental management from which quantifiable indication of overall environmental performance for an organisation may be derived. Design\\/methodology\\/approach – The links between environmental performance and financial performance are considered. Several research methods are described which consider pollution performance. However, it is clear that no single method wholly reflects

  5. Event Calculus with Explicit Quantifiers Iliano Cervesato

    E-print Network

    Franceschet, Massimo

    Event Calculus with Explicit Quantifiers Iliano Cervesato , Massimo Franceschet , and Angelo francesc@dimi.uniud.it; montana@dimi.uniud.it Abstract Kowalski and Sergot's Event Calculus (EC) is a sim -- TIME'98 (R. Morris, L. Khatib editors), pp. 81­88, IEEE Computer Society Press, Sanibel Island, FL, 16

  6. An instrument to quantify dental calculus deposits

    Microsoft Academic Search

    R. L. Jeffcoat; I.-C. Wang; K. G. Palcanis; M. K. Jeffcoat

    1999-01-01

    An instrument to quantify subgingival dental calculus was developed and validated in human subjects. The device uses a miniature single-axis strain gauge accelerometer which detects and analyzes vibrations in a metal probe as its tip traverses the rough calculus deposits, and calculates a heuristic metric, the gated envelope energy ratio (GEER), intended to relate vibration signatures to the tactile sensation

  7. Transient Expression Assays for Quantifying Signaling Output

    E-print Network

    Sheen, Jen

    Chapter 16 Transient Expression Assays for Quantifying Signaling Output Yajie Niu and Jen Sheen Abstract The protoplast transient expression system has become a powerful and popular tool for studying the output of various signaling pathways. Key words: Arabidopsis, Mesophyll protoplast, Transient expression

  8. Quantifying ‘humics’ in freshwaters: purpose and methods

    Microsoft Academic Search

    Montserrat Filella

    2010-01-01

    Natural organic matter (NOM) plays an important role in many environmentally relevant processes. NOM includes many different types of compounds, not all of which behave similarly. Much effort has gone into characterising some fractions of NOM (e.g. humic substances) in the different environmental compartments, in finding tracers to ascertain their origin, etc. However, few methods exist for quantifying the different

  9. Quantifying Ecosystem Controls and Their Contextual Interactions

    E-print Network

    Vermont, University of

    Quantifying Ecosystem Controls and Their Contextual Interactions on Nutrient Export from Developing The complexity of natural ecosystems makes it difficult to compare the relative importance of abiotic and biotic factors and to assess the effects of their interactions on ecosystem development. To improve our

  10. Protocol comparison for quantifying in situ mineralization

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In situ mineralization methods are intended to quantify mineralization under realistic environmental conditions. This study was conducted to compare soil moisture and temperature in intake soil cores contained in cylinders to that in adjacent bulk soil, compare the effect of two resin bag techniques...

  11. A model to quantify wastewater odor strength

    Microsoft Academic Search

    Lawrence C. C. Koe; N. C. Tan

    1988-01-01

    A method of quantifying the odor strength of wastewater samples has been investigated. Wastewater samples from two locations of a wastewater treatment plant were collected and subjected to air stripping. The off-gas odor concentration was measured by a dynamic olfactometer at various time intervals. Applying a first order model to the decay of odorous substances in the wastewater under air

  12. Quantifying cellular alignment on anisotropic biomaterial platforms

    PubMed Central

    Nectow, Alexander R.; Kilmer, Misha E.; Kaplan, David L.

    2014-01-01

    How do we quantify cellular alignment? Cellular alignment is an important technique used to study and promote tissue regeneration in vitro and in vivo. Indeed, regenerative outcomes are often strongly correlated with the efficacy of alignment, making quantitative, automated assessment an important goal for the field of tissue engineering. There currently exist various classes of algorithms, which effectively address the problem of quantifying individual cellular alignments using Fourier methods, kernel methods, and elliptical approximation; however, these algorithms often yield population distributions and are limited by their inability to yield a scalar metric quantifying the efficacy of alignment. The current work builds on these classes of algorithms by adapting the signal processing methods previously used by our group to study the alignment of cellular processes. We use an automated, ellipse-fitting algorithm to approximate cell body alignment with respect to a silk biomaterial scaffold, followed by the application of the normalized cumulative periodogram criterion to produce a scalar value quantifying alignment. The proposed work offers a generalized method for assessing cellular alignment in complex, two-dimensional environments. This method may also offer a novel alternative for assessing the alignment of cell types with polarity, such as fibroblasts, endothelial cells, and mesenchymal stem cells, as well as nuclei. PMID:23520051

  13. Quantifying the Thermal Fatigue of CPV Modules

    SciTech Connect

    Bosco, N.; Kurtz, S.

    2011-02-01

    A method is presented to quantify thermal fatigue in the CPV die-attach from meteorological data. A comparative; study between cities demonstrates a significant difference in the accumulated damage. These differences are most; sensitive to the number of larger (?T) thermal cycles experienced for a location. High frequency data (<1/min) may; be required to most accurately employ this method.

  14. Comparison of Approaches to Quantify Arterial Damping

    E-print Network

    Chesler, Naomi C.

    Comparison of Approaches to Quantify Arterial Damping Capacity From Pressurization Tests on Mouse Conduit Arteries Lian Tian e-mail: ltian22@wisc.edu Zhijie Wang e-mail: zwang48@wisc.edu Department-mail: chesler@engr.wisc.edu Large conduit arteries are not purely elastic, but viscoelastic, which affects

  15. FIG. 2: Model results showing vertical and horizontal displacements due to the Hekla 2000 lava (disk, final relaxed response). Tickmarks in c are Lambert coordinates and describe the extend of the modeled area in meters. (a,d) and the Mogi model (b,e). Th

    E-print Network

    Grapenthin, Ronni

    FIG. 2: Model results showing vertical and horizontal displacements due to the Hekla 2000 lava the results of the Mogi model from the final relaxed response due to the Hekla lava. Note that the scale, Chambery, France contact: ronni@gi.alaska.edu AGU ­ V53C-1421 Abstract Modeling a circular lava flow

  16. COMPLEXITY&APPROXIMABILITY OF QUANTIFIED&STOCHASTIC CONSTRAINT SATISFACTION PROBLEMS

    SciTech Connect

    Hunt, H. B. (Harry B.); Marathe, M. V. (Madhav V.); Stearns, R. E. (Richard E.)

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S and T be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SATc(S).) Here, we study simultaneously the complexity of decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. We present simple yet general techniques to characterize simultaneously, the complexity or efficient approximability of a number of versions/variants of the problems SAT(S), Q-SAT(S), S-SAT(S),MAX-Q-SAT(S) etc., for many different such D,C ,S, T. These versions/variants include decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. Our unified approach is based on the following two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic represent ability. Some of the results extend the earlier results in [Pa85,LMP99,CF+93,CF+94O]u r techniques and results reported here also provide significant steps towards obtaining dichotomy theorems, for a number of the problems above, including the problems MAX-&-SAT( S), and MAX-S-SAT(S). The discovery of such dichotomy theorems, for unquantified formulas, has received significant recent attention in the literature [CF+93,CF+94,Cr95,KSW97

  17. Entropy generation method to quantify thermal comfort

    NASA Technical Reports Server (NTRS)

    Boregowda, S. C.; Tiwari, S. N.; Chaturvedi, S. K.

    2001-01-01

    The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study is needed in the future to fully establish the validity of the OTCI formula and the model. One of the practical applications of this index is that could it be integrated in thermal control systems to develop human-centered environmental control systems for potential use in aircraft, mass transit vehicles, intelligent building systems, and space vehicles.

  18. Quantifying Lead-Time Bias in Risk-Factor Studies of Cancer through Simulation

    PubMed Central

    Jansen, Rick J.; Alexander, Bruce H.; Anderson, Kristin E.; Church, Timothy R.

    2013-01-01

    Purpose Lead-time is inherent in early detection and creates bias in observational studies of screening efficacy, but its potential to bias effect estimates in risk-factor studies is not always recognized. We describe a form of this bias that conventional analyses cannot address and develop a model to quantify it. Methods Surveillance Epidemiology and End Results (SEER) data form the basis for estimates of age-specific preclinical incidence and log-normal distributions describe the preclinical duration distribution. Simulations assume a joint null hypothesis of no effect of either the risk factor or screening on the preclinical incidence of cancer, and then quantify the bias as the risk-factor odds ratio (OR) from this null study. This bias can be used as a factor to adjust observed OR in the actual study. Results Results showed that for this particular study design, as average preclinical duration increased, the bias in the total-physical-activity OR monotonically increased from 1% to 22% above the null, but the smoking OR monotonically decreased from 1% above the null to 5% below the null. Conclusion The finding of nontrivial bias in fixed risk-factor effect estimates demonstrates the importance of quantitatively evaluating it in susceptible studies. PMID:23988688

  19. Quantifying electrode reliability during brain-computer interface operation.

    PubMed

    Sagha, Hesam; Perdikis, Serafeim; Millán, José del R; Chavarriaga, Ricardo

    2015-03-01

    One of the problems of noninvasive brain-computer interface (BCI) applications is the occurrence of anomalous (unexpected) signals that might degrade BCI performance. This situation might slip the operator's attention since raw signals are not usually continuously visualized and monitored during BCI-actuated device operation. Anomalous data can for instance be the result of electrode misplacement, degrading impedance or loss of connectivity. Since this problem can develop at run time, there is a need of a systematic approach to evaluate electrode reliability during online BCI operation. In this paper, we propose two metrics detecting how much each channel is deviating from its expected behavior. This quantifies electrode reliability at run time which could be embedded into BCI data processing to increase performance. We assess the effectiveness of these metrics in quantifying signal degradation by conducting three experiments: Electrode swap, electrode manipulation, and offline artificially degradation of P300 signals. PMID:25376032

  20. Quantifying variances in comparative RNA secondary structure prediction

    PubMed Central

    2013-01-01

    Background With the advancement of next-generation sequencing and transcriptomics technologies, regulatory effects involving RNA, in particular RNA structural changes are being detected. These results often rely on RNA secondary structure predictions. However, current approaches to RNA secondary structure modelling produce predictions with a high variance in predictive accuracy, and we have little quantifiable knowledge about the reasons for these variances. Results In this paper we explore a number of factors which can contribute to poor RNA secondary structure prediction quality. We establish a quantified relationship between alignment quality and loss of accuracy. Furthermore, we define two new measures to quantify uncertainty in alignment-based structure predictions. One of the measures improves on the “reliability score” reported by PPfold, and considers alignment uncertainty as well as base-pair probabilities. The other measure considers the information entropy for SCFGs over a space of input alignments. Conclusions Our predictive accuracy improves on the PPfold reliability score. We can successfully characterize many of the underlying reasons for and variances in poor prediction. However, there is still variability unaccounted for, which we therefore suggest comes from the RNA secondary structure predictive model itself. PMID:23634662

  1. Crisis of Japanese Vascular Flora Shown By Quantifying Extinction Risks for 1618 Taxa

    PubMed Central

    Kadoya, Taku; Takenaka, Akio; Ishihama, Fumiko; Fujita, Taku; Ogawa, Makoto; Katsuyama, Teruo; Kadono, Yasuro; Kawakubo, Nobumitsu; Serizawa, Shunsuke; Takahashi, Hideki; Takamiya, Masayuki; Fujii, Shinji; Matsuda, Hiroyuki; Muneda, Kazuo; Yokota, Masatsugu; Yonekura, Koji; Yahara, Tetsukazu

    2014-01-01

    Although many people have expressed alarm that we are witnessing a mass extinction, few projections have been quantified, owing to limited availability of time-series data on threatened organisms, especially plants. To quantify the risk of extinction, we need to monitor changes in population size over time for as many species as possible. Here, we present the world's first quantitative projection of plant species loss at a national level, with stochastic simulations based on the results of population censuses of 1618 threatened plant taxa in 3574 map cells of ca. 100 km2. More than 500 lay botanists helped monitor those taxa in 1994–1995 and in 2003–2004. We projected that between 370 and 561 vascular plant taxa will go extinct in Japan during the next century if past trends of population decline continue. This extinction rate is approximately two to three times the global rate. Using time-series data, we show that existing national protected areas (PAs) covering ca. 7% of Japan will not adequately prevent population declines: even core PAs can protect at best <60% of local populations from decline. Thus, the Aichi Biodiversity Target to expand PAs to 17% of land (and inland water) areas, as committed to by many national governments, is not enough: only 29.2% of currently threatened species will become non-threatened under the assumption that probability of protection success by PAs is 0.5, which our assessment shows is realistic. In countries where volunteers can be organized to monitor threatened taxa, censuses using our method should be able to quantify how fast we are losing species and to assess how effective current conservation measures such as PAs are in preventing species extinction. PMID:24922311

  2. Crisis of Japanese vascular flora shown by quantifying extinction risks for 1618 taxa.

    PubMed

    Kadoya, Taku; Takenaka, Akio; Ishihama, Fumiko; Fujita, Taku; Ogawa, Makoto; Katsuyama, Teruo; Kadono, Yasuro; Kawakubo, Nobumitsu; Serizawa, Shunsuke; Takahashi, Hideki; Takamiya, Masayuki; Fujii, Shinji; Matsuda, Hiroyuki; Muneda, Kazuo; Yokota, Masatsugu; Yonekura, Koji; Yahara, Tetsukazu

    2014-01-01

    Although many people have expressed alarm that we are witnessing a mass extinction, few projections have been quantified, owing to limited availability of time-series data on threatened organisms, especially plants. To quantify the risk of extinction, we need to monitor changes in population size over time for as many species as possible. Here, we present the world's first quantitative projection of plant species loss at a national level, with stochastic simulations based on the results of population censuses of 1618 threatened plant taxa in 3574 map cells of ca. 100 km2. More than 500 lay botanists helped monitor those taxa in 1994-1995 and in 2003-2004. We projected that between 370 and 561 vascular plant taxa will go extinct in Japan during the next century if past trends of population decline continue. This extinction rate is approximately two to three times the global rate. Using time-series data, we show that existing national protected areas (PAs) covering ca. 7% of Japan will not adequately prevent population declines: even core PAs can protect at best <60% of local populations from decline. Thus, the Aichi Biodiversity Target to expand PAs to 17% of land (and inland water) areas, as committed to by many national governments, is not enough: only 29.2% of currently threatened species will become non-threatened under the assumption that probability of protection success by PAs is 0.5, which our assessment shows is realistic. In countries where volunteers can be organized to monitor threatened taxa, censuses using our method should be able to quantify how fast we are losing species and to assess how effective current conservation measures such as PAs are in preventing species extinction. PMID:24922311

  3. Quantifying nonverbal communicative behavior in face-to-face human dialogues

    NASA Astrophysics Data System (ADS)

    Skhiri, Mustapha; Cerrato, Loredana

    2002-11-01

    The referred study is based on the assumption that understanding how humans use nonverbal behavior in dialogues can be very useful in the design of more natural-looking animated talking heads. The goal of the study is twofold: (1) to explore how people use specific facial expressions and head movements to serve important dialogue functions, and (2) to show evidence that it is possible to measure and quantify the entity of these movements with the Qualisys MacReflex motion tracking system. Naturally elicited dialogues between humans have been analyzed with focus on the attention on those nonverbal behaviors that serve the very relevant functions of regulating the conversational flux (i.e., turn taking) and producing information about the state of communication (i.e., feedback). The results show that eyebrow raising, head nods, and head shakes are typical signals involved during the exchange of speaking turns, as well as in the production and elicitation of feedback. These movements can be easily measured and quantified, and this measure can be implemented in animated talking heads.

  4. Quantifying the relative impact of climate and human activities on streamflow

    NASA Astrophysics Data System (ADS)

    Ahn, Kuk-Hyun; Merwade, Venkatesh

    2014-07-01

    The objective of this study is to quantify the role of climate and human impacts on streamflow conditions by using historical streamflow records, in conjunction with trend analysis and hydrologic modeling. Four U.S. states, including Indiana, New York, Arizona and Georgia area used to represent various level of human activity based on population change and diverse climate conditions. The Mann-Kendall trend analysis is first used to examine the magnitude changes in precipitation, streamflow and potential evapotranspiration for the four states. Four hydrologic modeling methods, including linear regression, hydrologic simulation, annual balance, and Budyko analysis are then used to quantify the amount of climate and human impacts on streamflow. All four methods show that the human impact is higher on streamflow at most gauging stations in all four states compared to climate impact. Among the four methods used, the linear regression approach produced the best hydrologic output in terms of higher Nash-Sutcliffe coefficient. The methodology used in this study is also able to correctly highlight the areas with higher human impact such as the modified channelized reaches in the northwestern part of Indiana. The results from this study show that population alone cannot capture all the changes caused by human activities in a region. However, this approach provides a starting point towards understanding the role of individual human activities on streamflow changes.

  5. Quantifying the statistical complexity of low-frequency fluctuations in semiconductor lasers with optical feedback

    SciTech Connect

    Tiana-Alsina, J.; Torrent, M. C.; Masoller, C.; Garcia-Ojalvo, J. [Departament de Fisica i Enginyeria Nuclear, Universitat Politecnica de Catalunya, Campus de Terrassa, Edif. GAIA, Rambla de Sant Nebridi s/n, Terrassa E-08222 Barcelona (Spain); Rosso, O. A. [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627 Campus Pampulha, C.P. 702, 30123-970 Belo Horizonte, MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, 1428 Ciudad Universitaria, Buenos Aires (Argentina)

    2010-07-15

    Low-frequency fluctuations (LFFs) represent a dynamical instability that occurs in semiconductor lasers when they are operated near the lasing threshold and subject to moderate optical feedback. LFFs consist of sudden power dropouts followed by gradual, stepwise recoveries. We analyze experimental time series of intensity dropouts and quantify the complexity of the underlying dynamics employing two tools from information theory, namely, Shannon's entropy and the Martin, Plastino, and Rosso statistical complexity measure. These measures are computed using a method based on ordinal patterns, by which the relative length and ordering of consecutive interdropout intervals (i.e., the time intervals between consecutive intensity dropouts) are analyzed, disregarding the precise timing of the dropouts and the absolute durations of the interdropout intervals. We show that this methodology is suitable for quantifying subtle characteristics of the LFFs, and in particular the transition to fully developed chaos that takes place when the laser's pump current is increased. Our method shows that the statistical complexity of the laser does not increase continuously with the pump current, but levels off before reaching the coherence collapse regime. This behavior coincides with that of the first- and second-order correlations of the interdropout intervals, suggesting that these correlations, and not the chaotic behavior, are what determine the level of complexity of the laser's dynamics. These results hold for two different dynamical regimes, namely, sustained LFFs and coexistence between LFFs and steady-state emission.

  6. A mass-balance model to separate and quantify colloidal and solute redistributions in soil

    USGS Publications Warehouse

    Bern, C.R.; Chadwick, O.A.; Hartshorn, A.S.; Khomo, L.M.; Chorover, J.

    2011-01-01

    Studies of weathering and pedogenesis have long used calculations based upon low solubility index elements to determine mass gains and losses in open systems. One of the questions currently unanswered in these settings is the degree to which mass is transferred in solution (solutes) versus suspension (colloids). Here we show that differential mobility of the low solubility, high field strength (HFS) elements Ti and Zr can trace colloidal redistribution, and we present a model for distinguishing between mass transfer in suspension and solution. The model is tested on a well-differentiated granitic catena located in Kruger National Park, South Africa. Ti and Zr ratios from parent material, soil and colloidal material are substituted into a mixing equation to quantify colloidal movement. The results show zones of both colloid removal and augmentation along the catena. Colloidal losses of 110kgm-2 (-5% relative to parent material) are calculated for one eluviated soil profile. A downslope illuviated profile has gained 169kgm-2 (10%) colloidal material. Elemental losses by mobilization in true solution are ubiquitous across the catena, even in zones of colloidal accumulation, and range from 1418kgm-2 (-46%) for an eluviated profile to 195kgm-2 (-23%) at the bottom of the catena. Quantification of simultaneous mass transfers in solution and suspension provide greater specificity on processes within soils and across hillslopes. Additionally, because colloids include both HFS and other elements, the ability to quantify their redistribution has implications for standard calculations of soil mass balances using such index elements. ?? 2011.

  7. Quantifying occupant energy behavior using pattern analysis techniques

    SciTech Connect

    Emery, A. [Univ. of Washington, Seattle, WA (United States). Dept. of Mechanical Engineering; Gartland, L. [Lawrence Berkeley National Lab., CA (United States). Energy and Environment Div.

    1996-08-01

    Occupant energy behavior is widely agreed upon to have a major influence over the amount of energy used in buildings. Few attempts have been made to quantify this energy behavior, even though vast amounts of end-use data containing useful information lay fallow. This paper describes analysis techniques developed to extract behavioral information from collected residential end-use data. Analysis of the averages, standard deviations and frequency distributions of hourly data can yield important behavioral information. Pattern analysis can be used to group similar daily energy patterns together for a particular end-use or set of end-uses. Resulting pattern groups can then be examined statistically using multinomial logit modeling to find their likelihood of occurrence for a given set of daily conditions. These techniques were tested successfully using end-use data for families living in four heavily instrumented residences. Energy behaviors were analyzed for individual families during each heating season of the study. These behaviors (indoor temperature, ventilation load, water heating, large appliance energy, and miscellaneous outlet energy) capture how occupants directly control the residence. The pattern analysis and multinomial logit model were able to match the occupant behavior correctly 40 to 70% of the time. The steadier behaviors of indoor temperature and ventilation were matched most successfully. Simple changes to capture more detail during pattern analysis can increase accuracy for the more variable behavior patterns. The methods developed here show promise for extracting meaningful and useful information about occupant energy behavior from the stores of existing end-use data.

  8. Olaparib shows promise in multiple tumor types.

    PubMed

    2013-07-01

    A phase II study of the PARP inhibitor olaparib (AstraZeneca) for cancer patients with inherited BRCA1 and BRCA2 gene mutations confirmed earlier results showing clinical benefit for advanced breast and ovarian cancers, and demonstrated evidence of effectiveness against pancreatic and prostate cancers. PMID:23847380

  9. Quantifying spatial correlations of general quantum dynamics

    NASA Astrophysics Data System (ADS)

    Rivas, Ángel; Müller, Markus

    2015-06-01

    Understanding the role of correlations in quantum systems is both a fundamental challenge as well as of high practical relevance for the control of multi-particle quantum systems. Whereas a lot of research has been devoted to study the various types of correlations that can be present in the states of quantum systems, in this work we introduce a general and rigorous method to quantify the amount of correlations in the dynamics of quantum systems. Using a resource-theoretical approach, we introduce a suitable quantifier and characterize the properties of correlated dynamics. Furthermore, we benchmark our method by applying it to the paradigmatic case of two atoms weakly coupled to the electromagnetic radiation field, and illustrate its potential use to detect and assess spatial noise correlations in quantum computing architectures.

  10. Quantifying Biogenic Bias in Screening Libraries

    PubMed Central

    Hert, Jérôme; Irwin, John J.; Laggner, Christian; Keiser, Michael J.; Shoichet, Brian K.

    2009-01-01

    In lead discovery, libraries of 106 molecules are screened for biological activity. Given the over 1060 drug-like molecules thought possible, such screens might never succeed. That they do, even occasionally, implies a biased selection of library molecules. Here a method is developed to quantify the bias in screening libraries towards biogenic molecules. With this approach, we consider what is missing from screening libraries and how they can be optimized. PMID:19483698

  11. The development of a methodology to quantify the impacts of information management strategies on EPC projects 

    E-print Network

    Moreau, Karen Anne

    1997-01-01

    This research develops and demonstrates a methodology to quantify time and cost impacts on Engineering, Procurement, and Construction (EPC) projects resulting from information management driven process changes in design ...

  12. Quantifying Factors That Impact Riverbed Dynamic Permeability at a Riverbank Filtration Facility

    E-print Network

    Hubbard, Susan

    Quantifying Factors That Impact Riverbed Dynamic Permeability at a Riverbank Filtration Facility modeling studies of the Wohler riverbank filtration system on the Russian River, California suggested of riverbed permeability dynamics associated with Riverbank filtration. The results are also expected

  13. Quantifying and managing the risk of information security breaches participants in a supply chain

    E-print Network

    Bellefeuille, Cynthia Lynn

    2005-01-01

    Technical integration between companies can result in an increased risk of information security breaches. This thesis proposes a methodology for quantifying information security risk to a supply chain participant. Given a ...

  14. Quantifying the surface chemistry of 3D matrices in situ

    NASA Astrophysics Data System (ADS)

    Tzeranis, Dimitrios S.; So, Peter T. C.; Yannas, Ioannis V.

    2014-03-01

    Despite the major role of the matrix (the insoluble environment around cells) in physiology and pathology, there are very few and limited methods that can quantify the surface chemistry of a 3D matrix such as a biomaterial or tissue ECM. This study describes a novel optical-based methodology that can quantify the surface chemistry (density of adhesion ligands for particular cell adhesion receptors) of a matrix in situ. The methodology utilizes fluorescent analogs (markers) of the receptor of interest and a series of binding assays, where the amount of bound markers on the matrix is quantified via spectral multi-photon imaging. The study provides preliminary results for the quantification of the ligands for the two major collagen-binding integrins (?1?1, ?2?1) in porous collagen scaffolds that have been shown to be able to induce maximum regeneration in transected peripheral nerves. The developed methodology opens the way for quantitative descriptions of the insoluble microenvironment of cells in physiology and pathology, and for integrating the matrix in quantitative models of cell signaling. ?

  15. Quantifying the robustness of the English sibilant fricative contrast in children

    PubMed Central

    Holliday, Jeffrey J.; Reidy, Patrick F.; Beckman, Mary E.; Edwards, Jan

    2015-01-01

    Purpose Four measures of children’s developing robustness of phonological contrast were compared to see how they correlated with age, with vocabulary size, and with adult listeners’ “correctness” ratings. Method Word-initial sibilant fricative productions from 81 two- to five-year-old children and 20 adults were phonetically transcribed and acoustically analyzed. Four measures of robustness of contrast were calculated for each speaker based on the centroid frequency measured from each fricative token. Productions from different children that were transcribed as correct were then used as stimuli in a perception experiment in which adult listeners rated the goodness of each production. Results Results showed that the degree of category overlap, quantified as the percentage of a child’s productions whose category could be correctly predicted from the output of a mixed effects logistic regression model, was the measure that correlated best with listeners’ goodness judgments. Conclusions Even when children’s productions have been transcribed as “correct”, adult listeners are sensitive to within-category variation quantified by the child’s degree of category overlap. Further research is needed to explore the relationship between the age of a child and adults’ sensitivity to different types of within-category variation in children’s speech. PMID:25766040

  16. Quantifying alosine prey in the diets of marine piscivores in the Gulf of Maine.

    PubMed

    McDermott, S P; Bransome, N C; Sutton, S E; Smith, B E; Link, J S; Miller, T J

    2015-06-01

    The objectives of this work were to quantify the spatial and temporal distribution of the occurrence of anadromous fishes (alewife Alosa pseudoharengus, blueback herring Alosa aestivalis and American shad Alosa sapidissima) in the stomachs of demersal fishes in coastal waters of the north-west Atlantic Ocean. Results show that anadromous fishes were detectable and quantifiable in the diets of common marine piscivores for every season sampled. Even though anadromous fishes were not the most abundant prey, they accounted for c. 5-10% of the diet by mass for several marine piscivores. Statistical comparisons of these data with fish diet data from a broad-scale survey of the north-west Atlantic Ocean indicate that the frequency of this trophic interaction was significantly higher within spatially and temporally focused sampling areas of this study than in the broad-scale survey. Odds ratios of anadromous predation were as much as 460 times higher in the targeted sampling as compared with the broad-scale sampling. Analyses indicate that anadromous prey consumption was more concentrated in the near-coastal waters compared with consumption of a similar, but more widely distributed species, the Atlantic herring Clupea harengus. In the context of ecosystem-based fisheries management, the results suggest that even low-frequency feeding events may be locally important, and should be incorporated into ecosystem models. PMID:25943427

  17. Quantifying and predicting interpretational uncertainty in cross-sections

    NASA Astrophysics Data System (ADS)

    Randle, Charles; Bond, Clare; Monaghan, Alison; Lark, Murray

    2015-04-01

    Cross-sections are often constructed from data to create a visual impression of the geologist's interpretation of the sub-surface geology. However as with all interpretations, this vision of the sub-surface geology is uncertain. We have designed and carried out an experiment with the aim of quantifying the uncertainty in geological cross-sections created by experts interpreting borehole data. By analysing different attributes of the data and interpretations we reflect on the main controls on uncertainty. A group of ten expert modellers at the British Geological Survey were asked to interpret an 11.4 km long cross-section from south-east Glasgow, UK. The data provided consisted of map and borehole data of the superficial deposits and shallow bedrock. Each modeller had a unique set of 11 boreholes removed from their dataset, to which their interpretations of the top of the bedrock were compared. This methodology allowed quantification of how far from the 'correct answer' each interpretation is at 11 points along each interpreted cross-section line; through comparison of the interpreted and actual bedrock elevations in the boreholes. This resulted in the collection of 110 measurements of the error to use in further analysis. To determine the potential control on uncertainty various attributes relating to the modeller, the interpretation and the data were recorded. Modellers were asked to fill out a questionnaire asking for information; such as how much 3D modelling experience they had, and how long it took them to complete the interpretation. They were also asked to record their confidence in their interpretations graphically, in the form of a confidence level drawn onto the cross-section. Initial analysis showed the majority of the experts' interpreted bedrock elevations within 5 metres of those recorded in the withheld boreholes. Their distribution is peaked and symmetrical about a mean of zero, indicating that there was no tendency for the experts to either under or over estimate the elevation of the bedrock. More complex analysis was completed in the form of linear mixed effects modelling. The modelling was used to determine if there were any correlations between the error and any other parameter recorded in the questionnaire, section or the initial dataset. This has resulted in the determination of both data based and interpreter based controls on uncertainty, adding insight into how uncertainty can be predicted, as well as how interpretation workflows can be improved. Our results will inform further experiments across a wide variety of geological situations to build understanding and best practice workflows for cross-section interpretation to reduce uncertainty.

  18. Quantifying spore viability of the honey bee pathogen Nosema apis using flow cytometry.

    PubMed

    Peng, Yan; Lee-Pullen, Tracey F; Heel, Kathy; Millar, A Harvey; Baer, Boris

    2014-05-01

    Honey bees are hosts to more than 80 different parasites, some of them being highly virulent and responsible for substantial losses in managed honey bee populations. The study of honey bee pathogens and their interactions with the bees' immune system has therefore become a research area of major interest. Here we developed a fast, accurate and reliable method to quantify the viability of spores of the honey bee gut parasite Nosema apis. To verify this method, a dilution series with 0, 25, 50, 75, and 100% live N. apis was made and SYTO 16 and Propidium Iodide (n = 35) were used to distinguish dead from live spores. The viability of spores in each sample was determined by flow cytometry and compared with the current method based on fluorescence microscopy. Results show that N. apis viability counts using flow cytometry produced very similar results when compared with fluorescence microscopy. However, we found that fluorescence microscopy underestimates N. apis viability in samples with higher percentages of viable spores, the latter typically being what is found in biological samples. A series of experiments were conducted to confirm that flow cytometry allows the use of additional fluorescent dyes such as SYBR 14 and SYTOX Red (used in combination with SYTO 16 or Propidium Iodide) to distinguish dead from live spores. We also show that spore viability quantification with flow cytometry can be undertaken using substantially lower dye concentrations than fluorescence microscopy. In conclusion, our data show flow cytometry to be a fast, reliable method to quantify N. apis spore viabilities, which has a number of advantages compared with existing methods. PMID:24339267

  19. Quantifying Subsidence in the 1999-2000 Arctic Winter Vortex

    NASA Technical Reports Server (NTRS)

    Greenblatt, Jeffery B.; Jost, Hans-juerg; Loewenstein, Max; Podolske, James R.; Bui, T. Paul; Elkins, James W.; Moore, Fred L.; Ray, Eric A.; Sen, Bhaswar; Margitan, James J.; Hipskind, R. Stephen (Technical Monitor)

    2000-01-01

    Quantifying the subsidence of the polar winter stratospheric vortex is essential to the analysis of ozone depletion, as chemical destruction often occurs against a large, altitude-dependent background ozone concentration. Using N2O measurements made during SOLVE on a variety of platforms (ER-2, in-situ balloon and remote balloon), the 1999-2000 Arctic winter subsidence is determined from N2O-potential temperature correlations along several N2O isopleths. The subsidence rates are compared to those determined in other winters, and comparison is also made with results from the SLIMCAT stratospheric chemical transport model.

  20. Quantifying the threat of extinction from Muller's ratchet in the diploid Amazon molly (Poecilia formosa)

    PubMed Central

    2008-01-01

    Background The Amazon molly (Poecilia formosa) is a small unisexual fish that has been suspected of being threatened by extinction from the stochastic accumulation of slightly deleterious mutations that is caused by Muller's ratchet in non-recombining populations. However, no detailed quantification of the extent of this threat is available. Results Here we quantify genomic decay in this fish by using a simple model of Muller's ratchet with the most realistic parameter combinations available employing the evolution@home global computing system. We also describe simple extensions of the standard model of Muller's ratchet that allow us to deal with selfing diploids, triploids and mitotic recombination. We show that Muller's ratchet creates a threat of extinction for the Amazon molly for many biologically realistic parameter combinations. In most cases, extinction is expected to occur within a time frame that is less than previous estimates of the age of the species, leading to a genomic decay paradox. Conclusion How then does the Amazon molly survive? Several biological processes could individually or in combination solve this genomic decay paradox, including paternal leakage of undamaged DNA from sexual sister species, compensatory mutations and many others. More research is needed to quantify the contribution of these potential solutions towards the survival of the Amazon molly and other (ancient) asexual species. PMID:18366680

  1. Quantifying the Short-Term Costs of Conservation Interventions for Fishers at Lake Alaotra, Madagascar

    PubMed Central

    Wallace, Andrea P. C.; Milner-Gulland, E. J.; Jones, Julia P. G.; Bunnefeld, Nils; Young, Richard; Nicholson, Emily

    2015-01-01

    Artisanal fisheries are a key source of food and income for millions of people, but if poorly managed, fishing can have declining returns as well as impacts on biodiversity. Management interventions such as spatial and temporal closures can improve fishery sustainability and reduce environmental degradation, but may carry substantial short-term costs for fishers. The Lake Alaotra wetland in Madagascar supports a commercially important artisanal fishery and provides habitat for a Critically Endangered primate and other endemic wildlife of conservation importance. Using detailed data from more than 1,600 fisher catches, we used linear mixed effects models to explore and quantify relationships between catch weight, effort, and spatial and temporal restrictions to identify drivers of fisher behaviour and quantify the potential effect of fishing restrictions on catch. We found that restricted area interventions and fishery closures would generate direct short-term costs through reduced catch and income, and these costs vary between groups of fishers using different gear. Our results show that conservation interventions can have uneven impacts on local people with different fishing strategies. This information can be used to formulate management strategies that minimise the adverse impacts of interventions, increase local support and compliance, and therefore maximise conservation effectiveness. PMID:26107284

  2. Quantifying Proteinuria in Hypertensive Disorders of Pregnancy

    PubMed Central

    Amin, Sapna V.; Illipilla, Sireesha; Rai, Lavanya; Kumar, Pratap; Pai, Muralidhar V.

    2014-01-01

    Background. Progressive proteinuria indicates worsening of the condition in hypertensive disorders of pregnancy and hence its quantification guides clinician in decision making and treatment planning. Objective. To evaluate the efficacy of spot dipstick analysis and urinary protein-creatinine ratio (UPCR) in hypertensive disease of pregnancy for predicting 24-hour proteinuria. Subjects and Methods. A total of 102 patients qualifying inclusion criteria were evaluated with preadmission urine dipstick test and UPCR performed on spot voided sample. After admission, the entire 24-hour urine sample was collected and analysed for daily protein excretion. Dipstick estimation and UPCR were compared to the 24-hour results. Results. Seventy-eight patients (76.5%) had significant proteinuria of more than 300?mg/24?h. Dipstick method showed 59% sensitivity and 67% specificity for prediction of significant proteinuria. Area under curve for UPCR was 0.89 (95% CI: 0.83 to 0.95, P < 0.001) showing 82% sensitivity and 12.5% false positive rate for cutoff value of 0.45. Higher cutoff values (1.46 and 1.83) predicted heavy proteinuria (2?g and 3?g/24?h, resp.). Conclusion. This study suggests that random urinary protein?:?creatine ratio is a reliable investigation compared to dipstick method to assess proteinuria in hypertensive pregnant women. However, clinical laboratories should standardize the reference values for their setup. PMID:25302114

  3. Quantifying Biofilm in Porous Media Using Rock Physics Models

    NASA Astrophysics Data System (ADS)

    Alhadhrami, F. M.; Jaiswal, P.; Atekwana, E. A.

    2012-12-01

    Biofilm formation and growth in porous rocks can change their material properties such as porosity, permeability which in turn will impact fluid flow. Finding a non-intrusive method to quantify biofilms and their byproducts in rocks is a key to understanding and modeling bioclogging in porous media. Previous geophysical investigations have documented that seismic techniques are sensitive to biofilm growth. These studies pointed to the fact that microbial growth and biofilm formation induces heterogeneity in the seismic properties. Currently there are no rock physics models to explain these observations and to provide quantitative interpretation of the seismic data. Our objectives are to develop a new class of rock physics model that incorporate microbial processes and their effect on seismic properties. Using the assumption that biofilms can grow within pore-spaces or as a layer coating the mineral grains, P-wave velocity (Vp) and S-wave (Vs) velocity models were constructed using travel-time and waveform tomography technique. We used generic rock physics schematics to represent our rock system numerically. We simulated the arrival times as well as waveforms by treating biofilms either as fluid (filling pore spaces) or as part of matrix (coating sand grains). The preliminary results showed that there is a 1% change in Vp and 3% change in Vs when biofilms are represented discrete structures in pore spaces. On the other hand, a 30% change in Vp and 100% change in Vs was observed when biofilm was represented as part of matrix coating sand grains. Therefore, Vp and Vs changes are more rapid when biofilm grows as grain-coating phase. The significant change in Vs associated with biofilms suggests that shear velocity can be used as a diagnostic tool for imaging zones of bioclogging in the subsurface. The results obtained from this study have significant implications for the study of the rheological properties of biofilms in geological media. Other applications include assessing biofilms used as barriers in CO2 sequestration studies as well as assisting in evaluating microbial enhanced oil recovery methods (MEOR), where microorganisms are used to plug highly porous rocks for efficient oil production.

  4. Quantifying uncertainty in observational rainfall datasets

    NASA Astrophysics Data System (ADS)

    Lennard, Chris; Dosio, Alessandro; Nikulin, Grigory; Pinto, Izidine; Seid, Hussen

    2015-04-01

    The CO-ordinated Regional Downscaling Experiment (CORDEX) has to date seen the publication of at least ten journal papers that examine the African domain during 2012 and 2013. Five of these papers consider Africa generally (Nikulin et al. 2012, Kim et al. 2013, Hernandes-Dias et al. 2013, Laprise et al. 2013, Panitz et al. 2013) and five have regional foci: Tramblay et al. (2013) on Northern Africa, Mariotti et al. (2014) and Gbobaniyi el al. (2013) on West Africa, Endris et al. (2013) on East Africa and Kalagnoumou et al. (2013) on southern Africa. There also are a further three papers that the authors know about under review. These papers all use an observed rainfall and/or temperature data to evaluate/validate the regional model output and often proceed to assess projected changes in these variables due to climate change in the context of these observations. The most popular reference rainfall data used are the CRU, GPCP, GPCC, TRMM and UDEL datasets. However, as Kalagnoumou et al. (2013) point out there are many other rainfall datasets available for consideration, for example, CMORPH, FEWS, TAMSAT & RIANNAA, TAMORA and the WATCH & WATCH-DEI data. They, with others (Nikulin et al. 2012, Sylla et al. 2012) show that the observed datasets can have a very wide spread at a particular space-time coordinate. As more ground, space and reanalysis-based rainfall products become available, all which use different methods to produce precipitation data, the selection of reference data is becoming an important factor in model evaluation. A number of factors can contribute to a uncertainty in terms of the reliability and validity of the datasets such as radiance conversion algorithims, the quantity and quality of available station data, interpolation techniques and blending methods used to combine satellite and guage based products. However, to date no comprehensive study has been performed to evaluate the uncertainty in these observational datasets. We assess 18 gridded rainfall datasets available over Africa on monthly, daily and sub-daily time scales as appropriate to quantify spatial and temporal differences between the datasets. We find regional wet and dry biases between datasets (using the ensemble mean as a reference) with generally larger biases in reanalysis products. Rainfall intensity is poorly represented in some datasets which demonstrates some datasets should not be used for rainfall intensity analyses. Using 10 CORDEX models we show in east Africa that the spread between observed datasets is often similar to the spread between models. We recommend that specific observational rainfall datasets datasets be used for specific investigations and also that where many datasets are applicable to an investigation, a probabilistic view be adopted for rainfall studies over Africa. Endris, H. S., P. Omondi, S. Jain, C. Lennard, B. Hewitson, L. Chang'a, J. L. Awange, A. Dosio, P. Ketiem, G. Nikulin, H-J. Panitz, M. Büchner, F. Stordal, and L. Tazalika (2013) Assessment of the Performance of CORDEX Regional Climate Models in Simulating East African Rainfall. J. Climate, 26, 8453-8475. DOI: 10.1175/JCLI-D-12-00708.1 Gbobaniyi, E., A. Sarr, M. B. Sylla, I. Diallo, C. Lennard, A. Dosio, A. Dhie ?diou, A. Kamga, N. A. B. Klutse, B. Hewitson, and B. Lamptey (2013) Climatology, annual cycle and interannual variability of precipitation and temperature in CORDEX simulations over West Africa. Int. J. Climatol., DOI: 10.1002/joc.3834 Hernández-Díaz, L., R. Laprise, L. Sushama, A. Martynov, K. Winger, and B. Dugas (2013) Climate simulation over CORDEX Africa domain using the fifth-generation Canadian Regional Climate Model (CRCM5). Clim. Dyn. 40, 1415-1433. DOI: 10.1007/s00382-012-1387-z Kalognomou, E., C. Lennard, M. Shongwe, I. Pinto, A. Favre, M. Kent, B. Hewitson, A. Dosio, G. Nikulin, H. Panitz, and M. Büchner (2013) A diagnostic evaluation of precipitation in CORDEX models over southern Africa. Journal of Climate, 26, 9477-9506. DOI:10.1175/JCLI-D-12-00703.1 Kim, J., D. E. Waliser, C. A. Mattmann, C. E. Goodale, A. F. Hart

  5. Quantifying Dirac hydrogenic effects via complexity measures

    E-print Network

    P. A. Bouvrie; S. López-Rosa; J. S. Dehesa

    2014-08-29

    The primary dynamical Dirac relativistic effects can only be seen in hydrogenic systems without the complications introduced by electron-electron interactions in many-electron systems. They are known to be the contraction-towards-the-origin of the electronic charge in hydrogenic systems and the nodal disapearance (because of the raising of all the non-relativistic minima) in the electron density of the excited states of these systems. In addition we point out the (largely ignored) gradient reduction of the charge density near and far the nucleus. In this work we quantify these effects by means of single (Fisher information) and composite (Fisher-Shannon complexity and plane, LMC complexity) information-theoretic measures. While the Fisher information measures the gradient content of the density, the (dimensionless) composite information-theoretic quantities grasp two-fold facets of the electronic distribution: The Fisher-Shannon complexity measures the combined balance of the gradient content and the total extent of the electronic charge, and the LMC complexity quantifies the disequilibrium jointly with the spreading of the density in the configuration space. Opposite to other complexity notions (e.g., computational and algorithmic complexities), these two quantities describe intrinsic properties of the system because they do not depend on the context but they are functionals of the electron density. Moreover, they are closely related to the intuitive notion of complexity because they are minimum for the two extreme (or least complex) distributions of perfect order and maximum disorder.

  6. Precise thermal NDE for quantifying structural damage

    SciTech Connect

    Del Grande, N.K.; Durbin, P.F.

    1995-09-18

    The authors demonstrated a fast, wide-area, precise thermal NDE imaging system to quantify aircraft corrosion damage, such as percent metal loss, above a threshold of 5% with 3% overall uncertainties. The DBIR precise thermal imaging and detection method has been used successfully to characterize defect types, and their respective depths, in aircraft skins, and multi-layered composite materials used for wing patches, doublers and stiffeners. This precise thermal NDE inspection tool has long-term potential benefits to evaluate the structural integrity of airframes, pipelines and waste containers. They proved the feasibility of the DBIR thermal NDE imaging system to inspect concrete and asphalt-concrete bridge decks. As a logical extension to the successful feasibility study, they plan to inspect a concrete bridge deck from a moving vehicle to quantify the volumetric damage within the deck and the percent of the deck which has subsurface delaminations. Potential near-term benefits are in-service monitoring from a moving vehicle to inspect the structural integrity of the bridge deck. This would help prioritize the repair schedule for a reported 200,000 bridge decks in the US which need substantive repairs. Potential long-term benefits are affordable, and reliable, rehabilitation for bridge decks.

  7. Quantifying chemical reactions by using mixing analysis.

    PubMed

    Jurado, Anna; Vázquez-Suñé, Enric; Carrera, Jesús; Tubau, Isabel; Pujades, Estanislao

    2015-01-01

    This work is motivated by a sound understanding of the chemical processes that affect the organic pollutants in an urban aquifer. We propose an approach to quantify such processes using mixing calculations. The methodology consists of the following steps: (1) identification of the recharge sources (end-members) and selection of the species (conservative and non-conservative) to be used, (2) identification of the chemical processes and (3) evaluation of mixing ratios including the chemical processes. This methodology has been applied in the Besòs River Delta (NE Barcelona, Spain), where the River Besòs is the main aquifer recharge source. A total number of 51 groundwater samples were collected from July 2007 to May 2010 during four field campaigns. Three river end-members were necessary to explain the temporal variability of the River Besòs: one river end-member is from the wet periods (W1) and two are from dry periods (D1 and D2). This methodology has proved to be useful not only to compute the mixing ratios but also to quantify processes such as calcite and magnesite dissolution, aerobic respiration and denitrification undergone at each observation point. PMID:25280248

  8. Using nitrate to quantify quick flow in a karst aquifer

    USGS Publications Warehouse

    Mahler, B.J.; Garner, B.D.

    2009-01-01

    In karst aquifers, contaminated recharge can degrade spring water quality, but quantifying the rapid recharge (quick flow) component of spring flow is challenging because of its temporal variability. Here, we investigate the use of nitrate in a two-endmember mixing model to quantify quick flow in Barton Springs, Austin, Texas. Historical nitrate data from recharging creeks and Barton Springs were evaluated to determine a representative nitrate concentration for the aquifer water endmember (1.5 mg/L) and the quick flow endmember (0.17 mg/L for nonstormflow conditions and 0.25 mg/L for stormflow conditions). Under nonstormflow conditions for 1990 to 2005, model results indicated that quick flow contributed from 0% to 55% of spring flow. The nitrate-based two-endmember model was applied to the response of Barton Springs to a storm and results compared to those produced using the same model with ??18O and specific conductance (SC) as tracers. Additionally, the mixing model was modified to allow endmember quick flow values to vary over time. Of the three tracers, nitrate appears to be the most advantageous because it is conservative and because the difference between the concentrations in the two endmembers is large relative to their variance. The ??18O- based model was very sensitive to variability within the quick flow endmember, and SC was not conservative over the timescale of the storm response. We conclude that a nitrate-based two-endmember mixing model might provide a useful approach for quantifying the temporally variable quick flow component of spring flow in some karst systems. ?? 2008 National Ground Water Association.

  9. Quantifying Volume of Groundwater in High Elevation Meadows

    NASA Astrophysics Data System (ADS)

    Ciruzzi, D.; Lowry, C.

    2013-12-01

    Assessing the current and future water needs of high elevation meadows is dependent on quantifying the volume of groundwater stored within the meadow sediment. As groundwater dependent ecosystems, these meadows rely on their ability to capture and store water in order to support ecologic function and base flow to streams. Previous research of these meadows simplified storage by assuming a homogenous reservoir of constant thickness. These previous storage models were able to close the water mass balance, but it is unclear if these assumptions will be successful under future anthropogenic impacts, such as increased air temperature resulting in dryer and longer growing seasons. Applying a geophysical approach, ground-penetrating radar was used at Tuolumne Meadows, CA to qualitatively and quantitatively identify the controls on volume of groundwater storage. From the geophysical results, a three-dimensional model of Tuolumne Meadows was created, which identified meadow thickness and bedrock geometry. This physical model was used in a suite of numerical models simulating high elevation meadows in order to quantify volume of groundwater stored with temporal and spatial variability. Modeling efforts tested both wet and dry water years in order to quantify the variability in the volume of groundwater storage for a range of aquifer properties. Each model was evaluated based on the seasonal depth to water in order to evaluate a particular scenario's ability to support ecological function and base flow. Depending on the simulated meadows ability or inability to support its ecosystem, each representative meadow was categorized as successful or unsuccessful. Restoration techniques to increase active storage volume were suggested at unsuccessful meadows.

  10. Quantifying the BICEP2-Planck tension over gravitational waves.

    PubMed

    Smith, Kendrick M; Dvorkin, Cora; Boyle, Latham; Turok, Neil; Halpern, Mark; Hinshaw, Gary; Gold, Ben

    2014-07-18

    The recent BICEP2 measurement of B-mode polarization in the cosmic microwave background (r = 0.2(-0.05)(+0.07)), a possible indication of primordial gravity waves, appears to be in tension with the upper limit from WMAP (r < 0.13 at 95% C.L.) and Planck (r < 0.11 at 95% C.L.). We carefully quantify the level of tension and show that it is very significant (around 0.1% unlikely) when the observed deficit of large-scale temperature power is taken into account. We show that measurements of TE and EE power spectra in the near future will discriminate between the hypotheses that this tension is either a statistical fluke or a sign of new physics. We also discuss extensions of the standard cosmological model that relieve the tension and some novel ways to constrain them. PMID:25083631

  11. Quantifying the BICEP2-Planck Tension over Gravitational Waves

    NASA Astrophysics Data System (ADS)

    Smith, Kendrick M.; Dvorkin, Cora; Boyle, Latham; Turok, Neil; Halpern, Mark; Hinshaw, Gary; Gold, Ben

    2014-07-01

    The recent BICEP2 measurement of B-mode polarization in the cosmic microwave background (r =0.2-0.05+0.07), a possible indication of primordial gravity waves, appears to be in tension with the upper limit from WMAP (r<0.13 at 95% C.L.) and Planck (r <0.11 at 95% C.L.). We carefully quantify the level of tension and show that it is very significant (around 0.1% unlikely) when the observed deficit of large-scale temperature power is taken into account. We show that measurements of TE and EE power spectra in the near future will discriminate between the hypotheses that this tension is either a statistical fluke or a sign of new physics. We also discuss extensions of the standard cosmological model that relieve the tension and some novel ways to constrain them.

  12. Anatomy of Alternating Quantifier Satisfiability (Work in progress)

    E-print Network

    Paris-Sud XI, Université de

    Anatomy of Alternating Quantifier Satisfiability (Work in progress) Anh-Dung Phan Technical procedures. We instantiate the generalization to projection functions based on virtual substitutions, i on Satisfiability Modulo Theories, Manchester : United Kingdom (2012)" #12;Anatomy of Alternating Quantifier

  13. Quantifier Elimination for Real Algebra - the Quadratic Case and Beyond

    Microsoft Academic Search

    Volker Weispfenning

    1997-01-01

    .  ?We present a new, “elementary” quantifier elimination method for various special cases of the general quantifier elimination\\u000a problem for the first-order theory of real numbers. These include the elimination of one existential quantifier ?x in front of quantifier-free formulas restricted by a non-trivial quadratic equation in x (the case considered also in [7]), and more generally in front of arbitrary

  14. UV-vis spectra as an alternative to the Lowry method for quantify hair damage induced by surfactants.

    PubMed

    Pires-Oliveira, Rafael; Joekes, Inés

    2014-11-01

    It is well known that long term use of shampoo causes damage to human hair. Although the Lowry method has been widely used to quantify hair damage, it is unsuitable to determine this in the presence of some surfactants and there is no other method proposed in literature. In this work, a different method is used to investigate and compare the hair damage induced by four types of surfactants (including three commercial-grade surfactants) and water. Hair samples were immersed in aqueous solution of surfactants under conditions that resemble a shower (38 °C, constant shaking). These solutions become colored with time of contact with hair and its UV-vis spectra were recorded. For comparison, the amount of extracted proteins from hair by sodium dodecyl sulfate (SDS) and by water were estimated by the Lowry method. Additionally, non-pigmented vs. pigmented hair and also sepia melanin were used to understand the washing solution color and their spectra. The results presented herein show that hair degradation is mostly caused by the extraction of proteins, cuticle fragments and melanin granules from hair fiber. It was found that the intensity of solution color varies with the charge density of the surfactants. Furthermore, the intensity of solution color can be correlated to the amount of proteins quantified by the Lowry method as well as to the degree of hair damage. UV-vis spectrum of hair washing solutions is a simple and straightforward method to quantify and compare hair damages induced by different commercial surfactants. PMID:25277290

  15. Quantifier Inference Rules for SMT proofs David Deharbe

    E-print Network

    Paris-Sud XI, Université de

    reasoner is a propositional SAT-solver, and quantifiers are handled by independent modules [8]. In veriT [4 quantifiers in the proof format of the SMT-solver veriT. The quantifier-handling modules in veriT being fairly benefit the SMT community. 1 Introduction In the typical architecture of an SMT-solver, the core automated

  16. Quantifying compositional impacts of ambient aerosol on cloud droplet formation

    NASA Astrophysics Data System (ADS)

    Lance, Sara

    It has been historically assumed that most of the uncertainty associated with the aerosol indirect effect on climate can be attributed to the unpredictability of updrafts. In Chapter 1, we analyze the sensitivity of cloud droplet number density, to realistic variations in aerosol chemical properties and to variable updraft velocities using a 1-dimensional cloud parcel model in three important environmental cases (continental, polluted and remote marine). The results suggest that aerosol chemical variability may be as important to the aerosol indirect effect as the effect of unresolved cloud dynamics, especially in polluted environments. We next used a continuous flow streamwise thermal gradient Cloud Condensation Nuclei counter (CCNc) to study the water-uptake properties of the ambient aerosol, by exposing an aerosol sample to a controlled water vapor supersaturation and counting the resulting number of droplets. In Chapter 2, we modeled and experimentally characterized the heat transfer properties and droplet growth within the CCNc. Chapter 3 describes results from the MIRAGE field campaign, in which the CCNc and a Hygroscopicity Tandem Differential Mobility Analyzer (HTDMA) were deployed at a ground-based site during March, 2006. Size-resolved CCN activation spectra and growth factor distributions of the ambient aerosol in Mexico City were obtained, and an analytical technique was developed to quantify a probability distribution of solute volume fractions for the CCN in addition to the aerosol mixing-state. The CCN were shown to be much less CCN active than ammonium sulfate, with water uptake properties more consistent with low molecular weight organic compounds. The pollution outflow from Mexico City was shown to have CCN with an even lower fraction of soluble material. "Chemical Closure" was attained for the CCN, by comparing the inferred solute volume fraction with that from direct chemical measurements. A clear diurnal pattern was observed for the CCN solute volume fraction, showing that measurable aging of the aerosol population occurs during the day, on the timescale of a few hours. The mixing state of the aerosol, also showing a consistent diurnal pattern, clearly correlates with a chemical tracer for local combustion sources. Chapter 4 describes results from the GoMACCS field study, in which the CCNc was subsequently deployed on an airborne field campaign in Houston, Texas during August-September, 2006. GoMACCS tested our ability to predict CCN for highly polluted conditions with limited chemical information. Assuming the particles were composed purely of ammonium sulfate, CCN closure was obtained with a 10% overprediction bias on average for CCN concentrations ranging from less than 100 cm-3 to over 10,000 cm-3, but with on average 50% variability. Assuming measured concentrations of organics to be internally mixed and insoluble tended to reduce the overprediction bias for less polluted conditions, but led to underprediction bias in the most polluted conditions. A likely explanation is that the high organic concentrations in the polluted environments depress the surface tension of the droplets, thereby enabling activation at lower soluble fractions.

  17. Boron aluminum crippling strength shows improvement

    NASA Technical Reports Server (NTRS)

    Otto, O. R.; Bohlmann, R. E.

    1974-01-01

    Results are presented from an experimental program directed toward improving boron aluminum crippling strength. Laminate changes evaluated were larger filament diameter, improved processing, shape changes, adding steel-aluminum cross plies, reduced filament volume in corners, adding boron aluminum angle plies, and using titanium interleaves. Filament diameter and steel-aluminum cross plies have little effect on crippling. It is shown that better processing combined with appropriate shape changes improved crippling over 50 percent at both room temperature and 600 F. Tests also show that crippling improvements ranging from 20 to 40 percent are achieved using angle plies and titanium interleaves.

  18. Quantifying the Behavioural Relevance of Hippocampal Neurogenesis

    PubMed Central

    Lazic, Stanley E.; Fuss, Johannes; Gass, Peter

    2014-01-01

    Few studies that examine the neurogenesis–behaviour relationship formally establish covariation between neurogenesis and behaviour or rule out competing explanations. The behavioural relevance of neurogenesis might therefore be overestimated if other mechanisms account for some, or even all, of the experimental effects. A systematic review of the literature was conducted and the data reanalysed using causal mediation analysis, which can estimate the behavioural contribution of new hippocampal neurons separately from other mechanisms that might be operating. Results from eleven eligible individual studies were then combined in a meta-analysis to increase precision (representing data from 215 animals) and showed that neurogenesis made a negligible contribution to behaviour (standarised effect ?=?0.15; 95% CI ?=??0.04 to 0.34; p?=?0.128); other mechanisms accounted for the majority of experimental effects (standardised effect ?=?1.06; 95% CI ?=?0.74 to 1.38; p?=?1.7×10?11). PMID:25426717

  19. Quantifying the length-scale dependence of surf zone advection

    NASA Astrophysics Data System (ADS)

    Wilson, Greg W.; Özkan-Haller, H. Tuba; Holman, Robert A.

    2013-05-01

    We investigate the momentum balance in the surf zone, in a setting which is weakly varying in the alongshore direction. Our focus is on the role of nonlinear advective terms. Using numerical experiments, we find that advection tends to counteract alongshore variations in momentum flux, resulting in more uniform kinematics. Additionally, advection causes a shifting of the kinematic response in the direction of flow. These effects are strongest at short alongshore length scales, and/or strong alongshore-mean velocity. The length-scale dependence is investigated using spectral analysis, where the effect of advective terms is treated as a transfer function applied to the solution to the linear (advection-free) equations of motion. The transfer function is then shown to be governed by a nondimensional parameter which quantifies the relative scales of advection and bottom stress, analogous to a Reynolds Number. Hence, this parameter can be used to quantify the length scales at which advective terms, and the resulting effects described above, are important. We also introduce an approximate functional form for the transfer function, which is valid asymptotically within a restricted range of length scales.

  20. Quantifiers More or Less Quantify On-Line: ERP Evidence for Partial Incremental Interpretation

    ERIC Educational Resources Information Center

    Urbach, Thomas P.; Kutas, Marta

    2010-01-01

    Event-related brain potentials were recorded during RSVP reading to test the hypothesis that quantifier expressions are incrementally interpreted fully and immediately. In sentences tapping general knowledge ("Farmers grow crops/worms as their primary source of income"), Experiment 1 found larger N400s for atypical ("worms") than typical objects…

  1. Quantifying limits to detection of early warning for critical transitions

    PubMed Central

    Boettiger, Carl; Hastings, Alan

    2012-01-01

    Catastrophic regime shifts in complex natural systems may be averted through advanced detection. Recent work has provided a proof-of-principle that many systems approaching a catastrophic transition may be identified through the lens of early warning indicators such as rising variance or increased return times. Despite widespread appreciation of the difficulties and uncertainty involved in such forecasts, proposed methods hardly ever characterize their expected error rates. Without the benefits of replicates, controls or hindsight, applications of these approaches must quantify how reliable different indicators are in avoiding false alarms, and how sensitive they are to missing subtle warning signs. We propose a model-based approach to quantify this trade-off between reliability and sensitivity and allow comparisons between different indicators. We show these error rates can be quite severe for common indicators even under favourable assumptions, and also illustrate how a model-based indicator can improve this performance. We demonstrate how the performance of an early warning indicator varies in different datasets, and suggest that uncertainty quantification become a more central part of early warning predictions. PMID:22593100

  2. A framework for quantifying net benefits of alternative prognostic models‡

    PubMed Central

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd. PMID:21905066

  3. How to quantify conduits in wood?

    PubMed Central

    Scholz, Alexander; Klepsch, Matthias; Karimi, Zohreh; Jansen, Steven

    2013-01-01

    Vessels and tracheids represent the most important xylem cells with respect to long distance water transport in plants. Wood anatomical studies frequently provide several quantitative details of these cells, such as vessel diameter, vessel density, vessel element length, and tracheid length, while important information on the three dimensional structure of the hydraulic network is not considered. This paper aims to provide an overview of various techniques, although there is no standard protocol to quantify conduits due to high anatomical variation and a wide range of techniques available. Despite recent progress in image analysis programs and automated methods for measuring cell dimensions, density, and spatial distribution, various characters remain time-consuming and tedious. Quantification of vessels and tracheids is not only important to better understand functional adaptations of tracheary elements to environment parameters, but will also be essential for linking wood anatomy with other fields such as wood development, xylem physiology, palaeobotany, and dendrochronology. PMID:23507674

  4. Quantifying truncation errors in effective field theory

    E-print Network

    Furnstahl, R J; Phillips, D R; Wesolowski, S

    2015-01-01

    Bayesian procedures designed to quantify truncation errors in perturbative calculations of quantum chromodynamics observables are adapted to expansions in effective field theory (EFT). In the Bayesian approach, such truncation errors are derived from degree-of-belief (DOB) intervals for EFT predictions. Computation of these intervals requires specification of prior probability distributions ("priors") for the expansion coefficients. By encoding expectations about the naturalness of these coefficients, this framework provides a statistical interpretation of the standard EFT procedure where truncation errors are estimated using the order-by-order convergence of the expansion. It also permits exploration of the ways in which such error bars are, and are not, sensitive to assumptions about EFT-coefficient naturalness. We first demonstrate the calculation of Bayesian probability distributions for the EFT truncation error in some representative examples, and then focus on the application of chiral EFT to neutron-pr...

  5. Quantifying International Travel Flows Using Flickr

    PubMed Central

    Barchiesi, Daniele; Moat, Helen Susannah; Alis, Christian; Bishop, Steven; Preis, Tobias

    2015-01-01

    Online social media platforms are opening up new opportunities to analyse human behaviour on an unprecedented scale. In some cases, the fast, cheap measurements of human behaviour gained from these platforms may offer an alternative to gathering such measurements using traditional, time consuming and expensive surveys. Here, we use geotagged photographs uploaded to the photo-sharing website Flickr to quantify international travel flows, by extracting the location of users and inferring trajectories to track their movement across time. We find that Flickr based estimates of the number of visitors to the United Kingdom significantly correlate with the official estimates released by the UK Office for National Statistics, for 28 countries for which official estimates are calculated. Our findings underline the potential for indicators of key aspects of human behaviour, such as mobility, to be generated from data attached to the vast volumes of photographs posted online. PMID:26147500

  6. Quantifying the Reversible Association of Thermosensitive Nanoparticles

    E-print Network

    Alessio Zaccone; Jerome J. Crassous; Benjamin Béri; Matthias Ballauff

    2011-10-12

    Under many conditions, biomolecules and nanoparticles associate by means of attractive bonds, due to hydrophobic attraction. Extracting the microscopic association or dissociation rates from experimental data is complicated by the dissociation events and by the sensitivity of the binding force to temperature (T). Here we introduce a theoretical model that combined with light-scattering experiments allows us to quantify these rates and the reversible binding energy as a function of T. We apply this method to the reversible aggregation of thermoresponsive polystyrene/poly(N-isopropylacrylamide) core-shell nanoparticles, as a model system for biomolecules. We find that the binding energy changes sharply with T, and relate this remarkable switchable behavior to the hydrophobic-hydrophilic transition of the thermosensitive nanoparticles.

  7. Animal biometrics: quantifying and detecting phenotypic appearance.

    PubMed

    Kühl, Hjalmar S; Burghardt, Tilo

    2013-07-01

    Animal biometrics is an emerging field that develops quantified approaches for representing and detecting the phenotypic appearance of species, individuals, behaviors, and morphological traits. It operates at the intersection between pattern recognition, ecology, and information sciences, producing computerized systems for phenotypic measurement and interpretation. Animal biometrics can benefit a wide range of disciplines, including biogeography, population ecology, and behavioral research. Currently, real-world applications are gaining momentum, augmenting the quantity and quality of ecological data collection and processing. However, to advance animal biometrics will require integration of methodologies among the scientific disciplines involved. Such efforts will be worthwhile because the great potential of this approach rests with the formal abstraction of phenomics, to create tractable interfaces between different organizational levels of life. PMID:23537688

  8. World Health Organization: Quantifying environmental health impacts

    NSDL National Science Digital Library

    The World Health Organization works in a number of public health areas, and their work on quantifying environmental health impacts has been receiving praise from many quarters. This site provides materials on their work in this area and visitors with a penchant for international health relief efforts and policy analysis will find the site invaluable. Along the left-hand side of the site, visitors will find topical sections that include "Methods", "Assessment at national level", "Global estimates", and "Publications". In the "Methods" area, visitors will learn about how the World Health Organization's methodology for studying environmental health impacts has been developed and they can also read a detailed report on the subject. The "Global Estimates" area is worth a look as well, and users can look at their complete report, "Preventing Disease Through Healthy Environments: Towards An Estimate Of the Global Burden Of Disease".

  9. Quantifying Flow Resistance of Mountain Streams Using the HHT Approach

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Fu, X.

    2014-12-01

    This study quantifies the flow resistance of mountain streams with gravel bed and remarkable bed forms. The motivation is to follow the previous ideas (Robert, A. 1990) that the bed surface can be divided into micro-scale and macro-scale roughness, respectively. We processed the field data of longitudinal bed profiles of the Longxi River, Sichuan Province, China, using the Hilbert-Huang Transformation Method (HHT). Each longitudinal profile was decomposed into a set of curves with different frequencies of spatial fluctuation. The spectrogram was accordingly obtained. We supposed that a certain high and low frequency curves correspond to the micro- and macro-roughness of stream bed, respectively. We specified the characteristic height and length with the spectrogram, which represent the macro bed form accounting for bed form roughness. We then estimated the bed form roughness as being proportional to the ratio of the height to length multiplied by the height(Yang et al,2005). We also assumed the parameter, Sp, defined as the sinuosity of the highest frequency curve as the measure of the micro-scale roughness. We then took into account the effect of bed material sizes through using the product of d50/R and Sp, where d50 is the sediment median size and R is the hydraulic radius. The macro- and micro-scale roughness parameters were merged together nonlinearly to evaluate the flow resistance caused by the interplaying friction and form drag forces. Validation results show that the square of the determinant coefficient can reach as high as 0.84 in the case of the Longxi River. Future studies will focus on the verification against more field data as well as the combination of skin friction and form drag. Key words: flow resistance; roughness; HHT; spectrogram; form drag Robert, A. (1990), Boundary roughness in coarse-grained channels, Prog. Phys. Geogr., 14(1), 42-69. Yang, S.-Q., S.-K. Tan, and S.-Y. Lim. (2005), Flow resistance and bed form geometry in a wide alluvial channel, Water Resour. Res., 41, W09419.

  10. Quantifying methane flux from lake sediments using multibeam sonar

    NASA Astrophysics Data System (ADS)

    Scandella, B.; Urban, P.; Delwiche, K.; Greinert, J.; Hemond, H.; Ruppel, C. D.; Juanes, R.

    2013-12-01

    Methane is a potent greenhouse gas, and the production and emission of methane from sediments in wetlands, lakes and rivers both contributes to and may be exacerbated by climate change. In some of these shallow-water settings, methane fluxes may be largely controlled by episodic venting that can be triggered by drops in hydrostatic pressure. Even with better constraints on the mechanisms for gas release, quantifying these fluxes has remained a challenge due to rapid spatiotemporal changes in the patterns of bubble emissions from the sediments. The research presented here uses a fixed-location Imagenex DeltaT 837B multibeam sonar to estimate methane-venting fluxes from organic-rich lake sediments over a large area (~400 m2) and over a multi-season deployment period with unprecedented spatial and temporal resolution. Simpler, single-beam sonar systems have been used in the past to estimate bubble fluxes in a variety of settings. Here we extend this methodology to a multibeam system by means of: (1) detailed calibration of the sonar signal against imposed bubble streams, and (2) validation against an in situ independent record of gas flux captured by overlying bubble traps. The calibrated sonar signals then yield estimates of the methane flux with high spatial resolution (~1 m) and temporal frequency (6 Hz) from a portion of the deepwater basin of Upper Mystic Lake, MA, USA, a temperate eutrophic kettle lake. These results in turn inform mathematical models of methane transport and release from the sediments, which reproduce with high fidelity the ebullitive response to hydrostatic pressure variations. In addition, the detailed information about spatial variability of methane flux derived from sonar records is used to estimate the uncertainty associated with upscaling flux measurements from bubble traps to the scale of the sonar observation area. Taken together, these multibeam sonar measurements and analysis provide a novel quantitative approach for the assessment of methane fluxes from shallow-water bodies. Time series showing how the uncalibrated, sonar-detected flux estimate (black) varies inversely with the hydrostatic pressure (meters of water, blue) at 5-minute resolution during April 2012. Overlain is the time series of scaled gas flux from a mechanistic numerical model forced by the same hydrostatic pressure signal (orange).

  11. Land cover change and remote sensing: Examples of quantifying spatiotemporal dynamics in tropical forests

    SciTech Connect

    Krummel, J.R.; Su, Haiping [Argonne National Lab., IL (United States); Fox, J. [East-West Center, Honolulu, HI (United States); Yarnasan, S.; Ekasingh, M. [Chiang Mai Univ. (Thailand)

    1995-06-01

    Research on human impacts or natural processes that operate over broad geographic areas must explicitly address issues of scale and spatial heterogeneity. While the tropical forests of Southeast Asia and Mexico have been occupied and used to meet human needs for thousands of years, traditional forest management systems are currently being transformed by rapid and far-reaching demographic, political, economic, and environmental changes. The dynamics of population growth, migration into the remaining frontiers, and responses to national and international market forces result in a demand for land to produce food and fiber. These results illustrate some of the mechanisms that drive current land use changes, especially in the tropical forest frontiers. By linking the outcome of individual land use decisions and measures of landscape fragmentation and change, the aggregated results shows the hierarchy of temporal and spatial events that in summation result in global changes to the most complex and sensitive biome -- tropical forests. By quantifying the spatial and temporal patterns of tropical forest change, researchers can assist policy makers by showing how landscape systems in these tropical forests are controlled by physical, biological, social, and economic parameters.

  12. Adults with Autism Show Increased Sensitivity to Outcomes at Low Error Rates during Decision-Making

    ERIC Educational Resources Information Center

    Minassian, Arpi; Paulus, Martin; Lincoln, Alan; Perry, William

    2007-01-01

    Decision-making is an important function that can be quantified using a two-choice prediction task. Individuals with Autistic Disorder (AD) often show highly restricted and repetitive behavior that may interfere with adaptive decision-making. We assessed whether AD adults showed repetitive behavior on the choice task that was unaffected by…

  13. Use of short half-life cosmogenic isotopes to quantify sediment mixing and transport in karst conduits

    NASA Astrophysics Data System (ADS)

    Paylor, R.

    2011-12-01

    Particulate inorganic carbon (PIC) transport and flux in karst aquifers is poorly understood. Methods to quantify PIC flux are needed in order to account for total inorganic carbon removal (chemical plus mechanical) from karst settings. Quantifying PIC flux will allow more accurate calculations of landscape denudation and global carbon sink processes. The study concentrates on the critical processes of the suspended sediment component of mass flux - surface soil/stored sediment mixing, transport rates and distance, and sediment storage times. The primary objective of the study is to describe transport and mixing with the resolution of single storm-flow events. To quantify the transport processes, short half-life cosmogenic isotopes are utilized. The isotopes 7Be (t1/2 = 53d) and 210Pb (t1/2 = 22y) are the primary isotopes measured, and other potential isotopes such as 137Cs and 241Am are investigated. The study location is at Mammoth Cave National Park within the Logsdon River watershed. The Logsdon River conduit is continuously traversable underground for two kilometers. Background levels and input concentrations of isotopes are determined from soil samples taken at random locations in the catchment area, and suspended sediment collected from the primary sinking stream during a storm event. Suspended sediment was also collected from the downstream end of the conduit during the storm event. After the storm flow receded, fine sediment samples were taken from the cave stream at regular intervals to determine transport distances and mixing ratios along the conduit. Samples were analyzed with a Canberra Industries gamma ray spectrometer, counted for 24 hours to increase detection of low radionuclide activities. The measured activity levels of radionuclides in the samples were adjusted for decay from time of sampling using standard decay curves. The results of the study show that surface sediment mixing, transport and storage in karst conduits is a dynamic but potentially quantifiable process at the storm-event scale.

  14. Methods for quantifying uncertainty in fast reactor analyses.

    SciTech Connect

    Fanning, T. H.; Fischer, P. F.

    2008-04-07

    Liquid-metal-cooled fast reactors in the form of sodium-cooled fast reactors have been successfully built and tested in the U.S. and throughout the world. However, no fast reactor has operated in the U.S. for nearly fourteen years. More importantly, the U.S. has not constructed a fast reactor in nearly 30 years. In addition to reestablishing the necessary industrial infrastructure, the development, testing, and licensing of a new, advanced fast reactor concept will likely require a significant base technology program that will rely more heavily on modeling and simulation than has been done in the past. The ability to quantify uncertainty in modeling and simulations will be an important part of any experimental program and can provide added confidence that established design limits and safety margins are appropriate. In addition, there is an increasing demand from the nuclear industry for best-estimate analysis methods to provide confidence bounds along with their results. The ability to quantify uncertainty will be an important component of modeling that is used to support design, testing, and experimental programs. Three avenues of UQ investigation are proposed. Two relatively new approaches are described which can be directly coupled to simulation codes currently being developed under the Advanced Simulation and Modeling program within the Reactor Campaign. A third approach, based on robust Monte Carlo methods, can be used in conjunction with existing reactor analysis codes as a means of verification and validation of the more detailed approaches.

  15. Risk-Quantified Decision-Making at Rocky Flats

    SciTech Connect

    Myers, Jeffrey C. [Washington Safety Management Solutions, Aiken, South Carolina (United States)

    2008-01-15

    Surface soils in the 903 Pad Lip Area of the Rocky Flats Environmental Technology Site (RFETS) were contaminated with {sup 239/240}Pu by site operations. To meet remediation goals, accurate definition of areas where {sup 239/240}Pu activity exceeded the threshold level of 50 pCi/g and those below 50- pCi/g needed definition. In addition, the confidence for remedial decisions needed to be quantified and displayed visually. Remedial objectives needed to achieve a 90 percent certainty that unremediated soils had less than a 10 percent chance of {sup 239/240}Pu activity exceeding 50-pCi/g. Removing areas where the chance of exceedance is greater than 10 percent creates a 90 percent confidence in the remedial effort results. To achieve the stipulated goals, the geostatistical approach of probability kriging (Myers 1997) was implemented. Lessons learnt: Geostatistical techniques provided a risk-quantified approach to remedial decision-making and provided visualizations of the excavation area. Error analysis demonstrated compliance and confirmed that more than sufficient soils were removed. Error analysis also illustrated that any soils above the threshold that were not removed would be of nominal activity. These quantitative approaches were useful from a regulatory, engineering, and stakeholder satisfaction perspective.

  16. Quantifying complexity in translational research: an integrated approach

    PubMed Central

    Munoz, David A.; Nembhard, Harriet Black; Kraschnewski, Jennifer L.

    2014-01-01

    Purpose This article quantifies complexity in translational research. The impact of major operational steps and technical requirements (TR) is calculated with respect to their ability to accelerate moving new discoveries into clinical practice. Design/Methodology/Approach A three-phase integrated Quality Function Deployment (QFD) and Analytic Hierarchy Process (AHP) method was used to quantify complexity in translational research. A case study in obesity was used to usability. Findings Generally, the evidence generated was valuable for understanding various components in translational research. Particularly, we found that collaboration networks, multidisciplinary team capacity and community engagement are crucial for translating new discoveries into practice. Research limitations/implications As the method is mainly based on subjective opinion, some argue that the results may be biased. However, a consistency ratio is calculated and used as a guide to subjectivity. Alternatively, a larger sample may be incorporated to reduce bias. Practical implications The integrated QFD-AHP framework provides evidence that could be helpful to generate agreement, develop guidelines, allocate resources wisely, identify benchmarks and enhance collaboration among similar projects. Originality/value Current conceptual models in translational research provide little or no clue to assess complexity. The proposed method aimed to fill this gap. Additionally, the literature review includes various features that have not been explored in translational research. PMID:25417380

  17. Quantifying variability on thermal resistance of Listeria monocytogenes.

    PubMed

    Aryani, D C; den Besten, H M W; Hazeleger, W C; Zwietering, M H

    2015-01-16

    Knowledge of the impact of strain variability and growth history on thermal resistance is needed to provide a realistic prediction and an adequate design of thermal treatments. In the present study, apart from quantifying strain variability on thermal resistance of Listeria monocytogenes, also biological variability and experimental variability were determined to prioritize their importance. Experimental variability was defined as the repeatability of parallel experimental replicates and biological variability was defined as the reproducibility of biologically independent reproductions. Furthermore, the effect of growth history was quantified. The thermal inactivation curves of 20 L. monocytogenes strains were fitted using the modified Weibull model, resulting in total 360 D-value estimates. The D-value ranged from 9 to 30 min at 55 °C; from 0.6 to 4 min at 60 °C; and from 0.08 to 0.6 min at 65 °C. The estimated z-values of all strains ranged from 4.4 to 5.7 °C. The strain variability was ten times higher than the experimental variability and four times higher than the biological variability. Furthermore, the effect of growth history on thermal resistance variability was not significantly different from that of strain variability and was mainly determined by the growth phase. PMID:25462932

  18. Quantifying Relative Diver Effects in Underwater Visual Censuses

    PubMed Central

    Dickens, Luke C.; Goatley, Christopher H. R.; Tanner, Jennifer K.; Bellwood, David R.

    2011-01-01

    Diver-based Underwater Visual Censuses (UVCs), particularly transect-based surveys, are key tools in the study of coral reef fish ecology. These techniques, however, have inherent problems that make it difficult to collect accurate numerical data. One of these problems is the diver effect (defined as the reaction of fish to a diver). Although widely recognised, its effects have yet to be quantified and the extent of taxonomic variation remains to be determined. We therefore examined relative diver effects on a reef fish assemblage on the Great Barrier Reef. Using common UVC methods, the recorded abundance of seven reef fish groups were significantly affected by the ongoing presence of SCUBA divers. Overall, the diver effect resulted in a 52% decrease in the mean number of individuals recorded, with declines of up to 70% in individual families. Although the diver effect appears to be a significant problem, UVCs remain a useful approach for quantifying spatial and temporal variation in relative fish abundances, especially if using methods that minimise the exposure of fishes to divers. Fixed distance transects using tapes or lines deployed by a second diver (or GPS-calibrated timed swims) would appear to maximise fish counts and minimise diver effects. PMID:21533039

  19. Quantifying Genome Editing Outcomes at Endogenous Loci using SMRT Sequencing

    PubMed Central

    Clark, Joseph; Punjya, Niraj; Sebastiano, Vittorio; Bao, Gang; Porteus, Matthew H

    2014-01-01

    SUMMARY Targeted genome editing with engineered nucleases has transformed the ability to introduce precise sequence modifications at almost any site within the genome. A major obstacle to probing the efficiency and consequences of genome editing is that no existing method enables the frequency of different editing events to be simultaneously measured across a cell population at any endogenous genomic locus. We have developed a novel method for quantifying individual genome editing outcomes at any site of interest using single molecule real time (SMRT) DNA sequencing. We show that this approach can be applied at various loci, using multiple engineered nuclease platforms including TALENs, RNA guided endonucleases (CRISPR/Cas9), and ZFNs, and in different cell lines to identify conditions and strategies in which the desired engineering outcome has occurred. This approach facilitates the evaluation of new gene editing technologies and permits sensitive quantification of editing outcomes in almost every experimental system used. PMID:24685129

  20. Live Cell Interferometry Quantifies Dynamics of Biomass Partitioning during Cytokinesis

    PubMed Central

    Zangle, Thomas A.; Teitell, Michael A.; Reed, Jason

    2014-01-01

    The equal partitioning of cell mass between daughters is the usual and expected outcome of cytokinesis for self-renewing cells. However, most studies of partitioning during cell division have focused on daughter cell shape symmetry or segregation of chromosomes. Here, we use live cell interferometry (LCI) to quantify the partitioning of daughter cell mass during and following cytokinesis. We use adherent and non-adherent mouse fibroblast and mouse and human lymphocyte cell lines as models and show that, on average, mass asymmetries present at the time of cleavage furrow formation persist through cytokinesis. The addition of multiple cytoskeleton-disrupting agents leads to increased asymmetry in mass partitioning which suggests the absence of active mass partitioning mechanisms after cleavage furrow positioning. PMID:25531652

  1. Image processing techniques to quantify microprojections on outer corneal epithelial cells

    PubMed Central

    Julio, Gemma; Merindano, Ma Dolores; Canals, Marc; Ralló, Miquel

    2008-01-01

    It is widely accepted that cellular microprojections (microvilli and/or microplicae) of the corneal surface are essential to maintain the functionality of the tissue. To date, the characterization of these vital structures has been made by analysing scanning or transmission electron microscopy images of the cornea by methods that are intrinsically subjective and imprecise (qualitative or semiquantitative methods). In the present study, numerical data concerning three microprojection features were obtained by an automated method and analysed to establish which of them showed less variability. We propose that the most stable microprojection characteristic would be a useful sign in early detection of epithelial damage or disease. With this aim, the scanning electron microscopy images of 220 corneal epithelial cells of nine rabbits were subjected to several image processing techniques to quantify microprojection density, microprojection average size and surface covered by microprojections (SCM). We then assessed the reliability of the methods used and performed a statistical analysis of the data. Our results show that the thresholding process, the basis of all image processing techniques used in this work, is highly reliable in separating microprojections from the rest of the cell membrane. Assessment of histogram information from thresholded images is a good method to quantify SCM. Amongst the three studied variables, SCM was the most stable (with a coefficient of variation of 15.24%), as 89.09% of the sample cells had SCM values ? 40%. We also found that the variability of SCM was mainly due to intercellular differences (the cell factor contribution represented 88.78% of the total variation in the analysed cell areas). Further studies are required to elucidate how healthy corneas maintain high SCM values. PMID:18510513

  2. Quantifying and Generalizing Hydrologic Responses to Dam Regulation using a Statistical Modeling Approach

    SciTech Connect

    McManamay, Ryan A [ORNL

    2014-01-01

    Despite the ubiquitous existence of dams within riverscapes, much of our knowledge about dams and their environmental effects remains context-specific. Hydrology, more than any other environmental variable, has been studied in great detail with regard to dam regulation. While much progress has been made in generalizing the hydrologic effects of regulation by large dams, many aspects of hydrology show site-specific fidelity to dam operations, small dams (including diversions), and regional hydrologic regimes. A statistical modeling framework is presented to quantify and generalize hydrologic responses to varying degrees of dam regulation. Specifically, the objectives were to 1) compare the effects of local versus cumulative dam regulation, 2) determine the importance of different regional hydrologic regimes in influencing hydrologic responses to dams, and 3) evaluate how different regulation contexts lead to error in predicting hydrologic responses to dams. Overall, model performance was poor in quantifying the magnitude of hydrologic responses, but performance was sufficient in classifying hydrologic responses as negative or positive. Responses of some hydrologic indices to dam regulation were highly dependent upon hydrologic class membership and the purpose of the dam. The opposing coefficients between local and cumulative-dam predictors suggested that hydrologic responses to cumulative dam regulation are complex, and predicting the hydrology downstream of individual dams, as opposed to multiple dams, may be more easy accomplished using statistical approaches. Results also suggested that particular contexts, including multipurpose dams, high cumulative regulation by multiple dams, diversions, close proximity to dams, and certain hydrologic classes are all sources of increased error when predicting hydrologic responses to dams. Statistical models, such as the ones presented herein, show promise in their ability to model the effects of dam regulation effects at large spatial scales as to generalize the directionality of hydrologic responses.

  3. A method to quantify organic functional groups and inorganic compounds in ambient aerosols using attenuated total reflectance FTIR spectroscopy and multivariate chemometric techniques

    NASA Astrophysics Data System (ADS)

    Coury, Charity; Dillner, Ann M.

    An attenuated total reflectance-Fourier transform infrared (ATR-FTIR) spectroscopic technique and a multivariate calibration method were developed to quantify ambient aerosol organic functional groups and inorganic compounds. These methods were applied to size-resolved particulate matter samples collected in winter and summer of 2004 at three sites: a downtown Phoenix, Arizona location, a rural site near Phoenix, and an urban fringe site between the urban and rural site. Ten organic compound classes, including four classes which contain a carbonyl functional group, and three inorganic species were identified in the ambient samples. A partial least squares calibration was developed and applied to the ambient spectra, and 13 functional groups related to organic compounds (aliphatic and aromatic CH, methylene, methyl, alkene, aldehydes/ketones, carboxylic acids, esters/lactones, acid anhydrides, carbohydrate hydroxyl and ethers, amino acids, and amines) as well as ammonium sulfate and ammonium nitrate were quantified. Comparison of the sum of the mass measured by the ATR-FTIR technique and gravimetric mass indicates that this method can quantify nearly all of the aerosol mass on sub-micrometer size-segregated samples. Analysis of sample results shows that differences in organic functional group and inorganic compound concentrations at the three sampling sites can be measured with these methods. Future work will analyze the quantified data from these three sites in detail.

  4. Quantifying Acute Myocardial Injury Using Ratiometric Fluorometry

    PubMed Central

    Ranji, Mahsa; Matsubara, Muneaki; Leshnower, Bradley G.; Hinmon, Robin H.; Jaggard, Dwight L.; Chance, Britton; Gorman, Robert C.

    2011-01-01

    Early reperfusion is the best therapy for myocardial infarction (MI). Effectiveness, however, varies significantly between patients and has implications for long-term prognosis and treatment. A technique to assess the extent of myocardial salvage after reperfusion therapy would allow for high-risk patients to be identified in the early post-MI period. Mitochondrial dysfunction is associated with cell death following myocardial reperfusion and can be quantified by fluorometry. Therefore, we hypothesized that variations in the fluorescence of mitochondrial nicotinamide adenine dinucleotide (NADH) and flavoprotein (FP) can be used acutely to predict the degree of myocardial injury. Thirteen rabbits had coronary occlusion for 30 min followed by 3 h of reperfusion. To produce a spectrum of infarct sizes, six animals were infused cyclosporine A prior to ischemia. Using a specially designed fluorometric probe, NADH and FP fluorescence were measured in the ischemic area. Changes in NADH and FP fluorescence, as early as 15 min after reperfusion, correlated with postmortem assessment infarct size (r = 0.695, p < 0.01). This correlation strengthened with time (r = 0.827, p < 0.001 after 180 min). Clinical application of catheter-based myocardial fluorometry may provide a minimally invasive technique for assessing the early response to reperfusion therapy. PMID:19272908

  5. Index to quantify thermal comfort in homes

    SciTech Connect

    Carroll, J.A.

    1980-01-01

    The discomfort index presented here quantifies the thermal discomfort associated with the design and operation of residential buildings during the heating season. Discomfort under steady-state conditions is estimated from the squared difference between the actual and a preferred temperature (which is assumed to vary with clothing and activity levels). The difference is found using a simple linear approximation to the ASHRAE ET that treats humidity effects only implicity. A penalty is also assessed for transient discomfort effects. The index allows calibration for individual preferences. The proposed index is similar to the currently used Predicted Percentage of Dissatisfied, but is simpler and better suited to simulations of residential environments. Simulations can integrate the index over a heating season along with energy use to estimate the overall thermal performance of a building. These two complementary aspects of performance can be combined into one overall index, by using thermostat settings as an indicator of the relative weights that people assign to comfort and energy use.

  6. Data Used in Quantified Reliability Models

    NASA Technical Reports Server (NTRS)

    DeMott, Diana; Kleinhammer, Roger K.; Kahn, C. J.

    2014-01-01

    Data is the crux to developing quantitative risk and reliability models, without the data there is no quantification. The means to find and identify reliability data or failure numbers to quantify fault tree models during conceptual and design phases is often the quagmire that precludes early decision makers consideration of potential risk drivers that will influence design. The analyst tasked with addressing a system or product reliability depends on the availability of data. But, where is does that data come from and what does it really apply to? Commercial industries, government agencies, and other international sources might have available data similar to what you are looking for. In general, internal and external technical reports and data based on similar and dissimilar equipment is often the first and only place checked. A common philosophy is "I have a number - that is good enough". But, is it? Have you ever considered the difference in reported data from various federal datasets and technical reports when compared to similar sources from national and/or international datasets? Just how well does your data compare? Understanding how the reported data was derived, and interpreting the information and details associated with the data is as important as the data itself.

  7. Quantifying the evolutionary dynamics of language

    PubMed Central

    Lieberman, Erez; Michel, Jean-Baptiste; Jackson, Joe; Tang, Tina; Nowak, Martin A.

    2008-01-01

    Human language is based on grammatical rules1–4. Cultural evolution allows these rules to change over time5. Rules compete with each other: as new rules rise to prominence, old ones die away. To quantify the dynamics of language evolution, we studied the regularization of English verbs over the last 1200 years. Although an elaborate system of productive conjugations existed in English’s proto-Germanic ancestor, modern English uses the dental suffix, -ed, to signify past tense6. Here, we describe the emergence of this linguistic rule amidst the evolutionary decay of its exceptions, known to us as irregular verbs. We have generated a dataset of verbs whose conjugations have been evolving for over a millennium, tracking inflectional changes to 177 Old English irregulars. Of these irregulars, 145 remained irregular in Middle English and 98 are still irregular today. We study how the rate of regularization depends on the frequency of word usage. The half-life of an irregular verb scales as the square root of its usage frequency: a verb that is 100 times less frequent regularizes 10 times as fast. Our study provides a quantitative analysis of the regularization process by which ancestral forms gradually yield to an emerging linguistic rule. PMID:17928859

  8. Fluorescence imaging to quantify crop residue cover

    NASA Technical Reports Server (NTRS)

    Daughtry, C. S. T.; Mcmurtrey, J. E., III; Chappelle, E. W.

    1994-01-01

    Crop residues, the portion of the crop left in the field after harvest, can be an important management factor in controlling soil erosion. Methods to quantify residue cover are needed that are rapid, accurate, and objective. Scenes with known amounts of crop residue were illuminated with long wave ultraviolet (UV) radiation and fluorescence images were recorded with an intensified video camera fitted with a 453 to 488 nm band pass filter. A light colored soil and a dark colored soil were used as background for the weathered soybean stems. Residue cover was determined by counting the proportion of the pixels in the image with fluorescence values greater than a threshold. Soil pixels had the lowest gray levels in the images. The values of the soybean residue pixels spanned nearly the full range of the 8-bit video data. Classification accuracies typically were within 3(absolute units) of measured cover values. Video imaging can provide an intuitive understanding of the fraction of the soil covered by residue.

  9. Quantifying the Magnetic Advantage in Magnetotaxis

    PubMed Central

    Smith, M. J.; Sheehan, P. E.; Perry, L. L.; O'Connor, K.; Csonka, L. N.; Applegate, B. M.; Whitman, L. J.

    2006-01-01

    Magnetotactic bacteria are characterized by the production of magnetosomes, nanoscale particles of lipid bilayer encapsulated magnetite, that act to orient the bacteria in magnetic fields. These magnetosomes allow magneto-aerotaxis, which is the motion of the bacteria along a magnetic field and toward preferred concentrations of oxygen. Magneto-aerotaxis has been shown to direct the motion of these bacteria downward toward sediments and microaerobic environments favorable for growth. Herein, we compare the magneto-aerotaxis of wild-type, magnetic Magnetospirillum magneticum AMB-1 with a nonmagnetic mutant we have engineered. Using an applied magnetic field and an advancing oxygen gradient, we have quantified the magnetic advantage in magneto-aerotaxis as a more rapid migration to preferred oxygen levels. Magnetic, wild-type cells swimming in an applied magnetic field more quickly migrate away from the advancing oxygen than either wild-type cells in a zero field or the nonmagnetic cells in any field. We find that the responses of the magnetic and mutant strains are well described by a relatively simple analytical model, an analysis of which indicates that the key benefit of magnetotaxis is an enhancement of a bacterium's ability to detect oxygen, not an increase in its average speed moving away from high oxygen concentrations. PMID:16714352

  10. Smart velocity ranging quantifiable optical microangiography

    NASA Astrophysics Data System (ADS)

    Zhi, Zhongwei; Wang, Ruikang K.

    2011-03-01

    We introduce a new type of Optical Microangiography (OMAG) called Quantifiable Optical Microangiography (QOMAG) which is capable of performing quantitative flow imaging with smart velocity ranging. In order to extracting multi-range velocity, two three dimensional data sets need to be acquired at the same imaging area. One data set performs dense scanning in B-scan direction and Doppler analysis was done at the basis of subsequent A-scans, while the other data set performs dense scanning in C-scan direction and Doppler analysis was done at the basis of consecutive B-scan. Since the velocity ranging is determined by the time interval between consecutive measurements of the spectral fringes, longer time interval will give us higher sensitivity to slow velocity. By simultaneous acquiring data sets with different time intervals, we can perform smart velocity ranging quantification on blood flow characterized by different velocity values. The feasibility of QOMAG for variable blood flow imaging is demonstrated by in vivo studies executed on cerebral blood flow of mouse model. Multi-range detailed blood flow map within intracranial Dura mater and cortex of mouse brain can be given by QOMAG.

  11. Next-Generation Genotyping by Digital PCR to Detect and Quantify the BRAF V600E Mutation in Melanoma Biopsies.

    PubMed

    Lamy, Pierre-Jean; Castan, Florence; Lozano, Nicolas; Montélion, Cécile; Audran, Patricia; Bibeau, Frédéric; Roques, Sylvie; Montels, Frédéric; Laberenne, Anne-Claire

    2015-07-01

    The detection of the BRAF V600E mutation in melanoma samples is used to select patients who should respond to BRAF inhibitors. Different techniques are routinely used to determine BRAF status in clinical samples. However, low tumor cellularity and tumor heterogeneity can affect the sensitivity of somatic mutation detection. Digital PCR (dPCR) is a next-generation genotyping method that clonally amplifies nucleic acids and allows the detection and quantification of rare mutations. Our aim was to evaluate the clinical routine performance of a new dPCR-based test to detect and quantify BRAF mutation load in 47 paraffin-embedded cutaneous melanoma biopsies. We compared the results obtained by dPCR with high-resolution melting curve analysis and pyrosequencing or with one of the allele-specific PCR methods available on the market. dPCR showed the lowest limit of detection. dPCR and allele-specific amplification detected the highest number of mutated samples. For the BRAF mutation load quantification both dPCR and pyrosequencing gave similar results with strong disparities in allele frequencies in the 47 tumor samples under study (from 0.7% to 79% of BRAF V600E mutations/sample). In conclusion, the four methods showed a high degree of concordance. dPCR was the more-sensitive method to reliably and easily detect mutations. Both pyrosequencing and dPCR could quantify the mutation load in heterogeneous tumor samples. PMID:25952101

  12. Everything, everywhere, all the time: quantifying the information gained from intensive hydrochemical sampling

    NASA Astrophysics Data System (ADS)

    Kirchner, J. W.; Neal, C.

    2011-12-01

    Catchment hydrochemical studies have suffered from a stark mismatch of measurement timescales: water fluxes are typically measured sub-hourly, but their chemical signatures are typically sampled only weekly or monthly. At the Plynlimon catchment in mid-Wales, however, precipitation and streamflow have now been sampled every seven hours for nearly two years, and analyzed for deuterium, oxygen-18, and more than 40 chemical species. This high-frequency sampling reveals temporal patterns that would be invisible in typical weekly monitoring samples. Furthermore, recent technological developments are now leading to systems that can provide measurements of rainfall and streamflow chemistry at hourly or sub-hourly intervals, similar to the time scales at which hydrometric data have long been available - and to provide these measurements for long spans of time, not just for intensive field campaigns associated with individual storms. But at what point will higher-frequency measurements become pointless, as additional measurements simply "connect the dots" between lower-frequency data points? Information Theory, dating back to the original work of Shannon and colleagues in the 1940's, provides mathematical tools for rigorously quantifying the information content of a time series. The key input data for such an analysis are the power spectrum of the measured data, and the power spectrum of the measurement noise. Here we apply these techniques to the high-frequency Plynlimon data set. The results show that, at least up to 7-hourly sampling frequency, the information content of the time series increases nearly linearly with the frequency of sampling. These results rigorously quantify what inspection of the time series visually suggests: these high-frequency data do not simply "connect the dots" between lower-frequency measurements, but instead contain a richly textured signature of dynamic behavior in catchment hydrochemistry.

  13. A new metric for quantifying performance impairment on the psychomotor vigilance test.

    PubMed

    Rajaraman, Srinivasan; Ramakrishnan, Sridhar; Thorsley, David; Wesensten, Nancy J; Balkin, Thomas J; Reifman, Jaques

    2012-12-01

    We have developed a new psychomotor vigilance test (PVT) metric for quantifying the effects of sleep loss on performance impairment. The new metric quantifies performance impairment by estimating the probability density of response times (RTs) in a PVT session, and then considering deviations of the density relative to that of a baseline-session density. Results from a controlled laboratory study involving 12 healthy adults subjected to 85 h of extended wakefulness, followed by 12 h of recovery sleep, revealed that the group performance variability based on the new metric remained relatively uniform throughout wakefulness. In contrast, the variability of PVT lapses, mean RT, median RT and (to a lesser extent) mean speed showed strong time-of-day effects, with the PVT lapse variability changing with time of day depending on the selected threshold. Our analysis suggests that the new metric captures more effectively the homeostatic and circadian process underlying sleep regulation than the other metrics, both directly in terms of larger effect sizes (4-61% larger) and indirectly through improved fits to the two-process model (9-67% larger coefficient of determination). Although the trend of the mean speed results followed those of the new metric, we found that mean speed yields significantly smaller (?50%) intersubject performance variance than the other metrics. Based on these findings, and that the new metric considers performance changes based on the entire set of responses relative to a baseline, we conclude that it provides a number of potential advantages over the traditional PVT metrics. PMID:22436093

  14. University of Florida Bee College Honey Show

    E-print Network

    Jawitz, James W.

    University of Florida Bee College Honey Show A REMINDER TO JUDGES 1. Judging will begin promptly or the Honey Show Manager only. 3. The Show will provide color grading glasses if required, towel, basin equipment (see judges checklist). For cake judging, knifes will be provided by the Honey Show Manager. 4

  15. Quantifying human vitamin kinetics using AMS

    SciTech Connect

    Hillegonds, D; Dueker, S; Ognibene, T; Buchholz, B; Lin, Y; Vogel, J; Clifford, A

    2004-02-19

    Tracing vitamin kinetics at physiologic concentrations has been hampered by a lack of quantitative sensitivity for chemically equivalent tracers that could be used safely in healthy people. Instead, elderly or ill volunteers were sought for studies involving pharmacologic doses with radioisotopic labels. These studies fail to be relevant in two ways: vitamins are inherently micronutrients, whose biochemical paths are saturated and distorted by pharmacological doses; and while vitamins remain important for health in the elderly or ill, their greatest effects may be in preventing slow and cumulative diseases by proper consumption throughout youth and adulthood. Neither the target dose nor the target population are available for nutrient metabolic studies through decay counting of radioisotopes at high levels. Stable isotopic labels are quantified by isotope ratio mass spectrometry at levels that trace physiologic vitamin doses, but the natural background of stable isotopes severely limits the time span over which the tracer is distinguishable. Indeed, study periods seldom ranged over a single biological mean life of the labeled nutrients, failing to provide data on the important final elimination phase of the compound. Kinetic data for the absorption phase is similarly rare in micronutrient research because the phase is rapid, requiring many consecutive plasma samples for accurate representation. However, repeated blood samples of sufficient volume for precise stable or radio-isotope quantitations consume an indefensible amount of the volunteer's blood over a short period. Thus, vitamin pharmacokinetics in humans has often relied on compartmental modeling based upon assumptions and tested only for the short period of maximal blood circulation, a period that poorly reflects absorption or final elimination kinetics except for the most simple models.

  16. Quantifying landscape resilience using vegetation indices

    NASA Astrophysics Data System (ADS)

    Eddy, I. M. S.; Gergel, S. E.

    2014-12-01

    Landscape resilience refers to the ability of systems to adapt to and recover from disturbance. In pastoral landscapes, degradation can be measured in terms of increased desertification and/or shrub encroachment. In many countries across Central Asia, the use and resilience of pastoral systems has changed markedly over the past 25 years, influenced by centralized Soviet governance, private property rights and recently, communal resource governance. In Kyrgyzstan, recent governance reforms were in response to the increasing degradation of pastures attributed to livestock overgrazing. Our goal is to examine and map the landscape-level factors that influence overgrazing throughout successive governance periods. Here, we map and examine some of the spatial factors influencing landscape resilience in agro-pastoral systems in the Kyrgyzstan Republic where pastures occupy >50% of the country's area. We ask three questions: 1) which mechanisms of pasture degradation (desertification vs. shrub encroachment), are detectable using remote sensing vegetation indices?; 2) Are these degraded pastures associated with landscape features that influence herder mobility and accessibility (e.g., terrain, distance to other pastures)?; and 3) Have these patterns changed through successive governance periods? Using a chronosequence of Landsat imagery (1999-2014), NDVI and other VIs were used to identify trends in pasture condition during the growing season. Least-cost path distances as well as graph theoretic indices were derived from topographic factors to assess landscape connectivity (from villages to pastures and among pastures). Fieldwork was used to assess the feasibility and accuracy of this approach using the most recent imagery. Previous research concluded that low herder mobility hindered pasture use, thus we expect the distance from pasture to village to be an important predictor of pasture condition. This research will quantify the magnitude of pastoral degradation and test assumptions regarding sustainable pastoral management. As grazing is the most extensive land use on Earth, understanding the broad-scale factors that influence the resilience of pastoral systems is an important issue globally.

  17. Quantifying the Behavior of Stock Correlations Under Market Stress

    PubMed Central

    Preis, Tobias; Kenett, Dror Y.; Stanley, H. Eugene; Helbing, Dirk; Ben-Jacob, Eshel

    2012-01-01

    Understanding correlations in complex systems is crucial in the face of turbulence, such as the ongoing financial crisis. However, in complex systems, such as financial systems, correlations are not constant but instead vary in time. Here we address the question of quantifying state-dependent correlations in stock markets. Reliable estimates of correlations are absolutely necessary to protect a portfolio. We analyze 72 years of daily closing prices of the 30 stocks forming the Dow Jones Industrial Average (DJIA). We find the striking result that the average correlation among these stocks scales linearly with market stress reflected by normalized DJIA index returns on various time scales. Consequently, the diversification effect which should protect a portfolio melts away in times of market losses, just when it would most urgently be needed. Our empirical analysis is consistent with the interesting possibility that one could anticipate diversification breakdowns, guiding the design of protected portfolios. PMID:23082242

  18. Quantifying light exposure patterns in young adult students.

    PubMed

    Alvarez, Amanda A; Wildsoet, Christine F

    2013-10-01

    Exposure to bright light appears to be protective against myopia in both animals (chicks, monkeys) and children, but quantitative data on human light exposure are limited. In this study, we report on a technique for quantifying light exposure using wearable sensors. Twenty-seven young adult subjects wore a light sensor continuously for two weeks during one of three seasons, and also completed questionnaires about their visual activities. Light data were analyzed with respect to refractive error and season, and the objective sensor data were compared with subjects' estimates of time spent indoors and outdoors. Subjects' estimates of time spent indoors and outdoors were in poor agreement with durations reported by the sensor data. The results of questionnaire-based studies of light exposure should thus be interpreted with caution. The role of light in refractive error development should be investigated using multiple methods such as sensors to complement questionnaires. PMID:25342873

  19. Quantifying light exposure patterns in young adult students

    PubMed Central

    Alvarez, Amanda A.; Wildsoet, Christine F.

    2014-01-01

    Exposure to bright light appears to be protective against myopia in both animals (chicks, monkeys) and children, but quantitative data on human light exposure are limited. In this study, we report on a technique for quantifying light exposure using wearable sensors. Twenty-seven young adult subjects wore a light sensor continuously for two weeks during one of three seasons, and also completed questionnaires about their visual activities. Light data were analyzed with respect to refractive error and season, and the objective sensor data were compared with subjects’ estimates of time spent indoors and outdoors. Subjects’ estimates of time spent indoors and outdoors were in poor agreement with durations reported by the sensor data. The results of questionnaire-based studies of light exposure should thus be interpreted with caution. The role of light in refractive error development should be investigated using multiple methods such as sensors to complement questionnaires. PMID:25342873

  20. Potential dosemeter for quantifying biologically effective blue light exposures.

    PubMed

    Turnbull, David J; Parisi, Alfio V

    2012-04-01

    This paper reports on the development of a blue light (VIS(BL)) dosemeter. The VIS(BL) dosemeter is based on the combination of polysulfone and phenothiazine as a potential VIS(BL) dosemeter for population studies of exposures related to the blue light hazard. This research found that this combination of photosensitive chromophores reacts to both ultraviolet and visible wavelengths of the solar spectrum. Further to this, the majority of the ultraviolet wavelengths <380 nm can be filtered out with the use of a low-pass filter. It was found that a large change in optical absorbance at 437 nm occurred when the dosemeter was employed to quantify the solar blue light hazard exposures. Preliminary results indicate that this dosemeter saturates relatively slowly and is able to measure exposures equivalent to >1200 kJ m(-2) of blue light hazard weighted solar radiation. PMID:21712257

  1. Quantifying the Impact of Unavailability in Cyber-Physical Environments

    SciTech Connect

    Aissa, Anis Ben [Université de Tunis El Manar, Tunisia; Abercrombie, Robert K [ORNL; Sheldon, Federick T. [University of Memphis; Mili, Ali [New Jersey Insitute of Technology

    2014-01-01

    The Supervisory Control and Data Acquisition (SCADA) system discussed in this work manages a distributed control network for the Tunisian Electric & Gas Utility. The network is dispersed over a large geographic area that monitors and controls the flow of electricity/gas from both remote and centralized locations. The availability of the SCADA system in this context is critical to ensuring the uninterrupted delivery of energy, including safety, security, continuity of operations and revenue. Such SCADA systems are the backbone of national critical cyber-physical infrastructures. Herein, we propose adapting the Mean Failure Cost (MFC) metric for quantifying the cost of unavailability. This new metric combines the classic availability formulation with MFC. The resulting metric, so-called Econometric Availability (EA), offers a computational basis to evaluate a system in terms of the gain/loss ($/hour of operation) that affects each stakeholder due to unavailability.

  2. Unified index to quantifying heterogeneity of complex networks

    NASA Astrophysics Data System (ADS)

    Hu, Hai-Bo; Wang, Xiao-Fan

    2008-06-01

    Although recent studies have revealed that degree heterogeneity of a complex network has significant impact on the network performance and function, a unified definition of the heterogeneity of a network with any degree distribution is absent. In this paper, we define a heterogeneity index 0?H<1 to quantify the degree heterogeneity of any given network. We analytically show the existence of an upper bound of H=0.5 for exponential networks, thus explain why exponential networks are homogeneous. On the other hand, we also analytically show that the heterogeneity index of an infinite power law network is between 1 and 0.5 if and only if its degree exponent is between 2 and 2.5. We further show that for any power law network with a degree exponent greater than 2.5, there always exists an exponential network such that both networks have the same heterogeneity index. This may help to explain why 2.5 is a critical degree exponent for some dynamic behaviors on power law networks.

  3. Quantifying tetrodotoxin levels in the California newt using a non-destructive sampling method

    E-print Network

    Shaffer, H. Bradley

    Quantifying tetrodotoxin levels in the California newt using a non-destructive sampling method Gary processes. The neurotoxin, tetrodotoxin (TTX), which is found in newts of the genus Taricha, acts individuals. We also show that embryos from oviposited California newt (Taricha torosa) egg masses can

  4. gene encoding enhanced green fluorescent protein to the repressor gene, and quantify

    E-print Network

    Weeks, Eric R.

    gene encoding enhanced green fluorescent protein to the repressor gene, and quantify of gene expression in the feedback network, compared with the control networks. They also show concentrations of anhydrotetra- cycline--achemicalinhibitorofTetR. In past theoretical studies of gene

  5. A novel three-dimensional model to quantify metastatic melanoma invasion

    E-print Network

    George, Steven C.

    A novel three-dimensional model to quantify metastatic melanoma invasion Cyrus M. Ghajar,1 Vinod. Culturing melanomas of different meta- static capacities within the system showed that each cell type (i.e., matrix components, interstitial cell presence) on planar and vertical melanoma invasion. We

  6. The Interpretation of Classically Quantified Sentences: A Set-Theoretic Approach

    ERIC Educational Resources Information Center

    Politzer, Guy; Van der Henst, Jean-Baptiste; Delle Luche, Claire; Noveck, Ira A.

    2006-01-01

    We present a set-theoretic model of the mental representation of classically quantified sentences (All P are Q, Some P are Q, Some P are not Q, and No P are Q). We take inclusion, exclusion, and their negations to be primitive concepts. We show that although these sentences are known to have a diagrammatic expression (in the form of the Gergonne…

  7. FOR SUBMISSION 1 Quantifying the degree of self-nestedness of trees.

    E-print Network

    Paris-Sud XI, Université de

    FOR SUBMISSION 1 Quantifying the degree of self-nestedness of trees. Application to the structural in the problem of approximating trees by trees with a particular self-nested structure. Self-nested trees are such that all their subtrees of a given height are isomorphic. We show that these trees present remarkable

  8. Quantifying Local Radiation-Induced Lung Damage From Computed Tomography

    SciTech Connect

    Ghobadi, Ghazaleh; Hogeweg, Laurens E. [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Faber, Hette [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Department of Cell Biology, Section of Radiation and Stress Cell Biology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Tukker, Wim G.J. [Department of Radiology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Schippers, Jacobus M. [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Accelerator Department, Paul Scherrer Institut, Villigen (Switzerland); Brandenburg, Sytze [Kernfysisch Versneller Instituut, Groningen (Netherlands); Langendijk, Johannes A. [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Coppes, Robert P. [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Department of Cell Biology, Section of Radiation and Stress Cell Biology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Luijk, Peter van, E-mail: p.van.luijk@rt.umcg.n [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands)

    2010-02-01

    Purpose: Optimal implementation of new radiotherapy techniques requires accurate predictive models for normal tissue complications. Since clinically used dose distributions are nonuniform, local tissue damage needs to be measured and related to local tissue dose. In lung, radiation-induced damage results in density changes that have been measured by computed tomography (CT) imaging noninvasively, but not yet on a localized scale. Therefore, the aim of the present study was to develop a method for quantification of local radiation-induced lung tissue damage using CT. Methods and Materials: CT images of the thorax were made 8 and 26 weeks after irradiation of 100%, 75%, 50%, and 25% lung volume of rats. Local lung tissue structure (S{sub L}) was quantified from local mean and local standard deviation of the CT density in Hounsfield units in 1-mm{sup 3} subvolumes. The relation of changes in S{sub L} (DELTAS{sub L}) to histologic changes and breathing rate was investigated. Feasibility for clinical application was tested by applying the method to CT images of a patient with non-small-cell lung carcinoma and investigating the local dose-effect relationship of DELTAS{sub L}. Results: In rats, a clear dose-response relationship of DELTAS{sub L} was observed at different time points after radiation. Furthermore, DELTAS{sub L} correlated strongly to histologic endpoints (infiltrates and inflammatory cells) and breathing rate. In the patient, progressive local dose-dependent increases in DELTAS{sub L} were observed. Conclusion: We developed a method to quantify local radiation-induced tissue damage in the lung using CT. This method can be used in the development of more accurate predictive models for normal tissue complications.

  9. Quantifying induced effects of subsurface renewable energy storage

    NASA Astrophysics Data System (ADS)

    Bauer, Sebastian; Beyer, Christof; Pfeiffer, Tilmann; Boockmeyer, Anke; Popp, Steffi; Delfs, Jens-Olaf; Wang, Bo; Li, Dedong; Dethlefsen, Frank; Dahmke, Andreas

    2015-04-01

    New methods and technologies for energy storage are required for the transition to renewable energy sources. Subsurface energy storage systems such as salt caverns or porous formations offer the possibility of hosting large amounts of energy or substance. When employing these systems, an adequate system and process understanding is required in order to assess the feasibility of the individual storage option at the respective site and to predict the complex and interacting effects induced. This understanding is the basis for assessing the potential as well as the risks connected with a sustainable usage of these storage options, especially when considering possible mutual influences. For achieving this aim, in this work synthetic scenarios for the use of the geological underground as an energy storage system are developed and parameterized. The scenarios are designed to represent typical conditions in North Germany. The types of subsurface use investigated here include gas storage and heat storage in porous formations. The scenarios are numerically simulated and interpreted with regard to risk analysis and effect forecasting. For this, the numerical simulators Eclipse and OpenGeoSys are used. The latter is enhanced to include the required coupled hydraulic, thermal, geomechanical and geochemical processes. Using the simulated and interpreted scenarios, the induced effects are quantified individually and monitoring concepts for observing these effects are derived. This presentation will detail the general investigation concept used and analyze the parameter availability for this type of model applications. Then the process implementation and numerical methods required and applied for simulating the induced effects of subsurface storage are detailed and explained. Application examples show the developed methods and quantify induced effects and storage sizes for the typical settings parameterized. This work is part of the ANGUS+ project, funded by the German Ministry of Education and Research (BMBF).

  10. Quantifying the thermodynamic entropy budget of the land surface: is this useful?

    NASA Astrophysics Data System (ADS)

    Brunsell, N. A.; Schymanski, S. J.; Kleidon, A.

    2011-06-01

    As a system is moved away from a state of thermodynamic equilibrium, spatial and temporal heterogeneity is induced. A possible methodology to assess these impacts is to examine the thermodynamic entropy budget and assess the role of entropy production and transfer between the surface and the atmosphere. Here, we adopted this thermodynamic framework to examine the implications of changing vegetation fractional cover on land surface energy exchange processes using the NOAH land surface model and eddy covariance observations. Simulations that varied the relative fraction of vegetation were used to calculate the resultant entropy budget as a function of fraction of vegetation. Results showed that increasing vegetation fraction increases entropy production by the land surface while decreasing the overall entropy budget (the rate of change in entropy at the surface). This is accomplished largely via simultaneous increase in the entropy production associated with the absorption of solar radiation and a decline in the Bowen ratio (ratio of sensible to latent heat flux), which leads to increasing the entropy export associated with the latent heat flux during the daylight hours and dominated by entropy transfer associated with sensible heat and soil heat fluxes during the nighttime hours. Eddy covariance observations also show that the entropy production has a consistent sensitivity to land cover, while the overall entropy budget appears most related to the net radiation at the surface, however with a large variance. This implies that quantifying the thermodynamic entropy budget and entropy production is a useful metric for assessing biosphere-atmosphere-hydrosphere system interactions.

  11. Quantifying the thermodynamic entropy budget of the land surface: is this useful?

    NASA Astrophysics Data System (ADS)

    Brunsell, N. A.; Schymanski, S. J.; Kleidon, A.

    2011-01-01

    As a system is moved away from a state of thermodynamic equilibrium, spatial and temporal heterogeneity is induced. A possible methodology to assess these impacts is to examine the thermodynamic entropy budget and assess the role of entropy production and transfer between the surface and the atmosphere. Here, we adopted this thermodynamic framework to examine the implications of changing vegetation fractional cover on land surface energy exchange processes using the NOAH land surface model and eddy covariance observations. Simulations that varied the relative fraction of vegetation were used to calculate the resultant entropy budget as a function of fraction of vegetation. Results showed that increasing vegetation fraction increases entropy production by the land surface while decreasing the overall entropy budget (the rate of change in entropy at the surface). This is accomplished largely via simultaneous increase in the entropy production associated with the absorption of solar radiation and a decline in the Bowen ratio (ratio of sensible to latent heat flux), which leads to increasing the entropy export associated with the latent heat flux during the daylight hours and dominated by entropy transfer associated with sensible heat and soil heat fluxes during the nighttime hours. Eddy covariance observations also show that the entropy production has a consistent sensitivity to land cover, while the overall entropy budget appears most related to the net radiation at the surface. This implies that quantifying the thermodynamic entropy budget and entropy production is a useful metric for assessing biosphere-atmosphere-hydrosphere system interactions.

  12. A numerical analysis on the applicability of the water level fluctuation method for quantifying groundwater recharge

    NASA Astrophysics Data System (ADS)

    Koo, M.; Lee, D.

    2002-12-01

    The water table fluctuation(WTF) method is a conventional method for quantifying groundwater recharge by multiplying the specific yield to the water level rise. Based on the van Genuchten model, an analytical relationship between groundwater recharge and the water level rise is derived. The equation is used to analyze the effects of the depth to water level and the soil properties on the recharge estimate using the WTF method. The results show that the WTF method is reliable when applied to the aquifers of the fluvial sand provided the water table is below 1m depth. However, if it is applied to the silt loam having the water table depth ranging 4~10m, the recharge is overestimated by 30~80%, and the error increases drastically as the water table is getting shallower. A 2-D unconfined flow model with a time series of the recharge rate is developed. It is used for elucidating the errors of the WTF method, which is implicitly based on the tank model where the horizontal flow in the saturated zone is ignored. Simulations show that the recharge estimated by the WTF method is underestimated for the observation well near the discharge boundary. This is due to the fact that the hydraulic stress resulting from the recharge is rapidly dissipating by the horizontal flow near the discharge boundary. Simulations also reveal that the recharge is significantly underestimated with increase in the hydraulic conductivity and the recharge duration, and decrease in the specific yield.

  13. New Hampshire Guide 4-H Dog Shows

    E-print Network

    New Hampshire, University of

    New Hampshire Guide to 4-H Dog Shows UNH Cooperative Extension 4-H Youth Development Moiles House cooperating. #12;NH Guide to 4-H Dog Shows i Table of Contents INTRODUCTION .................................................................................................................................2 Purpose of the 4-H Dog Project

  14. Flat Globe: Showing the Changing Seasons

    NSDL National Science Digital Library

    Jesse Allen

    1998-09-09

    SeaWiFS false color data showing seasonal change in the oceans and on land for the entire globe. The data is seasonally averaged, and shows the sequence: fall, winter, spring, summer, fall, winter, spring (for the Northern Hemisphere).

  15. Quantifying post-fire recovery of forest canopy structure and its environmental drivers using satellite image time-series

    NASA Astrophysics Data System (ADS)

    Khanal, Shiva; Duursma, Remko; Boer, Matthias

    2014-05-01

    Fire is a recurring disturbance in most of Australia's forests. Depending on fire severity, impacts on forest canopies vary from light scorching to complete defoliation, with related variation in the magnitude and duration of post-fire gas exchange by that canopy. Estimates of fire impacts on forest canopy structure and carbon uptake for south-eastern Australia's forests do not exist. Here, we use 8-day composite measurements of the fraction of Absorbed Photosynthetically Active radiation (FPAR) as recorded by the Moderate-resolution Imaging Spectroradiometer (MODIS) to characterise forest canopies before and after fire and to compare burnt and unburnt sites. FPAR is a key biophysical canopy variable and primary input for estimating Gross Primary Productivity (GPP). Post-fire FPAR loss was quantified for all forest areas burnt between 2001 and 2010, showing good agreement with independent assessments of fire severity patterns of 2009 Black Saturday fires. A new method was developed to determine the duration of post-fire recovery from MODIS-FPAR time-series. The method involves a spatial-mode principal component analysis on full FPAR time series followed by a K-means clustering to group pixels based on similarity in temporal patterns. Using fire history data, time series of FPAR for burnt and unburnt pixels in each cluster were then compared to quantify the duration of the post-fire recovery period, which ranged from less than 1 to 8 years. The results show that time series of MODIS FPAR are well suited to detect and quantify disturbances of forest canopy structure and function in large areas of highly variable climate and phenology. Finally, the role of post-fire climate conditions and previous fire history on the duration of the post-fire recovery of the forest canopy was examined using generalized additive models.

  16. Inside Gun Shows What Goes On

    E-print Network

    Leistikow, Bruce N.

    Inside Gun Shows What Goes On When Everybody Thinks Nobody's Watching Epilogue #12;Inside Gun Shows;Epilogue In February 2010, I attended a Crossroads of the West gun show at the Arizona State Fairgrounds here an update on each of the Phoenix obser- vations made in the photo-essay portion of Inside Gun

  17. Inside Gun Shows What Goes On

    E-print Network

    Leistikow, Bruce N.

    Preface Inside Gun Shows What Goes On When Everybody Thinks Nobody's Watching #12;#12;Inside Gun-Violence Effort. She put gun shows on my radar and is an ace straw-purchase spotter. Thanks also to Barbara Claire a great public institution. He was right. #12;Contents Preface Executive Summary Gun Shows in Context How

  18. Inside Gun Shows What Goes On

    E-print Network

    Nguyen, Danh

    Inside Gun Shows What Goes On When Everybody Thinks Nobody's Watching Executive Summary #12;Inside Gun Shows What Goes on When Everybody Thinks Nobody's Watching Garen Wintemute, MD, MPH Violence;Executive Summary Gun shows are surrounded by controversy. On the one hand, they are important economic

  19. Plant species descriptions show signs of disease.

    PubMed Central

    Hood, Michael E; Antonovics, Janis

    2003-01-01

    It is well known that diseases can greatly influence the morphology of plants, but often the incidence of disease is either too rare or the symptoms too obvious for the 'abnormalities' to cause confusion in systematics. However, we have recently come across several misinterpretations of disease-induced traits that may have been perpetuated into modern species inventories. Anther-smut disease (caused by the fungus Microbotryum violaceum) is common in many members of the Caryophyllaceae and related plant families. This disease causes anthers of infected plants to be filled with dark-violet fungal spores rather than pollen. Otherwise, their vegetative morphology is within the normal range of healthy plants. Here, we present the results of a herbarium survey showing that a number of type specimens (on which the species name and original description are based) in the genus Silene from Asia are diseased with anther smut. The primary visible disease symptom, namely the dark-violet anthers, is incorporated into the original species descriptions and some of these descriptions have persisted unchanged into modern floras. This raises the question of whether diseased type specimens have erroneously been given unique species names. PMID:14667368

  20. Quantifying Permafrost Characteristics with DCR-ERT

    NASA Astrophysics Data System (ADS)

    Schnabel, W.; Trochim, E.; Munk, J.; Kanevskiy, M. Z.; Shur, Y.; Fortier, R.

    2012-12-01

    Geophysical methods are an efficient method for quantifying permafrost characteristics for Arctic road design and engineering. In the Alaskan Arctic construction and maintenance of roads requires integration of permafrost; ground that is below 0 degrees C for two or more years. Features such as ice content and temperature are critical for understanding current and future ground conditions for planning, design and evaluation of engineering applications. This study focused on the proposed Foothills West Transportation Access project corridor where the purpose is to construct a new all-season road connecting the Dalton Highway to Umiat. Four major areas were chosen that represented a range of conditions including gravel bars, alluvial plains, tussock tundra (both unburned and burned conditions), high and low centered ice-wedge polygons and an active thermokarst feature. Direct-current resistivity using galvanic contact (DCR-ERT) was applied over transects. In conjunction complimentary site data including boreholes, active layer depths, vegetation descriptions and site photographs was obtained. The boreholes provided information on soil morphology, ice texture and gravimetric moisture content. Horizontal and vertical resolutions in the DCR-ERT were varied to determine the presence or absence of ground ice; subsurface heterogeneity; and the depth to groundwater (if present). The four main DCR-ERT methods used were: 84 electrodes with 2 m spacing; 42 electrodes with 0.5 m spacing; 42 electrodes with 2 m spacing; and 84 electrodes with 1 m spacing. In terms of identifying the ground ice characteristics the higher horizontal resolution DCR-ERT transects with either 42 or 84 electrodes and 0.5 or 1 m spacing were best able to differentiate wedge-ice. This evaluation is based on a combination of both borehole stratigraphy and surface characteristics. Simulated apparent resistivity values for permafrost areas varied from a low of 4582 ? m to a high of 10034 ? m. Previous studies found permafrost conditions with corresponding resistivity values as low as 5000 ? m. This work emphasizes the necessity of tailoring the DCR-ERT survey to verified ground ice characteristics.

  1. Quantifying metal ions binding onto dissolved organic matter using log-transformed absorbance spectra.

    PubMed

    Yan, Mingquan; Wang, Dongsheng; Korshin, Gregory V; Benedetti, Marc F

    2013-05-01

    This study introduces the concept of consistent examination of changes of log-transformed absorbance spectra of dissolved organic matter (DOM) at incrementally increasing concentrations of heavy metal cations such as copper, cadmium, and aluminum at environmentally relevant concentrations. The approach is designed to highlight contributions of low-intensity absorbance features that appear to be especially sensitive to DOM reactions. In accord with this approach, log-transformed absorbance spectra of fractions of DOM from the Suwannee River were acquired at varying pHs and concentrations of copper, cadmium, and aluminum. These log-transformed spectra were processed using the differential approach and used to examine the nature of the observed changes of DOM absorbance and correlate them with the extent of Me-DOM complexation. Two alternative parameters, namely the change of the spectral slope in the range of wavelengths 325-375 nm (DSlope325-375) and differential logarithm of DOM absorbance at 350 nm (DLnA350) were introduced to quantify Cu(II), Cd(II), and Al(III) binding onto DOMs. DLnA350 and DSlope325-375 datasets were compared with the amount of DOM-bound Cu(II), Cd(II), and Al(III) estimated based on NICA-Donnan model calculations. This examination showed that the DLnA350 and DSlope325-375 acquired at various pH values, metal ions concentrations, and DOM types were strongly and unambiguously correlated with the concentration of DOM-bound metal ions. The obtained experimental results and their interpretation indicate that the introduced DSlope325-375 and DLnA35 parameters are predictive of and can be used to quantify in situ metal ions interactions with DOMs. The presented approach can be used to gain more information about DOM-metal interactions and for further optimization of existing formal models of metal-DOM complexation. PMID:23490103

  2. Quantifying forearm muscle activity during wrist and finger movements by means of multi-channel electromyography.

    PubMed

    Gazzoni, Marco; Celadon, Nicolò; Mastrapasqua, Davide; Paleari, Marco; Margaria, Valentina; Ariano, Paolo

    2014-01-01

    The study of hand and finger movement is an important topic with applications in prosthetics, rehabilitation, and ergonomics. Surface electromyography (sEMG) is the gold standard for the analysis of muscle activation. Previous studies investigated the optimal electrode number and positioning on the forearm to obtain information representative of muscle activation and robust to movements. However, the sEMG spatial distribution on the forearm during hand and finger movements and its changes due to different hand positions has never been quantified. The aim of this work is to quantify 1) the spatial localization of surface EMG activity of distinct forearm muscles during dynamic free movements of wrist and single fingers and 2) the effect of hand position on sEMG activity distribution. The subjects performed cyclic dynamic tasks involving the wrist and the fingers. The wrist tasks and the hand opening/closing task were performed with the hand in prone and neutral positions. A sensorized glove was used for kinematics recording. sEMG signals were acquired from the forearm muscles using a grid of 112 electrodes integrated into a stretchable textile sleeve. The areas of sEMG activity have been identified by a segmentation technique after a data dimensionality reduction step based on Non Negative Matrix Factorization applied to the EMG envelopes. The results show that 1) it is possible to identify distinct areas of sEMG activity on the forearm for different fingers; 2) hand position influences sEMG activity level and spatial distribution. This work gives new quantitative information about sEMG activity distribution on the forearm in healthy subjects and provides a basis for future works on the identification of optimal electrode configuration for sEMG based control of prostheses, exoskeletons, or orthoses. An example of use of this information for the optimization of the detection system for the estimation of joint kinematics from sEMG is reported. PMID:25289669

  3. Utilizing novel diversity estimators to quantify multiple dimensions of microbial biodiversity across domains

    PubMed Central

    2013-01-01

    Background Microbial ecologists often employ methods from classical community ecology to analyze microbial community diversity. However, these methods have limitations because microbial communities differ from macro-organismal communities in key ways. This study sought to quantify microbial diversity using methods that are better suited for data spanning multiple domains of life and dimensions of diversity. Diversity profiles are one novel, promising way to analyze microbial datasets. Diversity profiles encompass many other indices, provide effective numbers of diversity (mathematical generalizations of previous indices that better convey the magnitude of differences in diversity), and can incorporate taxa similarity information. To explore whether these profiles change interpretations of microbial datasets, diversity profiles were calculated for four microbial datasets from different environments spanning all domains of life as well as viruses. Both similarity-based profiles that incorporated phylogenetic relatedness and naïve (not similarity-based) profiles were calculated. Simulated datasets were used to examine the robustness of diversity profiles to varying phylogenetic topology and community composition. Results Diversity profiles provided insights into microbial datasets that were not detectable with classical univariate diversity metrics. For all datasets analyzed, there were key distinctions between calculations that incorporated phylogenetic diversity as a measure of taxa similarity and naïve calculations. The profiles also provided information about the effects of rare species on diversity calculations. Additionally, diversity profiles were used to examine thousands of simulated microbial communities, showing that similarity-based and naïve diversity profiles only agreed approximately 50% of the time in their classification of which sample was most diverse. This is a strong argument for incorporating similarity information and calculating diversity with a range of emphases on rare and abundant species when quantifying microbial community diversity. Conclusions For many datasets, diversity profiles provided a different view of microbial community diversity compared to analyses that did not take into account taxa similarity information, effective diversity, or multiple diversity metrics. These findings are a valuable contribution to data analysis methodology in microbial ecology. PMID:24238386

  4. Diagnosis of Osteoarthritis by Cartilage Surface Smoothness Quantified Automatically from Knee MRI

    PubMed Central

    Tummala, Sudhakar; Bay-Jensen, Anne-Christine; Karsdal, Morten A.; Dam, Erik B.

    2011-01-01

    Objective: We investigated whether surface smoothness of articular cartilage in the medial tibiofemoral compartment quantified from magnetic resonance imaging (MRI) could be appropriate as a diagnostic marker of osteoarthritis (OA). Method: At baseline, 159 community-based subjects aged 21 to 81 with normal or OA-affected knees were recruited to provide a broad range of OA states. Smoothness was quantified using an automatic framework from low-field MRI in the tibial, femoral, and femoral subcompartments. Diagnostic ability of smoothness was evaluated by comparison with conventional OA markers, specifically cartilage volume from MRI, joint space width (JSW) from radiographs, and pain scores. Results: A total of 140 subjects concluded the 21-month study. Cartilage smoothness provided diagnostic ability in all compartments (P < 0.0001). The diagnostic smoothness markers performed at least similar to JSW and were superior to volume markers (e.g., the AUC for femoral smoothness of 0.80 was higher than the 0.57 for volume, P < 0.0001, and marginally higher than 0.73 for JSW, P = 0.25). The smoothness markers allowed diagnostic detection of pain presence (P < 0.05) and showed some correlation with pain severity (e.g., r = ?0.32). The longitudinal change in smoothness was correlated with cartilage loss (r up to 0.60, P < 0.0001 in all compartments). Conclusions: This study demonstrated the potential of cartilage smoothness markers for diagnosis of moderate radiographic OA. Furthermore, correlations between smoothness and pain values and smoothness loss and cartilage loss supported a link to progression of OA. Thereby, smoothness markers may allow detection and monitoring of OA-supplemented currently accepted markers.

  5. Locally-calibrated light transmission visualization methods to quantify nonaqueous phase liquid mass in porous media

    NASA Astrophysics Data System (ADS)

    Wang, Huaguo; Chen, Xiaosong; Jawitz, James W.

    2008-11-01

    Five locally-calibrated light transmission visualization (LTV) methods were tested to quantify nonaqueous phase liquid (NAPL) mass and mass reduction in porous media. Tetrachloroethylene (PCE) was released into a two-dimensional laboratory flow chamber packed with water-saturated sand which was then flushed with a surfactant solution (2% Tween 80) until all of the PCE had been dissolved. In all the LTV methods employed here, the water phase was dyed, rather than the more common approach of dyeing the NAPL phase, such that the light adsorption characteristics of NAPL did not change as dissolution progressed. Also, none of the methods used here required the use of external calibration chambers. The five visualization approaches evaluated included three methods developed from previously published models, a binary method, and a novel multiple wavelength method that has the advantage of not requiring any assumptions about the intra-pore interface structure between the various phases (sand/water/NAPL). The new multiple wavelength method is also expected to be applicable to any translucent porous media containing two immiscible fluids (e.g., water-air, water-NAPL). Results from the sand-water-PCE system evaluated here showed that the model that assumes wetting media of uniform pore size (Model C of Niemet and Selker, 2001) and the multiple wavelength model with no interface structure assumptions were able to accurately quantify PCE mass reduction during surfactant flushing. The average mass recoveries from these two imaging methods were greater than 95% for domain-average NAPL saturations of approximately 2.6 × 10 - 2 , and were approximately 90% during seven cycles of surfactant flushing that sequentially reduced the average NAPL saturation to 7.5 × 10 - 4 .

  6. Understanding and Quantifying Controls of Arsenic Mobility during Deepwell Re-injection of CSG Waters

    NASA Astrophysics Data System (ADS)

    Davis, J. A.; Rathi, B.; Prommer, H.; Donn, M.; Siade, A. J.; Berg, M.

    2014-12-01

    In Australia, the injection of reverse-osmosis treated production water from coal seams into the surrounding, deep aquifers may provide the most viable method to dispose of large quantities of production water. The geochemical disequilibrium between the injectant water composition and the target aquifer can potentially drive a range of water-sediment interactions that must be clearly understood and quantified in order to anticipate and manage future water quality changes at both the local and regional scale. In this study, we use a multi-scale geochemical characterisation of a proposed reinjection site in combination with geochemical/reactive transport modeling to understand and predict the long-term fate of arsenic; and explore means for suitably mitigating an undesired increase of naturally occurring arsenic concentrations. We use a series of arsenic sorption experiments with the aquifer material from an injection trial site in Queensland, Australia to quantify As sorption/desorption from mineral surfaces in response to changes in site-specific geochemical conditions. Batch experiments with arsenite were performed under anoxic conditions to replicate the highly reducing in-situ conditions. The results showed significant arsenic mobility at pH >8. Competitive sorption effects with phosphate and the impact of varying temperatures were also tested in batch mode. A site-specific general composite (GC) surface complexation model (SCM) was derived through inverse geochemical modeling, i.e., selection of appropriate surface complexation reactions and optimization of sorption constants. The SCM was subsequently tested and further improved during the interpretation of data from column flow-through experiments and from a field injection trial. Eventually the uncertainty associated with estimates of sorption constants was addressed and the effects of this uncertainty on field-scale model predictions were analyzed.

  7. Quantifying fluvial topography using UAS imagery and SfM photogrammetry

    NASA Astrophysics Data System (ADS)

    Woodget, Amy; Carbonneau, Patrice; Visser, Fleur; Maddock, Ian; Habit, Evelyn

    2014-05-01

    The measurement and monitoring of fluvial topography at high spatial and temporal resolutions is in increasing demand for a range of river science and management applications, including change detection, hydraulic models, habitat assessments, river restorations and sediment budgets. Existing approaches are yet to provide a single technique for rapidly quantifying fluvial topography in both exposed and submerged areas, with high spatial resolution, reach-scale continuous coverage, high accuracy and reasonable cost. In this paper, we explore the potential of using imagery acquired from a small unmanned aerial system (UAS) and processed using Structure-from-Motion (SfM) photogrammetry for filling this gap. We use a rotary winged hexacopter known as the Draganflyer X6, a consumer grade digital camera (Panasonic Lumix DMC-LX3) and the commercially available PhotoScan Pro SfM software (Agisoft LLC). We test the approach on three contrasting river systems; a shallow margin of the San Pedro River in the Valdivia region of south-central Chile, the lowland River Arrow in Warwickshire, UK, and the upland Coledale Beck in Cumbria, UK. Digital elevation models (DEMs) and orthophotos of hyperspatial resolution (0.01-0.02m) are produced. Mean elevation errors are found to vary somewhat between sites, dependent on vegetation coverage and the spatial arrangement of ground control points (GCPs) used to georeference the data. Mean errors are in the range 4-44mm for exposed areas and 17-89mm for submerged areas. Errors in submerged areas can be improved to 4-56mm with the application of a simple refraction correction procedure. Multiple surveys of the River Arrow site show consistently high quality results, indicating the repeatability of the approach. This work therefore demonstrates the potential of a UAS-SfM approach for quantifying fluvial topography.

  8. Quantifying the Spatial Dimension of Dengue Virus Epidemic Spread within a Tropical Urban Environment

    PubMed Central

    Vazquez-Prokopec, Gonzalo M.; Kitron, Uriel; Montgomery, Brian; Horne, Peter; Ritchie, Scott A.

    2010-01-01

    Background Dengue infection spread in naive populations occurs in an explosive and widespread fashion primarily due to the absence of population herd immunity, the population dynamics and dispersal of Ae. aegypti, and the movement of individuals within the urban space. Knowledge on the relative contribution of such factors to the spatial dimension of dengue virus spread has been limited. In the present study we analyzed the spatio-temporal pattern of a large dengue virus-2 (DENV-2) outbreak that affected the Australian city of Cairns (north Queensland) in 2003, quantified the relationship between dengue transmission and distance to the epidemic's index case (IC), evaluated the effects of indoor residual spraying (IRS) on the odds of dengue infection, and generated recommendations for city-wide dengue surveillance and control. Methods and Findings We retrospectively analyzed data from 383 DENV-2 confirmed cases and 1,163 IRS applications performed during the 25-week epidemic period. Spatial (local k-function, angular wavelets) and space-time (Knox test) analyses quantified the intensity and directionality of clustering of dengue cases, whereas a semi-parametric Bayesian space-time regression assessed the impact of IRS and spatial autocorrelation in the odds of weekly dengue infection. About 63% of the cases clustered up to 800 m around the IC's house. Most cases were distributed in the NW-SE axis as a consequence of the spatial arrangement of blocks within the city and, possibly, the prevailing winds. Space-time analysis showed that DENV-2 infection spread rapidly, generating 18 clusters (comprising 65% of all cases), and that these clusters varied in extent as a function of their distance to the IC's residence. IRS applications had a significant protective effect in the further occurrence of dengue cases, but only when they reached coverage of 60% or more of the neighboring premises of a house. Conclusion By applying sound statistical analysis to a very detailed dataset from one of the largest outbreaks that affected the city of Cairns in recent times, we not only described the spread of dengue virus with high detail but also quantified the spatio-temporal dimension of dengue virus transmission within this complex urban environment. In areas susceptible to non-periodic dengue epidemics, effective disease prevention and control would depend on the prompt response to introduced cases. We foresee that some of the results and recommendations derived from our study may also be applicable to other areas currently affected or potentially subject to dengue epidemics. PMID:21200419

  9. Low-Order Non-Spatial Effects Dominate Second-Order Spatial Effects in the Texture Quantifier Analysis of 18F-FDG-PET Images

    PubMed Central

    Brooks, Frank J.; Grigsby, Perry W.

    2015-01-01

    Background There is increasing interest in applying image texture quantifiers to assess the intra-tumor heterogeneity observed in FDG-PET images of various cancers. Use of these quantifiers as prognostic indicators of disease outcome and/or treatment response has yielded inconsistent results. We study the general applicability of some well-established texture quantifiers to the image data unique to FDG-PET. Methods We first created computer-simulated test images with statistical properties consistent with clinical image data for cancers of the uterine cervix. We specifically isolated second-order statistical effects from low-order effects and analyzed the resulting variation in common texture quantifiers in response to contrived image variations. We then analyzed the quantifiers computed for FIGOIIb cervical cancers via receiver operating characteristic (ROC) curves and via contingency table analysis of detrended quantifier values. Results We found that image texture quantifiers depend strongly on low-effects such as tumor volume and SUV distribution. When low-order effects are controlled, the image texture quantifiers tested were not able to discern only the second-order effects. Furthermore, the results of clinical tumor heterogeneity studies might be tunable via choice of patient population analyzed. Conclusion Some image texture quantifiers are strongly affected by factors distinct from the second-order effects researchers ostensibly seek to assess via those quantifiers. PMID:25714472

  10. Quantifying Drawdown in Complex Geologic Terrain with Theis Transforms

    NASA Astrophysics Data System (ADS)

    Garcia, C.; Halford, K. J.; Fenelon, J. M.

    2011-12-01

    Bulk hydraulic properties of aquifers and fault structures are most accurately quantified with multi-well aquifer tests. Detecting and quantifying pumping-induced drawdown at observation wells distant (> 1km) from the pumping well greatly expands the aquifer volume being investigated. Drawdown analyses at these greater distances, however, are often confounded because environmental water-level fluctuations typically mask the pumping signal. Environmental fluctuations (e.g. barometric and tidal potential) can be simulated and separated from the pumping signal with analytical water-level models if the period of pre-pumping data exceeds the period of drawdown and recovery to be analyzed. These circumstances occur infrequently, however, as a result of incomplete datasets and/or pervasive pumping or climatic trends. Pumping-induced changes can be differentiated reliably from environmental fluctuations in pumping-affected water-level records by simultaneously simulating pumping and environmental effects with analytical water-level models. Pumping effects are simulated with Theis transforms, which translate pumping schedules into water-level change by superimposing multiple Theis solutions. Environmental fluctuations are simulated by summing multiple time-series of barometric pressure change, tidal potential, and background water levels when available. Differences between simulated and measured water levels are minimized by adjusting the transmissivity and storage coefficient of pumping components and amplitude and phase of non-pumping components. Water levels simulated with the relatively simple Theis-transform method are capable of matching pumping signals generated with complex, three-dimensional numerical models. Pumping-induced changes simulated with Theis transforms and with a numerical model agreed (standard deviations as low as 0.003 m) in cases where pumping signals crossed confining units and fault structures and traveled distances of more than 1km.

  11. Monitoring microemboli during cardiopulmonary bypass with the EDAC quantifier.

    PubMed

    Lynch, John E; Wells, Christopher; Akers, Tom; Frantz, Paul; Garrett, Donna; Scott, M Lance; Williamson, Lisa; Agnew, Barbara; Lynch, John K

    2010-09-01

    Gaseous emboli may be introduced into the bypass circuit both from the surgical field and during perfusionist interventions. While circuits provide good protection against massive air embolism, they do not remove gaseous microemboli (GME) from the bypass circuit. The purpose of this preliminary study is to assess the incidence of GME during bypass surgery and determine if increased GME counts were associated with specific events during bypass surgery. In 30 cases divided between 15 coronary artery bypass grafts and 15 valve repairs, GME were counted and sizedt the three locations on the bypass circuit using the EDAC" Quantifier (Luna Innovations, Roanoke, VA). A mean of 45,276 GME were detected after the arterial line filter during these 30 cases, with significantly more detected (p = .04) post filter during valve cases (mean = 72,137 +/- 22,113) than coronary artery bypass graft cases (mean = 18,416 +/- 7831). GME detected post filter were significantly correlated in time with counts detected in the venous line (p < .001). Specific events associated with high counts included the initiation of cardiopulmonary bypass, heart manipulations, insertion and removal of clamps, and the administration of drugs. Global factors associated with increased counts post filter included higher venous line counts and higher post reservoir/bubble trap counts. The mean number of microemboli detected during bypass surgery was much higher than reported in other studies of emboli incidence, most likely due to the increased sensitivity of the EDAC Quantifier compared to other detection modalities. The results furthermore suggest the need for further study of the clinical significance of these microemboli and what practices may be used to reduce GME incidence. Increased in vitro testing of the air handling capability of different circuit designs, along with more clinical studies assessing best clinical practices for reducing GME activity, is recommended. PMID:21114224

  12. Quantifying terpenes in rumen fluid, serum, and plasma from sheep

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Determining the fate of terpenes consumed by browsing ruminants require methods to quantify their presence in blood and rumen fluid. Our objective was to modify an existing procedure for plasma terpenes to quantify 25 structurally diverse mono- and sesquiterpenes in serum, plasma, and rumen fluid fr...

  13. Identifying and Quantifying Landscape Patterns in Space and Time

    Microsoft Academic Search

    Janine Bolliger; Helene H. Wagner; Monica G. Turner

    In landscape ecology, approaches to identify and quantify landscape patterns are well developed for discrete landscape representations. Discretisation is often seen as a form of generalisation and simplification. Landscape patterns however are shaped by complex dynamic processes acting at various spatial and temporal scales. Thus, standard landscape metrics that quantify static, discrete overall landscape pattern or individual patch properties may

  14. Project No.: 003893 (GOCE) Quantifying the Climate Impact of

    E-print Network

    Haak, Hein

    of the new emission numbers from Activity 1 when these are available. For non-transport emissions the SRES A1B scenario will be applied, subtracting transport emissions. Figure 2: Emission estimates to be usedProject No.: 003893 (GOCE) QUANTIFY Quantifying the Climate Impact of Global and European Transport

  15. Anatomy of Alternating Quantifier Satisfiability (Work in progress)

    E-print Network

    Monniaux, David

    Anatomy of Alternating Quantifier Satisfiability (Work in progress) Anh-Dung Phan Technical procedures. We instantiate the generalization to projection functions based on virtual substitutions, i. 118­128 #12;Anatomy of Alternating Quantifier Satisfiability A.-D. Phan, N. Bjørner, D. Monniaux

  16. Quantifying present and projected future atmospheric moisture transports onto land

    E-print Network

    Allan, Richard P.

    Quantifying present and projected future atmospheric moisture transports onto land Matthias Zahn1 not considered for many of the regions. Citation: Zahn, M., and R. P. Allan (2013), Quantifying present of atmospheric water transports [Zahn and Allan, 2013]. Changes of the proper- ties of such transports affect

  17. QUANTIFYING DEMOCRACY OF WAVELET BASES IN LORENTZ SPACES

    E-print Network

    Hernández, Eugenio

    QUANTIFYING DEMOCRACY OF WAVELET BASES IN LORENTZ SPACES EUGENIO HERN´ANDEZ, JOS´E MAR´IA MARTELL it is interesting to ask how far wavelet bases are from being democratic in Lp,q (Rd ), p = q. To quantify democracy

  18. Quantifier elimination for formulas constrained by quadratic equations

    Microsoft Academic Search

    Hoon Hong

    1993-01-01

    An algorithm is given for constructing a quantifier free formula (a boolean expression of polynomial equations and inequalities) equivalent to a given formula of the form: (% c R)[azzz + alz + a. = O A F], where F is a quantifier free formula in Z1, . . . . z~, z, and az, al, ao are polynomials in z

  19. Shortcuts to Quantifier Interpretation in Children and Adults

    ERIC Educational Resources Information Center

    Brooks, Patricia J.; Sekerina, Irina

    2006-01-01

    Errors involving universal quantification are common in contexts depicting sets of individuals in partial, one-to-one correspondence. In this article, we explore whether quantifier-spreading errors are more common with distributive quantifiers each and every than with all. In Experiments 1 and 2, 96 children (5- to 9-year-olds) viewed pairs of…

  20. Quantifier elimination for real closed fields by cylindrical algebraic decomposition

    Microsoft Academic Search

    George E. Collins

    1975-01-01

    Tarski in 1948, [18] published a quantifier elimination method for the elementary theory of real closed fields (which he had discovered in 1930). As noted by Tarski, any quantifier elimination method for this theory also provides a decision method, which enables one to decide whether any sentence of the theory is true or false. Since many important and difficult mathematical

  1. Quantifier Elimination on Real Closed Fields and Dierential Equations

    Microsoft Academic Search

    Andreas Weber

    This paper surveys some recent applications of quantifier elimination on real closed fields in the context of dierential equations. Although polynomial vector fields give rise to solutions involving the exponential and other transcendental functions in general, many questions can be settled within the real closed field without referring to the real exponential field. The technique of quantifier elimination on real

  2. Quantifying Maintenance Requirements From the Steady-State Operation

    E-print Network

    Daugulis, Andrew J.

    compound (VOC) substrates in a two-phase partitioning bioscrubber is described. Direct evidenceQuantifying Maintenance Requirements From the Steady-State Operation of a Two-Phase Partitioning and explicitly quantifying the maintenance energy requirements of pure cultures growing on volatile organic

  3. Quantifying state-policy incentives for the renewable energy investor

    Microsoft Academic Search

    Sreenivas R. Sukumar; Mallikarjun Shankar; Mohammed M Olama; Stanton W Hadley; Vladimir A Protopopescu; Sergey Malinchik; Barry Ives

    2010-01-01

    In this paper, we describe an approach to quantify state-level renewable energy policies for a decision maker\\/investor. We describe the construction of a computational module - a rule-based system - to evaluate state incentives and their impacts on renewable energy investment. We aim to quantify the policy bias of states towards renewable technologies and identify profitable markets for investment, both

  4. New methods to quantify the cracking performance of cementitious systems made with internal curing

    NASA Astrophysics Data System (ADS)

    Schlitter, John L.

    The use of high performance concretes that utilize low water-cement ratios have been promoted for use in infrastructure based on their potential to increase durability and service life because they are stronger and less porous. Unfortunately, these benefits are not always realized due to the susceptibility of high performance concrete to undergo early age cracking caused by shrinkage. This problem is widespread and effects federal, state, and local budgets that must maintain or replace deterioration caused by cracking. As a result, methods to reduce or eliminate early age shrinkage cracking have been investigated. Internal curing is one such method in which a prewetted lightweight sand is incorporated into the concrete mixture to provide internal water as the concrete cures. This action can significantly reduce or eliminate shrinkage and in some cases causes a beneficial early age expansion. Standard laboratory tests have been developed to quantify the shrinkage cracking potential of concrete. Unfortunately, many of these tests may not be appropriate for use with internally cured mixtures and only provide limited amounts of information. Most standard tests are not designed to capture the expansive behavior of internally cured mixtures. This thesis describes the design and implementation of two new testing devices that overcome the limitations of current standards. The first device discussed in this thesis is called the dual ring. The dual ring is a testing device that quantifies the early age restrained shrinkage performance of cementitious mixtures. The design of the dual ring is based on the current ASTM C 1581-04 standard test which utilizes one steel ring to restrain a cementitious specimen. The dual ring overcomes two important limitations of the standard test. First, the standard single ring test cannot restrain the expansion that takes place at early ages which is not representative of field conditions. The dual ring incorporates a second restraining ring which is located outside of the sample to provide restraint against expansion. Second, the standard ring test is a passive test that only relies on the autogenous and drying shrinkage of the mixture to induce cracking. The dual ring test can be an active test because it has the ability to vary the temperature of the specimen in order to induce thermal stress and produce cracking. This ability enables the study of the restrained cracking capacity as the mixture ages in order to quantify crack sensitive periods of time. Measurements made with the dual ring quantify the benefits from using larger amounts of internal curing. Mixtures that resupplied internal curing water to match that of chemical shrinkage could sustain three times the magnitude of thermal change before cracking. The second device discussed in this thesis is a large scale slab testing device. This device tests the cracking potential of 15' long by 4" thick by 24" wide slab specimens in an environmentally controlled chamber. The current standard testing devices can be considered small scale and encounter problems when linking their results to the field due to size effects. Therefore, the large scale slab testing device was developed in order to calibrate the results of smaller scale tests to real world field conditions such as a pavement or bridge deck. Measurements made with the large scale testing device showed that the cracking propensity of the internally cured mixtures was reduced and that a significant benefit could be realized.

  5. Quantifying interactions between propranolol and dissolved organic matter (DOM) from different sources using fluorescence spectroscopy.

    PubMed

    Peng, Na; Wang, Kaifeng; Liu, Guoguang; Li, Fuhua; Yao, Kun; Lv, Wenying

    2014-04-01

    Beta blockers are widely used pharmaceuticals that have been detected in the environment. Interactions between beta blockers and dissolved organic matter (DOM) may mutually alter their environmental behaviors. To assess this potential, propranolol (PRO) was used as a model beta blocker to quantify the complexation with DOM from different sources using the fluorescence quenching titration method. The sources of studied DOM samples were identified by excitation-emission matrix spectroscopy (EEMs) combined with fluorescence regional integration analysis. The results show that PRO intrinsic fluorescence was statically quenched by DOM addition. The resulting binding constants (log K oc) ranged from 3.90 to 5.20, with the surface-water-filtered DOM samples claiming the lower log K oc and HA having the highest log K oc. Log K oc is negatively correlated with the fluorescence index, biological index, and the percent fluorescence response (P i,n) of protein-like region (P I,n) and the P i,n of microbial byproduct-like region (P II,n) of DOM EEMs, while it is correlated positively with humification index and the P i,n of UVC humic-like region (P III,n). These results indicate that DOM samples from allochthonous materials rich in aromatic and humic-like components would strongly bind PRO in aquatic systems, and autochthonous DOM containing high protein-like components would bind PRO more weakly. PMID:24390196

  6. Complexity and approximability of quantified and stochastic constraint satisfaction problems

    SciTech Connect

    Hunt, H. B. (Harry B.); Stearns, R. L.; Marathe, M. V. (Madhav V.)

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SAT{sub c}(S)). Here, we study simultaneously the complexity of and the existence of efficient approximation algorithms for a number of variants of the problems SAT(S) and SAT{sub c}(S), and for many different D, C, and S. These problem variants include decision and optimization problems, for formulas, quantified formulas stochastically-quantified formulas. We denote these problems by Q-SAT(S), MAX-Q-SAT(S), S-SAT(S), MAX-S-SAT(S) MAX-NSF-Q-SAT(S) and MAX-NSF-S-SAT(S). The main contribution is the development of a unified predictive theory for characterizing the the complexity of these problems. Our unified approach is based on the following basic two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic representability. Let k {ge} 2. Let S be a finite set of finite-arity relations on {Sigma}{sub k} with the following condition on S: All finite arity relations on {Sigma}{sub k} can be represented as finite existentially-quantified conjunctions of relations in S applied to variables (to variables and constant symbols in C), Then we prove the following new results: (1) The problems SAT(S) and SAT{sub c}(S) are both NQL-complete and {le}{sub logn}{sup bw}-complete for NP. (2) The problems Q-SAT(S), Q-SAT{sub c}(S), are PSPACE-complete. Letting k = 2, the problem S-SAT(S) and S-SAT{sub c}(S) are PSPACE-complete. (3) {exists} {epsilon} > 0 for which approximating the problems MAX-Q-SAT(S) within {epsilon} times optimum is PSPACE-hard. Letting k =: 2, {exists} {epsilon} > 0 for which approximating the problems MAX-S-SAT(S) within {epsilon} times optimum is PSPACE-hard. (4) {forall} {epsilon} > 0 the problems MAX-NSF-Q-SAT(S) and MAX-NSF-S-SAT(S), are PSPACE-hard to approximate within a factor of n{sup {epsilon}} times optimum. These results significantly extend the earlier results by (i) Papadimitriou [Pa851] on complexity of stochastic satisfiability, (ii) Condon, Feigenbaum, Lund and Shor [CF+93, CF+94] by identifying natural classes of PSPACE-hard optimization problems with provably PSPACE-hard {epsilon}-approximation problems. Moreover, most of our results hold not just for Boolean relations: most previous results were done only in the context of Boolean domains. The results also constitute as a significant step towards obtaining a dichotomy theorems for the problems MAX-S-SAT(S) and MAX-Q-SAT(S): a research area of recent interest [CF+93, CF+94, Cr95, KSW97, LMP99].

  7. Quantifying human response capabilities towards tsunami threats at community level

    NASA Astrophysics Data System (ADS)

    Post, J.; Mück, M.; Zosseder, K.; Wegscheider, S.; Taubenböck, H.; Strunz, G.; Muhari, A.; Anwar, H. Z.; Birkmann, J.; Gebert, N.

    2009-04-01

    Decision makers at the community level need detailed information on tsunami risks in their area. Knowledge on potential hazard impact, exposed elements such as people, critical facilities and lifelines, people's coping capacity and recovery potential are crucial to plan precautionary measures for adaptation and to mitigate potential impacts of tsunamis on society and the environment. A crucial point within a people-centred tsunami risk assessment is to quantify the human response capabilities towards tsunami threats. Based on this quantification and spatial representation in maps tsunami affected and safe areas, difficult-to-evacuate areas, evacuation target points and evacuation routes can be assigned and used as an important contribution to e.g. community level evacuation planning. Major component in the quantification of human response capabilities towards tsunami impacts is the factor time. The human response capabilities depend on the estimated time of arrival (ETA) of a tsunami, the time until technical or natural warning signs (ToNW) can be received, the reaction time (RT) of the population (human understanding of a tsunami warning and the decision to take appropriate action), the evacuation time (ET, time people need to reach a safe area) and the actual available response time (RsT = ETA - ToNW - RT). If RsT is larger than ET, people in the respective areas are able to reach a safe area and rescue themselves. Critical areas possess RsT values equal or even smaller ET and hence people whin these areas will be directly affected by a tsunami. Quantifying the factor time is challenging and an attempt to this is presented here. The ETA can be derived by analyzing pre-computed tsunami scenarios for a respective area. For ToNW we assume that the early warning center is able to fulfil the Indonesian presidential decree to issue a warning within 5 minutes. RT is difficult as here human intrinsic factors as educational level, believe, tsunami knowledge and experience besides others play a role. An attempt to quantify this variable under high uncertainty is also presented. Quantifying ET is based on a GIS modelling using a Cost Weighted Distance approach. Basic principle is to define the best evacuation path from a given point to the next safe area (shelter location). Here the fastest path from that point to the shelter location has to be found. Thereby the impact of land cover, slope, population density, population age and gender distribution are taken into account as literature studies prove these factors as highly important. Knowing the fastest path and the distance to the next safe area together with a spatially distributed pattern of evacuation speed delivers the time needed from each location to a safe area. By considering now the obtained time value for RsT the coverage area of an evacuation target point (safe area) can be assigned. Incorporating knowledge on people capacity of an evacuation target point the respective coverage area is refined. Hence areas with weak, moderate and good human response capabilities can be detected. This allows calculation of potential amount of people affected (dead or injured) and amount of people dislocated. First results for Kuta (Bali) for a worst case tsunami event deliver people affected of approx. 25 000 when RT = 0 minutes (direct evacuation when receiving a tsunami warning to 120 000 when RT > ETA (no evacuation action until tsunami hits the land). Additionally fastest evacuation routes to the evacuation target points can be assigned. Areas with weak response capabilities can be assigned as priority areas to install e.g. additional evacuation target points or to increase tsunami knowledge and awareness to promote a faster reaction time. Especially in analyzing underlying socio-economic properties causing deficiencies in responding to a tsunami threat can lead to valuable information and direct planning of adaptation measures. Keywords: Community level, Risk and vulnerability assessment, Early warning, Disaster management, Tsunami, Indonesia

  8. Quantifying Resistance to the Sybil Attack

    Microsoft Academic Search

    N. Boris Margolin; Brian Neil Levine

    2008-01-01

    Sybil attacks have been shown to be unpreventable except under the protection of a vigilant central authority. We use an economic analysis to show quantitatively that some appli- cations and protocols are more robust against the attack than others. In our approach, for each distributed application and an attacker objective, there is a critical value that determines the cost- effectiveness

  9. Quantifying NOx for Industrial Combustion Processes

    Microsoft Academic Search

    C. E. Baukal; P. B. Eleazer

    1998-01-01

    The objectives of this paper are to (1) identify the problems with many of the units that are used to report and regulate NOx, (2) show how to properly correct NOx measurements for oxygen-enhanced combustion, and (3) recommend a preferred type of NOx unit. The current variety of NOx units make comparisons difficult and can cause considerable confusion. NOx may

  10. The Language of Show Biz: A Dictionary.

    ERIC Educational Resources Information Center

    Sergel, Sherman Louis, Ed.

    This dictionary of the language of show biz provides the layman with definitions and essays on terms and expressions often used in show business. The overall pattern of selection was intended to be more rather than less inclusive, though radio, television, and film terms were deliberately omitted. Lengthy explanations are sometimes used to express…

  11. International Plowing Match & Farm Machinery Show

    NSDL National Science Digital Library

    The 1995 International Plowing Match & Farm Machinery Show in Ontario, Canada has a site of the Web. The IPM is a non-profit organization of volunteers which annually organizes Canada's largest farm machinery show. The event is commercial and educational. Thousands of school children and educators attend and participate in organized educational activities.

  12. Collegiate Cattle Growers 2012 Jackpot Show

    E-print Network

    Guerriero, Vince

    Collegiate Cattle Growers 2012 Jackpot Show February 18-19 At the U of A Ag Center Campbell Ave the evening before the show from 6 to 7 pm Make payment out to: Collegiate Cattle Growers Association Mail to: University of Arizona of Animal Sciences Attn. Collegiate Cattle Growers P.O. Box 210038, Tucson, AZ 85721

  13. BVA at the London Vet Show.

    PubMed

    2015-07-01

    The London Vet Show will take place at Olympia in London on November 19 and 20. A significant event in the veterinary calendar, the show hosts BVA Congress and several BVA programmed streams. Tim Keen, BVA marketing manager, takes a look at what's on offer. PMID:26139680

  14. End-of-Semester Barbecue Talent Show

    E-print Network

    Pilyugin, Sergei S.

    Highlights End-of-Semester Barbecue Talent Show Scholarship Winners St. Francis Food Drive eating, have fun tubing or canoeing, playing soccer, volleyball, and other games. This is the last Ceremony. Details will be in next week's Weekly. Talent Show Dress Rehearsal Dress Rehearsal: All acts MUST

  15. Salton Sea Satellite Image Showing Fault Slip

    USGS Multimedia Gallery

    Landsat satellite image (LE70390372003084EDC00) showing location of surface slip triggered along faults in the greater Salton Trough area. Red bars show the generalized location of 2010 surface slip along faults in the central Salton Trough and many additional faults in the southwestern section of t...

  16. SWINE PROSPECT SHOW Sunday, Dec. 7, 2014

    E-print Network

    Watson, Craig A.

    BARN WARS SWINE PROSPECT SHOW Sunday, Dec. 7, 2014 Indian River County Fairgrounds 7955 58th Ave. Checks need to be made out to: Indian River County 4-H. Mail to: Indian River County 4-H Attn: Prospect to all of the rules and regulations set forth by Barn Wars Prospect Show. Exhibitor's Signature: Date

  17. Identifying and quantifying the stromal fibrosis in muscularis propria of colorectal carcinoma by multiphoton microscopy

    NASA Astrophysics Data System (ADS)

    Chen, Sijia; Yang, Yinghong; Jiang, Weizhong; Feng, Changyin; Chen, Zhifen; Zhuo, Shuangmu; Zhu, Xiaoqin; Guan, Guoxian; Chen, Jianxin

    2014-10-01

    The examination of stromal fibrosis within colorectal cancer is overlooked, not only because the routine pathological examinations seem to focus more on tumour staging and precise surgical margins, but also because of the lack of efficient diagnostic methods. Multiphoton microscopy (MPM) can be used to study the muscularis stroma of normal and colorectal carcinoma tissue at the molecular level. In this work, we attempt to show the feasibility of MPM for discerning the microstructure of the normal human rectal muscle layer and fibrosis colorectal carcinoma tissue practicably. Three types of muscularis propria stromal fibrosis beneath the colorectal cancer infiltration were first observed through the MPM imaging system by providing intercellular microstructural details in fresh, unstained tissue samples. Our approach also presents the capability of quantifying the extent of stromal fibrosis from both amount and orientation of collagen, which may further characterize the severity of fibrosis. By comparing with the pathology analysis, these results show that the MPM has potential advantages in becoming a histological tool for detecting the stromal fibrosis and collecting prognosis evidence, which may guide subsequent therapy procedures for patients into good prognosis.

  18. Quantifying and mitigating bias in inference on gravitational wave source populations

    E-print Network

    Jonathan R. Gair; Christopher J. Moore

    2015-05-29

    When using incorrect or inaccurate signal models to perform parameter estimation on a gravitational wave signal, biased parameter estimates will in general be obtained. For a single event this bias may be consistent with the posterior, but when considering a population of events this bias becomes evident as a sag below the expected diagonal line of the P-P plot showing the fraction of signals found within a certain significance level versus that significance level. It would be hoped that recently proposed techniques for accounting for model uncertainties in parameter estimation would, to some extent, alleviate this problem. Here we demonstrate that this is indeed the case. We derive an analytic approximation to the P-P plot obtained when using an incorrect signal model to perform parameter estimation. This approximation is valid in the limit of high signal-to-noise ratio and nearly correct waveform models. We show how the P-P plot changes if a Gaussian process likelihood that allows for model errors is used to analyse the data. We demonstrate analytically and using numerical simulations that the bias is always reduced in this way. These results provide a way to quantify bias in inference on populations and demonstrate the importance of utilising methods to mitigate this bias.

  19. Quantifying and mitigating bias in inference on gravitational wave source populations

    NASA Astrophysics Data System (ADS)

    Gair, Jonathan R.; Moore, Christopher J.

    2015-06-01

    When using incorrect or inaccurate signal models to perform parameter estimation on a gravitational wave signal, biased parameter estimates will in general be obtained. For a single event this bias may be consistent with the posterior, but when considering a population of events this bias becomes evident as a sag below the expected diagonal line of the P -P plot showing the fraction of signals found within a certain significance level versus that significance level. It would be hoped that recently proposed techniques for accounting for model uncertainties in parameter estimation would, to some extent, alleviate this problem. Here we demonstrate that this is indeed the case. We derive an analytic approximation to the P -P plot obtained when using an incorrect signal model to perform parameter estimation. This approximation is valid in the limit of high signal-to-noise ratio and nearly correct waveform models. We show how the P -P plot changes if a Gaussian process likelihood that allows for model errors is used to analyze the data. We demonstrate analytically and using numerical simulations that the bias is always reduced in this way. These results provide a way to quantify bias in inference on populations and demonstrate the importance of utilizing methods to mitigate this bias.

  20. Quantifying Vegetation Recovery on Santa Rosa Island 

    E-print Network

    Rentschlar, Elizabeth

    2014-12-09

    reveals that vegetation growth rates (r) are higher in the overwashed transects. This is most likely the result of variations in the plant species found in an overwashed transect. The transects in which vegetation spread to a greater portion...

  1. Application of stereo laser tracking methods for quantifying flight dynamics-II

    Microsoft Academic Search

    Timothy J. Miller; Edward F. Romero; Hubert W. Schreier; Michael T. Valley

    2008-01-01

    Conventional tracking systems measure time-space-position data and collect imagery to quantify the flight dynamics of tracked targets. A major obstacle that severely impacts the accuracy of the target characterization is atmospheric turbulence induced distortion of the tracking laser beam and imagery degradations. Tracking occurs in a continuously changing atmosphere resulting in rapid variations in the tracking laser beam and distorted

  2. Method for quantifying the sample collected by an Andersen Cascade Impactor using total organic carbon analysis

    Microsoft Academic Search

    Lia G. Rebits; David J. Bennett; Pradnya A. Bhagwat; Andrea Morin; Robert E. Sievers

    2007-01-01

    A sensitive and precise method is proposed in which total organic carbon (TOC) analysis is used to quantify aerosol powder samples collected on the stages of an Andersen Cascade Impactor (ACI). In order to demonstrate the versatility of the method, size distributions of powders of varying composition and dose are presented. The results of ACI collections analyzed by both the

  3. Development of a Rapid Assessment Method for Quantifying Carbon Sequestration on Reclaimed Coal Mine Sites

    Microsoft Academic Search

    S. Maharaj; C. D. Barton; A. D. Karathanasis

    2005-01-01

    Projected climate change resulting from elevated atmospheric carbon dioxide has given rise to various strategies designed to sequester carbon in various terrestrial ecosystems. Reclaimed coal mine soils present one such potential carbon sink where traditional reclamation objectives can complement carbon sequestration. However, quantifying new carbon (carbon that has been added to soil through recent biological processes) on reclaimed mine soils

  4. Quantifying residues from postharvest fumigation of almonds and walnuts with propylene oxide

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A novel analytical approach, involving solvent extraction with methyl tert-butyl ether (MTBE) followed by gas chromatography (GC), was developed to quantify residues that result from the postharvest fumigation of almonds and walnuts with propylene oxide (PPO). Verification and quantification of PPO,...

  5. COMPARISON OF MEASUREMENT TECHNIQUES FOR QUANTIFYING SELECTED ORGANIC EMISSIONS FROM KEROSENE SPACE HEATERS

    EPA Science Inventory

    The report goes results of (1) a comparison the hood and chamber techniques for quantifying pollutant emission rates from unvented combustion appliances, and (2) an assessment of the semivolatile and nonvolatile organic-compound emissions from unvented kerosene space heaters. In ...

  6. Image Processing to quantify the Trajectory of a Visualized Air Jet

    Microsoft Academic Search

    A. Van Brecht; K. Janssens; D. Berckmans; E. Vranken

    2000-01-01

    In a ventilated space, the incoming air jet and the resulting airflow pattern play key roles in the removal or supply of heat, moisture, and harmful gases from or to living organisms (man, animal and plant). In this research, an image processing method was developed to visualize and quantify the two-dimensional trajectory and the deflection angle of an air jet

  7. Abstract.-Monte Carlo simula-tion is used to quantify the uncer-

    E-print Network

    Abstract.-Monte Carlo simula- tion is used to quantify the uncer- tainty in the results or perceived uncertainty in the inputs to the assessment model. Monte Carlo sim- ulation is then used proscribed limits while keeping the catch quota stable. We illustrate the use of the Monte Carlo approach

  8. QUANTIFYING ACCELERATED SOIL EROSION THROUGH ECOLOGICAL SITE-BASED ASSESSMENTS OF WIND AND WATER EROSION

    E-print Network

    QUANTIFYING ACCELERATED SOIL EROSION THROUGH ECOLOGICAL SITE- BASED ASSESSMENTS OF WIND AND WATER EROSION contact: Nicholas Webb phone: 575-646-3584 email: nwebb@nmsu.edu web: http change and intensification have resulted in accelerated rates of soil erosion in many areas of the world

  9. Quantifying commuter exposures to volatile organic compounds

    NASA Astrophysics Data System (ADS)

    Kayne, Ashleigh

    Motor-vehicles can be a predominant source of air pollution in cities. Traffic-related air pollution is often unavoidable for people who live in populous areas. Commuters may have high exposures to traffic-related air pollution as they are close to vehicle tailpipes. Volatile organic compounds (VOCs) are one class of air pollutants of concern because exposure to VOCs carries risk for adverse health effects. Specific VOCs of interest for this work include benzene, toluene, ethylbenzene, and xylenes (BTEX), which are often found in gasoline and combustion products. Although methods exist to measure time-integrated personal exposures to BTEX, there are few practical methods to measure a commuter's time-resolved BTEX exposure which could identify peak exposures that could be concealed with a time-integrated measurement. This study evaluated the ability of a photoionization detector (PID) to measure commuters' exposure to BTEX using Tenax TA samples as a reference and quantified the difference in BTEX exposure between cyclists and drivers with windows open and closed. To determine the suitability of two measurement methods (PID and Tenax TA) for use in this study, the precision, linearity, and limits of detection (LODs) for both the PID and Tenax TA measurement methods were determined in the laboratory with standard BTEX calibration gases. Volunteers commuted from their homes to their work places by cycling or driving while wearing a personal exposure backpack containing a collocated PID and Tenax TA sampler. Volunteers completed a survey and indicated if the windows in their vehicle were open or closed. Comparing pairs of exposure data from the Tenax TA and PID sampling methods determined the suitability of the PID to measure the BTEX exposures of commuters. The difference between BTEX exposures of cyclists and drivers with windows open and closed in Fort Collins was determined. Both the PID and Tenax TA measurement methods were precise and linear when evaluated in the laboratory using standard BTEX gases. The LODs for the Tenax TA sampling tubes (determined with a sample volume of 1,000 standard cubic centimeters which is close to the approximate commuter sample volumes collected) were orders of magnitude lower (0.04 to 0.7 parts per billion (ppb) for individual compounds of BTEX) compared to the PIDs' LODs (9.3 to 15 ppb of a BTEX mixture), which makes the Tenax TA sampling method more suitable to measure BTEX concentrations in the sub-parts per billion (ppb) range. PID and Tenax TA data for commuter exposures were inversely related. The concentrations of VOCs measured by the PID were substantially higher than BTEX concentrations measured by collocated Tenax TA samplers. The inverse trend and the large difference in magnitude between PID responses and Tenax TA BTEX measurements indicates the two methods may have been measuring different air pollutants that are negatively correlated. Drivers in Fort Collins, Colorado with closed windows experienced greater time-weighted average BTEX exposures than cyclists (p: 0.04). Commuter BTEX exposures measured in Fort Collins were lower than commuter exposures measured in prior studies that occurred in larger cities (Boston and Copenhagen). Although route and intake may affect a commuter's BTEX dose, these variables are outside of the scope of this study. Within the limitations of this study (including: small sample size, small representative area of Fort Collins, and respiration rates not taken into account), it appears health risks associated with traffic-induced BTEX exposures may be reduced by commuting via cycling instead of driving with windows closed and living in a less populous area that has less vehicle traffic. Although the PID did not reliably measure low-level commuter BTEX exposures, the Tenax TA sampling method did. The PID measured BTEX concentrations reliably in a controlled environment, at high concentrations (300-800 ppb), and in the absence of other air pollutants. In environments where there could be multiple chemicals present that may produce a PID signal (such a

  10. Sweet Concepts Inc.: Trade Show Marketing

    Microsoft Academic Search

    Mark Parry; Melanie Jones

    Brooks West of Sweet Concepts has recently adopted a new brand strategy for Butterfields, the company's line of hard candy. West must decide how the new branding stragtegy should affect his trade show marketing program.

  11. Do dogs (Canis familiaris) show contagious yawning?

    PubMed

    Harr, Aimee L; Gilbert, Valerie R; Phillips, Kimberley A

    2009-11-01

    We report an experimental investigation into whether domesticated dogs display contagious yawning. Fifteen dogs were shown video clips of (1) humans and (2) dogs displaying yawns and open-mouth expressions (not yawns) to investigate whether dogs showed contagious yawning to either of these social stimuli. Only one dog performed significantly more yawns during or shortly after viewing yawning videos than to the open-mouth videos, and most of these yawns occurred to the human videos. No dogs showed significantly more yawning to the open-mouth videos (human or dog). The percentage of dogs showing contagious yawning was less than chimpanzees and humans showing this behavior, and considerably less than a recently published report investigating this behavior in dogs (Joly-Mascheroni et al. in Biol Lett 4:446-448, 2008). PMID:19452178

  12. Nutrition and Feeding of Show Poultry 

    E-print Network

    Cartwright, A. Lee

    2003-11-03

    The championship potential of a chicken or turkey is determined by genetics, but proper nutrition can help an animal achieve that genetic potential. This publication outlines four principles critical to developing a nutrition program for show...

  13. Incident Response Planning for Selected Livestock Shows 

    E-print Network

    Tomascik, Chelsea Roxanne

    2012-02-14

    was to determine local officials' perceptions and awareness of incident planning and response pertaining to selected livestock shows. Little research has been completed in this area; therefore, this foundational study was needed. The objectives of this study...

  14. System Generator Tips Show sample time colors

    E-print Network

    System Generator Tips · Show sample time colors · FormatPort/Signal DisplaysSample Time Colors/stop the simulation · Give subsystems and their ports meaningful names · Mask subsystems you'll use again · Using too

  15. New Drug Shows Promise Against Psoriasis

    MedlinePLUS

    ... nlm.nih.gov/medlineplus/news/fullstory_153105.html New Drug Shows Promise Against Psoriasis Ixekizumab appeared to ... the disease clearing up, but people on the new drug also reporting a marked improvement in their ...

  16. The Moscow Show of Dissident Art

    ERIC Educational Resources Information Center

    Millet, Stephen M.

    1975-01-01

    Author described a show of dissident art held in Moscow on September 29, 1974, and contrasted the government's efforts to control artistic freedom with the determination of Russian artists to resist such imposition. (RK)

  17. World's Population Is Getting Sicker, Study Shows

    MedlinePLUS

    ... nlm.nih.gov/medlineplus/news/fullstory_152968.html World's Population Is Getting Sicker, Study Shows People lose ... largest analysis of trends in health around the world for the years 1990 to 2013, the journal ...

  18. Early Intervention Shows Promise in Treating Schizophrenia

    MedlinePLUS

    ... 153483.html Early Intervention Shows Promise in Treating Schizophrenia Programs that emphasize resiliency, education and job support ... health of patients in the early stages of schizophrenia, new research reveals. The finding, reported in the ...

  19. Ebola Treatment Shows Promise in Monkey Study

    MedlinePLUS

    Ebola Treatment Shows Promise in Monkey Study Antiviral drug cured animals with advanced infections, researchers say To ... HealthDay News) -- An experimental drug being tested on Ebola victims in Sierra Leone has proven effective in ...

  20. Quantifying heart rate dynamics using different approaches of symbolic dynamics

    NASA Astrophysics Data System (ADS)

    Cysarz, D.; Porta, A.; Montano, N.; Leeuwen, P. V.; Kurths, J.; Wessel, N.

    2013-06-01

    The analysis of symbolic dynamics applied to physiological time series is able to retrieve information about dynamical properties of the underlying system that cannot be gained with standard methods like e.g. spectral analysis. Different approaches for the transformation of the original time series to the symbolic time series have been proposed. Yet the differences between the approaches are unknown. In this study three different transformation methods are investigated: (1) symbolization according to the deviation from the average time series, (2) symbolization according to several equidistant levels between the minimum and maximum of the time series, (3) binary symbolization of the first derivative of the time series. Furthermore, permutation entropy was used to quantify the symbolic series. Each method was applied to the cardiac interbeat interval series RR i and its difference ? RR I of 17 healthy subjects obtained during head-up tilt testing. The symbolic dynamics of each method is analyzed by means of the occurrence of short sequences ("words") of length 3. The occurrence of words is grouped according to words without variations of the symbols (0V%), words with one variation (1V%), two like variations (2LV%) and two unlike variations (2UV%). Linear regression analysis showed that for method 1 0V%, 1V%, 2LV% and 2UV% changed with increasing tilt angle. For method 2 0V%, 2LV% and 2UV% changed with increasing tilt angle and method 3 showed changes for 0V% and 1V%. Furthermore, also the permutation entropy decreased with increasing tilt angle. In conclusion, all methods are capable of reflecting changes of the cardiac autonomic nervous system during head-up tilt. All methods show that even the analysis of very short symbolic sequences is capable of tracking changes of the cardiac autonomic regulation during head-up tilt testing.

  1. Quantifying the Impact of Dust on Heterogeneous Ice Generation in Midlevel Supercooled Stratiform Clouds

    SciTech Connect

    Zhang, Damao; Wang, Zhien; Heymsfield, Andrew J.; Fan, Jiwen; Liu, Dong; Zhao, Ming

    2012-09-26

    Dust aerosols have been regarded as effective ice nuclei (IN), but large uncertainties regarding their efficiencies remain. Here, four years of collocated CALIPSO and CloudSat measurements are used to quantify the impact of dust on heterogeneous ice generation in midlevel supercooled stratiform clouds (MSSCs) over the ‘dust belt’. The results show that the dusty MSSCs have an up to 20% higher mixed-phase cloud occurrence, up to 8 dBZ higher mean maximum Ze (Ze_max), and up to 11.5 g/m2 higher ice water path (IWP) than similar MSSCs under background aerosol conditions. Assuming similar ice growth and fallout history in similar MSSCs, the significant differences in Ze_max between dusty and non-dusty MSSCs reflect ice particle number concentration differences. Therefore, observed Ze_max differences indicate that dust could enhance ice particle concentration in MSSCs by a factor of 2 to 6 at temperatures colder than ?12°C. The enhancements are strongly dependent on the cloud top temperature, large dust particle concentration and chemical compositions. These results imply an important role of dust particles in modifying mixed-phase cloud properties globally.

  2. A Supervised Approach to Quantifying Sentence Similarity: With Application to Evidence Based Medicine

    PubMed Central

    Hassanzadeh, Hamed; Groza, Tudor; Nguyen, Anthony; Hunter, Jane

    2015-01-01

    Following the Evidence Based Medicine (EBM) practice, practitioners make use of the existing evidence to make therapeutic decisions. This evidence, in the form of scientific statements, is usually found in scholarly publications such as randomised control trials and systematic reviews. However, finding such information in the overwhelming amount of published material is particularly challenging. Approaches have been proposed to automatically extract scientific artefacts in EBM using standardised schemas. Our work takes this stream a step forward and looks into consolidating extracted artefacts—i.e., quantifying their degree of similarity based on the assumption that they carry the same rhetorical role. By semantically connecting key statements in the literature of EBM, practitioners are not only able to find available evidence more easily, but also can track the effects of different treatments/outcomes in a number of related studies. We devise a regression model based on a varied set of features and evaluate it both on a general English corpus (the SICK corpus), as well as on an EBM corpus (the NICTA-PIBOSO corpus). Experimental results show that our approach performs on par with the state of the art on the general English and achieves encouraging results on the biomedical text when compared against human judgement. PMID:26039310

  3. Quantifying component parts of indirect and direct voice therapy related to different voice disorders.

    PubMed

    Gartner-Schmidt, Jackie L; Roth, Douglas F; Zullo, Thomas G; Rosen, Clark A

    2013-03-01

    Voice therapy changes how people use and care for their voices. Speech-language pathologists (SLPs) have a multitude of choices from which to modify patient's vocal behaviors. Six SLPs performed 1461 voice therapy sessions and quantified the percentage of time spent in eight component parts of indirect and four component parts of direct voice therapy across five common voice disorders. Voice therapy data collection forms were prospectively completed immediately following each therapy visit. The SLPs were free to choose the component parts of voice therapy best suited for their respective patients. Results showed that direct voice therapy represented more than 75% of the treatment time across all voice therapy sessions. In the components of direct voice therapy, there was no statistical difference between percentages of time spent in resonant voice and flow phonation across all voice disorders. However, a significant difference was found for the time spent addressing transfer to conversational speech for muscle tension dysphonia, lesions, and scar than for vocal immobility and atrophy. Interestingly, while SLPs used a more common approach to direct voice therapy across voice disorders, they tended to vary the use of indirect components of therapy across voice disorders with certain components being addressed in greater length for specific voice disorders. Collectively, these results indicate that although SLPs may individualize their approach to indirect voice therapy, when it comes to direct voice therapy, SLPs have a common approach to voice therapy regardless of voice disorder. PMID:23352061

  4. A supervised approach to quantifying sentence similarity: with application to evidence based medicine.

    PubMed

    Hassanzadeh, Hamed; Groza, Tudor; Nguyen, Anthony; Hunter, Jane

    2015-01-01

    Following the Evidence Based Medicine (EBM) practice, practitioners make use of the existing evidence to make therapeutic decisions. This evidence, in the form of scientific statements, is usually found in scholarly publications such as randomised control trials and systematic reviews. However, finding such information in the overwhelming amount of published material is particularly challenging. Approaches have been proposed to automatically extract scientific artefacts in EBM using standardised schemas. Our work takes this stream a step forward and looks into consolidating extracted artefacts-i.e., quantifying their degree of similarity based on the assumption that they carry the same rhetorical role. By semantically connecting key statements in the literature of EBM, practitioners are not only able to find available evidence more easily, but also can track the effects of different treatments/outcomes in a number of related studies. We devise a regression model based on a varied set of features and evaluate it both on a general English corpus (the SICK corpus), as well as on an EBM corpus (the NICTA-PIBOSO corpus). Experimental results show that our approach performs on par with the state of the art on the general English and achieves encouraging results on the biomedical text when compared against human judgement. PMID:26039310

  5. Quantifying fluvial sediment flux on a monsoonal mega-river: the Mekong

    NASA Astrophysics Data System (ADS)

    Parsons, D. R.; Darby, S. E.; Hackney, C. R.; Best, J.; Aalto, R. E.; Nicholas, A. P.; Leyland, J.

    2013-12-01

    Quantifying sediment fluxes and distinguishing between bed-load and suspended-load (bed-load and suspended bed-load) transport of large rivers remains a significant challenge. It is increasingly apparent that prediction of large river morphodynamics in response to environmental change requires a robust quantification of sediment fluxes across a range of discharges. Such quantification becomes even more problematic for monsoonal rivers where large non-linearities in hydrological-sediment relations exist. This paper, as part of a NERC funded STELAR-S2S project (www.stelar-s2s.org), presents a series of multibeam sonar repeat bed surveys and acoustic calibrations that allow simultaneous quantification of bed-load transport and suspended load fluxes in the lower Mekong River. Results show how multibeam sonar can be used to map bedform evolution across a range of time scales and produce robust extimates of bed-load whilst acoustic backscatter calibration to suspended sediment load can be used in combination to Doppler flow velocity estimates to recover full sediment fluxes at the reach scale. The methods, estimates of error and implications of the results for the function of large river systems will be discussed.

  6. Method for quantifying optical properties of the human lens

    DOEpatents

    Loree, T.R.; Bigio, I.J.; Zuclich, J.A.; Shimada, Tsutomu; Strobl, K.

    1999-04-13

    A method is disclosed for quantifying optical properties of the human lens. The present invention includes the application of fiberoptic, OMA-based instrumentation as an in vivo diagnostic tool for the human ocular lens. Rapid, noninvasive and comprehensive assessment of the optical characteristics of a lens using very modest levels of exciting light are described. Typically, the backscatter and fluorescence spectra (from about 300- to 900-nm) elicited by each of several exciting wavelengths (from about 300- to 600-nm) are collected within a few seconds. The resulting optical signature of individual lenses is then used to assess the overall optical quality of the lens by comparing the results with a database of similar measurements obtained from a reference set of normal human lenses having various ages. Several metrics have been identified which gauge the optical quality of a given lens relative to the norm for the subject`s chronological age. These metrics may also serve to document accelerated optical aging and/or as early indicators of cataract or other disease processes. 8 figs.

  7. QUANTIFYING THE EVOLVING MAGNETIC STRUCTURE OF ACTIVE REGIONS

    SciTech Connect

    Conlon, Paul A.; McAteer, R.T. James; Gallagher, Peter T.; Fennell, Linda, E-mail: mcateer@nmsu.ed [School of Physics, Trinity College Dublin, Dublin 2 (Ireland)

    2010-10-10

    The topical and controversial issue of parameterizing the magnetic structure of solar active regions has vital implications in the understanding of how these structures form, evolve, produce solar flares, and decay. This interdisciplinary and ill-constrained problem of quantifying complexity is addressed by using a two-dimensional wavelet transform modulus maxima (WTMM) method to study the multifractal properties of active region photospheric magnetic fields. The WTMM method provides an adaptive space-scale partition of a fractal distribution, from which one can extract the multifractal spectra. The use of a novel segmentation procedure allows us to remove the quiet Sun component and reliably study the evolution of active region multifractal parameters. It is shown that prior to the onset of solar flares, the magnetic field undergoes restructuring as Dirac-like features (with a Hoelder exponent, h = -1) coalesce to form step functions (where h = 0). The resulting configuration has a higher concentration of gradients along neutral line features. We propose that when sufficient flux is present in an active region for a period of time, it must be structured with a fractal dimension greater than 1.2, and a Hoelder exponent greater than -0.7, in order to produce M- and X-class flares. This result has immediate applications in the study of the underlying physics of active region evolution and space weather forecasting.

  8. AN INDUSTRIAL HYGIENE SAMPLING STRATEGY TO QUANTIFY EMPLOYEE EXPOSURE

    SciTech Connect

    Thompson, Aaron L.; Hylko, James M.

    2003-02-27

    Depending on the invasive nature of performing waste management activities, excessive concentrations of mists, vapors, gases, dusts or fumes may be present thus creating hazards to the employee from either inhalation into the lungs or absorption through the skin. To address these hazards, similar exposure groups and an exposure profile result consisting of: (1) a hazard index (concentration); (2) an exposure rating (monitoring results or exposure probabilities); and (3) a frequency rating (hours of potential exposure per week) are used to assign an exposure risk rating (ERR). The ERR determines if the potential hazards pose significant risks to employees linking potential exposure and breathing zone (BZ) monitoring requirements. Three case studies consisting of: (1) a hazard-task approach; (2) a hazard-job classification-task approach; and (3) a hazard approach demonstrate how to conduct exposure assessments using this methodology. Environment, safety and health professionals can then categorize levels of risk and evaluate the need for BZ monitoring, thereby quantifying employee exposure levels accurately.

  9. Quantifying the Evolution of Soil Fabric Under Different Stress Paths

    NASA Astrophysics Data System (ADS)

    Barreto, D.; O'Sullivan, C.; Zdravkovic, L.

    2009-06-01

    It is well recognized that the macro-scale response of soils is anisotropic in terms of strength, stiffness, permeability, etc. The source of this anisotropy is thought to be an anisotropy of the material itself. This anisotropy can be quantified using statistical methods if DEM numerical simulations or advanced experimental techniques are used. The anisotropic response of soil has been analyzed by many researchers in terms of the fabric tensor, which provides a measure of the orientation of the contacts between particles. Although many approaches for the quantification of the evolution of soil fabric have been used, they have not been previously compared to assess their effectiveness to describe fabric changes. A direct comparison of different methods of fabric quantification is presented in this paper based on the results from DEM simulations under different stress paths and the suitability of each of these methods is discussed. The results highlight the need for more accurate methods and/or approaches to accurately describe the evolution of fabric anisotropy in granular materials.

  10. Quantifying the benefits of vehicle pooling with shareability networks.

    PubMed

    Santi, Paolo; Resta, Giovanni; Szell, Michael; Sobolevsky, Stanislav; Strogatz, Steven H; Ratti, Carlo

    2014-09-16

    Taxi services are a vital part of urban transportation, and a considerable contributor to traffic congestion and air pollution causing substantial adverse effects on human health. Sharing taxi trips is a possible way of reducing the negative impact of taxi services on cities, but this comes at the expense of passenger discomfort quantifiable in terms of a longer travel time. Due to computational challenges, taxi sharing has traditionally been approached on small scales, such as within airport perimeters, or with dynamical ad hoc heuristics. However, a mathematical framework for the systematic understanding of the tradeoff between collective benefits of sharing and individual passenger discomfort is lacking. Here we introduce the notion of shareability network, which allows us to model the collective benefits of sharing as a function of passenger inconvenience, and to efficiently compute optimal sharing strategies on massive datasets. We apply this framework to a dataset of millions of taxi trips taken in New York City, showing that with increasing but still relatively low passenger discomfort, cumulative trip length can be cut by 40% or more. This benefit comes with reductions in service cost, emissions, and with split fares, hinting toward a wide passenger acceptance of such a shared service. Simulation of a realistic online system demonstrates the feasibility of a shareable taxi service in New York City. Shareability as a function of trip density saturates fast, suggesting effectiveness of the taxi sharing system also in cities with much sparser taxi fleets or when willingness to share is low. PMID:25197046

  11. Quantifying food losses and the potential for reduction in Switzerland.

    PubMed

    Beretta, Claudio; Stoessel, Franziska; Baier, Urs; Hellweg, Stefanie

    2013-03-01

    A key element in making our food systems more efficient is the reduction of food losses across the entire food value chain. Nevertheless, food losses are often neglected. This paper quantifies food losses in Switzerland at the various stages of the food value chain (agricultural production, postharvest handling and trade, processing, food service industry, retail, and households), identifies hotspots and analyses the reasons for losses. Twenty-two food categories are modelled separately in a mass and energy flow analysis, based on data from 31 companies within the food value chain, and from public institutions, associations, and from the literature. The energy balance shows that 48% of the total calories produced (edible crop yields at harvest time and animal products, including slaughter waste) is lost across the whole food value chain. Half of these losses would be avoidable given appropriate mitigation measures. Most avoidable food losses occur at the household, processing, and agricultural production stage of the food value chain. Households are responsible for almost half of the total avoidable losses (in terms of calorific content). PMID:23270687

  12. Quantifying the benefits of vehicle pooling with shareability networks

    PubMed Central

    Santi, Paolo; Resta, Giovanni; Szell, Michael; Sobolevsky, Stanislav; Strogatz, Steven H.; Ratti, Carlo

    2014-01-01

    Taxi services are a vital part of urban transportation, and a considerable contributor to traffic congestion and air pollution causing substantial adverse effects on human health. Sharing taxi trips is a possible way of reducing the negative impact of taxi services on cities, but this comes at the expense of passenger discomfort quantifiable in terms of a longer travel time. Due to computational challenges, taxi sharing has traditionally been approached on small scales, such as within airport perimeters, or with dynamical ad hoc heuristics. However, a mathematical framework for the systematic understanding of the tradeoff between collective benefits of sharing and individual passenger discomfort is lacking. Here we introduce the notion of shareability network, which allows us to model the collective benefits of sharing as a function of passenger inconvenience, and to efficiently compute optimal sharing strategies on massive datasets. We apply this framework to a dataset of millions of taxi trips taken in New York City, showing that with increasing but still relatively low passenger discomfort, cumulative trip length can be cut by 40% or more. This benefit comes with reductions in service cost, emissions, and with split fares, hinting toward a wide passenger acceptance of such a shared service. Simulation of a realistic online system demonstrates the feasibility of a shareable taxi service in New York City. Shareability as a function of trip density saturates fast, suggesting effectiveness of the taxi sharing system also in cities with much sparser taxi fleets or when willingness to share is low. PMID:25197046

  13. Parkinson's Law quantified: three investigations on bureaucratic inefficiency

    NASA Astrophysics Data System (ADS)

    Klimek, Peter; Hanel, Rudolf; Thurner, Stefan

    2009-03-01

    We formulate three famous, descriptive essays of Parkinson on bureaucratic inefficiency in a quantifiable and dynamical socio-physical framework. In the first model we show how the use of recent opinion formation models for small groups can be used to understand Parkinson's observation that decision-making bodies such as cabinets or boards become highly inefficient once their size exceeds a critical 'Coefficient of Inefficiency', typically around 20. A second observation of Parkinson—which is sometimes referred to as Parkinson's Law—is that the growth of bureaucratic or administrative bodies usually goes hand in hand with a drastic decrease of its overall efficiency. In our second model we view a bureaucratic body as a system of a flow of workers, who enter, become promoted to various internal levels within the system over time, and leave the system after having served for a certain time. Promotion usually is associated with an increase of subordinates. Within the proposed model it becomes possible to work out the phase diagram under which conditions of bureaucratic growth can be confined. In our last model we assign individual efficiency curves to workers throughout their life in administration, and compute the optimum time to give them the old age pension, in order to ensure a maximum of efficiency within the body—in Parkinson's words we compute the 'Pension Point'.

  14. Quantifying Stochastic Effects in Biochemical Reaction Networks using Partitioned Leaping

    E-print Network

    Leonard A. Harris; Aaron M. Piccirilli; Emily R. Majusiak; Paulette Clancy

    2009-07-06

    "Leaping" methods show great promise for significantly accelerating stochastic simulations of complex biochemical reaction networks. However, few practical applications of leaping have appeared in the literature to date. Here, we address this issue using the "partitioned leaping algorithm" (PLA) [L.A. Harris and P. Clancy, J. Chem. Phys. 125, 144107 (2006)], a recently-introduced multiscale leaping approach. We use the PLA to investigate stochastic effects in two model biochemical reaction networks. The networks that we consider are simple enough so as to be accessible to our intuition but sufficiently complex so as to be generally representative of real biological systems. We demonstrate how the PLA allows us to quantify subtle effects of stochasticity in these systems that would be difficult to ascertain otherwise as well as not-so-subtle behaviors that would strain commonly-used "exact" stochastic methods. We also illustrate bottlenecks that can hinder the approach and exemplify and discuss possible strategies for overcoming them. Overall, our aim is to aid and motivate future applications of leaping by providing stark illustrations of the benefits of the method while at the same time elucidating obstacles that are often encountered in practice.

  15. Quantifying Uncertainties in Earthquake Recurrence Rate

    Microsoft Academic Search

    K. Coppersmith

    2003-01-01

    Probabilistic seismic hazard analysis (PSHA) requires expressions of the location, rate, and size of future seismicity. By definition, a PSHA is forward-looking in time -- we commonly use the results to design or evaluate engineered structures over their future lifetimes -- but relies on our perceptions and knowledge of the past. The \\

  16. Quantifying MCMC exploration of phylogenetic tree space.

    PubMed

    Whidden, Chris; Matsen, Frederick A

    2015-05-01

    In order to gain an understanding of the effectiveness of phylogenetic Markov chain Monte Carlo (MCMC), it is important to understand how quickly the empirical distribution of the MCMC converges to the posterior distribution. In this article, we investigate this problem on phylogenetic tree topologies with a metric that is especially well suited to the task: the subtree prune-and-regraft (SPR) metric. This metric directly corresponds to the minimum number of MCMC rearrangements required to move between trees in common phylogenetic MCMC implementations. We develop a novel graph-based approach to analyze tree posteriors and find that the SPR metric is much more informative than simpler metrics that are unrelated to MCMC moves. In doing so, we show conclusively that topological peaks do occur in Bayesian phylogenetic posteriors from real data sets as sampled with standard MCMC approaches, investigate the efficiency of Metropolis-coupled MCMC (MCMCMC) in traversing the valleys between peaks, and show that conditional clade distribution (CCD) can have systematic problems when there are multiple peaks. PMID:25631175

  17. Educational Outreach: The Space Science Road Show

    NASA Astrophysics Data System (ADS)

    Cox, N. L. J.

    2002-01-01

    The poster presented will give an overview of a study towards a "Space Road Show". The topic of this show is space science. The target group is adolescents, aged 12 to 15, at Dutch high schools. The show and its accompanying experiments would be supported with suitable educational material. Science teachers at schools can decide for themselves if they want to use this material in advance, afterwards or not at all. The aims of this outreach effort are: to motivate students for space science and engineering, to help them understand the importance of (space) research, to give them a positive feeling about the possibilities offered by space and in the process give them useful knowledge on space basics. The show revolves around three main themes: applications, science and society. First the students will get some historical background on the importance of space/astronomy to civilization. Secondly they will learn more about novel uses of space. On the one hand they will learn of "Views on Earth" involving technologies like Remote Sensing (or Spying), Communication, Broadcasting, GPS and Telemedicine. On the other hand they will experience "Views on Space" illustrated by past, present and future space research missions, like the space exploration missions (Cassini/Huygens, Mars Express and Rosetta) and the astronomy missions (Soho and XMM). Meanwhile, the students will learn more about the technology of launchers and satellites needed to accomplish these space missions. Throughout the show and especially towards the end attention will be paid to the third theme "Why go to space"? Other reasons for people to get into space will be explored. An important question in this is the commercial (manned) exploration of space. Thus, the questions of benefit of space to society are integrated in the entire show. It raises some fundamental questions about the effects of space travel on our environment, poverty and other moral issues. The show attempts to connect scientific with community thought. The difficulty with a show this elaborate and intricate is communicating on a level understandable for teenagers, whilst not treating them like children. Professional space scientists know how easy it is to lose oneself in technical specifics. This would, of course, only confuse young people. The author would like to discuss the ideas for this show with a knowledgeable audience and hopefully get some (constructive) feedback.

  18. Quantifying the Climate-Scale Accuracy of Satellite Cloud Retrievals

    NASA Astrophysics Data System (ADS)

    Roberts, Y.; Wielicki, B. A.; Sun-Mack, S.; Minnis, P.; Liang, L.; Di Girolamo, L.

    2014-12-01

    Instrument calibration and cloud retrieval algorithms have been developed to minimize retrieval errors on small scales. However, measurement uncertainties and assumptions within retrieval algorithms at the pixel level may alias into decadal-scale trends of cloud properties. We first, therefore, quantify how instrument calibration changes could alias into cloud property trends. For a perfect observing system the climate trend accuracy is limited only by the natural variability of the climate variable. Alternatively, for an actual observing system, the climate trend accuracy is additionally limited by the measurement uncertainty. Drifts in calibration over time may therefore be disguised as a true climate trend. We impose absolute calibration changes to MODIS spectral reflectance used as input to the CERES Cloud Property Retrieval System (CPRS) and run the modified MODIS reflectance through the CPRS to determine the sensitivity of cloud properties to calibration changes. We then use these changes to determine the impact of instrument calibration changes on trend uncertainty in reflected solar cloud properties. Secondly, we quantify how much cloud retrieval algorithm assumptions alias into cloud optical retrieval trends by starting with the largest of these biases: the plane-parallel assumption in cloud optical thickness (?C) retrievals. First, we collect liquid water cloud fields obtained from Multi-angle Imaging Spectroradiometer (MISR) measurements to construct realistic probability distribution functions (PDFs) of 3D cloud anisotropy (a measure of the degree to which clouds depart from plane-parallel) for different ISCCP cloud types. Next, we will conduct a theoretical study with dynamically simulated cloud fields and a 3D radiative transfer model to determine the relationship between 3D cloud anisotropy and 3D ?C bias for each cloud type. Combining these results provides distributions of 3D ?C bias by cloud type. Finally, we will estimate the change in frequency of occurrence of cloud types between two decades and will have the information needed to calculate the total change in 3D optical thickness bias between two decades. If we uncover aliases in this study, the results will motivate the development and rigorous testing of climate specific cloud retrieval algorithms.

  19. Quantifying Uncertainties in Tephra Thickness and Volume Estimates

    NASA Astrophysics Data System (ADS)

    Engwell, S. L.; Aspinall, W.; Sparks, R. S.

    2013-12-01

    Characterization of explosive volcanic eruptive processes from interpretations of deposits is a key to assessing long-term volcanic hazards and risks, particularly for large explosive eruptions which occur relatively infrequently and others whose deposits, particularly distal deposits, are transient in the geological record. Whilst eruption size - determined by measurement and interpretation of tephra fall deposits - is of particular importance, uncertainties for such measurements and volume estimates are rarely presented. In this study, we quantify the main sources of variance in determining tephra volume from thickness measurements and isopachs in terms of number and spatial distribution of such measurements, using the Fogo A deposit, São Miguel, Azores as an example. Analysis of Fogo A fall deposits show measurement uncertainties are approximately 9 % of measured thickness while uncertainty associated with natural deposit variability ranges between 10 % and 40 % of average thickness, with an average variation of 30 %. Correlations between measurement uncertainties and natural deposit variability are complex and depend on a unit's thickness, position within a succession and distance from source and local topography. The degree to which thickness measurement errors impact on volume uncertainty depends on the number of measurements in a given dataset and their associated individual uncertainties. For Fogo A, the consequent uncertainty in volume associated with thickness measurement uncertainty is 1.3 %, equivalent to a volume uncertainty of 1.5 × 0.02 km3. Uncertainties also arise in producing isopach maps: the spatial relationships between source location and different deposit thicknesses are described by contours subjectively drawn to encompass measurements of a given thickness, generally by eye. Recent advances in volume estimation techniques involve the application of mathematical models directly to tephra thickness data. Here, uncertainties in tephra volumes derived from isopach maps were investigated by modelling raw thickness data as bicubic splines under tension. In this way, isopachs are objectively determined in relation to the original data. This enables limitations in volume estimates to be identified in previously published maps where a mathematically formal fitting procedure was not used. Eruption volumes derived using these spline isopachs are in general smaller than published traditional estimates. Using the bicubic spline method, volume uncertainties are correlated with number of data points and decrease from as much as 40 % relative to the mean estimate for a case with 30 measurements to 10 % when 120 measurements or more are available. Thus the accuracy of volume estimation using this method depends on the number of data points, their spatial distribution and their associated measurement uncertainties, and the estimate reliability can be fully quantified on these terms; comprehensive uncertainty assessment is not feasible for most conventional tephra volume estimates determined using hand drawn isopachs.

  20. Liquid Crystal Research Shows Deformation By Drying

    NASA Technical Reports Server (NTRS)

    2003-01-01

    These images, from David Weitz's liquid crystal research, show ordered uniform sized droplets (upper left) before they are dried from their solution. After the droplets are dried (upper right), they are viewed with crossed polarizers that show the deformation caused by drying, a process that orients the bipolar structure of the liquid crystal within the droplets. When an electric field is applied to the dried droplets (lower left), and then increased (lower right), the liquid crystal within the droplets switches its alignment, thereby reducing the amount of light that can be scattered by the droplets when a beam is shone through them.

  1. Quantifying momenta through the Fourier transform

    E-print Network

    Rodr\\'\\iguez-Lara, B M

    2011-01-01

    Integral transforms arising from the separable solutions to the Helmholtz differential equation are presented. Pairs of these integral transforms are related via Plancherel theorem and, ultimately, any of these integral transforms may be calculated using only Fourier transforms. This result is used to evaluate the mean value of momenta associated to the symmetries of the reduced wave equation. As an explicit example, the orbital angular momenta of plane and elliptic-cylindrical waves is presented.

  2. Quantifying oil filtration effects on bearing life

    Microsoft Academic Search

    W. M. Needelman; E. V. Zaretsky

    1991-01-01

    Rolling-element bearing life is influenced by the number, size, and material properties of particles entering the Hertzian contact of the rolling element and raceway. In general, rolling-element bearing life increases with increasing level of oil filtration. Based upon test results, two equations are presented which allow for the adjustment of bearing Lââ or catalog life based upon oil filter rating.

  3. Quantifying protein diffusion and capture on filaments

    E-print Network

    Emanuel Reithmann; Louis Reese; Erwin Frey

    2015-03-03

    The functional relevance of regulating proteins is often limited to specific binding sites such as the ends of microtubules or actin-filaments. A localization of proteins on these functional sites is of great importance. We present a quantitative theory for a diffusion and capture process, where proteins diffuse on a filament and stop diffusing when reaching the filament's end. It is found that end-association after one-dimensional diffusion is the main source for tip-localization of such proteins. As a consequence, diffusion and capture is highly efficient in enhancing the reaction velocity of enzymatic reactions, where proteins and filament ends are to each other as enzyme and substrate. We show that the reaction velocity can effectively be described within a Michaelis-Menten framework. Together one-dimensional diffusion and capture beats the (three-dimensional) Smoluchowski diffusion limit for the rate of protein association to filament ends.

  4. Comparing 3D Gyrification Index and area-independent curvature-based measures in quantifying neonatal brain folding

    NASA Astrophysics Data System (ADS)

    Rodriguez-Carranza, Claudia E.; Mukherjee, P.; Vigneron, Daniel; Barkovich, James; Studholme, Colin

    2007-03-01

    In this work we compare 3D Gyrification Index and our recently proposed area-independent curvature-based surface measures [26] for the in-vivo quantification of brain surface folding in clinically acquired neonatal MR image data. A meaningful comparison of gyrification across brains of different sizes and their subregions will only be possible through the quantification of folding with measures that are independent of the area of the region of analysis. This work uses a 3D implementation of the classical Gyrification Index, a 2D measure that quantifies folding based on the ratio of the inner and outer contours of the brain and which has been used to study gyral patterns in adults with schizophrenia, among other conditions. The new surface curvature-based measures and the 3D Gyrification Index were calculated on twelve premature infants (age 28-37 weeks) from which surfaces of cerebrospinal fluid/gray matter (CSF/GM) interface and gray matter/white matter (GM/WM) interface were extracted. Experimental results show that our measures better quantify folding on the CSF/GM interface than Gyrification Index, and perform similarly on the GM/WM interface.

  5. Effective rates of heavy metal release from alkaline wastes — Quantified by column outflow experiments and inverse simulations

    NASA Astrophysics Data System (ADS)

    Wehrer, Markus; Totsche, Kai Uwe

    2008-10-01

    Column outflow experiments operated at steady state flow conditions do not allow the identification of rate limited release processes. This requires an alternative experimental methodology. In this study, the aim was to apply such a methodology in order to identify and quantify effective release rates of heavy metals from granular wastes. Column experiments were conducted with demolition waste and municipal waste incineration (MSWI) bottom ash using different flow velocities and multiple flow interruptions. The effluent was analyzed for heavy metals, DOC, electrical conductivity and pH. The breakthrough-curves were inversely modeled with a numerical code based on the advection-dispersion equation with first order mass-transfer and nonlinear interaction terms. Chromium, Copper, Nickel and Arsenic are usually released under non-equilibrium conditions. DOC might play a role as carrier for those trace metals. By inverse simulations, generally good model fits are derived. Although some parameters are correlated and some model deficiencies can be revealed, we are able to deduce physically reasonable release-mass-transfer time scales. Applying forward simulations, the parameter space with equifinal parameter sets was delineated. The results demonstrate that the presented experimental design is capable of identifying and quantifying non-equilibrium conditions. They show also that the possibility of rate limited release must not be neglected in release and transport studies involving inorganic contaminants.

  6. INTRODUCTION Mitotic metaphase chromosomes show sister chromatids

    E-print Network

    Villefranche sur mer

    . Meiosis I bivalents, as mitotic chromosomes, show sister-chromatid centromere and arm cohesions cohesion during meiosis I, and then release centromere cohesion during meiosis II (for review see Moore and Orr- Weaver, 1998). Consequently, this sequential loss of cohesion during meiosis might be precisely

  7. The object-oriented trivia show (TOOTS)

    Microsoft Academic Search

    Jeff Gray; Jules White

    2010-01-01

    OOPSLA has a longstanding tradition of being a forum for discussing the cutting edge of technology in a fun and participatory environment. The type of events sponsored by OOPSLA sometimes border on the unconventional. This event represents an atypical panel that conforms to the concept of a game show that is focused on questions and answers related to SPLASH, OOPSLA,

  8. George Arcement Shows Locations of USGS Streamgages

    USGS Multimedia Gallery

    USGS Louisiana Water Science Center Director George Arcement shows the locations of USGS' streamgage network to WAFB Meteorologist Jay Grymes.  USGS maintains more than 30 real-time streamgages throughout the area affected by the 2011 Flood. In addition, more than 50 non-real-time gages were...

  9. 2014 NORTHWEST MICHIGAN ORCHARD & VINEYARD SHOW

    E-print Network

    2014 NORTHWEST MICHIGAN ORCHARD & VINEYARD SHOW Grand Traverse Resort January 14-15 Costs Wing Drosophila and Other Invasive Vineyard Bugs: Should We Be Concerned? Dr. Rufus Isaacs, MSU Dept. of Entomology 9:40 ­ 10:00 Use of Compost Tea in the Vineyard Dr. Annemiek Schilder, MSU Dept. of Plant

  10. Children's Art Show: An Educational Family Experience

    ERIC Educational Resources Information Center

    Bakerlis, Julienne

    2007-01-01

    In a time of seemingly rampant budget cuts in the arts in school systems throughout the country, a children's art show reaps many rewards. It can strengthen family-school relationships and community ties and stimulate questions and comments about the benefits of art and its significance in the development of young children. In this photo essay of…

  11. Show Them You Really Want the Job

    ERIC Educational Resources Information Center

    Perlmutter, David D.

    2012-01-01

    Showing that one really "wants" the job entails more than just really wanting the job. An interview is part Broadway casting call, part intellectual dating game, part personality test, and part, well, job interview. When there are 300 applicants for a position, many of them will "fit" the required (and even the preferred) skills listed in the job…

  12. Press Release End of Year Show 2013

    E-print Network

    Stell, John

    Press Release End of Year Show 2013 Published 4 th July Record numbers of guests attended our End press coverage, featuring live on BBC1's Look North on the Private View evening. Alex Dodgson BA (Hons) Fine Art third year student performed live, using his body as a canvas and encouraging visitors

  13. SHOW YOUR ARTWORKTHIS SPRING COLOR AND DESIGN

    E-print Network

    Portman, Douglas

    . Students are warmly encouraged to submit up to three pieces for consideration. Choices for the show pm on Saturday, March 10 with the submission sheet filled out. Early submissions are encouraged! Help-276-8959 mag.rochester.edu, magcw@mag.rochester.edu Please fill out this form and return it by March 10, 2012

  14. More Dangerous Ebola Strain Unlikely, Study Shows

    MedlinePLUS

    More Dangerous Ebola Strain Unlikely, Study Shows Researchers compared virus samples that were 9 months apart, and found normal mutating rate ... 26, 2015 THURSDAY, March 26, 2015 (HealthDay News) -- Ebola likely won't mutate into a strain that ...

  15. Quantifying morphology changes in time series data with skew

    E-print Network

    Sung, Phil

    This paper examines strategies to quantify differences in the morphology of time series while accounting for time skew in the observed data. We adapt four measures originally designed for signal shape comparison: Dynamic ...

  16. Walljet Electrochemistry: Quantifying Molecular Transport through Metallopolymeric and Zirconium

    E-print Network

    Walljet Electrochemistry: Quantifying Molecular Transport through Metallopolymeric and Zirconium precision for the two methods. We apply this technique to a system consisting of zirconium phosphonate supramolecular square building blocks (Keefe; et al. Adv. Mater. 2003, 15, 1936). The zirconium phosphate

  17. A NEW METHOD TO QUANTIFY CORE TEMPERATURE INSTABILITY IN RODENTS.

    EPA Science Inventory

    Methods to quantify instability of autonomic systems such as temperature regulation should be important in toxicant and drug safety studies. Stability of core temperature (Tc) in laboratory rodents is susceptible to a variety of stimuli. Calculating the temperature differential o...

  18. Methodology to quantify leaks in aerosol sampling system components

    E-print Network

    Vijayaraghavan, Vishnu Karthik

    2004-11-15

    Farland Dennis O?Neal (Chair of Committee) (Member) ___________________________ ___________________________ Yassin A. Hassan... Dennis O?Neal (Member) (Head of Department) August 2003 Major Subject: Mechanical Engineering iii ABSTRACT Methodology to Quantify Leaks in Aerosol Sampling...

  19. Quantifying the evidence for biodiversity effects on ecosystem

    E-print Network

    Chave, Jérôme

    Quantifying the evidence for biodiversity effects on ecosystem functioning and services. Balvanera Meta-analysis 1. What are the most commonly addressed relationships between biodiversity and ecosystem commonly addressed relationships between biodiversity and ecosystem properties ? 2. How do the experimental

  20. Quantifying environmental limiting factors on tree cover using geospatial data.

    PubMed

    Greenberg, Jonathan A; Santos, Maria J; Dobrowski, Solomon Z; Vanderbilt, Vern C; Ustin, Susan L

    2015-01-01

    Environmental limiting factors (ELFs) are the thresholds that determine the maximum or minimum biological response for a given suite of environmental conditions. We asked the following questions: 1) Can we detect ELFs on percent tree cover across the eastern slopes of the Lake Tahoe Basin, NV? 2) How are the ELFs distributed spatially? 3) To what extent are unmeasured environmental factors limiting tree cover? ELFs are difficult to quantify as they require significant sample sizes. We addressed this by using geospatial data over a relatively large spatial extent, where the wall-to-wall sampling ensures the inclusion of rare data points which define the minimum or maximum response to environmental factors. We tested mean temperature, minimum temperature, potential evapotranspiration (PET) and PET minus precipitation (PET-P) as potential limiting factors on percent tree cover. We found that the study area showed system-wide limitations on tree cover, and each of the factors showed evidence of being limiting on tree cover. However, only 1.2% of the total area appeared to be limited by the four (4) environmental factors, suggesting other unmeasured factors are limiting much of the tree cover in the study area. Where sites were near their theoretical maximum, non-forest sites (tree cover < 25%) were primarily limited by cold mean temperatures, open-canopy forest sites (tree cover between 25% and 60%) were primarily limited by evaporative demand, and closed-canopy forests were not limited by any particular environmental factor. The detection of ELFs is necessary in order to fully understand the width of limitations that species experience within their geographic range. PMID:25692604

  1. Quantifying oil filtration effects on bearing life

    NASA Technical Reports Server (NTRS)

    Needelman, William M.; Zaretsky, Erwin V.

    1991-01-01

    Rolling-element bearing life is influenced by the number, size, and material properties of particles entering the Hertzian contact of the rolling element and raceway. In general, rolling-element bearing life increases with increasing level of oil filtration. Based upon test results, two equations are presented which allow for the adjustment of bearing L(sub 10) or catalog life based upon oil filter rating. It is recommended that where no oil filtration is used catalog life be reduced by 50 percent.

  2. Improved methods for quantifying potential nutrient interception by riparian buffers

    Microsoft Academic Search

    Matthew E. Baker; Donald E. Weller; Thomas E. Jordan

    2006-01-01

    Efforts to quantify the effects of riparian buffers on watershed nutrient discharges have been confounded by a commonly used\\u000a analysis, which estimates buffer potential as the percentage of forest or wetland within a fixed distance of streams. Effective\\u000a landscape metrics must instead be developed based on a clear conceptual model and quantified at the appropriate spatial scale.\\u000a We develop new

  3. Quantifiable Software Architecture for Dependable Systems of Systems

    Microsoft Academic Search

    Sheldon X. Liang; Joseph F. Puett; Luqi

    2003-01-01

    \\u000a Software architecture is a critical aspect in the successful development and evolution of dependable systems of systems (DSoS),\\u000a because it provides artifactual loci around which engineers can reason, construct, and evolve the software design to provide\\u000a robustness and resilience. Quantifiably architecting DSoS involves establishing a consensus of attributes of dependability\\u000a (from different stakeholders’ perspectives) and translating them into quantifiable constraints.

  4. Quantifying the natural history of breast cancer

    PubMed Central

    Tan, K H X; Simonella, L; Wee, H L; Roellin, A; Lim, Y-W; Lim, W-Y; Chia, K S; Hartman, M; Cook, A R

    2013-01-01

    Background: Natural history models of breast cancer progression provide an opportunity to evaluate and identify optimal screening scenarios. This paper describes a detailed Markov model characterising breast cancer tumour progression. Methods: Breast cancer is modelled by a 13-state continuous-time Markov model. The model differentiates between indolent and aggressive ductal carcinomas in situ tumours, and aggressive tumours of different sizes. We compared such aggressive cancers, that is, which are non-indolent, to those which are non-growing and regressing. Model input parameters and structure were informed by the 1978–1984 Ostergotland county breast screening randomised controlled trial. Overlaid on the natural history model is the effect of screening on diagnosis. Parameters were estimated using Bayesian methods. Markov chain Monte Carlo integration was used to sample the resulting posterior distribution. Results: The breast cancer incidence rate in the Ostergotland population was 21 (95% CI: 17–25) per 10?000 woman-years. Accounting for length-biased sampling, an estimated 91% (95% CI: 85–97%) of breast cancers were aggressive. Larger tumours, 21–50?mm, had an average sojourn of 6 years (95% CI: 3–16 years), whereas aggressive ductal carcinomas in situ took around half a month (95% CI: 0–1 month) to progress to the invasive ?10?mm state. Conclusion: These tumour progression rate estimates may facilitate future work analysing cost-effectiveness and quality-adjusted life years for various screening strategies. PMID:24084766

  5. Quantifying data worth toward reducing predictive uncertainty

    USGS Publications Warehouse

    Dausman, A.M.; Doherty, J.; Langevin, C.D.; Sukop, M.C.

    2010-01-01

    The present study demonstrates a methodology for optimization of environmental data acquisition. Based on the premise that the worth of data increases in proportion to its ability to reduce the uncertainty of key model predictions, the methodology can be used to compare the worth of different data types, gathered at different locations within study areas of arbitrary complexity. The method is applied to a hypothetical nonlinear, variable density numerical model of salt and heat transport. The relative utilities of temperature and concentration measurements at different locations within the model domain are assessed in terms of their ability to reduce the uncertainty associated with predictions of movement of the salt water interface in response to a decrease in fresh water recharge. In order to test the sensitivity of the method to nonlinear model behavior, analyses were repeated for multiple realizations of system properties. Rankings of observation worth were similar for all realizations, indicating robust performance of the methodology when employed in conjunction with a highly nonlinear model. The analysis showed that while concentration and temperature measurements can both aid in the prediction of interface movement, concentration measurements, especially when taken in proximity to the interface at locations where the interface is expected to move, are of greater worth than temperature measurements. Nevertheless, it was also demonstrated that pairs of temperature measurements, taken in strategic locations with respect to the interface, can also lead to more precise predictions of interface movement. Journal compilation ?? 2010 National Ground Water Association.

  6. Stretching DNA to quantify nonspecific protein binding

    NASA Astrophysics Data System (ADS)

    Goyal, Sachin; Fountain, Chandler; Dunlap, David; Family, Fereydoon; Finzi, Laura

    2012-07-01

    Nonspecific binding of regulatory proteins to DNA can be an important mechanism for target search and storage. This seems to be the case for the lambda repressor protein (CI), which maintains lysogeny after infection of E. coli. CI binds specifically at two distant regions along the viral genome and induces the formation of a repressive DNA loop. However, single-molecule imaging as well as thermodynamic and kinetic measurements of CI-mediated looping show that CI also binds to DNA nonspecifically and that this mode of binding may play an important role in maintaining lysogeny. This paper presents a robust phenomenological approach using a recently developed method based on the partition function, which allows calculation of the number of proteins bound nonspecific to DNA from measurements of the DNA extension as a function of applied force. This approach was used to analyze several cycles of extension and relaxation of ? DNA performed at several CI concentrations to measure the dissociation constant for nonspecific binding of CI (˜100 nM), and to obtain a measurement of the induced DNA compaction (˜10%) by CI.

  7. Quantifying Antigenic Relationships among the Lyssaviruses?

    PubMed Central

    Horton, D. L.; McElhinney, L. M.; Marston, D. A.; Wood, J. L. N.; Russell, C. A.; Lewis, N.; Kuzmin, I. V.; Fouchier, R. A. M.; Osterhaus, A. D. M. E.; Fooks, A. R.; Smith, D. J.

    2010-01-01

    All lyssaviruses cause fatal encephalitis in mammals. There is sufficient antigenic variation within the genus to cause variable vaccine efficacy, but this variation is difficult to characterize quantitatively: sequence analysis cannot yet provide detailed antigenic information, and antigenic neutralization data have been refractory to high-resolution robust interpretation. Here, we address these issues by using state-of-the-art antigenic analyses to generate a high-resolution antigenic map of a global panel of 25 lyssaviruses. We compared the calculated antigenic distances with viral glycoprotein ectodomain sequence data. Although 67% of antigenic variation was predictable from the glycoprotein amino acid sequence, there are in some cases substantial differences between genetic and antigenic distances, thus highlighting the risk of inferring antigenic relationships solely from sequence data at this time. These differences included epidemiologically important antigenic differences between vaccine strains and wild-type rabies viruses. Further, we quantitatively assessed the antigenic relationships measured by using rabbit, mouse, and human sera, validating the use of nonhuman experimental animals as a model for determining antigenic variation in humans. The use of passive immune globulin is a crucial component of rabies postexposure prophylaxis, and here we also show that it is possible to predict the reactivity of immune globulin against divergent lyssaviruses. PMID:20826698

  8. Quantifying NOx for industrial combustion processes.

    PubMed

    Baukal, C E; Eleazer, P B

    1998-01-01

    The objectives of this paper are to (1) identify the problems with many of the units that are used to report and regulate NOx, (2) show how to properly correct NOx measurements for oxygen-enhanced combustion, and (3) recommend a preferred type of NOx unit. The current variety of NOx units make comparisons difficult and can cause considerable confusion. NOx may be measured on a wet or dry basis, but it is commonly reported on a dry basis. The reported NOx may differ from the actual measurements, which may be converted to a specific O2 basis level. Nearly all of the measured NOx from industrial combustion systems is in the form of NO, which is converted to NO2 in the atmosphere. However, when given on a mass basis, the measured NO is commonly reported as NO2 for regulatory purposes, but may be reported as NO, NO2, or simply NOx in technical papers. Some existing regulations may penalize combustion technologies with higher efficiencies and lower flue gas volumes, such as oxygen-enhanced combustion. Confusion may occur when applying some of the "conventional" NOx units to oxygen-enhanced processes. A better unit is the mass of NOx generated per unit of production, which also incorporates the overall process efficiency into the emissions. That unit does not penalize more efficient processes that may generate more NOx on a volume basis, but less NOx on a production basis. PMID:15655998

  9. Populations of Monarch butterflies with different migratory behaviors show divergence in wing morphology.

    PubMed

    Altizer, Sonia; Davis, Andrew K

    2010-04-01

    The demands of long-distance flight represent an important evolutionary force operating on the traits of migratory species. Monarchs are widespread butterflies known for their annual migrations in North America. We examined divergence in wing morphology among migratory monarchs from eastern and western N. America, and nonmigratory monarchs in S. Florida, Puerto Rico, Costa Rica, and Hawaii. For the three N. American populations, we also examined monarchs reared in four common environment experiments. We used image analysis to measure multiple traits including forewing area and aspect ratio; for laboratory-reared monarchs we also quantified body area and wing loading. Results showed wild monarchs from all nonmigratory populations were smaller than those from migratory populations. Wild and captive-reared eastern monarchs had the largest and most elongated forewings, whereas monarchs from Puerto Rico and Costa Rica had the smallest and roundest forewings. Eastern monarchs also had the largest bodies and high measures of wing loading, whereas western and S. Florida monarchs had less elongated forewings and smaller bodies. Among captive-reared butterflies, family-level effects provided evidence that genetic factors contributed to variation in wing traits. Collectively, these results support evolutionary responses to long-distance flight in monarchs, with implications for the conservation of phenotypically distinct wild populations. PMID:20067519

  10. New Drug Shows Promise for MS

    MedlinePLUS

    ... drug appears to repair nerve damage seen in multiple sclerosis (MS) patients, results of an early trial suggest. ... those with optic neuritis go on to develop multiple sclerosis. Patients who received the experimental antibody had greater ...

  11. Quantifying Pollutant Emissions from Office Equipment Phase IReport

    SciTech Connect

    Maddalena, R.L.; Destaillats, H.; Hodgson, A.T.; McKone, T.E.; Perino, C.

    2006-12-01

    Although office equipment has been a focal point for governmental efforts to promote energy efficiency through programs such as Energy Star, little is known about the relationship between office equipment use and indoor air quality. This report provides results of the first phase (Phase I) of a study in which the primary objective is to measure emissions of organic pollutants and particulate matter from a selected set of office equipment typically used in residential and office environments. The specific aims of the overall research effort are: (1) use screening-level measurements to identify and quantify the concentrations of air pollutants of interest emitted by major categories of distributed office equipment in a controlled environment; (2) quantify the emissions of air pollutants from generally representative, individual machines within each of the major categories in a controlled chamber environment using well defined protocols; (3) characterize the effects of ageing and use on emissions for individual machines spanning several categories; (4) evaluate the importance of operational factors that can be manipulated to reduce pollutant emissions from office machines; and (5) explore the potential relationship between energy consumption and pollutant emissions for machines performing equivalent tasks. The study includes desktop computers (CPU units), computer monitors, and three categories of desktop printing devices. The printer categories are: (1) printers and multipurpose devices using color inkjet technology; (2) low- to medium output printers and multipurpose devices employing monochrome or color laser technology; and (3) high-output monochrome and color laser printers. The literature review and screening level experiments in Phase 1 were designed to identify substances of toxicological significance for more detailed study. In addition, these screening level measurements indicate the potential relative importance of different categories of office equipment with respect to human exposures. The more detailed studies of the next phase of research (Phase II) are meant to characterize changes in emissions with time and may identify factors that can be modified to reduce emissions. These measurements may identify 'win-win' situations in which low energy consumption machines have lower pollutant emissions. This information will be used to compare machines to determine if some are substantially better than their peers with respect to their emissions of pollutants.

  12. Can satellite-derived aerosol optical depth quantify the surface aerosol radiative forcing?

    NASA Astrophysics Data System (ADS)

    Xu, Hui; Ceamanos, Xavier; Roujean, Jean-Louis; Carrer, Dominique; Xue, Yong

    2014-12-01

    Aerosols play an important role in the climate of the Earth through aerosol radiative forcing (ARF). Nowadays, aerosol particles are detected, quantified and monitored by remote sensing techniques using low Earth orbit (LEO) and geostationary (GEO) satellites. In the present article, the use of satellite-derived AOD (aerosol optical depth) products is investigated in order to quantify on a daily basis the ARF at the surface level (SARF). By daily basis we mean that an average SARF value is computed every day based upon the available AOD satellite measurements for each station. In the first part of the study, the performance of four state-of-art different AOD products (MODIS-DT, MODIS-DB, MISR, and SEVIRI) is assessed through comparison against ground-based AOD measurements from 24 AERONET stations located in Europe and Africa during a 6-month period. While all AOD products are found to be comparable in terms of measured value (RMSE of 0.1 for low and average AOD values), a higher number of AOD estimates is made available by GEO satellites due to their enhanced frequency of scan. Experiments show a general lower agreement of AOD estimates over the African sites (RMSE of 0.2), which show the highest aerosol concentrations along with the occurrence of dust aerosols, coarse particles, and bright surfaces. In the second part of this study, the lessons learned about the confidence in aerosol burden derived from satellites are used to estimate SARF under clear sky conditions. While the use of AOD products issued from GEO observations like SEVIRI brings improvement in the SARF estimates with regard to LEO-based AOD products, the resulting absolute bias (13 W/m2 in average when AERONET AOD is used as reference) is judged to be still high in comparison with the average values of SARF found in this study (from - 25 W/m2 to - 43 W/m2) and also in the literature (from - 10 W/m2 to - 47 W/m2).

  13. Virtual environment to quantify the influence of colour stimuli on the performance of tasks requiring attention

    PubMed Central

    2011-01-01

    Background Recent studies indicate that the blue-yellow colour discrimination is impaired in ADHD individuals. However, the relationship between colour and performance has not been investigated. This paper describes the development and the testing of a virtual environment that is capable to quantify the influence of red-green versus blue-yellow colour stimuli on the performance of people in a fun and interactive way, being appropriate for the target audience. Methods An interactive computer game based on virtual reality was developed to evaluate the performance of the players. The game's storyline was based on the story of an old pirate who runs across islands and dangerous seas in search of a lost treasure. Within the game, the player must find and interpret the hints scattered in different scenarios. Two versions of this game were implemented. In the first, hints and information boards were painted using red and green colours. In the second version, these objects were painted using blue and yellow colours. For modelling, texturing, and animating virtual characters and objects the three-dimensional computer graphics tool Blender 3D was used. The textures were created with the GIMP editor to provide visual effects increasing the realism and immersion of the players. The games were tested on 20 non-ADHD volunteers who were divided into two subgroups (A1 and A2) and 20 volunteers with ADHD who were divided into subgroups B1 and B2. Subgroups A1 and B1 used the first version of the game with the hints painted in green-red colors, and subgroups A2 and B2 the second version using the same hints now painted in blue-yellow. The time spent to complete each task of the game was measured. Results Data analyzed with ANOVA two-way and posthoc TUKEY LSD showed that the use of blue/yellow instead of green/red colors decreased the game performance of all participants. However, a greater decrease in performance could be observed with ADHD participants where tasks, that require attention, were most affected. Conclusions The game proved to be a user-friendly tool capable to detect and quantify the influence of color on the performance of people executing tasks that require attention and showed to be attractive for people with ADHD. PMID:21854630

  14. Do dogs ( Canis familiaris ) show contagious yawning?

    Microsoft Academic Search

    Aimee L. Harr; Valerie R. Gilbert; Kimberley A. Phillips

    2009-01-01

    We report an experimental investigation into whether domesticated dogs display contagious yawning. Fifteen dogs were shown\\u000a video clips of (1) humans and (2) dogs displaying yawns and open-mouth expressions (not yawns) to investigate whether dogs\\u000a showed contagious yawning to either of these social stimuli. Only one dog performed significantly more yawns during or shortly\\u000a after viewing yawning videos than to

  15. Learning helicopter control through “teaching by showing”

    Microsoft Academic Search

    James F. Montgomery; George A. Bekey

    1998-01-01

    A model-free “teaching by showing” methodology is developed to train a fuzzy-neural controller for an autonomous robot helicopter. The controller is generated and tuned using training data gathered while a teacher operates the helicopter. A hierarchical behavior-based control architecture is used, with each behavior implemented as a hybrid fuzzy logic controller (FLC) and general regression neural network controller (GRNNC). The

  16. Map showing depth to bedrock, Anchorage, Alaska

    USGS Publications Warehouse

    Glass, R.L.

    1988-01-01

    Knowledge of the physical and hydrologic characteristics of geologic materials is useful in determining the availability of groundwater for public and domestic supply and the suitability of areas for on-site septic systems. A generalized map of the Anchorage area shows the approximate distance from land surface to the top of the bedrock surface. Four depth zones are shown. The depths were determined from lithologic data contained in drillers ' logs. (USGS)

  17. Quantifying protein–protein interactions in high throughput using protein domain microarrays

    PubMed Central

    Kaushansky, Alexis; Allen, John E; Gordus, Andrew; Stiffler, Michael A; Karp, Ethan S; Chang, Bryan H; MacBeath, Gavin

    2011-01-01

    Protein microarrays provide an efficient way to identify and quantify protein–protein interactions in high throughput. One drawback of this technique is that proteins show a broad range of physicochemical properties and are often difficult to produce recombinantly. To circumvent these problems, we have focused on families of protein interaction domains. Here we provide protocols for constructing microarrays of protein interaction domains in individual wells of 96-well microtiter plates, and for quantifying domain–peptide interactions in high throughput using fluorescently labeled synthetic peptides. As specific examples, we will describe the construction of microarrays of virtually every human Src homology 2 (SH2) and phosphotyrosine binding (PTB) domain, as well as microarrays of mouse PDZ domains, all produced recombinantly in Escherichia coli. For domains that mediate high-affinity interactions, such as SH2 and PTB domains, equilibrium dissociation constants (KDs) for their peptide ligands can be measured directly on arrays by obtaining saturation binding curves. For weaker binding domains, such as PDZ domains, arrays are best used to identify candidate interactions, which are then retested and quantified by fluorescence polarization. Overall, protein domain microarrays provide the ability to rapidly identify and quantify protein–ligand interactions with minimal sample consumption. Because entire domain families can be interrogated simultaneously, they provide a powerful way to assess binding selectivity on a proteome-wide scale and provide an unbiased perspective on the connectivity of protein–protein interaction networks. PMID:20360771

  18. Quantifying the Consistency of Scientific Databases

    PubMed Central

    Šubelj, Lovro; Bajec, Marko; Mileva Boshkoska, Biljana; Kastrin, Andrej; Levnaji?, Zoran

    2015-01-01

    Science is a social process with far-reaching impact on our modern society. In recent years, for the first time we are able to scientifically study the science itself. This is enabled by massive amounts of data on scientific publications that is increasingly becoming available. The data is contained in several databases such as Web of Science or PubMed, maintained by various public and private entities. Unfortunately, these databases are not always consistent, which considerably hinders this study. Relying on the powerful framework of complex networks, we conduct a systematic analysis of the consistency among six major scientific databases. We found that identifying a single "best" database is far from easy. Nevertheless, our results indicate appreciable differences in mutual consistency of different databases, which we interpret as recipes for future bibliometric studies. PMID:25984946

  19. Quantifying uncertainties in wind energy assessment

    NASA Astrophysics Data System (ADS)

    Patlakas, Platon; Galanis, George; Kallos, George

    2015-04-01

    The constant rise of wind energy production and the subsequent penetration in global energy markets during the last decades resulted in new sites selection with various types of problems. Such problems arise due to the variability and the uncertainty of wind speed. The study of the wind speed distribution lower and upper tail may support the quantification of these uncertainties. Such approaches focused on extreme wind conditions or periods below the energy production threshold are necessary for a better management of operations. Towards this direction, different methodologies are presented for the credible evaluation of potential non-frequent/extreme values for these environmental conditions. The approaches used, take into consideration the structural design of the wind turbines according to their lifespan, the turbine failures, the time needed for repairing as well as the energy production distribution. In this work, a multi-parametric approach for studying extreme wind speed values will be discussed based on tools of Extreme Value Theory. In particular, the study is focused on extreme wind speed return periods and the persistence of no energy production based on a weather modeling system/hind cast/10-year dataset. More specifically, two methods (Annual Maxima and Peaks Over Threshold) were used for the estimation of extreme wind speeds and their recurrence intervals. Additionally, two different methodologies (intensity given duration and duration given intensity, both based on Annual Maxima method) were implied to calculate the extreme events duration, combined with their intensity as well as the event frequency. The obtained results prove that the proposed approaches converge, at least on the main findings, for each case. It is also remarkable that, despite the moderate wind speed climate of the area, several consequent days of no energy production are observed.

  20. Quantifying uncertainty in brain network measures using Bayesian connectomics

    PubMed Central

    Janssen, Ronald J.; Hinne, Max; Heskes, Tom; van Gerven, Marcel A. J.

    2014-01-01

    The wiring diagram of the human brain can be described in terms of graph measures that characterize structural regularities. These measures require an estimate of whole-brain structural connectivity for which one may resort to deterministic or thresholded probabilistic streamlining procedures. While these procedures have provided important insights about the characteristics of human brain networks, they ultimately rely on unwarranted assumptions such as those of noise-free data or the use of an arbitrary threshold. Therefore, resulting structural connectivity estimates as well as derived graph measures fail to fully take into account the inherent uncertainty in the structural estimate. In this paper, we illustrate an easy way of obtaining posterior distributions over graph metrics using Bayesian inference. It is shown that this posterior distribution can be used to quantify uncertainty about graph-theoretical measures at the single subject level, thereby providing a more nuanced view of the graph-theoretical properties of human brain connectivity. We refer to this model-based approach to connectivity analysis as Bayesian connectomics. PMID:25339896

  1. A diagnostic for quantifying heat flux from a thermite spray

    SciTech Connect

    E. P. Nixon; M. L. Pantoya; D. J. Prentice; E. D. Steffler; M. A. Daniels; S. P. D'Arche

    2010-02-01

    Characterizing the combustion behaviors of energetic materials requires diagnostic tools that are often not readily or commercially available. For example, a jet of thermite spray provides a high temperature and pressure reaction that can also be highly corrosive and promote undesirable conditions for the survivability of any sensor. Developing a diagnostic to quantify heat flux from a thermite spray is the objective of this study. Quick response sensors such as thin film heat flux sensors cannot survive the harsh conditions of the spray, but more rugged sensors lack the response time for the resolution desired. A sensor that will allow for adequate response time while surviving the entire test duration was constructed. The sensor outputs interior temperatures of the probes at known locations and utilizes an inverse heat conduction code to calculate heat flux values. The details of this device are discussed and illustrated. Temperature and heat flux measurements of various thermite sprays are reported. Results indicate that this newly designed heat flux sensor provides quantitative data with good repeatability suitable for characterizing energetic material combustion.

  2. Quantifying Potential Error in Painting Breast Excision Specimens

    PubMed Central

    Godden, Amy

    2013-01-01

    Aim. When excision margins are close or involved following breast conserving surgery, many surgeons will attempt to reexcise the corresponding cavity margin. Margins are ascribed to breast specimens such that six faces are identifiable to the pathologist, a process that may be prone to error at several stages. Methods. An experimental model was designed according to stated criteria in order to answer the research question. Computer software was used to measure the surface areas of experimental surfaces to compare human-painted surfaces with experimental controls. Results. The variability of the hand-painted surfaces was considerable. Thirty percent of hand-painted surfaces were 20% larger or smaller than controls. The mean area of the last surface painted was significantly larger than controls (mean 58996 pixels versus 50096 pixels, CI 1477–16324, P = 0.014). By chance, each of the six volunteers chose to paint the deep surface last. Conclusion. This study is the first to attempt to quantify the extent of human error in marking imaginary boundaries on a breast excision model and suggests that humans do not make these judgements well, raising questions about the safety of targeting single margins at reexcision. PMID:23762569

  3. Quantifying molecular oxygen isotope variations during a Heinrich Stadial

    NASA Astrophysics Data System (ADS)

    Reutenauer, C.; Landais, A.; Blunier, T.; Bréant, C.; Kageyama, M.; Woillez, M.-N.; Risi, C.; Mariotti, V.; Braconnot, P.

    2015-06-01

    ?18O of atmospheric oxygen (?18Oatm) undergoes millennial scale variations during the last glacial period, and systematically increases during Heinrich Stadials (HS). Changes in ?18Oatm combine variations in biospheric and water cycle processes. The identification of the main driver of the millennial variability of ?18Oatm is thus not straightforward. Here, we quantify the response of ?18Oatm to such millennial events using a freshwater hosing simulation (HS_exp) performed under glacial boundary conditions. Our global approach takes into account the latest estimates of isotope fractionation factor for respiratory and photosynthetic processes and make use of atmospheric water isotopes and vegetations changes. Our modeling approach allows to reproduce the main observed features of a HS in terms of climatic conditions, vegetation distribution and ?18O of precipitation. We use it to decipher the relative importance of the different processes behind the observed changes in ?18Oatm. The results highlight the dominant role of hydrology on ?18Oatm and confirm that ?18Oatm can be seen as a global integrator of hydrological changes over vegetated areas.

  4. Quantifying the rheological and hemodynamic characteristics of sickle cell anemia.

    PubMed

    Lei, Huan; Karniadakis, George Em

    2012-01-18

    Sickle erythrocytes exhibit abnormal morphology and membrane mechanics under deoxygenated conditions due to the polymerization of hemoglobin S. We employed dissipative particle dynamics to extend a validated multiscale model of red blood cells (RBCs) to represent different sickle cell morphologies based on a simulated annealing procedure and experimental observations. We quantified cell distortion using asphericity and elliptical shape factors, and the results were consistent with a medical image analysis. We then studied the rheology and dynamics of sickle RBC suspensions under constant shear and in a tube. In shear flow, the transition from shear-thinning to shear-independent flow revealed a profound effect of cell membrane stiffening during deoxygenation, with granular RBC shapes leading to the greatest viscosity. In tube flow, the increase of flow resistance by granular RBCs was also greater than the resistance of blood flow with sickle-shape RBCs. However, no occlusion was observed in a straight tube under any conditions unless an adhesive dynamics model was explicitly incorporated into simulations that partially trapped sickle RBCs, which led to full occlusion in some cases. PMID:22339854

  5. Quantifying the Rheological and Hemodynamic Characteristics of Sickle Cell Anemia

    PubMed Central

    Lei, Huan; Karniadakis, George Em

    2012-01-01

    Sickle erythrocytes exhibit abnormal morphology and membrane mechanics under deoxygenated conditions due to the polymerization of hemoglobin S. We employed dissipative particle dynamics to extend a validated multiscale model of red blood cells (RBCs) to represent different sickle cell morphologies based on a simulated annealing procedure and experimental observations. We quantified cell distortion using asphericity and elliptical shape factors, and the results were consistent with a medical image analysis. We then studied the rheology and dynamics of sickle RBC suspensions under constant shear and in a tube. In shear flow, the transition from shear-thinning to shear-independent flow revealed a profound effect of cell membrane stiffening during deoxygenation, with granular RBC shapes leading to the greatest viscosity. In tube flow, the increase of flow resistance by granular RBCs was also greater than the resistance of blood flow with sickle-shape RBCs. However, no occlusion was observed in a straight tube under any conditions unless an adhesive dynamics model was explicitly incorporated into simulations that partially trapped sickle RBCs, which led to full occlusion in some cases. PMID:22339854

  6. Quantifying interictal metabolic activity in human temporal lobe epilepsy

    SciTech Connect

    Henry, T.R.; Mazziotta, J.C.; Engel, J. Jr.; Christenson, P.D.; Zhang, J.X.; Phelps, M.E.; Kuhl, D.E. (Univ. of California, Los Angeles (USA))

    1990-09-01

    The majority of patients with complex partial seizures of unilateral temporal lobe origin have interictal temporal hypometabolism on (18F)fluorodeoxyglucose positron emission tomography (FDG PET) studies. Often, this hypometabolism extends to ipsilateral extratemporal sites. The use of accurately quantified metabolic data has been limited by the absence of an equally reliable method of anatomical analysis of PET images. We developed a standardized method for visual placement of anatomically configured regions of interest on FDG PET studies, which is particularly adapted to the widespread, asymmetric, and often severe interictal metabolic alterations of temporal lobe epilepsy. This method was applied by a single investigator, who was blind to the identity of subjects, to 10 normal control and 25 interictal temporal lobe epilepsy studies. All subjects had normal brain anatomical volumes on structural neuroimaging studies. The results demonstrate ipsilateral thalamic and temporal lobe involvement in the interictal hypometabolism of unilateral temporal lobe epilepsy. Ipsilateral frontal, parietal, and basal ganglial metabolism is also reduced, although not as markedly as is temporal and thalamic metabolism.

  7. Quantifying residual hydrogen adsorption in low-temperature STMs

    NASA Astrophysics Data System (ADS)

    Natterer, F. D.; Patthey, F.; Brune, H.

    2013-09-01

    We report on low-temperature scanning tunneling microscopy observations demonstrating that individual Ti atoms on hexagonal boron nitride dissociate and adsorb hydrogen without measurable reaction barrier. The clean and hydrogenated states of the adatoms are clearly discerned by their apparent height and their differential conductance revealing the Kondo effect upon hydrogenation. Measurements at 50 K and 5 × 10- 11 mbar indicate a sizable hydrogenation within only 1 h originating from the residual gas pressure, whereas measurements at 4.7 K can be carried out for days without H2 contamination problems. However, heating up a low-T STM to operate it at variable temperature results in very sudden hydrogenation at around 17 K that correlates with a sharp peak in the total chamber pressure. From a quantitative analysis we derive the desorption energies of H2 on the cryostat walls. We find evidence for hydrogen contamination also during Ti evaporation and propose a strategy on how to dose transition metal atoms in the cleanliest fashion. The present contribution raises awareness of hydrogenation under seemingly ideal ultra-high vacuum conditions, it quantifies the H2 uptake by isolated transition metal atoms and its thermal desorption from the gold plated cryostat walls.

  8. Rapidly quantifying the relative distention of a human bladder

    NASA Technical Reports Server (NTRS)

    Companion, John A. (inventor); Heyman, Joseph S. (inventor); Mineo, Beth A. (inventor); Cavalier, Albert R. (inventor); Blalock, Travis N. (inventor)

    1991-01-01

    A device and method was developed to rapidly quantify the relative distention of the bladder of a human subject. An ultrasonic transducer is positioned on the human subject near the bladder. A microprocessor controlled pulser excites the transducer by sending an acoustic wave into the human subject. This wave interacts with the bladder walls and is reflected back to the ultrasonic transducer where it is received, amplified, and processed by the receiver. The resulting signal is digitized by an analog to digital converter, controlled by the microprocessor again, and is stored in data memory. The software in the microprocessor determines the relative distention of the bladder as a function of the propagated ultrasonic energy. Based on programmed scientific measurements and the human subject's past history as contained in program memory, the microprocessor sends out a signal to turn on any or all of the available alarms. The alarm system includes and audible alarm, the visible alarm, the tactile alarm, and the remote wireless alarm.

  9. Rapidly quantifying the relative distention of a human bladder

    NASA Technical Reports Server (NTRS)

    Companion, John A. (inventor); Heyman, Joseph S. (inventor); Mineo, Beth A. (inventor); Cavalier, Albert R. (inventor); Blalock, Travis N. (inventor)

    1989-01-01

    A device and method of rapidly quantifying the relative distention of the bladder in a human subject are disclosed. The ultrasonic transducer which is positioned on the subject in proximity to the bladder is excited by a pulser under the command of a microprocessor to launch an acoustic wave into the patient. This wave interacts with the bladder walls and is reflected back to the ultrasonic transducer, when it is received, amplified and processed by the receiver. The resulting signal is digitized by an analog-to-digital converter under the command of the microprocessor and is stored in the data memory. The software in the microprocessor determines the relative distention of the bladder as a function of the propagated ultrasonic energy; and based on programmed scientific measurements and individual, anatomical, and behavioral characterists of the specific subject as contained in the program memory, sends out a signal to turn on any or all of the audible alarm, the visible alarm, the tactile alarm, and the remote wireless alarm.

  10. Quantum Process Tomography Quantifies Coherence Transfer Dynamics in Vibrational Exciton

    PubMed Central

    Chuntonov, Lev; Ma, Jianqiang

    2013-01-01

    Quantum coherence has been a subject of great interest in many scientific disciplines. However, detailed characterization of the quantum coherence in molecular systems, especially its transfer and relaxation mechanisms, still remains a major challenge. The difficulties arise in part because the spectroscopic signatures of the coherence transfer are typically overwhelmed by other excitation relaxation processes. We use quantum process tomography (QPT) via two-dimensional infrared spectroscopy to quantify the rate of the elusive coherence transfer between two vibrational exciton states. QPT retrieves the dynamics of the dissipative quantum system directly from the experimental observables. It thus serves as an experimental alternative to theoretical models of the system-bath interaction, and can be used to validate these theories. Our results for coupled carbonyl groups of a diketone molecule in chloroform, used as a benchmark system, reveal the non-secular nature of the interaction between the exciton and the Markovian bath and open the door for the systematic studies of the dissipative quantum systems dynamics in detail. PMID:24079417

  11. Quantifying the stratigraphic completeness of delta shoreline trajectories

    NASA Astrophysics Data System (ADS)

    Mahon, Robert C.; Shaw, John B.; Barnhart, Katherine R.; Hobley, Daniel E. J.; McElroy, Brandon

    2015-05-01

    Understanding the incomplete nature of the stratigraphic record is fundamental for interpreting stratigraphic sequences. Methods for quantifying stratigraphic completeness for one-dimensional stratigraphic columns, defined as the proportion of time intervals of some length that contain stratigraphy, are commonplace; however, quantitative assessments of completeness in higher dimensions are lacking. Here we present a metric for defining stratigraphic completeness of two-dimensional shoreline trajectories using topset-foreset rollover positions in dip-parallel sections and describe the preservation potential of a shoreline trajectory derived from the geometry of the delta surface profile and the kinematics of the geomorphic shoreline trajectory. Two end-member forward models are required to fully constrain the preservation potential of the shoreline dependent on whether or not a topset is eroded during base level fall. A laboratory fan-delta was constructed under nonsteady boundary conditions, and one-dimensional stratigraphic column and two-dimensional shoreline completeness curves were calculated. Results are consistent with the hypothesis derived from conservation of sediment mass that completeness over all timescales should increase given increasing dimensions of analysis. Stratigraphic trajectories and completeness curves determined from forward models using experimental geomorphic trajectories compare well to values from transects when subsampled to the equivalent stratigraphic resolution as observed in the actual preserved sequence. The concept of stratigraphic completeness applied to two-dimensional trajectory analysis and the end-member forward models presented here provide novel tools for a conceptual understanding of the nature of stratigraphic preservation at basin scales.

  12. Quantifying the relationship between visual salience and visual importance

    NASA Astrophysics Data System (ADS)

    Wang, Junle; Chandler, Damon M.; Le Callet, Patrick

    2010-02-01

    This paper presents the results of two psychophysical experiments and an associated computational analysis designed to quantify the relationship between visual salience and visual importance. In the first experiment, importance maps were collected by asking human subjects to rate the relative visual importance of each object within a database of hand-segmented images. In the second experiment, experimental saliency maps were computed from visual gaze patterns measured for these same images by using an eye-tracker and task-free viewing. By comparing the importance maps with the saliency maps, we found that the maps are related, but perhaps less than one might expect. When coupled with the segmentation information, the saliency maps were shown to be effective at predicting the main subjects. However, the saliency maps were less effective at predicting the objects of secondary importance and the unimportant objects. We also found that the vast majority of early gaze position samples (0-2000 ms) were made on the main subjects, suggesting that a possible strategy of early visual coding might be to quickly locate the main subject(s) in the scene.

  13. Quantifying the abnormal hemodynamics of sickle cell anemia

    NASA Astrophysics Data System (ADS)

    Lei, Huan; Karniadakis, George

    2012-02-01

    Sickle red blood cells (SS-RBC) exhibit heterogeneous morphologies and abnormal hemodynamics in deoxygenated states. A multi-scale model for SS-RBC is developed based on the Dissipative Particle Dynamics (DPD) method. Different cell morphologies (sickle, granular, elongated shapes) typically observed in deoxygenated states are constructed and quantified by the Asphericity and Elliptical shape factors. The hemodynamics of SS-RBC suspensions is studied in both shear and pipe flow systems. The flow resistance obtained from both systems exhibits a larger value than the healthy blood flow due to the abnormal cell properties. Moreover, SS-RBCs exhibit abnormal adhesive interactions with both the vessel endothelium cells and the leukocytes. The effect of the abnormal adhesive interactions on the hemodynamics of sickle blood is investigated using the current model. It is found that both the SS-RBC - endothelium and the SS-RBC - leukocytes interactions, can potentially trigger the vicious ``sickling and entrapment'' cycles, resulting in vaso-occlusion phenomena widely observed in micro-circulation experiments.

  14. Quantifying Snow Volume Uncertainty from Repeat Terrestrial Laser Scanning Observations

    NASA Astrophysics Data System (ADS)

    Gadomski, P. J.; Hartzell, P. J.; Finnegan, D. C.; Glennie, C. L.; Deems, J. S.

    2014-12-01

    Terrestrial laser scanning (TLS) systems are capable of providing rapid, high density, 3D topographic measurements of snow surfaces from increasing standoff distances. By differencing snow surface with snow free measurements within a common scene, snow depths and volumes can be estimated. These data can support operational water management decision-making when combined with measured or modeled snow densities to estimate basin water content, evaluate in-situ data, or drive operational hydrologic models. In addition, change maps from differential TLS scans can also be used to support avalanche control operations to quantify loading patterns for both pre-control planning and post-control assessment. However, while methods for computing volume from TLS point cloud data are well documented, a rigorous quantification of the volumetric uncertainty has yet to be presented. Using repeat TLS data collected at the Arapahoe Basin Ski Area in Summit County, Colorado, we demonstrate the propagation of TLS point measurement and cloud registration uncertainties into 3D covariance matrices at the point level. The point covariances are then propagated through a volume computation to arrive at a single volume uncertainty value. Results from two volume computation methods are compared and the influence of data voids produced by occlusions examined.

  15. Choosing among techniques for quantifying single-case intervention effectiveness.

    PubMed

    Manolov, Rumen; Solanas, Antonio; Sierra, Vicenta; Evans, Jonathan J

    2011-09-01

    If single-case experimental designs are to be used to establish guidelines for evidence-based interventions in clinical and educational settings, numerical values that reflect treatment effect sizes are required. The present study compares four recently developed procedures for quantifying the magnitude of intervention effect using data with known characteristics. Monte Carlo methods were used to generate AB designs data with potential confounding variables (serial dependence, linear and curvilinear trend, and heteroscedasticity between phases) and two types of treatment effect (level and slope change). The results suggest that data features are important for choosing the appropriate procedure and, thus, inspecting the graphed data visually is a necessary initial stage. In the presence of serial dependence or a change in data variability, the nonoverlap of all pairs (NAP) and the slope and level change (SLC) were the only techniques of the four examined that performed adequately. Introducing a data correction step in NAP renders it unaffected by linear trend, as is also the case for the percentage of nonoverlapping corrected data and SLC. The performance of these techniques indicates that professionals' judgments concerning treatment effectiveness can be readily complemented by both visual and statistical analyses. A flowchart to guide selection of techniques according to the data characteristics identified by visual inspection is provided. PMID:21658534

  16. Cardiovascular regulation during sleep quantified by symbolic coupling traces

    NASA Astrophysics Data System (ADS)

    Suhrbier, A.; Riedl, M.; Malberg, H.; Penzel, T.; Bretthauer, G.; Kurths, J.; Wessel, N.

    2010-12-01

    Sleep is a complex regulated process with short periods of wakefulness and different sleep stages. These sleep stages modulate autonomous functions such as blood pressure and heart rate. The method of symbolic coupling traces (SCT) is used to analyze and quantify time-delayed coupling of these measurements during different sleep stages. The symbolic coupling traces, defined as the symmetric and diametric traces of the bivariate word distribution matrix, allow the quantification of time-delayed coupling. In this paper, the method is applied to heart rate and systolic blood pressure time series during different sleep stages for healthy controls as well as for normotensive and hypertensive patients with sleep apneas. Using the SCT, significant different cardiovascular mechanisms not only between the deep sleep and the other sleep stages but also between healthy subjects and patients can be revealed. The SCT method is applied to model systems, compared with established methods, such as cross correlation, mutual information, and cross recurrence analysis and demonstrates its advantages especially for nonstationary physiological data. As a result, SCT proves to be more specific in detecting delays of directional interactions than standard coupling analysis methods and yields additional information which cannot be measured by standard parameters of heart rate and blood pressure variability. The proposed method may help to indicate the pathological changes in cardiovascular regulation and also the effects of continuous positive airway pressure therapy on the cardiovascular system.

  17. QUANTIFYING KINEMATIC SUBSTRUCTURE IN THE MILKY WAY'S STELLAR HALO

    SciTech Connect

    Xue Xiangxiang; Zhao Gang; Luo Ali [Key Lab of Optical Astronomy, National Astronomical Observatories, CAS, 20A Datun Road, Chaoyang District, 100012 Beijing (China); Rix, Hans-Walter; Bell, Eric F.; Koposov, Sergey E.; Kang, Xi; Liu, Chao [Max-Planck-Institute for Astronomy, Koenigstuhl 17, D-69117 Heidelberg (Germany); Yanny, Brian [Fermi National Accelerator Laboratory, P.O. Box 500, Batavia, IL 60510-5011 (United States); Beers, Timothy C.; Lee, Young Sun [Department of Physics and Astronomy and JINA: Joint Institute for Nuclear Astrophysics, Michigan State University, East Lansing, MI 48824 (United States); Bullock, James S. [Center for Cosmology, Department of Physics and Astronomy, University of California, Irvine, CA 92697 (United States); Johnston, Kathryn V. [Astronomy Department, Columbia University, New York, NY 10027 (United States); Morrison, Heather [Department of Astronomy, Case Western Reserve University, Cleveland, OH 44106 (United States); Rockosi, Constance [Lick Observatory/University of California, Santa Cruz, CA 95060 (United States); Weaver, Benjamin A. [Center for Cosmology and Particle Physics, New York University, New York, NY 10003 (United States)

    2011-09-01

    We present and analyze the positions, distances, and radial velocities for over 4000 blue horizontal-branch (BHB) stars in the Milky Way's halo, drawn from SDSS DR8. We search for position-velocity substructure in these data, a signature of the hierarchical assembly of the stellar halo. Using a cumulative 'close pair distribution' as a statistic in the four-dimensional space of sky position, distance, and velocity, we quantify the presence of position-velocity substructure at high statistical significance among the BHB stars: pairs of BHB stars that are close in position on the sky tend to have more similar distances and radial velocities compared to a random sampling of these overall distributions. We make analogous mock observations of 11 numerical halo formation simulations, in which the stellar halo is entirely composed of disrupted satellite debris, and find a level of substructure comparable to that seen in the actually observed BHB star sample. This result quantitatively confirms the hierarchical build-up of the stellar halo through a signature in phase (position-velocity) space. In detail, the structure present in the BHB stars is somewhat less prominent than that seen in most simulated halos, quite possibly because BHB stars represent an older sub-population. BHB stars located beyond 20 kpc from the Galactic center exhibit stronger substructure than at r{sub gc} < 20 kpc.

  18. Quantifying seismic survey reverberation off the Alaskan North Slope.

    PubMed

    Guerra, Melania; Thode, Aaron M; Blackwell, Susanna B; Michael Macrander, A

    2011-11-01

    Shallow-water airgun survey activities off the North Slope of Alaska generate impulsive sounds that are the focus of much regulatory attention. Reverberation from repetitive airgun shots, however, can also increase background noise levels, which can decrease the detection range of nearby passive acoustic monitoring (PAM) systems. Typical acoustic metrics for impulsive signals provide no quantitative information about reverberation or its relative effect on the ambient acoustic environment. Here, two conservative metrics are defined for quantifying reverberation: a minimum level metric measures reverberation levels that exist between airgun pulse arrivals, while a reverberation metric estimates the relative magnitude of reverberation vs expected ambient levels in the hypothetical absence of airgun activity, using satellite-measured wind data. The metrics are applied to acoustic data measured by autonomous recorders in the Alaskan Beaufort Sea in 2008 and demonstrate how seismic surveys can increase the background noise over natural ambient levels by 30-45 dB within 1 km of the activity, by 10-25 dB within 15 km of the activity, and by a few dB at 128 km range. These results suggest that shallow-water reverberation would reduce the performance of nearby PAM systems when monitoring for marine mammals within a few kilometers of shallow-water seismic surveys. PMID:22087932

  19. Rat Ultrasonic Vocalization Shows Features of a Modular Behavior

    PubMed Central

    2014-01-01

    Small units of production, or modules, can be effective building blocks of more complex motor behaviors. Recording underlying movements of vocal production in awake and spontaneously behaving male Sprague Dawley rats interacting with a female, I tested whether the underlying movements of ultrasonic calls can be described by modules. Movements were quantified by laryngeal muscle EMG activity and subglottal pressure changes. A module was defined by uniformity in both larynx movement and pressure pattern that resulted in a specific spectrographic feature. Modules are produced either singly (single module call type) or in combination with a different module (composite call type). Distinct modules were shown to be linearly (re)combined. Additionally, I found that modules produced during the same expiratory phase can be linked with or without a pause in laryngeal activity, the latter creating the spectrographic appearance of two separate calls. Results suggest that combining discrete modules facilitates generation of higher-order patterns, thereby increasing overall complexity of the vocal repertoire. With additional study, modularity and flexible laryngeal–respiratory coordination may prove to be a basal feature of mammalian vocal motor control. PMID:24828641

  20. Maps showing seismic landslide hazards in Anchorage, Alaska

    USGS Publications Warehouse

    Jibson, Randall W.

    2014-01-01

    The devastating landslides that accompanied the great 1964 Alaska earthquake showed that seismically triggered landslides are one of the greatest geologic hazards in Anchorage. Maps quantifying seismic landslide hazards are therefore important for planning, zoning, and emergency-response preparation. The accompanying maps portray seismic landslide hazards for the following conditions: (1) deep, translational landslides, which occur only during great subduction-zone earthquakes that have return periods of =300-900 yr; (2) shallow landslides for a peak ground acceleration (PGA) of 0.69 g, which has a return period of 2,475 yr, or a 2 percent probability of exceedance in 50 yr; and (3) shallow landslides for a PGA of 0.43 g, which has a return period of 475 yr, or a 10 percent probability of exceedance in 50 yr. Deep, translational landslide hazards were delineated based on previous studies of such landslides, with some modifications based on field observations of locations of deep landslides. Shallow-landslide hazards were delineated using a Newmark-type displacement analysis for the two probabilistic ground motions modeled.

  1. Maps Showing Seismic Landslide Hazards in Anchorage, Alaska

    USGS Publications Warehouse

    Jibson, Randall W.; Michael, John A.

    2009-01-01

    The devastating landslides that accompanied the great 1964 Alaska earthquake showed that seismically triggered landslides are one of the greatest geologic hazards in Anchorage. Maps quantifying seismic landslide hazards are therefore important for planning, zoning, and emergency-response preparation. The accompanying maps portray seismic landslide hazards for the following conditions: (1) deep, translational landslides, which occur only during great subduction-zone earthquakes that have return periods of =~300-900 yr; (2) shallow landslides for a peak ground acceleration (PGA) of 0.69 g, which has a return period of 2,475 yr, or a 2 percent probability of exceedance in 50 yr; and (3) shallow landslides for a PGA of 0.43 g, which has a return period of 475 yr, or a 10 percent probability of exceedance in 50 yr. Deep, translational landslide hazard zones were delineated based on previous studies of such landslides, with some modifications based on field observations of locations of deep landslides. Shallow-landslide hazards were delineated using a Newmark-type displacement analysis for the two probabilistic ground motions modeled.

  2. Quantifying the power of multiple event interpretations

    NASA Astrophysics Data System (ADS)

    Chien, Yang-Ting; Farhi, David; Krohn, David; Marantan, Andrew; Mateos, David Lopez; Schwartz, Matthew

    2014-12-01

    A number of methods have been proposed recently which exploit multiple highly-correlated interpretations of events, or of jets within an event. For example, Qjets reclusters a jet multiple times and telescoping jets uses multiple cone sizes. Previous work has employed these methods in pseudo-experimental analyses and found that, with a simplified statistical treatment, they give sizable improvements over traditional methods. In this paper, the improvement gain from multiple event interpretations is explored with methods much closer to those used in real experiments. To this end, we derive and study a generalized extended maximum likelihood procedure, and find that using multiple jet radii can provide substantial benefit over a single radius in fitting procedures. Another major concern we address is that multiple event interpretations might be exploiting similar information to that already present in the standard kinematic variables. We perform multivariate analyses (boosted decision trees) on a set of standard kinematic variables, a single observable computed with several different cone sizes, and both sets combined. We find that using multiple radii is still helpful even on top of standard kinematic variables (providing a 12% improvement at low p T and 20% at high p T ). These results suggest that including multiple event interpretations in a realistic search for Higgs to would give additional sensitivity over traditional approaches.

  3. Quantifying the power of multiple event interpretations

    E-print Network

    Yang-Ting Chien; David Farhi; David Krohn; Andrew Marantan; David Lopez Mateos; Matthew Schwartz

    2014-07-10

    A number of methods have been proposed recently which exploit multiple highly-correlated interpretations of events, or of jets within an event. For example, Qjets reclusters a jet multiple times and telescoping jets uses multiple cone sizes. Previous work has employed these methods in pseudo-experimental analyses and found that, with a simplified statistical treatment, they give sizable improvements over traditional methods. In this paper, the improvement gain from multiple event interpretations is explored with methods much closer to those used in real experiments. To this end, we derive a generalized extended maximum likelihood procedure. We study the significance improvement in Higgs to bb with both this method and the simplified method from previous analysis. With either method, we find that using multiple jet radii can provide substantial benefit over a single radius. Another concern we address is that multiple event interpretations might be exploiting similar information to that already present in the standard kinematic variables. By examining correlations between kinematic variables commonly used in LHC analyses and invariant masses obtained with multiple jet reconstructions, we find that using multiple radii is still helpful even on top of standard kinematic variables when combined with boosted decision trees. These results suggest that including multiple event interpretations in a realistic search for Higgs to bb would give additional sensitivity over traditional approaches.

  4. Quantifying Distributions of Lyman Continuum Escape Fraction

    E-print Network

    Cen, Renyue

    2015-01-01

    Simulations have indicated that most of the escaped Lyman continuum photons escape through a minority of solid angles with near complete transparency, with the remaining majority of the solid angles largely opaque, resulting in a very broad and skewed probability distribution function (PDF) of the escape fraction when viewed at different angles. Thus, the escape fraction of Lyman continuum photons of a galaxy observed along a line of sight merely represents the properties of the interstellar medium along that line of sight, which may be an ill-representation of true escape fraction of the galaxy averaged over its full sky. Here we study how Lyman continuum photons escape from galaxies at $z=4-6$, utilizing high-resolution large-scale cosmological radiation-hydrodynamic simulations. We compute the PDF of the mean escape fraction ($\\left$) averaged over mock observational samples, as a function of the sample size, compared to the true mean (had you an infinite sample size). We find that, when the sample size is...

  5. Quantifying ammonia emissions from a cattle feedlot using a dispersion model.

    PubMed

    McGinn, S M; Flesch, T K; Crenna, B P; Beauchemin, K A; Coates, T

    2007-01-01

    Livestock manure is a significant source of ammonia (NH3) emissions. In the atmosphere, NH3 is a precursor to the formation of fine aerosols that contribute to poor air quality associated with human health. Other environmental issues result when NH3 is deposited to land and water. Our study documented the quantity of NH3 emitted from a feedlot housing growing beef cattle. The study was conducted between June and October 2006 at a feedlot with a one-time capacity of 22,500 cattle located in southern Alberta, Canada. A backward Lagrangian stochastic (bLS) inverse-dispersion technique was used to calculate NH3 emissions, based on measurements of NH3 concentration (open-path laser) and wind (sonic anemometer) taken above the interior of the feedlot. There was an average of 3146 kg NH3 d(-1) lost from the entire feedlot, equivalent to 84 microg NH3 m(-2) s(-1) or 140 g NH3 head(-1) d(-1). The NH3 emissions correlated with sensible heat flux (r2 = 0.84) and to a lesser extent the wind speed (r2 = 0.56). There was also evidence that rain suppressed the NH3 emission. Quantifying NH3 emission and dispersion from farms is essential to show the impact of farm management on reducing NH3-related environmental issues. PMID:17940257

  6. Assessment of machine learning reliability methods for quantifying the applicability domain of QSAR regression models.

    PubMed

    Toplak, Marko; Mo?nik, Rok; Polajnar, Matija; Bosni?, Zoran; Carlsson, Lars; Hasselgren, Catrin; Demšar, Janez; Boyer, Scott; Zupan, Blaž; Stålring, Jonna

    2014-02-24

    The vastness of chemical space and the relatively small coverage by experimental data recording molecular properties require us to identify subspaces, or domains, for which we can confidently apply QSAR models. The prediction of QSAR models in these domains is reliable, and potential subsequent investigations of such compounds would find that the predictions closely match the experimental values. Standard approaches in QSAR assume that predictions are more reliable for compounds that are "similar" to those in subspaces with denser experimental data. Here, we report on a study of an alternative set of techniques recently proposed in the machine learning community. These methods quantify prediction confidence through estimation of the prediction error at the point of interest. Our study includes 20 public QSAR data sets with continuous response and assesses the quality of 10 reliability scoring methods by observing their correlation with prediction error. We show that these new alternative approaches can outperform standard reliability scores that rely only on similarity to compounds in the training set. The results also indicate that the quality of reliability scoring methods is sensitive to data set characteristics and to the regression method used in QSAR. We demonstrate that at the cost of increased computational complexity these dependencies can be leveraged by integration of scores from various reliability estimation approaches. The reliability estimation techniques described in this paper have been implemented in an open source add-on package ( https://bitbucket.org/biolab/orange-reliability ) to the Orange data mining suite. PMID:24490838

  7. Using nonlinear methods to quantify changes in infant limb movements and vocalizations

    PubMed Central

    Abney, Drew H.; Warlaumont, Anne S.; Haussman, Anna; Ross, Jessica M.; Wallot, Sebastian

    2014-01-01

    The pairing of dynamical systems theory and complexity science brings novel concepts and methods to the study of infant motor development. Accordingly, this longitudinal case study presents a new approach to characterizing the dynamics of infant limb and vocalization behaviors. A single infant's vocalizations and limb movements were recorded from 51-days to 305-days of age. On each recording day, accelerometers were placed on all four of the infant's limbs and an audio recorder was worn on the child's chest. Using nonlinear time series analysis methods, such as recurrence quantification analysis and Allan factor, we quantified changes in the stability and multiscale properties of the infant's behaviors across age as well as how these dynamics relate across modalities and effectors. We observed that particular changes in these dynamics preceded or coincided with the onset of various developmental milestones. For example, the largest changes in vocalization dynamics preceded the onset of canonical babbling. The results show that nonlinear analyses can help to understand the functional co-development of different aspects of infant behavior. PMID:25161629

  8. Quantifying solar spectral irradiance in aquatic habitats for the assessment of photoenhanced toxicity

    USGS Publications Warehouse

    Barron, M.G.; Little, E.E.; Calfee, R.; Diamond, S.

    2000-01-01

    The spectra and intensity of solar radiation (solar spectral irradiance [SSI]) was quantified in selected aquatic habitats in the vicinity of an oil field on the California coast. Solar spectral irradiance measurements consisted of spectral scans (280-700 rim) and radiometric measurements of ultraviolet (UV): UVB (280-320 nm) and UVA (320-400 nm). Solar spectral irradiance measurements were taken at the surface and at various depths in two marsh ponds, a shallow wetland, an estuary lagoon, and the intertidal area of a high-energy sandy beach. Daily fluctuation in SSI showed a general parabolic relationship with time; maximum structure-activity relationship (SAR) was observed at approximate solar noon. Solar spectral irradiance measurements taken at 10-cm depth at approximate solar noon in multiple aquatic habitats exhibited only a twofold variation in visible light and UVA and a 4.5-fold variation in UVB. Visible light ranged from 11,000 to 19,000 ??W/cm2, UVA ranged from 460 to 1,100 ??W/cm2, and UVB ranged from 8.4 to 38 ??W/cm2. In each habitat, the attenuation of light intensity with increasing water depth was differentially affected over specific wavelengths of SSI. The study results allowed the development of environmentally realistic light regimes necessary for photoenhanced toxicity studies.

  9. Quantifying dispersal and establishment limitation in a population of an epiphytic lichen.

    PubMed

    Werth, Silke; Wagner, Helene H; Gugerli, Felix; Holderegger, Rolf; Csencsics, Daniela; Kalwij, Jesse M; Scheidegger, Christoph

    2006-08-01

    Dispersal is a process critical for the dynamics and persistence of metapopulations, but it is difficult to quantify. It has been suggested that the old-forest lichen Lobaria pulmonaria is limited by insufficient dispersal ability. We analyzed 240 DNA extracts derived from snow samples by a L. pulmonaria-specific real-time PCR (polymerase chain reaction) assay of the ITS (internal transcribed spacer) region allowing for the discrimination among propagules originating from a single, isolated source tree or propagules originating from other locations. Samples that were detected as positives by real-time PCR were additionally genotyped for five L. pulmonaria microsatellite loci. Both molecular approaches demonstrated substantial dispersal from other than local sources. In a landscape approach, we additionally analyzed 240 snow samples with real-time PCR of ITS and detected propagules not only in forests where L. pulmonaria was present, but also in large unforested pasture areas and in forest patches where L. pulmonaria was not found. Monitoring of soredia of L. pulmonaria transplanted to maple bark after two vegetation periods showed high variance in growth among forest stands, but no significant differences among different transplantation treatments. Hence, it is probably not dispersal limitation that hinders colonization in the old-forest lichen L. pulmonaria, but ecological constraints at the stand level that can result in establishment limitation. Our study exemplifies that care has to be taken to adequately separate the effects of dispersal limitation from a limitation of establishment. PMID:16937643

  10. Quantifying terrestrial ecosystem carbon dynamics in the Jinsha watershed, Upper Yangtze, China from 1975 to 2000

    USGS Publications Warehouse

    Zhao, Shuqing

    2010-01-01

    Quantifying the spatial and temporal dynamics of carbon stocks in terrestrial ecosystems and carbon fluxes between the terrestrial biosphere and the atmosphere is critical to our understanding of regional patterns of carbon budgets. Here we use the General Ensemble biogeochemical Modeling System to simulate the terrestrial ecosystem carbon dynamics in the Jinsha watershed of China’s upper Yangtze basin from 1975 to 2000, based on unique combinations of spatial and temporal dynamics of major driving forces, such as climate, soil properties, nitrogen deposition, and land use and land cover changes. Our analysis demonstrates that the Jinsha watershed ecosystems acted as a carbon sink during the period of 1975–2000, with an average rate of 0.36 Mg/ha/yr, primarily resulting from regional climate variation and local land use and land cover change. Vegetation biomass accumulation accounted for 90.6% of the sink, while soil organic carbon loss before 1992 led to a lower net gain of carbon in the watershed, and after that soils became a small sink. Ecosystem carbon sink/source patterns showed a high degree of spatial heterogeneity. Carbon sinks were associated with forest areas without disturbances, whereas carbon sources were primarily caused by stand-replacing disturbances. It is critical to adequately represent the detailed fast-changing dynamics of land use activities in regional biogeochemical models to determine the spatial and temporal evolution of regional carbon sink/source patterns.

  11. Quantifying sub-pixel urban impervious surface through fusion of optical and inSAR imagery

    USGS Publications Warehouse

    Yang, L.; Jiang, L.; Lin, H.; Liao, M.

    2009-01-01

    In this study, we explored the potential to improve urban impervious surface modeling and mapping with the synergistic use of optical and Interferometric Synthetic Aperture Radar (InSAR) imagery. We used a Classification and Regression Tree (CART)-based approach to test the feasibility and accuracy of quantifying Impervious Surface Percentage (ISP) using four spectral bands of SPOT 5 high-resolution geometric (HRG) imagery and three parameters derived from the European Remote Sensing (ERS)-2 Single Look Complex (SLC) SAR image pair. Validated by an independent ISP reference dataset derived from the 33 cm-resolution digital aerial photographs, results show that the addition of InSAR data reduced the ISP modeling error rate from 15.5% to 12.9% and increased the correlation coefficient from 0.71 to 0.77. Spatially, the improvement is especially noted in areas of vacant land and bare ground, which were incorrectly mapped as urban impervious surfaces when using the optical remote sensing data. In addition, the accuracy of ISP prediction using InSAR images alone is only marginally less than that obtained by using SPOT imagery. The finding indicates the potential of using InSAR data for frequent monitoring of urban settings located in cloud-prone areas. Copyright ?? 2009 by Bellwether Publishing, Ltd. All right reserved.

  12. Quantifying downstream impacts of impoundment on flow regime and channel planform, lower Trinity River, Texas

    NASA Astrophysics Data System (ADS)

    Wellmeyer, Jessica L.; Slattery, Michael C.; Phillips, Jonathan D.

    2005-07-01

    As human population worldwide has grown, so has interest in harnessing and manipulating the flow of water for the benefit of humans. The Trinity River of eastern Texas is one such watershed greatly impacted by engineering and urbanization. Draining the Dallas-Fort Worth metroplex, just under 30 reservoirs are in operation in the basin, regulating flow while containing public supplies, supporting recreation, and providing flood control. Lake Livingston is the lowest, as well as largest, reservoir in the basin, a mere 95 km above the Trinity's outlet near Galveston Bay. This study seeks to describe and quantify channel activity and flow regime, identifying effects of the 1968 closure of Livingston dam. Using historic daily and peak discharge data from USGS gauging stations, flow duration curves are constructed, identifying pre- and post-dam flow conditions. A digital historic photo archive was also constructed using six sets of aerial photographs spanning from 1938 to 1995, and three measures of channel activity applied using a GIS. Results show no changes in high flow conditions following impoundment, while low flows are elevated. However, the entire post-dam period is characterized by significantly higher rainfall, which may be obscuring the full impact of flow regulation. Channel activity rates do not indicate a more stabilized planform following dam closure; rather they suggest that the Trinity River is adjusting itself to the stress of Livingston dam in a slow, gradual process that may not be apparent in a modern time scale.

  13. Quantifying the spoilage and shelf-life of yoghurt with fruits.

    PubMed

    Mataragas, M; Dimitriou, V; Skandamis, P N; Drosinos, E H

    2011-05-01

    The aim of the present study was to develop a predictive model to quantify the spoilage of yoghurt with fruits. Product samples were stored at various temperatures (5-20 °C). Samples were subjected to microbiological (total viable counts, lactic acid bacteria-LAB, yeasts and moulds) and physico-chemical analysis (pH, titratable acidity and sugars). LAB was the dominant micro-flora. Yeasts population increased at all temperatures but a delay was observed during the first days of storage. Titratable acidity and pH remained almost constant at low temperatures (5 and 10 °C). However, at higher temperatures (>10 °C), an increase in titratable acidity and reduction in pH was observed. Sugar concentration (fructose, lactose and glucose) decreased during storage. A mathematical model was developed for shelf-life determination of the product. It was successfully validated at a temperature (17 °C) not used during model development. The results showed that shelf-life of this product could not be established based only on microbiological data and use of other parameters such as sensory or/and physico-chemical analysis is required. Shelf-life determination by spoilage tests is time-consuming and the need for new rapid techniques has been raised. The developed model could help dairy industries to establish shelf-life predictions on yoghurt with fruits stored under constant temperature conditions. PMID:21356472

  14. Research and development of a portable device to quantify muscle tone in patients with Parkinsons disease.

    PubMed

    Wright, David; Nakamura, Kazuhiro; Maeda, Tetsuya; Kutsuzawa, Keiichi; Miyawaki, Kazuhito; Nagata, Ken

    2008-01-01

    Parkinson's disease (PD) is a progressive, degenerative condition that is characterised by tremor, bradykinesia, cogwheel rigidity and postural instability. Currently, only subjective tests are employed and we aim to develop a portable device to quantify muscle tone in patients with movement disorders, in particular, Parkinson's disease. A servo-motor robotic arm was developed to rhythmically flex and extend the subjects arm and their response was measured using continuous electromyography (EMG) sampled at 4kHz. Surface electrodes were attached over the biceps of the more severely affected arm and a reference electrode attached to the back of the contra-lateral hand. EMG data was normalized using the rest mean EMG level and then segmented into epochs according to the position data. EMG recorded from normal subjects remained flat throughout the trial with only very small fluctuations in amplitude about the mean. In contrast, biceps activity in parkinsonian patients tended to increase with flexion and as expected there were large variations in amplitude about the baseline activity. A fast fourier transform was also performed and spectra obtained from a typical Parkinsonian patient showed two peaks at approximately 6 Hz and 8 Hz, consistent with previously published data. In conclusion, the results clearly differentiate between normal and parkinsonian cases and to some extent severity. Quantification of disease severity might be possible given a broader range and greater number of patients. PMID:19163293

  15. Quantifying the impacts of dust on the Caspian Sea using a regional climate model

    NASA Astrophysics Data System (ADS)

    Elguindi, N.; Solmon, F.; Turuncoglu, U.

    2013-12-01

    The Karakum desert and surrounding area to the Caspian Sea (CS) provide a significant source of dust to the region. Local dust events can have a substantial impact on SSTs and evaporation from the Sea through direct radiative effects. Given the high interest in projected changes in the Caspian Sea Level (CSL), it is critical that we understand what these effects are in order to accurately model net sea evaporation, a major component of the CS hydrological budget. In this study, we employ a regional climate model (RegCM4) coupled to the 1D Hostetler lake model to explore the impact of dust on the CS. Dust is simulated in RegCM4 through an interactive dust emission transport model coupled to the radiation scheme, as well as a representation of anthropogenic aerosols. The first part of this study focuses on an evaluation of the ability of RegCM4 to simulate dust in the region by comparing 1) seasonal climatologies of modelled aerosol optical depth (AOD) to a range of satellite sources, and 2) a climatology of dust events, as well as decadal variability, to observations derived from visibility measurements. The second part of this study attempts to quantify the impact of dust on the Caspian SSTs, evaporation and heat flux components. The results of this study show that simulating the effects of dust on the CS is necessary for accurately modeling the Sea's hydrological budget.

  16. Quantifying spatial and temporal trends in beach-dune volumetric changes using spatial statistics

    NASA Astrophysics Data System (ADS)

    Eamer, Jordan B. R.; Walker, Ian J.

    2013-06-01

    Spatial statistics are generally underutilized in coastal geomorphology, despite offering great potential for identifying and quantifying spatial-temporal trends in landscape morphodynamics. In particular, local Moran's Ii provides a statistical framework for detecting clusters of significant change in an attribute (e.g., surface erosion or deposition) and quantifying how this changes over space and time. This study analyzes and interprets spatial-temporal patterns in sediment volume changes in a beach-foredune-transgressive dune complex following removal of invasive marram grass (Ammophila spp.). Results are derived by detecting significant changes in post-removal repeat DEMs derived from topographic surveys and airborne LiDAR. The study site was separated into discrete, linked geomorphic units (beach, foredune, transgressive dune complex) to facilitate sub-landscape scale analysis of volumetric change and sediment budget responses. Difference surfaces derived from a pixel-subtraction algorithm between interval DEMs and the LiDAR baseline DEM were filtered using the local Moran's Ii method and two different spatial weights (1.5 and 5 m) to detect statistically significant change. Moran's Ii results were compared with those derived from a more spatially uniform statistical method that uses a simpler student's t distribution threshold for change detection. Morphodynamic patterns and volumetric estimates were similar between the uniform geostatistical method and Moran's Ii at a spatial weight of 5 m while the smaller spatial weight (1.5 m) consistently indicated volumetric changes of less magnitude. The larger 5 m spatial weight was most representative of broader site morphodynamics and spatial patterns while the smaller spatial weight provided volumetric changes consistent with field observations. All methods showed foredune deflation immediately following removal with increased sediment volumes into the spring via deposition at the crest and on lobes in the lee, despite erosion on the stoss slope and dune toe. Generally, the foredune became wider by landward extension and the seaward slope recovered from erosion to a similar height and form to that of pre-restoration despite remaining essentially free of vegetation.

  17. Latest European coelacanth shows Gondwanan affinities.

    PubMed

    Cavin, Lionel; Forey, Peter L; Buffetaut, Eric; Tong, Haiyan

    2005-06-22

    The last European fossil occurrence of a coelacanth is from the Mid-Cretaceous of the English Chalk (Turonian, 90 million years ago). Here, we report the discovery of a coelacanth from Late Cretaceous non-marine rocks in southern France. It consists of a left angular bone showing structures that imply close phylogenetic affinities with some extinct Mawsoniidae. The closest relatives are otherwise known from Cretaceous continental deposits of southern continents and suggest that the dispersal of freshwater organisms from Africa to Europe occurred in the Late Cretaceous. PMID:17148159

  18. Quantified energy dissipation rates in the terrestrial bow shock: 2. Waves and dissipation

    NASA Astrophysics Data System (ADS)

    Wilson, L. B.; Sibeck, D. G.; Breneman, A. W.; Contel, O. Le; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.

    2014-08-01

    We present the first quantified measure of the energy dissipation rates, due to wave-particle interactions, in the transition region of the Earth's collisionless bow shock using data from the Time History of Events and Macroscale Interactions during Substorms spacecraft. Our results show that wave-particle interactions can regulate the global structure and dominate the energy dissipation of collisionless shocks. In every bow shock crossing examined, we observed both low-frequency (<10 Hz) and high-frequency (?10 Hz) electromagnetic waves throughout the entire transition region and into the magnetosheath. The low-frequency waves were consistent with magnetosonic-whistler waves. The high-frequency waves were combinations of ion-acoustic waves, electron cyclotron drift instability driven waves, electrostatic solitary waves, and whistler mode waves. The high-frequency waves had the following: (1) peak amplitudes exceeding ?B˜ 10 nT and ?E˜ 300 mV/m, though more typical values were ?B˜ 0.1-1.0 nT and ?E˜ 10-50 mV/m; (2) Poynting fluxes in excess of 2000 ?W m-2 (typical values were ˜1-10 ?W m-2); (3) resistivities > 9000 ? m; and (4) associated energy dissipation rates >10 ?W m-3. The dissipation rates due to wave-particle interactions exceeded rates necessary to explain the increase in entropy across the shock ramps for ˜90% of the wave burst durations. For ˜22% of these times, the wave-particle interactions needed to only be ? 0.1% efficient to balance the nonlinear wave steepening that produced the shock waves. These results show that wave-particle interactions have the capacity to regulate the global structure and dominate the energy dissipation of collisionless shocks.

  19. Infrared imaging to quantify the effects of nicotine-induced vasoconstriction in humans

    NASA Astrophysics Data System (ADS)

    Brunner, Siegfried; Kargel, Christian

    2009-05-01

    Smoking is the most significant source of preventable morbidity and premature mortality worldwide (WHO-2008). One of the many effects of nicotine is vasoconstriction which is triggered by the autonomic nervous system. The constriction of blood vessels e.g. of the skin's vascular bed is responsible for a decrease of the supply with oxygen and nutrients and a lowering of the skin temperature. We used infrared imaging to quantify temperature decreases caused by cigarette smoking in the extremities of smokers and also monitored heart rate as well as blood pressure. The results - including thermograms showing "temporary amputations" of the fingertips due to a significant temperature drop - can help increase the awareness of the dangers of smoking and the success of withdrawal programs. Surprisingly, in our control persons (3 brave non-smoking volunteers who smoked a cigarette) we also found temperature increases suggesting that vasodilation (widening of blood vessels) was provoked by cigarettes. To verify this unexpected finding and eliminate effects from the 4000 chemical compounds in the smoke, we repeated the experiment following a stringent protocol ruling out physiological and psychological influences with 9 habitual smokers and 17 nonsmokers who all chew gums with 2 mg of nicotine. Task-optimized digital image processing techniques (target detection, image-registration and -segmentation) were applied to the acquired infrared image sequences to automatically yield temperature plots of the fingers and palm. In this paper we present the results of our study in detail and show that smokers and non-smokers respond differently to the administration of nicotine.

  20. Show Me the Invisible: Visualizing Hidden Content

    PubMed Central

    Geymayer, Thomas; Steinberger, Markus; Lex, Alexander; Streit, Marc; Schmalstieg, Dieter

    2014-01-01

    Content on computer screens is often inaccessible to users because it is hidden, e.g., occluded by other windows, outside the viewport, or overlooked. In search tasks, the efficient retrieval of sought content is important. Current software, however, only provides limited support to visualize hidden occurrences and rarely supports search synchronization crossing application boundaries. To remedy this situation, we introduce two novel visualization methods to guide users to hidden content. Our first method generates awareness for occluded or out-of-viewport content using see-through visualization. For content that is either outside the screen’s viewport or for data sources not opened at all, our second method shows off-screen indicators and an on-demand smart preview. To reduce the chances of overlooking content, we use visual links, i.e., visible edges, to connect the visible content or the visible representations of the hidden content. We show the validity of our methods in a user study, which demonstrates that our technique enables a faster localization of hidden content compared to traditional search functionality and thereby assists users in information retrieval tasks. PMID:25325078

  1. Conservation Project Shows Substantial Reduction in Home Water Use

    ERIC Educational Resources Information Center

    Sharpe, William E.; Smith, Donald

    1978-01-01

    Describes a water use study-conservation project conducted by the Washington Suburban Sanitary Commission in Maryland. Results show a significant decrease in the amount of water used by home customers over a ten-year period. (Author/MA)

  2. Isotopes in Urban Cheatgrass Quantify Atmospheric Pollution

    NASA Astrophysics Data System (ADS)

    Kammerdiener, S. A.; Ehleringer, J. R.

    2006-12-01

    This study presents evidence that the nitrogen and carbon stable isotope values of vegetation can be used as integrators of ephemeral atmospheric pollution signals. Leaves and stems of Bromus tectorum and soil samples were collected in the urban Salt Lake Valley and in the rural Skull Valley of Utah. These samples were used to develop a map of the spatial distribution of ?13C and ?15N values of leaves and stems of Bromus tectorum and soils around each valley. The spatial distribution of ?15N values of leaves and stems of Bromus tectorum and associated soils were significantly correlated. The average ?15N value for Salt Lake Valley Bromus tectorum leaves and stems was 2.37‰ while the average value for Skull Valley Bromus tectorum leaves and stems was 4.76‰. It is possible that the higher concentration of atmospheric nitrogen pollutants measured in the Salt Lake Valley provided the ?15N depleted nitrogen source for uptake by plants and deposition on soils, though the ?15N value of source nitrogen was not measured directly. The presence of a seasonal difference in ?15N values of leaves and stems of Bromus tectorum sampled in Salt Lake Valley but not in Skull Valley further supports this idea. Leaves and stems of Bromus tectorum sampled in the Salt Lake Valley in April 2003 had a statistically more positive average ?15N value of 2.4 ‰ than samples collected in August 2003, which had an average ?15N value of 0.90‰. The carbon isotope values of leaves and stems of Bromus tectorum and air samples collected in Salt Lake Valley were more negative than values measured in Skull Valley samples (Salt Lake ?13Cplant= -28.50‰ and ?13Cair= -9.32 ‰; Skull Valley ?13Cplant= -27.58‰ and ?13C air= -8.52 ‰). This supports the idea that differences in stable isotope values of source air are correlated with differences in stable isotope values of exposed vegetation. Overall, the results of this study suggest that the carbon and nitrogen stable isotope values measured in vegetation are useful indicators of differences in atmospheric pollutant concentration in urban and rural areas.

  3. Quantifying retro-foreland evolution in the Eastern Pyrenees.

    NASA Astrophysics Data System (ADS)

    Grool, Arjan R.; Ford, Mary; Huismans, Ritske S.

    2015-04-01

    The northern Pyrenees form the retro-foreland of the Pyrenean orogen. Modelling studies show that retro-forelands have several contrasting characteristics compared to pro-forelands: They tend to show a constant tectonic subsidence during the growth phase of an orogen, and no tectonic subsidence during the steady-state phase. Retro-forelands are also not displaced into the core of the orogen once the steady state phase is achieved. This means they tend to preserve the subsidence history from the growth phase of the orogen, but little or no history from the steady state phase. The northeastern Pyrenees (Carcassonne high) are a good location to test these characteristics against real-world data, because syn-orogenic sediments are preserved and the lack of postrift thermal subsidence and Triassic salt reduce complicating factors. In order to test the model, quantification of the following parameters is needed: Timing, amount and distribution of deformation, subsidence and sedimentation. We use subsurface, field, map and literature data to construct 2 balanced and restored cross sections through the eastern north Pyrenean foreland, stretching from the Montagne Noire in the north, to the Axial Zone in the south. We will link this to published thermochronology data to further constrain the evolution of the retro-foreland and investigate the link with the Axial Zone towards the south. We will quantify subsidence, deformation and sedimentation and link them to exhumation phases in the North Pyrenean Zone (NPZ) and the Axial Zone. The north Pyrenean retro-foreland is divided into two parts: the external foreland basin (Aquitaine basin) to the north and the North Pyrenean Zone to the south, separated by the North Pyrenean Frontal Thrust (NPFT). South of the NPZ lies the Axial Zone, separated from the retro-foreland by the North Pyrenean Fault which is believed to be the suture between Iberia and Europe. The NPFT was the breakaway fault on the European continent during the Apto-Albian rifting phase and was strongly inverted during the Pyrenean orogeny. South of the NPFT we find Lower Cretaceous and older sediments, including Triassic salt. These sediments are completely absent north of the NPFT (on Carcassonne high), indicating its significance during the extensional phase. The retro-foreland is deformed by fault-propagation folds above basement-involving thrusts. A slow northward propagation of deformation and sedimentation is clearly visible. The preserved thickness of Upper Cretaceous sediments corresponds with the retro-foreland model's prediction that early subsidence records are preserved. Two distinct deformation phases are recognized, but not the latest Oligocene phase that is found in the pro-foreland (southern Pyrenees). This could indicate a steady state during the late Oligocene.We quantify and constrain the evolution of the eastern Pyrenean retro-foreland basin, investigate the link with the axial zone and investigate the pre-orogenic configuration of the region that currently constitutes the eastern Pyrenean retro-foreland.

  4. Nature's Late-Night Light Shows

    NASA Astrophysics Data System (ADS)

    Peterson, Carolyn Collins

    2002-09-01

    In addition to stars and planets, there are other interesting lights to be seen in the night sky. The northern and southern lights, called the aurora borealis and aurora australis, are created by charged particles from the Sun reacting in Earth's magnetic field. Night-shining clouds or noctilucent clouds appear at evening twilight as a result of water vapor in the polar mesosphere. Zodiacal light can be seen stretching up from the horizon after sunset or before sunrise.

  5. Color Voyager 2 Image Showing Crescent Uranus

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This image shows a crescent Uranus, a view that Earthlings never witnessed until Voyager 2 flew near and then beyond Uranus on January 24, 1986. This planet's natural blue-green color is due to the absorption of redder wavelengths in the atmosphere by traces of methane gas. Uranus' diameter is 32,500 miles, a little over four times that of Earth. The hazy blue-green atmosphere probably extends to a depth of around 5,400 miles, where it rests above what is believed to be an icy or liquid mixture (an 'ocean') of water, ammonia, methane, and other volatiles, which in turn surrounds a rocky core perhaps a little smaller than Earth.

  6. Microbiological and environmental issues in show caves.

    PubMed

    Saiz-Jimenez, Cesareo

    2012-07-01

    Cultural tourism expanded in the last half of the twentieth century, and the interest of visitors has come to include caves containing archaeological remains. Some show caves attracted mass tourism, and economical interests prevailed over conservation, which led to a deterioration of the subterranean environment and the rock art. The presence and the role of microorganisms in caves is a topic that is often ignored in cave management. Knowledge of the colonisation patterns, the dispersion mechanisms, and the effect on human health and, when present, over rock art paintings of these microorganisms is of the utmost importance. In this review the most recent advances in the study of microorganisms in caves are presented, together with the environmental implications of the findings. PMID:22806150

  7. Survey shows successes, failures of horizontal wells

    SciTech Connect

    Deskins, W.G.; McDonald, W.J. [Maurer Engineering Inc., Houston, TX (United States); Reid, T.B. [Dept. of Energy, Bartlesville, OK (United States)

    1995-06-19

    Industry`s experience now shows that horizontal well technology must be applied thoughtfully and be site-specific to attain technical and economic success. This article, based on a comprehensive study done by Maurer Engineering for the US Department of Energy (DOE), addresses the success of horizontal wells in less-publicized formations, that is, other than the Austin chalk. Early excitement within the industry about the new technology reached a fever pitch at times, leaving some with the impression that horizontal drilling is a panacea for all drilling environments. This work gauges the overall success of horizontal technology in US and Canadian oil and gas fields, defines the applications where horizontal technology is most appropriate, and assesses its impact on oil recovery and reserves.

  8. Surveys show support for green 'activities'.

    PubMed

    Baillie, Jonathan

    2012-03-01

    Two independently conducted surveys on sustainability - one into the 'views and values' of NHS 'leaders', and the other questioning the public about the importance of the 'green agenda' in the NHS, and their opinions on how the service might most effectively reduce its carbon footprint, form the basis of Sustainability in the NHS: Health Check 2012, a new NHS Sustainable Development Unit (NHS SDU) publication. As HEJ editor Jonathan Baillie reports, the new document also presents updated data on the 'size' of the carbon footprint of the NHS in England, showing that, although good work by a number of Trusts in the past two years has seen healthcare-generated carbon emissions start to 'level off', the biggest contributors have been the current health service spending review, and the increased national availability of renewable energy. PMID:22515017

  9. Mesenchymal stem cells show radioresistance in vivo

    PubMed Central

    Singh, Sarvpreet; Kloss, Frank R; Brunauer, Regina; Schimke, Magdalena; Jamnig, Angelika; Greiderer-Kleinlercher, Brigitte; Klima, Günter; Rentenberger, Julia; Auberger, Thomas; Hächl, Oliver; Rasse, Michael; Gassner, Robert; Lepperdinger, Günter

    2012-01-01

    Abstract Irradiation impacts on the viability and differentiation capacity of tissue-borne mesenchymal stem cells (MSC), which play a pivotal role in bone regeneration. As a consequence of radiotherapy, bones may develop osteoradionecrosis. When irradiating human bone-derived MSC in vitro with increasing doses, the cells’ self-renewal capabilities were greatly reduced. Mitotically stalled cells were still capable of differentiating into osteoblasts and pre-adipocytes. As a large animal model comparable to the clinical situation, pig mandibles were subjected to fractionized radiation of 2 ? 9 Gy within 1 week. This treatment mimics that of a standardized clinical treatment regimen of head and neck cancer patients irradiated 30 ? 2 Gy. In the pig model, fractures which had been irradiated, showed delayed osseous healing. When isolating MSC at different time points post-irradiation, no significant changes regarding proliferation capacity and osteogenic differentiation potential became apparent. Therefore, pig mandibles were irradiated with a single dose of either 9 or 18 Gy in vivo, and MSC were isolated immediately afterwards. No significant differences between the untreated and 9 Gy irradiated bone with respect to proliferation and osteogenic differentiation were unveiled. Yet, cells isolated from 18 Gy irradiated specimens exhibited a reduced osteogenic differentiation capacity, and during the first 2 weeks proliferation rates were greatly diminished. Thereafter, cells recovered and showed normal proliferation behaviour. These findings imply that MSC can effectively cope with irradiation up to high doses in vivo. This finding should thus be implemented in future therapeutic concepts to protect regenerating tissue from radiation consequences. PMID:21762375

  10. Quantifying Cumulative Effects of Multiple Rock Quarries on Aquifers

    NASA Astrophysics Data System (ADS)

    West, A. C.; Marentette, K. A.; Oxtobee, J. P.

    2009-05-01

    The determination of the cumulative effect of more than one quarry on groundwater resources is a task that is being requested by regulatory agencies with increasing frequency, generally as part of a license or permit application advanced by a single quarry operator. One key complicating factor is that data from potentially- competing adjacent quarry operators must be collected to assist the applicant in assessing the cumulative effect of multiple quarries in the same general area. A second complicating factor is that the cumulative effect is strongly dependent on the timing of the development of the quarries under consideration, which is difficult to predict and/or determine from all parties. Notwithstanding these difficulties, the cumulative effect (principally groundwater level drawdown and its effect on wells and surface water features) of a cluster of three existing quarries and one proposed quarry were assessed in support of a quarry license application for the proposed quarry. The method involved the development of regional-scale groundwater flow models of existing and future conditions. The model of existing conditions was based on a structural geological model validated by on-site core logging and geophysical logging and off-site geological mapping. The hydrostratigraphy was aligned with the structural model, and with thicknesses and hydraulic conductivities inferred from the results of packer testing, single well response testing, pump testing, and from calibration of the simulated hydraulic heads to those measured in numerous on-site multi-level monitoring wells and those inferred from static water levels recorded in off-site water wells. The models of future conditions were based on the adjustment of the model of existing conditions to reflect an assumed "worst case" configuration of quarries, with and without the proposed quarry. The modelling results were used to quantify the cumulative effects assuming the successful licensure of the proposed quarry, and the proportion of these effects that would be attributable to the proposed quarry.

  11. A FRAMEWORK FOR QUANTIFYING THE DEGENERACIES OF EXOPLANET INTERIOR COMPOSITIONS

    SciTech Connect

    Rogers, L. A.; Seager, S. [Department of Physics, Massachusetts Institute of Technology, Cambridge, MA 02139 (United States)

    2010-04-01

    Several transiting super-Earths are expected to be discovered in the coming few years. While tools to model the interior structure of transiting planets exist, inferences about the composition are fraught with ambiguities. We present a framework to quantify how much we can robustly infer about super-Earth and Neptune-size exoplanet interiors from radius and mass measurements. We introduce quaternary diagrams to illustrate the range of possible interior compositions for planets with four layers (iron core, silicate mantles, water layers, and H/He envelopes). We apply our model to CoRoT-7b, GJ 436b, and HAT-P-11b. Interpretation of planets with H/He envelopes is limited by the model uncertainty in the interior temperature, while for CoRoT-7b observational uncertainties dominate. We further find that our planet interior model sharpens the observational constraints on CoRoT-7b's mass and radius, assuming the planet does not contain significant amounts of water or gas. We show that the strength of the limits that can be placed on a super-Earth's composition depends on the planet's density; for similar observational uncertainties, high-density super-Mercuries allow the tightest composition constraints. Finally, we describe how techniques from Bayesian statistics can be used to take into account in a formal way the combined contributions of both theoretical and observational uncertainties to ambiguities in a planet's interior composition. On the whole, with only a mass and radius measurement an exact interior composition cannot be inferred for an exoplanet because the problem is highly underconstrained. Detailed quantitative ranges of plausible compositions, however, can be found.

  12. Quantifying cortical EEG responses to TMS in (un)consciousness.

    PubMed

    Sarasso, Simone; Rosanova, Mario; Casali, Adenauer G; Casarotto, Silvia; Fecchio, Matteo; Boly, Melanie; Gosseries, Olivia; Tononi, Giulio; Laureys, Steven; Massimini, Marcello

    2014-01-01

    We normally assess another individual's level of consciousness based on her or his ability to interact with the surrounding environment and communicate. Usually, if we observe purposeful behavior, appropriate responses to sensory inputs, and, above all, appropriate answers to questions, we can be reasonably sure that the person is conscious. However, we know that consciousness can be entirely within the brain, even in the absence of any interaction with the external world; this happens almost every night, while we dream. Yet, to this day, we lack an objective, dependable measure of the level of consciousness that is independent of processing sensory inputs and producing appropriate motor outputs. Theoretically, consciousness is thought to require the joint presence of functional integration and functional differentiation, otherwise defined as brain complexity. Here we review a series of recent studies in which Transcranial Magnetic Stimulation combined with electroencephalography (TMS/EEG) has been employed to quantify brain complexity in wakefulness and during physiological (sleep), pharmacological (anesthesia) and pathological (brain injury) loss of consciousness. These studies invariably show that the complexity of the cortical response to TMS collapses when consciousness is lost during deep sleep, anesthesia and vegetative state following severe brain injury, while it recovers when consciousness resurges in wakefulness, during dreaming, in the minimally conscious state or locked-in syndrome. The present paper will also focus on how this approach may contribute to unveiling the pathophysiology of disorders of consciousness affecting brain-injured patients. Finally, we will underline some crucial methodological aspects concerning TMS/EEG measurements of brain complexity. PMID:24403317

  13. Schizophrenia shows a unique metabolomics signature in plasma.

    PubMed

    He, Y; Yu, Z; Giegling, I; Xie, L; Hartmann, A M; Prehn, C; Adamski, J; Kahn, R; Li, Y; Illig, T; Wang-Sattler, R; Rujescu, D

    2012-01-01

    Schizophrenia is a severe complex mental disorder affecting 0.5-1% of the world population. To date, diagnosis of the disease is mainly based on personal and thus subjective interviews. The underlying molecular mechanism of schizophrenia is poorly understood. Using targeted metabolomics we quantified and compared 103 metabolites in plasma samples from 216 healthy controls and 265 schizophrenic patients, including 52 cases that do not take antipsychotic medication. Compared with healthy controls, levels of five metabolites were found significantly altered in schizophrenic patients (P-values ranged from 2.9 × 10(-8) to 2.5 × 10(-4)) and in neuroleptics-free probands (P-values ranging between 0.006 and 0.03), respectively. These metabolites include four amino acids (arginine, glutamine, histidine and ornithine) and one lipid (PC ae C38:6) and are suggested as candidate biomarkers for schizophrenia. To explore the genetic susceptibility on the associated metabolic pathways, we constructed a molecular network connecting these five aberrant metabolites with 13 schizophrenia risk genes. Our result implicated aberrations in biosynthetic pathways linked to glutamine and arginine metabolism and associated signaling pathways as genetic risk factors, which may contribute to patho-mechanisms and memory deficits associated with schizophrenia. This study illustrated that the metabolic deviations detected in plasma may serve as potential biomarkers to aid diagnosis of schizophrenia. PMID:22892715

  14. A fluorescence-based hydrolytic enzyme activity assay for quantifying toxic effects of Roundup® to Daphnia magna.

    PubMed

    Ørsted, Michael; Roslev, Peter

    2015-08-01

    Daphnia magna is a widely used model organism for aquatic toxicity testing. In the present study, the authors investigated the hydrolytic enzyme activity of D. magna after exposure to toxicant stress. In vivo enzyme activity was quantified using 15 fluorogenic enzyme probes based on 4-methylumbelliferyl or 7-amino-4-methylcoumarin. Probing D. magna enzyme activity was evaluated using short-term exposure (24-48 h) to the reference chemical K2 Cr2 O7 or the herbicide formulation Roundup®. Toxicant-induced changes in hydrolytic enzyme activity were compared with changes in mobility (International Organization for Standardization standard 6341). The results showed that hydrolytic enzyme activity was quantifiable as a combination of whole body fluorescence of D. magna and the fluorescence of the surrounding water. Exposure of D. magna to lethal and sublethal concentrations of Roundup resulted in loss of whole body enzyme activity and release of cell constituents, including enzymes and DNA. Roundup caused comparable inhibition of mobility and alkaline phosphatase activity with median effective concentration values at 20?°C of 8.7?mg active ingredient (a.i.)/L to 11.7?mg?a.i./L. Inhibition of alkaline phosphatase activity by Roundup was lowest at 14?°C and greater at 20?°C and 26?°C. The results suggest that the fluorescence-based hydrolytic enzyme activity assay (FLEA assay) can be used as an index of D. magna stress. Combining enzyme activity with fluorescence measurements may be applied as a simple and quantitative supplement for toxicity testing with D. magna. Environ Toxicol Chem 2015;34:1841-1850. © 2015 SETAC. PMID:25809520

  15. Identifying Methane Sources in Groundwater; Quantifying Changes in Compositional and Stable Isotope Values during Multiphase Transport

    NASA Astrophysics Data System (ADS)

    Larson, T.; Sathaye, K.

    2014-12-01

    A dramatic expansion of hydraulic fracturing and horizontal drilling for natural gas in unconventional reserves is underway. This expansion is fueling considerable public concern, however, that extracted natural gas, reservoir brines and associated fracking fluids may infiltrate to and contaminate shallower (< 500m depth) groundwater reservoirs, thereby posing a health threat. Attributing methane found in shallow groundwater to either deep thermogenic 'fracking' operations or locally-derived shallow microbial sources utilizes geochemical methods including alkane wetness and stable carbon and hydrogen isotope ratios of short chain (C1-C5) hydrocarbons. Compared to shallow microbial gas, thermogenic gas is wetter and falls within a different range of ?13C and ?D values. What is not clear, however, is how the transport of natural gas through water saturated geological media may affect its compositional and stable isotope values. What is needed is a means to differentiate potential flow paths of natural gas including 'fast paths' along preexisting fractures and drill casings vs. 'slow paths' through low permeability rocks. In this study we attempt quantify transport-related effects using experimental 1-dimensional two-phase column experiments and analytical solutions to multi-phase gas injection equations. Two-phase experimental results for an injection of natural gas into a water saturated column packed with crushed illite show that the natural gas becomes enriched in methane compared to ethane and propane during transport. Carbon isotope measurements are ongoing. Results from the multi-phase gas injection equations that include methane isotopologue solubility and diffusion effects predict the development of a 'bank' of methane depleted in 13C relative to 12C at the front of a plume of fugitive natural gas. These results, therefore, suggest that transport of natural gas through water saturated geological media may complicate attribution methods needed to distinguish thermogenic and microbial methane.

  16. Quantifying the effect of the air/water interface in marine active source EM

    NASA Astrophysics Data System (ADS)

    Wright, David

    2015-07-01

    The marine controlled source EM surveying method has become an accepted tool for deep water exploration for oil and gas reserves. In shallow water (< 500 m) data are complicated by the signal which interacts with the water-air interface which can dominate the response at the receiver. By decomposing the 1-D response to an impulsive current dipole source in the time domain and frequency domain I separate the response into: (1) an earth response, (2) a direct arrival, (3) a coupled airwave which travels through the air and (4) a surface coupling term which travels through the earth. The last two terms are coupled to the sea surface as well as to the earth resistivity structure but one travels through the air between source and receiver and the other only through the earth. Using a range of simple models I quantify the effect of these four terms in the time domain and the frequency domain. The results show that in shallow water the total response is significantly larger than in very deep water and that a large part of this extra energy comes from surface coupling, which is reflected at the sea surface and does not propagate through the air but through the earth. As a result, this term is highly sensitive to the resistivity of the earth. This means that the sea surface in shallow water not only significantly increases the signal strength of CSEM data but also enhances the sensitivity to subsurface resistivity structure. Compared with the surface coupling term, the coupled part of the airwave contains very little information about the earth, and is limited to the near surface. Time domain separation of the airwave from the surface coupling response results in greater sensitivity to a deep resistive target than frequency domain separation although there is also reasonable sensitivity in the frequency domain.

  17. Lockheed Electra - animation showing air turbulence detection

    NASA Technical Reports Server (NTRS)

    1999-01-01

    On Mar. 24, 1998, an L-188 Electra aircraft owned by the National Science Foundation, Arlington, Virginia, and operated by the National Center for Atmospheric Research, Boulder, Colorado, flew near Boulder with an Airborne Coherent LiDAR (Light Detection and Ranging) for Advanced In-flight Measurement. This aircraft was on its first flight to test its ability to detect previously invisible forms of clear air turbulence. Coherent Technologies Inc., Lafayette, Colorado, built the LiDAR device for the NASA Dryden Flight Research Center, Edwards, California. NASA Dryden participated in the effort as part of the NASA Aviation Safety Program, for which the lead center was Langley Research Center, Hampton, Virginia. Results of the test indicated that the device did successfully detect the clear air turbulence. Computer animation of the clear air turbulence (CAT) detection system known as the 'Airborne Coherent LiDAR for Advanced In-flight Measurement' was tested aboard the National Science Foundation L-188 Lockheed Electra.

  18. HUBBLE SHOWS EXPANSION OF ETA CARINAE DEBRIS

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The furious expansion of a huge, billowing pair of gas and dust clouds are captured in this NASA Hubble Space Telescope comparison image of the supermassive star Eta Carinae. To create the picture, astronomers aligned and subtracted two images of Eta Carinae taken 17 months apart (April 1994, September 1995). Black represents where the material was located in the older image, and white represents the more recent location. (The light and dark streaks that make an 'X' pattern are instrumental artifacts caused by the extreme brightness of the central star. The bright white region at the center of the image results from the star and its immediate surroundings being 'saturated' in one of the images.)Photo Credit: Jon Morse (University of Colorado), Kris Davidson (University of Minnesota), and NASA Image files in GIF and JPEG format and captions may be accessed on Internet via anonymous ftp from oposite.stsci.edu in /pubinfo.

  19. Quantifying and minimizing entropy generation in AMTEC cells

    SciTech Connect

    Hendricks, T.J.; Huang, C. [Advanced Modular Power Systems, Inc., Ann Arbor, MI (United States)

    1997-12-31

    Entropy generation in an AMTEC cell represents inherent power loss to the AMTEC cell. Minimizing cell entropy generation directly maximizes cell power generation and efficiency. An internal project is on-going at AMPS to identify, quantify and minimize entropy generation mechanisms within an AMTEC cell, with the goal of determining cost-effective design approaches for maximizing AMTEC cell power generation. Various entropy generation mechanisms have been identified and quantified. The project has investigated several cell design techniques in a solar-driven AMTEC system to minimize cell entropy generation and produce maximum power cell designs. In many cases, various sources of entropy generation are interrelated such that minimizing entropy generation requires cell and system design optimization. Some of the tradeoffs between various entropy generation mechanisms are quantified and explained and their implications on cell design are discussed. The relationship between AMTEC cell power and efficiency and entropy generation is presented and discussed.

  20. Quantifying the centre of rotation pattern in a multi-body model of the lumbar spine.

    PubMed

    Abouhossein, Alireza; Weisse, Bernhard; Ferguson, Stephen J

    2013-01-01

    Understanding the kinematics of the spine provides paramount knowledge for many aspects of the clinical analysis of back pain. More specifically, visualisation of the instantaneous centre of rotation (ICR) enables clinicians to quantify joint laxity in the segments, avoiding a dependence on more inconclusive measurements based on the range of motion and excessive translations, which vary in every individual. Alternatively, it provides motion preserving designers with an insight into where a physiological ICR of a motion preserving prosthesis can be situated in order to restore proper load distribution across the passive and active elements of the lumbar region. Prior to the use of an unconstrained dynamic musculoskeletal model system, based on multi-body models capable of transient analysis, to estimate segmental loads, the model must be kinematically evaluated for all possible sensitivity due to ligament properties and the initial locus of intervertebral disc (IVD). A previously calibrated osseoligamentous model of lumbar spine was used to evaluate the changes in ICR under variation of the ligament stiffness and initial locus of IVD, when subjected to pure moments from 0 to 15 Nm. The ICR was quantified based on the closed solution of unit quaternion that improves accuracy and prevents coordinate singularities, which is often observed in Euler-based methods and least squares principles. The calculation of the ICR during flexion/extension revealed complexity and intrinsic nonlinearity between flexion and extension. This study revealed that, to accommodate a good agreement between in vitro data and the multi-body model predictions, in flexion more laxity is required than in extension. The results showed that the ICR location is concentrated in the posterior region of the disc, in agreement with previous experimental studies. However, the current multi-body model demonstrates a sensitivity to the initial definition of the ICR, which should be recognised as a limitation of the method. Nevertheless, the current simulations suggest that, due to the constantly evolving path of the ICR across the IVD during flexion-extension, a movable ICR is a necessary condition in multi-body modelling of the spine, in the context of whole body simulation, to accurately capture segmental kinematics and kinetics. PMID:22439815

  1. Quantifying light-dependent circadian disruption in humans and animal models.

    PubMed

    Rea, Mark S; Figueiro, Mariana G

    2014-12-01

    Although circadian disruption is an accepted term, little has been done to develop methods to quantify the degree of disruption or entrainment individual organisms actually exhibit in the field. A variety of behavioral, physiological and hormonal responses vary in amplitude over a 24-h period and the degree to which these circadian rhythms are synchronized to the daily light-dark cycle can be quantified with a technique known as phasor analysis. Several studies have been carried out using phasor analysis in an attempt to measure circadian disruption exhibited by animals and by humans. To perform these studies, species-specific light measurement and light delivery technologies had to be developed based upon a fundamental understanding of circadian phototransduction mechanisms in the different species. When both nocturnal rodents and diurnal humans, experienced different species-specific light-dark shift schedules, they showed, based upon phasor analysis of the light-dark and activity-rest patterns, similar levels of light-dependent circadian disruption. Indeed, both rodents and humans show monotonically increasing and quantitatively similar levels of light-dependent circadian disruption with increasing shift-nights per week. Thus, phasor analysis provides a method for quantifying circadian disruption in the field and in the laboratory as well as a bridge between ecological measurements of circadian entrainment in humans and parametric studies of circadian disruption in animal models, including nocturnal rodents. PMID:25229212

  2. Quantifying the Uncertainty in Land Loss Estimates in Coastal Louisiana

    NASA Astrophysics Data System (ADS)

    Wales, P. M.; Kuszmaul, J. S.; Roberts, C.

    2005-05-01

    For the past twenty-five years the land loss along the Louisiana Coast has been recognized as a growing problem. One of the clearest indicators of this land loss is that in 2000 smooth cord grass (spartina alterniflora) was turning brown well before its normal hibernation period. In 2001 data were collected using low altitude helicopter based transects of the coast, with 8,400 data points being collected. The surveys contained data describing the characteristics of the marsh, including; latitude, longitude, marsh condition, marsh color, percent vegetated, and marsh die-back. The 2001 data were compared with previously collected data from 1997. Over 100,000 acres of marsh were affected by the 2000 browning. Satellite imagery can be used to monitor changes in coastlines, vegetation health, and conversion of land to open water. An unsupervised classification was applied to 1997 Landsat TM imagery from the Louisiana coast. Based on the classification, polygons were delineated surrounding areas of water. Using the Kappa Classification Statistical Analysis extension in ArcView, kappa statistics were calculated to quantify the amount of agreement between the unsupervised classification and field checked data while correcting for agreement due to chance. Numerical results reveal that a straightforward unsupervised classification does a reasonable job of approximating the actual field checked data. Kappa values of 0.57 and higher have been obtained, which is considered fair to good agreement. This agreement adds credibility to imagery based estimates of coastal land loss, which affords the opportunity for significant savings of time, labor, and cost compared to field based monitoring. Refined classifications and use of higher resolution imagery are expected to yield improved costal land loss estimates.

  3. Quantifying conformational dynamics using solid-state R?? experiments.

    PubMed

    Quinn, Caitlin M; McDermott, Ann E

    2012-09-01

    We demonstrate the determination of quantitative rates of molecular reorientation in the solid state with rotating frame (R(1?)) relaxation measurements. Reorientation of the carbon chemical shift anisotropy (CSA) tensor was used to probe site-specific conformational exchange in a model system, d(6)-dimethyl sulfone (d(6)-DMS). The CSA as a probe of exchange has the advantage that it can still be utilized when there is no dipolar mechanism (i.e. no protons attached to the site of interest). Other works have presented R(1?) measurements as a general indicator of dynamics, but this study extracts quantitative rates of molecular reorientation from the R(1?) values. Some challenges of this technique include precise knowledge of sample temperature and determining the R(2)(0) contribution to the observed relaxation rate from interactions other than molecular reorientation, such as residual dipolar couplings or fast timescale dynamics; determination of this term is necessary in order to quantify the exchange rate due to covariance between the 2 terms. Low-temperature experiments measured an R(2)(0) value of 1.8±0.2s(-1) Allowing for an additional relaxation term (R(2)(0)), which was modeled as both temperature-dependent and temperature-independent, rates of molecular reorientation were extracted from field strength-dependent R(1?) measurements at four different temperatures and the activation energy was determined from these exchange rates. The activation energies determined were 74.7±4.3kJ/mol and 71.7±2.9kJ/mol for the temperature-independent and temperature-dependent R(2)(0) models respectively, in excellent agreement with literature values. The results of this study suggest important methodological considerations for the application of the method to more complicated systems such as proteins, such as the importance of deuterating samples and the need to make assumptions regarding the R(2)(0) contribution to relaxation. PMID:22820004

  4. QUANTIFYING THE BIASES OF SPECTROSCOPICALLY SELECTED GRAVITATIONAL LENSES

    SciTech Connect

    Arneson, Ryan A.; Brownstein, Joel R.; Bolton, Adam S., E-mail: arnesonr@uci.edu, E-mail: joelbrownstein@astro.utah.edu, E-mail: bolton@astro.utah.edu [Department of Physics and Astronomy, University of Utah, Salt Lake City, UT 84112 (United States)

    2012-07-01

    Spectroscopic selection has been the most productive technique for the selection of galaxy-scale strong gravitational lens systems with known redshifts. Statistically significant samples of strong lenses provide a powerful method for measuring the mass-density parameters of the lensing population, but results can only be generalized to the parent population if the lensing selection biases are sufficiently understood. We perform controlled Monte Carlo simulations of spectroscopic lens surveys in order to quantify the bias of lenses relative to parent galaxies in velocity dispersion, mass axis ratio, and mass-density profile. For parameters typical of the SLACS and BELLS surveys, we find (1) no significant mass axis ratio detection bias of lenses relative to parent galaxies; (2) a very small detection bias toward shallow mass-density profiles, which is likely negligible compared to other sources of uncertainty in this parameter; (3) a detection bias toward smaller Einstein radius for systems drawn from parent populations with group- and cluster-scale lensing masses; and (4) a lens-modeling bias toward larger velocity dispersions for systems drawn from parent samples with sub-arcsecond mean Einstein radii. This last finding indicates that the incorporation of velocity-dispersion upper limits of non-lenses is an important ingredient for unbiased analyses of spectroscopically selected lens samples. In general, we find that the completeness of spectroscopic lens surveys in the plane of Einstein radius and mass-density profile power-law index is quite uniform, up to a sharp drop in the region of large Einstein radius and steep mass-density profile, and hence that such surveys are ideally suited to the study of massive field galaxies.

  5. Quantifying mesoscale soil moisture with the cosmic-ray rover

    NASA Astrophysics Data System (ADS)

    Chrisman, B.; Zreda, M.

    2013-06-01

    Soil moisture governs the surface fluxes of mass and energy and is a major influence on floods and drought. Existing techniques measure soil moisture either at a point or over a large area many kilometers across. To bridge these two scales we used the cosmic-ray rover, an instrument similar to the recently developed COSMOS probe, but bigger and mobile. This paper explores the challenges and opportunities for mapping soil moisture over large areas using the cosmic-ray rover. In 2012, soil moisture was mapped 22 times in a 25 km × 40 km survey area of the Tucson Basin at 1 km2 resolution, i.e., a survey area extent comparable to that of a pixel for the Soil Moisture and Ocean Salinity (SMOS) satellite mission. The soil moisture distribution is dominated by climatic variations, notably by the North American monsoon, that results in a systematic increase in the standard deviation, observed up to 0.022 m3 m-3, as a function of the mean, between 0.06 and 0.14 m3 m-3. Two techniques are explored to use the cosmic-ray rover data for hydrologic applications: (1) interpolation of the 22 surveys into a daily soil moisture product by defining an approach to utilize and quantify the observed temporal stability producing an average correlation coefficient of 0.82 for the soil moisture distributions that were surveyed and (2) estimation of soil moisture profiles by combining surface moisture from satellite microwave sensors with deeper measurements from the cosmic-ray rover. The interpolated soil moisture and soil moisture profile estimates allow for basin-wide mass balance calculation of evapotranspiration, totaling 241 mm for the year 2012. Generating soil moisture maps with cosmic-ray rover at this intermediate scale may help in the calibration and validation of satellite campaigns and may also aid in various large scale hydrologic studies.

  6. Quantifying reactivity for electrophilic aromatic substitution reactions with Hirshfeld charge.

    PubMed

    Liu, Shubin

    2015-03-26

    An electrophilic aromatic substitution is a process where one atom or group on an aromatic ring is replaced by an incoming electrophile. The reactivity and regioselectivity of this category of reactions is significantly impacted by the group that is already attached to the aromatic ring. Groups promoting substitution at the ortho/para and meta position are called ortho/para and meta directing groups, respectively. Earlier, we have shown that regioselectivity of the electrophilic aromatic substitution is dictated by the nucleophilicity of the substituted aromatic ring, which is proportional to the Hirshfeld charge on the regioselective site. Ortho/para directing groups have the largest negative charge values at the ortho/para positions, whereas meta directing groups often have the largest negative charge value at the meta position. The electron donation or acceptance feature of a substitution group is irrelevant to the regioselectivity. In this contribution, we extend our previous study by quantifying the reactivity for this kind of reactions. To that end, we examine the transition-state structure and activation energy of an identity reaction for a series of monosubstituted-benzene molecules reacting with hydrogen fluoride using BF3 as the catalyst in the gas phase. A total of 18 substitution groups will be considered, nine of which are ortho/para directing and the other nine groups meta directing. From this study, we found that the barrier height of these reactions strongly correlates with the Hirshfeld charge on the regioselective site for both ortho/para and meta directing groups, with the correlation coefficient R(2) both better than 0.96. We also discovered a less accurate correlation between the barrier height and HOMO energy. These results reconfirm the validity and effectiveness of employing the Hirshfeld charge as a reliable descriptor of both reactivity and regioselectivity for this vastly important category of chemical transformations. PMID:25723372

  7. Quantifying Evaporation and Salt Accumulation in Fractured Rocks

    NASA Astrophysics Data System (ADS)

    Komorowski, T.; Weisbrod, N.; Dragila, M.

    2005-05-01

    Evaporation was traditionally considered as the loss of water vapor from land surface and water reservoirs to the atmosphere. Direct loss of water vapor from cracks, fractures and other discontinuities was usually neglected. A few papers published in the early 70's, recent models and field measurements suggest that under typical arid conditions significant amount of water vapor could be transported directly from surface exposed fractures to the atmosphere. Subsequently, salt accumulation along the fracture surfaces is likely to occur as the pore water solution in the upper vadose zone is typically saline. The rare but intensive rain events, often occurring in deserts, could dissolve this accumulated salt and flush it to the underlying aquifers. This process could be of great importance for salinization, especially in low permeability rocks, where without this mechanism salts are likely to accumulate in the upper vadose zone and never reach groundwater. The main objectives of this work is to experimentally quantify the amount of water vapor loss from fractures under controlled conditions and measure the amount and distribution of salts precipitate on the fracture walls. A customized Climate Control Room (CCR) was especially designed and constructed to mimic extreme night-time and day-time temperature conditions, typical for deserts. Within the CCR, two fractured blocks of chalk were installed. The rocks and fractures were instrumented so the temperature at the bottom of the rock is constant. Humidity and temperature within the fracture aperture and within the rock are constantly monitored. A feeding container is attached on each side of the block to supply the rock with pore water solution under constant tension. The inflow of water from the feeding containers into the rocks are constantly monitored and recorded as well as the overall changes of the water content within the block. Preliminary results indicate a measurable water vapor loss from the fracture surfaces to the atmosphere and subsequent salt precipitation on the fracture walls.

  8. Quantifying reinforced concrete bridge deck deterioration using ground penetrating radar

    NASA Astrophysics Data System (ADS)

    Martino, Nicole Marie

    Bridge decks are deteriorating at an alarming rate due to corrosion of the reinforcing steel, requiring billions of dollars to repair and replace them. Furthermore, the techniques used to assess the decks don't provide enough quantitative information. In recent years, ground penetrating radar (GPR) has been used to quantify deterioration by comparing the rebar reflection amplitudes to technologies serving as ground truth, because there is not an available amplitude threshold to distinguish healthy from corroded areas using only GPR. The goal of this research is to understand the relationship between GPR and deck deterioration, and develop a model to determine deterioration quantities with GPR alone. The beginning of this research determines that not only is the relationship between GPR and rebar corrosion stronger than the relationship between GPR and delaminations, but that the two are exceptionally correlated (90.2% and 86.6%). Next, multiple bridge decks were assessed with GPR and half-cell potential (HCP). Statistical parameters like the mean and skewness were computed for the GPR amplitudes of each deck, and coupled with actual corrosion quantities based on the HCP measurements to form a future bridge deck model that can be used to assess any deck with GPR alone. Finally, in order to understand exactly which component of rebar corrosion (rust, cracking or chloride) attenuates the GPR data, computational modeling was carried out to isolate each variable. The results indicate that chloride is the major contributor to the rebar reflection attenuation, and that computational modeling can be used to accurately simulate GPR attenuation due to chloride.

  9. Quantified MRI and cognition in TBI with diffuse and focal damage?

    PubMed Central

    Levine, Brian; Kovacevic, Natasa; Nica, Elena Irina; Schwartz, Michael L.; Gao, Fuqiang; Black, Sandra E.

    2013-01-01

    In patients with chronic-phase traumatic brain injury (TBI), structural MRI is readily attainable and provides rich anatomical information, yet the relationship between whole-brain structural MRI measures and neurocognitive outcome is relatively unexplored and can be complicated by the presence of combined focal and diffuse injury. In this study, sixty-three patients spanning the full range of TBI severity received high-resolution structural MRI concurrent with neuropsychological testing. Multivariate statistical analysis assessed covariance patterns between volumes of grey matter, white matter, and sulcal/subdural and ventricular CSF across 38 brain regions and neuropsychological test performance. Patients with diffuse and diffuse + focal injury were analyzed both separately and together. Tests of speeded attention, working memory, and verbal learning and memory robustly covaried with a distributed pattern of volume loss over temporal, ventromedial prefrontal, right parietal regions, and cingulate regions. This pattern was modulated by the presence of large focal lesions, but held even when analyses were restricted to those with diffuse injury. Effects were most consistently observed within grey matter. Relative to regional brain volumetric data, clinically defined injury severity (depth of coma at time of injury) showed only weak relation to neuropsychological outcome. The results showed that neuropsychological test performance in patients with TBI is related to a distributed pattern of volume loss in regions mediating mnemonic and attentional processing. This relationship holds for patients with and without focal lesions, indicating that diffuse injury alone is sufficient to cause significant neuropsychological disability in relation to regional volume loss. Quantified structural brain imaging data provides a highly sensitive index of brain integrity that is related to cognitive functioning in chronic phase TBI. PMID:24049744

  10. Quantified MRI and cognition in TBI with diffuse and focal damage(?)

    PubMed

    Levine, Brian; Kovacevic, Natasa; Nica, Elena Irina; Schwartz, Michael L; Gao, Fuqiang; Black, Sandra E

    2013-04-10

    In patients with chronic-phase traumatic brain injury (TBI), structural MRI is readily attainable and provides rich anatomical information, yet the relationship between whole-brain structural MRI measures and neurocognitive outcome is relatively unexplored and can be complicated by the presence of combined focal and diffuse injury. In this study, sixty-three patients spanning the full range of TBI severity received high-resolution structural MRI concurrent with neuropsychological testing. Multivariate statistical analysis assessed covariance patterns between volumes of grey matter, white matter, and sulcal/subdural and ventricular CSF across 38 brain regions and neuropsychological test performance. Patients with diffuse and diffuse + focal injury were analyzed both separately and together. Tests of speeded attention, working memory, and verbal learning and memory robustly covaried with a distributed pattern of volume loss over temporal, ventromedial prefrontal, right parietal regions, and cingulate regions. This pattern was modulated by the presence of large focal lesions, but held even when analyses were restricted to those with diffuse injury. Effects were most consistently observed within grey matter. Relative to regional brain volumetric data, clinically defined injury severity (depth of coma at time of injury) showed only weak relation to neuropsychological outcome. The results showed that neuropsychological test performance in patients with TBI is related to a distributed pattern of volume loss in regions mediating mnemonic and attentional processing. This relationship holds for patients with and without focal lesions, indicating that diffuse injury alone is sufficient to cause significant neuropsychological disability in relation to regional volume loss. Quantified structural brain imaging data provides a highly sensitive index of brain integrity that is related to cognitive functioning in chronic phase TBI. PMID:24049744

  11. Quantifying archaeal community autotrophy in the mesopelagic ocean using natural radiocarbon.

    PubMed

    Ingalls, Anitra E; Shah, Sunita R; Hansman, Roberta L; Aluwihare, Lihini I; Santos, Guaciara M; Druffel, Ellen R M; Pearson, Ann

    2006-04-25

    An ammonia-oxidizing, carbon-fixing archaeon, Candidatus "Nitrosopumilus maritimus," recently was isolated from a salt-water aquarium, definitively confirming that chemoautotrophy exists among the marine archaea. However, in other incubation studies, pelagic archaea also were capable of using organic carbon. It has remained unknown what fraction of the total marine archaeal community is autotrophic in situ. If archaea live primarily as autotrophs in the natural environment, a large ammonia-oxidizing population would play a significant role in marine nitrification. Here we use the natural distribution of radiocarbon in archaeal membrane lipids to quantify the bulk carbon metabolism of archaea at two depths in the subtropical North Pacific gyre. Our compound-specific radiocarbon data show that the archaea in surface waters incorporate modern carbon into their membrane lipids, and archaea at 670 m incorporate carbon that is slightly more isotopically enriched than inorganic carbon at the same depth. An isotopic mass balance model shows that the dominant metabolism at depth indeed is autotrophy (83%), whereas heterotrophic consumption of modern organic carbon accounts for the remainder of archaeal biomass. These results reflect the in situ production of the total community that produces tetraether lipids and are not subject to biases associated with incubation and/or culture experiments. The data suggest either that the marine archaeal community includes both autotrophs and heterotrophs or is a single population with a uniformly mixotrophic metabolism. The metabolic and phylogenetic diversity of the marine archaea warrants further exploration; these organisms may play a major role in the marine cycles of nitrogen and carbon. PMID:16614070

  12. Quantifying the hydrological responses to climate change in an intact forested small watershed in Southern China

    USGS Publications Warehouse

    Zhou, G.; Wei, X.; Wu, Y.; Huang, Y.; Yan, J.; Zhang, Dongxiao; Zhang, Q.; Liu, J.; Meng, Z.; Wang, C.; Chu, G.; Liu, S.; Tang, X.; Liu, Xiuying

    2011-01-01

    Responses of hydrological processes to climate change are key components in the Intergovernmental Panel for Climate Change (IPCC) assessment. Understanding these responses is critical for developing appropriate mitigation and adaptation strategies for sustainable water resources management and protection of public safety. However, these responses are not well understood and little long-term evidence exists. Herein, we show how climate change, specifically increased air temperature and storm intensity, can affect soil moisture dynamics and hydrological variables based on both long-term observation and model simulations using the Soil and Water Assessment Tool (SWAT) in an intact forested watershed (the Dinghushan Biosphere Reserve) in Southern China. Our results show that, although total annual precipitation changed little from 1950 to 2009, soil moisture decreased significantly. A significant decline was also found in the monthly 7-day low flow from 2000 to 2009. However, the maximum daily streamflow in the wet season and unconfined groundwater tables have significantly increased during the same 10-year period. The significant decreasing trends on soil moisture and low flow variables suggest that the study watershed is moving towards drought-like condition. Our analysis indicates that the intensification of rainfall storms and the increasing number of annual no-rain days were responsible for the increasing chance of both droughts and floods. We conclude that climate change has indeed induced more extreme hydrological events (e.g. droughts and floods) in this watershed and perhaps other areas of Southern China. This study also demonstrated usefulness of our research methodology and its possible applications on quantifying the impacts of climate change on hydrology in any other watersheds where long-term data are available and human disturbance is negligible. ?? 2011 Blackwell Publishing Ltd.

  13. Quantifying the hydrological responses to climate change in an intact forested small watershed in southern China

    USGS Publications Warehouse

    Zhou, Guoyi; Wei, Xiaohua; Wu, Yiping; Liu, Shuguang; Huang, Yuhui; Yan, Junhua; Zhang, Deqiang; Zhang, Qianmei; Liu, Juxiu; Meng, Ze; Wang, Chunlin; Chu, Guowei; Liu, Shizhong; Tang, Xuli; Liu, Xiaodong

    2011-01-01

    Responses of hydrological processes to climate change are key components in the Intergovernmental Panel for Climate Change (IPCC) assessment. Understanding these responses is critical for developing appropriate mitigation and adaptation strategies for sustainable water resources management and protection of public safety. However, these responses are not well understood and little long-term evidence exists. Herein, we show how climate change, specifically increased air temperature and storm intensity, can affect soil moisture dynamics and hydrological variables based on both long-term observation and model simulations using the Soil and Water Assessment Tool (SWAT) in an intact forested watershed (the Dinghushan Biosphere Reserve) in Southern China. Our results show that, although total annual precipitation changed little from 1950 to 2009, soil moisture decreased significantly. A significant decline was also found in the monthly 7-day low flow from 2000 to 2009. However, the maximum daily streamflow in the wet season and unconfined groundwater tables have significantly increased during the same 10-year period. The significant decreasing trends on soil moisture and low flow variables suggest that the study watershed is moving towards drought-like condition. Our analysis indicates that the intensification of rainfall storms and the increasing number of annual no-rain days were responsible for the increasing chance of both droughts and floods. We conclude that climate change has indeed induced more extreme hydrological events (e.g. droughts and floods) in this watershed and perhaps other areas of Southern China. This study also demonstrated usefulness of our research methodology and its possible applications on quantifying the impacts of climate change on hydrology in any other watersheds where long-term data are available and human disturbance is negligible.

  14. Quantifying and generalizing hydrologic responses to dam regulation using a statistical modeling approach

    NASA Astrophysics Data System (ADS)

    McManamay, Ryan A.

    2014-11-01

    Despite the ubiquitous existence of dams within riverscapes, much of our knowledge about dams and their environmental effects remains context-specific. Hydrology, more than any other environmental variable, has been studied in great detail with regard to dam regulation. While much progress has been made in generalizing the hydrologic effects of regulation by large dams, many aspects of hydrology show site-specific fidelity to dam operations, small dams (including diversions), and regional hydrologic regimes. A statistical modeling framework is presented as a predictive tool to quantify and generalize hydrologic responses to varying degrees of dam regulation at large spatial scales. In addition, the approach provides a method to expand sample sizes beyond that of traditional dam-hydrologic-effect analyses. Model performance was relatively poor with models explaining 10-31% of the variation in hydrologic responses. However, models had relatively high accuracies (61-89%) in classifying the direction of hydrologic responses as negative or positive. Responses of many hydrologic indices to dam regulation were highly dependent upon regional hydrology, the purpose of the dam, and the presence of diversion dams. In addition, models revealed opposite effects of dam regulation in systems regulated by individual dams versus many upstream dams, suggesting that the effects of dams may be countered by other dams in basins experiencing intensified cumulative disturbance. Results also suggested that particular contexts, including multipurpose dams, high cumulative regulation, diversions, and regions of unpredictable hydrology are all sources of increased error when predicting hydrologic responses to dams. Statistical models, such as the ones presented herein, show promise in their ability to generalize the directionality of hydrologic responses to dam regulation and provide parameter coefficients to inform future site-specific modeling efforts.

  15. Influence of gold nanoparticles on collagen fibril morphology quantified using transmission electron microscopy and image analysis

    PubMed Central

    Haidekker, Mark A; Boettcher, Lisa W; Suter, Jonathan D; Rone, Rebecca; Grant, Sheila A

    2006-01-01

    Background Development of implantable biosensors for disease detection is challenging because of poor biocompatibility of synthetic materials. A possible solution involves engineering interface materials that promote selfassembly and adhesion of autologous cells on sensor surfaces. Crosslinked type-I collagen is an acceptable material for developing engineered basement membranes. In this study, we used functionalized gold nanoparticles as the crosslinking agent. Functionalized nanoparticles provide sites for crosslinking collagen as well as sites to deliver signaling compounds that direct selfassembly and reduce inflammation. The goal of this study was to obtain a quantitative parameter to objectively determine the presence of crosslinks. Methods We analyzed TEM images of collagen fibrils by two methods: Run length analysis and topology analysis after medial axis transform. Results Run length analysis showed a significant reduction of the interfibril spaces in the presence of nanoparticles (change of 40%, P < 0.05), whereas the fibril thickness remained unchanged. In the topological network, the number of elements, number of branches and number of sides increased significantly in the presence of nanoparticles (P < 0.05). Other parameters, especially the number of loops showed only a minimal and nonsignificant change. We chose a ratiometric parameter of the number of branches normalized by the number of loops to achieve independence from gross fibril density. This parameter is lower by a factor of 2.8 in the presence of nanoparticles (P < 0.05). Conclusion The numerical parameters presented herein allow not only to quantify fibril mesh complexity and crosslinking, but also to help quantitatively compare cell growth and adhesion on collagen matrices of different degree of crosslinking in further studies. PMID:16737541

  16. [Pain patients show a higher hindsight bias].

    PubMed

    Ruoss, M

    1997-01-01

    Research on pain-related cognitions has up to now predominantly relied upon introspective questionnaire data. Experimental cognitive psychology offers an alternative way of access to the cognitive aspects of chronical pain. Building on the assumption that information-processing is in part uncontrolled, automatic and pre-attentive, similar processes are also expected to be relevant for pain-relevant cognitions and to be involved in health-related convictions and in coping strategies that can be assessed with questionnaires. Cognitive-psychological research has established the "hindsight bias" as a robust phenomenon that occurs uncontrolled and automatically in diverse contexts when a prior judgment or prediction is assimilated to information received later on. The hindsight bias may be regarded as a manifestation of a universal cognitive mechanism, meaning that information (including information about emotional states) available at a given time will change the memory of prior judgments or of predictions of future events and results of behavior. Cognitive biases similar to the hindsight effect have been demonstrated in chronical pain patients. The present work elaborates the hypothesis that pain patients differ from other groups in the size of the hindsight bias and in its composition and outlines how it can contribute to the chronification of pain. Data from a hindsight-bias experiment comparing pain patients, psychiatric patients and students are analyzed using alternatively a traditional global hindsight bias score ("Hell-Index") and a multinomial modelling approach. The hindsight-effect was observed in the usual extent in the student control group, but was significantly greater in the pain group and absent in the psychiatric sample. In addition to this global finding, multinomial modelling revealed group differences in specific model parameters. This method of analysis thus proved as promising for the assessment of cognitive aspects of clinical disorders. PMID:9577225

  17. A directional quantifying Doppler system for measurement of transport velocity of blood.

    PubMed

    De Jong, D A; Megens, P H; De Vlieger, M; Thön, H; Holland, W P

    1975-05-01

    A transcutaneous Doppler device has been developed that measures primarily the directional transport velocity of blood, averaged over the vessel diameter, irrespective of its flow in adjacent vessels. Directional information is obtained by high or low-pass filtering of frequency converted versions of the received Doppler signals, applying low-cost, sharp filters in a superheterodyne system. Upper and lower channel signals are quantified separately to average directional velocity. Linear results from in vitro measurements are obtained. PMID:1138476

  18. Entropy Measures Quantify Global Splicing Disorders in William Ritchie1

    E-print Network

    Paris-Sud XI, Université de

    [4­7] providing the fine tuning of gene expression required for cell differentiation and tissue that the identification of transcript isoforms is now considered an important avenue in cancer diagnosis and therapy [11Entropy Measures Quantify Global Splicing Disorders in Cancer William Ritchie1 , Samuel Granjeaud1

  19. Quantifying Gender Differences in Physical Performance: A Developmental Perspective

    Microsoft Academic Search

    Frank L. Smoll; Robert W. Schutz

    1990-01-01

    The purpose was to quantify the contribution of anthropometric variables to gender differences in performance during childhood and adolescence. Measures of height, percentage body fat, and fat-free body weight were obtained for 2,142 students in Grades 3, 7, and 11 (ages 9, 13, and 17 years), and the subjects were tested on 6 motor tasks. Multivariate analysis of variance indicated

  20. Methods for quantifying phagocytosis and bacterial killing by human neutrophils

    Microsoft Academic Search

    Mark B. Hampton; Christine C. Winterbourn

    1999-01-01

    This paper reviews a variety of methods available for quantifying phagocytosis and bacterial killing by neutrophils. We outline the advantages and disadvantages of each technique, with the selection of a technique for research or analytical purposes being dependent on the information required and the resources available. A detailed protocol is provided for a comprehensive microbiological technique that measures both phagocytosis