These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

Quantifying causal emergence shows that macro can beat micro  

PubMed Central

Causal interactions within complex systems can be analyzed at multiple spatial and temporal scales. For example, the brain can be analyzed at the level of neurons, neuronal groups, and areas, over tens, hundreds, or thousands of milliseconds. It is widely assumed that, once a micro level is fixed, macro levels are fixed too, a relation called supervenience. It is also assumed that, although macro descriptions may be convenient, only the micro level is causally complete, because it includes every detail, thus leaving no room for causation at the macro level. However, this assumption can only be evaluated under a proper measure of causation. Here, we use a measure [effective information (EI)] that depends on both the effectiveness of a system’s mechanisms and the size of its state space: EI is higher the more the mechanisms constrain the system’s possible past and future states. By measuring EI at micro and macro levels in simple systems whose micro mechanisms are fixed, we show that for certain causal architectures EI can peak at a macro level in space and/or time. This happens when coarse-grained macro mechanisms are more effective (more deterministic and/or less degenerate) than the underlying micro mechanisms, to an extent that overcomes the smaller state space. Thus, although the macro level supervenes upon the micro, it can supersede it causally, leading to genuine causal emergence—the gain in EI when moving from a micro to a macro level of analysis. PMID:24248356

Hoel, Erik P.; Albantakis, Larissa; Tononi, Giulio

2013-01-01

2

Using Turbulence Model Results to Quantify Oxygen Reaeration in an  

E-print Network

Using Turbulence Model Results to Quantify Oxygen Reaeration in an Estuary Dissolved Oxygen Model 6, 2007 ECM 10, Newport RI #12;Dissolved Oxygen (DO) Model Applications · One of the most common (including North Carolina) #12;A Steady-State Box Model of Dissolved Oxygen in an Estuary BOD Loading

Bowen, James D.

3

Quantified security is a weak hypothesis: a critical survey of results and assumptions  

Microsoft Academic Search

This paper critically surveys previous work on quantitative representation and analysis of security. Such quantified security has been presented as a general approach to precisely assess and control security. We classify a significant part of the work between 1981 and 2008 with respect to security perspective, target of quantification, underlying assumptions and type of validation. The result shows how the

Vilhelm Verendel

2009-01-01

4

BEST IN SHOW RESULTS AND FAIR PHOTOS INSIDE  

E-print Network

club reporter sends in an article about your club happenings. See you at Awards Night! Sincerely Bulletin 4-7 Around The County 8 Awards Night 11 Fair Photos 13-15 Fair Best in Show Winners 17-20 window for a week. Show your 4-H spirit and wear your 4-H Club or Morris County 4-H t-shirt to school. Be sure

Goodman, Robert M.

5

Astronomy Diagnostic Test Results Reflect Course Goals and Show Room for Improvement  

ERIC Educational Resources Information Center

The results of administering the Astronomy Diagnostic Test (ADT) to introductory astronomy students at Henry Ford Community College over three years have shown gains comparable with national averages. Results have also accurately corresponded to course goals, showing greater gains in topics covered in more detail, and lower gains in topics covered…

LoPresto, Michael C.

2007-01-01

6

Preliminary Results In Quantifying The Climatic Impact Forcing Factors Around 3 Ma Ago  

NASA Astrophysics Data System (ADS)

What is exactly the control of climate changes on the development of the Hominids ? Is it possible to quantify such changes ? and which are the forcing factors that create these changes ? We use here a General Circulation Model to investigate the climate sensitivity of 3 different forcing factors : the uplift of the East African Rift, the ex- tent (more than twenty time PD surfaces) of the Chad Lake and ultimately we shall with a coupled oceanatmospher GCM test the the effect of Indonesian throughflow changes. To achieve these goals, we need a multidisciplinary group to assess the evo- lution of the Rift and the extent of the Lake. We prescribe these different boundary conditions to the GCM and use a biome model to assess the vegetation changes. In this presentation we will only focus on the Rift uplift and the Chad lake impacts on Atmospheric circulation, monsoon and their environmental consequences in term of vegetation changes.

Fluteau, F.; Ramstein, G.; Duringer, P.; Schuster, M.; Tiercelin, J. J.

7

Quantifying the effects of root reinforcing on slope stability: results of the first tests with an new shearing device  

NASA Astrophysics Data System (ADS)

The role of vegetation in preventing shallow soil mass movements such as shallow landslides and soil erosion is generally well recognized and, correspondingly, soil bioengineering on steep slopes has been widely used in practice. However, the precise effectiveness of vegetation regarding slope stabilityis still difficult to determine. A recently designed inclinable shearing device for large scale vegetated soil samples allows quantitative evaluation of the additional shear strength provided by roots of specific plant species. In the following we describe the results of a first series of shear strength experiments with this apparatus focusing on root reinforcement of White Alder (Alnus incana) and Silver Birch (Betula pendula) in large soil block samples (500 x 500 x 400 mm). The specimen with partly saturated soil of a maximum grain size of 10 mm were slowly sheared at an inclination of 35° with low normal stresses of 3.2 kPa accounting for natural conditions on a typical slope prone to mass movements. Measurements during the experiments involved shear stress, shear displacement and normal displacement, all recorded with high accuracy. In addition, dry weights of sprout and roots were measured to quantify plant growth of the planted specimen. The results with the new apparatus indicate a considerable reinforcement of the soil due to plant roots, i.e. maximum shear stress of the vegetated specimen were substantially higher compared to non-vegetated soil and the additional strength was a function of species and growth. Soil samples with seedlings planted five months prior to the test yielded an important increase in maximum shear stress of 250% for White Alder and 240% for Silver Birch compared to non-vegetated soil. The results of a second test series with 12 month old plants showed even clearer enhancements in maximum shear stress (390% for Alder and 230% for Birch). Overall the results of this first series of shear strength experiments with the new apparatus using planted and unplanted soil specimen confirm the importance of plants in soil stabilisation. Furthermore, they demonstrate the suitability of the apparatus to quantify the additional strength of specific vegetation as a function of species and growth under clearly defined conditions in the laboratory.

Rickli, Christian; Graf, Frank

2013-04-01

8

Nanotribology results show that DNA forms a mechanically resistant 2D network in metaphase chromatin plates.  

PubMed

In a previous study, we found that metaphase chromosomes are formed by thin plates, and here we have applied atomic force microscopy (AFM) and friction force measurements at the nanoscale (nanotribology) to analyze the properties of these planar structures in aqueous media at room temperature. Our results show that high concentrations of NaCl and EDTA and extensive digestion with protease and nuclease enzymes cause plate denaturation. Nanotribology studies show that native plates under structuring conditions (5 mM Mg2+) have a relatively high friction coefficient (??0.3), which is markedly reduced when high concentrations of NaCl or EDTA are added (??0.1). This lubricant effect can be interpreted considering the electrostatic repulsion between DNA phosphate groups and the AFM tip. Protease digestion increases the friction coefficient (??0.5), but the highest friction is observed when DNA is cleaved by micrococcal nuclease (??0.9), indicating that DNA is the main structural element of plates. Whereas nuclease-digested plates are irreversibly damaged after the friction measurement, native plates can absorb kinetic energy from the AFM tip without suffering any damage. These results suggest that plates are formed by a flexible and mechanically resistant two-dimensional network which allows the safe storage of DNA during mitosis. PMID:21156137

Gállego, Isaac; Oncins, Gerard; Sisquella, Xavier; Fernàndez-Busquets, Xavier; Daban, Joan-Ramon

2010-12-15

9

An analysis of semiclassical radiation from single particle quantum currents shows surprising results  

E-print Network

Classical electromagnetic radiation from quantum currents and densities are calculated. For the free Schrodinger equation with no external force it's found that the classical radiation is zero to all orders of the multipole expansion. This is true of mixed or pure states for the charged particle. It is a non-trivial and surprising result. A similar result is found for the Klein-Gordon currents when the wave function consists of only positive energy solutions. For the Dirac equation it is found that radiation is suppressed at lower frequencies but is not zero at all frequencies. Implications of these results for the interpretation of quantum mechanics are discussed.

Mark P. Davidson

2002-09-10

10

Trial results show high remission rate in leukemia following immune cell therapy  

Cancer.gov

Children and young adults (age 1 to age 30) with chemotherapy-resistant B-cell acute lymphoblastic leukemia (ALL) experienced high remission rates following treatment with an experimental immunotherapy. Results demonstrated that the immunotherapy treatment had anti-leukemia effects in patients and that the treatment was feasible and safe.

11

NIH trial shows promising results in treating a lymphoma in young people  

Cancer.gov

Patients with a type of cancer known as primary mediastinal B-cell lymphoma who received infusions of chemotherapy, but who did not have radiation therapy to an area of the thorax known as the mediastinum, had excellent outcomes, according to clinical trial results.

12

Mitochondrial DNA transmitted from sperm in the blue mussel Mytilus galloprovincialis showing doubly uniparental inheritance of mitochondria, quantified by real-time PCR.  

PubMed

Doubly uniparental inheritance (DUI) of mitochondrial DNA transmission to progeny has been reported in the mussel, Mytilus. In DUI, males have both paternally (M type) and maternally (F type) transmitted mitochondrial DNA (mtDNA), but females have only the F type. To estimate how much M type mtDNA enters the egg with sperm in the DUI system, ratios of M type to F type mtDNA were measured before and after fertilization. M type mtDNA content in eggs increased markedly after fertilization. Similar patterns in M type content changes after fertilization were observed in crosses using the same males. To compare mtDNA quantities, we subsequently measured the ratios of mtDNA to the 28S ribosomal RNA gene (an endogenous control sequence) in sperm or unfertilized eggs using a real-time polymerase chain reaction (PCR) assay. F type content in unfertilized eggs was greater than the M type in sperm by about 1000-fold on average. M type content in spermatozoa was greater than in unfertilized egg, but their distribution overlapped. These results may explain the post-fertilization changes in zygotic M type content. We previously demonstrated that paternal and maternal M type mtDNAs are transmitted to offspring, and hypothesized that the paternal M type contributed to M type transmission to the next generation more than the maternal type did. These quantitative data on M and F type mtDNA in sperm and eggs provide further support for that hypothesis. PMID:20608851

Sano, Natsumi; Obata, Mayu; Komaru, Akira

2010-07-01

13

Lung cancer trial results show mortality benefit with low-dose CT:  

Cancer.gov

The NCI has released initial results from a large-scale test of screening methods to reduce deaths from lung cancer by detecting cancers at relatively early stages. The National Lung Screening Trial, a randomized national trial involving more than 53,000 current and former heavy smokers ages 55 to 74, compared the effects of two screening procedures for lung cancer -- low-dose helical computed tomography (CT) and standard chest X-ray -- on lung cancer mortality and found 20 percent fewer lung cancer deaths among trial participants screened with low-dose helical CT.

14

Results From Mars Show Electrostatic Charging of the Mars Pathfinder Sojourner Rover  

NASA Technical Reports Server (NTRS)

Indirect evidence (dust accumulation) has been obtained indicating that the Mars Pathfinder rover, Sojourner, experienced electrostatic charging on Mars. Lander camera images of the Sojourner rover provide distinctive evidence of dust accumulation on rover wheels during traverses, turns, and crabbing maneuvers. The sol 22 (22nd Martian "day" after Pathfinder landed) end-of-day image clearly shows fine red dust concentrated around the wheel edges with additional accumulation in the wheel hubs. A sol 41 image of the rover near the rock "Wedge" (see the next image) shows a more uniform coating of dust on the wheel drive surfaces with accumulation in the hubs similar to that in the previous image. In the sol 41 image, note particularly the loss of black-white contrast on the Wheel Abrasion Experiment strips (center wheel). This loss of contrast was also seen when dust accumulated on test wheels in the laboratory. We believe that this accumulation occurred because the Martian surface dust consists of clay-sized particles, similar to those detected by Viking, which have become electrically charged. By adhering to the wheels, the charged dust carries a net nonzero charge to the rover, raising its electrical potential relative to its surroundings. Similar charging behavior was routinely observed in an experimental facility at the NASA Lewis Research Center, where a Sojourner wheel was driven in a simulated Martian surface environment. There, as the wheel moved and accumulated dust (see the following image), electrical potentials in excess of 100 V (relative to the chamber ground) were detected by a capacitively coupled electrostatic probe located 4 mm from the wheel surface. The measured wheel capacitance was approximately 80 picofarads (pF), and the calculated charge, 8 x 10(exp -9) coulombs (C). Voltage differences of 100 V and greater are believed sufficient to produce Paschen electrical discharge in the Martian atmosphere. With an accumulated net charge of 8 x 10(exp -9) C, and average arc time of 1 msec, arcs can also occur with estimated arc currents approaching 10 milliamperes (mA). Discharges of this magnitude could interfere with the operation of sensitive electrical or electronic elements and logic circuits. Sojourner rover wheel tested in laboratory before launch to Mars. Before launch, we believed that the dust would become triboelectrically charged as it was moved about and compacted by the rover wheels. In all cases observed in the laboratory, the test wheel charged positively, and the wheel tracks charged negatively. Dust samples removed from the laboratory wheel averaged a few ones to tens of micrometers in size (clay size). Coarser grains were left behind in the wheel track. On Mars, grain size estimates of 2 to 10 mm were derived for the Martian surface materials from the Viking Gas Exchange Experiment. These size estimates approximately match the laboratory samples. Our tentative conclusion for the Sojourner observations is that fine clay-sized particles acquired an electrostatic charge during rover traverses and adhered to the rover wheels, carrying electrical charge to the rover. Since the Sojourner rover carried no instruments to measure this mission's onboard electrical charge, confirmatory measurements from future rover missions on Mars are desirable so that the physical and electrical properties of the Martian surface dust can be characterized. Sojourner was protected by discharge points, and Faraday cages were placed around sensitive electronics. But larger systems than Sojourner are being contemplated for missions to the Martian surface in the foreseeable future. The design of such systems will require a detailed knowledge of how they will interact with their environment. Validated environmental interaction models and guidelines for the Martian surface must be developed so that design engineers can test new ideas prior to cutting hardware. These models and guidelines cannot be validated without actual flighata. Electrical charging of vehicles and, one day, astronauts moving across t

Kolecki, Joseph C.; Siebert, Mark W.

1998-01-01

15

Updated clinical results show experimental agent ibrutinib as highly active in CLL patients  

Cancer.gov

Updated results from a Phase Ib/II clinical trial led by the Ohio State University Comprehensive Cancer Center – Arthur G. James Cancer Hospital and Richard J. Solove Research Institute indicates that a novel therapeutic agent for chronic lymphocytic leukemia (CLL) is highly active and well tolerated in patients who have relapsed and are resistant to other therapy. The agent, ibrutinib (PCI-32765), is the first drug designed to target Bruton's tyrosine kinase (BTK), a protein essential for CLL-cell survival and proliferation. CLL is the most common form of leukemia, with about 15,000 new cases annually in the U.S. About 4,400 Americans die of the disease each year.

16

International gene therapy trial for 'bubble boy' disease shows promising early results  

Cancer.gov

Researchers reported promising outcomes data for the first group of boys with X-linked severe combined immunodeficiency syndrome (SCID-X1), a fatal genetic immunodeficiency also known as "bubble boy" disease, who were treated as part of an international clinical study of a new form of gene therapy. The mechanism used to deliver the gene therapy is designed to prevent the serious complication of leukemia that arose a decade ago in a similar trial in Europe, when one-quarter of boys treated developed the blood cancer. Researchers from the Dana-Farber Cancer Institute presented the study results annual meeting of the American Society of Hematology, on behalf of the Transatlantic Gene Therapy Consortium.

17

Assessing the use of Geoscience Laser Altimeter System data to quantify forest structure change resultant from large-scale forest disturbance events- Case Study Hurricane Katrina  

NASA Astrophysics Data System (ADS)

The biodiversity, structure, and functioning of forest systems in most areas are strongly influenced by disturbances. Forest structure can both influence and help indicate forest functions such as the storage and transfer of carbon between the land surface and the atmosphere. A 2007 report published by the National Research Council states that ‘Quantifying changes in the size of the [vegetation biomass] pool, its horizontal distribution, and its vertical structure resulting from natural and human-induced perturbations, such as deforestation and fire, and the recovery processes is critical for measuring ecosystem change.’ This study assessed the use of the Geoscience Laser Altimeter System (GLAS) to detect and quantify changes in forest structure caused by Hurricane Katrina. Data from GLAS campaigns for the year proceeding and following Katrina were compared to wind speed, forest cover, and damage maps to analyze sensor sampling, and forest structure change over a range of spatial scales. Results showed a significant decrease in mean canopy height of 4.0 m in forested areas experiencing category two winds, a 2.2 meter decrease in forests experiencing category one winds, and a 0.6 meter change in forests hit by tropical storm winds. Changes in structure were converted into carbon estimates using the Ecosystem Demography (ED) model to yield above ground carbon storage losses of ~30Tg over the domain. Although the greatest height loss was observed in areas hit by category two winds, these areas only contributed to a fraction (~3Tg) of the estimated above ground carbon storage losses resultant from Katrina, highlighting that small disturbance spread over a large area can account for as much as or more damage than intense disturbance over smaller areas. This finding stresses the importance of detecting and measuring the full extent of storm damage. While results highlighted the potential use of space-born Lidar in damage detection and quantification, they also emphasize limitations on the scope and scale at which current data can quantify hurricane related changes. Season of data acquisition was shown to influence calculations of mean canopy height and change. Limited sampling hindered our ability to make reliable estimates of height change and standing biomass loss at one degree resolution and smaller across the domain. These results have implications for sampling requirements in upcoming missions, such as DESDnyI, that will improve our ability to detect and quantify forest structure changes from disturbance events.

Dolan, K. A.; Hurtt, G. C.; Chambers, J. Q.; Dubayah, R.; Frolking, S. E.; Masek, J.

2009-12-01

18

Quantifying microwear on experimental Mistassini quartzite scrapers: preliminary results of exploratory research using LSCM and scale-sensitive fractal analysis.  

PubMed

Although previous use-wear studies involving quartz and quartzite have been undertaken by archaeologists, these are comparatively few in number. Moreover, there has been relatively little effort to quantify use-wear on stone tools made from quartzite. The purpose of this article is to determine the effectiveness of a measurement system, laser scanning confocal microscopy (LSCM), to document the surface roughness or texture of experimental Mistassini quartzite scrapers used on two different contact materials (fresh and dry deer hide). As in previous studies using LSCM on chert, flint, and obsidian, this exploratory study incorporates a mathematical algorithm that permits the discrimination of surface roughness based on comparisons at multiple scales. Specifically, we employ measures of relative area (RelA) coupled with the F-test to discriminate used from unused stone tool surfaces, as well as surfaces of quartzite scrapers used on dry and fresh deer hide. Our results further demonstrate the effect of raw material variation on use-wear formation and its documentation using LSCM and RelA. PMID:22688593

Stemp, W James; Lerner, Harry J; Kristant, Elaine H

2013-01-01

19

Quantifying dust input to the Subarctic North Pacific - Results from surface sediments and sea water thorium isotope measurements  

NASA Astrophysics Data System (ADS)

The Subarctic North Pacific is one of the three primary high-nutrient-low chlorophyll regions of the modern ocean, where the biological pump is relatively inefficient at transferring carbon from the atmosphere to the deep sea. The system is thought to be iron-limited. Aeolian dust is a significant source of iron and other nutrients that are essential for the health of marine ecosystems and potentially a controlling factor of the high-nutrient-low chlorophyll status of the Subarctic North Pacific. However, constraining the size of the dust flux to the surface ocean remains difficult. Here we apply two different approaches, based on surface sediment and water column samples, respectively, obtained during the SO202/INOPEX research cruise to the Subarctic North Pacific in 2009. We map the spatial patterns of Th/U isotopes, helium isotopes and rare earth elements across surface sediments from 37 multi-core core-top sediments across the Subarctic North Pacific. In order to deconvolve the detrital endmembers in regions of the North Pacific affected by volcanic material, IRD and hemipelagic input, we use a combination of trace elements with distinct characteristics in the different endmembers. This approach allows us to calculate the relative aeolian fraction, and in combination with Thorium230-normalized mass flux data, to quantify the dust supply. Secondly, we present an innovative approach to use paired Thorium-232 and Thorium-230 concentrations of upper-ocean seawater at 7 stations along the INOPEX track. Thorium-232 in the upper water column is dominantly derived from dissolution of aeolian dust, whereas Thorium-230 data provide a measure of the thorium removal from the surface waters and, thus, allow us to derive Thorium-232 fluxes. Combined with a mean Thorium-232 concentration in dust and estimate of the thorium solubility, the Thorium-232 flux can be translated in a dust flux to the surface ocean. Dust flux estimates for the Subarctic North Pacific will be compared to results from model simulations from Mahowald et al. (2006).

Winckler, G.; Serno, S.; Hayes, C.; Anderson, R. F.; Gersonde, R.; Haug, G. H.

2012-12-01

20

News Note: Long-term Results from Study of Tamoxifen and Raloxifene Shows Lower Toxicities of Raloxifene  

Cancer.gov

Initial results in 2006 of the NCI-sponsored Study of Tamoxifen and Raloxifene (STAR) showed that a common osteoporosis drug, raloxifene, prevented breast cancer to the same degree, but with fewer serious side-effects, than the drug tamoxifen that had been in use many years for breast cancer prevention as well as treatment. The longer-term results show that raloxifene retained 76 percent of the effectiveness of tamoxifen in preventing invasive disease and grew closer to tamoxifen in preventing noninvasive disease, while remaining far less toxic – in particular, there was significantly less endometrial cancer with raloxifene use.

21

maximal steatosis as early as week 2 (168 and 292 mg/109 cells, respectively). Our results showed  

E-print Network

maximal steatosis as early as week 2 (168 and 292 mg/109 cells, respectively). Our results showed and bovine adipose tis- sue cultured over 7 days. Y Faulconnier,Y Faulconnier, L Guillon, R Lefaivre, M-Champanelle, France) The effect of insulin (2 mU/ml) on glucose uti- lization was studied on adipose tissue (AT

Boyer, Edmond

22

Quantifying saltmarsh vegetation and its effect on wave height dissipation: Results from a UK East coast saltmarsh  

NASA Astrophysics Data System (ADS)

The degree to which incident wind waves are attenuated over intertidal surfaces is critical to the development of coastal wetlands, which are, amongst other processes, affected by the delivery, erosion, and/or resuspension of sediment due to wave action. Knowledge on wave attenuation over saltmarsh surfaces is also essential for accurate assessments of their natural sea-defence value to be made and incorporated into sea defence and management schemes. The aim of this paper is to evaluate the use of a digital photographic method for the quantification of marsh vegetation density and then to investigate the relative roles played by hydrodynamic controls and vegetation density/type in causing the attenuation of incident waves over a macro-tidal saltmarsh. Results show that a significant statistical relationship exists between the density of vegetation measured in side-on photographs and the dry biomass of the photographed vegetation determined through direct harvesting. The potential of the digital photographic method for the spatial and temporal comparison of marsh surface vegetation biomass, density, and canopy structure is highlighted and the method was applied to assess spatial and seasonal differences in vegetation density and their effect on wave attenuation at three locations on a macro-tidal saltmarsh on Dengie Peninsula, Essex, UK. In this environmental setting, vegetation density/type did not have a significant direct effect on wave attenuation but modified the process of wave transformation under different hydrodynamic conditions. At the two locations, characterised by a relatively tall canopy (15-26 cm) with biomass values of 430-500 g m -2, dominated by Spartina spp. (>70% of total dry biomass), relative incident wave height (wave height/water depth) is identified as a statistically significant dominant positive control on wave attenuation up to a threshold value of 0.55, beyond which wave attenuation showed no significant further increase. At the third location, characterised by only slightly less biomass (398 g m -2) but a shorter (6 cm) canopy of the annual Salicornia spp., no significant relationship existed between wave attenuation and relative wave height. Seasonally (between September and December) significant temporal increase/decrease in vegetation density occurred in one of the Spartina canopies and in the Salicornia canopy, respectively, and led to an expected (but not statistically significant) increase/decrease in wave attenuation. The wider implications of these findings in the context of form-process interactions on saltmarshes and their effect on marsh evolution are also discussed.

Möller, I.

2006-09-01

23

Recombinant PNPLA3 protein shows triglyceride hydrolase activity and its I148M mutation results in loss of function.  

PubMed

The patatin-like phospholipase domain containing 3 (PNPLA3, also called adiponutrin, ADPN) is a membrane-bound protein highly expressed in the liver. The genetic variant I148M (rs738409) was found to be associated with progression of chronic liver disease. We aimed to establish a protein purification protocol in a yeast system (Pichia pastoris) and to examine the human PNPLA3 enzymatic activity, substrate specificity and the I148M mutation effect. hPNPLA3 148I wild type and 148M mutant cDNA were cloned into P. pastoris expression vectors. Yeast cells were grown in 3L fermentors. PNPLA3 protein was purified from membrane fractions by Ni-affinity chromatography. Enzymatic activity was assessed using radiolabeled substrates. Both 148I wild type and 148M mutant proteins are localized to the membrane. The wild type protein shows a predominant lipase activity with mild lysophosphatidic acid acyl transferase activity (LPAAT) and the I148M mutation results in a loss of function of both these activities. Our data show that PNPLA3 has a predominant lipase activity and I148M mutation results in a loss of function. PMID:24369119

Pingitore, Piero; Pirazzi, Carlo; Mancina, Rosellina M; Motta, Benedetta M; Indiveri, Cesare; Pujia, Arturo; Montalcini, Tiziana; Hedfalk, Kristina; Romeo, Stefano

2014-04-01

24

Transgenic plants expressing HC-Pro show enhanced virus sensitivity while silencing of the transgene results in resistance.  

PubMed

Nicotiana benthamiana plants were engineered to express sequences of the helper component-proteinase (HC-Pro) of Cowpea aphid-borne mosaic potyvirus (CABMV). The sensitivity of the transgenic plants to infection with parental and heterologous viruses was studied. The lines expressing HC-Pro showed enhanced symptoms after infection with the parental CABMV isolate and also after infection with a heterologous potyvirus, Potato virus Y (PVY) and a comovirus, Cowpea mosaic virus (CPMV). On the other hand, transgenic lines expressing nontranslatable HC-Pro or translatable HC-Pro with a deletion of the central domain showed wild type symptoms after infection with the parental CABMV isolate and heterologous viruses. These results showed that CABMV HC-Pro is a pathogenicity determinant that conditions enhanced sensitivity to virus infection in plants, and that the central domain of the protein is essential for this. The severe symptoms in CABMV-infected HC-Pro expressing lines were remarkably followed by brief recovery and subsequent re-establishment of infection, possibly indicating counteracting effects of HC-Pro expression and a host defense response. One of the HC-Pro expressing lines (h48) was found to contain low levels of transgenic HC-Pro RNA and to be resistant to CABMV and to recombinant CPMV expressing HC-Pro. This indicated that h48 was (partially) posttranscriptionally silenced for the HC-Pro transgene inspite of the established role of HC-Pro as a suppressor of posttranscriptional gene silencing. Line h48 was not resistant to PVY, but instead showed enhanced symptoms compared to nontransgenic plants. This may be due to relief of silencing of the HC-Pro transgene by HC-Pro expressed by PVY. PMID:12206307

Mlotshwa, Sizolwenkosi; Verver, Jan; Sithole-Niang, Idah; Prins, Marcel; Van Kammen, A B; Wellink, Joan

2002-01-01

25

Not all Surface Waters show a Strong Relation between DOC and Hg Species: Results from an Adirondack Mountain Watershed  

NASA Astrophysics Data System (ADS)

Several recent papers have highlighted the strong statistical correlation between dissolved organic carbon (DOC) concentrations and total dissolved mercury (THgd) and/or dissolved methyl Hg (MeHgd). These relations of organic carbon with Hg species are often even stronger when a measurement that reflects some fraction of the DOC is used such as UV absorbance at 254 nm or the hydrophobic acid fraction. These strong relations are not surprising given the pivotal role DOC plays in binding and transporting Hg, which is otherwise relatively insoluble in dilute waters. In this study, we show data collected monthly and during some storms and snowmelt over 2.5 years from the 65 km2 Fishing Brook watershed in the Adirondack Mountains of New York. This dataset is noteworthy because of a weak and statistically non-significant (p > 0.05) relationship between DOC and either of THgd or MeHgd over the entire study period. We believe that the lack of a strong DOC-Hg relation in Fishing Brook reflects the combined effects of the heterogeneous land cover and the presence of three ponds within the watershed. The watershed is dominantly (89.3%) hardwood and coniferous forest with 8% wetland area, and 2.7% open water. Despite the lack of a strong relation between DOC and Hg species across the annual hydrograph, the dataset shows strong within-season correlations that have different y-intercepts and slopes between the growing season (May 1 - Sept. 30) and dormant season (Oct. 1 - April 30), as well as strong, but seasonally varying DOC-Hg correlations at smaller spatial scales in data collected on several occasions in 10 sub-watersheds of Fishing Brook. We hypothesize that a combination of several factors can account for these annually weak, but seasonally and spatially strong DOC-Hg correlations: (1) seasonal variations in runoff generation processes from upland and wetland areas that may yield DOC with varying Hg-binding characteristics, (2) photo-induced losses of Hg species and DOC in ponded areas, and (3) the effects of the widely varying seasonal temperature and snow cover on the rates of microbial processes such as the decomposition of soil organic matter and methylation of Hg. These results emphasize that not all watersheds show simple linear relations between DOC and Hg species on an annual basis, and provide a caution that measurements such as the optical properties of waters are not always a strong surrogate for Hg.

Burns, D. A.; Schelker, J.; Murray, K. R.; Brigham, M. E.; Aiken, G.

2009-12-01

26

Acute Myocardial Infarction and Pulmonary Diseases Result in Two Different Degradation Profiles of Elastin as Quantified by Two Novel ELISAs  

PubMed Central

Background Elastin is a signature protein of the arteries and lungs, thus it was hypothesized that elastin is subject to enzymatic degradation during cardiovascular and pulmonary diseases. The aim was to investigate if different fragments of the same protein entail different information associated to two different diseases and if these fragments have the potential of being diagnostic biomarkers. Methods Monoclonal antibodies were raised against an identified fragment (the ELM-2 neoepitope) generated at the amino acid position ‘552 in elastin by matrix metalloproteinase (MMP) ?9/?12. A newly identified ELM neoepitope was generated by the same proteases but at amino acid position ‘441. The distribution of ELM-2 and ELM, in human arterial plaques and fibrotic lung tissues were investigated by immunohistochemistry. A competitive ELISA for ELM-2 was developed. The clinical relevance of the ELM and ELM-2 ELISAs was evaluated in patients with acute myocardial infarction (AMI), no AMI, high coronary calcium, or low coronary calcium. The serological release of ELM-2 in patients with chronic obstructive pulmonary disease (COPD) or idiopathic pulmonary fibrosis (IPF) was compared to controls. Results ELM and ELM-2 neoepitopes were both localized in diseased carotid arteries and fibrotic lungs. In the cardiovascular cohort, ELM-2 levels were 66% higher in serum from AMI patients compared to patients with no AMI (p<0.01). Levels of ELM were not significantly increased in these patients and no correlation was observed between ELM-2 and ELM. ELM-2 was not elevated in the COPD and IPF patients and was not correlated to ELM. ELM was shown to be correlated with smoking habits (p<0.01). Conclusions The ELM-2 neoepitope was related to AMI whereas the ELM neoepitope was related to pulmonary diseases. These results indicate that elastin neoepitopes generated by the same proteases but at different amino acid sites provide different tissue-related information depending on the disease in question. PMID:23805173

Skj?t-Arkil, Helene; Clausen, Rikke E.; Rasmussen, Lars M.; Wang, Wanchun; Wang, Yaguo; Zheng, Qinlong; Mickley, Hans; Saaby, Lotte; Diederichsen, Axel C. P.; Lambrechtsen, Jess; Martinez, Fernando J.; Hogaboam, Cory M.; Han, MeiLan; Larsen, Martin R.; Nawrocki, Arkadiusz; Vainer, Ben; Krustrup, Dorrit; Bj?rling-Poulsen, Marina; Karsdal, Morten A.; Leeming, Diana J.

2013-01-01

27

Genomic and Enzymatic Results Show Bacillus cellulosilyticus Uses a Novel Set of LPXTA Carbohydrases to Hydrolyze Polysaccharides  

PubMed Central

Background Alkaliphilic Bacillus species are intrinsically interesting due to the bioenergetic problems posed by growth at high pH and high salt. Three alkaline cellulases have been cloned, sequenced and expressed from Bacillus cellulosilyticus N-4 (Bcell) making it an excellent target for genomic sequencing and mining of biomass-degrading enzymes. Methodology/Principal Findings The genome of Bcell is a single chromosome of 4.7 Mb with no plasmids present and three large phage insertions. The most unusual feature of the genome is the presence of 23 LPXTA membrane anchor proteins; 17 of these are annotated as involved in polysaccharide degradation. These two values are significantly higher than seen in any other Bacillus species. This high number of membrane anchor proteins is seen only in pathogenic Gram-positive organisms such as Listeria monocytogenes or Staphylococcus aureus. Bcell also possesses four sortase D subfamily 4 enzymes that incorporate LPXTA-bearing proteins into the cell wall; three of these are closely related to each other and unique to Bcell. Cell fractionation and enzymatic assay of Bcell cultures show that the majority of polysaccharide degradation is associated with the cell wall LPXTA-enzymes, an unusual feature in Gram-positive aerobes. Genomic analysis and growth studies both strongly argue against Bcell being a truly cellulolytic organism, in spite of its name. Preliminary results suggest that fungal mycelia may be the natural substrate for this organism. Conclusions/Significance Bacillus cellulosilyticus N-4, in spite of its name, does not possess any of the genes necessary for crystalline cellulose degradation, demonstrating the risk of classifying microorganisms without the benefit of genomic analysis. Bcell is the first Gram-positive aerobic organism shown to use predominantly cell-bound, non-cellulosomal enzymes for polysaccharide degradation. The LPXTA-sortase system utilized by Bcell may have applications both in anchoring cellulases and other biomass-degrading enzymes to Bcell itself and in anchoring proteins other Gram-positive organisms. PMID:23593409

Mead, David; Drinkwater, Colleen; Brumm, Phillip J.

2013-01-01

28

Low-frequency ac electroporation shows strong frequency dependence and yields comparable transfection results to dc electroporation  

E-print Network

transfection results to dc electroporation Yihong Zhan a , Zhenning Cao b , Ning Bao c,d , Jianbo Li e , Jun. In contrast, transfection efficiency with DNA reaches its maximum at medium frequencies (100­1000 Hz) in the range. We postulate that the relationship between the transfection efficiency and the ac frequency

Lu, Chang

29

QUANTIFYING FOREST ABOVEGROUND CARBON POOLS AND FLUXES USING MULTI-TEMPORAL LIDAR A report on field monitoring, remote sensing MMV, GIS integration, and modeling results for forestry field validation test to quantify aboveground tree biomass and carbon  

SciTech Connect

Sound policy recommendations relating to the role of forest management in mitigating atmospheric carbon dioxide (CO{sub 2}) depend upon establishing accurate methodologies for quantifying forest carbon pools for large tracts of land that can be dynamically updated over time. Light Detection and Ranging (LiDAR) remote sensing is a promising technology for achieving accurate estimates of aboveground biomass and thereby carbon pools; however, not much is known about the accuracy of estimating biomass change and carbon flux from repeat LiDAR acquisitions containing different data sampling characteristics. In this study, discrete return airborne LiDAR data was collected in 2003 and 2009 across {approx}20,000 hectares (ha) of an actively managed, mixed conifer forest landscape in northern Idaho, USA. Forest inventory plots, established via a random stratified sampling design, were established and sampled in 2003 and 2009. The Random Forest machine learning algorithm was used to establish statistical relationships between inventory data and forest structural metrics derived from the LiDAR acquisitions. Aboveground biomass maps were created for the study area based on statistical relationships developed at the plot level. Over this 6-year period, we found that the mean increase in biomass due to forest growth across the non-harvested portions of the study area was 4.8 metric ton/hectare (Mg/ha). In these non-harvested areas, we found a significant difference in biomass increase among forest successional stages, with a higher biomass increase in mature and old forest compared to stand initiation and young forest. Approximately 20% of the landscape had been disturbed by harvest activities during the six-year time period, representing a biomass loss of >70 Mg/ha in these areas. During the study period, these harvest activities outweighed growth at the landscape scale, resulting in an overall loss in aboveground carbon at this site. The 30-fold increase in sampling density between the 2003 and 2009 did not affect the biomass estimates. Overall, LiDAR data coupled with field reference data offer a powerful method for calculating pools and changes in aboveground carbon in forested systems. The results of our study suggest that multitemporal LiDAR-based approaches are likely to be useful for high quality estimates of aboveground carbon change in conifer forest systems.

Lee Spangler; Lee A. Vierling; Eva K. Stand; Andrew T. Hudak; Jan U.H. Eitel; Sebastian Martinuzzi

2012-04-01

30

QUANTIFYING SPICULES  

SciTech Connect

Understanding the dynamic solar chromosphere is fundamental in solar physics. Spicules are an important feature of the chromosphere, connecting the photosphere to the corona, potentially mediating the transfer of energy and mass. The aim of this work is to study the properties of spicules over different regions of the Sun. Our goal is to investigate if there is more than one type of spicule, and how spicules behave in the quiet Sun, coronal holes, and active regions. We make use of high cadence and high spatial resolution Ca II H observations taken by Hinode/Solar Optical Telescope. Making use of a semi-automated detection algorithm, we self-consistently track and measure the properties of 519 spicules over different regions. We find clear evidence of two types of spicules. Type I spicules show a rise and fall and have typical lifetimes of 150-400 s and maximum ascending velocities of 15-40 km s{sup -1}, while type II spicules have shorter lifetimes of 50-150 s, faster velocities of 30-110 km s{sup -1}, and are not seen to fall down, but rather fade at around their maximum length. Type II spicules are the most common, seen in the quiet Sun and coronal holes. Type I spicules are seen mostly in active regions. There are regional differences between quiet-Sun and coronal hole spicules, likely attributable to the different field configurations. The properties of type II spicules are consistent with published results of rapid blueshifted events (RBEs), supporting the hypothesis that RBEs are their disk counterparts. For type I spicules we find the relations between their properties to be consistent with a magnetoacoustic shock wave driver, and with dynamic fibrils as their disk counterpart. The driver of type II spicules remains unclear from limb observations.

Pereira, Tiago M. D. [NASA Ames Research Center, Moffett Field, CA 94035 (United States); De Pontieu, Bart [Lockheed Martin Solar and Astrophysics Laboratory, Org. A021S, Building 252, 3251 Hanover Street, Palo Alto, CA 94304 (United States); Carlsson, Mats, E-mail: tiago.pereira@nasa.gov [Institute of Theoretical Astrophysics, P.O. Box 1029, Blindern, NO-0315 Oslo (Norway)

2012-11-01

31

Quantifiers induced by subjective expected value of sample information.  

PubMed

The ordered weighted averaging (OWA) operator provides a unified framework for multiattribute decision making (MADM) under uncertainty. In this paper, we attempt to tackle some issues arising from the quantifier guided aggregation using OWA operators. This allows us to consider a more general case involving the generation of quantifier targeted at the specified decision maker (DM) by using sample information. In order to do that, we first develop a repeatable interactive procedure in which with the given sample values, and the expected values the DM involved provides with personal preferences, we build nonlinear optimal models to extract from the DM information about his/her decision attitude in an OWA weighting vector form. After that, with the obtained attitudinal weighting vectors we suggest a suitable quantifier just for this DM by means of the piecewise linear interpolations. This obtained quantifier is totally derived from the behavior of the DM involved and thus inherently characterized by his/her own attitudinal character. Owing to the nature of this type of quantifier, we call it the subjective expected value of sample information-induced quantifier. We show some properties of the developed quantifier. We also prove the consistency of OWA aggregation guided by this type of quantifier. In contrast with parameterized quantifiers, our developed quantifiers are oriented toward the specified DMs with proper consideration of their decision attitudes or behavior characteristics, thus bringing about more intuitively appealing and convincing results in the quantifier guided OWA aggregation. PMID:25222722

Guo, Kaihong

2014-10-01

32

"The Show"  

ERIC Educational Resources Information Center

For the past 16 years, the blue-collar city of Huntington, West Virginia, has rolled out the red carpet to welcome young wrestlers and their families as old friends. They have come to town chasing the same dream for a spot in what many of them call "The Show". For three days, under the lights of an arena packed with 5,000 fans, the state's best…

Gehring, John

2004-01-01

33

timid, or easily manipulated. This is not compassion. A marine drill ser-geant may be demanding and results-driven, but can show compassion  

E-print Network

timid, or easily manipulated. This is not compassion. A marine drill ser- geant may be demanding and results-driven, but can show compassion when a recruit requires bereavement leave to attend a family

Kim, Duck O.

34

Sci Show  

NSDL National Science Digital Library

The Sci Show, an entertaining series of quirky YouTube videos, tackles topics ranging from âÂÂHow Do Polarized Sunglasses Workâ to âÂÂStrong Interaction: The Four Fundamental Forces of Physics.â Most episodes are less than five minutes long, but they pack a wallop of handy science info. Anyone short on time but long on big questions will benefit from the series. Episodes will be helpful to teachers and parents looking to spark enthusiasm in young minds. Viewers may want to start with recent episodes like âÂÂTodayâÂÂs Mass Extinction,â and the âÂÂWorldâÂÂs First See-Through Animalâ and âÂÂHow Do Animals Change Color?â before digging into the archives for gems like âÂÂThe Truth About Gingersâ and âÂÂThe Science of Lying.âÂÂ

35

Quantified Interference for a While Language  

Microsoft Academic Search

We show how an information theoretic approach can quantify interference in a simple imperative language that includes a looping construct. In this paper we focus on a particular case of this definition of interference: leakage of information from private variables to public ones in While language programs. The major result of the paper is a quantitative analysis for this language

David Clark; Sebastian Hunt; Pasquale Malacaria

2005-01-01

36

Discussion and Future Work Our results for the rainy season (Nov Mar) show that 30% of the moisture flux leaving the Amazon  

E-print Network

. The ratio of ET to total moisture flux shows a maximum of about 40% over Bolivia with a secondary maximum resolution preliminary results indicate that 40% of the precipitation over Bolivia and 30% of that over we have considered that in a quasi-equilibrium regime the vertical profiles of total water vapor

Barbosa, Henrique

37

Quantifying Clinical Relevance  

PubMed Central

Communicating clinical trial results should include measures of effect size in order to quantify the clinical relevance of the results. P-values are not informative regarding the size of treatment effects. Cohen’s d and its variants are often used but are not easy to understand in terms of applicability to day-to-day clinical practice. Number needed to treat and number needed to harm can provide additional information about effect size that clinicians may find useful in clinical decision making, and although number needed to treat and number needed to harm are limited to dichotomous outcomes, it is recommended that they be considered for inclusion when describing clinical trial results. Examples are given using the fictional antipsychotic medications miracledone and fantastapine for the treatment of acute schizophrenia. PMID:25152844

2014-01-01

38

Quantifiable Lateral Flow Assay Test Strips  

NASA Technical Reports Server (NTRS)

As easy to read as a home pregnancy test, three Quantifiable Lateral Flow Assay (QLFA) strips used to test water for E. coli show different results. The brightly glowing control line on the far right of each strip indicates that all three tests ran successfully. But the glowing test line on the middle left and bottom strips reveal their samples were contaminated with E. coli bacteria at two different concentrations. The color intensity correlates with concentration of contamination.

2003-01-01

39

How often do German children and adolescents show signs of common mental health problems? Results from different methodological approaches - a cross-sectional study  

PubMed Central

Background Child and adolescent mental health problems are ubiquitous and burdensome. Their impact on functional disability, the high rates of accompanying medical illnesses and the potential to last until adulthood make them a major public health issue. While methodological factors cause variability of the results from epidemiological studies, there is a lack of prevalence rates of mental health problems in children and adolescents according to ICD-10 criteria from nationally representative samples. International findings suggest only a small proportion of children with function impairing mental health problems receive treatment, but information about the health care situation of children and adolescents is scarce. The aim of this epidemiological study was a) to classify symptoms of common mental health problems according to ICD-10 criteria in order to compare the statistical and clinical case definition strategies using a single set of data and b) to report ICD-10 codes from health insurance claims data. Methods a) Based on a clinical expert rating, questionnaire items were mapped on ICD-10 criteria; data from the Mental Health Module (BELLA study) were analyzed for relevant ICD-10 and cut-off criteria; b) Claims data were analyzed for relevant ICD-10 codes. Results According to parent report 7.5% (n?=?208) met the ICD-10 criteria of a mild depressive episode and 11% (n?=?305) showed symptoms of depression according to cut-off score; Anxiety is reported in 5.6% (n?=?156) and 11.6% (n?=?323), conduct disorder in 15.2% (n?=?373) and 14.6% (n?=?357). Self-reported symptoms in 11 to 17 year olds resulted in 15% (n?=?279) reporting signs of a mild depression according to ICD-10 criteria (vs. 16.7% (n?=?307) based on cut-off) and 10.9% (n?=?201) reported symptoms of anxiety (vs. 15.4% (n?=?283)). Results from routine data identify 0.9% (n?=?1,196) with a depression diagnosis, 3.1% (n?=?6,729) with anxiety and 1.4% (n?=?3,100) with conduct disorder in outpatient health care. Conclusions Statistical and clinical case definition strategies show moderate concordance in depression and conduct disorder in a German national sample. Comparatively, lower rates of children and adolescents with diagnosed mental health problems in the outpatient health care setting support the assumptions that a small number of children and adolescents in need of treatment receive it. PMID:24597565

2014-01-01

40

A high-density wireless underground sensor network (WUSN) to quantify hydro-ecological interactions for a UK floodplain; project background and initial results  

NASA Astrophysics Data System (ADS)

Floodplain meadows support some of the most diverse vegetation in the UK, and also perform key ecosystem services, such as flood storage and sediment retention. However, the UK now has less than 1500 ha of this unique habitat remaining. In order to conserve and better exploit the services provided by this grassland, an improved understanding of its functioning is essential. Vegetation functioning and species composition are known to be tightly correlated to the hydrological regime, and related temperature and nutrient regime, but the mechanisms controlling these relationships are not well established. The FUSE* project aims to investigate the spatiotemporal variability in vegetation functioning (e.g. photosynthesis and transpiration) and plant community composition in a floodplain meadow near Oxford, UK (Yarnton Mead), and their relationship to key soil physical variables (soil temperature and moisture content), soil nutrient levels and the water- and energy-balance. A distributed high density Wireless Underground Sensor Network (WUSN) is in the process of being established on Yarnton Mead. The majority, or ideally all, of the sensing and transmitting components will be installed below-ground because Yarnton Mead is a SSSI (Site of Special Scientific Interest, due to its unique plant community) and because occasionally sheep or cattle are grazing on it, and that could damage the nodes. This prerequisite has implications for the maximum spacing between UG nodes and their communications technologies; in terms of signal strength, path losses and requirements for battery life. The success of underground wireless communication is highly dependent on the soil type and water content. This floodplain environment is particularly challenging in this context because the soil contains a large amount of clay near the surface and is therefore less favourable to EM wave propagation than sandy soils. Furthermore, due to high relative saturation levels (as a result of high groundwater levels and occasional overland flooding) considerable path losses are expected. Finally, the long-term below-ground installation of the nodes means that batteries cannot be replaced easily, therefore energy conservation schemes are required to be deployed on the nodes. We present a brief overview of the project and initial findings of the approach we have adopted to address these wireless communication issues. This involves tests covering a range of transmission frequencies, antennae types, and node placements. *FUSE, Floodplain Underground SEnsors, funded by the UK Natural Environment Research Council, NE/I007288/1, start date 1-3-2011)

Verhoef, A.; Choudhary, B.; Morris, P. J.; McCann, J.

2012-04-01

41

Expansion for Universal Quantifiers  

E-print Network

Expansion is an operation on typings (i.e., pairs of typing environments and result types) defined originally in type systems for the lambda-calculus with intersection types in order to obtain principal (i.e., most informative, strongest) typings. In a type inference scenario, expansion allows postponing choices for whether and how to use non-syntax-driven typing rules (e.g., intersection introduction) until enough information has been gathered to make the right decision. Furthermore, these choices can be equivalent to inserting uses of such typing rules at deeply nested positions in a typing derivation, without needing to actually inspect or modify (or even have) the typing derivation. Expansion has in recent years become simpler due to the use of expansion variables (e.g., in System E). This paper extends expansion and expansion variables to systems with forall-quantifiers. We present System Fs, an extension of System F with expansion, and prove its main properties. This system turns type inference into a c...

Lenglet, Sergueï

2012-01-01

42

Storytelling Slide Shows to Improve Diabetes and High Blood Pressure Knowledge and Self-Efficacy: Three-Year Results among Community Dwelling Older African Americans  

ERIC Educational Resources Information Center

This study combined the African American tradition of oral storytelling with the Hispanic medium of "Fotonovelas." A staggered pretest posttest control group design was used to evaluate four Storytelling Slide Shows on health that featured community members. A total of 212 participants were recruited for the intervention and 217 for the…

Bertera, Elizabeth M.

2014-01-01

43

The Relevance of External Quality Assessment for Molecular Testing for ALK Positive Non-Small Cell Lung Cancer: Results from Two Pilot Rounds Show Room for Optimization  

PubMed Central

Background and Purpose Molecular profiling should be performed on all advanced non-small cell lung cancer with non-squamous histology to allow treatment selection. Currently, this should include EGFR mutation testing and testing for ALK rearrangements. ROS1 is another emerging target. ALK rearrangement status is a critical biomarker to predict response to tyrosine kinase inhibitors such as crizotinib. To promote high quality testing in non-small cell lung cancer, the European Society of Pathology has introduced an external quality assessment scheme. This article summarizes the results of the first two pilot rounds organized in 2012–2013. Materials and Methods Tissue microarray slides consisting of cell-lines and resection specimens were distributed with the request for routine ALK testing using IHC or FISH. Participation in ALK FISH testing included the interpretation of four digital FISH images. Results Data from 173 different laboratories was obtained. Results demonstrate decreased error rates in the second round for both ALK FISH and ALK IHC, although the error rates were still high and the need for external quality assessment in laboratories performing ALK testing is evident. Error rates obtained by FISH were lower than by IHC. The lowest error rates were observed for the interpretation of digital FISH images. Conclusion There was a large variety in FISH enumeration practices. Based on the results from this study, recommendations for the methodology, analysis, interpretation and result reporting were issued. External quality assessment is a crucial element to improve the quality of molecular testing. PMID:25386659

Tembuyser, Lien; Tack, Véronique; Zwaenepoel, Karen; Pauwels, Patrick; Miller, Keith; Bubendorf, Lukas; Kerr, Keith; Schuuring, Ed; Thunnissen, Erik; Dequeker, Elisabeth M. C.

2014-01-01

44

This in-depth survey of30 companies reveals actual goings-on in software production. Results show that, while practice is 10 years behind  

E-print Network

in the late 1960's to describe ways to develop, manage, and maintain software so that resulting products science community has created a need to assess the impact that numerous advances have had on actual of different program development environments in in- dustry to determine the state of the art in software

Basili, Victor R.

45

RESULTS FROM TWO PHARMACOTHERAPY TRIALS SHOW ALCOHOLIC SMOKERS WERE MORE SEVERELY ALCOHOL DEPENDENT BUT LESS PRONE TO RELAPSE THAN ALCOHOLIC NON-SMOKERS  

E-print Network

Abstract — Aims: To assess the role of smoking on treatment outcome in quitting alcoholics on the background of the priming or coping hypothesis (Rohsenow et al., 1997). Methods: Data sets of placebo treated patients of the German phase III trial of naltrexone (Gastpar et al., 2002) and of acamprosate treated patients of a German phase IV trial (Soyka et al., 2002) were reanalyzed. Differences between smoking and non-smoking alcoholics were evaluated using ? 2-, t- or ANOVA-tests, relapse rates using survival techniques with Cox regression. Results: Smoking alcoholics differed significantly from non-smoking alcoholics regarding sociodemographic variables (e.g. more males, more often living alone) and severity indicators of alcoholism (e.g. quantity, onset, related problems). In the naltrexone study time to first relapse was significantly longer for smoking alcoholics compared to non-smoking alcoholics (hazard ratio = 2.26; P = 0.036). The same effect was seen in the acamprosate study (hazard ratio = 1.34; P = 0.015); estimated abstinencerates after 24 weeks were 38 % for smoking alcoholics compared to 28 % for non-smoking alcoholics (P alcoholics included in two pharmacotherapy trials. Although the underlying mechanisms remain unclear our findings are in favour of the coping hypothesis. The results challenge the validity of the dependence syndrome.

Lutz G. Schmidt; Michael N. Smolka

46

10, 71277160, 2013 Quantifying  

E-print Network

to that of a pixel for the Soil Moisture and Ocean Salinity (SMOS) satellite mission. The soil moisture distribution.82 for the soil moisture distributions that were surveyed and (2) estimation of soil moisture profiles by combinHESSD 10, 7127­7160, 2013 Quantifying mesoscale soil moisture B. Chrisman and M. Zreda Title Page

Zreda, Marek

47

Quantifying IT estimation risks  

Microsoft Academic Search

A statistical method is proposed to quantify the impact of factors that influence the quality of the estimation of costs for IT-enabled business projects. We call these factors risk drivers as they influence the risk of the misestimation of project costs. The method can eortlessly be trans- posed for usage on other important IT key performance indicators (KPIs), such as

G. P. Kulk; R. J. Peters; Chris Verhoef

2009-01-01

48

Quantifying Faculty Workloads.  

ERIC Educational Resources Information Center

Teaching load depends on many variables, however most colleges define it strictly in terms of contact or credit hours. The failure to give weight to variables such as number of preparations, number of students served, committee and other noninstructional assignments is usually due to the lack of a formula that will quantify the effects of these…

Archer, J. Andrew

49

Value of Fused 18F-Choline-PET/MRI to Evaluate Prostate Cancer Relapse in Patients Showing Biochemical Recurrence after EBRT: Preliminary Results  

PubMed Central

Purpose. We compared the accuracy of 18F-Choline-PET/MRI with that of multiparametric MRI (mMRI), 18F-Choline-PET/CT, 18F-Fluoride-PET/CT, and contrast-enhanced CT (CeCT) in detecting relapse in patients with suspected relapse of prostate cancer (PC) after external beam radiotherapy (EBRT). We assessed the association between standard uptake value (SUV) and apparent diffusion coefficient (ADC). Methods. We evaluated 21 patients with biochemical relapse after EBRT. Patients underwent 18F-Choline-PET/contrast-enhanced (Ce)CT, 18F-Fluoride-PET/CT, and mMRI. Imaging coregistration of PET and mMRI was performed. Results. 18F-Choline-PET/MRI was positive in 18/21 patients, with a detection rate (DR) of 86%. DRs of 18F-Choline-PET/CT, CeCT, and mMRI were 76%, 43%, and 81%, respectively. In terms of DR the only significant difference was between 18F-Choline-PET/MRI and CeCT. On lesion-based analysis, the accuracy of 18F-Choline-PET/MRI, 18F-Choline-PET/CT, CeCT, and mMRI was 99%, 95%, 70%, and 85%, respectively. Accuracy, sensitivity, and NPV of 18F-Choline-PET/MRI were significantly higher than those of both mMRI and CeCT. On whole-body assessment of bone metastases, the sensitivity of 18F-Choline-PET/CT and 18F-Fluoride-PET/CT was significantly higher than that of CeCT. Regarding local and lymph node relapse, we found a significant inverse correlation between ADC and SUV-max. Conclusion. 18F-Choline-PET/MRI is a promising technique in detecting PC relapse. PMID:24877053

Piccardo, Arnoldo; Paparo, Francesco; Picazzo, Riccardo; Naseri, Mehrdad; Ricci, Paolo; Marziano, Andrea; Bacigalupo, Lorenzo; Biscaldi, Ennio; Rollandi, Gian Andrea; Grillo-Ruggieri, Filippo; Farsad, Mohsen

2014-01-01

50

Quantifying tumour heterogeneity with CT  

PubMed Central

Abstract Heterogeneity is a key feature of malignancy associated with adverse tumour biology. Quantifying heterogeneity could provide a useful non-invasive imaging biomarker. Heterogeneity on computed tomography (CT) can be quantified using texture analysis which extracts spatial information from CT images (unenhanced, contrast-enhanced and derived images such as CT perfusion) that may not be perceptible to the naked eye. The main components of texture analysis can be categorized into image transformation and quantification. Image transformation filters the conventional image into its basic components (spatial, frequency, etc.) to produce derived subimages. Texture quantification techniques include structural-, model- (fractal dimensions), statistical- and frequency-based methods. The underlying tumour biology that CT texture analysis may reflect includes (but is not limited to) tumour hypoxia and angiogenesis. Emerging studies show that CT texture analysis has the potential to be a useful adjunct in clinical oncologic imaging, providing important information about tumour characterization, prognosis and treatment prediction and response. PMID:23545171

Miles, Kenneth A.

2013-01-01

51

Meditations on Quantified Constraint Satisfaction  

E-print Network

The quantified constraint satisfaction problem (QCSP) is the problem of deciding, given a structure and a first-order prenex sentence whose quantifier-free part is the conjunction of atoms, whether or not the sentence holds on the structure. One obtains a family of problems by defining, for each structure B, the problem QCSP(B) to be the QCSP where the structure is fixed to be B. In this article, we offer a viewpoint on the research program of understanding the complexity of the problems QCSP(B) on finite structures. In particular, we propose and discuss a group of conjectures; throughout, we attempt to place the conjectures in relation to existing results and to emphasize open issues and potential research directions.

Chen, Hubie

2012-01-01

52

Visualizing and quantifying the suppressive effects of glucocorticoids on the tadpole immune system in vivo  

NSDL National Science Digital Library

This article presents a laboratory module developed to show students how glucocorticoid receptor activity can be pharmacologically modulated in Xenopus laevis tadpoles and the resulting effects on thymus gland size visualized and quantified in vivo.

Alexander Schreiber (St. Lawrence University)

2011-12-01

53

Quantifying water diffusion in secondary organic material  

NASA Astrophysics Data System (ADS)

Recent research suggests that some secondary organic aerosol (SOA) is highly viscous under certain atmospheric conditions. This may have important consequences for equilibration timescales, SOA growth, heterogeneous chemistry and ice nucleation. In order to quantify these effects, knowledge of the diffusion coefficients of relevant gas species within aerosol particles is vital. In this work, a Raman isotope tracer method is used to quantify water diffusion coefficients over a range of atmospherically relevant humidity and temperature conditions. D2O is observed as it diffuses from the gas phase into a disk of aqueous solution, without the disk changing in size or viscosity. An analytical solution of Fick's second law is then used with a fitting procedure to determine water diffusion coefficients in reference materials for method validation. The technique is then extended to compounds of atmospheric relevance and ?-pinene secondary organic material. We produce water diffusion coefficients from 20 to 80 % RH at 23.5° C for sucrose, levoglucosan, M5AS and MgSO4. For levoglucosan we show that under conditions where a particle bounces, water diffusion in aqueous solutions can be fast (a fraction of a second for a 100 nm radius). For sucrose solutions, we also show that the Stokes-Einstein relation breaks down at high viscosity and cannot be used to predict water diffusion timescales with accuracy. In addition, we also quantify water diffusion coefficients in ?-pinene SOM from 20-80% RH and over temperatures from 6 to -30° C. Our results suggest that, at 6° C, water diffusion in ?-pinene SOA is not kinetically limited on the second timescale, even at 20% RH. As temperatures decrease, however, diffusion slows and may become an increasingly limiting factor for atmospheric processes. A parameterization for the diffusion coefficient of water in ?-pinene secondary organic material, as a function of relative humidity and temperature, is presented. The implications for atmospheric processes such as ice nucleation and heterogeneous chemistry in the mid- and upper-troposphere will be discussed.

Price, Hannah; Murray, Benjamin; Mattsson, Johan; O'Sullivan, Daniel; Wilson, Theodore; Zhang, Yue; Martin, Scot

2014-05-01

54

On quantifying insect movements  

SciTech Connect

We elaborate on methods described by Turchin, Odendaal Rausher for quantifying insect movement pathways. We note the need to scale measurement resolution to the study insects and the questions being asked, and we discuss the use of surveying instrumentation for recording sequential positions of individuals on pathways. We itemize several measures that may be used to characterize movement pathways and illustrate these by comparisons among several Eleodes beetles occurring in shortgrass steppe. The fractal dimension of pathways may provide insights not available from absolute measures of pathway configuration. Finally, we describe a renormalization procedure that may be used to remove sequential interdependence among locations of moving individuals while preserving the basic attributes of the pathway.

Wiens, J.A.; Crist, T.O. (Colorado State Univ., Fort Collins (United States)); Milne, B.T. (Univ. of New Mexico, Albuquerque (United States))

1993-08-01

55

Quantifying the Wave Driving of the Stratosphere  

NASA Technical Reports Server (NTRS)

The zonal mean eddy heat flux is directly proportional to the wave activity that propagates from the troposphere into the stratosphere. This quantity is a simple eddy diagnostic which is easily calculated from conventional meteorological analyses. Because this "wave driving" of the stratosphere has a strong impact on the stratospheric temperature, it is necessary to compare the impact of the flux with respect to stratospheric radiative changes caused by greenhouse gas changes. Hence, we must understand the precision and accuracy of the heat flux derived from our global meteorological analyses. Herein, we quantify the stratospheric heat flux using five different meteorological analyses, and show that there are 30% differences between these analyses during the disturbed conditions of the northern hemisphere winter. Such large differences result from the planetary differences in the stationary temperature and meridional wind fields. In contrast, planetary transient waves show excellent agreement amongst these five analyses, and this transient heat flux appears to have a long term downward trend.

Newman, Paul A.; Nash, Eric R.

1999-01-01

56

Quantifying the Nonclassicality of Operations  

E-print Network

Deep insight can be gained into the nature of nonclassical correlations by studying the quantum operations that create them. Motivated by this we propose a measure of nonclassicality of a quantum operation utilizing the relative entropy to quantify its commutativity with the completely dephasing operation. We show that our measure of nonclassicality is a sum of two independent contributions, the generating power -- its ability to produce nonclassical states out of classical ones, and the distinguishing power -- its usefulness to a classical observer for distinguishing between classical and nonclassical states. Each of these effects can be exploited individually in quantum protocols. We further show that our measure leads to an interpretation of quantum discord as the difference in superdense coding capacities between a quantum state and the best classical state when both are produced at a source that makes a classical error during transmission.

Sebastian Meznaric; Stephen R. Clark; Animesh Datta

2012-07-30

57

Quantifying the Nonclassicality of Operations  

NASA Astrophysics Data System (ADS)

Deep insight can be gained into the nature of nonclassical correlations by studying the quantum operations that create them. Motivated by this we propose a measure of nonclassicality of a quantum operation utilizing the relative entropy to quantify its commutativity with the completely dephasing operation. We show that our measure of nonclassicality is a sum of two independent contributions, the generating power—its ability to produce nonclassical states out of classical ones, and the distinguishing power—its usefulness to a classical observer for distinguishing between classical and nonclassical states. Each of these effects can be exploited individually in quantum protocols. We further show that our measure leads to an interpretation of quantum discord as the difference in superdense coding capacities between a quantum state and the best classical state when both are produced at a source that makes a classical error during transmission.

Meznaric, Sebastian; Clark, Stephen R.; Datta, Animesh

2013-02-01

58

Quantifier Comprehension in Corticobasal Degeneration  

ERIC Educational Resources Information Center

In this study, we investigated patients with focal neurodegenerative diseases to examine a formal linguistic distinction between classes of generalized quantifiers, like "some X" and "less than half of X." Our model of quantifier comprehension proposes that number knowledge is required to understand both first-order and higher-order quantifiers.…

McMillan, Corey T.; Clark, Robin; Moore, Peachie; Grossman, Murray

2006-01-01

59

Quantifying social group evolution.  

PubMed

The rich set of interactions between individuals in society results in complex community structure, capturing highly connected circles of friends, families or professional cliques in a social network. Thanks to frequent changes in the activity and communication patterns of individuals, the associated social and communication network is subject to constant evolution. Our knowledge of the mechanisms governing the underlying community dynamics is limited, but is essential for a deeper understanding of the development and self-optimization of society as a whole. We have developed an algorithm based on clique percolation that allows us to investigate the time dependence of overlapping communities on a large scale, and thus uncover basic relationships characterizing community evolution. Our focus is on networks capturing the collaboration between scientists and the calls between mobile phone users. We find that large groups persist for longer if they are capable of dynamically altering their membership, suggesting that an ability to change the group composition results in better adaptability. The behaviour of small groups displays the opposite tendency-the condition for stability is that their composition remains unchanged. We also show that knowledge of the time commitment of members to a given community can be used for estimating the community's lifetime. These findings offer insight into the fundamental differences between the dynamics of small groups and large institutions. PMID:17410175

Palla, Gergely; Barabási, Albert-László; Vicsek, Tamás

2007-04-01

60

Quantifying social group evolution  

NASA Astrophysics Data System (ADS)

The rich set of interactions between individuals in society results in complex community structure, capturing highly connected circles of friends, families or professional cliques in a social network. Thanks to frequent changes in the activity and communication patterns of individuals, the associated social and communication network is subject to constant evolution. Our knowledge of the mechanisms governing the underlying community dynamics is limited, but is essential for a deeper understanding of the development and self-optimization of society as a whole. We have developed an algorithm based on clique percolation that allows us to investigate the time dependence of overlapping communities on a large scale, and thus uncover basic relationships characterizing community evolution. Our focus is on networks capturing the collaboration between scientists and the calls between mobile phone users. We find that large groups persist for longer if they are capable of dynamically altering their membership, suggesting that an ability to change the group composition results in better adaptability. The behaviour of small groups displays the opposite tendency-the condition for stability is that their composition remains unchanged. We also show that knowledge of the time commitment of members to a given community can be used for estimating the community's lifetime. These findings offer insight into the fundamental differences between the dynamics of small groups and large institutions.

Palla, Gergely; Barabási, Albert-László; Vicsek, Tamás

2007-04-01

61

Quantifying traffic exposure.  

PubMed

Living near traffic adversely affects health outcomes. Traffic exposure metrics include distance to high-traffic roads, traffic volume on nearby roads, traffic within buffer distances, measured pollutant concentrations, land-use regression estimates of pollution concentrations, and others. We used Geographic Information System software to explore a new approach using traffic count data and a kernel density calculation to generate a traffic density surface with a resolution of 50?m. The density value in each cell reflects all the traffic on all the roads within the distance specified in the kernel density algorithm. The effect of a given roadway on the raster cell value depends on the amount of traffic on the road segment, its distance from the raster cell, and the form of the algorithm. We used a Gaussian algorithm in which traffic influence became insignificant beyond 300?m. This metric integrates the deleterious effects of traffic rather than focusing on one pollutant. The density surface can be used to impute exposure at any point, and it can be used to quantify integrated exposure along a global positioning system route. The traffic density calculation compares favorably with other metrics for assessing traffic exposure and can be used in a variety of applications. PMID:24045427

Pratt, Gregory C; Parson, Kris; Shinoda, Naomi; Lindgren, Paula; Dunlap, Sara; Yawn, Barbara; Wollan, Peter; Johnson, Jean

2014-01-01

62

Quantifying the nonlinearity of a quantum oscillator  

E-print Network

We address the quantification of nonlinearity for quantum oscillators and introduce two measures based on the properties of the ground state rather than on the form of the potential itself. The first measure is a fidelity-based one, and corresponds to the renormalized Bures distance between the ground state of the considered oscillator and the ground state of a reference harmonic oscillator. Then, in order to avoid the introduction of this auxiliary oscillator, we introduce a different measure based on the non-Gaussianity (nG) of the ground state. The two measures are evaluated for a sample of significant nonlinear potentials and their properties are discussed in some detail. We show that the two measures are monotone functions of each other in most cases, and this suggests that the nG-based measure is a suitable choice to capture the anharmonic nature of a quantum oscillator, and to quantify its nonlinearity independently on the specific features of the potential. We also provide examples of potentials where the Bures measure cannot be defined, due to the lack of a proper reference harmonic potential, while the nG-based measure properly quantify their nonlinear features. Our results may have implications in experimental applications where access to the effective potential is limited, e.g., in quantum control, and protocols rely on information about the ground or thermal state.

Matteo G. A. Paris; Marco G. Genoni; Nathan Shammah; Berihu Teklu

2014-05-05

63

Quantifying Anderson's fault types  

NASA Astrophysics Data System (ADS)

Anderson [1905] explained three basic types of faulting (normal, strike-slip, and reverse) in terms of the shape of the causative stress tensor and its orientation relative to the Earth's surface. Quantitative parameters can be defined which contain information about both shape and orientation [Célérier, 1995], thereby offering a way to distinguish fault-type domains on plots of regional stress fields and to quantify, for example, the degree of normal-faulting tendencies within strike-slip domains. This paper offers a geometrically motivated generalization of Angelier's [1979, 1984, 1990] shape parameters ? and ? to new quantities named A? and A?. In their simple forms, A? varies from 0 to 1 for normal, 1 to 2 for strike-slip, and 2 to 3 for reverse faulting, and A?/ranges from 0° to 60°, 60° to 120°, and 120° to 180°, respectively. After scaling, A? and A? agree to within 2% (or 1°), a difference of little practical significance, although A? has smoother analytical properties. A formulation distinguishing horizontal axes as well as the vertical axis is also possible, yielding an A? ranging from -3 to +3 and A? from -180° to +180°. The geometrically motivated derivation in three-dimensional stress space presented here may aid intuition and offers a natural link with traditional ways of plotting yield and failure criteria. Examples are given, based on models of Bird [1996] and Bird and Kong [1994], of the use of Anderson fault parameters A? and A? for visualizing tectonic regimes defined by regional stress fields.

Simpson, Robert W.

1997-08-01

64

Quantifying errors in trace species transport modeling.  

PubMed

One expectation when computationally solving an Earth system model is that a correct answer exists, that with adequate physical approximations and numerical methods our solutions will converge to that single answer. With such hubris, we performed a controlled numerical test of the atmospheric transport of CO(2) using 2 models known for accurate transport of trace species. Resulting differences were unexpectedly large, indicating that in some cases, scientific conclusions may err because of lack of knowledge of the numerical errors in tracer transport models. By doubling the resolution, thereby reducing numerical error, both models show some convergence to the same answer. Now, under realistic conditions, we identify a practical approach for finding the correct answer and thus quantifying the advection error. PMID:19066224

Prather, Michael J; Zhu, Xin; Strahan, Susan E; Steenrod, Stephen D; Rodriguez, Jose M

2008-12-16

65

Quantifying errors in trace species transport modeling  

PubMed Central

One expectation when computationally solving an Earth system model is that a correct answer exists, that with adequate physical approximations and numerical methods our solutions will converge to that single answer. With such hubris, we performed a controlled numerical test of the atmospheric transport of CO2 using 2 models known for accurate transport of trace species. Resulting differences were unexpectedly large, indicating that in some cases, scientific conclusions may err because of lack of knowledge of the numerical errors in tracer transport models. By doubling the resolution, thereby reducing numerical error, both models show some convergence to the same answer. Now, under realistic conditions, we identify a practical approach for finding the correct answer and thus quantifying the advection error. PMID:19066224

Prather, Michael J.; Zhu, Xin; Strahan, Susan E.; Steenrod, Stephen D.; Rodriguez, Jose M.

2008-01-01

66

Terahertz spectroscopy for quantifying refined oil mixtures.  

PubMed

In this paper, the absorption coefficient spectra of samples prepared as mixtures of gasoline and diesel in different proportions are obtained by terahertz time-domain spectroscopy. To quantify the components of refined oil mixtures, a method is proposed to evaluate the best frequency band for regression analysis. With the data in this frequency band, dualistic linear regression fitting is used to determine the volume fraction of gasoline and diesel in the mixture based on the Beer-Lambert law. The minimum of regression fitting R-Square is 0.99967, and the mean error of fitted volume fraction of 97# gasoline is 4.3%. Results show that refined oil mixtures can be quantitatively analyzed through absorption coefficient spectra in terahertz frequency, which it has bright application prospects in the storage and transportation field for refined oil. PMID:22907017

Li, Yi-nan; Li, Jian; Zeng, Zhou-mo; Li, Jie; Tian, Zhen; Wang, Wei-kui

2012-08-20

67

Quantifying decoherence in continuous variable systems  

E-print Network

We present a detailed report on the decoherence of quantum states of continuous variable systems under the action of a quantum optical master equation resulting from the interaction with general Gaussian uncorrelated environments. The rate of decoherence is quantified by relating it to the decay rates of various, complementary measures of the quantum nature of a state, such as the purity, some nonclassicality indicators in phase space and, for two-mode states, entanglement measures and total correlations between the modes. Different sets of physically relevant initial configurations are considered, including one- and two-mode Gaussian states, number states, and coherent superpositions. Our analysis shows that, generally, the use of initially squeezed configurations does not help to preserve the coherence of Gaussian states, whereas it can be effective in protecting coherent superpositions of both number states and Gaussian wave packets.

Serafini, A; Illuminati, F; De Siena, S

2005-01-01

68

Quantifying decoherence in continuous variable systems  

E-print Network

We present a detailed report on the decoherence of quantum states of continuous variable systems under the action of a quantum optical master equation resulting from the interaction with general Gaussian uncorrelated environments. The rate of decoherence is quantified by relating it to the decay rates of various, complementary measures of the quantum nature of a state, such as the purity, some nonclassicality indicators in phase space and, for two-mode states, entanglement measures and total correlations between the modes. Different sets of physically relevant initial configurations are considered, including one- and two-mode Gaussian states, number states, and coherent superpositions. Our analysis shows that, generally, the use of initially squeezed configurations does not help to preserve the coherence of Gaussian states, whereas it can be effective in protecting coherent superpositions of both number states and Gaussian wave packets.

A. Serafini; M. G. A. Paris; F. Illuminati; S. De Siena

2005-01-28

69

A stochastic approach for quantifying immigrant integration: the Spanish test case  

E-print Network

We apply stochastic process theory to the analysis of immigrant integration. Using a unique and detailed data set from Spain, we study the relationship between local immigrant density and two social and two economic immigration quantifiers for the period 1999-2010. As opposed to the classic time-series approach, by letting immigrant density play the role of "time", and the quantifier the role of "space" it become possible to analyze the behavior of the quantifiers by means of continuous time random walks. Two classes of results are obtained. First we show that social integration quantifiers evolve following pure diffusion law, while the evolution of economic quantifiers exhibit ballistic dynamics. Second we make predictions of best and worst case scenarios taking into account large local fluctuations. Our stochastic process approach to integration lends itself to interesting forecasting scenarios which, in the hands of policy makers, have the potential to improve political responses to integration problems. F...

Agliari, Elena; Contucci, Pierluigi; Sandell, Rickard; Vernia, Cecilia

2014-01-01

70

Quantifying air pollution removal by green roofs in Chicago Jun Yang a,c,*, Qian Yu b  

E-print Network

Quantifying air pollution removal by green roofs in Chicago Jun Yang a,c,*, Qian Yu b , Peng Gong c t The level of air pollution removal by green roofs in Chicago was quantified using a dry deposition model. The result showed that a total of 1675 kg of air pollutants was removed by 19.8 ha of green roofs in one year

Yu, Qian

71

Quantifier Elimination via Functional Composition  

Microsoft Academic Search

This paper poses the following basic question: Given a quantified Boolean formula ??x. ?, what should a function\\/formula f be such that substituting f for x in ? yields a logically equivalent quantifier-free formula? Its answer leads to a solution to quantifier elimination in the Boolean\\u000a domain, alternative to the conventional approach based on formula expansion. Such a composite function

Jie-hong R. Jiang

2009-01-01

72

Quantifying Cosmic Superstructures  

E-print Network

The Large Scale Structure (LSS) found in galaxy redshift surveys and in computer simulations of cosmic structure formation shows a very complex network of galaxy clusters, filaments, and sheets around large voids. Here, we introduce a new algorithm, based on a Minimal Spanning Tree, to find basic structural elements of this network and their properties. We demonstrate how the algorithm works using simple test cases and then apply it to haloes from the Millennium Run simulation (Springel et al. 2005). We show that about 70% of the total halo mass is contained in a structure composed of more than 74,000 individual elements, the vast majority of which are filamentary, with lengths of up to 15 Mpc/h preferred. Spatially more extended structures do exist, as do examples of what appear to be sheet-like configurations of matter. What is more, LSS appears to be composed of a fixed set of basic building blocks. The LSS formed by mass selected subsamples of haloes shows a clear correlation between the threshold mass and the mean extent of major branches, with cluster-size haloes forming structures whose branches can extend to almost 200 Mpc/h - the backbone of LSS to which smaller branches consisting of smaller haloes are attached.

J. M. Colberg

2006-11-20

73

Quantifying nonisothermal subsurface soil water evaporation  

NASA Astrophysics Data System (ADS)

Accurate quantification of energy and mass transfer during soil water evaporation is critical for improving understanding of the hydrologic cycle and for many environmental, agricultural, and engineering applications. Drying of soil under radiation boundary conditions results in formation of a dry surface layer (DSL), which is accompanied by a shift in the position of the latent heat sink from the surface to the subsurface. Detailed investigation of evaporative dynamics within this active near-surface zone has mostly been limited to modeling, with few measurements available to test models. Soil column studies were conducted to quantify nonisothermal subsurface evaporation profiles using a sensible heat balance (SHB) approach. Eleven-needle heat pulse probes were used to measure soil temperature and thermal property distributions at the millimeter scale in the near-surface soil. Depth-integrated SHB evaporation rates were compared with mass balance evaporation estimates under controlled laboratory conditions. The results show that the SHB method effectively measured total subsurface evaporation rates with only 0.01-0.03 mm h-1difference from mass balance estimates. The SHB approach also quantified millimeter-scale nonisothermal subsurface evaporation profiles over a drying event, which has not been previously possible. Thickness of the DSL was also examined using measured soil thermal conductivity distributions near the drying surface. Estimates of the DSL thickness were consistent with observed evaporation profile distributions from SHB. Estimated thickness of the DSL was further used to compute diffusive vapor flux. The diffusive vapor flux also closely matched both mass balance evaporation rates and subsurface evaporation rates estimated from SHB.

Deol, Pukhraj; Heitman, Josh; Amoozegar, Aziz; Ren, Tusheng; Horton, Robert

2012-11-01

74

Solar Light Show  

NSDL National Science Digital Library

Over the last few days, the Earth has been buffeted by a geomagnetic storm caused by a major solar flare. In addition to disruptions in radio, telecommunications, and electric service, the flare may also produce a dramatic light show as it peaks tonight. Weather permitting, the aurora borealis, or northern lights, may be visible as far south as Washington, D.C. The best viewing time will be local midnight. The sun is currently at the peak of its eleven-year solar cycle, spawning flares and "coronal mass ejections" (CME), violent outbursts of gas from the sun's corona that can carry up to 10 billion tons of electrified gas traveling at speeds as high as 2000 km/s. Geomagnetic storms result when solar winds compress the magnetosphere, sometimes interfering with electric power transmission and satellites, but also creating beautiful aurorae, as many stargazers hope will occur tonight.

De Nie, Michael W.

75

Automata and quantifier hierarchies  

Microsoft Academic Search

The paper discusses results on -languages in a recursion theoretic framework which is adapted to the treatment of formal languages. We consider variants of the arithmetical hierarchy which are not based on the recursive sets but on sets defined in terms of finite automata. In particular, it is shown how the theorems of Büchi and McNaughton on regular -languages can

Wolfgang Thomas; RWTH Aachen; Lehrstuhl f'tir Informatik

1988-01-01

76

Results.  

ERIC Educational Resources Information Center

Describes the Collegiate Results Instrument (CRI), which measures a range of collegiate outcomes for alumni 6 years after graduation. The CRI was designed to target alumni from institutions across market segments and assess their values, abilities, work skills, occupations, and pursuit of lifelong learning. (EV)

Zemsky, Robert; Shaman, Susan; Shapiro, Daniel B.

2001-01-01

77

What Do Blood Tests Show?  

MedlinePLUS

... page from the NHLBI on Twitter. What Do Blood Tests Show? Blood tests show whether the levels ... changes may work best. Result Ranges for Common Blood Tests This section presents the result ranges for ...

78

Quantifying the extinction vortex  

Microsoft Academic Search

We developed a database of 10 wild vertebrate populations whose declines to extinction were monitored over at least 12 years. We quantitatively characterized the final declines of these well-monitored populations and tested key theoretical predictions about the process of extinction, obtaining two primary results. First, we found evidence of logarithmic scaling of time-to-extinction as a function of population size for

William F. Fagan

2006-01-01

79

The Great Cometary Show  

NASA Astrophysics Data System (ADS)

The ESO Very Large Telescope Interferometer, which allows astronomers to scrutinise objects with a precision equivalent to that of a 130-m telescope, is proving itself an unequalled success every day. One of the latest instruments installed, AMBER, has led to a flurry of scientific results, an anthology of which is being published this week as special features in the research journal Astronomy & Astrophysics. ESO PR Photo 06a/07 ESO PR Photo 06a/07 The AMBER Instrument "With its unique capabilities, the VLT Interferometer (VLTI) has created itself a niche in which it provide answers to many astronomical questions, from the shape of stars, to discs around stars, to the surroundings of the supermassive black holes in active galaxies," says Jorge Melnick (ESO), the VLT Project Scientist. The VLTI has led to 55 scientific papers already and is in fact producing more than half of the interferometric results worldwide. "With the capability of AMBER to combine up to three of the 8.2-m VLT Unit Telescopes, we can really achieve what nobody else can do," added Fabien Malbet, from the LAOG (France) and the AMBER Project Scientist. Eleven articles will appear this week in Astronomy & Astrophysics' special AMBER section. Three of them describe the unique instrument, while the other eight reveal completely new results about the early and late stages in the life of stars. ESO PR Photo 06b/07 ESO PR Photo 06b/07 The Inner Winds of Eta Carinae The first results presented in this issue cover various fields of stellar and circumstellar physics. Two papers deal with very young solar-like stars, offering new information about the geometry of the surrounding discs and associated outflowing winds. Other articles are devoted to the study of hot active stars of particular interest: Alpha Arae, Kappa Canis Majoris, and CPD -57o2874. They provide new, precise information about their rotating gas envelopes. An important new result concerns the enigmatic object Eta Carinae. Using AMBER with its high spatial and spectral resolution, it was possible to zoom into the very heart of this very massive star. In this innermost region, the observations are dominated by the extremely dense stellar wind that totally obscures the underlying central star. The AMBER observations show that this dense stellar wind is not spherically symmetric, but exhibits a clearly elongated structure. Overall, the AMBER observations confirm that the extremely high mass loss of Eta Carinae's massive central star is non-spherical and much stronger along the poles than in the equatorial plane. This is in agreement with theoretical models that predict such an enhanced polar mass-loss in the case of rapidly rotating stars. ESO PR Photo 06c/07 ESO PR Photo 06c/07 RS Ophiuchi in Outburst Several papers from this special feature focus on the later stages in a star's life. One looks at the binary system Gamma 2 Velorum, which contains the closest example of a star known as a Wolf-Rayet. A single AMBER observation allowed the astronomers to separate the spectra of the two components, offering new insights in the modeling of Wolf-Rayet stars, but made it also possible to measure the separation between the two stars. This led to a new determination of the distance of the system, showing that previous estimates were incorrect. The observations also revealed information on the region where the winds from the two stars collide. The famous binary system RS Ophiuchi, an example of a recurrent nova, was observed just 5 days after it was discovered to be in outburst on 12 February 2006, an event that has been expected for 21 years. AMBER was able to detect the extension of the expanding nova emission. These observations show a complex geometry and kinematics, far from the simple interpretation of a spherical fireball in extension. AMBER has detected a high velocity jet probably perpendicular to the orbital plane of the binary system, and allowed a precise and careful study of the wind and the shockwave coming from the nova. The stream of results from the VLTI and AMBER

2007-01-01

80

Quantifying periodicity in omics data  

PubMed Central

Oscillations play a significant role in biological systems, with many examples in the fast, ultradian, circadian, circalunar, and yearly time domains. However, determining periodicity in such data can be problematic. There are a number of computational methods to identify the periodic components in large datasets, such as signal-to-noise based Fourier decomposition, Fisher's g-test and autocorrelation. However, the available methods assume a sinusoidal model and do not attempt to quantify the waveform shape and the presence of multiple periodicities, which provide vital clues in determining the underlying dynamics. Here, we developed a Fourier based measure that generates a de-noised waveform from multiple significant frequencies. This waveform is then correlated with the raw data from the respiratory oscillation found in yeast, to provide oscillation statistics including waveform metrics and multi-periods. The method is compared and contrasted to commonly used statistics. Moreover, we show the utility of the program in the analysis of noisy datasets and other high-throughput analyses, such as metabolomics and flow cytometry, respectively. PMID:25364747

Amariei, Cornelia; Tomita, Masaru; Murray, Douglas B.

2014-01-01

81

Quantifying the seismicity on Taiwan  

NASA Astrophysics Data System (ADS)

We quantify the seismicity on the island of Taiwan using the frequency-magnitude statistics of earthquakes since 1900. A break in Gutenberg-Richter scaling for large earthquakes in global seismicity has been observed, this break is also observed in our Taiwan study. The seismic data from the Central Weather Bureau Seismic Network are in good agreement with the Gutenberg-Richter relation taking b ? 1 when M < 7. For large earthquakes, M ? 7, the seismic data fit Gutenberg-Richter scaling with b ? 1.5. If the Gutenberg-Richter scaling for M < 7 earthquakes is extrapolated to larger earthquakes, we would expect a M > 8 earthquake in the study region about every 25 yr. However, our analysis shows a lower frequency of occurrence of large earthquakes so that the expected frequency of M > 8 earthquakes is about 200 yr. The level of seismicity for smaller earthquakes on Taiwan is about 12 times greater than in Southern California and the possibility of a M ? 9 earthquake north or south of Taiwan cannot be ruled out. In light of the Fukushima, Japan nuclear disaster, we also discuss the implications of our study for the three operating nuclear power plants on the coast of Taiwan.

Wu, Yi-Hsuan; Chen, Chien-Chih; Turcotte, Donald L.; Rundle, John B.

2013-07-01

82

The Art Show  

ERIC Educational Resources Information Center

This article describes what once was thought to be impossible--a formal art show extravaganza at an elementary school with 1,000 students, a Department of Defense Dependent School (DODDS) located overseas, on RAF Lakenheath, England. The dream of this this event involved the transformation of the school cafeteria into an elegant art show

Scolarici, Alicia

2004-01-01

83

Quantifying decoherence in continuous variable systems  

Microsoft Academic Search

We present a detailed report on the decoherence of quantum states of continuous variable systems under the action of a quantum optical master equation resulting from the interaction with general Gaussian uncorrelated environments. The rate of decoherence is quantified by relating it to the decay rates of various, complementary measures of the quantum nature of a state, such as the

A Serafini; M G A Paris; F. Illuminati; S. De Siena

2005-01-01

84

Quantifying Design Quality Through Design Experiments  

Microsoft Academic Search

The authors present a model for design quality metrics, discuss its relevance, and give some examples of use. Design experiments demonstrate error data extraction and analysis. Using a model of the design process for electronic products that emphasizes the resulting quality of the design, the authors demonstrated that they can quantify design quality. They can best express the probability of

Einar J. Aas; Tore Steen; Karl Klingsheim

1994-01-01

85

Quantifying Intraplate Tectonics Parameters for GESS  

Microsoft Academic Search

Intraplate tectonics needs to be better determined at the continental and regional spatial scales. By quantifying intraplate activity with global geodynamic datasets, we will be better able to parameterize tectonics within the continental craton. The utilization of a Geographical Information System (GIS) for tectonic activity has resulted in the preferred data-system platform. The tectonic data system includes seismic epicenters co-registered

J. Yates; P. D. Lowman

2002-01-01

86

The Diane Rehm Show  

NSDL National Science Digital Library

The Diane Rehm Show has its origins in a mid-day program at WAMU in Washington, D.C. Diane Rehm came on to host the program in 1979, and in 1984 it was renamed "The Diane Rehm Show". Over the past several decades, Rehm has played host to hundreds of guests, include Archbishop Desmond Tutu, Julie Andrews, and President Bill Clinton. This website contains an archive of her past programs, and visitors can use the interactive calendar to look through past shows. Those visitors looking for specific topics can use the "Topics" list on the left-hand side of the page, or also take advantage of the search engine. The show has a number of social networking links, including a Facebook page and a Twitter feed.

87

The Ozone Show.  

ERIC Educational Resources Information Center

Uses a talk show activity for a final assessment tool for students to debate about the ozone hole. Students are assessed on five areas: (1) cooperative learning; (2) the written component; (3) content; (4) self-evaluation; and (5) peer evaluation. (SAH)

Mathieu, Aaron

2000-01-01

88

Showing What They Know  

ERIC Educational Resources Information Center

Having students show their skills in three dimensions, known as performance-based assessment, dates back at least to Socrates. Individual schools such as Barrington High School--located just outside of Providence--have been requiring students to actively demonstrate their knowledge for years. The Rhode Island's high school graduating class became…

Cech, Scott J.

2008-01-01

89

Chemistry Game Shows  

Microsoft Academic Search

We present a technological improvement to the use of game shows to help students review for tests. Our approach uses HTML files interpreted with a browser on a computer attached to an LCD projector. The HTML files can be easily modified for use of the game in a variety of courses.

Susan Campbell; Jennifer Muzyka

2002-01-01

90

Obesity in show cats.  

PubMed

Obesity is an important disease with a high prevalence in cats. Because obesity is related to several other diseases, it is important to identify the population at risk. Several risk factors for obesity have been described in the literature. A higher incidence of obesity in certain cat breeds has been suggested. The aim of this study was to determine whether obesity occurs more often in certain breeds. The second aim was to relate the increased prevalence of obesity in certain breeds to the official standards of that breed. To this end, 268 cats of 22 different breeds investigated by determining their body condition score (BCS) on a nine-point scale by inspection and palpation, at two different cat shows. Overall, 45.5% of the show cats had a BCS > 5, and 4.5% of the show cats had a BCS > 7. There were significant differences between breeds, which could be related to the breed standards. Most overweight and obese cats were in the neutered group. It warrants firm discussions with breeders and cat show judges to come to different interpretations of the standards in order to prevent overweight conditions in certain breeds from being the standard of beauty. Neutering predisposes for obesity and requires early nutritional intervention to prevent obese conditions. PMID:24612018

Corbee, R J

2014-12-01

91

Show-Me Center  

NSDL National Science Digital Library

The Show-Me Center is a partnership of four NSF-sponsored middle grades mathematics curriculum development Satellite Centers (University of Wisconsin, Michigan State University, University of Montana, and the Educational Development Center). The group's website provides "information and resources needed to support selection and implementation of standards-based middle grades mathematics curricula." The Video Showcase includes segments on Number, Algebra, Geometry, Measure, and Data Analysis, with information on ways to obtain the complete video set. The Curricula Showcase provides general information, unit goals, sample lessons and teacher pages spanning four projects: the Connected Mathematics Project (CMP), Mathematics in Context (MiC), MathScape: Seeing and Thinking Mathematically, and Middle Grades Math Thematics. The website also posts Show-Me Center newsletters, information on upcoming conferences and workshops, and links to resources including published articles and unpublished commentary on mathematics school reform.

92

Show-Me Magazine  

NSDL National Science Digital Library

Come along as the folks at the University of Missouri show you the history of their college days through the Show Me magazine. It's a wonderful collection of college humor published from 1946 to 1963. First-time visitors would do well to read about the magazine's colorful past, courtesy of Jerry Smith. A good place to start is the November 1920 issue (easily found when you browse by date), which contains a number of parody advertisements along with some doggerels poking good natured fun at the football team and an assortment of deans. Also, it's worth noting that visitors can scroll through issues and save them to an online "bookbag" for later use.

2008-01-01

93

The Truman Show  

Microsoft Academic Search

The Truman Show is hardly a film you would automatically speak about as a game. At first glance, it is tempting to interpret the story of\\u000a Truman Burbank — his perpetual subjection to the artificial (televisual) world of Seahaven and its gargantuan reality TV project,\\u000a his eventual escape from the “OmniCam Ecosphere” building and the paternalistic surveillance of director Christof

Rolf F. Nohr

94

The Graphing Game Show  

NSDL National Science Digital Library

This lesson plan assesses student interpretation of graphs utilizing cooperative learning to further students understanding. Types of graphs used are horizontal and vertical bar graphs, picture graphs, and pictographs. In the lesson students play a game called the Graphing Game Show, in which they must work as a team to answer questions about specific graphs. The lesson includes four student resource worksheets and suggestions for extension and differentiation.

2011-01-01

95

Quantifiers for spatio-temporal bifurcations in coupled map lattices  

Microsoft Academic Search

The bifurcation behaviour of spatially extended systems shows interesting features which are as yet poorly understood. We analyse spatio-temporal bifurcations in coupled map lattices which maybe classified as purely spatial or spatio-temporal in nature. We construct quantifiers, which can detect all types of bifurcation behaviour. We demonstrate the utility of our quantifiers in the context of spatially or temporally periodic

Nandini Chatterjee; Neelima Gupte

1997-01-01

96

Quantifying cognitive decrements caused by cranial radiotherapy.  

PubMed

With the exception of survival, cognitive impairment stemming from the clinical management of cancer is a major factor dictating therapeutic outcome. For many patients afflicted with CNS and non-CNS malignancies, radiotherapy and chemotherapy offer the best options for disease control. These treatments however come at a cost, and nearly all cancer survivors (~11 million in the US alone as of 2006) incur some risk for developing cognitive dysfunction, with the most severe cases found in patients subjected to cranial radiotherapy (~200,000/yr) for the control of primary and metastatic brain tumors. Particularly problematic are pediatric cases, whose long-term survival plagued with marked cognitive decrements results in significant socioeconomic burdens. To date, there are still no satisfactory solutions to this significant clinical problem. We have addressed this serious health concern using transplanted stem cells to combat radiation-induced cognitive decline in athymic rats subjected to cranial irradiation. Details of the stereotaxic irradiation and the in vitro culturing and transplantation of human neural stem cells (hNSCs) can be found in our companion paper (Acharya et al., JoVE reference). Following irradiation and transplantation surgery, rats are then assessed for changes in cognition, grafted cell survival and expression of differentiation-specific markers 1 and 4-months after irradiation. To critically evaluate the success or failure of any potential intervention designed to ameliorate radiation-induced cognitive sequelae, a rigorous series of quantitative cognitive tasks must be performed. To accomplish this, we subject our animals to a suite of cognitive testing paradigms including novel place recognition, water maze, elevated plus maze and fear conditioning, in order to quantify hippocampal and non-hippocampal learning and memory. We have demonstrated the utility of these tests for quantifying specific types of cognitive decrements in irradiated animals, and used them to show that animals engrafted with hNSCs exhibit significant improvements in cognitive function. The cognitive benefits derived from engrafted human stem cells suggest that similar strategies may one day provide much needed clinical recourse to cancer survivors suffering from impaired cognition. Accordingly, we have provided written and visual documentation of the critical steps used in our cognitive testing paradigms to facilitate the translation of our promising results into the clinic. PMID:22042060

Christie, Lori-Ann; Acharya, Munjal M; Limoli, Charles L

2011-01-01

97

Quantifier processing can be dissociated from numerical processing: evidence from semantic dementia patients.  

PubMed

Quantifiers such as frequency adverbs (e.g., "always", "never") and quantity pronouns (e.g., "many", "none") convey quantity information. Whether quantifiers are processed as numbers or as general semantics has been a matter of much debate. Some neuropsychological and fMRI studies have found that the processing of quantifiers depends on the numerical magnitude comprehension system, but others have found that quantifier processing is associated with semantic representation. The selective impairment of language in semantic dementia patients provides a way to examine the above controversy. We administered a series of neuropsychological tests (i.e., language processing, numerical processing and semantic distance judgment) to two patients with different levels of severity in semantic dementia (mild vs. severe). The results showed that the two patients had intact numerical knowledge, but impairments in semantic processing. Moreover, the patient with severe/late semantic dementia showed more impairment in quantifier and semantic processing than the patient with mild/early semantic dementia. We concluded that quantifier processing is associated with general semantic processing, not with numerical processing. PMID:23867350

Cheng, Dazhi; Zhou, Aihong; Yu, Xing; Chen, Chuansheng; Jia, Jianping; Zhou, Xinlin

2013-09-01

98

Obesity in show dogs.  

PubMed

Obesity is an important disease with a growing incidence. Because obesity is related to several other diseases, and decreases life span, it is important to identify the population at risk. Several risk factors for obesity have been described in the literature. A higher incidence of obesity in certain breeds is often suggested. The aim of this study was to determine whether obesity occurs more often in certain breeds. The second aim was to relate the increased prevalence of obesity in certain breeds to the official standards of that breed. To this end, we investigated 1379 dogs of 128 different breeds by determining their body condition score (BCS). Overall, 18.6% of the show dogs had a BCS >5, and 1.1% of the show dogs had a BCS>7. There were significant differences between breeds, which could be correlated to the breed standards. It warrants firm discussions with breeders and judges in order to come to different interpretations of the standards to prevent overweight conditions from being the standard of beauty. PMID:22882163

Corbee, R J

2012-08-11

99

Factors influencing quantified surface EMGs  

Microsoft Academic Search

Summary  Isometric flexions of the elbow are studied in man. For the three joint angles studied, the torque, the surface EMG of the biceps brachii muscle, and the quantified EMG are recorded. The EMGs are picked up by means of bipolar electrodes located in such a way as to vary the interelectrode orientation, the interelectrode distance and the position on the

B. Vigreux; J. C. Cnockaert; E. Pertuzon

1979-01-01

100

NPR: The Picture Show  

NSDL National Science Digital Library

National Public Radio's "The Picture Show" photo blog is a great way to avoid culling through the thousands of less interesting and engaging photographs on the web. With a dedicated team of professionals, this blog brings together different posts that profile various sets of photographs that cover 19th century war in Afghanistan, visual memories of WWII, unpublished photographs of JFK's presidential campaign, and abandoned buildings on the islands in Boston Harbor. Visitors can search through previous posts, use social media features to share the photo features with friends, and also sign up to receive new materials via their RSS feed. There's quite a nice mix of material here, and visitors can also comment on the photos and recommend the collection to friends and others.

101

Not a "reality" show.  

PubMed

The authors of the preceding articles raise legitimate questions about patient and staff rights and the unintended consequences of allowing ABC News to film inside teaching hospitals. We explain why we regard their fears as baseless and not supported by what we heard from individuals portrayed in the filming, our decade-long experience making medical documentaries, and the full un-aired context of the scenes shown in the broadcast. The authors don't and can't know what conversations we had, what documents we reviewed, and what protections we put in place in each televised scene. Finally, we hope to correct several misleading examples cited by the authors as well as their offhand mischaracterization of our program as a "reality" show. PMID:23631336

Wrong, Terence; Baumgart, Erica

2013-01-01

102

HLA-DPB1 mismatching results in the generation of a full repertoire of HLA-DPB1-specific CD4+ T cell responses showing immunogenicity of all HLA-DPB1 alleles.  

PubMed

Clinical studies have indicated that HLA-DPB1 functions as a classical transplantation antigen in allogeneic stem cell transplantation. Mismatching for HLA-DPB1 was associated with an increased risk of graft-versus-host disease (GVHD), but also a decreased risk of disease relapse. However, specific HLA-DPB1 mismatches were associated with poor clinical outcome. It was suggested that this unfavorable effect was caused by a difference in immunogenicity between HLA-DPB1 alleles. To analyze whether immunogenicity of HLA-DPB1 mismatches could be predicted based on the presence or absence of specific amino acid sequences we developed a model to generate allo-HLA-DPB1 responses in vitro. We tested in total 48 different stimulator/responder combinations by stimulating CD4(+) T cells from 5 HLA-DPB1 homozygous individuals with the same antigen-presenting cells transduced with different allo-HLA-DPB1 molecules. HLA-DPB1 molecules used for stimulation comprised 76% to 99% of HLA-DPB1 molecules present in different ethnic populations. We show that all HLA-DPB1 mismatches as defined by allele typing resulted in high-frequency immune responses. Furthermore, we show that crossrecognition of different HLA-DPB1 molecules is a broadly observed phenomenon. We confirm previously described patterns in crossrecognition, and demonstrate that a high degree in similarity between HLA-DPB1 molecules is predictive for crossrecognition, but not for immunogenicity. PMID:20350610

Rutten, Caroline E; van Luxemburg-Heijs, Simone A P; van der Meijden, Edith D; Griffioen, Marieke; Oudshoorn, Machteld; Willemze, Roel; Falkenburg, J H Frederik

2010-09-01

103

Animal behavior: The Truman Show for ants.  

PubMed

A new tracking setup allows researchers to monitor the behavior of individual ants inside a colony. The first results demonstrate a link between age, spatial organization and division of labor, and quantify the dynamics of the colony's social network. PMID:23845245

Saragosti, Jonathan; Kronauer, Daniel J C

2013-07-01

104

Quantifying reliability uncertainty : a proof of concept.  

SciTech Connect

This paper develops Classical and Bayesian methods for quantifying the uncertainty in reliability for a system of mixed series and parallel components for which both go/no-go and variables data are available. Classical methods focus on uncertainty due to sampling error. Bayesian methods can explore both sampling error and other knowledge-based uncertainties. To date, the reliability community has focused on qualitative statements about uncertainty because there was no consensus on how to quantify them. This paper provides a proof of concept that workable, meaningful quantification methods can be constructed. In addition, the application of the methods demonstrated that the results from the two fundamentally different approaches can be quite comparable. In both approaches, results are sensitive to the details of how one handles components for which no failures have been seen in relatively few tests.

Diegert, Kathleen V.; Dvorack, Michael A.; Ringland, James T.; Mundt, Michael Joseph; Huzurbazar, Aparna (Los Alamos National Laboratory, Los Alamos, NM); Lorio, John F.; Fatherley, Quinn (Los Alamos National Laboratory, Los Alamos, NM); Anderson-Cook, Christine (Los Alamos National Laboratory, Los Alamos, NM); Wilson, Alyson G. (Los Alamos National Laboratory, Los Alamos, NM); Zurn, Rena M.

2009-10-01

105

Quantifying the net slab pull force as a driving mechanism for plate tectonics  

Microsoft Academic Search

It has remained unclear how much of the negative buoyancy force of the slab (FB) is used to pull the trailing plate at the surface into the mantle. Here I present three-dimensional laboratory experiments to quantify the net slab pull force (FNSP) with respect to FB during subduction. Results show that FNSP increases with increasing slab length and dip up

W. P. Schellart

2004-01-01

106

Tracking and Quantifying Objects and Non-Cohesive Substances  

ERIC Educational Resources Information Center

The present study tested infants' ability to assess and compare quantities of a food substance. Contrary to previous findings, the results suggest that by 10 months of age infants can quantify non-cohesive substances, and that this ability is different in important ways from their ability to quantify discrete objects: (1) In contrast to even much…

van Marle, Kristy; Wynn, Karen

2011-01-01

107

Quantifying magnetite magnetofossil contributions to sedimentary magnetizations  

NASA Astrophysics Data System (ADS)

Under suitable conditions, magnetofossils (the inorganic remains of magnetotactic bacteria) can contribute to the natural remanent magnetization (NRM) of sediments. In recent years, magnetofossils have been shown to be preserved commonly in marine sediments, which makes it essential to quantify their importance in palaeomagnetic recording. In this study, we examine a deep-sea sediment core from offshore of northwestern Western Australia. The magnetic mineral assemblage is dominated by continental detritus and magnetite magnetofossils. By separating magnetofossil and detrital components based on their different demagnetization characteristics, it is possible to quantify their respective contributions to the sedimentary NRM throughout the Brunhes chron. In the studied core, the contribution of magnetofossils to the NRM is controlled by large-scale climate changes, with their relative importance increasing during glacial periods when detrital inputs were low. Our results demonstrate that magnetite magnetofossils can dominate sedimentary NRMs in settings where they are preserved in significant abundances.

Heslop, David; Roberts, Andrew P.; Chang, Liao; Davies, Maureen; Abrajevitch, Alexandra; De Deckker, Patrick

2013-11-01

108

Quantifying pulsed laser induced damage to graphene  

SciTech Connect

As an emerging optical material, graphene's ultrafast dynamics are often probed using pulsed lasers yet the region in which optical damage takes place is largely uncharted. Here, femtosecond laser pulses induced localized damage in single-layer graphene on sapphire. Raman spatial mapping, SEM, and AFM microscopy quantified the damage. The resulting size of the damaged area has a linear correlation with the optical fluence. These results demonstrate local modification of sp{sup 2}-carbon bonding structures with optical pulse fluences as low as 14 mJ/cm{sup 2}, an order-of-magnitude lower than measured and theoretical ablation thresholds.

Currie, Marc; Caldwell, Joshua D.; Bezares, Francisco J.; Robinson, Jeremy; Anderson, Travis; Chun, Hayden; Tadjer, Marko [Optical Sciences Division and Electronics Science and Technology Division, Naval Research Laboratory, Washington DC 20375 (United States)

2011-11-21

109

Quantifying torso deformity in scoliosis  

NASA Astrophysics Data System (ADS)

Scoliosis affects the alignment of the spine and the shape of the torso. Most scoliosis patients and their families are more concerned about the effect of scoliosis on the torso than its effect on the spine. There is a need to develop robust techniques for quantifying torso deformity based on full torso scans. In this paper, deformation indices obtained from orthogonal maps of full torso scans are used to quantify torso deformity in scoliosis. 'Orthogonal maps' are obtained by applying orthogonal transforms to 3D surface maps. (An 'orthogonal transform' maps a cylindrical coordinate system to a Cartesian coordinate system.) The technique was tested on 361 deformed computer models of the human torso and on 22 scans of volunteers (8 normal and 14 scoliosis). Deformation indices from the orthogonal maps correctly classified up to 95% of the volunteers with a specificity of 1.00 and a sensitivity of 0.91. In addition to classifying scoliosis, the system gives a visual representation of the entire torso in one view and is viable for use in a clinical environment for managing scoliosis.

Ajemba, Peter O.; Kumar, Anish; Durdle, Nelson G.; Raso, V. James

2006-03-01

110

Use of the Concept of Equivalent Biologically Effective Dose (BED) to Quantify the Contribution of Hyperthermia to Local Tumor Control in Radiohyperthermia Cervical Cancer Trials, and Comparison With Radiochemotherapy Results  

SciTech Connect

Purpose: To express the magnitude of contribution of hyperthermia to local tumor control in radiohyperthermia (RT/HT) cervical cancer trials, in terms of the radiation-equivalent biologically effective dose (BED) and to explore the potential of the combined modalities in the treatment of this neoplasm. Materials and Methods: Local control rates of both arms of each study (RT vs. RT+HT) reported from randomized controlled trials (RCT) on concurrent RT/HT for cervical cancer were reviewed. By comparing the two tumor control probabilities (TCPs) from each study, we calculated the HT-related log cell-kill and then expressed it in terms of the number of 2 Gy fraction equivalents, for a range of tumor volumes and radiosensitivities. We have compared the contribution of each modality and made some exploratory calculations on the TCPs that might be expected from a combined trimodality treatment (RT+CT+HT). Results: The HT-equivalent number of 2-Gy fractions ranges from 0.6 to 4.8 depending on radiosensitivity. Opportunities for clinically detectable improvement by the addition of HT are only available in tumors with an alpha value in the approximate range of 0.22-0.28 Gy{sup -1}. A combined treatment (RT+CT+HT) is not expected to improve prognosis in radioresistant tumors. Conclusion: The most significant improvements in TCP, which may result from the combination of RT/CT/HT for locally advanced cervical carcinomas, are likely to be limited only to those patients with tumors of relatively low-intermediate radiosensitivity.

Plataniotis, George A. [Department of Oncology, Aberdeen Royal Infirmary, Aberdeen (United Kingdom)], E-mail: george.plataniotis@nhs.net; Dale, Roger G. [Imperial College Healthcare NHS Trust, London (United Kingdom)

2009-04-01

111

Children's knowledge of hierarchical phrase structure: quantifier floating in Japanese.  

PubMed

The interpretation of floating quantifiers in Japanese requires knowledge of hierarchical phrase structure. However, the input to children is insufficient or even misleading, as our analysis indicates. This presents an intriguing question on learnability: do children interpret floating quantifiers based on a structure-dependent rule which is not obvious in the input or do they employ a sentence comprehension strategy based on the available input? Two experiments examined four- to six-year-old Japanese-speaking children for their interpretations of floating quantifiers in SOV and OSV sentences. The results revealed that no child employed a comprehension strategy in terms of the linear ordering of constituents, and most five- and six-year-olds correctly interpreted floating quantifiers when word-order difficulty was reduced. These facts indicate that children's interpretation of floating quantifiers is structurally dependent on hierarchical phrase structure, suggesting that this knowledge is a part of children's grammar despite the insufficient input available to them. PMID:22850618

Suzuki, Takaaki; Yoshinaga, Naoko

2013-06-01

112

Examples show system-performance upgrade  

SciTech Connect

This paper reviews several techniques to quantify the performance of a turboexpander refrigeration system's expander and compressor. It proposes that improvements in cycle efficiency can best be implemented through a combined effort of those responsible for the operation of the gas processing plant and the design engineers familiar with the performance of the specialized components within the system. It points out that the manufacturers of the turbomachinery should be able to recommend specific changes to improve performance and quantify the net benefits these changes have on plant efficiency. It presents graphs showing the effect of heat exchanger fouling, performance loss, and cost of lost performance. It attempts to operate a compressor at a higher efficiency on its efficiency vs. Q/N (flow rate/shaft speed) performance curve and explains that operating curves can be prepared based on field-measured values.

McIntire, R.

1982-07-12

113

Quantifying forested stands with the pulsed airborne laser profiler  

E-print Network

's portrayal of total tree height (height intercept) and crown diameter (crown intercept). Distribution models for total tree height, crown diameter, and diameter breast height (dbh) were developed to quantify the relationship between simulation laser... measurements and actual field measurements. Comparisons of predicted distributions from distribution models to observed distributions from field measurements showed measurement discrepancy trends which should be quantified with further testing. A...

Whatley, Michael Craig

2012-06-07

114

Low-frequency magnetic field technology: quantifying spinal range of motion.  

PubMed

Reliability of a sensitive, noninvasive technique for quantifying spinal range of motion was assessed by investigators who measured lumbar mobility of 19 subjects with no history of low-back pain or spinal abnormalities. The measurement method used low-frequency, quasistatic magnetic dipole field and sensors which disturb this field in a precisely quantifiable manner. Sensors, affixed to the skin over T12 to L1 interspace and over the sacrum at the level of S1, were directly interfaced with an IBM PC-AT microcomputer, which was used for error-free collection and storage of data. Measurement results compared favorably to those obtained from the biplanar radiographic technique. Statistical analysis showed an extremely high degree of intraobserver and interobserver reliability. Additional advantages included simplicity and noninvasiveness. Overall, the magnetic field technique proved a significant advancement in clinically quantifying spinal mobility, allowing precise determination of impairment under American Medical Association guidelines. PMID:2730309

Cohn, M L; Machado, A F; Cohn, S J

1989-06-01

115

A stochastic approach for quantifying immigrant integration: the Spanish test case  

NASA Astrophysics Data System (ADS)

We apply stochastic process theory to the analysis of immigrant integration. Using a unique and detailed data set from Spain, we study the relationship between local immigrant density and two social and two economic immigration quantifiers for the period 1999–2010. As opposed to the classic time-series approach, by letting immigrant density play the role of ‘time’ and the quantifier the role of ‘space,’ it becomes possible to analyse the behavior of the quantifiers by means of continuous time random walks. Two classes of results are then obtained. First, we show that social integration quantifiers evolve following diffusion law, while the evolution of economic quantifiers exhibits ballistic dynamics. Second, we make predictions of best- and worst-case scenarios taking into account large local fluctuations. Our stochastic process approach to integration lends itself to interesting forecasting scenarios which, in the hands of policy makers, have the potential to improve political responses to integration problems. For instance, estimating the standard first-passage time and maximum-span walk reveals local differences in integration performance for different immigration scenarios. Thus, by recognizing the importance of local fluctuations around national means, this research constitutes an important tool to assess the impact of immigration phenomena on municipal budgets and to set up solid multi-ethnic plans at the municipal level as immigration pressures build.

Agliari, Elena; Barra, Adriano; Contucci, Pierluigi; Sandell, Richard; Vernia, Cecilia

2014-10-01

116

Detecting and Quantifying Topography in Neural Maps  

PubMed Central

Topographic maps are an often-encountered feature in the brains of many species, yet there are no standard, objective procedures for quantifying topography. Topographic maps are typically identified and described subjectively, but in cases where the scale of the map is close to the resolution limit of the measurement technique, identifying the presence of a topographic map can be a challenging subjective task. In such cases, an objective topography detection test would be advantageous. To address these issues, we assessed seven measures (Pearson distance correlation, Spearman distance correlation, Zrehen's measure, topographic product, topological correlation, path length and wiring length) by quantifying topography in three classes of cortical map model: linear, orientation-like, and clusters. We found that all but one of these measures were effective at detecting statistically significant topography even in weakly-ordered maps, based on simulated noisy measurements of neuronal selectivity and sparse sampling of the maps. We demonstrate the practical applicability of these measures by using them to examine the arrangement of spatial cue selectivity in pallid bat A1. This analysis shows that significantly topographic arrangements of interaural intensity difference and azimuth selectivity exist at the scale of individual binaural clusters. PMID:24505279

Yarrow, Stuart; Razak, Khaleel A.; Seitz, Aaron R.; Series, Peggy

2014-01-01

117

Quantifying utricular stimulation during natural behavior.  

PubMed

The use of natural stimuli in neurophysiological studies has led to significant insights into the encoding strategies used by sensory neurons. To investigate these encoding strategies in vestibular receptors and neurons, we have developed a method for calculating the stimuli delivered to a vestibular organ, the utricle, during natural (unrestrained) behaviors, using the turtle as our experimental preparation. High-speed digital video sequences are used to calculate the dynamic gravito-inertial (GI) vector acting on the head during behavior. X-ray computed tomography (CT) scans are used to determine the orientation of the otoconial layer (OL) of the utricle within the head, and the calculated GI vectors are then rotated into the plane of the OL. Thus, the method allows us to quantify the spatio-temporal structure of stimuli to the OL during natural behaviors. In the future, these waveforms can be used as stimuli in neurophysiological experiments to understand how natural signals are encoded by vestibular receptors and neurons. We provide one example of the method, which shows that turtle feeding behaviors can stimulate the utricle at frequencies higher than those typically used in vestibular studies. This method can be adapted to other species, to other vestibular end organs, and to other methods of quantifying head movements. PMID:22753360

Rivera, Angela R V; Davis, Julian; Grant, Wally; Blob, Richard W; Peterson, Ellengene; Neiman, Alexander B; Rowe, Michael

2012-12-01

118

Quantifying of bactericide properties of medicinal plants.  

PubMed

Extended research has been carried out to clarify the ecological role of plant secondary metabolites (SMs). Although their primary ecological function is self-defence, bioactive compounds have long been used in alternative medicine or in biological control of pests. Several members of the family Labiatae are known to have strong antimicrobial capacity. For testing and quantifying antibacterial activity, most often standard microbial protocols are used, assessing inhibitory activity on a selected strain. In this study the applicability of a microbial ecotoxtest was evaluated to quantify the aggregate bactericide capacity of Labiatae species, based on the bioluminescence inhibition of the bacterium Vibrio fischeri. Striking differences were found amongst herbs, reaching even 10-fold toxicity. Glechoma hederacea L. proved to be the most toxic, with the EC50 of 0.4073 g dried plant/l. LC50 values generated by the standard bioassay seem to be a good indicator of the bactericide property of herbs. Traditional use of the selected herbs shows a good correlation with bioactivity expressed as bioluminescence inhibition, leading to the conclusion that the Vibrio fischeri bioassay can be a good indicator of the overall antibacterial capacity of herbs, at least on a screening level. PMID:21502819

Kováts, Nora; Ács, András; Gölöncsér, Flóra; Barabás, Anikó

2011-06-01

119

Quantifying of bactericide properties of medicinal plants  

PubMed Central

Extended research has been carried out to clarify the ecological role of plant secondary metabolites (SMs). Although their primary ecological function is self-defense, bioactive compounds have long been used in alternative medicine or in biological control of pests. Several members of the family Labiatae are known to have strong antimicrobial capacity. For testing and quantifying antibacterial activity, most often standard microbial protocols are used, assessing inhibitory activity on a selected strain. In this study, the applicability of a microbial ecotoxtest was evaluated to quantify the aggregate bactericide capacity of Labiatae species, based on the bioluminescence inhibition of the bacterium Vibrio fischeri. Striking differences were found amongst herbs, reaching even 10-fold toxicity. Glechoma hederacea L. proved to be the most toxic, with the EC50 of 0.4073 g dried plant/l. LC50 values generated by the standard bioassay seem to be a good indicator of the bactericide property of herbs. Traditional use of the selected herbs shows a good correlation with bioactivity expressed as bioluminescence inhibition, leading to the conclusion that the Vibrio fischeri bioassay can be a good indicator of the overall antibacterial capacity of herbs, at least on a screening level. PMID:21502819

Acs, Andras; Goloncser, Flora; Barabas, Aniko

2011-01-01

120

Quantifying utricular stimulation during natural behavior  

PubMed Central

The use of natural stimuli in neurophysiological studies has led to significant insights into the encoding strategies used by sensory neurons. To investigate these encoding strategies in vestibular receptors and neurons, we have developed a method for calculating the stimuli delivered to a vestibular organ, the utricle, during natural (unrestrained) behaviors, using the turtle as our experimental preparation. High-speed digital video sequences are used to calculate the dynamic gravito-inertial (GI) vector acting on the head during behavior. X-ray computed tomography (CT) scans are used to determine the orientation of the otoconial layer (OL) of the utricle within the head, and the calculated GI vectors are then rotated into the plane of the OL. Thus, the method allows us to quantify the spatio-temporal structure of stimuli to the OL during natural behaviors. In the future, these waveforms can be used as stimuli in neurophysiological experiments to understand how natural signals are encoded by vestibular receptors and neurons. We provide one example of the method which shows that turtle feeding behaviors can stimulate the utricle at frequencies higher than those typically used in vestibular studies. This method can be adapted to other species, to other vestibular end organs, and to other methods of quantifying head movements. PMID:22753360

Rivera, Angela R. V.; Davis, Julian; Grant, Wally; Blob, Richard W.; Peterson, Ellengene; Neiman, Alexander B.; Rowe, Michael

2012-01-01

121

The S locus-linked Primula homeotic mutant sepaloid shows characteristics of a B-function mutant but does not result from mutation in a B-function gene.  

PubMed

Floral homeotic and flower development mutants of Primula, including double, Hose in Hose, Jack in the Green and Split Perianth, have been cultivated since the late 1500s as ornamental plants but until recently have attracted limited scientific attention. Here we describe the characterization of a new mutant phenotype, sepaloid, that produces flowers comprising only sepals and carpels. The sepaloid mutation is recessive, and is linked to the S locus that controls floral heteromorphy. The phenotype shows developmental variability, with flowers containing three whorls of sepals surrounding fertile carpels, two whorls of sepals with a diminished third whorl of sepals surrounding a fourth whorl of carpels, or three whorls of sepals surrounding abnormal carpels. In some respects, these phenotypes resemble the Arabidopsis and Antirrhinum homeotic B-function mutants apetala3/deficiens (ap3/def) and pistillata/globosa (pi/glo). We have isolated the Primula vulgaris B-function genes PvDEFICIENS (PvDEF) and PvGLOBOSA (PvGLO), expression of both of which is affected in the sepaloid mutant. PvGLO, like sepaloid, is linked to the S locus, whereas PvDEF is not. However, our analyses reveal that sepaloid and PvGLO represent different genes. We conclude that SEPALOID is an S-linked independent regulator of floral organ identity genes including PvDEF and PvGLO. PMID:18564384

Li, Jinhong; Webster, Margaret; Dudas, Brigitta; Cook, Holly; Manfield, Iain; Davies, Brendan; Gilmartin, Philip M

2008-10-01

122

The "Life Potential": a new complex algorithm to assess "Heart Rate Variability" from Holter records for cognitive and diagnostic aims. Preliminary experimental results showing its dependence on age, gender and health conditions  

E-print Network

Although HRV (Heart Rate Variability) analyses have been carried out for several decades, several limiting factors still make these analyses useless from a clinical point of view. The present paper aims at overcoming some of these limits by introducing the "Life Potential" (BMP), a new mathematical algorithm which seems to exhibit surprising cognitive and predictive capabilities. BMP is defined as a linear combination of five HRV Non-Linear Variables, in turn derived from the thermodynamic formalism of chaotic dynamic systems. The paper presents experimental measurements of BMP (Average Values and Standard Deviations) derived from 1048 Holter tests, matched in age and gender, including a control group of 356 healthy subjects. The main results are: (a) BMP always decreases when the age increases, and its dependence on age and gender is well established; (b) the shape of the age dependence within "healthy people" is different from that found in the general group: this behavior provides evidence of possible illn...

Barra, Orazio A

2013-01-01

123

Quantifying lateral femoral condyle ellipticalness in chimpanzees, gorillas, and humans.  

PubMed

Articular surfaces of limb bones provide information for understanding animal locomotion because their size and shape are a reflection of habitual postures and movements. Here we present a novel method for quantifying the ellipticalness (i.e., departure from perfectly circular) of the lateral femoral condyle (LFC), applying this technique to hominid femora. Three-dimensional surface models were created for 49 Homo sapiens, 34 Pan troglodytes and 25 Gorilla gorilla femora. Software was developed that fit separate cylinders to each of the femoral condyles. These cylinders were constrained to have a single axis, but could have different radii. The cylinder fit to the LFC was allowed to assume an elliptical cross-section, while the cylinder fit to the medial condyle was constrained to remain circular. The shape of the elliptical cylinder (ratio of the major and minor axes of the ellipse) was recorded, and the orientation of the elliptical cylinder quantified as angles between the major axis of the ellipse and the anatomical and mechanical axes of the femur. Species were compared using analysis of variance and post hoc multiple comparisons tests. Confirming qualitative descriptions, human LFCs are more elliptical than those of chimpanzees and gorillas. Human femora exhibit a narrow range for the angle between the major axis of the elliptical cylinder and femoral axes. Conversely, the chimpanzee sample is bimodal for these angles, exhibiting two ellipse orientations, while Gorilla shows no preferred angle. Our results suggest that like modern human femora, chimpanzee femoral condyles have preferentially used regions. PMID:23042636

Sylvester, Adam D; Pfisterer, Theresa

2012-11-01

124

Quantifying a cellular automata simulation of electric vehicles  

NASA Astrophysics Data System (ADS)

Within this work the Nagel-Schreckenberg (NS) cellular automata is used to simulate a basic cyclic road network. Results from SwitchEV, a real world Electric Vehicle trial which has collected more than two years of detailed electric vehicle data, are used to quantify the results of the NS automata, demonstrating similar power consumption behavior to that observed in the experimental results. In particular the efficiency of the electric vehicles reduces as the vehicle density increases, due in part to the reduced efficiency of EVs at low speeds, but also due to the energy consumption inherent in changing speeds. Further work shows the results from introducing spatially restricted speed restriction. In general it can be seen that induced congestion from spatially transient events propagates back through the road network and alters the energy and efficiency profile of the simulated vehicles, both before and after the speed restriction. Vehicles upstream from the restriction show a reduced energy usage and an increased efficiency, and vehicles downstream show an initial large increase in energy usage as they accelerate away from the speed restriction.

Hill, Graeme; Bell, Margaret; Blythe, Phil

2014-12-01

125

Quantifying uncertainty from material inhomogeneity.  

SciTech Connect

Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the deformation behavior was found to depend strongly on the character of the nearby microstructure.

Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

2009-09-01

126

Quantifying the isotopic ‘continental effect’  

NASA Astrophysics Data System (ADS)

Since the establishment of the IAEA-WMO precipitation-monitoring network in 1961, it has been observed that isotope ratios in precipitation (?H2 and ?O18) generally decrease from coastal to inland locations, an observation described as the ‘continental effect.’ While discussed frequently in the literature, there have been few attempts to quantify the variables controlling this effect despite the fact that isotopic gradients over continents can vary by orders of magnitude. In a number of studies, traditional Rayleigh fractionation has proven inadequate in describing the global variability of isotopic gradients due to its simplified treatment of moisture transport and its lack of moisture recycling processes. In this study, we use a one-dimensional idealized model of water vapor transport along a storm track to investigate the dominant variables controlling isotopic gradients in precipitation across terrestrial environments. We find that the sensitivity of these gradients to progressive rainout is controlled by a combination of the amount of evapotranspiration and the ratio of transport by advection to transport by eddy diffusion, with these variables becoming increasingly important with decreasing length scales of specific humidity. A comparison of modeled gradients with global precipitation isotope data indicates that these variables can account for the majority of variability in observed isotopic gradients between coastal and inland locations. Furthermore, the dependence of the ‘continental effect’ on moisture recycling allows for the quantification of evapotranspiration fluxes from measured isotopic gradients, with implications for both paleoclimate reconstructions and large-scale monitoring efforts in the context of global warming and a changing hydrologic cycle.

Winnick, Matthew J.; Chamberlain, C. Page; Caves, Jeremy K.; Welker, Jeffrey M.

2014-11-01

127

Computed tomography to quantify tooth abrasion  

NASA Astrophysics Data System (ADS)

Cone-beam computed tomography, also termed digital volume tomography, has become a standard technique in dentistry, allowing for fast 3D jaw imaging including denture at moderate spatial resolution. More detailed X-ray images of restricted volumes for post-mortem studies in dental anthropology are obtained by means of micro computed tomography. The present study evaluates the impact of the pipe smoking wear on teeth morphology comparing the abraded tooth with its contra-lateral counterpart. A set of 60 teeth, loose or anchored in the jaw, from 12 dentitions have been analyzed. After the two contra-lateral teeth were scanned, one dataset has been mirrored before the two datasets were registered using affine and rigid registration algorithms. Rigid registration provides three translational and three rotational parameters to maximize the overlap of two rigid bodies. For the affine registration, three scaling factors are incorporated. Within the present investigation, affine and rigid registrations yield comparable values. The restriction to the six parameters of the rigid registration is not a limitation. The differences in size and shape between the tooth and its contra-lateral counterpart generally exhibit only a few percent in the non-abraded volume, validating that the contralateral tooth is a reasonable approximation to quantify, for example, the volume loss as the result of long-term clay pipe smoking. Therefore, this approach allows quantifying the impact of the pipe abrasion on the internal tooth morphology including root canal, dentin, and enamel volumes.

Kofmehl, Lukas; Schulz, Georg; Deyhle, Hans; Filippi, Andreas; Hotz, Gerhard; Berndt-Dagassan, Dorothea; Kramis, Simon; Beckmann, Felix; Müller, Bert

2010-09-01

128

Towards Quantifying Complexity with Quantum Mechanics  

E-print Network

While we have intuitive notions of structure and complexity, the formalization of this intuition is non-trivial. The statistical complexity is a popular candidate. It is based on the idea that the complexity of a process can be quantified by the complexity of its simplest mathematical model - the model that requires the least past information for optimal future prediction. Here we review how such models, known as $\\epsilon$-machines can be further simplified through quantum logic, and explore the resulting consequences for understanding complexity. In particular, we propose a new measure of complexity based on quantum $\\epsilon$-machines. We apply this to a simple system undergoing constant thermalization. The resulting quantum measure of complexity aligns more closely with our intuition of how complexity should behave.

Ryan Tan; Daniel R. Terno; Jayne Thompson; Vlatko Vedral; Mile Gu

2014-04-24

129

Examples show system-performance upgrade  

SciTech Connect

Described are ways of improving the productivity of the turboexpander refrigeration system's expander and compressor through systematic review of component performance. Improvements in cycle efficiency can best be implemented through a combined effort of those responsible for the operation of the gas processing plant and design engineers familiar with performance of specialized components. Manufacturers of turbomachinery should be able to recommend specific changes to improve performance and to quantify the net benefits these changes have on overall plant efficiency. Presented are graphs showing effect of heat exchanger fouling, performance loss and cost of lost performance. Shows how, by modifying the compressor wheel, expander nozzle, and repositioning compressor diffusers, the expander (with modified nozzles) was able to pass the higher flow, and compressor efficiency showed an increase of 10% at the increased flow rate. Redesigning new aerodynamic components was not necessary since existing hardware was modified to new conditions at a cost saving.

McIntire, R.

1982-07-12

130

Quantifying Drosophila food intake: comparative analysis of current methodology  

PubMed Central

Food intake is a fundamental parameter in animal studies. Despite the prevalent use of Drosophila in laboratory research, precise measurements of food intake remain challenging in this model organism. Here, we compare several common Drosophila feeding assays: the Capillary Feeder (CAFE), food-labeling with a radioactive tracer or a colorimetric dye, and observations of proboscis extension (PE). We show that the CAFE and radioisotope-labeling provide the most consistent results, have the highest sensitivity, and can resolve differences in feeding that dye-labeling and PE fail to distinguish. We conclude that performing the radiolabeling and CAFE assays in parallel is currently the best approach for quantifying Drosophila food intake. Understanding the strengths and limitations of food intake methodology will greatly advance Drosophila studies of nutrition, behavior, and disease. PMID:24681694

Deshpande, Sonali A.; Carvalho, Gil B.; Amador, Ariadna; Phillips, Angela M.; Hoxha, Sany; Lizotte, Keith J.; Ja, William W.

2014-01-01

131

Quantifying Global Uncertainties in a Simple Microwave Rainfall Algorithm  

NASA Technical Reports Server (NTRS)

While a large number of methods exist in the literature for retrieving rainfall from passive microwave brightness temperatures, little has been written about the quantitative assessment of the expected uncertainties in these rainfall products at various time and space scales. The latter is the result of two factors: sparse validation sites over most of the world's oceans, and algorithm sensitivities to rainfall regimes that cause inconsistencies against validation data collected at different locations. To make progress in this area, a simple probabilistic algorithm is developed. The algorithm uses an a priori database constructed from the Tropical Rainfall Measuring Mission (TRMM) radar data coupled with radiative transfer computations. Unlike efforts designed to improve rainfall products, this algorithm takes a step backward in order to focus on uncertainties. In addition to inversion uncertainties, the construction of the algorithm allows errors resulting from incorrect databases, incomplete databases, and time- and space-varying databases to be examined. These are quantified. Results show that the simple algorithm reduces errors introduced by imperfect knowledge of precipitation radar (PR) rain by a factor of 4 relative to an algorithm that is tuned to the PR rainfall. Database completeness does not introduce any additional uncertainty at the global scale, while climatologically distinct space/time domains add approximately 25% uncertainty that cannot be detected by a radiometer alone. Of this value, 20% is attributed to changes in cloud morphology and microphysics, while 5% is a result of changes in the rain/no-rain thresholds. All but 2%-3% of this variability can be accounted for by considering the implicit assumptions in the algorithm. Additional uncertainties introduced by the details of the algorithm formulation are not quantified in this study because of the need for independent measurements that are beyond the scope of this paper. A validation strategy for these errors is outlined.

Kummerow, Christian; Berg, Wesley; Thomas-Stahle, Jody; Masunaga, Hirohiko

2006-01-01

132

Showing results, 3 Energy technology and energy planning  

E-print Network

­ Processes and cycling of matter in ecosystems, 16 Radioecology and tracers, 16 Trace analysis and limitation materials, 24 Surfaces and boundary layers, 25 ­ Structural materials, 26 Fundamental materials research, 26 and special research facilities, 30 Special research facilities, 30 Research departments, 30 Staff functions

133

Midterm Picnic ELI Talent Show  

E-print Network

Highlights Midterm Picnic ELI Talent Show Notes from the Office Birthdays Manners Grammar The will be on the Activities Board and in next week's Weekly. ELI Talent Show Do you have a talent? Show it to us at the ELI Talent Show! The talent show is open to ELI students, faculty, staff, and LAs. Acts can include

Pilyugin, Sergei S.

134

Quantifying anatomical shape variations in neurological disorders.  

PubMed

We develop a multivariate analysis of brain anatomy to identify the relevant shape deformation patterns and quantify the shape changes that explain corresponding variations in clinical neuropsychological measures. We use kernel Partial Least Squares (PLS) and formulate a regression model in the tangent space of the manifold of diffeomorphisms characterized by deformation momenta. The scalar deformation momenta completely encode the diffeomorphic changes in anatomical shape. In this model, the clinical measures are the response variables, while the anatomical variability is treated as the independent variable. To better understand the "shape-clinical response" relationship, we also control for demographic confounders, such as age, gender, and years of education in our regression model. We evaluate the proposed methodology on the Alzheimer's Disease Neuroimaging Initiative (ADNI) database using baseline structural MR imaging data and neuropsychological evaluation test scores. We demonstrate the ability of our model to quantify the anatomical deformations in units of clinical response. Our results also demonstrate that the proposed method is generic and generates reliable shape deformations both in terms of the extracted patterns and the amount of shape changes. We found that while the hippocampus and amygdala emerge as mainly responsible for changes in test scores for global measures of dementia and memory function, they are not a determinant factor for executive function. Another critical finding was the appearance of thalamus and putamen as most important regions that relate to executive function. These resulting anatomical regions were consistent with very high confidence irrespective of the size of the population used in the study. This data-driven global analysis of brain anatomy was able to reach similar conclusions as other studies in Alzheimer's disease based on predefined ROIs, together with the identification of other new patterns of deformation. The proposed methodology thus holds promise for discovering new patterns of shape changes in the human brain that could add to our understanding of disease progression in neurological disorders. PMID:24667299

Singh, Nikhil; Fletcher, P Thomas; Preston, J Samuel; King, Richard D; Marron, J S; Weiner, Michael W; Joshi, Sarang

2014-04-01

135

Quantifying immersion in virtual reality  

Microsoft Academic Search

Virtual Reality (VR) has generated much excitement but little for- mal proof that it is useful. Because VR interfaces are difficult and expensive to build, the computer graphics community needs to be able to predict which applications will benefit from VR. In this paper, we show that users with a VR interface complete a search task faster than users with

Randy F. Pausch; Dennis Proffitt; George H. Williams

1997-01-01

136

Soccer Tournament ELI Talent Show  

E-print Network

Highlights Soccer Tournament ELI Talent Show Notes from your Teachers Notes from the Office very quickly! ELI Talent Show As you probably already know, the ELI is going to have its second annual Talent Show. The talent show is open to ELI students, faculty, staff, and LAs. Acts can include

Pilyugin, Sergei S.

137

Existential Quantifiers in Abstract Data Types  

Microsoft Academic Search

Hierarchies of abstract data types are specified by axioms which are positive formulas consisting of universally and existentially quantified disjunctions and conjunctions of equations. Necessary and sufficient conditions for the existence of terminal algebras are investigated. Furthermore, some advantages of disjunctions and existential quantifiers within the laws are discussed and the usefulness of terminal algebras is demonstrated by a few

Manfred Broy; Walter Dosch; Helmuth Partsch; Peter Pepper; Martin Wirsing

1979-01-01

138

Math 512: Finite Model Theory Generalized Quantifiers  

E-print Network

Quantifiers 0-1 Laws John Baldwin isomorphism, the graph quantifier QGH. If OE(x, y, __z) is * *a formula so is QGHOE(x, y, __z) which has __zfree and (A, E) |= QGHOE(x, y, __a) if and only i* *f {g1, g2> : (A, E) |= OE(g1, g2, __a) _ OE(g1

Baldwin, John T.

139

Quantifying temporal ventriloquism in audiovisual synchrony perception.  

PubMed

The integration of visual and auditory inputs in the human brain works properly only if the components are perceived in close temporal proximity. In the present study, we quantified cross-modal interactions in the human brain for audiovisual stimuli with temporal asynchronies, using a paradigm from rhythm perception. In this method, participants had to align the temporal position of a target in a rhythmic sequence of four markers. In the first experiment, target and markers consisted of a visual flash or an auditory noise burst, and all four combinations of target and marker modalities were tested. In the same-modality conditions, no temporal biases and a high precision of the adjusted temporal position of the target were observed. In the different-modality conditions, we found a systematic temporal bias of 25-30 ms. In the second part of the first and in a second experiment, we tested conditions in which audiovisual markers with different stimulus onset asynchronies (SOAs) between the two components and a visual target were used to quantify temporal ventriloquism. The adjusted target positions varied by up to about 50 ms and depended in a systematic way on the SOA and its proximity to the point of subjective synchrony. These data allowed testing different quantitative models. The most satisfying model, based on work by Maij, Brenner, and Smeets (Journal of Neurophysiology 102, 490-495, 2009), linked temporal ventriloquism and the percept of synchrony and was capable of adequately describing the results from the present study, as well as those of some earlier experiments. PMID:23868564

Kuling, Irene A; Kohlrausch, Armin; Juola, James F

2013-10-01

140

Fat Stigmatization in Television Shows and Movies: A Content Analysis  

Microsoft Academic Search

Objective: To examine the phenomenon of fat stigmatization messages presented in television shows and movies, a content analysis was used to quantify and categorize fat-specific commentary and humor.Research Methods and Procedures: Fat stigmatization vignettes were identified using a targeted sampling procedure, and 135 scenes were excised from movies and television shows. The material was coded by trained raters. Reliability indices

Susan M. Himes; J. Kevin Thompson

2007-01-01

141

Quantifying Inductive Bias: AI Learning Algorithms and Valiant's Learning Framework  

Microsoft Academic Search

We show that the notion of inductive bias in concept learning can be quantified in a way that directl_v relates to learning performance in the framework recently introduced by Valiant. Our measure of bias is based on the growth function introduced by Vapnik and Chervonenkis, and on the Vapnik-Chervonenkis dimension. We measure some common language biases, including restriction to conjunctive

David Haussler

1988-01-01

142

quantifying and Predicting Reactive Transport  

SciTech Connect

This project was led by Dr. Jiamin Wan at Lawrence Berkeley National Laboratory. Peter Burns provided expertise in uranium mineralogy and in identification of uranium minerals in test materials. Dr. Wan conducted column tests regarding uranium transport at LBNL, and samples of the resulting columns were sent to Dr. Burns for analysis. Samples were analyzed for uranium mineralogy by X-ray powder diffraction and by scanning electron microscopy, and results were provided to Dr. Wan for inclusion in the modeling effort. Full details of the project can be found in Dr. Wan's final reports for the associated effort at LBNL.

Peter C. Burns, Department of Civil Engineering and Geological Sciences, University of Notre Dame

2009-12-04

143

Is it Logical to Count on Quantifiers? Dissociable Neural Networks Underlying Numerical and Logical Quantifiers  

PubMed Central

The present study examined the neural substrate of two classes of quantifiers: Numerical quantifiers like “at least three” which require magnitude processing, and logical quantifiers like “some” which can be satisfied using a simple form of perceptual logic. We assessed these distinct classes of quantifiers with converging observations from two sources: functional imaging data from healthy adults, and behavioral and structural data from patients with corticobasal degeneration, who have acalculia. Our findings are consistent with the claim that numerical quantifier comprehension depends on a parietal-dorsolateral prefrontal network, but logical quantifier comprehension depends instead on a rostral medial prefrontal-posterior cingulate network. These observations emphasize the important contribution of abstract number knowledge to the meaning of numerical quantifiers in semantic memory and the potential role of a logic-based evaluation in the service of non-numerical quantifiers. PMID:18789346

Troiani, Vanessa; Peelle, Jonathan E.; Clark, Robin; Grossman, Murray

2009-01-01

144

quantifying and Predicting Reactive Transport  

Microsoft Academic Search

This project was led by Dr. Jiamin Wan at Lawrence Berkeley National Laboratory. Peter Burns provided expertise in uranium mineralogy and in identification of uranium minerals in test materials. Dr. Wan conducted column tests regarding uranium transport at LBNL, and samples of the resulting columns were sent to Dr. Burns for analysis. Samples were analyzed for uranium mineralogy by X-ray

Peter C. Burns

2009-01-01

145

Societal Scale Modeling: Quantifying the  

E-print Network

services from market planning to reliability design in support of new products & services Formed out of MIT Transforming the way buildings are designed, built and used Focus on energy Business perspective A world where economically to reduce energy demand and CO2 ... We expect a persuasive result. #12;EEB Project "EEB

de Weck, Olivier L.

146

Quantifying Coral Reef Ecosystem Services  

EPA Science Inventory

Coral reefs have been declining during the last four decades as a result of both local and global anthropogenic stresses. Numerous research efforts to elucidate the nature, causes, magnitude, and potential remedies for the decline have led to the widely held belief that the recov...

147

Quantifying tetrodotoxin levels in the California newt using a non-destructive sampling method.  

PubMed

Toxic or noxious substances often serve as a means of chemical defense for numerous taxa. However, such compounds may also facilitate ecological or evolutionary processes. The neurotoxin, tetrodotoxin (TTX), which is found in newts of the genus Taricha, acts as a selection pressure upon predatory garter snakes, is a chemical cue to conspecific larvae, which elicits antipredator behavior, and may also affect macroinvertebrate foraging behavior. To understand selection patterns and how potential variation might affect ecological and evolutionary processes, it is necessary to quantify TTX levels within individuals and populations. To do so has often required that animals be destructively sampled or removed from breeding habitats and brought into the laboratory. Here we demonstrate a non-destructive method of sampling adult Taricha that obviates the need to capture and collect individuals. We also show that embryos from oviposited California newt (Taricha torosa) egg masses can be individually sampled and TTX quantified from embryos. We employed three different extraction techniques to isolate TTX. Using a custom fabricated high performance liquid chromatography (HPLC) system we quantified recovery of TTX. We found that a newly developed micro-extraction technique significantly improved recovery compared to previously used methods. Results also indicate our improvements to the HPLC method have high repeatability and increased sensitivity, with a detection limit of 48 pg (0.15 pmol) TTX. The quantified amounts of TTX in adult newts suggest fine geographic variation in toxin levels between sampling localities isolated by as little as 3 km. PMID:24467994

Bucciarelli, Gary M; Li, Amy; Kats, Lee B; Green, David B

2014-03-01

148

Managing Beef Cattle for Show  

E-print Network

This publication gives advice on raising beef cattle to exhibit at shows. Topics include animal selection, feeding, general health management, disease prevention, calf handling, and preparing for the show....

Herd, Dennis B.; Boleman, Chris; Boleman, Larry L.

2001-11-16

149

ELI Talent Show Final Exams  

E-print Network

Highlights ELI Talent Show Final Exams Scholarship Nominees Graduate Admissions Workshop Reminders from the Office Manners, Cultures, & Grammar TheELIWeekly ELI Talent Show It's going to be a blast! Come one, come all! The 2nd Annual ELI Talent Show will be on Tuesday, April 15th

Pilyugin, Sergei S.

150

28. MAP SHOWING LOCATION OF ARVFS FACILITY AS BUILT. SHOWS ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

28. MAP SHOWING LOCATION OF ARVFS FACILITY AS BUILT. SHOWS LINCOLN BOULEVARD, BIG LOST RIVER, AND NAVAL REACTORS FACILITY. F.C. TORKELSON DRAWING NUMBER 842-ARVFS-101-2. DATED OCTOBER 12, 1965. INEL INDEX CODE NUMBER: 075 0101 851 151969. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

151

Quantifying the value of redundant measurements at GRUAN sites  

NASA Astrophysics Data System (ADS)

The potential for measurement redundancy to reduce uncertainty in atmospheric variables has not been investigated comprehensively for climate observations. We evaluated the usefulness of entropy and mutual correlation concepts, as defined in information theory, for quantifying random uncertainty and redundancy in time series of atmospheric water vapor provided by five highly instrumented GRUAN (GCOS [Global Climate Observing System] Reference Upper-Air Network) Stations in 2010-2012. Results show that the random uncertainties for radiosonde, frost-point hygrometer, Global Positioning System, microwave and infrared radiometers, and Raman lidar measurements differed by less than 8%. Comparisons of time series of the Integrated Water Vapor (IWV) content from ground-based remote sensing instruments with in situ soundings showed that microwave radiometers have the highest redundancy and therefore the highest potential to reduce random uncertainty of IWV time series estimated by radiosondes. Moreover, the random uncertainty of a time series from one instrument should be reduced of ~ 60% by constraining the measurements with those from another instrument. The best reduction of random uncertainty resulted from conditioning of Raman lidar measurements with microwave radiometer measurements. Specific instruments are recommended for atmospheric water vapor measurements at GRUAN sites. This approach can be applied to the study of redundant measurements for other climate variables.

Madonna, F.; Rosoldi, M.; Güldner, J.; Haefele, A.; Kivi, R.; Cadeddu, M. P.; Sisterson, D.; Pappalardo, G.

2014-06-01

152

Quantifying Magnetic Stellar Wind Torques  

NASA Astrophysics Data System (ADS)

In order to be able to understand the evolution of stellar spin rates and differential rotation, it is necessary to have a rigorous theory for predicting angular momentum loss via magnetic stellar winds that is applicable over a wide range of conditions. Based upon the results of multidimensional, numerical simulations and semi-analytic calculations, we present an improved formulation for predicting the stellar wind torque, which is valid for varying degrees of magnetization in the wind, as well as for stellar spin rates that range from the slow- to the fast-magnetic-rotator regimes.

Matt, Sean; MacGregor, K. B.; Pinsonneault, M. H.; Greene, T. P.

2011-01-01

153

Quantifying Comparisons of Threshold Resummations  

E-print Network

We explore similarities and differences between widely-used threshold resummation formalisms, employing electroweak boson production as an instructive example. Resummations based on both full QCD and soft-collinear effective theory (SCET) share common underlying factorizations and resulting evolution equations. The formalisms differ primarily in their choices of boundary conditions for evolution, in moment space for many treatments based on full QCD, and in momentum space for treatments based on soft-collinear effective theory. At the level of factorized hadronic cross sections, these choices lead to quite different expressions. Nevertheless, we can identify a natural expansion for parton luminosity functions, in which SCET and full QCD resummations agree for the first term, and for which subsequent terms provide differences that are small in most cases. We also clarify the roles of the non-leading resummation constants in the two formalisms, and observe a relationship of the QCD resummation function $D(\\alpha_s)$ to the web expansion.

George Sterman; Mao Zeng

2013-12-19

154

Quantifying energetics and dissipation in magnetohydrodynamic turbulence  

NASA Astrophysics Data System (ADS)

We perform a suite of two- and three-dimensional magnetohydrodynamic (MHD) simulations with the ATHENA code of the non-driven Kelvin-Helmholtz instability in the subsonic, weak magnetic field limit. Focusing the analysis on the non-linear turbulent regime, we quantify energy transfer on a scale-by-scale basis and identify the physical mechanisms responsible for energy exchange by developing the diagnostic known as spectral energy transfer function analysis. At late times when the fluid is in a state of MHD turbulence, magnetic tension mediates the dominant mode of energy injection into the magnetic reservoir, whereby turbulent fluid motions twist and stretch the magnetic field lines. This generated magnetic energy turbulently cascades to smaller scales, while being exchanged backwards and forwards with the kinetic energy reservoir, until finally being dissipated. Incorporating explicit dissipation pushes the dissipation scale to larger scales than if the dissipation were entirely numerical. For scales larger than the dissipation scale, we show that the physics of energy transfer in decaying MHD turbulence is robust to numerical effects.

Salvesen, Greg; Beckwith, Kris; Simon, Jacob B.; O'Neill, Sean M.; Begelman, Mitchell C.

2014-02-01

155

Quantifying Tsunami Impact on Structures  

NASA Astrophysics Data System (ADS)

Tsunami impact is usually assessed through inundation simulations and maps which provide estimates of coastal flooding zones based on "credible worst case" scenarios. Earlier maps relied on one-dimensional computations, but two-dimensional computations are now employed routinely. In some cases, the maps do not represent flooding from any particular scenario event, but present an inundation line that reflects the worst inundation at this particular location among a range of scenario events. Current practice in tsunami resistant design relies on estimates of tsunami impact forces derived from empirical relationships that have been borrowed from riverine flooding calculations, which involve only inundation elevations. We examine this practice critically. Recent computational advances allow for calculation of additional parameters from scenario events such as the detailed distributions of tsunami currents and fluid accelerations, and this suggests that alternative and more comprehensive expressions for calculating tsunami impact and tsunami forces should be examined. We do so, using model output for multiple inundation simulations of Seaside, Oregon, as part of a pilot project to develop probabilistic tsunami hazard assessment methodologies for incorporation into FEMA Flood Insurance Rate Maps. We consider three different methods, compare the results with existing methodology for estimating forces and impact, and discuss the implications of these methodologies for probabilistic tsunami hazard assessment.

Yalciner, A. C.; Kanoglu, U.; Titov, V.; Gonzalez, F.; Synolakis, C. E.

2004-12-01

156

Planning a Successful Tech Show  

ERIC Educational Resources Information Center

Tech shows are a great way to introduce prospective students, parents, and local business and industry to a technology and engineering or career and technical education program. In addition to showcasing instructional programs, a tech show allows students to demonstrate their professionalism and skills, practice public presentations, and interact…

Nikirk, Martin

2011-01-01

157

RISK AVERSION IN GAME SHOWS  

Microsoft Academic Search

We review the use of behavior from television game shows to infer risk attitudes. These shows provide evidence when contestants are making decisions over very large stakes, and in a replicated, structured way. Inferences are generally confounded by the subjective assessment of skill in some games, and the dynamic nature of the task in most games. We consider the game

Steffen Andersen; Glenn W. Harrison; Morten I. Lau; E. Elisabet Rutstrom

158

A Study of Quantifier Phrases in Thai  

E-print Network

The structures of quantifier phrases in Thai are studied in the X -Syntax framework (Jackendoff 1977). Syntactic and Semantic arguments are provided to prove that this model remedies the deficiency of traditional and early ...

Deephuengton, Phawadee

1992-01-01

159

Construction of implicational quantifiers from fuzzy implications  

Microsoft Academic Search

Relations between two Boolean attributes derived from data can be quantified by [0,1]-valued functions defined on four-fold tables corresponding to pairs of the attributes. The most important of such quantifiers are implicational ones widely used in data-mining procedures. On the other hand, there are operators of fuzzy implication introduced and studied in fuzzy logic. In the paper, two methods of

Jirí Ivánek

2005-01-01

160

Quantifying the Risk of Blood Exposure in Optometric Clinical Education.  

ERIC Educational Resources Information Center

A study attempted to quantify risk of blood exposure in optometric clinical education by surveying optometric interns in their fourth year at the Southern California College of Optometry concerning their history of exposure or use of a needle. Results indicate blood exposure or needle use ranged from 0.95 to 18.71 per 10,000 patient encounters.…

Hoppe, Elizabeth

1997-01-01

161

Quantifying perception of nonlinear elastic tissue models using multidimensional scaling  

Microsoft Academic Search

Simplified soft tissue models used in surgical simulations cannot perfectly reproduce all material behaviors. In particular, many tissues exhibit the Poynting effect, which results in normal forces during shearing of tissue and is only observed in nonlinear elastic material models. In order to investigate and quantify the role of the Poynting effect on material discrimination, we performed a multi-dimensional scaling

Sarthak Misra; Philipp Fuernstahl; K. T. Ramesh; Allison M. Okamura; Matthias Harders

2009-01-01

162

Quantified Energy Dissipation Rates in the Terrestrial Bow Shock  

NASA Astrophysics Data System (ADS)

We present the first observationally quantified measure of the energy dissipation rate due to wave-particle interactions in the transition region of the Earth's collisionless bow shock using data from the THEMIS spacecraft. Each of more than 11 bow shock crossings examined with available wave burst data showed both low frequency (<10 Hz) magnetosonic-whistler waves and high frequency (?10 Hz) electromagnetic and electrostatic waves throughout the entire transition region and into the magnetosheath. The high frequency waves were identified as combinations of ion-acoustic waves, electron cyclotron drift instability driven waves, electrostatic solitary waves, and electromagnetic whistler mode waves. These waves were found to have: (1) amplitudes capable of exceeding ?B ~ 10 nT and ?E ~ 300 mV/m, though more typical values were ?B ~ 0.1-1.0 nT and ?E ~ 10-50 mV/m; (2) energy fluxes in excess of 2000 ?W m-2; (3) resistivities > 9000 ? m; and (4) energy dissipation rates > 3 ?W m-3. The high frequency (>10 Hz) electromagnetic waves produce such excessive energy dissipation that they need only be, at times, < 0.01% efficient to produce the observed increase in entropy across the shocks necessary to balance the nonlinear wave steepening that produces the shocks. These results show that wave-particle interactions have the capacity to regulate the global structure and dominate the energy dissipation of collisionless shocks.

Wilson, L. B., III; Sibeck, D. G.; Breneman, A. W.; Le Contel, O.; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.

2013-12-01

163

Rectal Swabs Are Suitable for Quantifying the Carriage Load of KPC-Producing Carbapenem-Resistant Enterobacteriaceae  

PubMed Central

It is more convenient and practical to collect rectal swabs than stool specimens to study carriage of colon pathogens. In this study, we examined the ability to use rectal swabs rather than stool specimens to quantify Klebsiella pneumoniae carbapenemase (KPC)-producing carbapenem-resistant Enterobacteriaceae (CRE). We used a quantitative real-time PCR (qPCR) assay to determine the concentration of the blaKPC gene relative to the concentration of 16S rRNA genes and a quantitative culture-based method to quantify CRE relative to total aerobic bacteria. Our results demonstrated that rectal swabs are suitable for quantifying the concentration of KPC-producing CRE and that qPCR showed higher correlation between rectal swabs and stool specimens than the culture-based method. PMID:23295937

Lerner, A.; Romano, J.; Chmelnitsky, I.; Navon-Venezia, S.; Edgar, R.

2013-01-01

164

Quantifying Wrinkle Features of Thin Membrane Structures  

NASA Technical Reports Server (NTRS)

For future micro-systems utilizing membrane based structures, quantified predictions of wrinkling behavior in terms of amplitude, angle and wavelength are needed to optimize the efficiency and integrity of such structures, as well as their associated control systems. For numerical analyses performed in the past, limitations on the accuracy of membrane distortion simulations have often been related to the assumptions made. This work demonstrates that critical assumptions include: effects of gravity, supposed initial or boundary conditions, and the type of element used to model the membrane. In this work, a 0.2 m x 02 m membrane is treated as a structural material with non-negligible bending stiffness. Finite element modeling is used to simulate wrinkling behavior due to a constant applied in-plane shear load. Membrane thickness, gravity effects, and initial imperfections with respect to flatness were varied in numerous nonlinear analysis cases. Significant findings include notable variations in wrinkle modes for thickness in the range of 50 microns to 1000 microns, which also depend on the presence of an applied gravity field. However, it is revealed that relationships between overall strain energy density and thickness for cases with differing initial conditions are independent of assumed initial conditions. In addition, analysis results indicate that the relationship between wrinkle amplitude scale (W/t) and structural scale (L/t) is independent of the nonlinear relationship between thickness and stiffness.

Jacobson, Mindy B.; Iwasa, Takashi; Naton, M. C.

2004-01-01

165

Quantifying ant activity using vibration measurements.  

PubMed

Ant behaviour is of great interest due to their sociality. Ant behaviour is typically observed visually, however there are many circumstances where visual observation is not possible. It may be possible to assess ant behaviour using vibration signals produced by their physical movement. We demonstrate through a series of bioassays with different stimuli that the level of activity of meat ants (Iridomyrmex purpureus) can be quantified using vibrations, corresponding to observations with video. We found that ants exposed to physical shaking produced the highest average vibration amplitudes followed by ants with stones to drag, then ants with neighbours, illuminated ants and ants in darkness. In addition, we devised a novel method based on wavelet decomposition to separate the vibration signal owing to the initial ant behaviour from the substrate response, which will allow signals recorded from different substrates to be compared directly. Our results indicate the potential to use vibration signals to classify some ant behaviours in situations where visual observation could be difficult. PMID:24658467

Oberst, Sebastian; Baro, Enrique Nava; Lai, Joseph C S; Evans, Theodore A

2014-01-01

166

Quantifying selection in immune receptor repertoires.  

PubMed

The efficient recognition of pathogens by the adaptive immune system relies on the diversity of receptors displayed at the surface of immune cells. T-cell receptor diversity results from an initial random DNA editing process, called VDJ recombination, followed by functional selection of cells according to the interaction of their surface receptors with self and foreign antigenic peptides. Using high-throughput sequence data from the ?-chain of human T-cell receptors, we infer factors that quantify the overall effect of selection on the elements of receptor sequence composition: the V and J gene choice and the length and amino acid composition of the variable region. We find a significant correlation between biases induced by VDJ recombination and our inferred selection factors together with a reduction of diversity during selection. Both effects suggest that natural selection acting on the recombination process has anticipated the selection pressures experienced during somatic evolution. The inferred selection factors differ little between donors or between naive and memory repertoires. The number of sequences shared between donors is well-predicted by our model, indicating a stochastic origin of such public sequences. Our approach is based on a probabilistic maximum likelihood method, which is necessary to disentangle the effects of selection from biases inherent in the recombination process. PMID:24941953

Elhanati, Yuval; Murugan, Anand; Callan, Curtis G; Mora, Thierry; Walczak, Aleksandra M

2014-07-01

167

Quantifying long-range correlations in complex networks beyond nearest neighbors  

E-print Network

We propose a fluctuation analysis to quantify spatial correlations in complex networks. The approach considers the sequences of degrees along shortest paths in the networks and quantifies the fluctuations in analogy to time series. In this work, the Barabasi-Albert (BA) model, the Cayley tree at the percolation transition, a fractal network model, and examples of real-world networks are studied. While the fluctuation functions for the BA model show exponential decay, in the case of the Cayley tree and the fractal network model the fluctuation functions display a power-law behavior. The fractal network model comprises long-range anti-correlations. The results suggest that the fluctuation exponent provides complementary information to the fractal dimension.

Rybski, Diego; Kropp, Jürgen P

2010-01-01

168

Shakespeare and other English Renaissance authors as characterized by Information Theory complexity quantifiers  

NASA Astrophysics Data System (ADS)

We introduce novel Information Theory quantifiers in a computational linguistic study that involves a large corpus of English Renaissance literature. The 185 texts studied (136 plays and 49 poems in total), with first editions that range from 1580 to 1640, form a representative set of its period. Our data set includes 30 texts unquestionably attributed to Shakespeare; in addition we also included A Lover’s Complaint, a poem which generally appears in Shakespeare collected editions but whose authorship is currently in dispute. Our statistical complexity quantifiers combine the power of Jensen-Shannon’s divergence with the entropy variations as computed from a probability distribution function of the observed word use frequencies. Our results show, among other things, that for a given entropy poems display higher complexity than plays, that Shakespeare’s work falls into two distinct clusters in entropy, and that his work is remarkable for its homogeneity and for its closeness to overall means.

Rosso, Osvaldo A.; Craig, Hugh; Moscato, Pablo

2009-03-01

169

2013 Goat Shows Show Date Show Name Entries Due Eligibility Weigh In Show Time Contact Phone Extra Info  

E-print Network

:00 p.m. Mark 931-319-2205 $2 at gate Fair Must Ledford 931-823-4295 Limit 5 Control animals/ Goat Fee Must 5:30 p.m. Tracey 615-464-5229 Show Control Lawrence Goat 8/2/2013 Smith Co. 7/25/2013 Youth 3 of Show TN Exhibit. 11:00 a.m. to 4:00 p.m. John 931-684-5971 Limit 5 Region Entry Fee 4-12 grade 1:00 p

Grissino-Mayer, Henri D.

170

Quantifying groundwater recharge from floods in semi-arid environments  

NASA Astrophysics Data System (ADS)

Floods represent an important aquifer recharge component in semi-arid environment. Changes in land use and the creation of artificial barriers to protect land from inundation can considerably influence the amount of aquifer recharge. Despite their importance, mechanisms that control flood recharge are poorly understood. Moreover, groundwater flow models rarely incorporate these processes with an appropriate physics based approach. In this study, we use a fully integrated surface subsurface fluid flow model to quantify changes in flood recharge induced by changes in land use. First, the flow simulations are performed on a synthetic aquifer to understand first order controls on flood recharge. Later, the simulations are extended to a real aquifer located in the lower Namoi aquifer, New South Wales, Australia. The long term groundwater monitoring hydrographs are used to calibrate the aquifer model. Satellite and aero-photographic surveys available both before and after changes in land use enable the comparison of flood extent to groundwater hydrograph response. The results show that the volume of water provided by the floods can represent a significant fraction of the aquifer water balance, and that changes in land use have a considerable effect on it. In addition, the results highlight the importance of treating flood recharge as a non-linear process.

Comunian, A.; Ajami, H.; Kelly, B. F.

2013-12-01

171

The OOPSLA trivia show (TOOTS)  

Microsoft Academic Search

OOPSLA has a longstanding tradition of being a forum for discussing the cutting edge of technology in a fun and participatory environment. The type of events sponsored by OOPSLA sometimes border on the unconventional. This event represents an atypical panel that conforms to the concept of a game show that is focused on questions and answers related to OOPSLA themes.

Jeff Gray; Douglas C. Schmidt

2009-01-01

172

Magic Carpet Shows Its Colors  

NASA Technical Reports Server (NTRS)

The upper left image in this display is from the panoramic camera on the Mars Exploration Rover Spirit, showing the 'Magic Carpet' region near the rover at Gusev Crater, Mars, on Sol 7, the seventh martian day of its journey (Jan. 10, 2004). The lower image, also from the panoramic camera, is a monochrome (single filter) image of a rock in the 'Magic Carpet' area. Note that colored portions of the rock correlate with extracted spectra shown in the plot to the side. Four different types of materials are shown: the rock itself, the soil in front of the rock, some brighter soil on top of the rock, and some dust that has collected in small recesses on the rock face ('spots'). Each color on the spectra matches a line on the graph, showing how the panoramic camera's different colored filters are used to broadly assess the varying mineral compositions of martian rocks and soils.

2004-01-01

173

Rocks and Minerals Slide Show  

NSDL National Science Digital Library

This interactive slide show of common rocks and minerals allows students to choose from two sets of minerals and click on a thumbnail to see a larger photograph with a full description of the mineral including color, streak, hardness, cleavage/fracture, and chemical composition. Also included are its use and where it is found. The rocks are divided into igneous, sedimentary, and metamorphic and can be accessed in the same manner. They are described on the basis of crystal size and mineral composition as well as use.

174

SANTA: quantifying the functional content of molecular networks  

E-print Network

lines. Our theory, simulations and applications show that SANTA provides a principled statistical way to quantify the association between molecular networks and cellular functions and phenotypes. SANTA is available from http... of Cambridge, Cambridge, Cambridgeshire, United Kingdom ? E-mail: florian.markowetz@cruk.cam.ac.uk Abstract Linking networks of molecular interactions to cellular functions and phenotypes is a key goal in sys- tems biology. Here, we adapt concepts of spatial...

Cornish, Alex J.; Markowetz, Florian

2014-09-11

175

"Medicine show." Alice in Doctorland.  

PubMed

This is an excerpt from the script of a 1939 play provided to the Institute of Social Medicine and Community Health by the Library of Congress Federal Theater Project Collection at George Mason University Library, Fairfax, Virginia, pages 2-1-8 thru 2-1-14. The Federal Theatre Project (FTP) was part of the New Deal program for the arts 1935-1939. Funded by the Works Progress Administration (WPA) its goal was to employ theater professionals from the relief rolls. A number of FTP plays deal with aspects of medicine and public health. Pageants, puppet shows and documentary plays celebrated progress in medical science while examining social controversies in medical services and the public health movement. "Medicine Show" sharply contrasts technological wonders with social backwardness. The play was rehearsed by the FTP but never opened because funding ended. A revised version ran on Broadway in 1940. The preceding comments are adapted from an excellent, well-illustrated review of five of these plays by Barabara Melosh: "The New Deal's Federal Theatre Project," Medical Heritage, Vol. 2, No. 1 (Jan/Feb 1986), pp. 36-47. PMID:10301683

1987-01-01

176

Show off the corporate library  

Microsoft Academic Search

Some published evidence has suggested that corporate libraries are slowly becoming irrelevant in meeting organisational information needs due to old-fashioned models of service delivery. Elsewhere in the literature the intranet is heralded as a technological tool for corporate information management. This paper provides the results of a series of case studies completed in 1998. The purpose of the research was

Hazel Hall; Alyn M Jones

2000-01-01

177

Quantifying the Clinical Significance of Cannabis Withdrawal  

PubMed Central

Background and Aims Questions over the clinical significance of cannabis withdrawal have hindered its inclusion as a discrete cannabis induced psychiatric condition in the Diagnostic and Statistical Manual of Mental Disorders (DSM IV). This study aims to quantify functional impairment to normal daily activities from cannabis withdrawal, and looks at the factors predicting functional impairment. In addition the study tests the influence of functional impairment from cannabis withdrawal on cannabis use during and after an abstinence attempt. Methods and Results A volunteer sample of 49 non-treatment seeking cannabis users who met DSM-IV criteria for dependence provided daily withdrawal-related functional impairment scores during a one-week baseline phase and two weeks of monitored abstinence from cannabis with a one month follow up. Functional impairment from withdrawal symptoms was strongly associated with symptom severity (p?=?0.0001). Participants with more severe cannabis dependence before the abstinence attempt reported greater functional impairment from cannabis withdrawal (p?=?0.03). Relapse to cannabis use during the abstinence period was associated with greater functional impairment from a subset of withdrawal symptoms in high dependence users. Higher levels of functional impairment during the abstinence attempt predicted higher levels of cannabis use at one month follow up (p?=?0.001). Conclusions Cannabis withdrawal is clinically significant because it is associated with functional impairment to normal daily activities, as well as relapse to cannabis use. Sample size in the relapse group was small and the use of a non-treatment seeking population requires findings to be replicated in clinical samples. Tailoring treatments to target withdrawal symptoms contributing to functional impairment during a quit attempt may improve treatment outcomes. PMID:23049760

Allsop, David J.; Copeland, Jan; Norberg, Melissa M.; Fu, Shanlin; Molnar, Anna; Lewis, John; Budney, Alan J.

2012-01-01

178

Quantifying uncertainty sources in hydrological climate impact projections  

NASA Astrophysics Data System (ADS)

Impact modeling systems, consisting of an emission scenario, global and regional climate models, statistical post-processing methods and hydrological models, are commonly used to assess hydrological climate impacts. Uncertainties associated with the projected impacts arise from each element of the modeling chain. While propagating through the modeling chain, the uncertainties from various modeling steps might interact. Interactions mean that the uncertainty of projected climate impacts by an ensemble of, e.g., multiple hydrological models, depends on the preceding modeling steps. In order to quantify such interactions, one needs to generate an ensemble of projections that varies different elements of the impact modeling chain simultaneously. In this study, we conducted a modeling experiment in the Alpine Rhine catchment using an ensemble of 9 climate model chains (CMs) from the ENSEMBLES project (www.ensembles-eu.org), 2 statistical post-processing (PP) methods and 2 hydrological models (HMs). We address changes in the annual cycle of runoff and of different runoff quantiles for the period 2021-2050 relative to 1961-1990. Based on this database of 36 different modeling chains, we tried to answer the questions: (1) how large is the total uncertainty of the projections, and (2) how much do the three modeling chain elements (CMs, PP methods, HMs) and interactions between them contribute to the total uncertainty as estimated in (1). The results show that most of the projections agree on an increase of runoff in winter (+15.6 [range +5.5 to +40.7] %) and a decrease in summer (-13.8 [range -26.0 to +3.9] %). However, there is large uncertainty in the magnitude of the changes. We used an ANalysis Of VAriance (ANOVA) model to quantify the contributions of various uncertainty sources to the total uncertainty of the ensemble. We found that CMs are the most important source of uncertainty for changes in the annual cycle of runoff during most parts of the year, and over a large quantile range. We also found that interactions might be as important as CMs during winter and spring and for extreme runoff quantiles. This indicates that it is crucial to vary multiple impact modeling chain elements simultaneously in order to assess the full uncertainty of hydrological climate impacts. Concerning the design of future impact studies, our results indicate that one should invest more into having a balanced sampling of all possible uncertainty sources rather than increase the sample size of just one particular source. Furthermore, the employed ANOVA model for the decomposition of the total uncertainty is flexible and could be adapted to modeling experiments that include other uncertainty sources such as e.g. emission scenarios or land use changes.

Bosshard, T.; Kotlarski, S.; Carambia, M.; Görgen, K.; Krahe, P.; Zappa, M.; Schär, C.

2012-04-01

179

"Show me" bioethics and politics.  

PubMed

Missouri, the "Show Me State," has become the epicenter of several important national public policy debates, including abortion rights, the right to choose and refuse medical treatment, and, most recently, early stem cell research. In this environment, the Center for Practical Bioethics (formerly, Midwest Bioethics Center) emerged and grew. The Center's role in these "cultural wars" is not to advocate for a particular position but to provide well researched and objective information, perspective, and advocacy for the ethical justification of policy positions; and to serve as a neutral convener and provider of a public forum for discussion. In this article, the Center's work on early stem cell research is a case study through which to argue that not only the Center, but also the field of bioethics has a critical role in the politics of public health policy. PMID:17926217

Christopher, Myra J

2007-10-01

180

Quantifying thiol-gold interactions towards the efficient strength control  

NASA Astrophysics Data System (ADS)

The strength of the thiol-gold interactions provides the basis to fabricate robust self-assembled monolayers for diverse applications. Investigation on the stability of thiol-gold interactions has thus become a hot topic. Here we use atomic force microscopy to quantify the stability of individual thiol-gold contacts formed both by isolated single thiols and in self-assembled monolayers on gold surface. Our results show that the oxidized gold surface can enhance greatly the stability of gold-thiol contacts. In addition, the shift of binding modes from a coordinate bond to a covalent bond with the change in environmental pH and interaction time has been observed experimentally. Furthermore, isolated thiol-gold contact is found to be more stable than that in self-assembled monolayers. Our findings revealed mechanisms to control the strength of thiol-gold contacts and will help guide the design of thiol-gold contacts for a variety of practical applications.

Xue, Yurui; Li, Xun; Li, Hongbin; Zhang, Wenke

2014-07-01

181

Statistical physics approach to quantifying differences in myelinated nerve fibers.  

PubMed

We present a new method to quantify differences in myelinated nerve fibers. These differences range from morphologic characteristics of individual fibers to differences in macroscopic properties of collections of fibers. Our method uses statistical physics tools to improve on traditional measures, such as fiber size and packing density. As a case study, we analyze cross-sectional electron micrographs from the fornix of young and old rhesus monkeys using a semi-automatic detection algorithm to identify and characterize myelinated axons. We then apply a feature selection approach to identify the features that best distinguish between the young and old age groups, achieving a maximum accuracy of 94% when assigning samples to their age groups. This analysis shows that the best discrimination is obtained using the combination of two features: the fraction of occupied axon area and the effective local density. The latter is a modified calculation of axon density, which reflects how closely axons are packed. Our feature analysis approach can be applied to characterize differences that result from biological processes such as aging, damage from trauma or disease or developmental differences, as well as differences between anatomical regions such as the fornix and the cingulum bundle or corpus callosum. PMID:24676146

Comin, César H; Santos, João R; Corradini, Dario; Morrison, Will; Curme, Chester; Rosene, Douglas L; Gabrielli, Andrea; Costa, Luciano da F; Stanley, H Eugene

2014-01-01

182

Quantifying the impact of ocean acidification on our future climate  

NASA Astrophysics Data System (ADS)

Ocean acidification (OA) is the consequence of rising atmospheric CO2 levels, and it is occurring in conjunction with global warming. Observational studies show that OA will impact ocean biogeochemical cycles. Here, we use an Earth system model under the RCP8.5 emission scenario to evaluate and quantify the first-order impacts of OA on marine biogeochemical cycles, and its potential feedback on our future climate. We find that OA impacts have only a small impact on the future atmospheric CO2 (less than 45 ppm) and global warming (less than a 0.25 K) by 2100. While the climate change feedbacks are small, OA impacts may significantly alter the distribution of biological production and remineralisation, which would alter the dissolved oxygen distribution in the ocean interior. Our results demonstrate that the consequences of OA will not be through its impact on climate change, but on how it impacts the flow of energy in marine ecosystems, which may significantly impact their productivity, composition and diversity.

Matear, R. J.; Lenton, A.

2014-07-01

183

Quantifying the effect of intertrial dependence on perceptual decisions.  

PubMed

In the perceptual sciences, experimenters study the causal mechanisms of perceptual systems by probing observers with carefully constructed stimuli. It has long been known, however, that perceptual decisions are not only determined by the stimulus, but also by internal factors. Internal factors could lead to a statistical influence of previous stimuli and responses on the current trial, resulting in serial dependencies, which complicate the causal inference between stimulus and response. However, the majority of studies do not take serial dependencies into account, and it has been unclear how strongly they influence perceptual decisions. We hypothesize that one reason for this neglect is that there has been no reliable tool to quantify them and to correct for their effects. Here we develop a statistical method to detect, estimate, and correct for serial dependencies in behavioral data. We show that even trained psychophysical observers suffer from strong history dependence. A substantial fraction of the decision variance on difficult stimuli was independent of the stimulus but dependent on experimental history.We discuss the strong dependence of perceptual decisions on internal factors and its implications for correct data interpretation. PMID:24944238

Fründ, Ingo; Wichmann, Felix A; Macke, Jakob H

2014-01-01

184

Quantifying metastatic inefficiency: rare genotypes versus rare dynamics  

NASA Astrophysics Data System (ADS)

We introduce and solve a ‘null model’ of stochastic metastatic colonization. The model is described by a single parameter ?: the ratio of the rate of cell division to the rate of cell death for a disseminated tumour cell in a given secondary tissue environment. We are primarily interested in the case in which colonizing cells are poorly adapted for proliferation in the local tissue environment, so that cell death is more likely than cell division, i.e. \\theta \\lt 1. We quantify the rare event statistics for the successful establishment of a metastatic colony of size N. For N\\gg 1, we find that the probability of establishment is exponentially rare, as expected, and yet the mean time for such rare events is of the form \\sim log (N)/(1-\\theta ) while the standard deviation of colonization times is \\sim 1/(1-\\theta ). Thus, counter to naive expectation, for \\theta \\lt 1, the average time for establishment of successful metastatic colonies decreases with decreasing cell fitness, and colonies seeded from lower fitness cells show less stochastic variation in their growth. These results indicate that metastatic growth from poorly adapted cells is rare, exponentially explosive and essentially deterministic. These statements are brought into sharper focus by the finding that the temporal statistics of the early stages of metastatic colonization from low-fitness cells (\\theta \\lt 1) are statistically indistinguishable from those initiated from high-fitness cells (\\theta \\gt 1), i.e. the statistics show a duality mapping (1-\\theta )\\to (\\theta -1). We conclude our analysis with a study of heterogeneity in the fitness of colonising cells, and describe a phase diagram delineating parameter regions in which metastatic colonization is dominated either by low or high fitness cells, showing that both are plausible given our current knowledge of physiological conditions in human cancer.

Cisneros, Luis H.; Newman, Timothy J.

2014-08-01

185

Improved estimates show large circumpolar stocks of permafrost carbon while quantifying substantial uncertainty ranges and identifying remaining data gaps  

NASA Astrophysics Data System (ADS)

Soils and other unconsolidated deposits in the northern circumpolar permafrost region store large amounts of soil organic carbon (SOC). This SOC is potentially vulnerable to remobilization following soil warming and permafrost thaw, but stock estimates are poorly constrained and quantitative error estimates were lacking. This study presents revised estimates of the permafrost SOC pool, including quantitative uncertainty estimates, in the 0-3 m depth range in soils as well as for deeper sediments (>3 m) in deltaic deposits of major rivers and in the Yedoma region of Siberia and Alaska. The revised estimates are based on significantly larger databases compared to previous studies. Compared to previous studies, the number of individual sites/pedons has increased by a factor ×8-11 for soils in the 1-3 m depth range,, a factor ×8 for deltaic alluvium and a factor ×5 for Yedoma region deposits. Upscaled based on regional soil maps, estimated permafrost region SOC stocks are 217 ± 15 and 472 ± 34 Pg for the 0-0.3 m and 0-1 m soil depths, respectively (±95% confidence intervals). Depending on the regional subdivision used to upscale 1-3 m soils (following physiography or continents), estimated 0-3 m SOC storage is 1034 ± 183 Pg or 1104 ± 133 Pg. Of this, 34 ± 16 Pg C is stored in thin soils of the High Arctic. Based on generalised calculations, storage of SOC in deep deltaic alluvium (>3 m to ?60 m depth) of major Arctic rivers is estimated to 91 ± 39 Pg (of which 69 ± 34 Pg is in permafrost). In the Yedoma region, estimated >3 m SOC stocks are 178 +140/-146 Pg, of which 74 +54/-57 Pg is stored in intact, frozen Yedoma (late Pleistocene ice- and organic-rich silty sediments) with the remainder in refrozen thermokarst deposits (±16/84th percentiles of bootstrapped estimates). A total estimated mean storage for the permafrost region of ca. 1300-1370 Pg with an uncertainty range of 930-1690 Pg encompasses the combined revised estimates. Of this, ?819-836 Pg is perennially frozen. While some components of the revised SOC stocks are similar in magnitude to those previously reported for this region, there are also substantial differences in individual components. There is evidence of remaining regional data-gaps. Estimates remain particularly poorly constrained for soils in the High Arctic region and physiographic regions with thin sedimentary overburden (mountains, highlands and plateaus) as well as for >3 m depth deposits in deltas and the Yedoma region.

Hugelius, G.; Strauss, J.; Zubrzycki, S.; Harden, J. W.; Schuur, E. A. G.; Ping, C. L.; Schirrmeister, L.; Grosse, G.; Michaelson, G. J.; Koven, C. D.; O'Donnell, J. A.; Elberling, B.; Mishra, U.; Camill, P.; Yu, Z.; Palmtag, J.; Kuhry, P.

2014-03-01

186

Quantifying intrachromosomal GC heterogeneity in prokaryotic genomes  

Microsoft Academic Search

The sequencing of prokaryotic genomes covering a wide taxonomic range has sparked renewed interest in intrachromosomal compositional (GC) heterogeneity, largely in view of lateral transfers. We present here a brief overview of some methods for visualizing and quantifying GC variation in prokaryotes. We used these methods to examine heterogeneity levels in sequenced prokaryotes, for a range of scales or stringencies.

Pedro Bernaola-Galvan; Giorgio Bernardic; Villa Comunale

187

Quantifying the Reuse of Learning Objects  

ERIC Educational Resources Information Center

This paper reports the findings of one case study from a larger project, which aims to quantify the claimed efficiencies of reusing learning objects to develop e-learning resources. The case study describes how an online inquiry project "Diabetes: A waste of energy" was developed by searching for, evaluating, modifying and then integrating as many…

Elliott, Kristine; Sweeney, Kevin

2008-01-01

188

Quantifier Elimination for Finite and Infinite Trees.  

National Technical Information Service (NTIS)

Assuming E(sigma), the Clark's Equational theory for a set of function symbols sigma, and DCA, the domain closure axiom for sigma, the paper investigates the possibility of quantifier elimination for the formulae made up of function symbols in sigma, the ...

T. Sato

1989-01-01

189

Existentially Quantified Type Classes Peter J. Stuckey  

E-print Network

Existentially Quantified Type Classes Peter J. Stuckey NICTA Victoria Laboratories Department framework for existential types and type classes. In contrast to LË?aufer's original proposal our system We start by reviewing LË?aufer's and Odersky's [10] ex­ tension of ML with existential types. We use

Sulzmann, Martin

190

physicalQuantifying President announces departure  

E-print Network

brain the adaptable sociable physicalQuantifying President announces departure Charles W. Steger and fulfilling experience of my life.Years ago, I left a career in the private sector to pursue my passion in mid-October, look for an in- depth exploration of Steger's legacy. passionate alumni, for active

Beex, A. A. "Louis"

191

A method for quantifying rotational symmetry.  

PubMed

Here, a new approach for quantifying rotational symmetry based on vector analysis was described and compared with information obtained from a geometric morphometric analysis and a technique based on distance alone. A new method was developed that generates a polygon from the length and angle data of a structure and then quantifies the minimum change necessary to convert that polygon into a regular polygon. This technique yielded an asymmetry score (s) that can range from 0 (perfect symmetry) to 1 (complete asymmetry). Using digital images of Geranium robertianum flowers, this new method was compared with a technique based on lengths alone and with established geometric morphometric methods used to quantify shape variation. Asymmetry scores (s) more clearly described variation in symmetry and were more consistent with a visual assessment of the images than either comparative technique. This procedure is the first to quantify the asymmetry of radial structures accurately, uses easily obtainable measures to calculate the asymmetry score and allows comparisons among individuals and species, even when the comparisons involve structures with different patterns of symmetry. This technique enables the rigorous analysis of polysymmetric structures and provides a foundation for a better understanding of symmetry in nature. PMID:17688593

Frey, Frank M; Robertson, Aaron; Bukoski, Michael

2007-01-01

192

Quantifying Subsurface Drainage using the Variable Infiltration  

E-print Network

Quantifying Subsurface Drainage using the Variable Infiltration Capacity Model Sarah Rutkowski. #12;· Estimation of tile drainage at the watershed and regional scales will lead to improved best for large scale modeling of tile drainage. Introduction #12;Objectives Today's Presentation · Use

Cherkauer, Keith

193

Quantifying Ecosystem Controls and Their Contextual Interactions  

E-print Network

Quantifying Ecosystem Controls and Their Contextual Interactions on Nutrient Export from Developing The complexity of natural ecosystems makes it difficult to compare the relative importance of abiotic and biotic factors and to assess the effects of their interactions on ecosystem development. To improve our

Wang, Deane

194

Quantify Prostate Cancer by Automated Histomorphometry  

Microsoft Academic Search

A new method is presented to quantify malignant changes in histological sections of prostate tissue immunohistochemically stained for prostate-specific antigen (PSA) by means of image processing. The morphological analysis of the prostate tissue uses the solidity of PSA- positive prostate tissue segments to compute a quantitative measure that turns out highly correlated with scores obtained from routine diagnosis (Gleason, Dhom).

Ulf-Dietrich Braumann; Jens-Peer Kuska; Markus Löffler; Nicolas Wernert

2008-01-01

195

Quantifying precipitation suppression due to air Pollution  

E-print Network

Quantifying precipitation suppression due to air Pollution First author: Amir Givati The Hebrew January 2004 #12;ABSTRACT: Urban and industrial air pollution has been shown qualitatively to suppress. The evidence suggests that air pollution aerosols that are incorporated in orographic clouds slow down cloud

Li, Zhanqing

196

Digital Optical Method to quantify the visual opacity of fugitive plumes  

NASA Astrophysics Data System (ADS)

Fugitive emissions of particulate matter (PM) raise public concerns due to their adverse impacts on human health and atmospheric visibility. Although the United States Environmental Protection Agency (USEPA) has not developed a standard method for quantifying the opacities of fugitive plumes, select states have developed human vision-based opacity methods for such applications. A digital photographic method, Digital Optical Method for fugitive plumes (DOMfugitive), is described herein for quantifying the opacities of fugitive plume emissions. Field campaigns were completed to evaluate this method by driving vehicles on unpaved roads to generate dust plumes. DOMfugitive was validated by performing simultaneous measurements using a co-located laser transmissometer. For 84% of the measurements, the individual absolute opacity difference values between the two methods were ?15%. The average absolute opacity difference for all the measurements was 8.5%. The paired t-test showed no significant difference between the two methods at 99% confidence level. Comparisons of wavelength dependent opacities with grayscale opacities indicated that DOMfugitive was not sensitive to the wavelength in the visible spectrum evaluated during these field campaigns. These results encourage the development of a USEPA standard method for quantifying the opacities of fugitive PM plumes using digital photography, as an alternative to human-vision based approaches.

Du, Ke; Shi, Peng; Rood, Mark J.; Wang, Kai; Wang, Yang; Varma, Ravi M.

2013-10-01

197

DOE: Quantifying the Value of Hydropower in the Electric Grid  

SciTech Connect

The report summarizes research to Quantify the Value of Hydropower in the Electric Grid. This 3-year DOE study focused on defining value of hydropower assets in a changing electric grid. Methods are described for valuation and planning of pumped storage and conventional hydropower. The project team conducted plant case studies, electric system modeling, market analysis, cost data gathering, and evaluations of operating strategies and constraints. Five other reports detailing these research results are available a project website, www.epri.com/hydrogrid. With increasing deployment of wind and solar renewable generation, many owners, operators, and developers of hydropower have recognized the opportunity to provide more flexibility and ancillary services to the electric grid. To quantify value of services, this study focused on the Western Electric Coordinating Council region. A security-constrained, unit commitment and economic dispatch model was used to quantify the role of hydropower for several future energy scenarios up to 2020. This hourly production simulation considered transmission requirements to deliver energy, including future expansion plans. Both energy and ancillary service values were considered. Addressing specifically the quantification of pumped storage value, no single value stream dominated predicted plant contributions in various energy futures. Modeling confirmed that service value depends greatly on location and on competition with other available grid support resources. In this summary, ten different value streams related to hydropower are described. These fell into three categories; operational improvements, new technologies, and electricity market opportunities. Of these ten, the study was able to quantify a monetary value in six by applying both present day and future scenarios for operating the electric grid. This study confirmed that hydropower resources across the United States contribute significantly to operation of the grid in terms of energy, capacity, and ancillary services. Many potential improvements to existing hydropower plants were found to be cost-effective. Pumped storage is the most likely form of large new hydro asset expansions in the U.S. however, justifying investments in new pumped storage plants remains very challenging with current electricity market economics. Even over a wide range of possible energy futures, up to 2020, no energy future was found to bring quantifiable revenues sufficient to cover estimated costs of plant construction. Value streams not quantified in this study may provide a different cost-benefit balance and an economic tipping point for hydro. Future studies are essential in the quest to quantify the full potential value. Additional research should consider the value of services provided by advanced storage hydropower and pumped storage at smaller time steps for integration of variable renewable resources, and should include all possible value streams such as capacity value and portfolio benefits i.e.; reducing cycling on traditional generation.

None

2012-12-31

198

Entropy generation method to quantify thermal comfort  

NASA Technical Reports Server (NTRS)

The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study is needed in the future to fully establish the validity of the OTCI formula and the model. One of the practical applications of this index is that could it be integrated in thermal control systems to develop human-centered environmental control systems for potential use in aircraft, mass transit vehicles, intelligent building systems, and space vehicles.

Boregowda, S. C.; Tiwari, S. N.; Chaturvedi, S. K.

2001-01-01

199

Quantifying Clinical Data Quality Using Relative Gold Standards  

PubMed Central

As the use of detailed clinical data expands for strategic planning, clinical quality measures, and research, the quality of the data contained in source systems, such as electronic medical records, becomes more critical. Methods to quantify and monitor clinical data quality in large operational databases involve a set of predefined data quality queries that attempt to detect data anomalies such as missing or unrealistic values based on meta-knowledge about a data domain. However, descriptive data elements, such as patient race, cannot be assessed using these methods. We present a novel approach leveraging existing intra-institutional databases with differing data quality for the same data element to quantify data quality for descriptive data. Using the concept of a relative gold standard, we show how this method can be used to assess data quality in enterprise clinical databases. PMID:21347000

Kahn, Michael G.; Eliason, Brian B.; Bathurst, Janet

2010-01-01

200

A Photometric Method for Quantifying Asymmetries in Disk Galaxies  

E-print Network

A photometric method for quantifying deviations from axisymmetry in optical images of disk galaxies is applied to a sample of 32 face-on and nearly face-on spirals. The method involves comparing the relative fluxes contained within trapezoidal sectors arranged symmetrically about the galaxy center of light, excluding the bulge and/or barred regions. Such a method has several advantages over others, especially when quantifying asymmetry in flocculent galaxies. Specifically, the averaging of large regions improves the signal-to-noise in the measurements; the method is not strongly affected by the presence of spiral arms; and it identifies the kinds of asymmetry that are likely to be dynamically important. Application of this "method of sectors" to R-band images of 32 disk galaxies indicates that about 30% of spirals show deviations from axisymmetry at the 5-sigma level.

David A. Kornreich; Martha P. Haynes; Richard V. E. Lovelace

1998-07-29

201

Quantifiers more or less quantify online: ERP evidence for partial incremental interpretation  

PubMed Central

Event-related brain potentials were recorded during RSVP reading to test the hypothesis that quantifier expressions are incrementally interpreted fully and immediately. In sentences tapping general knowledge (Farmers grow crops/worms as their primary source of income), Experiment 1 found larger N400s for atypical (worms) than typical objects (crops). Experiment 2 crossed object typicality with non-logical subject-noun phrase quantifiers (most, few). Off-line plausibility ratings exhibited the crossover interaction predicted by full quantifier interpretation: Most farmers grow crops and Few farmers grow worms were rated more plausible than Most farmers grow worms and Few farmers grow crops. Object N400s, although modulated in the expected direction, did not reverse. Experiment 3 replicated these findings with adverbial quantifiers (Farmers often/rarely grow crops/worms). Interpretation of quantifier expressions thus is neither fully immediate nor fully delayed. Furthermore, object atypicality was associated with a frontal slow positivity in few-type/rarely quantifier contexts, suggesting systematic processing differences among quantifier types. PMID:20640044

Urbach, Thomas P.; Kutas, Marta

2010-01-01

202

The Arizona Sun Corridor: Quantifying climatic implications of megapolitan development  

NASA Astrophysics Data System (ADS)

The local and regional-scale hydro-climatic impacts of land use and land cover change (LULCC) that result from urbanization require attention in light of future urban growth projections and related concerns for environmental sustainability. This is an especially serious issue over the southwestern U.S. where mounting pressure on the area’s natural desert environment and increasingly limited resources (e.g. water) exists, and is likely to worsen, due to unrelenting sprawl and associated urbanization. While previous modeling results have shown the degree to which the built environment has contributed to the region’s warming summertime climate, we use projections of future landscape change over the rapidly urbanizing Arizona Sun Corridor - an anticipated stretch of urban expanse that includes current metro Phoenix and Tucson - as surface boundary conditions to conduct high-resolution (order of 1-km) numerical simulations, over the seasonal timescale, to quantify the climatic effect of this relentlessly growing and increasingly vulnerable region. We use the latest version of the WRF modeling system to take advantage of several new capabilities, including a newly implemented nesting method used to refine the vertical mesh, and a comprehensive multi-story urban canopy scheme. We quantify the impact of projected (circa 2050) Sun Corridor megapolitan area on further development of the urban heat island (UHI), assess changes in the surface energy budget, with important implications for the near surface temperature and stability, and discuss modeled impacts on regional rainfall. Lastly, simulated effects are compared with projected warming due to increasing greenhouse gases (the GCMs from which these results are obtained currently do not take into account effects of urbanizing regions) and quantify the degree to which LULCC over the Arizona Sun Corridor will exacerbate regional anthropogenic climate change. A number of potential mitigation strategies are discussed (including effects of renewable energy), the simulated impact on anthropogenic heat production is quantified, and the degree to which future warming may be offset is estimated.

Georgescu, M.; Moustaoui, M.; Mahalov, A.

2010-12-01

203

Quantifying Nitrogen Loss From Flooded Hawaiian Taro Fields  

NASA Astrophysics Data System (ADS)

In 2004 a field fertilization experiment showed that approximately 80% of the fertilizer nitrogen (N) added to flooded Hawaiian taro (Colocasia esculenta) fields could not be accounted for using classic N balance calculations. To quantify N loss through denitrification and anaerobic ammonium oxidation (anammox) pathways in these taro systems we utilized a slurry-based isotope pairing technique (IPT). Measured nitrification rates and porewater N profiles were also used to model ammonium and nitrate fluxes through the top 10 cm of soil. Quantitative PCR of nitrogen cycling functional genes was used to correlate porewater N dynamics with potential microbial activity. Rates of denitrification calculated using porewater profiles were compared to those obtained using the slurry method. Potential denitrification rates of surficial sediments obtained with the slurry method were found to drastically overestimate the calculated in-situ rates. The largest discrepancies were present in fields greater than one month after initial fertilization, reflecting a microbial community poised to denitrify the initial N pulse. Potential surficial nitrification rates varied between 1.3% of the slurry-measured denitrification potential in a heavily-fertilized site to 100% in an unfertilized site. Compared to the use of urea, fish bone meal fertilizer use resulted in decreased N loss through denitrification in the surface sediment, according to both porewater modeling and IPT measurements. In addition, sub-surface porewater profiles point to root-mediated coupled nitrification/denitrification as a potential N loss pathway that is not captured in surface-based incubations. Profile-based surface plus subsurface coupled nitrification/denitrification estimates were between 1.1 and 12.7 times denitrification estimates from the surface only. These results suggest that the use of a ‘classic’ isotope pairing technique that employs 15NO3- in fertilized agricultural systems can lead to a drastic overestimation of in-situ denitrification rates and that root-associated subsurface coupled nitrification/denitrification may be a major N loss pathway in these flooded agricultural systems.

Deenik, J. L.; Penton, C. R.; Bruland, G. L.; Popp, B. N.; Engstrom, P.; Mueller, J. A.; Tiedje, J.

2010-12-01

204

Quantifying thermal modifications on laser welded skin tissue  

NASA Astrophysics Data System (ADS)

Laser tissue welding is a potential medical treatment method especially on closing cuts implemented during any kind of surgery. Photothermal effects of laser on tissue should be quantified in order to determine optimal dosimetry parameters. Polarized light and phase contrast techniques reveal information about extend of thermal change over tissue occurred during laser welding application. Change in collagen structure in skin tissue stained with hematoxilen and eosin samples can be detected. In this study, three different near infrared laser wavelengths (809 nm, 980 nm and 1070 nm) were compared for skin welding efficiency. 1 cm long cuts were treated spot by spot laser application on Wistar rats' dorsal skin, in vivo. In all laser applications, 0.5 W of optical power was delivered to the tissue, 5 s continuously, resulting in 79.61 J/cm2 energy density (15.92 W/cm2 power density) for each spot. The 1st, 4th, 7th, 14th, and 21st days of recovery period were determined as control days, and skin samples needed for histology were removed on these particular days. The stained samples were examined under a light microscope. Images were taken with a CCD camera and examined with imaging software. 809 Nm laser was found to be capable of creating strong full-thickness closure, but thermal damage was evident. The thermal damage from 980 nm laser welding was found to be more tolerable. The results showed that 1070 nm laser welding produced noticeably stronger bonds with minimal scar formation.

Tabakoglu, Hasim Ö.; Gülsoy, Murat

2011-02-01

205

Quantifying plasticity-independent creep compliance and relaxation of viscoelastoplastic materials under contact loading  

E-print Network

Here we quantify the time-dependent mechanical properties of a linear viscoelastoplastic material under contact loading. For contact load relaxation, we showed that the relaxation modulus can be measured independently of ...

Vandamme, Matthieu

206

Quantifying the underlying landscape and paths of cancer.  

PubMed

Cancer is a disease regulated by the underlying gene networks. The emergence of normal and cancer states as well as the transformation between them can be thought of as a result of the gene network interactions and associated changes. We developed a global potential landscape and path framework to quantify cancer and associated processes. We constructed a cancer gene regulatory network based on the experimental evidences and uncovered the underlying landscape. The resulting tristable landscape characterizes important biological states: normal, cancer and apoptosis. The landscape topography in terms of barrier heights between stable state attractors quantifies the global stability of the cancer network system. We propose two mechanisms of cancerization: one is by the changes of landscape topography through the changes in regulation strengths of the gene networks. The other is by the fluctuations that help the system to go over the critical barrier at fixed landscape topography. The kinetic paths from least action principle quantify the transition processes among normal state, cancer state and apoptosis state. The kinetic rates provide the quantification of transition speeds among normal, cancer and apoptosis attractors. By the global sensitivity analysis of the gene network parameters on the landscape topography, we uncovered some key gene regulations determining the transitions between cancer and normal states. This can be used to guide the design of new anti-cancer tactics, through cocktail strategy of targeting multiple key regulation links simultaneously, for preventing cancer occurrence or transforming the early cancer state back to normal state. PMID:25232051

Li, Chunhe; Wang, Jin

2014-11-01

207

Quantifying uncertainty in state and parameter estimation  

NASA Astrophysics Data System (ADS)

Observability of state variables and parameters of a dynamical system from an observed time series is analyzed and quantified by means of the Jacobian matrix of the delay coordinates map. For each state variable and each parameter to be estimated, a measure of uncertainty is introduced depending on the current state and parameter values, which allows us to identify regions in state and parameter space where the specific unknown quantity can(not) be estimated from a given time series. The method is demonstrated using the Ikeda map and the Hindmarsh-Rose model.

Parlitz, Ulrich; Schumann-Bischoff, Jan; Luther, Stefan

2014-05-01

208

Effect of soil structure on the growth of bacteria in soil quantified using CARD-FISH  

NASA Astrophysics Data System (ADS)

It has been reported that compaction of soil due to use of heavy machinery has resulted in the reduction of crop yield. Compaction affects the physical properties of soil such as bulk density, soil strength and porosity. This causes an alteration in the soil structure which limits the mobility of nutrients, water and air infiltration and root penetration in soil. Several studies have been conducted to explore the effect of soil compaction on plant growth and development. However, there is scant information on the effect of soil compaction on the microbial community and its activities in soil. Understanding the effect of soil compaction on microbial community is essential as microbial activities are very sensitive to abrupt environmental changes in soil. Therefore, the aim of this work was to investigate the effect of soil structure on growth of bacteria in soil. The bulk density of soil was used as a soil physical parameter to quantify the effect of soil compaction. To detect and quantify bacteria in soil the method of catalyzed reporter deposition-fluorescence in situ hybridization (CARD-FISH) was used. This technique results in high intensity fluorescent signals which make it easy to quantify bacteria against high levels of autofluorescence emitted by soil particles and organic matter. In this study, bacterial strains Pseudomonas fluorescens SBW25 and Bacillus subtilis DSM10 were used. Soils of aggregate size 2-1mm were packed at five different bulk densities in polyethylene rings (4.25 cm3).The soil rings were sampled at four different days. Results showed that the total number of bacteria counts was reduced significantly (P

Juyal, Archana; Eickhorst, Thilo; Falconer, Ruth; Otten, Wilfred

2014-05-01

209

Quantifying Position-Dependent Codon Usage Bias  

PubMed Central

Although the mapping of codon to amino acid is conserved across nearly all species, the frequency at which synonymous codons are used varies both between organisms and between genes from the same organism. This variation affects diverse cellular processes including protein expression, regulation, and folding. Here, we mathematically model an additional layer of complexity and show that individual codon usage biases follow a position-dependent exponential decay model with unique parameter fits for each codon. We use this methodology to perform an in-depth analysis on codon usage bias in the model organism Escherichia coli. Our methodology shows that lowly and highly expressed genes are more similar in their codon usage patterns in the 5?-gene regions, but that these preferences diverge at distal sites resulting in greater positional dependency (pD, which we mathematically define later) for highly expressed genes. We show that position-dependent codon usage bias is partially explained by the structural requirements of mRNAs that results in increased usage of A/T rich codons shortly after the gene start. However, we also show that the pD of 4- and 6-fold degenerate codons is partially related to the gene copy number of cognate-tRNAs supporting existing hypotheses that posit benefits to a region of slow translation in the beginning of coding sequences. Lastly, we demonstrate that viewing codon usage bias through a position-dependent framework has practical utility by improving accuracy of gene expression prediction when incorporating positional dependencies into the Codon Adaptation Index model. PMID:24710515

Hockenberry, Adam J.; Sirer, M. Irmak; Amaral, Luis A. Nunes; Jewett, Michael C.

2014-01-01

210

Quantifying Dirac hydrogenic effects via complexity measures  

NASA Astrophysics Data System (ADS)

The primary dynamical Dirac relativistic effects can only be seen in hydrogenic systems without the complications introduced by electron-electron interactions in many-electron systems. They are known to be the contraction towards the origin of the electronic charge in hydrogenic systems and the nodal disappearance (because of the raising of all the nonrelativistic minima) in the electron density of the excited states of these systems. In addition we point out the (largely ignored) gradient reduction of the charge density near and far from the nucleus. In this work we quantify these effects by means of single (Fisher information) and composite [Fisher-Shannon complexity and plane, López-Ruiz, Mancini, and Calbet (LMC) complexity] information-theoretic measures. While the Fisher information measures the gradient content of the density, the (dimensionless) composite information-theoretic quantities grasp twofold facets of the electronic distribution: The Fisher-Shannon complexity measures the combined balance of the gradient content and the total extent of the electronic charge, and the LMC complexity quantifies the disequilibrium jointly with the spreading of the density in the configuration space. Opposite to other complexity notions (e.g., computational and algorithmic complexities), these two quantities describe intrinsic properties of the system because they do not depend on the context but are functionals of the electron density. Moreover, they are closely related to the intuitive notion of complexity because they are minimum for the two extreme (or least complex) distributions of perfect order and maximum disorder.

Bouvrie, P. A.; López-Rosa, S.; Dehesa, J. S.

2012-07-01

211

Quantifying litter decomposition losses to dissolved organic carbon and respiration  

NASA Astrophysics Data System (ADS)

As litter decomposes its carbon is lost from the litter layer, largely through microbial processing. However, much of the carbon lost from the surface litter layer during decomposition is not truly lost from the ecosystem but gets transferred to the soil through fragmentation and leaching of dissolved organic carbon (DOC). This DOC in the soil acts as a stock of soil organic matter (SOM) to be utilized by soil microbes, stabilized in the soil, or leached further through the soil profile. The total amount of C that ends up leaching from litter to the soil, as well as its chemical composition, has important implications on the residence time of decomposing litter C in the soil and is not currently well parameterized in models. In this study we aim to quantify the proportional relationship between CO2 efflux and DOC partitioning during decomposition of fresh leaf litter with distinct structural and chemical composition. The results from this one-year laboratory incubation show a clear relationship between the lignin to cellulose ratios of litter and DOC to CO2 partitioning during four distinct phases of litter decomposition. For example, bluestem grass litter with a low lignin to cellulose ratio loses almost 50% of its C as DOC whereas pine needles with a high lignin to cellulose ratio loses only 10% of its C as DOC, indicating a potential ligno-cellulose complexation effect on carbon use efficiency during litter decomposition. DOC production also decreases with time during decomposition, correlating with increasing lignin to cellulose ratios as decomposition progresses. Initial DOC leaching can be predicted based on the amount of labile fraction in each litter type. Field data using stable isotope labeled bluestem grass show that about 18% of the surface litter C lost in 18 months of decomposition is stored in the soil, and that over 50% of this is recovered in mineral-associated heavy SOM fractions, not as litter fragments, confirming the relative importance of the DOC flux of C from the litter layer to the soil for stable SOM formation. These results are being used to parameterize a new litter decomposition sub-model to more accurately represent the movement of decomposing surface litter C to CO2 and the mineral soil. This surface litter sub-model can be used to strengthen our understanding of the litter C and microbial processes that feed into larger ecosystem models such as Daycent.

Soong, J.; Parton, W. J.; Calderon, F. J.; Guilbert, K.; Cotrufo, M.

2013-12-01

212

Crisis of Japanese Vascular Flora Shown By Quantifying Extinction Risks for 1618 Taxa  

PubMed Central

Although many people have expressed alarm that we are witnessing a mass extinction, few projections have been quantified, owing to limited availability of time-series data on threatened organisms, especially plants. To quantify the risk of extinction, we need to monitor changes in population size over time for as many species as possible. Here, we present the world's first quantitative projection of plant species loss at a national level, with stochastic simulations based on the results of population censuses of 1618 threatened plant taxa in 3574 map cells of ca. 100 km2. More than 500 lay botanists helped monitor those taxa in 1994–1995 and in 2003–2004. We projected that between 370 and 561 vascular plant taxa will go extinct in Japan during the next century if past trends of population decline continue. This extinction rate is approximately two to three times the global rate. Using time-series data, we show that existing national protected areas (PAs) covering ca. 7% of Japan will not adequately prevent population declines: even core PAs can protect at best <60% of local populations from decline. Thus, the Aichi Biodiversity Target to expand PAs to 17% of land (and inland water) areas, as committed to by many national governments, is not enough: only 29.2% of currently threatened species will become non-threatened under the assumption that probability of protection success by PAs is 0.5, which our assessment shows is realistic. In countries where volunteers can be organized to monitor threatened taxa, censuses using our method should be able to quantify how fast we are losing species and to assess how effective current conservation measures such as PAs are in preventing species extinction. PMID:24922311

Kadoya, Taku; Takenaka, Akio; Ishihama, Fumiko; Fujita, Taku; Ogawa, Makoto; Katsuyama, Teruo; Kadono, Yasuro; Kawakubo, Nobumitsu; Serizawa, Shunsuke; Takahashi, Hideki; Takamiya, Masayuki; Fujii, Shinji; Matsuda, Hiroyuki; Muneda, Kazuo; Yokota, Masatsugu; Yonekura, Koji; Yahara, Tetsukazu

2014-01-01

213

Crisis of Japanese vascular flora shown by quantifying extinction risks for 1618 taxa.  

PubMed

Although many people have expressed alarm that we are witnessing a mass extinction, few projections have been quantified, owing to limited availability of time-series data on threatened organisms, especially plants. To quantify the risk of extinction, we need to monitor changes in population size over time for as many species as possible. Here, we present the world's first quantitative projection of plant species loss at a national level, with stochastic simulations based on the results of population censuses of 1618 threatened plant taxa in 3574 map cells of ca. 100 km2. More than 500 lay botanists helped monitor those taxa in 1994-1995 and in 2003-2004. We projected that between 370 and 561 vascular plant taxa will go extinct in Japan during the next century if past trends of population decline continue. This extinction rate is approximately two to three times the global rate. Using time-series data, we show that existing national protected areas (PAs) covering ca. 7% of Japan will not adequately prevent population declines: even core PAs can protect at best <60% of local populations from decline. Thus, the Aichi Biodiversity Target to expand PAs to 17% of land (and inland water) areas, as committed to by many national governments, is not enough: only 29.2% of currently threatened species will become non-threatened under the assumption that probability of protection success by PAs is 0.5, which our assessment shows is realistic. In countries where volunteers can be organized to monitor threatened taxa, censuses using our method should be able to quantify how fast we are losing species and to assess how effective current conservation measures such as PAs are in preventing species extinction. PMID:24922311

Kadoya, Taku; Takenaka, Akio; Ishihama, Fumiko; Fujita, Taku; Ogawa, Makoto; Katsuyama, Teruo; Kadono, Yasuro; Kawakubo, Nobumitsu; Serizawa, Shunsuke; Takahashi, Hideki; Takamiya, Masayuki; Fujii, Shinji; Matsuda, Hiroyuki; Muneda, Kazuo; Yokota, Masatsugu; Yonekura, Koji; Yahara, Tetsukazu

2014-01-01

214

Quantifying Lead-Time Bias in Risk-Factor Studies of Cancer through Simulation  

PubMed Central

Purpose Lead-time is inherent in early detection and creates bias in observational studies of screening efficacy, but its potential to bias effect estimates in risk-factor studies is not always recognized. We describe a form of this bias that conventional analyses cannot address and develop a model to quantify it. Methods Surveillance Epidemiology and End Results (SEER) data form the basis for estimates of age-specific preclinical incidence and log-normal distributions describe the preclinical duration distribution. Simulations assume a joint null hypothesis of no effect of either the risk factor or screening on the preclinical incidence of cancer, and then quantify the bias as the risk-factor odds ratio (OR) from this null study. This bias can be used as a factor to adjust observed OR in the actual study. Results Results showed that for this particular study design, as average preclinical duration increased, the bias in the total-physical-activity OR monotonically increased from 1% to 22% above the null, but the smoking OR monotonically decreased from 1% above the null to 5% below the null. Conclusion The finding of nontrivial bias in fixed risk-factor effect estimates demonstrates the importance of quantitatively evaluating it in susceptible studies. PMID:23988688

Jansen, Rick J.; Alexander, Bruce H.; Anderson, Kristin E.; Church, Timothy R.

2013-01-01

215

Quantifying Proteinuria in Hypertensive Disorders of Pregnancy  

PubMed Central

Background. Progressive proteinuria indicates worsening of the condition in hypertensive disorders of pregnancy and hence its quantification guides clinician in decision making and treatment planning. Objective. To evaluate the efficacy of spot dipstick analysis and urinary protein-creatinine ratio (UPCR) in hypertensive disease of pregnancy for predicting 24-hour proteinuria. Subjects and Methods. A total of 102 patients qualifying inclusion criteria were evaluated with preadmission urine dipstick test and UPCR performed on spot voided sample. After admission, the entire 24-hour urine sample was collected and analysed for daily protein excretion. Dipstick estimation and UPCR were compared to the 24-hour results. Results. Seventy-eight patients (76.5%) had significant proteinuria of more than 300?mg/24?h. Dipstick method showed 59% sensitivity and 67% specificity for prediction of significant proteinuria. Area under curve for UPCR was 0.89 (95% CI: 0.83 to 0.95, P < 0.001) showing 82% sensitivity and 12.5% false positive rate for cutoff value of 0.45. Higher cutoff values (1.46 and 1.83) predicted heavy proteinuria (2?g and 3?g/24?h, resp.). Conclusion. This study suggests that random urinary protein?:?creatine ratio is a reliable investigation compared to dipstick method to assess proteinuria in hypertensive pregnant women. However, clinical laboratories should standardize the reference values for their setup. PMID:25302114

Amin, Sapna V.; Illipilla, Sireesha; Rai, Lavanya; Kumar, Pratap; Pai, Muralidhar V.

2014-01-01

216

Quantifying nonverbal communicative behavior in face-to-face human dialogues  

NASA Astrophysics Data System (ADS)

The referred study is based on the assumption that understanding how humans use nonverbal behavior in dialogues can be very useful in the design of more natural-looking animated talking heads. The goal of the study is twofold: (1) to explore how people use specific facial expressions and head movements to serve important dialogue functions, and (2) to show evidence that it is possible to measure and quantify the entity of these movements with the Qualisys MacReflex motion tracking system. Naturally elicited dialogues between humans have been analyzed with focus on the attention on those nonverbal behaviors that serve the very relevant functions of regulating the conversational flux (i.e., turn taking) and producing information about the state of communication (i.e., feedback). The results show that eyebrow raising, head nods, and head shakes are typical signals involved during the exchange of speaking turns, as well as in the production and elicitation of feedback. These movements can be easily measured and quantified, and this measure can be implemented in animated talking heads.

Skhiri, Mustapha; Cerrato, Loredana

2002-11-01

217

3D Wind: Quantifying wind speed and turbulence intensity  

NASA Astrophysics Data System (ADS)

Integrating measurements and modeling of wind characteristics for wind resource assessment and wind farm control is increasingly challenging as the scales of wind farms increases. Even offshore or in relatively homogeneous landscapes, there are significant gradients of both wind speed and turbulence intensity on scales that typify large wind farms. Our project is, therefore, focused on (i) improving methods to integrate remote sensing and in situ measurements with model simulations to produce a 3-dimensional view of the flow field on wind farm scales and (ii) investigating important controls on the spatiotemporal variability of flow fields within the coastal zone. The instrument suite deployed during the field experiments includes; 3-D sonic and cup anemometers deployed on meteorological masts and buoys, anemometers deployed on tethersondes and an Unmanned Aerial Vehicle, multiple vertically-pointing continuous-wave lidars and scanning Doppler lidars. We also integrate data from satellite-borne instrumentation - specifically synthetic aperture radar and scatterometers and output from the Weather Research and Forecast (WRF) model. Spatial wind fields and vertical profiles of wind speed from WRF and from the full in situ observational suite exhibit excellent agreement in a proof-of-principle experiment conducted in north Indiana particularly during convective conditions, but showed some discrepancies during the breakdown of the nocturnal stable layer. Our second experiment in May 2013 focused on triangulating a volume above an area of coastal water extending from the port in Cleveland out to an offshore water intake crib (about 5 km) and back to the coast, and includes extremely high resolution WRF simulations designed to characterize the coastal zone. Vertically pointing continuous-wave lidars were operated at each apex of the triangle, while the scanning Doppler lidar scanned out across the water over 90 degrees azimuth angle. Preliminary results pertaining to objective (i) indicate there is good agreement between wind and turbulence profiles to 200 m as measured by the vertically pointing lidars with the expected modification of the offshore profiles relative to the profiles at the coastline. However, these profiles do not always fully agree with wind speed and direction profiles measured by the scanning Doppler lidar. Further investigation is required to elucidate these results and to analyze whether these discrepancies occur during particular atmospheric conditions. Preliminary results regarding controls on flow in the coastal zone (i.e. objective ii) include clear evidence that the wind profile to 200 m was modified due to swell during unstable conditions even under moderate to high wind speed conditions. The measurement campaigns will be described in detail, with a view to evaluating optimal strategies for offshore measurement campaigns and in the context of quantifying wind and turbulence in a 3D volume.

Barthelmie, R. J.; Pryor, S. C.; Wang, H.; Crippa, P.

2013-12-01

218

Quantifying Factors That Impact Riverbed Dynamic Permeability at a Riverbank Filtration Facility  

E-print Network

Quantifying Factors That Impact Riverbed Dynamic Permeability at a Riverbank Filtration Facility of riverbed permeability dynamics associated with Riverbank filtration. The results are also expected modeling studies of the Wohler riverbank filtration system on the Russian River, California suggested

Hubbard, Susan

219

World Health Organization: Quantifying environmental health impacts  

NSDL National Science Digital Library

The World Health Organization works in a number of public health areas, and their work on quantifying environmental health impacts has been receiving praise from many quarters. This site provides materials on their work in this area and visitors with a penchant for international health relief efforts and policy analysis will find the site invaluable. Along the left-hand side of the site, visitors will find topical sections that include "Methods", "Assessment at national level", "Global estimates", and "Publications". In the "Methods" area, visitors will learn about how the World Health Organization's methodology for studying environmental health impacts has been developed and they can also read a detailed report on the subject. The "Global Estimates" area is worth a look as well, and users can look at their complete report, "Preventing Disease Through Healthy Environments: Towards An Estimate Of the Global Burden Of Disease".

220

Quantifying fault recovery in multiprocessor systems  

NASA Technical Reports Server (NTRS)

Various aspects of reliable computing are formalized and quantified with emphasis on efficient fault recovery. The mathematical model which proves to be most appropriate is provided by the theory of graphs. New measures for fault recovery are developed and the value of elements of the fault recovery vector are observed to depend not only on the computation graph H and the architecture graph G, but also on the specific location of a fault. In the examples, a hypercube is chosen as a representative of parallel computer architecture, and a pipeline as a typical configuration for program execution. Dependability qualities of such a system is defined with or without a fault. These qualities are determined by the resiliency triple defined by three parameters: multiplicity, robustness, and configurability. Parameters for measuring the recovery effectiveness are also introduced in terms of distance, time, and the number of new, used, and moved nodes and edges.

Malek, Miroslaw; Harary, Frank

1990-01-01

221

Animal biometrics: quantifying and detecting phenotypic appearance.  

PubMed

Animal biometrics is an emerging field that develops quantified approaches for representing and detecting the phenotypic appearance of species, individuals, behaviors, and morphological traits. It operates at the intersection between pattern recognition, ecology, and information sciences, producing computerized systems for phenotypic measurement and interpretation. Animal biometrics can benefit a wide range of disciplines, including biogeography, population ecology, and behavioral research. Currently, real-world applications are gaining momentum, augmenting the quantity and quality of ecological data collection and processing. However, to advance animal biometrics will require integration of methodologies among the scientific disciplines involved. Such efforts will be worthwhile because the great potential of this approach rests with the formal abstraction of phenomics, to create tractable interfaces between different organizational levels of life. PMID:23537688

Kühl, Hjalmar S; Burghardt, Tilo

2013-07-01

222

Quantifying creativity: can measures span the spectrum?  

PubMed Central

Because the cognitive neuroscientists have become increasingly interested in the phenomenon of creativity, the issue arises of how creativity is to be optimally measured. Unlike intelligence, which can be assessed across the full range of intellectual ability creativity measures tend to concentrate on different sections of the overall spectrum. After first defining creativity in terms of the three criteria of novelty, usefulness, and surprise, this article provides an overview of the available measures. Not only do these instruments vary according to whether they focus on the creative process, person, or product, but they differ regarding whether they tap into “little-c” versus “Big-C” creativity; only productivity and eminence measures reach into genius-level manifestations of the phenomenon. The article closes by discussing whether various alternative assessment techniques can be integrated into a single measure that quantifies creativity across the full spectrum. PMID:22577309

Simonton, Dean Keith

2012-01-01

223

Cascading "Triclick" functionalization of poly(caprolactone) thin films quantified via a quartz crystal microbalance.  

PubMed

A series of mono- and multifunctionalized degradable polyesters bearing various "clickable" groups, including ketone, alkyne, azide, and methyl acrylate (MA) are reported. Using this approach, we demonstrate a cascade approach to immobilize and quantitate three separate bioactive groups onto poly(caprolactone) (PCL) thin films. The materials are based on tunable copolymer compositions of ?-caprolactone and 2-oxepane-1,5-dione. A quartz crystal microbalance (QCM) was used to quantify the rate and extent of surface conjugation between RGD peptide and polymer thin films using "click" chemistry methods. The results show that alkyne-functionalized polymers have the highest conversion efficiency, followed by MA and azide polymers, while polymer films possessing keto groups are less amenable to surface functionalization. The successful conjugation was further confirmed by static contact angle measurements, with a smaller contact angle correlating directly with lower levels of surface peptide conjugation. QCM results quantify the sequential immobilization of peptides on the PCL thin films and indicate that Michael addition must occur first, followed by azide-alkyne Huisgen cycloadditions. PMID:23795681

Lin, Fei; Zheng, Jukuan; Yu, Jiayi; Zhou, Jinjun; Becker, Matthew L

2013-08-12

224

A time-domain hybrid analysis method for detecting and quantifying T-wave alternans.  

PubMed

T-wave alternans (TWA) in surface electrocardiograph (ECG) signals has been recognized as a marker of cardiac electrical instability and is hypothesized to be associated with increased risk for ventricular arrhythmias among patients. A novel time-domain TWA hybrid analysis method (HAM) utilizing the correlation method and least squares regression technique is described in this paper. Simulated ECGs containing artificial TWA (cases of absence of TWA and presence of stationary or time-varying or phase-reversal TWA) under different baseline wanderings are used to test the method, and the results show that HAM has a better ability of quantifying TWA amplitude compared with the correlation method (CM) and adapting match filter method (AMFM). The HAM is subsequently used to analyze the clinical ECGs, and results produced by the HAM have, in general, demonstrated consistency with those produced by the CM and the AMFM, while the quantifying TWA amplitudes by the HAM are universally higher than those by the other two methods. PMID:24803951

Wan, Xiangkui; Yan, Kanghui; Zhang, Linlin; Zeng, Yanjun

2014-01-01

225

ORIGINAL PAPER Evaluation of sampling methods to quantify abundance  

E-print Network

ORIGINAL PAPER Evaluation of sampling methods to quantify abundance of hardwoods and snags within was quite high and further research is needed to determine an optimal sampling method for quantifying hard

Paris-Sud XI, Université de

226

Quantifying Flexibility in the Operationally Responsive Space Lauren Viscito  

E-print Network

Quantifying Flexibility in the Operationally Responsive Space Paradigm by Lauren Viscito B;Quantifying Flexibility in the Operationally Responsive Space Paradigm by Lauren Viscito Submitted the system's behavior in many possible future contexts. Designing flexible systems will allow mitigation

de Weck, Olivier L.

227

Quantifying human health risks from virginiamycin used in chickens.  

PubMed

The streptogramin antimicrobial combination Quinupristin-Dalfopristin (QD) has been used in the United States since late 1999 to treat patients with vancomycin-resistant Enterococcus faecium (VREF) infections. Another streptogramin, virginiamycin (VM), is used as a growth promoter and therapeutic agent in farm animals in the United States and other countries. Many chickens test positive for QD-resistant E. faecium, raising concern that VM use in chickens might compromise QD effectiveness against VREF infections by promoting development of QD-resistant strains that can be transferred to human patients. Despite the potential importance of this threat to human health, quantifying the risk via traditional farm-to-fork modeling has proved extremely difficult. Enough key data (mainly on microbial loads at each stage) are lacking so that such modeling amounts to little more than choosing a set of assumptions to determine the answer. Yet, regulators cannot keep waiting for more data. Patients prescribed QD are typically severely ill, immunocompromised people for whom other treatment options have not readily been available. Thus, there is a pressing need for sound risk assessment methods to inform risk management decisions for VM/QD using currently available data. This article takes a new approach to the QD-VM risk modeling challenge. Recognizing that the usual farm-to-fork ("forward chaining") approach commonly used in antimicrobial risk assessment for food animals is unlikely to produce reliable results soon enough to be useful, we instead draw on ideas from traditional fault tree analysis ("backward chaining") to reverse the farm-to-fork process and start with readily available human data on VREF case loads and QD resistance rates. Combining these data with recent genogroup frequency data for humans, chickens, and other sources (Willems et al., 2000, 2001) allows us to quantify potential human health risks from VM in chickens in both the United States and Australia, two countries where regulatory action for VM is being considered. We present a risk simulation model, thoroughly grounded in data, that incorporates recent nosocomial transmission and genetic typing data. The model is used to estimate human QD treatment failures over the next five years with and without continued VM use in chickens. The quantitative estimates and probability distributions were implemented in a Monte Carlo simulation model for a five-year horizon beginning in the first quarter of 2002. In Australia, a Q1-2002 ban of virginiamycin would likely reduce average attributable treatment failures by 0.35 x 10(-3) cases, expected mortalities by 5.8 x 10(-5) deaths, and life years lost by 1.3 x 10(-3) for the entire population over five years. In the United States, where the number of cases of VRE is much higher, a 1Q-2002 ban on VM is predicted to reduce average attributable treatment failures by 1.8 cases in the entire population over five years; expected mortalities by 0.29 cases; and life years lost by 6.3 over a five-year period. The model shows that the theoretical statistical human health benefits of a VM ban range from zero to less than one statistical life saved in both Australia and the United States over the next five years and are rapidly decreasing. Sensitivity analyses indicate that this conclusion is robust to key data gaps and uncertainties, e.g., about the extent of resistance transfer from chickens to people. PMID:15028017

Cox, Louis A; Popken, Douglas A

2004-02-01

228

Quantifying Volume of Groundwater in High Elevation Meadows  

NASA Astrophysics Data System (ADS)

Assessing the current and future water needs of high elevation meadows is dependent on quantifying the volume of groundwater stored within the meadow sediment. As groundwater dependent ecosystems, these meadows rely on their ability to capture and store water in order to support ecologic function and base flow to streams. Previous research of these meadows simplified storage by assuming a homogenous reservoir of constant thickness. These previous storage models were able to close the water mass balance, but it is unclear if these assumptions will be successful under future anthropogenic impacts, such as increased air temperature resulting in dryer and longer growing seasons. Applying a geophysical approach, ground-penetrating radar was used at Tuolumne Meadows, CA to qualitatively and quantitatively identify the controls on volume of groundwater storage. From the geophysical results, a three-dimensional model of Tuolumne Meadows was created, which identified meadow thickness and bedrock geometry. This physical model was used in a suite of numerical models simulating high elevation meadows in order to quantify volume of groundwater stored with temporal and spatial variability. Modeling efforts tested both wet and dry water years in order to quantify the variability in the volume of groundwater storage for a range of aquifer properties. Each model was evaluated based on the seasonal depth to water in order to evaluate a particular scenario's ability to support ecological function and base flow. Depending on the simulated meadows ability or inability to support its ecosystem, each representative meadow was categorized as successful or unsuccessful. Restoration techniques to increase active storage volume were suggested at unsuccessful meadows.

Ciruzzi, D.; Lowry, C.

2013-12-01

229

Quantifying Serum Antiplague Antibody with a Fiber-Optic Biosensor  

Microsoft Academic Search

The fiber-optic biosensor, originally developed to detect hazardous biological agents such as protein toxins or bacterial cells, has been utilized to quantify the concentration of serum antiplague antibodies. This biosensor has been used to detect and quantify the plague fraction 1 antigen in serum, plasma, and whole-blood samples, but its ability to quantify serum antibodies has not been demonstrated. By

GEORGE P. ANDERSON; KEELEY D. KING; LYNN K. CAO; MEAGAN JACOBY; FRANCES S. LIGLER; JOHN EZZELL; Fort Detrick

1998-01-01

230

Instantiation of Existentially Quantified Variables in Inductive Specification Proofs  

E-print Network

Instantiation of Existentially Quantified Variables in Inductive Specification Proofs Brigitte specifications proofs. Our ap- proach uses first-order meta-variables in place of existentially quantified which usually occur in the presence of existentially quantified variables. Moreover, we are able

Kreitz, Christoph

231

Existential quantifiers in the rule body Pedro Cabalar  

E-print Network

Existential quantifiers in the rule body Pedro Cabalar Department of Computer Science, Corunna of Answer Set Programming (ASP) for dealing with (nested) existential quantifiers and double negation-order expressions under QEL by introducing existential quantifiers and double negations in the rule bodies

Cabalar, Pedro

232

Information theoretic approach to quantify causal neural interactions from EEG  

Microsoft Academic Search

In neurophysiology, it is important to quantify the causal neural interactions and infer the underlying complex networks from neurophysiological recordings such as electroen-cephalogram (EEG). Existing methods such as Granger causality are model dependent and thus cannot quantify nonlinear dependencies. In this paper, directed information (DI) is used to quantify the causality of the interactions and time-lagged directed information is proposed

Ying Liu; Selin Aviyente

2010-01-01

233

Quantifying Meteorite Impact Craters Individual Volume Data Sheet  

E-print Network

Quantifying Meteorite Impact Craters Individual Volume Data Sheet Experiment One (Volume) Drop 1 150 Trial 2 150 Trial 3 150 #12;Quantifying Meteorite Impact Craters Individual Speed Data Sheet 100 Trial 3 100 50 cm Height Trial 1 50 Trial 2 50 Trial 3 50 #12;Quantifying Meteorite Impact Craters

Polly, David

234

On the Correspondence between Classes of Implicational and Equivalence Quantifiers  

Microsoft Academic Search

Relations between two Boolean attributes derived from data can be quantified by truth functions defined on four-fold tables corresponding to pairs of the attributes. In the paper, several classes of such quantifiers (implicational, double implicational, equivalence ones) with truth values in the unit interval are investigated. The method of construction of the logically nearest double implicational and equivalence quantifiers to

Jirí Ivánek; W. Churchill

1999-01-01

235

Quantifying Speech Rhythm Abnormalities in the Dysarthrias  

PubMed Central

Purpose In this study, the authors examined whether rhythm metrics capable of distinguishing languages with high and low temporal stress contrast also can distinguish among control and dysarthric speakers of American English with perceptually distinct rhythm patterns. Methods Acoustic measures of vocalic and consonantal segment durations were obtained for speech samples from 55 speakers across 5 groups (hypokinetic, hyperkinetic, flaccid-spastic, ataxic dysarthrias, and controls). Segment durations were used to calculate standard and new rhythm metrics. Discriminant function analyses (DFAs) were used to determine which sets of predictor variables (rhythm metrics) best discriminated between groups (control vs. dysarthrias; and among the 4 dysarthrias). A cross-validation method was used to test the robustness of each original DFA. Results The majority of classification functions were more than 80% successful in classifying speakers into their appropriate group. New metrics that combined successive vocalic and consonantal segments emerged as important predictor variables. DFAs pitting each dysarthria group against the combined others resulted in unique constellations of predictor variables that yielded high levels of classification accuracy. Conclusions: This study confirms the ability of rhythm metrics to distinguish control speech from dysarthrias and to discriminate dysarthria subtypes. Rhythm metrics show promise for use as a rational and objective clinical tool. PMID:19717656

Liss, Julie M.; White, Laurence; Mattys, Sven L.; Lansford, Kaitlin; Lotto, Andrew J.; Spitzer, Stephanie M.; Caviness, John N.

2013-01-01

236

Beyond immunity: quantifying the effects of host anti-parasite behavior on parasite transmission.  

PubMed

A host's first line of defense in response to the threat of parasitic infection is behavior, yet the efficacy of anti-parasite behaviors in reducing infection are rarely quantified relative to immunological defense mechanisms. Larval amphibians developing in aquatic habitats are at risk of infection from a diverse assemblage of pathogens, some of which cause substantial morbidity and mortality, suggesting that behavioral avoidance and resistance could be significant defensive strategies. To quantify the importance of anti-parasite behaviors in reducing infection, we exposed larval Pacific chorus frogs (Pseudacris regilla) to pathogenic trematodes (Ribeiroia and Echinostoma) in one of two experimental conditions: behaviorally active (unmanipulated) or behaviorally impaired (anesthetized). By quantifying both the number of successful and unsuccessful parasites, we show that host behavior reduces infection prevalence and intensity for both parasites. Anesthetized hosts were 20-39% more likely to become infected and, when infected, supported 2.8-fold more parasitic cysts. Echinostoma had a 60% lower infection success relative to the more deadly Ribeiroia and was also more vulnerable to behaviorally mediated reductions in transmission. For Ribeiroia, increases in host mass enhanced infection success, consistent with epidemiological theory, but this relationship was eroded among active hosts. Our results underscore the importance of host behavior in mitigating disease risk and suggest that, in some systems, anti-parasite behaviors can be as or more effective than immune-mediated defenses in reducing infection. Considering the severe pathologies induced by these and other pathogens of amphibians, we emphasize the value of a broader understanding of anti-parasite behaviors and how co-occurring stressors affect them. PMID:20857146

Daly, Elizabeth W; Johnson, Pieter T J

2011-04-01

237

The Physics of Equestrian Show Jumping  

NASA Astrophysics Data System (ADS)

This article discusses the kinematics and dynamics of equestrian show jumping. For some time I have attended a series of show jumping events at Spruce Meadows, an international equestrian center near Calgary, Alberta, often referred to as the "Wimbledon of equestrian jumping." I have always had a desire to write an article such as this one, but when I searched the Internet for information and looked at YouTube presentations, I could only find simplistic references to Newton's laws and the conservation of mechanical energy principle. Nowhere could I find detailed calculations. On the other hand, there were several biomechanical articles with empirical reports of the results of kinetic and dynamic investigations of show jumping using high-speed digital cameras and force plates. They summarize their results in tables that give information about the motion of a horse jumping over high fences (1.40 m) and the magnitudes of the forces encountered when landing. However, they do not describe the physics of these results.

Stinner, Art

2014-04-01

238

A framework for quantifying net benefits of alternative prognostic models‡  

PubMed Central

New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd. PMID:21905066

Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

2012-01-01

239

Using multiscale norms to quantify mixing and transport  

NASA Astrophysics Data System (ADS)

Mixing is relevant to many areas of science and engineering, including the pharmaceutical and food industries, oceanography, atmospheric sciences and civil engineering. In all these situations one goal is to quantify and often then to improve the degree of homogenization of a substance being stirred, referred to as a passive scalar or tracer. A classical measure of mixing is the variance of the concentration of the scalar, which is the L2 norm of a mean-zero concentration field. Recently, other norms have been used to quantify mixing, in particular the mix-norm as well as negative Sobolev norms. These norms have the advantage that unlike variance they decay even in the absence of diffusion, and their decay corresponds to the flow being mixing in the sense of ergodic theory. General Sobolev norms weigh scalar gradients differently, and are known as multiscale norms for mixing. We review the applications of such norms to mixing and transport, and show how they can be used to optimize the stirring and mixing of a decaying passive scalar. We then review recent work on the less-studied case of a continuously replenished scalar field—the source-sink problem. In that case the flows that optimally reduce the norms are associated with transport rather than mixing: they push sources onto sinks, and vice versa.

Thiffeault, Jean-Luc

2012-02-01

240

SANTA: Quantifying the Functional Content of Molecular Networks  

PubMed Central

Linking networks of molecular interactions to cellular functions and phenotypes is a key goal in systems biology. Here, we adapt concepts of spatial statistics to assess the functional content of molecular networks. Based on the guilt-by-association principle, our approach (called SANTA) quantifies the strength of association between a gene set and a network, and functionally annotates molecular networks like other enrichment methods annotate lists of genes. As a general association measure, SANTA can (i) functionally annotate experimentally derived networks using a collection of curated gene sets and (ii) annotate experimentally derived gene sets using a collection of curated networks, as well as (iii) prioritize genes for follow-up analyses. We exemplify the efficacy of SANTA in several case studies using the S. cerevisiae genetic interaction network and genome-wide RNAi screens in cancer cell lines. Our theory, simulations, and applications show that SANTA provides a principled statistical way to quantify the association between molecular networks and cellular functions and phenotypes. SANTA is available from http://bioconductor.org/packages/release/bioc/html/SANTA.html. PMID:25210953

Cornish, Alex J.; Markowetz, Florian

2014-01-01

241

Quantifying VOC emissions for the strategic petroleum reserve.  

SciTech Connect

A very important aspect of the Department of Energy's (DOE's) Strategic Petroleum Reserve (SPR) program is regulatory compliance. One of the regulatory compliance issues deals with limiting the amount of volatile organic compounds (VOCs) that are emitted into the atmosphere from brine wastes when they are discharged to brine holding ponds. The US Environmental Protection Agency (USEPA) has set limits on the amount of VOCs that can be discharged to the atmosphere. Several attempts have been made to quantify the VOC emissions associated with the brine ponds going back to the late 1970's. There are potential issues associated with each of these quantification efforts. Two efforts were made to quantify VOC emissions by analyzing VOC content of brine samples obtained from wells. Efforts to measure air concentrations were mentioned in historical reports but no data have been located to confirm these assertions. A modeling effort was also performed to quantify the VOC emissions. More recently in 2011- 2013, additional brine sampling has been performed to update the VOC emissions estimate. An analysis of the statistical confidence in these results is presented here. Arguably, there are uncertainties associated with each of these efforts. The analysis herein indicates that the upper confidence limit in VOC emissions based on recent brine sampling is very close to the 0.42 ton/MMB limit used historically on the project. Refining this estimate would require considerable investment in additional sampling, analysis, and monitoring. An analysis of the VOC emissions at each site suggests that additional discharges could be made and stay within current regulatory limits.

Knowlton, Robert G.; Lord, David L.

2013-06-01

242

Choosing appropriate techniques for quantifying groundwater recharge  

USGS Publications Warehouse

Various techniques are available to quantify recharge; however, choosing appropriate techniques is often difficult. Important considerations in choosing a technique include space/time scales, range, and reliability of recharge estimates based on different techniques; other factors may limit the application of particular techniques. The goal of the recharge study is important because it may dictate the required space/time scales of the recharge estimates. Typical study goals include water-resource evaluation, which requires information on recharge over large spatial scales and on decadal time scales; and evaluation of aquifer vulnerability to contamination, which requires detailed information on spatial variability and preferential flow. The range of recharge rates that can be estimated using different approaches should be matched to expected recharge rates at a site. The reliability of recharge estimates using different techniques is variable. Techniques based on surface-water and unsaturated-zone data provide estimates of potential recharge, whereas those based on groundwater data generally provide estimates of actual recharge. Uncertainties in each approach to estimating recharge underscore the need for application of multiple techniques to increase reliability of recharge estimates.

Scanlon, B.R.; Healy, R.W.; Cook, P.G.

2002-01-01

243

Quantifying Climate Risks for Urban Environments  

NASA Astrophysics Data System (ADS)

High-density urban areas are both uniquely vulnerable and uniquely able to adapt to climate change. Enabling this potential requires identifying the vulnerabilities, however, and these depend strongly on location: the challenges climate change poses for a southern coastal city such as Miami, for example, have little in common with those facing a northern inland city such as Chicago. By combining local knowledge with climate science, risk assessment, engineering analysis, and adaptation planning, it is possible to develop relevant climate information that feeds directly into vulnerability assessment and long-term planning. Key steps include: developing climate projections tagged to long-term weather stations within the city itself that reflect local characteristics; mining local knowledge to identify existing vulnerabilities to, and costs of, weather and climate extremes; understanding how future projections can be integrated into the planning process; and identifying ways in which the city may adapt. Using examples from our work in the cities of Boston, Chicago, and Mobile we illustrate the practical application of this approach to quantify the impacts of climate change on these cities and identify robust adaptation options as diverse as reducing the urban heat island effect, protecting essential infrastructure, changing design standards and building codes, developing robust emergency management plans, and rebuilding storm sewer systems.

Hayhoe, K.; Stoner, A. K.; Dickson, L.

2013-12-01

244

Quantifying non-Markovianity via correlations  

NASA Astrophysics Data System (ADS)

In the study of open quantum systems, memory effects are usually ignored, and this leads to dynamical semigroups and Markovian dynamics. However, in practice, non-Markovian dynamics is the rule rather than the exception. With the recent emergence of quantum information theory, there is a flurry of investigations of non-Markovian dynamics, and several significant measures for non-Markovianity are introduced from various perspectives such as deviation from divisibility, information exchange between a system and its environment, or entanglement with the environment. In this work, by exploiting the correlations flow between a system and an arbitrary ancillary, we propose a considerably intuitive measure for non-Markovianity by use of correlations as quantified by the quantum mutual information rather than entanglement. The fundamental properties, physical significance, and differences and relations with existing measures for non-Markovianity are elucidated. The measure captures quite directly and deeply the characteristics of non-Markovianity from the perspective of information. A simplified version based on Jamio?kowski-Choi isomorphism which encodes operations via bipartite states and does not involve any optimization is also proposed.

Luo, Shunlong; Fu, Shuangshuang; Song, Hongting

2012-10-01

245

Quantifying Transmission of Campylobacter spp. among Broilers  

PubMed Central

Campylobacter species are frequently identified as a cause of human gastroenteritis, often from eating or mishandling contaminated poultry products. Quantitative knowledge of transmission of Campylobacter in broiler flocks is necessary, as this may help to determine the moment of introduction of Campylobacter in broiler flocks more precisely. The aim of this study was to determine the transmission rate parameter in broiler flocks. Four experiments were performed, each with four Campylobacter-inoculated chicks housed with 396 contact chicks per group. Colonization was monitored by regularly testing fecal samples for Campylobacter. A mathematical model was used to quantify the transmission rate, which was determined to be 1.04 new cases per colonized chick per day. This would imply that, for example, in a flock of 20,000 broilers, the prevalence of Campylobacter would increase from 5% to 95% within 6 days after Campylobacter introduction. The model and the estimated transmission rate parameter can be used to develop a suitable sampling scheme to determine transmission in commercial broiler flocks, to estimate whether control measures can reduce the transmission rate, or to estimate when Campylobacter was introduced into a colonized broiler flock on the basis of the time course of transmission in the flock. PMID:16204486

Van Gerwe, T. J. W. M.; Bouma, A.; Jacobs-Reitsma, W. F.; van den Broek, J.; Klinkenberg, D.; Stegeman, J. A.; Heesterbeek, J. A. P.

2005-01-01

246

Automated Counting of Particles To Quantify Cleanliness  

NASA Technical Reports Server (NTRS)

A machine vision system, similar to systems used in microbiological laboratories to count cultured microbes, has been proposed for quantifying the cleanliness of nominally precisely cleaned hardware by counting residual contaminant particles. The system would include a microscope equipped with an electronic camera and circuitry to digitize the camera output, a personal computer programmed with machine-vision and interface software, and digital storage media. A filter pad, through which had been aspirated solvent from rinsing the hardware in question, would be placed on the microscope stage. A high-resolution image of the filter pad would be recorded. The computer would analyze the image and present a histogram of sizes of particles on the filter. On the basis of the histogram and a measure of the desired level of cleanliness, the hardware would be accepted or rejected. If the hardware were accepted, the image would be saved, along with other information, as a quality record. If the hardware were rejected, the histogram and ancillary information would be recorded for analysis of trends. The software would perceive particles that are too large or too numerous to meet a specified particle-distribution profile. Anomalous particles or fibrous material would be flagged for inspection.

Rhode, James

2005-01-01

247

Quantifying Acute Myocardial Injury Using Ratiometric Fluorometry  

PubMed Central

Early reperfusion is the best therapy for myocardial infarction (MI). Effectiveness, however, varies significantly between patients and has implications for long-term prognosis and treatment. A technique to assess the extent of myocardial salvage after reperfusion therapy would allow for high-risk patients to be identified in the early post-MI period. Mitochondrial dysfunction is associated with cell death following myocardial reperfusion and can be quantified by fluorometry. Therefore, we hypothesized that variations in the fluorescence of mitochondrial nicotinamide adenine dinucleotide (NADH) and flavoprotein (FP) can be used acutely to predict the degree of myocardial injury. Thirteen rabbits had coronary occlusion for 30 min followed by 3 h of reperfusion. To produce a spectrum of infarct sizes, six animals were infused cyclosporine A prior to ischemia. Using a specially designed fluorometric probe, NADH and FP fluorescence were measured in the ischemic area. Changes in NADH and FP fluorescence, as early as 15 min after reperfusion, correlated with postmortem assessment infarct size (r = 0.695, p < 0.01). This correlation strengthened with time (r = 0.827, p < 0.001 after 180 min). Clinical application of catheter-based myocardial fluorometry may provide a minimally invasive technique for assessing the early response to reperfusion therapy. PMID:19272908

Ranji, Mahsa; Matsubara, Muneaki; Leshnower, Bradley G.; Hinmon, Robin H.; Jaggard, Dwight L.; Chance, Britton; Gorman, Robert C.

2011-01-01

248

Data Used in Quantified Reliability Models  

NASA Technical Reports Server (NTRS)

Data is the crux to developing quantitative risk and reliability models, without the data there is no quantification. The means to find and identify reliability data or failure numbers to quantify fault tree models during conceptual and design phases is often the quagmire that precludes early decision makers consideration of potential risk drivers that will influence design. The analyst tasked with addressing a system or product reliability depends on the availability of data. But, where is does that data come from and what does it really apply to? Commercial industries, government agencies, and other international sources might have available data similar to what you are looking for. In general, internal and external technical reports and data based on similar and dissimilar equipment is often the first and only place checked. A common philosophy is "I have a number - that is good enough". But, is it? Have you ever considered the difference in reported data from various federal datasets and technical reports when compared to similar sources from national and/or international datasets? Just how well does your data compare? Understanding how the reported data was derived, and interpreting the information and details associated with the data is as important as the data itself.

DeMott, Diana; Kleinhammer, Roger K.; Kahn, C. J.

2014-01-01

249

Fluorescence imaging to quantify crop residue cover  

NASA Technical Reports Server (NTRS)

Crop residues, the portion of the crop left in the field after harvest, can be an important management factor in controlling soil erosion. Methods to quantify residue cover are needed that are rapid, accurate, and objective. Scenes with known amounts of crop residue were illuminated with long wave ultraviolet (UV) radiation and fluorescence images were recorded with an intensified video camera fitted with a 453 to 488 nm band pass filter. A light colored soil and a dark colored soil were used as background for the weathered soybean stems. Residue cover was determined by counting the proportion of the pixels in the image with fluorescence values greater than a threshold. Soil pixels had the lowest gray levels in the images. The values of the soybean residue pixels spanned nearly the full range of the 8-bit video data. Classification accuracies typically were within 3(absolute units) of measured cover values. Video imaging can provide an intuitive understanding of the fraction of the soil covered by residue.

Daughtry, C. S. T.; Mcmurtrey, J. E., III; Chappelle, E. W.

1994-01-01

250

Quantifying spore viability of the honey bee pathogen Nosema apis using flow cytometry.  

PubMed

Honey bees are hosts to more than 80 different parasites, some of them being highly virulent and responsible for substantial losses in managed honey bee populations. The study of honey bee pathogens and their interactions with the bees' immune system has therefore become a research area of major interest. Here we developed a fast, accurate and reliable method to quantify the viability of spores of the honey bee gut parasite Nosema apis. To verify this method, a dilution series with 0, 25, 50, 75, and 100% live N. apis was made and SYTO 16 and Propidium Iodide (n = 35) were used to distinguish dead from live spores. The viability of spores in each sample was determined by flow cytometry and compared with the current method based on fluorescence microscopy. Results show that N. apis viability counts using flow cytometry produced very similar results when compared with fluorescence microscopy. However, we found that fluorescence microscopy underestimates N. apis viability in samples with higher percentages of viable spores, the latter typically being what is found in biological samples. A series of experiments were conducted to confirm that flow cytometry allows the use of additional fluorescent dyes such as SYBR 14 and SYTOX Red (used in combination with SYTO 16 or Propidium Iodide) to distinguish dead from live spores. We also show that spore viability quantification with flow cytometry can be undertaken using substantially lower dye concentrations than fluorescence microscopy. In conclusion, our data show flow cytometry to be a fast, reliable method to quantify N. apis spore viabilities, which has a number of advantages compared with existing methods. PMID:24339267

Peng, Yan; Lee-Pullen, Tracey F; Heel, Kathy; Millar, A Harvey; Baer, Boris

2014-05-01

251

Quantifying compositional impacts of ambient aerosol on cloud droplet formation  

NASA Astrophysics Data System (ADS)

It has been historically assumed that most of the uncertainty associated with the aerosol indirect effect on climate can be attributed to the unpredictability of updrafts. In Chapter 1, we analyze the sensitivity of cloud droplet number density, to realistic variations in aerosol chemical properties and to variable updraft velocities using a 1-dimensional cloud parcel model in three important environmental cases (continental, polluted and remote marine). The results suggest that aerosol chemical variability may be as important to the aerosol indirect effect as the effect of unresolved cloud dynamics, especially in polluted environments. We next used a continuous flow streamwise thermal gradient Cloud Condensation Nuclei counter (CCNc) to study the water-uptake properties of the ambient aerosol, by exposing an aerosol sample to a controlled water vapor supersaturation and counting the resulting number of droplets. In Chapter 2, we modeled and experimentally characterized the heat transfer properties and droplet growth within the CCNc. Chapter 3 describes results from the MIRAGE field campaign, in which the CCNc and a Hygroscopicity Tandem Differential Mobility Analyzer (HTDMA) were deployed at a ground-based site during March, 2006. Size-resolved CCN activation spectra and growth factor distributions of the ambient aerosol in Mexico City were obtained, and an analytical technique was developed to quantify a probability distribution of solute volume fractions for the CCN in addition to the aerosol mixing-state. The CCN were shown to be much less CCN active than ammonium sulfate, with water uptake properties more consistent with low molecular weight organic compounds. The pollution outflow from Mexico City was shown to have CCN with an even lower fraction of soluble material. "Chemical Closure" was attained for the CCN, by comparing the inferred solute volume fraction with that from direct chemical measurements. A clear diurnal pattern was observed for the CCN solute volume fraction, showing that measurable aging of the aerosol population occurs during the day, on the timescale of a few hours. The mixing state of the aerosol, also showing a consistent diurnal pattern, clearly correlates with a chemical tracer for local combustion sources. Chapter 4 describes results from the GoMACCS field study, in which the CCNc was subsequently deployed on an airborne field campaign in Houston, Texas during August-September, 2006. GoMACCS tested our ability to predict CCN for highly polluted conditions with limited chemical information. Assuming the particles were composed purely of ammonium sulfate, CCN closure was obtained with a 10% overprediction bias on average for CCN concentrations ranging from less than 100 cm-3 to over 10,000 cm-3, but with on average 50% variability. Assuming measured concentrations of organics to be internally mixed and insoluble tended to reduce the overprediction bias for less polluted conditions, but led to underprediction bias in the most polluted conditions. A likely explanation is that the high organic concentrations in the polluted environments depress the surface tension of the droplets, thereby enabling activation at lower soluble fractions.

Lance, Sara

252

Quantifying methane flux from lake sediments using multibeam sonar  

NASA Astrophysics Data System (ADS)

Methane is a potent greenhouse gas, and the production and emission of methane from sediments in wetlands, lakes and rivers both contributes to and may be exacerbated by climate change. In some of these shallow-water settings, methane fluxes may be largely controlled by episodic venting that can be triggered by drops in hydrostatic pressure. Even with better constraints on the mechanisms for gas release, quantifying these fluxes has remained a challenge due to rapid spatiotemporal changes in the patterns of bubble emissions from the sediments. The research presented here uses a fixed-location Imagenex DeltaT 837B multibeam sonar to estimate methane-venting fluxes from organic-rich lake sediments over a large area (~400 m2) and over a multi-season deployment period with unprecedented spatial and temporal resolution. Simpler, single-beam sonar systems have been used in the past to estimate bubble fluxes in a variety of settings. Here we extend this methodology to a multibeam system by means of: (1) detailed calibration of the sonar signal against imposed bubble streams, and (2) validation against an in situ independent record of gas flux captured by overlying bubble traps. The calibrated sonar signals then yield estimates of the methane flux with high spatial resolution (~1 m) and temporal frequency (6 Hz) from a portion of the deepwater basin of Upper Mystic Lake, MA, USA, a temperate eutrophic kettle lake. These results in turn inform mathematical models of methane transport and release from the sediments, which reproduce with high fidelity the ebullitive response to hydrostatic pressure variations. In addition, the detailed information about spatial variability of methane flux derived from sonar records is used to estimate the uncertainty associated with upscaling flux measurements from bubble traps to the scale of the sonar observation area. Taken together, these multibeam sonar measurements and analysis provide a novel quantitative approach for the assessment of methane fluxes from shallow-water bodies. Time series showing how the uncalibrated, sonar-detected flux estimate (black) varies inversely with the hydrostatic pressure (meters of water, blue) at 5-minute resolution during April 2012. Overlain is the time series of scaled gas flux from a mechanistic numerical model forced by the same hydrostatic pressure signal (orange).

Scandella, B.; Urban, P.; Delwiche, K.; Greinert, J.; Hemond, H.; Ruppel, C. D.; Juanes, R.

2013-12-01

253

Quantifying the effect of environment stability on the transcription factor repertoire of marine microbes  

PubMed Central

Background DNA-binding transcription factors (TFs) regulate cellular functions in prokaryotes, often in response to environmental stimuli. Thus, the environment exerts constant selective pressure on the TF gene content of microbial communities. Recently a study on marine Synechococcus strains detected differences in their genomic TF content related to environmental adaptation, but so far the effect of environmental parameters on the content of TFs in bacterial communities has not been systematically investigated. Results We quantified the effect of environment stability on the transcription factor repertoire of marine pelagic microbes from the Global Ocean Sampling (GOS) metagenome using interpolated physico-chemical parameters and multivariate statistics. Thirty-five percent of the difference in relative TF abundances between samples could be explained by environment stability. Six percent was attributable to spatial distance but none to a combination of both spatial distance and stability. Some individual TFs showed a stronger relationship to environment stability and space than the total TF pool. Conclusions Environmental stability appears to have a clearly detectable effect on TF gene content in bacterioplanktonic communities described by the GOS metagenome. Interpolated environmental parameters were shown to compare well to in situ measurements and were essential for quantifying the effect of the environment on the TF content. It is demonstrated that comprehensive and well-structured contextual data will strongly enhance our ability to interpret the functional potential of microbes from metagenomic data. PMID:22587903

2011-01-01

254

Comparative study of two chromatographic methods for quantifying 2,4,6-trichloranisole in wines.  

PubMed

Here we present the validation and the comparative study of two chromatographic methods for quantifying 2,4,6-trichloroanisole (TCA) in wines (red, rosé and white wines). The first method involves headspace solid-phase microextraction and gas chromatography with electron-capture detection (ECD). The evaluation of the performance parameters shows limit of detection of 0.3 ng l(-1), limit of quantification of 1.0 ng l(-1), recoveries around 100% and repeatability of 10%. The second one implies a headspace solid-phase microextraction and gas chromatography with mass spectrometric detection. The performance parameters of this second method are limit of detection of 0.2 ng l(-1), limit of quantification of 0.8 ng l(-1) and repeatability of 10.1%. From the comparative study we can state that both methods provide similar results and the differences between them are the better sensitivity of the GC-ECD method and the very shorter chromatogram running time of the GC-MS method. The two methods are able to quantify TCA below the sensorial threshold in red, rosé and white wines using just a calibration graph, thus they could be a very good tool for quality control in wineries. PMID:17109869

Riu, M; Mestres, M; Busto, O; Guasch, J

2007-01-01

255

A Synthetic Phased Array Surface Acoustic Wave Sensor for Quantifying Bolt Tension  

PubMed Central

In this paper, we report our findings on implementing a synthetic phased array surface acoustic wave sensor to quantify bolt tension. Maintaining proper bolt tension is important in many fields such as for ensuring safe operation of civil infrastructures. Significant advantages of this relatively simple methodology is its capability to assess bolt tension without any contact with the bolt, thus enabling measurement at inaccessible locations, multiple bolt measurement capability at a time, not requiring data collection during the installation and no calibration requirements. We performed detailed experiments on a custom-built flexible bench-top experimental setup consisting of 1018 steel plate of 12.7 mm (½ in) thickness, a 6.4 mm (¼ in) grade 8 bolt and a stainless steel washer with 19 mm (¾ in) of external diameter. Our results indicate that this method is not only capable of clearly distinguishing properly bolted joints from loosened joints but also capable of quantifying how loose the bolt actually is. We also conducted detailed signal-to-noise (SNR) analysis and showed that the SNR value for the entire bolt tension range was sufficient for image reconstruction.

Martinez, Jairo; Sisman, Alper; Onen, Onursal; Velasquez, Dean; Guldiken, Rasim

2012-01-01

256

Quantifying fluvial bedrock erosion using repeat terrestrial Lidar  

NASA Astrophysics Data System (ADS)

The Da'an River Gorge in western Taiwan provides a unique opportunity to observe the formation and evolution of a natural bedrock gorge. The 1.2 km long and up to 20 m deep gorge has formed since 1999 in response to uplift of the riverbed during the Chi-Chi earthquake. The extremely rapid pace of erosion enables us to observe both downcutting and channel widening over short time periods. We have monitored the evolution of the gorge since 2009 using repeat RTK GPS surveys and terrestrial Lidar scans. GPS surveys of the channel profile are conducted frequently, with 24 surveys to date, while Lidar scans are conducted after major floods, or after 5-9 months without a flood, for a total of 8 scans to date. The Lidar data are most useful for recording erosion of channel walls, which is quite episodic and highly variable along the channel. By quantifying the distribution of wall erosion in space and time, we can improve our understanding of channel widening processes and of the development of the channel planform, particularly the growth of bends. During the summer of 2012, the Da'an catchment experienced two large storm events, a meiyu (plum rain) event on June 10-13 that brought 800 mm of rain and a typhoon on August 1-3 that brought 650 mm of rain. The resulting floods had significant geomorphic effects on the Da'an gorge, including up to 10s of meters of erosion in some sections of the gorge walls. We quantify these changes using Lidar surveys conducted on June 7, July 3, and August 30. Channel wall collapses also occur in the absence of large floods, and we use scans from August 23, 2011 and June 7, 2012 to quantify erosion during a period that included a number of small floods, but no large ones. This allows us to compare the impact of 9 months of normal conditions to the impact of short-duration extreme events. The observed variability of erosion in space and time highlights the need for 3D techniques such as terrestrial Lidar to properly quantify erosion in this setting.

Cook, Kristen

2013-04-01

257

Quantifying Riverscape Connectivity with Graph Theory  

NASA Astrophysics Data System (ADS)

Fluvial catchments convey fluxes of water, sediment, nutrients and aquatic biota. At continental scales, crustal topography defines the overall path of channels whilst at local scales depositional and/or erosional features generally determine the exact path of a channel. Furthermore, constructions such as dams, for either water abstraction or hydropower, often have a significant impact on channel networks.The concept of ';connectivity' is commonly invoked when conceptualising the structure of a river network.This concept is easy to grasp but there have been uneven efforts across the environmental sciences to actually quantify connectivity. Currently there have only been a few studies reporting quantitative indices of connectivity in river sciences, notably, in the study of avulsion processes. However, the majority of current work describing some form of environmental connectivity in a quantitative manner is in the field of landscape ecology. Driven by the need to quantify habitat fragmentation, landscape ecologists have returned to graph theory. Within this formal setting, landscape ecologists have successfully developed a range of indices which can model connectivity loss. Such formal connectivity metrics are currently needed for a range of applications in fluvial sciences. One of the most urgent needs relates to dam construction. In the developed world, hydropower development has generally slowed and in many countries, dams are actually being removed. However, this is not the case in the developing world where hydropower is seen as a key element to low-emissions power-security. For example, several dam projects are envisaged in Himalayan catchments in the next 2 decades. This region is already under severe pressure from climate change and urbanisation, and a better understanding of the network fragmentation which can be expected in this system is urgently needed. In this paper, we apply and adapt connectivity metrics from landscape ecology. We then examine the connectivity structure of the Gangetic riverscape with fluvial remote sensing. Our study reach extends from the heavily dammed headwaters of the Bhagirathi, Mandakini and Alaknanda rivers which form the source of the Ganga to Allahabad ~900 km downstream on the main stem. We use Landsat-8 imagery as the baseline dataset. Channel width along the Ganga (i.e. Ganges) is often several kilometres. Therefore, the pan-sharpened 15m pixels of Landsat-8 are in fact capable of resolving inner channel features for over 80% of the channel length thus allowing a riverscape approach to be adopted. We examine the following connectivity metrics: size distribution of connected components, betweeness centrality and the integrated index of connectivity. A geographic perspective is added by mapping local (25 km-scale) values for these metrics in order to examine spatial patterns of connectivity. This approach allows us to map impacts of dam construction and has the potential to inform policy decisions in the area as well as open-up new avenues of investigation.

Carbonneau, P.; Milledge, D.; Sinha, R.; Tandon, S. K.

2013-12-01

258

A new model for quantifying climate episodes  

NASA Astrophysics Data System (ADS)

When long records of climate (precipitation, temperature, stream runoff, etc.) are available, either from instrumental observations or from proxy records, the objective evaluation and comparison of climatic episodes becomes necessary. Such episodes can be quantified in terms of duration (the number of time intervals, e.g. years, the process remains continuously above or below a reference level) and magnitude (the sum of all series values for a given duration). The joint distribution of duration and magnitude is represented here by a stochastic model called BEG, for bivariate distribution with exponential and geometric marginals. The model is based on the theory of random sums, and its mathematical derivation confirms and extends previous empirical findings. Probability statements that can be obtained from the model are illustrated by applying it to a 2300-year dendroclimatic reconstruction of water-year precipitation for the eastern Sierra Nevada-western Great Basin. Using the Dust Bowl drought period as an example, the chance of a longer or greater drought is 8%. Conditional probabilities are much higher, i.e. a drought of that magnitude has a 62% chance of lasting for 11 years or longer, and a drought that lasts 11 years has a 46% chance of having an equal or greater magnitude. In addition, because of the bivariate model, we can estimate a 6% chance of witnessing a drought that is both longer and greater. Additional examples of model application are also provided. This type of information provides a way to place any climatic episode in a temporal perspective, and such numerical statements help with reaching science-based management and policy decisions.

Biondi, Franco; Kozubowski, Tomasz J.; Panorska, Anna K.

2005-07-01

259

Quantifying human vitamin kinetics using AMS  

SciTech Connect

Tracing vitamin kinetics at physiologic concentrations has been hampered by a lack of quantitative sensitivity for chemically equivalent tracers that could be used safely in healthy people. Instead, elderly or ill volunteers were sought for studies involving pharmacologic doses with radioisotopic labels. These studies fail to be relevant in two ways: vitamins are inherently micronutrients, whose biochemical paths are saturated and distorted by pharmacological doses; and while vitamins remain important for health in the elderly or ill, their greatest effects may be in preventing slow and cumulative diseases by proper consumption throughout youth and adulthood. Neither the target dose nor the target population are available for nutrient metabolic studies through decay counting of radioisotopes at high levels. Stable isotopic labels are quantified by isotope ratio mass spectrometry at levels that trace physiologic vitamin doses, but the natural background of stable isotopes severely limits the time span over which the tracer is distinguishable. Indeed, study periods seldom ranged over a single biological mean life of the labeled nutrients, failing to provide data on the important final elimination phase of the compound. Kinetic data for the absorption phase is similarly rare in micronutrient research because the phase is rapid, requiring many consecutive plasma samples for accurate representation. However, repeated blood samples of sufficient volume for precise stable or radio-isotope quantitations consume an indefensible amount of the volunteer's blood over a short period. Thus, vitamin pharmacokinetics in humans has often relied on compartmental modeling based upon assumptions and tested only for the short period of maximal blood circulation, a period that poorly reflects absorption or final elimination kinetics except for the most simple models.

Hillegonds, D; Dueker, S; Ognibene, T; Buchholz, B; Lin, Y; Vogel, J; Clifford, A

2004-02-19

260

Quantifying collective attention from tweet stream.  

PubMed

Online social media are increasingly facilitating our social interactions, thereby making available a massive "digital fossil" of human behavior. Discovering and quantifying distinct patterns using these data is important for studying social behavior, although the rapid time-variant nature and large volumes of these data make this task difficult and challenging. In this study, we focused on the emergence of "collective attention" on Twitter, a popular social networking service. We propose a simple method for detecting and measuring the collective attention evoked by various types of events. This method exploits the fact that tweeting activity exhibits a burst-like increase and an irregular oscillation when a particular real-world event occurs; otherwise, it follows regular circadian rhythms. The difference between regular and irregular states in the tweet stream was measured using the Jensen-Shannon divergence, which corresponds to the intensity of collective attention. We then associated irregular incidents with their corresponding events that attracted the attention and elicited responses from large numbers of people, based on the popularity and the enhancement of key terms in posted messages or "tweets." Next, we demonstrate the effectiveness of this method using a large dataset that contained approximately 490 million Japanese tweets by over 400,000 users, in which we identified 60 cases of collective attentions, including one related to the Tohoku-oki earthquake. "Retweet" networks were also investigated to understand collective attention in terms of social interactions. This simple method provides a retrospective summary of collective attention, thereby contributing to the fundamental understanding of social behavior in the digital era. PMID:23637913

Sasahara, Kazutoshi; Hirata, Yoshito; Toyoda, Masashi; Kitsuregawa, Masaru; Aihara, Kazuyuki

2013-01-01

261

Boron aluminum crippling strength shows improvement  

NASA Technical Reports Server (NTRS)

Results are presented from an experimental program directed toward improving boron aluminum crippling strength. Laminate changes evaluated were larger filament diameter, improved processing, shape changes, adding steel-aluminum cross plies, reduced filament volume in corners, adding boron aluminum angle plies, and using titanium interleaves. Filament diameter and steel-aluminum cross plies have little effect on crippling. It is shown that better processing combined with appropriate shape changes improved crippling over 50 percent at both room temperature and 600 F. Tests also show that crippling improvements ranging from 20 to 40 percent are achieved using angle plies and titanium interleaves.

Otto, O. R.; Bohlmann, R. E.

1974-01-01

262

Quantifying Light, Medium, and Heavy Crude Oil Distribution in Homogeneous Porous Media  

NASA Astrophysics Data System (ADS)

Crude oil recovery is highly dependent upon the physical heterogeneity of media and resulting distribution of the oil-phase within the pore spaces. Factors such as capillary force, the geometry of the pore spaces, and interfacial tension between the oil blobs and water-wet porous media will ultimately control the recovery process. Pore scale studies were conducted to study the distribution and the morphology of various fractions of crude oil in increasingly heterogeneous porous media. In addition, experiments were also carried out to characterize the temporal changes in distribution and morphology of the oil phase after a series of surfactant flooding events. Specifically, columns were packed with three different porous media with increasing heterogeneity and distributed with three different fractions (light, medium, and heavy) of crude oil. The columns were imaged using synchrotron X-ray microtomography before and after a series of surfactant floods to quantify the resulting crude oil distributions over time. Preliminary results show that the light crude oil was more heterogeneously distributed than the medium fraction crude oil within the same porous media type both before and throughout the series of surfactant floods. It was also observed that approximately 95% of the medium fraction crude oil blob-size distribution was smaller (<0.0008 cu mm) than that of the light crude oil, encompassing a significant number of blob singlets. The lighter crude oil fraction has the median blob diameter approximately 20 times greater than that of the medium crude oil fraction. These results further reveal that that oil extraction and recovery is highly dependent upon the oil fraction, the presence of small- sized blob singlets, and the resulting distributions present within porous media before and during surfactant flooding. This research will not only be helpful in understanding the factors controlling crude oil mobilization at the pore scale but also test the utility of synchrotron X-ray microtomography to quantify pore scale distribution at high resolution.

Ghosh, J.; Tick, G. R.

2008-12-01

263

Molecular Marker Approach on Characterizing and Quantifying Charcoal in Environmental Media  

NASA Astrophysics Data System (ADS)

Black carbon (BC) is widely distributed in natural environments including soils, sediments, freshwater, seawater and the atmosphere. It is produced mostly from the incomplete combustion of fossil fuels and vegetation. In recent years, increasing attention has been given to BC due to its potential influence in many biogeochemical processes. In the environment, BC exists as a continuum ranging from partly charred plant materials, charcoal residues to highly condensed soot and graphite particles. The heterogeneous nature of black carbon means that BC is always operationally-defined, highlighting the need for standard methods that support data comparisons. Unlike soot and graphite that can be quantified with well-established methods, it is difficult to directly quantify charcoal in geologic media due to its chemical and physical heterogeneity. Most of the available charcoal quantification methods detect unknown fractions of the BC continuum. To specifically identify and quantify charcoal in soils and sediments, we adopted and validated an innovative molecular marker approach that quantifies levoglucosan, a pyrogenic derivative of cellulose, as a proxy of charcoal. Levoglucosan is source-specific, stable and is able to be detected at low concentrations using gas chromatograph-mass spectrometer (GC-MS). In the present study, two different plant species, honey mesquite and cordgrass, were selected as the raw materials to synthesize charcoals. The lab-synthesize charcoals were made under control conditions to eliminate the high heterogeneity often found in natural charcoals. The effects of two major combustion factors, temperature and duration, on the yield of levoglucosan were characterized in the lab-synthesize charcoals. Our results showed that significant levoglucosan production in the two types of charcoal was restricted to relatively low combustion temperatures (150-350 degree C). The combustion duration did not cause significant differences in the yield of levoglucosan in the two charcoals. Interestingly, the low temperature charcoals are undetectable by the acid dichromate oxidation method, a popular soot/charcoal analytical approach. Our study demonstrates that levoglucosan can serve as a proxy of low temperature charcoals that are undetectable using other BC methods. Moreover, our study highlights the limitations of the common BC quantification methods to characterize the entire BC continuum.

Kuo, L.; Herbert, B. E.; Louchouarn, P.

2006-12-01

264

Quantifiable outcomes from corporate and higher education learning collaborations  

NASA Astrophysics Data System (ADS)

The study investigated the existence of measurable learning outcomes that emerged out of the shared strengths of collaborating sponsors. The study identified quantifiable learning outcomes that confirm corporate, academic and learner participation in learning collaborations. Each of the three hypotheses and the synergy indicator quantitatively and qualitatively confirmed learning outcomes benefiting participants. The academic-indicator quantitatively confirmed that learning outcomes attract learners to the institution. The corporate-indicator confirmed that learning outcomes include knowledge exchange and enhanced workforce talents for careers in the energy-utility industry. The learner-indicator confirmed that learning outcomes provide professional development opportunities for employment. The synergy-indicator confirmed that best learning practices in learning collaborations emanate out of the sponsors' shared strengths, and that partnerships can be elevated to strategic alliances, going beyond response to the desires of sponsors to create learner-centered cultures. The synergy-indicator confirmed the value of organizational processes that elevate sponsors' interactions to sharing strength, to create a learner-centered culture. The study's series of qualitative questions confirmed prior success factors, while verifying the hypothesis results and providing insight not available from quantitative data. The direct benefactors of the study are the energy-utility learning-collaboration participants of the study, and corporation, academic institutions, and learners of the collaboration. The indirect benefactors are the stakeholders of future learning collaborations, through improved knowledge of the existence or absence of quantifiable learning outcomes.

Devine, Thomas G.

265

Quantified PIRT and Uncertainty Quantification for Computer Code Validation  

NASA Astrophysics Data System (ADS)

This study is intended to investigate and propose a systematic method for uncertainty quantification for the computer code validation application. Uncertainty quantification has gained more and more attentions in recent years. U.S. Nuclear Regulatory Commission (NRC) requires the use of realistic best estimate (BE) computer code to follow the rigorous Code Scaling, Application and Uncertainty (CSAU) methodology. In CSAU, the Phenomena Identification and Ranking Table (PIRT) was developed to identify important code uncertainty contributors. To support and examine the traditional PIRT with quantified judgments, this study proposes a novel approach, the Quantified PIRT (QPIRT), to identify important code models and parameters for uncertainty quantification. Dimensionless analysis to code field equations to generate dimensionless groups (pi groups) using code simulation results serves as the foundation for QPIRT. Uncertainty quantification using DAKOTA code is proposed in this study based on the sampling approach. Nonparametric statistical theory identifies the fixed number of code run to assure the 95 percent probability and 95 percent confidence in the code uncertainty intervals.

Luo, Hu

266

Methods for quantifying uncertainty in fast reactor analyses.  

SciTech Connect

Liquid-metal-cooled fast reactors in the form of sodium-cooled fast reactors have been successfully built and tested in the U.S. and throughout the world. However, no fast reactor has operated in the U.S. for nearly fourteen years. More importantly, the U.S. has not constructed a fast reactor in nearly 30 years. In addition to reestablishing the necessary industrial infrastructure, the development, testing, and licensing of a new, advanced fast reactor concept will likely require a significant base technology program that will rely more heavily on modeling and simulation than has been done in the past. The ability to quantify uncertainty in modeling and simulations will be an important part of any experimental program and can provide added confidence that established design limits and safety margins are appropriate. In addition, there is an increasing demand from the nuclear industry for best-estimate analysis methods to provide confidence bounds along with their results. The ability to quantify uncertainty will be an important component of modeling that is used to support design, testing, and experimental programs. Three avenues of UQ investigation are proposed. Two relatively new approaches are described which can be directly coupled to simulation codes currently being developed under the Advanced Simulation and Modeling program within the Reactor Campaign. A third approach, based on robust Monte Carlo methods, can be used in conjunction with existing reactor analysis codes as a means of verification and validation of the more detailed approaches.

Fanning, T. H.; Fischer, P. F.

2008-04-07

267

Quantifying Relative Diver Effects in Underwater Visual Censuses  

PubMed Central

Diver-based Underwater Visual Censuses (UVCs), particularly transect-based surveys, are key tools in the study of coral reef fish ecology. These techniques, however, have inherent problems that make it difficult to collect accurate numerical data. One of these problems is the diver effect (defined as the reaction of fish to a diver). Although widely recognised, its effects have yet to be quantified and the extent of taxonomic variation remains to be determined. We therefore examined relative diver effects on a reef fish assemblage on the Great Barrier Reef. Using common UVC methods, the recorded abundance of seven reef fish groups were significantly affected by the ongoing presence of SCUBA divers. Overall, the diver effect resulted in a 52% decrease in the mean number of individuals recorded, with declines of up to 70% in individual families. Although the diver effect appears to be a significant problem, UVCs remain a useful approach for quantifying spatial and temporal variation in relative fish abundances, especially if using methods that minimise the exposure of fishes to divers. Fixed distance transects using tapes or lines deployed by a second diver (or GPS-calibrated timed swims) would appear to maximise fish counts and minimise diver effects. PMID:21533039

Dickens, Luke C.; Goatley, Christopher H. R.; Tanner, Jennifer K.; Bellwood, David R.

2011-01-01

268

Quantifying Disorder through Conditional Entropy: An Application to Fluid Mixing  

PubMed Central

In this paper, we present a method to quantify the extent of disorder in a system by using conditional entropies. Our approach is especially useful when other global, or mean field, measures of disorder fail. The method is equally suited for both continuum and lattice models, and it can be made rigorous for the latter. We apply it to mixing and demixing in multicomponent fluid membranes, and show that it has advantages over previous measures based on Shannon entropies, such as a much diminished dependence on binning and the ability to capture local correlations. Further potential applications are very diverse, and could include the study of local and global order in fluid mixtures, liquid crystals, magnetic materials, and particularly biomolecular systems. PMID:23762401

Brandani, Giovanni B.; Schor, Marieke; MacPhee, Cait E.; Grubmüller, Helmut; Zachariae, Ulrich; Marenduzzo, Davide

2013-01-01

269

Quantifying the Topology of Large-Scale Structure  

E-print Network

We propose and investigate a new algorithm for quantifying the topological properties of cosmological density fluctuations. We first motivate this algorithm by drawing a formal distinction between two definitions of relevant topological characteristics, based on concepts, on the one hand, from differential topology and, on the other, from integral geometry. The former approach leads one to concentrate on properties of the contour surfaces which, in turn, leads to the algorithms CONTOUR2D and CONTOUR3D familiar to cosmologists. The other approach, which we adopt here, actually leads to much simpler algorithms in both two and three dimensions. (The 2D algorithm has already been introduced to the astronomical literature.) We discuss the 3D case in some detail and compare results obtained with it to analogous results using the CONTOUR3D algorithm.

Peter Coles; Andrew G. Davies; Russell C. Pearson

1996-03-27

270

QUANTIFYING ATYPICALITY IN AFFECTIVE FACIAL EXPRESSIONS OF CHILDREN WITH AUTISM SPECTRUM DISORDERS  

PubMed Central

We focus on the analysis, quantification and visualization of atypicality in affective facial expressions of children with High Functioning Autism (HFA). We examine facial Motion Capture data from typically developing (TD) children and children with HFA, using various statistical methods, including Functional Data Analysis, in order to quantify atypical expression characteristics and uncover patterns of expression evolution in the two populations. Our results show that children with HFA display higher asynchrony of motion between facial regions, more rough facial and head motion, and a larger range of facial region motion. Overall, subjects with HFA consistently display a wider variability in the expressive facial gestures that they employ. Our analysis demonstrates the utility of computational approaches for understanding behavioral data and brings new insights into the autism domain regarding the atypicality that is often associated with facial expressions of subjects with HFA.

Metallinou, Angeliki; Grossman, Ruth B.; Narayanan, Shrikanth

2013-01-01

271

Quantifying Permafrost Characteristics with DCR-ERT  

NASA Astrophysics Data System (ADS)

Geophysical methods are an efficient method for quantifying permafrost characteristics for Arctic road design and engineering. In the Alaskan Arctic construction and maintenance of roads requires integration of permafrost; ground that is below 0 degrees C for two or more years. Features such as ice content and temperature are critical for understanding current and future ground conditions for planning, design and evaluation of engineering applications. This study focused on the proposed Foothills West Transportation Access project corridor where the purpose is to construct a new all-season road connecting the Dalton Highway to Umiat. Four major areas were chosen that represented a range of conditions including gravel bars, alluvial plains, tussock tundra (both unburned and burned conditions), high and low centered ice-wedge polygons and an active thermokarst feature. Direct-current resistivity using galvanic contact (DCR-ERT) was applied over transects. In conjunction complimentary site data including boreholes, active layer depths, vegetation descriptions and site photographs was obtained. The boreholes provided information on soil morphology, ice texture and gravimetric moisture content. Horizontal and vertical resolutions in the DCR-ERT were varied to determine the presence or absence of ground ice; subsurface heterogeneity; and the depth to groundwater (if present). The four main DCR-ERT methods used were: 84 electrodes with 2 m spacing; 42 electrodes with 0.5 m spacing; 42 electrodes with 2 m spacing; and 84 electrodes with 1 m spacing. In terms of identifying the ground ice characteristics the higher horizontal resolution DCR-ERT transects with either 42 or 84 electrodes and 0.5 or 1 m spacing were best able to differentiate wedge-ice. This evaluation is based on a combination of both borehole stratigraphy and surface characteristics. Simulated apparent resistivity values for permafrost areas varied from a low of 4582 ? m to a high of 10034 ? m. Previous studies found permafrost conditions with corresponding resistivity values as low as 5000 ? m. This work emphasizes the necessity of tailoring the DCR-ERT survey to verified ground ice characteristics.

Schnabel, W.; Trochim, E.; Munk, J.; Kanevskiy, M. Z.; Shur, Y.; Fortier, R.

2012-12-01

272

Quantifying missing heritability at known GWAS loci.  

PubMed

Recent work has shown that much of the missing heritability of complex traits can be resolved by estimates of heritability explained by all genotyped SNPs. However, it is currently unknown how much heritability is missing due to poor tagging or additional causal variants at known GWAS loci. Here, we use variance components to quantify the heritability explained by all SNPs at known GWAS loci in nine diseases from WTCCC1 and WTCCC2. After accounting for expectation, we observed all SNPs at known GWAS loci to explain 1.29 x more heritability than GWAS-associated SNPs on average (P=3.3 x 10??). For some diseases, this increase was individually significant: 2.07 x for Multiple Sclerosis (MS) (P=6.5 x 10??) and 1.48 x for Crohn's Disease (CD) (P = 1.3 x 10?³); all analyses of autoimmune diseases excluded the well-studied MHC region. Additionally, we found that GWAS loci from other related traits also explained significant heritability. The union of all autoimmune disease loci explained 7.15 x more MS heritability than known MS SNPs (P < 1.0 x 10?¹? and 2.20 x more CD heritability than known CD SNPs (P = 6.1 x 10??), with an analogous increase for all autoimmune diseases analyzed. We also observed significant increases in an analysis of > 20,000 Rheumatoid Arthritis (RA) samples typed on ImmunoChip, with 2.37 x more heritability from all SNPs at GWAS loci (P = 2.3 x 10??) and 5.33 x more heritability from all autoimmune disease loci (P < 1 x 10?¹? compared to known RA SNPs (including those identified in this cohort). Our methods adjust for LD between SNPs, which can bias standard estimates of heritability from SNPs even if all causal variants are typed. By comparing adjusted estimates, we hypothesize that the genome-wide distribution of causal variants is enriched for low-frequency alleles, but that causal variants at known GWAS loci are skewed towards common alleles. These findings have important ramifications for fine-mapping study design and our understanding of complex disease architecture. PMID:24385918

Gusev, Alexander; Bhatia, Gaurav; Zaitlen, Noah; Vilhjalmsson, Bjarni J; Diogo, Dorothée; Stahl, Eli A; Gregersen, Peter K; Worthington, Jane; Klareskog, Lars; Raychaudhuri, Soumya; Plenge, Robert M; Pasaniuc, Bogdan; Price, Alkes L

2013-01-01

273

Quantifying lithological variability in the mantle  

NASA Astrophysics Data System (ADS)

We present a method that can be used to estimate the amount of recycled material present in the source region of mid-ocean ridge basalts by combining three key constraints: (1) the melting behaviour of the lithologies identified to be present in a mantle source, (2) the overall volume of melt production, and (3) the proportion of melt production attributable to melting of each lithology. These constraints are unified in a three-lithology melting model containing lherzolite, pyroxenite and harzburgite, representative products of mantle differentiation, to quantify their abundance in igneous source regions. As a case study we apply this method to Iceland, a location with sufficient geochemical and geophysical data to meet the required observational constraints. We find that to generate the 20 km of igneous crustal thickness at Iceland's coasts, with 30±10% of the crust produced from melting a pyroxenitic lithology, requires an excess mantle potential temperature (?Tp) of ?130 °C (Tp?1460 °C) and a source consisting of at least 5% recycled basalt. Therefore, the mantle beneath Iceland requires a significant excess temperature to match geophysical and geochemical observations: lithological variation alone cannot account for the high crustal thickness. Determining a unique source solution is only possible if mantle potential temperature is known precisely and independently, otherwise a family of possible lithology mixtures is obtained across the range of viable ?Tp. For Iceland this uncertainty in ?Tp means that the mantle could be >20% harzburgitic if ?Tp>150 °C (Tp>1480 °C). The consequences of lithological heterogeneity for plume dynamics in various geological contexts are also explored through thermodynamic modelling of the densities of lherzolite, basalt, and harzburgite mixtures in the mantle. All lithology solutions for Iceland are buoyant in the shallow mantle at the ?Tp for which they are valid, however only lithology mixtures incorporating a significant harzburgite component are able to reproduce recent estimates of the Iceland plume's volume flux. Using the literature estimates of the amount of recycled basalt in the sources of Hawaiian and Siberian volcanism, we found that they are negatively buoyant in the upper mantle, even at the extremes of their expected ?Tp. One solution to this problem is that low density refractory harzburgite is a more ubiquitous component in mantle plumes than previously acknowledged.

Shorttle, Oliver; Maclennan, John; Lambart, Sarah

2014-06-01

274

Quantifying the impacts of global disasters  

NASA Astrophysics Data System (ADS)

The US Geological Survey, National Oceanic and Atmospheric Administration, California Geological Survey, and other entities are developing a Tsunami Scenario, depicting a realistic outcome of a hypothetical but plausible large tsunami originating in the eastern Aleutian Arc, affecting the west coast of the United States, including Alaska and Hawaii. The scenario includes earth-science effects, damage and restoration of the built environment, and social and economic impacts. Like the earlier ShakeOut and ARkStorm disaster scenarios, the purpose of the Tsunami Scenario is to apply science to quantify the impacts of natural disasters in a way that can be used by decision makers in the affected sectors to reduce the potential for loss. Most natural disasters are local. A major hurricane can destroy a city or damage a long swath of coastline while mostly sparing inland areas. The largest earthquake on record caused strong shaking along 1500 km of Chile, but left the capital relatively unscathed. Previous scenarios have used the local nature of disasters to focus interaction with the user community. However, the capacity for global disasters is growing with the interdependency of the global economy. Earthquakes have disrupted global computer chip manufacturing and caused stock market downturns. Tsunamis, however, can be global in their extent and direct impact. Moreover, the vulnerability of seaports to tsunami damage can increase the global consequences. The Tsunami Scenario is trying to capture the widespread effects while maintaining the close interaction with users that has been one of the most successful features of the previous scenarios. The scenario tsunami occurs in the eastern Aleutians with a source similar to the 2011 Tohoku event. Geologic similarities support the argument that a Tohoku-like source is plausible in Alaska. It creates a major nearfield tsunami in the Aleutian arc and peninsula, a moderate tsunami in the US Pacific Northwest, large but not the maximum in Hawaii, and the largest plausible tsunami in southern California. To support the analysis of global impacts, we begin with the Ports of Los Angeles and Long Beach which account for >40% of the imports to the United States. We expand from there throughout California for the first level economic analysis. We are looking to work with Alaska and Hawaii, especially on similar economic issues in ports, over the next year and to expand the analysis to consideration of economic interactions between the regions.

Jones, L. M.; Ross, S.; Wilson, R. I.; Borrero, J. C.; Brosnan, D.; Bwarie, J. T.; Geist, E. L.; Hansen, R. A.; Johnson, L. A.; Kirby, S. H.; Long, K.; Lynett, P. J.; Miller, K. M.; Mortensen, C. E.; Perry, S. C.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Thio, H. K.; Wein, A. M.; Whitmore, P.; Wood, N. J.

2012-12-01

275

Quantifying active tectonic processes in orogenic belts from river analysis of the Sierra Nevada (Spain)  

NASA Astrophysics Data System (ADS)

The landscape of active orogenic belts is the result of the interaction between tectonic and surface processes. This interaction can be quantified by analysing bedrock river profiles, as they are the first features to respond to active tectonics. Sierra Nevada is an E-W oriented, ~80 km long, ~40 km wide young mountain chain in southern Spain which is seismically active. The western and southern margins of the chain are defined by normal faults with some strike-slip components, which have been active at least since the Pleistocene. Seismicity is less pronounced in the north where sedimentary basins now in exhumation are exposed. The geographical location of Sierra Nevada in the southern Mediterranean; its E-W orientation and pronounced elevation (highest peak, Mulhacen, 3,480m) result in a climate contrast between the west, with higher elevation, and the eastern flanks of the mountain. Precipitation occurs mainly as snowfall during winter; in the west, where peaks have elevations of more than 3,000 m, snow is present for up to 4 months per year. The difference in tectonic activity and climate is reflected in the river profiles. Analysing those we have divided the Sierra Nevada in three main areas: the northern flank, where rivers show predominantly concave profiles; the western flank, where rivers show tendencies towards non equilibrium; and the southern flank where rivers are not in equilibrium and often show convex profiles. The contribution climate makes in shaping these profiles is, however, unknown. We have analysed 16 longitudinal river profiles along the Sierra Nevada; quantified their morphological parameters and identified the presence of knickpoints. These results show a strong correlation between river disequilibrium and active seismicity in the southern flank; this correlation is not as clearly defined in the western area, where, despite the high tectonic activity, rivers are closer to equilibrium. The results observed in the northern rivers show that they are in equilibrium. We used stream power maps to model the effects of tectonics and climate on river profiles to provide insight on the erosion and exhumation of the Sierra Nevada. We suggest that the presence of snow and, probably water condensation, increase the erosional power of the rivers in the western flank hence we observe river profiles closer to equilibrium in a highly active tectonic area.

Carracedo, A.; Beucher, R.; Persano, C.; Jansen, J.; Codilean, A.; Hoey, T.; Bishop, P.

2012-04-01

276

Quantifying the role of forest soil and bedrock in the acid neutralization of surface water in steep hillslopes.  

PubMed

The role of soil and bedrock in acid neutralizing processes has been difficult to quantify because of hydrological and biogeochemical uncertainties. To quantify those roles, hydrochemical observations were conducted at two hydrologically well-defined, steep granitic hillslopes in the Tanakami Mountains of Japan. These paired hillslopes are similar except for their soils; Fudoji is leached of base cations (base saturation <6%), while Rachidani is covered with fresh soil (base saturation >30%), because the erosion rate is 100-1000 times greater. The results showed that (1) soil solution pH at the soil-bedrock interface at Fudoji (4.3) was significantly lower than that of Rachidani (5.5), (2) the hillslope discharge pH in both hillslopes was similar (6.7-6.8), and (3) at Fudoji, 60% of the base cations leaching from the hillslope were derived from bedrock, whereas only 20% were derived from bedrock in Rachidani. Further, previously published results showed that the stream pH could not be predicted from the acid deposition rate and soil base saturation status. These results demonstrate that bedrock plays an especially important role when the overlying soil has been leached of base cations. These results indicate that while the status of soil acidification is a first-order control on vulnerability to surface water acidification, in some cases such as at Fudoji, subsurface interaction with the bedrock determines the sensitivity of surface water to acidic deposition. PMID:15519722

Asano, Yuko; Uchida, Taro

2005-02-01

277

Correlating two methods of quantifying fungal activity: Heat production by isothermal calorimetry and ergosterol amount by gas chromatography–tandem mass spectrometry  

Microsoft Academic Search

Two methods of quantifying fungal activity have been compared and correlated: isothermal calorimetry for measuring heat production and gas chromatography–tandem mass spectrometry (GC–MS\\/MS) for measuring ergosterol, a proxy for biomass. The measurements were made on four different fungi: Penicillium roqueforti, Cladosporium cladosporioides, Neopetromyces muricatus and the dry rot fungus Serpula lacrymans. The results showed linear correlations between ergosterol production and

Yujing Li; Lars Wadsö; Lennart Larsson; Jonny Bjurman

2007-01-01

278

Quantifying anthropogenic and natural contributions to thermosteric sea level rise  

NASA Astrophysics Data System (ADS)

in thermosteric sea level at decadal and longer time scales respond to anthropogenic forcing and natural variability of the climate system. Disentangling these contributions is essential to quantify the impact of human activity in the past and to anticipate thermosteric sea level rise under global warming. Climate models, fed with radiative forcing, display a large spread of outputs with limited correspondence with the observationally based estimates of thermosteric sea level during the last decades of the twentieth century. Here we extract the common signal of climate models from Coupled Model Intercomparison Project Phase 5 using a signal-to-noise maximizing empirical orthogonal function technique for the period 1950-2005. Our results match the observed trends, improving the widely used approach of multimodel ensemble averaging. We then compute the fraction of the observed thermosteric sea level rise of anthropogenic origin and conclude that 87% of the observed trend in the upper 700 m since 1970 is induced by human activity.

Marcos, Marta; Amores, Angel

2014-04-01

279

Quantifying the behavior of stock correlations under market stress.  

PubMed

Understanding correlations in complex systems is crucial in the face of turbulence, such as the ongoing financial crisis. However, in complex systems, such as financial systems, correlations are not constant but instead vary in time. Here we address the question of quantifying state-dependent correlations in stock markets. Reliable estimates of correlations are absolutely necessary to protect a portfolio. We analyze 72 years of daily closing prices of the 30 stocks forming the Dow Jones Industrial Average (DJIA). We find the striking result that the average correlation among these stocks scales linearly with market stress reflected by normalized DJIA index returns on various time scales. Consequently, the diversification effect which should protect a portfolio melts away in times of market losses, just when it would most urgently be needed. Our empirical analysis is consistent with the interesting possibility that one could anticipate diversification breakdowns, guiding the design of protected portfolios. PMID:23082242

Preis, Tobias; Kenett, Dror Y; Stanley, H Eugene; Helbing, Dirk; Ben-Jacob, Eshel

2012-01-01

280

Quantifying the Behavior of Stock Correlations Under Market Stress  

PubMed Central

Understanding correlations in complex systems is crucial in the face of turbulence, such as the ongoing financial crisis. However, in complex systems, such as financial systems, correlations are not constant but instead vary in time. Here we address the question of quantifying state-dependent correlations in stock markets. Reliable estimates of correlations are absolutely necessary to protect a portfolio. We analyze 72 years of daily closing prices of the 30 stocks forming the Dow Jones Industrial Average (DJIA). We find the striking result that the average correlation among these stocks scales linearly with market stress reflected by normalized DJIA index returns on various time scales. Consequently, the diversification effect which should protect a portfolio melts away in times of market losses, just when it would most urgently be needed. Our empirical analysis is consistent with the interesting possibility that one could anticipate diversification breakdowns, guiding the design of protected portfolios. PMID:23082242

Preis, Tobias; Kenett, Dror Y.; Stanley, H. Eugene; Helbing, Dirk; Ben-Jacob, Eshel

2012-01-01

281

Quantifying the Behavior of Stock Correlations Under Market Stress  

NASA Astrophysics Data System (ADS)

Understanding correlations in complex systems is crucial in the face of turbulence, such as the ongoing financial crisis. However, in complex systems, such as financial systems, correlations are not constant but instead vary in time. Here we address the question of quantifying state-dependent correlations in stock markets. Reliable estimates of correlations are absolutely necessary to protect a portfolio. We analyze 72 years of daily closing prices of the 30 stocks forming the Dow Jones Industrial Average (DJIA). We find the striking result that the average correlation among these stocks scales linearly with market stress reflected by normalized DJIA index returns on various time scales. Consequently, the diversification effect which should protect a portfolio melts away in times of market losses, just when it would most urgently be needed. Our empirical analysis is consistent with the interesting possibility that one could anticipate diversification breakdowns, guiding the design of protected portfolios.

Preis, Tobias; Kenett, Dror Y.; Stanley, H. Eugene; Helbing, Dirk; Ben-Jacob, Eshel

2012-10-01

282

Quantifying the relationship between financial news and the stock market.  

PubMed

The complex behavior of financial markets emerges from decisions made by many traders. Here, we exploit a large corpus of daily print issues of the Financial Times from 2(nd) January 2007 until 31(st) December 2012 to quantify the relationship between decisions taken in financial markets and developments in financial news. We find a positive correlation between the daily number of mentions of a company in the Financial Times and the daily transaction volume of a company's stock both on the day before the news is released, and on the same day as the news is released. Our results provide quantitative support for the suggestion that movements in financial markets and movements in financial news are intrinsically interlinked. PMID:24356666

Alanyali, Merve; Moat, Helen Susannah; Preis, Tobias

2013-01-01

283

Quantifying the Relationship Between Financial News and the Stock Market  

NASA Astrophysics Data System (ADS)

The complex behavior of financial markets emerges from decisions made by many traders. Here, we exploit a large corpus of daily print issues of the Financial Times from 2nd January 2007 until 31st December 2012 to quantify the relationship between decisions taken in financial markets and developments in financial news. We find a positive correlation between the daily number of mentions of a company in the Financial Times and the daily transaction volume of a company's stock both on the day before the news is released, and on the same day as the news is released. Our results provide quantitative support for the suggestion that movements in financial markets and movements in financial news are intrinsically interlinked.

Alanyali, Merve; Moat, Helen Susannah; Preis, Tobias

2013-12-01

284

Quantifying light exposure patterns in young adult students  

PubMed Central

Exposure to bright light appears to be protective against myopia in both animals (chicks, monkeys) and children, but quantitative data on human light exposure are limited. In this study, we report on a technique for quantifying light exposure using wearable sensors. Twenty-seven young adult subjects wore a light sensor continuously for two weeks during one of three seasons, and also completed questionnaires about their visual activities. Light data were analyzed with respect to refractive error and season, and the objective sensor data were compared with subjects’ estimates of time spent indoors and outdoors. Subjects’ estimates of time spent indoors and outdoors were in poor agreement with durations reported by the sensor data. The results of questionnaire-based studies of light exposure should thus be interpreted with caution. The role of light in refractive error development should be investigated using multiple methods such as sensors to complement questionnaires. PMID:25342873

Alvarez, Amanda A.; Wildsoet, Christine F.

2014-01-01

285

Quantifying the Relationship Between Financial News and the Stock Market  

PubMed Central

The complex behavior of financial markets emerges from decisions made by many traders. Here, we exploit a large corpus of daily print issues of the Financial Times from 2nd January 2007 until 31st December 2012 to quantify the relationship between decisions taken in financial markets and developments in financial news. We find a positive correlation between the daily number of mentions of a company in the Financial Times and the daily transaction volume of a company's stock both on the day before the news is released, and on the same day as the news is released. Our results provide quantitative support for the suggestion that movements in financial markets and movements in financial news are intrinsically interlinked. PMID:24356666

Alanyali, Merve; Moat, Helen Susannah; Preis, Tobias

2013-01-01

286

Land cover change and remote sensing: Examples of quantifying spatiotemporal dynamics in tropical forests  

SciTech Connect

Research on human impacts or natural processes that operate over broad geographic areas must explicitly address issues of scale and spatial heterogeneity. While the tropical forests of Southeast Asia and Mexico have been occupied and used to meet human needs for thousands of years, traditional forest management systems are currently being transformed by rapid and far-reaching demographic, political, economic, and environmental changes. The dynamics of population growth, migration into the remaining frontiers, and responses to national and international market forces result in a demand for land to produce food and fiber. These results illustrate some of the mechanisms that drive current land use changes, especially in the tropical forest frontiers. By linking the outcome of individual land use decisions and measures of landscape fragmentation and change, the aggregated results shows the hierarchy of temporal and spatial events that in summation result in global changes to the most complex and sensitive biome -- tropical forests. By quantifying the spatial and temporal patterns of tropical forest change, researchers can assist policy makers by showing how landscape systems in these tropical forests are controlled by physical, biological, social, and economic parameters.

Krummel, J.R.; Su, Haiping [Argonne National Lab., IL (United States); Fox, J. [East-West Center, Honolulu, HI (United States); Yarnasan, S.; Ekasingh, M. [Chiang Mai Univ. (Thailand)

1995-06-01

287

Sodium borohydride/chloranil-based assay for quantifying total flavonoids.  

PubMed

A novel sodium borohydride/chloranil-based (SBC) assay for quantifying total flavonoids, including flavones, flavonols, flavonones, flavononols, isoflavonoids, flavanols, and anthocyanins, has been developed. Flavonoids with a 4-carbonyl group were reduced to flavanols using sodium borohydride catalyzed with aluminum chloride. Then the flavan-4-ols were oxidized to anthocyanins by chloranil in an acetic acid solution. The anthocyanins were reacted with vanillin in concentrated hydrochloric acid and then quantified spectrophotometrically at 490 nm. A representative of each common flavonoid class including flavones (baicalein), flavonols (quercetin), flavonones (hesperetin), flavononols (silibinin), isoflavonoids (biochanin A), and flavanols (catechin) showed excellent linear dose-responses in the general range of 0.1-10.0 mM. For most flavonoids, the detection limit was about 0.1 mM in this assay. The recoveries of quercetin from spiked samples of apples and red peppers were 96.5 +/- 1.4% (CV = 1.4%, n = 4) and 99.0 +/- 4.2% (CV = 4.2%, n = 4), respectively. The recovery of catechin from spiked samples of cranberry extracts was 97.9 +/- 2.0% (CV = 2.0%, n = 4). The total flavonoids of selected common fruits and vegetables were measured using this assay. Among the samples tested, blueberry had the highest total flavonoid content (689.5 +/- 10.7 mg of catechin equiv per 100 g of sample), followed by cranberry, apple, broccoli, and red pepper. This novel SBC total flavonoid assay can be widely used to measure the total flavonoid content of fruits, vegetables, whole grains, herbal products, dietary supplements, and nutraceutical products. PMID:18798633

He, Xiangjiu; Liu, Dong; Liu, Rui Hai

2008-10-22

288

The Local Dimension: a method to quantify the Cosmic Web  

E-print Network

It is now well accepted that the galaxies are distributed in filaments, sheets and clusters all of which form an interconnected network known as the Cosmic Web. It is a big challenge to quantify the shapes of the interconnected structural elements that form this network. Tools like the Minkowski functionals which use global properties, though well suited for an isolated object like a single sheet or filament, are not suited for an interconnected network of such objects. We consider the Local Dimension $D$, defined through $N(R)=A R^D$, where $N(R)$ is the galaxy number count within a sphere of comoving radius $R$ centered on a particular galaxy, as a tool to locally quantify the shape in the neigbourhood of different galaxies along the Cosmic Web. We expect $D \\sim 1,2$ and 3 for a galaxy located in a filament, sheet and cluster respectively. Using LCDM N-body simulations we find that it is possible to determine $D$ through a power law fit to $N(R)$ across the length-scales 2 to $10 {\\rm Mpc}$ for $\\sim 33 %$ of the galaxies. We have visually identified the filaments and sheets corresponding to many of the galaxies with $D \\sim 1$ and 2 respectively. In several other situations the structure responsible for the $D$ value could not be visually identified, either due to its being tenuous or due to other dominating structures in the vicinity. We also show that the global distribution of the $D$ values can be used to visualize and interpret how the different structural elements are woven into the Cosmic Web.

Prakash Sarkar; Somnath Bharadwaj

2008-12-09

289

Quantifying total suspended sediment export from the Burdekin River catchment using the loads regression estimator tool  

NASA Astrophysics Data System (ADS)

The loads regression estimator (LRE) was introduced by Wang et al. (2011) as an improved approach for quantifying the export of loads and the corresponding uncertainty from river systems, where data are limited. We extend this methodology and show how LRE can be used to analyze a 24 year record of total suspended sediment concentrations for the Burdekin River. For large catchments with highly variable discharge such as that of the Burdekin River, it is important to quantify loads and their uncertainties accurately to determine the current load and to monitor the effect of changes in catchment management. The extended methodology incorporates (1) multiple discounted flow terms to represent the effect of flow history on concentration, (2) a term that captures sediment trapping and spatial sources of flow in terms of the ratio of flow from above the Burdekin Falls Dam, and (3) catchment vegetation cover. Furthermore, we validated model structure and performance in relation to the application tested. We also considered errors in gauged flow rates of 10% that were consistent with the literature. The results for the Burdekin site indicate substantial variability in loads across years. The inclusion of vegetation cover as a predictor had a significant impact on total suspended sediment (TSS) concentration, with values up to 2.1% lower noted per increasing percentage of vegetation cover. TSS concentration was up to 38% lower in years with greater proportions of flow from above the dam. The extended LRE methodology resulted in improved model performance. The results suggest that management of vegetation cover in dry years can reduce TSS loads from the Burdekin catchment, and this is the focus of future work.

Kuhnert, Petra M.; Henderson, Brent L.; Lewis, Stephen E.; Bainbridge, Zoe T.; Wilkinson, Scott N.; Brodie, Jon E.

2012-04-01

290

Quantifying the Bioporosity of Recent Po Delta Sediments  

NASA Astrophysics Data System (ADS)

As part of project EuroSTRATAFORM, a series of shallow box-core samples were collected near the Po Delta in Italy. They reveal the presence of intense biogenic structures that are believed to have a strong influence on the physical properties of the sediments as they introduce another class of porosity: bioporosity (volume of biopores/total volume of the sample). Therefore, it required that they be evaluated and quantified. Two of these cores were selected for a detailed analysis. This was done primarily using 3D CATSCAN imagery (tomographic intensity) and direct physico-chemical measurements. The tomographic intensity is a complex value controlled by many factors such as the grain size, mineralogy, consolidation, water content and porosity. Two methods were used to quantify the bioporosity: an absolute and a relative bioporosity measurements both based on the use of the tomographic intensity. The relative method takes into account the variability of the sediment densities along the core, whereas the absolute method fixes the tomographic intensity based on the mean density of sediment for the whole core. Because of the evolution of the geometry of the biogenic structures, it became clear that the relative method was much better. Results have shown that the bioporosity could reach values as high a 40% and could account for more than half of the total porosity. These results suggest that significant bias on water content measurement of the matrix thus influencing estimation of physical properties like plastic and liquid limits and the liquidity index via the bias on the matrix water content measurement.

Locat, J.; Levesque, M.; Lee, H.; Leroueil, S.

2004-12-01

291

Quantifying Local Radiation-Induced Lung Damage From Computed Tomography  

SciTech Connect

Purpose: Optimal implementation of new radiotherapy techniques requires accurate predictive models for normal tissue complications. Since clinically used dose distributions are nonuniform, local tissue damage needs to be measured and related to local tissue dose. In lung, radiation-induced damage results in density changes that have been measured by computed tomography (CT) imaging noninvasively, but not yet on a localized scale. Therefore, the aim of the present study was to develop a method for quantification of local radiation-induced lung tissue damage using CT. Methods and Materials: CT images of the thorax were made 8 and 26 weeks after irradiation of 100%, 75%, 50%, and 25% lung volume of rats. Local lung tissue structure (S{sub L}) was quantified from local mean and local standard deviation of the CT density in Hounsfield units in 1-mm{sup 3} subvolumes. The relation of changes in S{sub L} (DELTAS{sub L}) to histologic changes and breathing rate was investigated. Feasibility for clinical application was tested by applying the method to CT images of a patient with non-small-cell lung carcinoma and investigating the local dose-effect relationship of DELTAS{sub L}. Results: In rats, a clear dose-response relationship of DELTAS{sub L} was observed at different time points after radiation. Furthermore, DELTAS{sub L} correlated strongly to histologic endpoints (infiltrates and inflammatory cells) and breathing rate. In the patient, progressive local dose-dependent increases in DELTAS{sub L} were observed. Conclusion: We developed a method to quantify local radiation-induced tissue damage in the lung using CT. This method can be used in the development of more accurate predictive models for normal tissue complications.

Ghobadi, Ghazaleh; Hogeweg, Laurens E. [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Faber, Hette [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Department of Cell Biology, Section of Radiation and Stress Cell Biology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Tukker, Wim G.J. [Department of Radiology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Schippers, Jacobus M. [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Accelerator Department, Paul Scherrer Institut, Villigen (Switzerland); Brandenburg, Sytze [Kernfysisch Versneller Instituut, Groningen (Netherlands); Langendijk, Johannes A. [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Coppes, Robert P. [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Department of Cell Biology, Section of Radiation and Stress Cell Biology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands); Luijk, Peter van, E-mail: p.van.luijk@rt.umcg.n [Department of Radiation Oncology, University Medical Center Groningen/University of Groningen, Groningen (Netherlands)

2010-02-01

292

Talent Show Notes from the Office  

E-print Network

Highlights Paintball Talent Show Notes from the Office Spring B Places of Origin Birthdays's Weekly. Talent Show ­ Tryouts We're so excited about the Talent Show! We have a long list of students, March 24. This is also the last day to sign up to be in the Talent Show. We also need a Master

Pilyugin, Sergei S.

293

Quantifying and Generalizing Hydrologic Responses to Dam Regulation using a Statistical Modeling Approach  

SciTech Connect

Despite the ubiquitous existence of dams within riverscapes, much of our knowledge about dams and their environmental effects remains context-specific. Hydrology, more than any other environmental variable, has been studied in great detail with regard to dam regulation. While much progress has been made in generalizing the hydrologic effects of regulation by large dams, many aspects of hydrology show site-specific fidelity to dam operations, small dams (including diversions), and regional hydrologic regimes. A statistical modeling framework is presented to quantify and generalize hydrologic responses to varying degrees of dam regulation. Specifically, the objectives were to 1) compare the effects of local versus cumulative dam regulation, 2) determine the importance of different regional hydrologic regimes in influencing hydrologic responses to dams, and 3) evaluate how different regulation contexts lead to error in predicting hydrologic responses to dams. Overall, model performance was poor in quantifying the magnitude of hydrologic responses, but performance was sufficient in classifying hydrologic responses as negative or positive. Responses of some hydrologic indices to dam regulation were highly dependent upon hydrologic class membership and the purpose of the dam. The opposing coefficients between local and cumulative-dam predictors suggested that hydrologic responses to cumulative dam regulation are complex, and predicting the hydrology downstream of individual dams, as opposed to multiple dams, may be more easy accomplished using statistical approaches. Results also suggested that particular contexts, including multipurpose dams, high cumulative regulation by multiple dams, diversions, close proximity to dams, and certain hydrologic classes are all sources of increased error when predicting hydrologic responses to dams. Statistical models, such as the ones presented herein, show promise in their ability to model the effects of dam regulation effects at large spatial scales as to generalize the directionality of hydrologic responses.

McManamay, Ryan A [ORNL

2014-01-01

294

Quantifying invasion resistance: the use of recruitment functions to control for propagule pressure.  

PubMed

Invasive species distributions tend to be biased towards some habitats compared to others due to the combined effects of habitat-specific resistance to invasion and non-uniform propagule pressure. These two factors may also interact, with habitat resistance varying as a function of propagule supply rate. Recruitment experiments, in which the number of individuals recruiting into a population is measured under different propagule supply rates, can help us understand these interactions and quantify habitat resistance to invasion while controlling for variation in propagule supply rate. Here, we constructed recruitment functions for the invasive herb Hieracium lepidulum by sowing seeds at five different densities into six different habitat types in New Zealand's Southern Alps repeated over two successive years, and monitored seedling recruitment and survival over a four year period. We fitted recruitment functions that allowed us to estimate the total number of safe sites available for plants to occupy, which we used as a measure of invasion resistance, and tested several hypotheses concerning how invasion resistance differed among habitats and over time. We found significant differences in levels of H. lepidulum recruitment among habitats, which did not match the species' current distribution in the landscape. Local biotic and abiotic characteristics helped explain some of the between-habitat variation, with vascular plant species richness, vascular plant cover, and light availability, all positively correlated with the number of safe sites for recruitment. Resistance also varied over time however, with cohorts sown in successive years showing different levels of recruitment in some habitats but not others. These results show that recruitment functions can be used to quantify habitat resistance to invasion and to identify potential mechanisms of invasion resistance. PMID:24933811

Miller, Alice L; Diez, Jeffrey M; Sullivan, Jon J; Wangen, Steven R; Wiser, Susan K; Meffin, Ross; Duncan, Richard P

2014-04-01

295

Empowering Women? The Oprah Winfrey Show  

Microsoft Academic Search

The Oprah Winfrey Show, the most-watched US daytime talk show, aims to empower women. This article examines the show's representations of gender and how images of `race', sexuality and class cross-cut them. It considers the show's status as television psychology. It explores the show's translation of aspects of black feminism to television, and discusses the social implications of its `super-real'

Corinne Squire

1994-01-01

296

Quantifying commuter exposures to volatile organic compounds  

NASA Astrophysics Data System (ADS)

Motor-vehicles can be a predominant source of air pollution in cities. Traffic-related air pollution is often unavoidable for people who live in populous areas. Commuters may have high exposures to traffic-related air pollution as they are close to vehicle tailpipes. Volatile organic compounds (VOCs) are one class of air pollutants of concern because exposure to VOCs carries risk for adverse health effects. Specific VOCs of interest for this work include benzene, toluene, ethylbenzene, and xylenes (BTEX), which are often found in gasoline and combustion products. Although methods exist to measure time-integrated personal exposures to BTEX, there are few practical methods to measure a commuter's time-resolved BTEX exposure which could identify peak exposures that could be concealed with a time-integrated measurement. This study evaluated the ability of a photoionization detector (PID) to measure commuters' exposure to BTEX using Tenax TA samples as a reference and quantified the difference in BTEX exposure between cyclists and drivers with windows open and closed. To determine the suitability of two measurement methods (PID and Tenax TA) for use in this study, the precision, linearity, and limits of detection (LODs) for both the PID and Tenax TA measurement methods were determined in the laboratory with standard BTEX calibration gases. Volunteers commuted from their homes to their work places by cycling or driving while wearing a personal exposure backpack containing a collocated PID and Tenax TA sampler. Volunteers completed a survey and indicated if the windows in their vehicle were open or closed. Comparing pairs of exposure data from the Tenax TA and PID sampling methods determined the suitability of the PID to measure the BTEX exposures of commuters. The difference between BTEX exposures of cyclists and drivers with windows open and closed in Fort Collins was determined. Both the PID and Tenax TA measurement methods were precise and linear when evaluated in the laboratory using standard BTEX gases. The LODs for the Tenax TA sampling tubes (determined with a sample volume of 1,000 standard cubic centimeters which is close to the approximate commuter sample volumes collected) were orders of magnitude lower (0.04 to 0.7 parts per billion (ppb) for individual compounds of BTEX) compared to the PIDs' LODs (9.3 to 15 ppb of a BTEX mixture), which makes the Tenax TA sampling method more suitable to measure BTEX concentrations in the sub-parts per billion (ppb) range. PID and Tenax TA data for commuter exposures were inversely related. The concentrations of VOCs measured by the PID were substantially higher than BTEX concentrations measured by collocated Tenax TA samplers. The inverse trend and the large difference in magnitude between PID responses and Tenax TA BTEX measurements indicates the two methods may have been measuring different air pollutants that are negatively correlated. Drivers in Fort Collins, Colorado with closed windows experienced greater time-weighted average BTEX exposures than cyclists (p: 0.04). Commuter BTEX exposures measured in Fort Collins were lower than commuter exposures measured in prior studies that occurred in larger cities (Boston and Copenhagen). Although route and intake may affect a commuter's BTEX dose, these variables are outside of the scope of this study. Within the limitations of this study (including: small sample size, small representative area of Fort Collins, and respiration rates not taken into account), it appears health risks associated with traffic-induced BTEX exposures may be reduced by commuting via cycling instead of driving with windows closed and living in a less populous area that has less vehicle traffic. Although the PID did not reliably measure low-level commuter BTEX exposures, the Tenax TA sampling method did. The PID measured BTEX concentrations reliably in a controlled environment, at high concentrations (300-800 ppb), and in the absence of other air pollutants. In environments where there could be multiple chemicals present that may produce a PID signal (such a

Kayne, Ashleigh

297

Quantifier elimination for real closed fields by cylindrical algebraic decomposition  

Microsoft Academic Search

Tarski in 1948, [18] published a quantifier elimination method for the elementary theory of real closed fields (which he had discovered in 1930). As noted by Tarski, any quantifier elimination method for this theory also provides a decision method, which enables one to decide whether any sentence of the theory is true or false. Since many important and difficult mathematical

George E. Collins

1975-01-01

298

Quantifying population genetic differentiation from Next-Generation Sequencing data  

E-print Network

Quantifying population genetic differentiation from Next-Generation Sequencing data Matteo on this idea, we propose a novel method for quantifying population genetic differentiation from next of populations sampled at low coverage. 3 #12;Introduction Determining the level of genetic variation within

Nielsen, Rasmus

299

The Skolemization of existential quantifiers in intuitionistic logic  

E-print Network

The Skolemization of existential quantifiers in intuitionistic logic Matthias Baaz and Rosalie are existential. The method makes use of an exis­ tence predicate first introduced by Dana Scott. Keywords of an existential quantifier. Skolemization can either be considered in the context of derivability

Iemhoff, Rosalie

300

Dynamic generalised quantifiers and hypothetical contexts Robin Cooper  

E-print Network

Dynamic generalised quantifiers and hypothetical contexts Robin Cooper Abstract We shall consider the development of record types in Martin-L¨of or constructive type theory but differs in that the type theory quantifiers using type theory with records as discussed in Cooper(2003, forthcoming). TTR follows closely

Cooper, Robin

301

Quantifying biodiversity and asymptotics for a sequence of random strings.  

PubMed

We present a methodology for quantifying biodiversity at the sequence level by developing the probability theory on a set of strings. Further, we apply our methodology to the problem of quantifying the population diversity of microorganisms in several extreme environments and digestive organs and reveal the relation between microbial diversity and various environmental parameters. PMID:20866445

Koyano, Hitoshi; Kishino, Hirohisa

2010-06-01

302

Quantifying Climatological Ranges and Anomalies for Pacific Coral Reef Ecosystems  

E-print Network

Quantifying Climatological Ranges and Anomalies for Pacific Coral Reef Ecosystems Jamison M. Gove1 to develop spatially constrained, island- and atoll-scale metrics that quantify climatological range limits considerable spatial heterogeneity in climatological ranges and anomalies across 41 islands and atolls

303

Children with Autism Show Reduced Somatosensory Response: An MEG Study  

PubMed Central

Lay Abstract Autism spectrum disorders are reported to affect nearly one out of every one hundred children, with over 90% of these children showing behavioral disturbances related to the processing of basic sensory information. Behavioral sensitivity to light touch, such as profound discomfort with clothing tags and physical contact, is a ubiquitous finding in children on the autism spectrum. In this study, we investigate the strength and timing of brain activity in response to simple, light taps to the fingertip. Our results suggest that children with autism show a diminished early response in the primary somatosensory cortex (S1). This finding is most evident in the left hemisphere. In exploratory analysis, we also show that tactile sensory behavior, as measured by the Sensory Profile, may be a better predictor of the intensity and timing of brain activity related to touch than a clinical autism diagnosis. We report that children with atypical tactile behavior have significantly lower amplitude somatosensory cortical responses in both hemispheres. Thus sensory behavioral phenotype appears to be a more powerful strategy for investigating neural activity in this cohort. This study provides evidence for atypical brain activity during sensory processing in autistic children and suggests that our sensory behavior based methodology may be an important approach to investigating brain activity in people with autism and neurodevelopmental disorders. Scientific Abstract The neural underpinnings of sensory processing differences in autism remain poorly understood. This prospective magnetoencephalography (MEG) study investigates whether children with autism show atypical cortical activity in the primary somatosensory cortex (S1) in comparison to matched controls. Tactile stimuli were clearly detectable, painless taps applied to the distal phalanx of the second (D2) and third (D3) fingers of the right and left hands. Three tactile paradigms were administered: an oddball paradigm (standard taps to D3 at an inter-stimulus interval (ISI) of 0.33 and deviant taps to D2 with ISI ranging from 1.32–1.64s); a slow-rate paradigm (D2) with an ISI matching the deviant taps in the oddball paradigm; and a fast-rate paradigm (D2) with an ISI matching the standard taps in the oddball. Study subjects were boys (age 7–11 years) with and without autism disorder. Sensory behavior was quantified using the Sensory Profile questionnaire. Boys with autism exhibited smaller amplitude left hemisphere S1 response to slow and deviant stimuli during the right hand paradigms. In post-hoc analysis, tactile behavior directly correlated with the amplitude of cortical response. Consequently, the children were re-categorized by degree of parent-report tactile sensitivity. This regrouping created a more robust distinction between the groups with amplitude diminution in the left and right hemispheres and latency prolongation in the right hemisphere in the deviant and slow-rate paradigms for the affected children. This study suggests that children with autism have early differences in somatosensory processing, which likely influence later stages of cortical activity from integration to motor response. PMID:22933354

Marco, Elysa J.; Khatibi, Kasra; Hill, Susanna S.; Siegel, Bryna; Arroyo, Monica S.; Dowling, Anne F.; Neuhaus, John M.; Sherr, Elliott H.; Hinkley, Leighton N. B.; Nagarajan, Srikantan S.

2012-01-01

304

Everything, everywhere, all the time: quantifying the information gained from intensive hydrochemical sampling  

NASA Astrophysics Data System (ADS)

Catchment hydrochemical studies have suffered from a stark mismatch of measurement timescales: water fluxes are typically measured sub-hourly, but their chemical signatures are typically sampled only weekly or monthly. At the Plynlimon catchment in mid-Wales, however, precipitation and streamflow have now been sampled every seven hours for nearly two years, and analyzed for deuterium, oxygen-18, and more than 40 chemical species. This high-frequency sampling reveals temporal patterns that would be invisible in typical weekly monitoring samples. Furthermore, recent technological developments are now leading to systems that can provide measurements of rainfall and streamflow chemistry at hourly or sub-hourly intervals, similar to the time scales at which hydrometric data have long been available - and to provide these measurements for long spans of time, not just for intensive field campaigns associated with individual storms. But at what point will higher-frequency measurements become pointless, as additional measurements simply "connect the dots" between lower-frequency data points? Information Theory, dating back to the original work of Shannon and colleagues in the 1940's, provides mathematical tools for rigorously quantifying the information content of a time series. The key input data for such an analysis are the power spectrum of the measured data, and the power spectrum of the measurement noise. Here we apply these techniques to the high-frequency Plynlimon data set. The results show that, at least up to 7-hourly sampling frequency, the information content of the time series increases nearly linearly with the frequency of sampling. These results rigorously quantify what inspection of the time series visually suggests: these high-frequency data do not simply "connect the dots" between lower-frequency measurements, but instead contain a richly textured signature of dynamic behavior in catchment hydrochemistry.

Kirchner, J. W.; Neal, C.

2011-12-01

305

47 CFR 90.505 - Showing required.  

Code of Federal Regulations, 2010 CFR

...CONTINUED) SAFETY AND SPECIAL RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Developmental Operation § 90.505 Showing...showing that: (1) The applicant has an organized plan of development leading to a specific objective; (2) The actual...

2010-10-01

306

47 CFR 90.505 - Showing required.  

Code of Federal Regulations, 2011 CFR

...CONTINUED) SAFETY AND SPECIAL RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Developmental Operation § 90.505 Showing...showing that: (1) The applicant has an organized plan of development leading to a specific objective; (2) The actual...

2011-10-01

307

New Hampshire Guide 4-H Dog Shows  

E-print Network

New Hampshire Guide to 4-H Dog Shows UNH Cooperative Extension 4-H Youth Development Moiles House cooperating. #12;NH Guide to 4-H Dog Shows i Table of Contents INTRODUCTION .................................................................................................................................2 Purpose of the 4-H Dog Project

New Hampshire, University of

308

Quantifying covalency and metallicity in correlated compounds undergoing metal-insulator transitions  

NASA Astrophysics Data System (ADS)

The tunability of bonding character in transition-metal compounds controls phase transitions and their fascinating properties such as high-temperature superconductivity, colossal magnetoresistance, spin-charge ordering, etc. However, separating out and quantifying the roles of covalency and metallicity derived from the same set of transition-metal d and ligand p electrons remains a fundamental challenge. In this study, we use bulk-sensitive photoelectron spectroscopy and configuration-interaction calculations for quantifying the covalency and metallicity in correlated compounds. The method is applied to study the first-order temperature- (T-) dependent metal-insulator transitions (MITs) in the cubic pyrochlore ruthenates Tl2Ru2O7 and Hg2Ru2O7. Core-level spectroscopy shows drastic T-dependent modifications which are well explained by including ligand-screening and metallic-screening channels. The core-level metallic-origin features get quenched upon gap formation in valence band spectra, while ionic and covalent components remain intact across the MIT. The results establish temperature-driven Mott-Hubbard MITs in three-dimensional ruthenates and reveal three energy scales: (a) 4d electronic changes occur on the largest (˜eV) energy scale, (b) the band-gap energies/charge gaps (Eg˜160-200 meV) are intermediate, and (c) the lowest-energy scale corresponds to the transition temperature TMIT (˜10 meV), which is also the spin gap energy of Tl2Ru2O7 and the magnetic-ordering temperature of Hg2Ru2O7. The method is general for doping- and T-induced transitions and is valid for V2O3, CrN, La1-xSrxMnO3, La2-xSrxCuO4, etc. The obtained transition-metal-ligand (d-p) bonding energies (V˜45-90 kcal/mol) are consistent with thermochemical data, and with energies of typical heteronuclear covalent bonds such as C-H, C-O, C-N, etc. In contrast, the metallic-screening energies of correlated compounds form a weaker class (V*˜10-40 kcal/mol) but are still stronger than van der Waals and hydrogen bonding. The results identify and quantify the roles of covalency and metallicity in 3d and 4d correlated compounds undergoing metal-insulator transitions.

Chainani, Ashish; Yamamoto, Ayako; Matsunami, Masaharu; Eguchi, Ritsuko; Taguchi, Munetaka; Takata, Yasutaka; Takagi, Hidenori; Shin, Shik; Nishino, Yoshinori; Yabashi, Makina; Tamasaku, Kenji; Ishikawa, Tetsuya

2013-01-01

309

End-of-Semester Barbecue Talent Show  

E-print Network

Highlights End-of-Semester Barbecue Talent Show Scholarship Winners St. Francis Food Drive Ceremony. Details will be in next week's Weekly. Talent Show Dress Rehearsal Dress Rehearsal: All acts MUST come to the dress rehearsal before the Talent Show but don't miss class! Bring money for pizza. When

Pilyugin, Sergei S.

310

Inside Gun Shows What Goes On  

E-print Network

Preface Inside Gun Shows What Goes On When Everybody Thinks Nobody's Watching #12;#12;Inside Gun-Violence Effort. She put gun shows on my radar and is an ace straw-purchase spotter. Thanks also to Barbara Claire a great public institution. He was right. #12;Contents Preface Executive Summary Gun Shows in Context How

Leistikow, Bruce N.

311

Inside Gun Shows What Goes On  

E-print Network

Inside Gun Shows What Goes On When Everybody Thinks Nobody's Watching Epilogue #12;Inside Gun Shows;Epilogue In February 2010, I attended a Crossroads of the West gun show at the Arizona State Fairgrounds here an update on each of the Phoenix obser- vations made in the photo-essay portion of Inside Gun

Leistikow, Bruce N.

312

Inside Gun Shows What Goes On  

E-print Network

Inside Gun Shows What Goes On When Everybody Thinks Nobody's Watching Executive Summary #12;Inside Gun Shows What Goes on When Everybody Thinks Nobody's Watching Garen Wintemute, MD, MPH Violence;Executive Summary Gun shows are surrounded by controversy. On the one hand, they are important economic

Nguyen, Danh

313

Toward quantifying uncertainty in travel time tomography using the null-space shuttle  

E-print Network

Toward quantifying uncertainty in travel time tomography using the null-space shuttle R. W. L. de in travel time tomography using the null-space shuttle, J. Geophys. Res., 117, B03301, doi:10.1029/2011JB the null-space of the forward operator. We show that with the null-space shuttle it is possible to assess

Utrecht, Universiteit

314

5.8 QUANTIFYING PRECIPITATION REDUCTION DUE TO AIR POLLUTION DOWNWIND OF MAJOR URBAN AREAS  

Microsoft Academic Search

Previous studies had showed qualitatively that urban and industrial air pollution suppresses the cloud drop coalescence and so reduces the precipitation from the Polluted clouds (Rosenfeld, 2000). Here we present the first study that attempts to quantify these effects based on time series of rain gauge data during the last century, on comparison with air pollution emissions records and on

Amir Givati; Daniel Rosenfeld

315

Quantifying Digit Force Vector Coordination during Precision Pinch  

PubMed Central

A methodology was established to investigate the contact mechanics of the thumb and the index finger at the digit-object interface during precision pinch. Two force/torque transducers were incorporated into an apparatus designed to overcome the thickness of each transducer and provide a flexible pinch span for digit placement and force application. To demonstrate the utility of the device, five subjects completed a pinch task with the pulps of their thumb and index finger. Inter-digit force vector coordination was quantified by examining the 1) force vector component magnitudes, 2) resultant force vector magnitudes, 3) coordination angle – the angle formed by the resultant vectors of each digit, 4) direction angles – the angle formed by each vector and the coordinate axes, and 5) center of pressure locations. It was shown that the resultant force magnitude of the index finger exceeded that of the thumb by 0.8 ± 0.3 N and that the coordination angle between the digit resultant force vectors was 160.2 ± 4.6°. The experimental apparatus and analysis methods provide a valuable tool for the quantitative examination of biomechanics and motor control during dexterous manipulation. PMID:24443624

Marquardt, Tamara L.; Li, Zong-Ming

2013-01-01

316

Ancient bacteria show evidence of DNA repair  

PubMed Central

Recent claims of cultivable ancient bacteria within sealed environments highlight our limited understanding of the mechanisms behind long-term cell survival. It remains unclear how dormancy, a favored explanation for extended cellular persistence, can cope with spontaneous genomic decay over geological timescales. There has been no direct evidence in ancient microbes for the most likely mechanism, active DNA repair, or for the metabolic activity necessary to sustain it. In this paper, we couple PCR and enzymatic treatment of DNA with direct respiration measurements to investigate long-term survival of bacteria sealed in frozen conditions for up to one million years. Our results show evidence of bacterial survival in samples up to half a million years in age, making this the oldest independently authenticated DNA to date obtained from viable cells. Additionally, we find strong evidence that this long-term survival is closely tied to cellular metabolic activity and DNA repair that over time proves to be superior to dormancy as a mechanism in sustaining bacteria viability. PMID:17728401

Johnson, Sarah Stewart; Hebsgaard, Martin B.; Christensen, Torben R.; Mastepanov, Mikhail; Nielsen, Rasmus; Munch, Kasper; Brand, Tina; Gilbert, M. Thomas P.; Zuber, Maria T.; Bunce, Michael; R?nn, Regin; Gilichinsky, David; Froese, Duane; Willerslev, Eske

2007-01-01

317

Quantifying tissue mechanical properties using photoplethysmography  

PubMed Central

Photoplethysmography (PPG) is a non-invasive optical method that can be used to detect blood volume changes in the microvascular bed of tissue. The PPG signal comprises two components; a pulsatile waveform (AC) attributed to changes in the interrogated blood volume with each heartbeat, and a slowly varying baseline (DC) combining low frequency fluctuations mainly due to respiration and sympathetic nervous system activity. In this report, we investigate the AC pulsatile waveform of the PPG pulse for ultimate use in extracting information regarding the biomechanical properties of tissue and vasculature. By analyzing the rise time of the pulse in the diastole period, we show that PPG is capable of measuring changes in the Young’s Modulus of tissue mimicking phantoms with a resolution of 4 KPa in the range of 12 to 61 KPa. In addition, the shape of the pulse can potentially be used to diagnose vascular complications by differentiating upstream from downstream complications. A Windkessel model was used to model changes in the biomechanical properties of the circulation and to test the proposed concept. The modeling data confirmed the response seen in vitro and showed the same trends in the PPG rise and fall times with changes in compliance and vascular resistance. PMID:25071970

Akl, Tony J.; Wilson, Mark A.; Ericson, M. Nance; Cote, Gerard L.

2014-01-01

318

Quantifying the Ease of Scientific Discovery.  

PubMed

It has long been known that scientific output proceeds on an exponential increase, or more properly, a logistic growth curve. The interplay between effort and discovery is clear, and the nature of the functional form has been thought to be due to many changes in the scientific process over time. Here I show a quantitative method for examining the ease of scientific progress, another necessary component in understanding scientific discovery. Using examples from three different scientific disciplines - mammalian species, chemical elements, and minor planets - I find the ease of discovery to conform to an exponential decay. In addition, I show how the pace of scientific discovery can be best understood as the outcome of both scientific output and ease of discovery. A quantitative study of the ease of scientific discovery in the aggregate, such as done here, has the potential to provide a great deal of insight into both the nature of future discoveries and the technical processes behind discoveries in science. PMID:22328796

Arbesman, Samuel

2011-02-01

319

Quantifying tissue mechanical properties using photoplethysmography  

SciTech Connect

Photoplethysmography (PPG) is a non-invasive optical method that can be used to detect blood volume changes in the microvascular bed of tissue. The PPG signal comprises two components; a pulsatile waveform (AC) attributed to changes in the interrogated blood volume with each heartbeat, and a slowly varying baseline (DC) combining low frequency fluctuations mainly due to respiration and sympathetic nervous system activity. In this report, we investigate the AC pulsatile waveform of the PPG pulse for ultimate use in extracting information regarding the biomechanical properties of tissue and vasculature. By analyzing the rise time of the pulse in the diastole period, we show that PPG is capable of measuring changes in the Young s Modulus of tissue mimicking phantoms with a resolution of 4 KPa in the range of 12 to 61 KPa. In addition, the shape of the pulse can potentially be used to diagnose vascular complications by differentiating upstream from downstream complications. A Windkessel model was used to model changes in the biomechanical properties of the circulation and to test the proposed concept. The modeling data confirmed the response seen in vitro and showed the same trends in the PPG rise and fall times with changes in compliance and vascular resistance.

Akl, Tony [Texas A& M University; Wilson, Mark A. [University of Pittsburgh School of Medicine, Pittsburgh PA; Ericson, Milton Nance [ORNL; Cote, Gerard L. [Texas A& M University

2014-01-01

320

Quantifying human response capabilities towards tsunami threats at community level  

NASA Astrophysics Data System (ADS)

Decision makers at the community level need detailed information on tsunami risks in their area. Knowledge on potential hazard impact, exposed elements such as people, critical facilities and lifelines, people's coping capacity and recovery potential are crucial to plan precautionary measures for adaptation and to mitigate potential impacts of tsunamis on society and the environment. A crucial point within a people-centred tsunami risk assessment is to quantify the human response capabilities towards tsunami threats. Based on this quantification and spatial representation in maps tsunami affected and safe areas, difficult-to-evacuate areas, evacuation target points and evacuation routes can be assigned and used as an important contribution to e.g. community level evacuation planning. Major component in the quantification of human response capabilities towards tsunami impacts is the factor time. The human response capabilities depend on the estimated time of arrival (ETA) of a tsunami, the time until technical or natural warning signs (ToNW) can be received, the reaction time (RT) of the population (human understanding of a tsunami warning and the decision to take appropriate action), the evacuation time (ET, time people need to reach a safe area) and the actual available response time (RsT = ETA - ToNW - RT). If RsT is larger than ET, people in the respective areas are able to reach a safe area and rescue themselves. Critical areas possess RsT values equal or even smaller ET and hence people whin these areas will be directly affected by a tsunami. Quantifying the factor time is challenging and an attempt to this is presented here. The ETA can be derived by analyzing pre-computed tsunami scenarios for a respective area. For ToNW we assume that the early warning center is able to fulfil the Indonesian presidential decree to issue a warning within 5 minutes. RT is difficult as here human intrinsic factors as educational level, believe, tsunami knowledge and experience besides others play a role. An attempt to quantify this variable under high uncertainty is also presented. Quantifying ET is based on a GIS modelling using a Cost Weighted Distance approach. Basic principle is to define the best evacuation path from a given point to the next safe area (shelter location). Here the fastest path from that point to the shelter location has to be found. Thereby the impact of land cover, slope, population density, population age and gender distribution are taken into account as literature studies prove these factors as highly important. Knowing the fastest path and the distance to the next safe area together with a spatially distributed pattern of evacuation speed delivers the time needed from each location to a safe area. By considering now the obtained time value for RsT the coverage area of an evacuation target point (safe area) can be assigned. Incorporating knowledge on people capacity of an evacuation target point the respective coverage area is refined. Hence areas with weak, moderate and good human response capabilities can be detected. This allows calculation of potential amount of people affected (dead or injured) and amount of people dislocated. First results for Kuta (Bali) for a worst case tsunami event deliver people affected of approx. 25 000 when RT = 0 minutes (direct evacuation when receiving a tsunami warning to 120 000 when RT > ETA (no evacuation action until tsunami hits the land). Additionally fastest evacuation routes to the evacuation target points can be assigned. Areas with weak response capabilities can be assigned as priority areas to install e.g. additional evacuation target points or to increase tsunami knowledge and awareness to promote a faster reaction time. Especially in analyzing underlying socio-economic properties causing deficiencies in responding to a tsunami threat can lead to valuable information and direct planning of adaptation measures. Keywords: Community level, Risk and vulnerability assessment, Early warning, Disaster management, Tsunami, Indonesia

Post, J.; Mück, M.; Zosseder, K.; Wegscheider, S.; Taubenböck, H.; Strunz, G.; Muhari, A.; Anwar, H. Z.; Birkmann, J.; Gebert, N.

2009-04-01

321

A graph-theoretic method to quantify the airline route authority  

NASA Technical Reports Server (NTRS)

The paper introduces a graph-theoretic method to quantify the legal statements in route certificate which specifies the airline routing restrictions. All the authorized nonstop and multistop routes, including the shortest time routes, can be obtained, and the method suggests profitable route structure alternatives to airline analysts. This method to quantify the C.A.B. route authority was programmed in a software package, Route Improvement Synthesis and Evaluation, and demonstrated in a case study with a commercial airline. The study showed the utility of this technique in suggesting route alternatives and the possibility of improvements in the U.S. route system.

Chan, Y.

1979-01-01

322

Quantifying Inuenza Vaccine Ecacy and Antigenic Distance  

Microsoft Academic Search

We introduce a new measure of antigenic distance between inuenza A vaccine and circulating strains. The measure correlates well with ecacies of the H3N2 inuenza A component of the annual vaccine between 1971 and 2004, as do results of a theory of the immune response to inuenza following vaccination. This new measure of antigenic distance is correlated with vaccine ecacy

Vishal Gupta; David J. Earl; Michael W. Deem

2005-01-01

323

Decomposing Quantified Conjunctive (or Disjunctive) Formulas  

E-print Network

of the sentence, and where two variables are linked by an edge if they occur together in a common atomic formula {hubie.chen,victor.dalmau}@upf.edu Abstract--Model checking­deciding if a logical sentence holds of result Model checking, the problem of deciding if a logical sentence holds on a structure

Dalmau, Victor

324

Quantifying Irregularity in Pulsating Red Giants  

NASA Astrophysics Data System (ADS)

Hundreds of red giant variable stars are classified as “type L,” which the General Catalogue of Variable Stars (GCVS) defines as “slow irregular variables of late spectral type...which show no evidence of periodicity, or any periodicity present is very poorly defined....” Self-correlation (Percy and Muhammed 2004) is a simple form of time-series analysis which determines the cycle-to-cycle behavior of a star, averaged over all the available data. It is well suited for analyzing stars which are not strictly periodic. Even for non-periodic stars, it provides a “profile” of the variability, including the average “characteristic time” of variability. We have applied this method to twenty-three L-type variables which have been measured extensively by AAVSO visual observers. We find a continuous spectrum of behavior, from irregular to semiregular.

Percy, J. R.; Esteves, S.; Lin, A.; Menezes, C.; Wu, S.

2009-12-01

325

Using automated comparisons to quantify handwriting individuality.  

PubMed

The proposition that writing profiles are unique is considered a key premise underlying forensic handwriting comparisons. An empirical study cannot validate this proposition because of the impossibility of observing sample documents written by every individual. The goal of this paper is to illustrate what can be stated about the individuality of writing profiles using a database of handwriting samples and an automated comparison procedure. In this paper, we provide a strategy for bounding the probability of observing two writers with indistinguishable writing profiles (regardless of the comparison methodology used) with a random match probability that can be estimated statistically. We illustrate computation of this bound using a convenience sample of documents and an automated comparison procedure based on Pearson's chi-squared statistic applied to frequency distributions of letter shapes extracted from handwriting samples. We also show how this bound can be used when designing an empirical study of individuality. PMID:21391999

Saunders, Christopher P; Davis, Linda J; Buscaglia, JoAnn

2011-05-01

326

Quantifying non-Gaussianity for quantum information  

NASA Astrophysics Data System (ADS)

We address the quantification of non-Gaussianity (nG) of states and operations in continuous-variable systems and its use in quantum information. We start by illustrating in detail the properties and the relationships of two recently proposed measures of nG based on the Hilbert-Schmidt distance and the quantum relative entropy (QRE) between the state under examination and a reference Gaussian state. We then evaluate the non-Gaussianities of several families of non-Gaussian quantum states and show that the two measures have the same basic properties and also share the same qualitative behavior in most of the examples taken into account. However, we also show that they introduce a different relation of order; that is, they are not strictly monotone to each other. We exploit the nG measures for states in order to introduce a measure of nG for quantum operations, to assess Gaussification and de-Gaussification protocols, and to investigate in detail the role played by nG in entanglement-distillation protocols. Besides, we exploit the QRE-based nG measure to provide different insight on the extremality of Gaussian states for some entropic quantities such as conditional entropy, mutual information, and the Holevo bound. We also deal with parameter estimation and present a theorem connecting the QRE nG to the quantum Fisher information. Finally, since evaluation of the QRE nG measure requires the knowledge of the full density matrix, we derive some experimentally friendly lower bounds to nG for some classes of states and by considering the possibility of performing on the states only certain efficient or inefficient measurements.

Genoni, Marco G.; Paris, Matteo G. A.

2010-11-01

327

Dynamics of geometric and entropic quantifiers of correlations in open quantum systems  

E-print Network

We extend the Hilbert-Schmidt (square norm) distance, previously used to define the geometric quantum discord, to define also geometric quantifiers of total and classical correlations. We then compare the dynamics of geometric and entropic quantifiers of the different kinds of correlations in a non-Markovian open two-qubit system under local dephasing. We find that qualitative differences occur only for quantum discords. This is taken to imply that geometric and entropic discords are not, in general, equivalent in describing the dynamics of quantum correlations. We then show that also geometric and entropic quantifiers of total correlations present qualitative disagreements in the state space. This aspect indicates that the differences found for quantum discord are not attributable to a different separation, introduced by each measure, between the quantum and classical parts of correlations. Finally, we find that the Hilbert-Schmidt distance formally coincides with a symmetrized form of linear relative entropy.

Bruno Bellomo; Rosario Lo Franco; Giuseppe Compagno

2011-04-20

328

Dynamics of geometric and entropic quantifiers of correlations in open quantum systems  

NASA Astrophysics Data System (ADS)

We extend the Hilbert-Schmidt (square norm) distance, previously used to define the geometric quantum discord, to define also geometric quantifiers of total and classical correlations. We then compare the dynamics of geometric and entropic quantifiers of the different kinds of correlations in a non-Markovian open two-qubit system under local dephasing. We find that qualitative differences occur only for quantum discords. This is taken to imply that geometric and entropic discords are not, in general, equivalent in describing the dynamics of quantum correlations. We then show that geometric and entropic quantifiers of total correlations also present qualitative disagreements in the state space. This aspect indicates that the differences found for quantum discord are not attributable to a different separation, introduced by each measure, between the quantum and classical parts of correlations. Finally, we find that the Hilbert-Schmidt distance formally coincides with a symmetrized form of linear relative entropy.

Bellomo, B.; Lo Franco, R.; Compagno, G.

2012-07-01

329

Thermoplasmonics: quantifying plasmonic heating in single nanowires.  

PubMed

Plasmonic absorption of light can lead to significant local heating in metallic nanostructures, an effect that defines the subfield of thermoplasmonics and has been leveraged in diverse applications from biomedical technology to optoelectronics. Quantitatively characterizing the resulting local temperature increase can be very challenging in isolated nanostructures. By measuring the optically induced change in resistance of metal nanowires with a transverse plasmon mode, we quantitatively determine the temperature increase in single nanostructures with the dependence on incident polarization clearly revealing the plasmonic heating mechanism. Computational modeling explains the resonant and nonresonant contributions to the optical heating and the dominant pathways for thermal transport. These results, obtained by combining electronic and optical measurements, place a bound on the role of optical heating in prior experiments and suggest design guidelines for engineered structures meant to leverage such effects. PMID:24382140

Herzog, Joseph B; Knight, Mark W; Natelson, Douglas

2014-02-12

330

Quantifying momenta through the Fourier transform  

E-print Network

Integral transforms arising from the separable solutions to the Helmholtz differential equation are presented. Pairs of these integral transforms are related via Plancherel theorem and, ultimately, any of these integral transforms may be calculated using only Fourier transforms. This result is used to evaluate the mean value of momenta associated to the symmetries of the reduced wave equation. As an explicit example, the orbital angular momenta of plane and elliptic-cylindrical waves is presented.

Rodr\\'\\iguez-Lara, B M

2011-01-01

331

Quantifying changes in groundwater level and chemistry in Shahrood, northeastern Iran  

NASA Astrophysics Data System (ADS)

Temporal changes in the quantity and chemical status of groundwater resources must be accurately quantified to aid sustainable management of aquifers. Monitoring data show that the groundwater level in Shahrood alluvial aquifer, northeastern Iran, continuously declined from 1993 to 2009, falling 11.4 m in 16 years. This constitutes a loss of 216 million m3 from the aquifer's stored groundwater reserve. Overexploitation and reduction in rainfall intensified the declining trend. In contrast, the reduced abstraction rate, the result of reduced borehole productivity (related to the reduction in saturated-zone thickness over time), slowed down the declining trend. Groundwater salinity varied substantially showing a minor rising trend. For the same 16-year period, increases were recorded in the order of 24% for electrical conductivity, 12.4% for major ions, and 9.9% for pH. This research shows that the groundwater-level declining trend was not interrupted by fluctuation in rainfall and it does not necessarily lead to water-quality deterioration. Water-level drop is greater near the aquifer's recharging boundary, while greater rates of salinity rise occur around the end of groundwater flow lines. Also, fresher groundwater experiences a greater rate of salinity increase. These findings are of significance for predicting the groundwater level and salinity of exhausted aquifers.

Ajdary, Khalil; Kazemi, Gholam A.

2014-03-01

332

Quantifying the biodiversity value of tropical primary, secondary, and plantation forests.  

PubMed

Biodiversity loss from deforestation may be partly offset by the expansion of secondary forests and plantation forestry in the tropics. However, our current knowledge of the value of these habitats for biodiversity conservation is limited to very few taxa, and many studies are severely confounded by methodological shortcomings. We examined the conservation value of tropical primary, secondary, and plantation forests for 15 taxonomic groups using a robust and replicated sample design that minimized edge effects. Different taxa varied markedly in their response to patterns of land use in terms of species richness and the percentage of species restricted to primary forest (varying from 5% to 57%), yet almost all between-forest comparisons showed marked differences in community structure and composition. Cross-taxon congruence in response patterns was very weak when evaluated using abundance or species richness data, but much stronger when using metrics based upon community similarity. Our results show that, whereas the biodiversity indicator group concept may hold some validity for several taxa that are frequently sampled (such as birds and fruit-feeding butterflies), it fails for those exhibiting highly idiosyncratic responses to tropical land-use change (including highly vagile species groups such as bats and orchid bees), highlighting the problems associated with quantifying the biodiversity value of anthropogenic habitats. Finally, although we show that areas of native regeneration and exotic tree plantations can provide complementary conservation services, we also provide clear empirical evidence demonstrating the irreplaceable value of primary forests. PMID:18003934

Barlow, J; Gardner, T A; Araujo, I S; Avila-Pires, T C; Bonaldo, A B; Costa, J E; Esposito, M C; Ferreira, L V; Hawes, J; Hernandez, M I M; Hoogmoed, M S; Leite, R N; Lo-Man-Hung, N F; Malcolm, J R; Martins, M B; Mestre, L A M; Miranda-Santos, R; Nunes-Gutjahr, A L; Overal, W L; Parry, L; Peters, S L; Ribeiro-Junior, M A; da Silva, M N F; da Silva Motta, C; Peres, C A

2007-11-20

333

Quantifying the biodiversity value of tropical primary, secondary, and plantation forests  

PubMed Central

Biodiversity loss from deforestation may be partly offset by the expansion of secondary forests and plantation forestry in the tropics. However, our current knowledge of the value of these habitats for biodiversity conservation is limited to very few taxa, and many studies are severely confounded by methodological shortcomings. We examined the conservation value of tropical primary, secondary, and plantation forests for 15 taxonomic groups using a robust and replicated sample design that minimized edge effects. Different taxa varied markedly in their response to patterns of land use in terms of species richness and the percentage of species restricted to primary forest (varying from 5% to 57%), yet almost all between-forest comparisons showed marked differences in community structure and composition. Cross-taxon congruence in response patterns was very weak when evaluated using abundance or species richness data, but much stronger when using metrics based upon community similarity. Our results show that, whereas the biodiversity indicator group concept may hold some validity for several taxa that are frequently sampled (such as birds and fruit-feeding butterflies), it fails for those exhibiting highly idiosyncratic responses to tropical land-use change (including highly vagile species groups such as bats and orchid bees), highlighting the problems associated with quantifying the biodiversity value of anthropogenic habitats. Finally, although we show that areas of native regeneration and exotic tree plantations can provide complementary conservation services, we also provide clear empirical evidence demonstrating the irreplaceable value of primary forests. PMID:18003934

Barlow, J.; Gardner, T. A.; Araujo, I. S.; Avila-Pires, T. C.; Bonaldo, A. B.; Costa, J. E.; Esposito, M. C.; Ferreira, L. V.; Hawes, J.; Hernandez, M. I. M.; Hoogmoed, M. S.; Leite, R. N.; Lo-Man-Hung, N. F.; Malcolm, J. R.; Martins, M. B.; Mestre, L. A. M.; Miranda-Santos, R.; Nunes-Gutjahr, A. L.; Overal, W. L.; Parry, L.; Peters, S. L.; Ribeiro-Junior, M. A.; da Silva, M. N. F.; da Silva Motta, C.; Peres, C. A.

2007-01-01

334

Solar System Odyssey - Fulldome Digital Planetarium Show  

NSDL National Science Digital Library

This is a Fulldome Digital Planetarium Show. Learners go on a futuristic journey through our Solar System. They explore the inner and outer planets, then the moons: Titan, Europa, and Callisto as possible places to establish a human colony. A full-length preview of the show is available on the website, you need to scroll down about 3/4 of the page - under section on children's shows, direct link not available.

335

Quantifying Spatial Variability of Selected Soil Trace Elements and Their Scaling Relationships Using Multifractal Techniques  

PubMed Central

Multifractal techniques were utilized to quantify the spatial variability of selected soil trace elements and their scaling relationships in a 10.24-ha agricultural field in northeast China. 1024 soil samples were collected from the field and available Fe, Mn, Cu and Zn were measured in each sample. Descriptive results showed that Mn deficiencies were widespread throughout the field while Fe and Zn deficiencies tended to occur in patches. By estimating single multifractal spectra, we found that available Fe, Cu and Zn in the study soils exhibited high spatial variability and the existence of anomalies ([?(q)max??(q)min]?0.54), whereas available Mn had a relatively uniform distribution ([?(q)max??(q)min]?0.10). The joint multifractal spectra revealed that the strong positive relationships (r?0.86, P<0.001) among available Fe, Cu and Zn were all valid across a wider range of scales and over the full range of data values, whereas available Mn was weakly related to available Fe and Zn (r?0.18, P<0.01) but not related to available Cu (r?=??0.03, P?=?0.40). These results show that the variability and singularities of selected soil trace elements as well as their scaling relationships can be characterized by single and joint multifractal parameters. The findings presented in this study could be extended to predict selected soil trace elements at larger regional scales with the aid of geographic information systems. PMID:23874944

Zhang, Fasheng; Yin, Guanghua; Wang, Zhenying; McLaughlin, Neil; Geng, Xiaoyuan; Liu, Zuoxin

2013-01-01

336

Quantifying radial diffusion coefficients of radiation belt electrons based on global MHD simulation and spacecraft measurements  

NASA Astrophysics Data System (ADS)

Radial diffusion is one of the most important acceleration mechanisms for radiation belt electrons, which can be enhanced from drift-resonant interactions with large-scale fluctuations of the magnetosphere's magnetic and electric fields (Pc5 range of ULF waves). In order to physically quantify the radial diffusion coefficient, DLL, we run the global Lyon-Fedder-Mobarry (LFM) MHD simulations to obtain the mode structure and power spectrum of the ULF waves and validate the simulation results with available satellite measurements. The calculated diffusion coefficients, directly from the MHD fields over a Corotating Interaction Region (CIR) storm in March 2008, are generally higher when solar wind dynamic pressure is enhanced or AE index is high. In contrary to the conventional understanding, our results show that inside geosynchronous orbit the total diffusion coefficient from MHD fields is dominated by the contribution from electric field perturbations, rather than the magnetic field perturbations. The calculated diffusion coefficient has a physical dependence on ? (or electron energy) and L, which is missing in the empirical diffusion coefficient, DLLKp as a function of Kp index, and DLLKp are generally greater than our calculated DLL during the storm event. Validation of the MHD ULF waves by spacecraft field data shows that for this event the LFM code reasonably well-reproduces the Bz wave power observed by GOES and THEMIS satellites, while the E? power observed by THEMIS probes are generally underestimated by LFM fields, on average by about a factor of ten.

Tu, Weichao; Elkington, Scot R.; Li, Xinlin; Liu, Wenlong; Bonnell, J.

2012-10-01

337

Quantified energy dissipation rates in the terrestrial bow shock: 1. Analysis techniques and methodology  

NASA Astrophysics Data System (ADS)

We present a detailed outline and discussion of the analysis techniques used to compare the relevance of different energy dissipation mechanisms at collisionless shock waves. We show that the low-frequency, quasi-static fields contribute less to ohmic energy dissipation, (-j·E), than their high-frequency counterparts. In fact, we found that high-frequency, large-amplitude (>100 mV/m and/or >1 nT) waves are ubiquitous in the transition region of collisionless shocks. We quantitatively show that their fields, through wave-particle interactions, cause enough energy dissipation to regulate the global structure of collisionless shocks. The purpose of this paper, part one of two, is to outline and describe in detail the background, analysis techniques, and theoretical motivation for our new results presented in the companion paper. The companion paper presents the results of our quantitative energy dissipation rate estimates and discusses the implications. Together, the two manuscripts present the first study quantifying the contribution that high-frequency waves provide, through wave-particle interactions, to the total energy dissipation budget of collisionless shock waves.

Wilson, L. B.; Sibeck, D. G.; Breneman, A. W.; Contel, O. Le; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.

2014-08-01

338

Quantifying entanglement of overlapping indistinguishable particles  

NASA Astrophysics Data System (ADS)

This thesis develops the quantitative study of quantum entanglement in systems of identical particles. Understanding this topic is essential for the construction of quantum information processing devices involving identical particles. A brief overview of necessary concepts and methods, such as the density matrix, the entanglement in pure and mixed states of distinguishable particles, and some common applications of entanglement is given in the introduction. Some competing methods of calculating the entanglement in bipartite pure states of indistinguishable particles are examined. It is shown that only the 'site entropy' measure introduced by Zanardi satisfies all the criteria for a correct entanglement measure. A teleportation protocol which utilizes all the entanglement carried (in both the spin and space degrees of freedom) in a doubly- occupied molecular bonding orbital is presented. The output from an interferometer in a thought experiment described by Omar et al. is studied as an example to see whether entanglement can be separated into space-only, spin-only, and space-spin components. A similar exercise is performed for a doubly-occupied molecular bonding orbital. The relationship between these results and the application of superselection rules (SSRs) to the quantification of useful entanglement is discussed. A numerical method for estimating the entanglement of formation of a mixed state of arbitrary dimension by a conjugate gradient algorithm is described. The results of applying an implementation of the algorithm to both random and isotropic states of 2 qutrits (i.e. two three-dimensional systems) is described. Existing work on calculating entanglement between two sites in various spin systems is outlined. New methods for calculating the entanglement between two sites in various types of degenerate quantum gas - a Fermi gas, a Bose condensate, and a BCS superconductor - are described. The results of numerical studies of the entanglement in a normal metal and a BCS superconductor are reported, both with and without the application of superselection rules for local particle number conservation.

Gittings, Joseph R.

339

Quantifying oil filtration effects on bearing life  

NASA Technical Reports Server (NTRS)

Rolling-element bearing life is influenced by the number, size, and material properties of particles entering the Hertzian contact of the rolling element and raceway. In general, rolling-element bearing life increases with increasing level of oil filtration. Based upon test results, two equations are presented which allow for the adjustment of bearing L(sub 10) or catalog life based upon oil filter rating. It is recommended that where no oil filtration is used catalog life be reduced by 50 percent.

Needelman, William M.; Zaretsky, Erwin V.

1991-01-01

340

Quantifying the Fate of Stablised Criegee Intermediates under Atmospheric Conditions  

NASA Astrophysics Data System (ADS)

The products of alkene ozonolysis have been shown in field experiments to convert SO2 to H2SO4. One fate of H2SO4 formed in the atmosphere is the formation of sulphate aerosol. This has been reported to contribute - 0.4 W m-2 to anthropogenic radiative forcing via the direct aerosol effect and can also contribute to the indirect aerosol effect, currently one of the greatest uncertainties in climate modelling. The observed SO2 oxidation has been proposed to arise from reactions of the carbonyl oxide, or Criegee Intermediate (CI), formed during alkene ozonolysis reactions, with SO2. Direct laboratory experiments have confirmed that stabilised CIs (SCIs) react more quickly with SO2 (k > 10-11 cm3 s-1) than was previously thought. The major sink for SCI in the troposphere is reaction with water vapour. The importance of the SO2 + SCI reaction in H2SO4 formation has been shown in modelling work to be critically dependent on the ratio of the rate constants for the reaction of the SCI with SO2 and with H2O. Such modelling work has suggested that the SCI + SO2 reaction is only likely to be important in regions with high alkene emissions, e.g. forests. Here we present results from a series of ozonolysis experiments performed at the EUPHORE atmospheric simulation chamber, Valencia. These experiments measure the loss of SO2, in the presence of an alkene (ethene, cis-but-2-ene and 2,3-dimethyl butene), as a function of water vapour. From these experiments we quantify the relative rates of reaction of the three smallest SCI with water and SO2 and their decomposition rates. In addition the results appear to suggest that the conversion of SO2 to H2SO4 during alkene ozonolysis may be inconsistent with the SCI + SO2 mechanism alone, particularly at high relative humidities. The results suggest that SCI are likely to provide at least an equivalent sink for SO2 to that of OH in the troposphere, in agreement with field observations. This work highlights the importance of alkene ozonolysis not only as a non-photolytic source of HOx but additionally as a source of other important atmospheric oxidants and moves towards quantifying some of the important sinks of SCI in the atmosphere.

Newland, Mike; Rickard, Andrew; Alam, Mohammed; Vereecken, Luc; Muñoz, Amalia; Ródenas, Milagros; Bloss, William

2014-05-01

341

Quantifying the natural history of breast cancer  

PubMed Central

Background: Natural history models of breast cancer progression provide an opportunity to evaluate and identify optimal screening scenarios. This paper describes a detailed Markov model characterising breast cancer tumour progression. Methods: Breast cancer is modelled by a 13-state continuous-time Markov model. The model differentiates between indolent and aggressive ductal carcinomas in situ tumours, and aggressive tumours of different sizes. We compared such aggressive cancers, that is, which are non-indolent, to those which are non-growing and regressing. Model input parameters and structure were informed by the 1978–1984 Ostergotland county breast screening randomised controlled trial. Overlaid on the natural history model is the effect of screening on diagnosis. Parameters were estimated using Bayesian methods. Markov chain Monte Carlo integration was used to sample the resulting posterior distribution. Results: The breast cancer incidence rate in the Ostergotland population was 21 (95% CI: 17–25) per 10?000 woman-years. Accounting for length-biased sampling, an estimated 91% (95% CI: 85–97%) of breast cancers were aggressive. Larger tumours, 21–50?mm, had an average sojourn of 6 years (95% CI: 3–16 years), whereas aggressive ductal carcinomas in situ took around half a month (95% CI: 0–1 month) to progress to the invasive ?10?mm state. Conclusion: These tumour progression rate estimates may facilitate future work analysing cost-effectiveness and quality-adjusted life years for various screening strategies. PMID:24084766

Tan, K H X; Simonella, L; Wee, H L; Roellin, A; Lim, Y-W; Lim, W-Y; Chia, K S; Hartman, M; Cook, A R

2013-01-01

342

Quantifying the Spatial Dimension of Dengue Virus Epidemic Spread within a Tropical Urban Environment  

PubMed Central

Background Dengue infection spread in naive populations occurs in an explosive and widespread fashion primarily due to the absence of population herd immunity, the population dynamics and dispersal of Ae. aegypti, and the movement of individuals within the urban space. Knowledge on the relative contribution of such factors to the spatial dimension of dengue virus spread has been limited. In the present study we analyzed the spatio-temporal pattern of a large dengue virus-2 (DENV-2) outbreak that affected the Australian city of Cairns (north Queensland) in 2003, quantified the relationship between dengue transmission and distance to the epidemic's index case (IC), evaluated the effects of indoor residual spraying (IRS) on the odds of dengue infection, and generated recommendations for city-wide dengue surveillance and control. Methods and Findings We retrospectively analyzed data from 383 DENV-2 confirmed cases and 1,163 IRS applications performed during the 25-week epidemic period. Spatial (local k-function, angular wavelets) and space-time (Knox test) analyses quantified the intensity and directionality of clustering of dengue cases, whereas a semi-parametric Bayesian space-time regression assessed the impact of IRS and spatial autocorrelation in the odds of weekly dengue infection. About 63% of the cases clustered up to 800 m around the IC's house. Most cases were distributed in the NW-SE axis as a consequence of the spatial arrangement of blocks within the city and, possibly, the prevailing winds. Space-time analysis showed that DENV-2 infection spread rapidly, generating 18 clusters (comprising 65% of all cases), and that these clusters varied in extent as a function of their distance to the IC's residence. IRS applications had a significant protective effect in the further occurrence of dengue cases, but only when they reached coverage of 60% or more of the neighboring premises of a house. Conclusion By applying sound statistical analysis to a very detailed dataset from one of the largest outbreaks that affected the city of Cairns in recent times, we not only described the spread of dengue virus with high detail but also quantified the spatio-temporal dimension of dengue virus transmission within this complex urban environment. In areas susceptible to non-periodic dengue epidemics, effective disease prevention and control would depend on the prompt response to introduced cases. We foresee that some of the results and recommendations derived from our study may also be applicable to other areas currently affected or potentially subject to dengue epidemics. PMID:21200419

Vazquez-Prokopec, Gonzalo M.; Kitron, Uriel; Montgomery, Brian; Horne, Peter; Ritchie, Scott A.

2010-01-01

343

Quantifying Forearm Muscle Activity during Wrist and Finger Movements by Means of Multi-Channel Electromyography  

PubMed Central

The study of hand and finger movement is an important topic with applications in prosthetics, rehabilitation, and ergonomics. Surface electromyography (sEMG) is the gold standard for the analysis of muscle activation. Previous studies investigated the optimal electrode number and positioning on the forearm to obtain information representative of muscle activation and robust to movements. However, the sEMG spatial distribution on the forearm during hand and finger movements and its changes due to different hand positions has never been quantified. The aim of this work is to quantify 1) the spatial localization of surface EMG activity of distinct forearm muscles during dynamic free movements of wrist and single fingers and 2) the effect of hand position on sEMG activity distribution. The subjects performed cyclic dynamic tasks involving the wrist and the fingers. The wrist tasks and the hand opening/closing task were performed with the hand in prone and neutral positions. A sensorized glove was used for kinematics recording. sEMG signals were acquired from the forearm muscles using a grid of 112 electrodes integrated into a stretchable textile sleeve. The areas of sEMG activity have been identified by a segmentation technique after a data dimensionality reduction step based on Non Negative Matrix Factorization applied to the EMG envelopes. The results show that 1) it is possible to identify distinct areas of sEMG activity on the forearm for different fingers; 2) hand position influences sEMG activity level and spatial distribution. This work gives new quantitative information about sEMG activity distribution on the forearm in healthy subjects and provides a basis for future works on the identification of optimal electrode configuration for sEMG based control of prostheses, exoskeletons, or orthoses. An example of use of this information for the optimization of the detection system for the estimation of joint kinematics from sEMG is reported. PMID:25289669

Gazzoni, Marco; Celadon, Nicolo; Mastrapasqua, Davide; Paleari, Marco; Margaria, Valentina; Ariano, Paolo

2014-01-01

344

Quantifying Phycocyanin Concentration in Cyanobacterial Algal Blooms from Remote Sensing Reflectance-A Quasi Analytical Approach  

NASA Astrophysics Data System (ADS)

Cyanobacterial harmful algal blooms (CHAB) are notorious for depleting dissolved oxygen level, producing various toxins, causing threats to aquatic life, altering the food-web dynamics and the overall ecosystem functioning in inland lakes, estuaries, and coastal waters. Most of these algal blooms produce various toxins that can damage cells, tissues and even cause mortality of living organisms. Frequent monitoring of water quality in a synoptic scale has been possible by the virtue of remote sensing techniques. In this research, we present a novel technique to monitor CHAB using remote sensing reflectance products. We have modified a multi-band quasi analytical algorithm that determines phytoplankton absorption coefficients from above surface remote sensing reflectance measurements using an inversion method. In situ hyperspectral remote sensing reflectance data were collected from several highly turbid and productive aquaculture ponds. A novel technique was developed to further decompose the phytoplankton absorption coefficients at 620 nm and obtain phycocyanin absorption coefficient at the same wavelength. An empirical relationship was established between phycocyanin absorption coefficients at 620 nm and measured phycocyanin concentrations. Model calibration showed strong relationship between phycocyanin absorption coefficients and phycocyanin pigment concentration (r2=0.94). Validation of the model in a separate dataset produced a root mean squared error of 167 mg m-3 (phycocyanin range: 26-1012 mg m-3). Results demonstrate that the new approach will be suitable for quantifying phycocyanin concentration in cyanobacteria dominated turbid productive waters. Band architecture of the model matches with the band configuration of the Medium Resolution Imaging Spectrometer (MERIS) and assures that MERIS reflectance products can be used to quantify phycocyanin in cyanobacterial harmful algal blooms in optically complex waters.

Mishra, S.; Mishra, D. R.; Tucker, C.

2011-12-01

345

Utilizing novel diversity estimators to quantify multiple dimensions of microbial biodiversity across domains  

PubMed Central

Background Microbial ecologists often employ methods from classical community ecology to analyze microbial community diversity. However, these methods have limitations because microbial communities differ from macro-organismal communities in key ways. This study sought to quantify microbial diversity using methods that are better suited for data spanning multiple domains of life and dimensions of diversity. Diversity profiles are one novel, promising way to analyze microbial datasets. Diversity profiles encompass many other indices, provide effective numbers of diversity (mathematical generalizations of previous indices that better convey the magnitude of differences in diversity), and can incorporate taxa similarity information. To explore whether these profiles change interpretations of microbial datasets, diversity profiles were calculated for four microbial datasets from different environments spanning all domains of life as well as viruses. Both similarity-based profiles that incorporated phylogenetic relatedness and naïve (not similarity-based) profiles were calculated. Simulated datasets were used to examine the robustness of diversity profiles to varying phylogenetic topology and community composition. Results Diversity profiles provided insights into microbial datasets that were not detectable with classical univariate diversity metrics. For all datasets analyzed, there were key distinctions between calculations that incorporated phylogenetic diversity as a measure of taxa similarity and naïve calculations. The profiles also provided information about the effects of rare species on diversity calculations. Additionally, diversity profiles were used to examine thousands of simulated microbial communities, showing that similarity-based and naïve diversity profiles only agreed approximately 50% of the time in their classification of which sample was most diverse. This is a strong argument for incorporating similarity information and calculating diversity with a range of emphases on rare and abundant species when quantifying microbial community diversity. Conclusions For many datasets, diversity profiles provided a different view of microbial community diversity compared to analyses that did not take into account taxa similarity information, effective diversity, or multiple diversity metrics. These findings are a valuable contribution to data analysis methodology in microbial ecology. PMID:24238386

2013-01-01

346

Quantifying heart rate dynamics using different approaches of symbolic dynamics  

NASA Astrophysics Data System (ADS)

The analysis of symbolic dynamics applied to physiological time series is able to retrieve information about dynamical properties of the underlying system that cannot be gained with standard methods like e.g. spectral analysis. Different approaches for the transformation of the original time series to the symbolic time series have been proposed. Yet the differences between the approaches are unknown. In this study three different transformation methods are investigated: (1) symbolization according to the deviation from the average time series, (2) symbolization according to several equidistant levels between the minimum and maximum of the time series, (3) binary symbolization of the first derivative of the time series. Furthermore, permutation entropy was used to quantify the symbolic series. Each method was applied to the cardiac interbeat interval series RR i and its difference ? RR I of 17 healthy subjects obtained during head-up tilt testing. The symbolic dynamics of each method is analyzed by means of the occurrence of short sequences ("words") of length 3. The occurrence of words is grouped according to words without variations of the symbols (0V%), words with one variation (1V%), two like variations (2LV%) and two unlike variations (2UV%). Linear regression analysis showed that for method 1 0V%, 1V%, 2LV% and 2UV% changed with increasing tilt angle. For method 2 0V%, 2LV% and 2UV% changed with increasing tilt angle and method 3 showed changes for 0V% and 1V%. Furthermore, also the permutation entropy decreased with increasing tilt angle. In conclusion, all methods are capable of reflecting changes of the cardiac autonomic nervous system during head-up tilt. All methods show that even the analysis of very short symbolic sequences is capable of tracking changes of the cardiac autonomic regulation during head-up tilt testing.

Cysarz, D.; Porta, A.; Montano, N.; Leeuwen, P. V.; Kurths, J.; Wessel, N.

2013-06-01

347

The Language of Show Biz: A Dictionary.  

ERIC Educational Resources Information Center

This dictionary of the language of show biz provides the layman with definitions and essays on terms and expressions often used in show business. The overall pattern of selection was intended to be more rather than less inclusive, though radio, television, and film terms were deliberately omitted. Lengthy explanations are sometimes used to express…

Sergel, Sherman Louis, Ed.

348

Acculturation, Cultivation, and Daytime TV Talk Shows  

Microsoft Academic Search

This study explored the cultivation phenomenon among international college students in the United States by examining the connection between levels of acculturation, daytime TV talk show viewing, and beliefs about social reality. It was expected that international students who were heavy viewers of daytime TV talk shows and who scored low on a measure of acculturation about the United States

Hyung-Jin Woo; Joseph R. Dominick

2003-01-01

349

2015 4-H State Food Show Guidelines  

E-print Network

1 2015 4-H State Food Show Guidelines Bringing Texas to the Table Educational programs of the Texas A&M AgriLife Extension Service are open to all people without regard to race, color, sex, religion.D. Shawnte Clawson, MS Subject: 2015 4-H State Food Show Guidelines Being transmitted to you this year via e

350

Serving Up Activities for TV Cooking Shows.  

ERIC Educational Resources Information Center

This paper documents a presentation given on the use of English-language television cooking shows in English-as-a-Second-Language (ESL) and English-as-a-Foreign-Language (EFL) classrooms in Taiwan. Such shows can be ideal for classroom use, since they have a predictable structure consisting of short segments, are of interest to most students,…

Katchen, Johanna E.

351

Exploring visitor experiences at trade shows  

Microsoft Academic Search

Purpose – The purpose of this paper is to investigate business visitor behaviour at trade shows and to propose a complementary view based on the experiential perspective in marketing. Design\\/methodology\\/approach – The paper reports an ethnographic study conducted in the context of ten international trade shows in the textile-apparel industry in Europe. Findings – The study sheds light on the

Diego Rinallo; Stefania Borghini; Francesca Golfetto

2010-01-01

352

The Physics of Equestrian Show Jumping  

ERIC Educational Resources Information Center

This article discusses the kinematics and dynamics of equestrian show jumping. For some time I have attended a series of show jumping events at Spruce Meadows, an international equestrian center near Calgary, Alberta, often referred to as the "Wimbledon of equestrian jumping." I have always had a desire to write an article such as this…

Stinner, Art

2014-01-01

353

Virtual game show host — Dr. Chestr  

Microsoft Academic Search

This paper describes the design, implementation and evaluation of an interactive virtual human Dr. Chestr: Computerized Host Encouraging Students to Review. Game show hosts exert a unique personality that becomes the trademark of their respective game shows. Our aim is to create virtual humans that can interact naturally and spontaneously using speech, emotions and gesture. Dr. Chestr is our virtual

Raghavi Sakpal; Dale-Marie Wilson

2011-01-01

354

International Plowing Match & Farm Machinery Show  

NSDL National Science Digital Library

The 1995 International Plowing Match & Farm Machinery Show in Ontario, Canada has a site of the Web. The IPM is a non-profit organization of volunteers which annually organizes Canada's largest farm machinery show. The event is commercial and educational. Thousands of school children and educators attend and participate in organized educational activities.

1995-01-01

355

Quantifying selective linear erosion in Antarctica  

NASA Astrophysics Data System (ADS)

David Sugden (1978) coined the term 'selective linear erosion' to describe landscapes, characteristic of high-latitude glaciated areas, that are distinguished by deep glacially excavated troughs separated by low-relief upland surfaces that show no evidence of glacial erosion. Sugden (and later researchers) proposed that this landscape form owed its existence to the thermal distribution within polar ice sheets: ice at high elevations is thin, frozen to its bed, and therefore protects rather than erodes the landscape; thicker ice in topographic depressions can sustain basal melting with consequent erosion by hydraulic and thermodynamic processes. This contrast in basal thermal regime implies an extreme contrast in erosion rates, which amplifies preexisting relief and gives rise to landscapes of selective linear erosion. These landscapes are currently exposed in formerly glaciated high-latitude regions of the northern continents. They also exist beneath the Antarctic ice sheets, where presumably the processes responsible for their formation are currently active. Here we argue that understanding how and when these landscapes form is important to understanding how ice sheets mediate climate-landscape interactions. However, the facts that: i) the processes in question occur beneath the modern Antarctic ice sheet, and ii) currently unglaciated portions of glacier troughs in Arctic and Antarctic landscapes are nearly universally submerged, present several challenges to attaining this understanding. Here we summarize geochemical and geochronological means of addressing these challenges. These include: first, cosmogenic-nuclide measurements that establish the Plio-Pleistocene erosion history of high-elevation plateau surfaces; second, thermochronometric observations on debris shed by glaciers occupying major troughs that provide information about when and how fast these troughs formed.

Balco, G.; Shuster, D. L.

2012-12-01

356

Quantifying hydrate formation and kinetic inhibition  

SciTech Connect

In the Prausnitz tradition, molecular and macroscopic evidence of hydrate formation and kinetic inhibition is presented. On the microscopic level, the first Raman spectra are presented for the formation of both uninhibited and inhibited methane hydrates with time. This method has the potential to provide a microscopic-based kinetics model. Three macroscopic aspects of natural gas hydrate kinetic inhibition are also reported: (1) The effect of hydrate dissociation residual structures was measured, which has application in decreasing the time required for subsequent formation. (2) The performance of a kinetic inhibitor (poly(N-vinylcaprolactam) or PVCap) was measured and correlated as a function of PVCap molecular weight and concentrations of PVCap, methanol, and salt in the aqueous phase. (3) Long-duration test results indicated that the use of PVCap can prevent pipeline blockage for a time exceeding the aqueous phase residence time in some gas pipelines.

Sloan, E.D.; Subramanian, S.; Matthews, P.N.; Lederhos, J.P.; Khokhar, A.A. [Colorado School of Mines, Golden, CO (United States). Center for Hydrate Research] [Colorado School of Mines, Golden, CO (United States). Center for Hydrate Research

1998-08-01

357

Using sociometers to quantify social interaction patterns  

E-print Network

Research on human social interactions has traditionally relied on self-reports. Despite their widespread use, self-reported accounts of behaviour are prone to biases and necessarily reduce the range of behaviours, and the number of subjects, that may be studied simultaneously. The development of ever smaller sensors makes it possible to study group-level human behaviour in naturalistic settings outside research laboratories. We used such sensors, sociometers, to examine gender, talkativeness and interaction style in two different contexts. Here, we find that in the collaborative context, women were much more likely to be physically proximate to other women and were also significantly more talkative than men, especially in small groups. In contrast, there were no gender-based differences in the non-collaborative setting. Our results highlight the importance of objective measurement in the study of human behaviour, here enabling us to discern context specific, gender-based differences in interaction style.

Onnela, Jukka-Pekka; Alex,; Pentland,; Schnorf, Sebastian; Lazer, David

2014-01-01

358

Quantifying photometric observing conditions on Paranal using an IR camera  

E-print Network

A Low Humidity and Temperature Profiling (LHATPRO) microwave radiometer, manufactured by Radiometer Physics GmbH (RPG), is used to monitor sky conditions over ESO's Paranal observatory in support of VLT science operations. In addition to measuring precipitable water vapour (PWV) the instrument also contains an IR camera measuring sky brightness temperature at 10.5 {\\mu}m. Due to its extended operating range down to -100 {\\deg}C it is capable of detecting very cold and very thin, even sub-visual, cirrus clouds. We present a set of instrument flux calibration values as compared with a detrended fluctuation analysis (DFA) of the IR camera zenith-looking sky brightness data measured above Paranal taken over the past two years. We show that it is possible to quantify photometric observing conditions and that the method is highly sensitive to the presence of even very thin clouds but robust against variations of sky brightness caused by effects other than clouds such as variations of precipitable water vapour. Henc...

Kerber, Florian; Hanuschik, Reinhard

2014-01-01

359

Quantifying neurotransmission reliability through metrics-based information analysis.  

PubMed

We set forth an information-theoretical measure to quantify neurotransmission reliability while taking into full account the metrical properties of the spike train space. This parametric information analysis relies on similarity measures induced by the metrical relations between neural responses as spikes flow in. Thus, in order to assess the entropy, the conditional entropy, and the overall information transfer, this method does not require any a priori decoding algorithm to partition the space into equivalence classes. It therefore allows the optimal parameters of a class of distances to be determined with respect to information transmission. To validate the proposed information-theoretical approach, we study precise temporal decoding of human somatosensory signals recorded using microneurography experiments. For this analysis, we employ a similarity measure based on the Victor-Purpura spike train metrics. We show that with appropriate parameters of this distance, the relative spike times of the mechanoreceptors' responses convey enough information to perform optimal discrimination--defined as maximum metrical information and zero conditional entropy--of 81 distinct stimuli within 40 ms of the first afferent spike. The proposed information-theoretical measure proves to be a suitable generalization of Shannon mutual information in order to consider the metrics of temporal codes explicitly. It allows neurotransmission reliability to be assessed in the presence of large spike train spaces (e.g., neural population codes) with high temporal precision. PMID:21222522

Brasselet, Romain; Johansson, Roland S; Arleo, Angelo

2011-04-01

360

Quantifying the benefits of vehicle pooling with shareability networks  

PubMed Central

Taxi services are a vital part of urban transportation, and a considerable contributor to traffic congestion and air pollution causing substantial adverse effects on human health. Sharing taxi trips is a possible way of reducing the negative impact of taxi services on cities, but this comes at the expense of passenger discomfort quantifiable in terms of a longer travel time. Due to computational challenges, taxi sharing has traditionally been approached on small scales, such as within airport perimeters, or with dynamical ad hoc heuristics. However, a mathematical framework for the systematic understanding of the tradeoff between collective benefits of sharing and individual passenger discomfort is lacking. Here we introduce the notion of shareability network, which allows us to model the collective benefits of sharing as a function of passenger inconvenience, and to efficiently compute optimal sharing strategies on massive datasets. We apply this framework to a dataset of millions of taxi trips taken in New York City, showing that with increasing but still relatively low passenger discomfort, cumulative trip length can be cut by 40% or more. This benefit comes with reductions in service cost, emissions, and with split fares, hinting toward a wide passenger acceptance of such a shared service. Simulation of a realistic online system demonstrates the feasibility of a shareable taxi service in New York City. Shareability as a function of trip density saturates fast, suggesting effectiveness of the taxi sharing system also in cities with much sparser taxi fleets or when willingness to share is low. PMID:25197046

Santi, Paolo; Resta, Giovanni; Szell, Michael; Sobolevsky, Stanislav; Strogatz, Steven H.; Ratti, Carlo

2014-01-01

361

Quantifying photometric observing conditions on Paranal using an IR camera  

NASA Astrophysics Data System (ADS)

A Low Humidity and Temperature Profiling (LHATPRO) microwave radiometer, manufactured by Radiometer Physics GmbH (RPG), is used to monitor sky conditions over ESO's Paranal observatory in support of VLT science operations. In addition to measuring precipitable water vapour (PWV) the instrument also contains an IR camera measuring sky brightness temperature at 10.5 ?m. Due to its extended operating range down to -100 °C it is capable of detecting very cold and very thin, even sub-visual, cirrus clouds. We present a set of instrument flux calibration values as compared with a detrended fluctuation analysis (DFA) of the IR camera zenith-looking sky brightness data measured above Paranal taken over the past two years. We show that it is possible to quantify photometric observing conditions and that the method is highly sensitive to the presence of even very thin clouds but robust against variations of sky brightness caused by effects other than clouds such as variations of precipitable water vapour. Hence it can be used to determine photometric conditions for science operations. About 60 % of nights are free of clouds on Paranal. More work will be required to classify the clouds using this technique. For the future this approach might become part of VLT science operations for evaluating nightly sky conditions.

Kerber, Florian; Querel, Richard R.; Hanuschik, Reinhard

2014-08-01

362

FIG. 2: Model results showing vertical and horizontal displacements due to the Hekla 2000 lava (disk, final relaxed response). Tickmarks in c are Lambert coordinates and describe the extend of the modeled area in meters. (a,d) and the Mogi model (b,e). Th  

E-print Network

by Pinel et al. [2007]. We assume 5 km as elastic thickness of the lithosphere, a Young's modulus of 40 GPa decrease in the magma chamber only; both processes show a very similar deformation pattern. This poses, provide an additional source of deformation. The initial elastic response due to a load on the surface

Grapenthin, Ronni

363

Comparison of Weather Shows in Eastern Europe  

NASA Astrophysics Data System (ADS)

Comparison of Weather Shows in Eastern Europe Television weather shows in Eastern Europe have in most cases in the high graphical standard. There is though a wast difference in duration and information content in the weather shows. There are few signs and regularities by which we can see the character of the weather show. The main differences are mainly caused by the income structure of the TV station. Either it is a fully privately funded TV relying on the TV commercials income. Or it is a public service TV station funded mainly by the national budget or fixed fee structure/tax. There are wast differences in duration and even a graphical presentation of the weather. Next important aspect is a supplier of the weather information and /or the processor. Shortly we can say, that when the TV show is produced by the national met office, the TV show consists of more scientific terms, synoptic maps, satellite imagery, etc. If the supplier is the private meteorological company, the weather show is more user-friendly, laical with less scientific terms. We are experiencing a massive shift in public weather knowledge and demand for information. In the past, weather shows consisted only of maps with weather icons. In todaýs world, even the laic weather shows consist partly of numerical weather model outputs - they are of course designed to be understandable and graphically attractive. Outputs of the numerical weather models used to be only a part of daily life of a professional meteorologist, today they are common part of life of regular people. Video samples are a part of this presentation.

Najman, M.

2009-09-01

364

QUANTIFYING THE EVOLVING MAGNETIC STRUCTURE OF ACTIVE REGIONS  

SciTech Connect

The topical and controversial issue of parameterizing the magnetic structure of solar active regions has vital implications in the understanding of how these structures form, evolve, produce solar flares, and decay. This interdisciplinary and ill-constrained problem of quantifying complexity is addressed by using a two-dimensional wavelet transform modulus maxima (WTMM) method to study the multifractal properties of active region photospheric magnetic fields. The WTMM method provides an adaptive space-scale partition of a fractal distribution, from which one can extract the multifractal spectra. The use of a novel segmentation procedure allows us to remove the quiet Sun component and reliably study the evolution of active region multifractal parameters. It is shown that prior to the onset of solar flares, the magnetic field undergoes restructuring as Dirac-like features (with a Hoelder exponent, h = -1) coalesce to form step functions (where h = 0). The resulting configuration has a higher concentration of gradients along neutral line features. We propose that when sufficient flux is present in an active region for a period of time, it must be structured with a fractal dimension greater than 1.2, and a Hoelder exponent greater than -0.7, in order to produce M- and X-class flares. This result has immediate applications in the study of the underlying physics of active region evolution and space weather forecasting.

Conlon, Paul A.; McAteer, R.T. James; Gallagher, Peter T.; Fennell, Linda, E-mail: mcateer@nmsu.ed [School of Physics, Trinity College Dublin, Dublin 2 (Ireland)

2010-10-10

365

Quantifying the Evolving Magnetic Structure of Active Regions  

NASA Astrophysics Data System (ADS)

The topical and controversial issue of parameterizing the magnetic structure of solar active regions has vital implications in the understanding of how these structures form, evolve, produce solar flares, and decay. This interdisciplinary and ill-constrained problem of quantifying complexity is addressed by using a two-dimensional wavelet transform modulus maxima (WTMM) method to study the multifractal properties of active region photospheric magnetic fields. The WTMM method provides an adaptive space-scale partition of a fractal distribution, from which one can extract the multifractal spectra. The use of a novel segmentation procedure allows us to remove the quiet Sun component and reliably study the evolution of active region multifractal parameters. It is shown that prior to the onset of solar flares, the magnetic field undergoes restructuring as Dirac-like features (with a Hölder exponent, h = -1) coalesce to form step functions (where h = 0). The resulting configuration has a higher concentration of gradients along neutral line features. We propose that when sufficient flux is present in an active region for a period of time, it must be structured with a fractal dimension greater than 1.2, and a Hölder exponent greater than -0.7, in order to produce M- and X-class flares. This result has immediate applications in the study of the underlying physics of active region evolution and space weather forecasting.

Conlon, Paul A.; McAteer, R. T. James; Gallagher, Peter T.; Fennell, Linda

2010-10-01

366

Quantifying urban heat island intensity in Hong Kong SAR, China.  

PubMed

This paper addresses the methodological concerns in quantifying urban heat island (UHI) intensity in Hong Kong SAR, China. Although the urban heat island in Hong Kong has been widely investigated, there is no consensus on the most appropriate fixed point meteorological sites to be used to calculate heat island intensity. This study utilized the Local Climate Zones landscape classification system to classify 17 weather stations from the Hong Kong Observatory's extensive fixed point meteorological observation network. According to the classification results, the meteorological site located at the Hong Kong Observatory Headquarters is the representative urban weather station in Hong Kong, whereas sites located at Tsak Yue Wu and Ta Kwu Ling are appropriate rural or nonurbanized counterparts. These choices were validated and supported quantitatively through comparison of long-term annual and diurnal UHI intensities with rural stations used in previous studies. Results indicate that the rural stations used in previous studies are not representative, and thus, the past UHI intensities calculated for Hong Kong may have been underestimated. PMID:23007798

Siu, Leong Wai; Hart, Melissa A

2013-05-01

367

Quantifying dose to the reconstructed breast: Can we adequately treat?  

SciTech Connect

To evaluate how immediate reconstruction (IR) impacts postmastectomy radiotherapy (PMRT) dose distributions to the reconstructed breast (RB), internal mammary nodes (IMN), heart, and lungs using quantifiable dosimetric end points. 3D conformal plans were developed for 20 IR patients, 10 autologous reconstruction (AR), and 10 expander-implant (EI) reconstruction. For each reconstruction type, 5 right- and 5 left-sided reconstructions were selected. Two plans were created for each patient, 1 with RB coverage alone and 1 with RB + IMN coverage. Left-sided EI plans without IMN coverage had higher heart Dmean than left-sided AR plans (2.97 and 0.84 Gy, p = 0.03). Otherwise, results did not vary by reconstruction type and all remaining metrics were evaluated using a combined AR and EI dataset. RB coverage was adequate regardless of laterality or IMN coverage (Dmean 50.61 Gy, D95 45.76 Gy). When included, IMN Dmean and D95 were 49.57 and 40.96 Gy, respectively. Mean heart doses increased with left-sided treatment plans and IMN inclusion. Right-sided treatment plans and IMN inclusion increased mean lung V{sub 20}. Using standard field arrangements and 3D planning, we observed excellent coverage of the RB and IMN, regardless of laterality or reconstruction type. Our results demonstrate that adequate doses can be delivered to the RB with or without IMN coverage.

Chung, Eugene; Marsh, Robin B.; Griffith, Kent A.; Moran, Jean M. [Department of Radiation Oncology, University of Michigan, Ann Arbor, MI (United States); Pierce, Lori J., E-mail: ljpierce@umich.edu [Department of Radiation Oncology, University of Michigan, Ann Arbor, MI (United States)

2013-04-01

368

Quantifying the direct use value of Condor seamount  

NASA Astrophysics Data System (ADS)

Seamounts often satisfy numerous uses and interests. Multiple uses can generate multiple benefits but also conflicts and impacts, calling, therefore, for integrated and sustainable management. To assist in developing comprehensive management strategies, policymakers recognise the need to include measures of socioeconomic analysis alongside ecological data so that practical compromises can be made. This study assessed the direct output impact (DOI) of the relevant marine activities operating at Condor seamount (Azores, central northeast Atlantic) as proxies of the direct use values provided by the resource system. Results demonstrated that Condor seamount supported a wide range of uses yielding distinct economic outputs. Demersal fisheries, scientific research and shark diving were the top-three activities generating the highest revenues, while tuna fisheries, whale watching and scuba-diving had marginal economic significance. Results also indicated that the economic importance of non-extractive uses of Condor is considerable, highlighting the importance of these uses as alternative income-generating opportunities for local communities. It is hoped that quantifying the direct use values provided by Condor seamount will contribute to the decision making process towards its long-term conservation and sustainable use.

Ressurreição, Adriana; Giacomello, Eva

2013-12-01

369

Quantifying Representative Hydraulic Conductivity for Three-Dimensional Fractured Formations  

NASA Astrophysics Data System (ADS)

The fractures and pores in rock formations are the fundamental units for flow and contaminant transport simulations. Due to technical and logical limitations it is difficult in reality to account for such small units to model flow and transport in large-scale problems. The concept of continuum representations of fractured rocks is then used as an alternative to solve for flow and transport in complex fractured formations. For these types of approaches the determinations of the representative parameters such as hydraulic conductivity and dispersion coefficient play important roles in controlling the accuracy of simulation results for large-scale problems. The objective of this study is to develop a discrete fracture network (DFN) model and the associated unstructured mesh generation system to characterize the continuum hydraulic conductivity for fractured rocks on different scales. In this study a coupled three-dimensional model of water flow, thermal transport, solute transport, and geochemical kinetic/equilibrium reactions in saturated/unsaturated porous media (HYDROGEOCHEM) is employed to be the flow simulator to analyze the flow behaviors in fracture formations. The fracture network model and the corresponding continuum model are simulated for same scale problems. Based on the concept of mass conservation in flow, the correlations between statistics of fracture structure and the representative continuum parameters are quantified for a variety of fracture distribution scenarios and scales. The results of this study are expected to provide general insight into the procedures and the associated techniques for analyzing flow in complex large-scale fractured rock systems.

Lee, I.; Ni, C.

2013-12-01

370

Method for quantifying optical properties of the human lens  

DOEpatents

Method for quantifying optical properties of the human lens. The present invention includes the application of fiberoptic, OMA-based instrumentation as an in vivo diagnostic tool for the human ocular lens. Rapid, noninvasive and comprehensive assessment of the optical characteristics of a lens using very modest levels of exciting light are described. Typically, the backscatter and fluorescence spectra (from about 300- to 900-nm) elicited by each of several exciting wavelengths (from about 300- to 600-nm) are collected within a few seconds. The resulting optical signature of individual lenses is then used to assess the overall optical quality of the lens by comparing the results with a database of similar measurements obtained from a reference set of normal human lenses having various ages. Several metrics have been identified which gauge the optical quality of a given lens relative to the norm for the subject's chronological age. These metrics may also serve to document accelerated optical aging and/or as early indicators of cataract or other disease processes.

Loree, deceased, Thomas R. (late of Albuquerque, NM); Bigio, Irving J. (Los Alamos, NM); Zuclich, Joseph A. (San Antonio, TX); Shimada, Tsutomu (Los Alamos, NM); Strobl, Karlheinz (Fiskdale, MA)

1999-01-01

371

Quantified MS analysis applied to combinatorial heterogeneous catalyst libraries.  

PubMed

A high-throughput screening system for secondary catalyst libraries has been developed by incorporation of an 80-pass reactor and a quantified multistream mass spectrometer screening (MSMSS) technique. With a low-melting alloy as the heating medium, a uniform reaction temperature could be obtained in the multistream reactor (maximum temperature differences are less than 1 K at 673 K). Quantification of the results was realized by combination of a gas chromatogram with the MSMSS, which could provide the product selectivities of each catalyst in a heterogeneous catalyst library. Because the catalyst loading of each reaction tube is comparable to that of the conventional microreaction system and because the parallel reactions could be operated under identical conditions (homogeneous temperature, same pressure and WHSV), the reaction results of a promising catalyst selected from the library could be reasonably applied to the further scale-up of the system. The aldol condensation of acetone, with obvious differences in the product distribution over different kind of catalysts, was selected as a model reaction to validate the screening system. PMID:14606808

Wang, Hua; Liu, Zhongmin; Shen, Jianghan

2003-01-01

372

Quantifying the Restorable Water Volume of California's Sierra Nevada Meadows  

NASA Astrophysics Data System (ADS)

The Sierra Nevada is estimated to provide over 66% of California's water supply, which is largely derived from snowmelt. Global climate warming is expected to result in a decrease in snow pack and an increase in melting rate, making the attenuation of snowmelt by any means, an important ecosystem service for ensuring water availability. Montane meadows are dispersed throughout the mountain range and can act like natural reservoirs, and also provide wildlife habitat, water filtration, and water storage. Despite the important role of meadows in the Sierra Nevada, a large proportion is degraded from stream incision, which increases volume outflows and reduces overbank flooding, thus reducing infiltration and potential water storage. Restoration of meadow stream channels would therefore improve hydrological functioning, including increased water storage. The potential water holding capacity of restored meadows has yet to be quantified, thus this research seeks to address this knowledge gap by estimating the restorable water volume due to stream incision. More than 17,000 meadows were analyzed by categorizing their erosion potential using channel slope and soil texture, ultimately resulting in six general erodibility types. Field measurements of over 100 meadows, stratified by latitude, elevation, and geologic substrate, were then taken and analyzed for each erodibility type to determine average depth of incision. Restorable water volume was then quantified as a function of water holding capacity of the soil, meadow area and incised depth. Total restorable water volume was found to be 120 x 10^6 m3, or approximately 97,000 acre-feet. Using 95% confidence intervals for incised depth, the upper and lower bounds of the total restorable water volume were found to be 107 - 140 x 10^6 m3. Though this estimate of restorable water volume is small in regards to the storage capacity of typical California reservoirs, restoration of Sierra Nevada meadows remains an important objective. Storage of water in meadows benefits California wildlife, potentially attenuate floods, and elevates base flows, which can ease effects to the spring recession curve from the expected decline in Sierran snowpack with atmospheric warming.

Emmons, J. D.; Yarnell, S. M.; Fryjoff-Hung, A.; Viers, J.

2013-12-01

373

Incorporating both physical and kinetic limitations in quantifying dissolved oxygen flux to aquatic sediments  

USGS Publications Warehouse

Traditionally, dissolved oxygen (DO) fluxes have been calculated using the thin-film theory with DO microstructure data in systems characterized by fine sediments and low velocities. However, recent experimental evidence of fluctuating DO concentrations near the sediment-water interface suggests that turbulence and coherent motions control the mass transfer, and the surface renewal theory gives a more mechanistic model for quantifying fluxes. Both models involve quantifying the mass transfer coefficient (k) and the relevant concentration difference (??C). This study compared several empirical models for quantifying k based on both thin-film and surface renewal theories, as well as presents a new method for quantifying ??C (dynamic approach) that is consistent with the observed DO concentration fluctuations near the interface. Data were used from a series of flume experiments that includes both physical and kinetic uptake limitations of the flux. Results indicated that methods for quantifying k and ??C using the surface renewal theory better estimated the DO flux across a range of fluid-flow conditions. ?? 2009 ASCE.

O'connor, B. L.; Hondzo, M.; Harvey, J. W.

2009-01-01

374

Alcoholism Damages Brain's White Matter, Scans Show  

MedlinePLUS

... Brain's White Matter, Scans Show Areas tied to decision-making, such as how much to drink, seem most ... part of the brain mediates inhibitory control and decision-making, so tragically, it appears that some of the ...

375

GOES Satellite Data Shows Tornado Development  

NASA Video Gallery

This animation of NOAA's GOES-East satellite data shows the development and movement of the weather system that spawned tornadoes affecting the southern and eastern U.S. states on April 27-29, 2014...

376

TRMM Satellite Shows Heavy Rainfall in Cristina  

NASA Video Gallery

NASA's TRMM satellite rainfall data was overlaid on an enhanced visible/infrared image from NOAA's GOES-East satellite showing cloud and rainfall extent. Green areas indicate rainfall at over 20 mm...

377

Removal of Quantifiers by Elimination of Boundary Points  

E-print Network

We consider the problem of elimination of existential quantifiers from a Boolean CNF formula. Our approach is based on the following observation. One can get rid of dependency on a set of variables of a quantified CNF formula F by adding resolvent clauses of F eliminating boundary points. This approach is similar to the method of quantifier elimination described in [9]. The difference of the method described in the present paper is twofold: {\\bullet} branching is performed only on quantified variables, {\\bullet} an explicit search for boundary points is performed by calls to a SAT-solver Although we published the paper [9] before this one, chrono- logically the method of the present report was developed first. Preliminary presentations of this method were made in [10], [11]. We postponed a publication of this method due to preparation of a patent application [8].

Goldberg, Eugene

2012-01-01

378

Research Article Quantifying septic nitrogen loadings to receiving waters  

E-print Network

limited (Ryther and Dunstan 1971,Howarth 1988). Coastal waters are increasingly at risk of eutrophicationResearch Article Quantifying septic nitrogen loadings to receiving waters: Waquoit Bay symptoms of eutrophication,largely attributedto septic nitrogen inputs. This study assessed septicnitrogen

Moritz, Max A.

379

Quantifying economic and environmental tradeoffs of walnut arthropod pest management  

E-print Network

Quantifying economic and environmental tradeoffs of walnut arthropod pest management Kimberly P Tradeoff Walnut Pesticide a b s t r a c t Many arthropod pesticides used by California walnut growers have

Zhang, Minghua

380

Quantifying emissions reductions from New England offshore wind energy resources  

E-print Network

Access to straightforward yet robust tools to quantify the impact of renewable energy resources on air emissions from fossil fuel power plants is important to governments aiming to improve air quality and reduce greenhouse ...

Berlinski, Michael Peter

2006-01-01

381

Quantifying Particle Coatings Using High-Precision Mass Measurements  

E-print Network

We present a general method to quantify coatings on microparticle surfaces based on the additional mass. Particle buoyant mass is determined in a solution with a density that is nearly equivalent to that of the core particle, ...

Knudsen, Scott Michael

382

Evolutionary modification of development in mammalian teeth: Quantifying gene  

E-print Network

Evolutionary modification of development in mammalian teeth: Quantifying gene expression patterns Geographic Information Systems. We investi- gated how genetic markers for epithelial signaling centers known, usually involve little initial modification of morphology. One system that offers promise for linking

Jernvall, Jukka

383

Quantifying Regional Measurement Requirements for ASCENDS  

NASA Astrophysics Data System (ADS)

Quantification of greenhouse gas fluxes at regional and local scales is required by the Kyoto protocol and potential follow-up agreements, and their accompanying implementation mechanisms (e.g., cap-and-trade schemes and treaty verification protocols). Dedicated satellite observations, such as those provided by the Greenhouse gases Observing Satellite (GOSAT), the upcoming Orbiting Carbon Observatory (OCO-2), and future active missions, particularly Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) and Advanced Space Carbon and Climate Observation of Planet Earth (A-SCOPE), are poised to play a central role in this endeavor. In order to prepare for the ASCENDS mission, we are applying the Stochastic Time-Inverted Lagrangian Transport (STILT) model driven by meteorological fields from a customized version of the Weather Research and Forecasting (WRF) model to generate surface influence functions for ASCENDS observations. These "footprints" (or adjoint) express the sensitivity of observations to surface fluxes in the upwind source regions and thus enable the computation of a posteriori flux error reductions resulting from the inclusion of satellite observations (taking into account the vertical sensitivity and error characteristics of the latter). The overarching objective of this project is the specification of the measurement requirements for the ASCENDS mission, with a focus on policy-relevant regional scales. Several features make WRF-STILT an attractive tool for regional analysis of satellite observations: 1) WRF meteorology is available at higher resolution than for global models and is thus more realistic, 2) The Lagrangian approach minimizes numerical diffusion present in Eulerian models, 3) The WRF-STILT coupling has been specifically designed to achieve good mass conservation characteristics, and 4) The receptor-oriented approach offers a relatively straightforward way to compute the adjoint of the transport model. These aspects allow the model to compute surface influences for satellite observations at high spatiotemporal resolution and to generate realistic flux error and flux estimates at policy-relevant scales. The main drawbacks of the Lagrangian approach to satellite simulations are inefficiency and storage requirements, but these obstacles can be overcome by taking advantage of modern computing resources (the current runs are being performed on the NASA Pleiades supercomputer). We gratefully acknowledge funding by the NASA Atmospheric CO2 Observations from Space Program (grant NNX10AT87G).

Mountain, M. E.; Eluszkiewicz, J.; Nehrkorn, T.; Hegarty, J. D.; Aschbrenner, R.; Henderson, J.; Zaccheo, S.

2011-12-01

384

Global climate change: the quantifiable sustainability challenge.  

PubMed

Population growth and the pressures spawned by increasing demands for energy and resource-intensive goods, foods, and services are driving unsustainable growth in greenhouse gas (GHG) emissions. Recent GHG emission trends are consistent with worst-case scenarios of the previous decade. Dramatic and near-term emission reductions likely will be needed to ameliorate the potential deleterious impacts of climate change. To achieve such reductions, fundamental changes are required in the way that energy is generated and used. New technologies must be developed and deployed at a rapid rate. Advances in carbon capture and storage, renewable, nuclear and transportation technologies are particularly important; however, global research and development efforts related to these technologies currently appear to fall short relative to needs. Even with a proactive and international mitigation effort, humanity will need to adapt to climate change, but the adaptation needs and damages will be far greater if mitigation activities are not pursued in earnest. In this review, research is highlighted that indicates increasing global and regional temperatures and ties climate changes to increasing GHG emissions. GHG mitigation targets necessary for limiting future global temperature increases are discussed, including how factors such as population growth and the growing energy intensity of the developing world will make these reduction targets more challenging. Potential technological pathways for meeting emission reduction targets are examined, barriers are discussed, and global and US. modeling results are presented that suggest that the necessary pathways will require radically transformed electric and mobile sectors. While geoengineering options have been proposed to allow more time for serious emission reductions, these measures are at the conceptual stage with many unanswered cost, environmental, and political issues. Implications: This paper lays out the case that mitigating the potential for catastrophic climate change will be a monumental challenge, requiring the global community to transform its energy system in an aggressive, coordinated, and timely manner. If this challenge is to be met, new technologies will have to be developed and deployed at a rapid rate. Advances in carbon capture and storage, renewable, nuclear, and transportation technologies are particularly important. Even with an aggressive international mitigation effort, humanity will still need to adapt to significant climate change. PMID:25282995

Princiotta, Frank T; Loughlin, Daniel H

2014-09-01

385

Quantifying Mixing using Magnetic Resonance Imaging  

PubMed Central

Mixing is a unit operation that combines two or more components into a homogeneous mixture. This work involves mixing two viscous liquid streams using an in-line static mixer. The mixer is a split-and-recombine design that employs shear and extensional flow to increase the interfacial contact between the components. A prototype split-and-recombine (SAR) mixer was constructed by aligning a series of thin laser-cut Poly (methyl methacrylate) (PMMA) plates held in place in a PVC pipe. Mixing in this device is illustrated in the photograph in Fig. 1. Red dye was added to a portion of the test fluid and used as the minor component being mixed into the major (undyed) component. At the inlet of the mixer, the injected layer of tracer fluid is split into two layers as it flows through the mixing section. On each subsequent mixing section, the number of horizontal layers is duplicated. Ultimately, the single stream of dye is uniformly dispersed throughout the cross section of the device. Using a non-Newtonian test fluid of 0.2% Carbopol and a doped tracer fluid of similar composition, mixing in the unit is visualized using magnetic resonance imaging (MRI). MRI is a very powerful experimental probe of molecular chemical and physical environment as well as sample structure on the length scales from microns to centimeters. This sensitivity has resulted in broad application of these techniques to characterize physical, chemical and/or biological properties of materials ranging from humans to foods to porous media 1, 2. The equipment and conditions used here are suitable for imaging liquids containing substantial amounts of NMR mobile 1H such as ordinary water and organic liquids including oils. Traditionally MRI has utilized super conducting magnets which are not suitable for industrial environments and not portable within a laboratory (Fig. 2). Recent advances in magnet technology have permitted the construction of large volume industrially compatible magnets suitable for imaging process flows. Here, MRI provides spatially resolved component concentrations at different axial locations during the mixing process. This work documents real-time mixing of highly viscous fluids via distributive mixing with an application to personal care products. PMID:22314707

Tozzi, Emilio J.; McCarthy, Kathryn L.; Bacca, Lori A.; Hartt, William H.; McCarthy, Michael J.

2012-01-01

386

Quantifying mixing using magnetic resonance imaging.  

PubMed

Mixing is a unit operation that combines two or more components into a homogeneous mixture. This work involves mixing two viscous liquid streams using an in-line static mixer. The mixer is a split-and-recombine design that employs shear and extensional flow to increase the interfacial contact between the components. A prototype split-and-recombine (SAR) mixer was constructed by aligning a series of thin laser-cut Poly (methyl methacrylate) (PMMA) plates held in place in a PVC pipe. Mixing in this device is illustrated in the photograph in Fig. 1. Red dye was added to a portion of the test fluid and used as the minor component being mixed into the major (undyed) component. At the inlet of the mixer, the injected layer of tracer fluid is split into two layers as it flows through the mixing section. On each subsequent mixing section, the number of horizontal layers is duplicated. Ultimately, the single stream of dye is uniformly dispersed throughout the cross section of the device. Using a non-Newtonian test fluid of 0.2% Carbopol and a doped tracer fluid of similar composition, mixing in the unit is visualized using magnetic resonance imaging (MRI). MRI is a very powerful experimental probe of molecular chemical and physical environment as well as sample structure on the length scales from microns to centimeters. This sensitivity has resulted in broad application of these techniques to characterize physical, chemical and/or biological properties of materials ranging from humans to foods to porous media (1, 2). The equipment and conditions used here are suitable for imaging liquids containing substantial amounts of NMR mobile (1)H such as ordinary water and organic liquids including oils. Traditionally MRI has utilized super conducting magnets which are not suitable for industrial environments and not portable within a laboratory (Fig. 2). Recent advances in magnet technology have permitted the construction of large volume industrially compatible magnets suitable for imaging process flows. Here, MRI provides spatially resolved component concentrations at different axial locations during the mixing process. This work documents real-time mixing of highly viscous fluids via distributive mixing with an application to personal care products. PMID:22314707

Tozzi, Emilio J; McCarthy, Kathryn L; Bacca, Lori A; Hartt, William H; McCarthy, Michael J

2012-01-01

387

Methodology to quantify leaks in aerosol sampling system components  

E-print Network

METHODOLOGY TO QUANTIFY LEAKS IN AEROSOL SAMPLING SYSTEM COMPONENTS A Thesis by VISHNU KARTHIK VIJAYARAGHAVAN Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of the requirements... for the degree of MASTER OF SCIENCE August 2003 Major Subject: Mechanical Engineering METHODOLOGY TO QUANTIFY LEAKS IN AEROSOL SAMPLING SYSTEM COMPONENTS A Thesis by VISHNU KARTHIK VIJAYARAGHAVAN Submitted to Texas A&M...

Vijayaraghavan, Vishnu Karthik

2004-11-15

388

Quantifying interactions between propranolol and dissolved organic matter (DOM) from different sources using fluorescence spectroscopy.  

PubMed

Beta blockers are widely used pharmaceuticals that have been detected in the environment. Interactions between beta blockers and dissolved organic matter (DOM) may mutually alter their environmental behaviors. To assess this potential, propranolol (PRO) was used as a model beta blocker to quantify the complexation with DOM from different sources using the fluorescence quenching titration method. The sources of studied DOM samples were identified by excitation-emission matrix spectroscopy (EEMs) combined with fluorescence regional integration analysis. The results show that PRO intrinsic fluorescence was statically quenched by DOM addition. The resulting binding constants (log K oc) ranged from 3.90 to 5.20, with the surface-water-filtered DOM samples claiming the lower log K oc and HA having the highest log K oc. Log K oc is negatively correlated with the fluorescence index, biological index, and the percent fluorescence response (P i,n) of protein-like region (P I,n) and the P i,n of microbial byproduct-like region (P II,n) of DOM EEMs, while it is correlated positively with humification index and the P i,n of UVC humic-like region (P III,n). These results indicate that DOM samples from allochthonous materials rich in aromatic and humic-like components would strongly bind PRO in aquatic systems, and autochthonous DOM containing high protein-like components would bind PRO more weakly. PMID:24390196

Peng, Na; Wang, Kaifeng; Liu, Guoguang; Li, Fuhua; Yao, Kun; Lv, Wenying

2014-04-01

389

Educational Outreach: The Space Science Road Show  

NASA Astrophysics Data System (ADS)

The poster presented will give an overview of a study towards a "Space Road Show". The topic of this show is space science. The target group is adolescents, aged 12 to 15, at Dutch high schools. The show and its accompanying experiments would be supported with suitable educational material. Science teachers at schools can decide for themselves if they want to use this material in advance, afterwards or not at all. The aims of this outreach effort are: to motivate students for space science and engineering, to help them understand the importance of (space) research, to give them a positive feeling about the possibilities offered by space and in the process give them useful knowledge on space basics. The show revolves around three main themes: applications, science and society. First the students will get some historical background on the importance of space/astronomy to civilization. Secondly they will learn more about novel uses of space. On the one hand they will learn of "Views on Earth" involving technologies like Remote Sensing (or Spying), Communication, Broadcasting, GPS and Telemedicine. On the other hand they will experience "Views on Space" illustrated by past, present and future space research missions, like the space exploration missions (Cassini/Huygens, Mars Express and Rosetta) and the astronomy missions (Soho and XMM). Meanwhile, the students will learn more about the technology of launchers and satellites needed to accomplish these space missions. Throughout the show and especially towards the end attention will be paid to the third theme "Why go to space"? Other reasons for people to get into space will be explored. An important question in this is the commercial (manned) exploration of space. Thus, the questions of benefit of space to society are integrated in the entire show. It raises some fundamental questions about the effects of space travel on our environment, poverty and other moral issues. The show attempts to connect scientific with community thought. The difficulty with a show this elaborate and intricate is communicating on a level understandable for teenagers, whilst not treating them like children. Professional space scientists know how easy it is to lose oneself in technical specifics. This would, of course, only confuse young people. The author would like to discuss the ideas for this show with a knowledgeable audience and hopefully get some (constructive) feedback.

Cox, N. L. J.

2002-01-01

390

The Patient Acuity Rating: Quantifying clinical judgment regarding inpatient stability  

PubMed Central

Background New resident work-hour restrictions are expected to result in further increases in the number of handoffs between inpatient care providers, a known risk factor for poor outcomes. Strategies for improving the accuracy and efficiency of provider sign-outs are needed. Objective To develop and test a judgment-based scale for conveying the risk of clinical deterioration. Design Prospective observational study. Setting University teaching hospital. Subjects Internal medicine clinicians and patients. Measurement The Patient Acuity Rating (PAR), a 7-point Likert score representing the likelihood of a patient experiencing a cardiac arrest or ICU transfer within the next 24 hours, was obtained from physicians and midlevel practitioners at the time of sign-out. Cross-covering physicians were blinded to the results, which were subsequently correlated with outcomes. Results Forty eligible clinicians consented to participate, providing 6034 individual scores on 3419 patient-days. Seventy four patient-days resulted in cardiac arrest or ICU transfer within 24 hours. The average PAR was 3±1 and yielded an area under the receiver operator characteristics curve (AUROC) of 0.82. Provider-specific AUROC values ranged from 0.69 for residents to 0.85 for attendings (p=0.01). Interns and midlevels did not differ significantly from the other groups. A PAR of 4 or higher corresponded to a sensitivity of 82% and a specificity of 68% for predicting cardiac arrest or ICU transfer in the next 24 hours. Conclusions Clinical judgment regarding patient stability can be reliably quantified in a simple score with the potential for efficiently conveying complex assessments of at-risk patients during handoffs between healthcare members. PMID:21853529

Edelson, Dana P.; Retzer, Elizabeth; Weidman, Elizabeth K.; Walsh, Deborah; Woodruff, James; Cua, Jefferson L.; Schmitz, Amanda; Davis, Andrew M.; Minsky, Bruce D.; Meadow, William; Vanden Hoek, Terry L.; Meltzer, David O.

2012-01-01

391

QUANTIFYING THE POTENTIAL IMPACTS OF ATMS ON AIR QUALITY Bruce Hellinga  

E-print Network

QUANTIFYING THE POTENTIAL IMPACTS OF ATMS ON AIR QUALITY Bruce Hellinga Department of Civil@uwaterloo.ca Presented at the Canadian Society of Mechanical Engineers Forum 98 Symposium on Recent Advances, the relationships between various traffic management options and the resulting air quality impacts are generally

Hellinga, Bruce

392

Quantifying nonstationary radioactivity concentration fluctuations near Chernobyl: A complete statistical description  

E-print Network

Quantifying nonstationary radioactivity concentration fluctuations near Chernobyl: A complete Chernobyl after the 1986 disaster and find three new results: i the histogram of fluctuations is well.60. x, 02.50.Fz, 05.45.Tp, 87.66.Na I. INTRODUCTION Chernobyl's No. 4 reactor was completely destroyed

Stanley, H. Eugene

393

Quantifying nonstationary radioactivity concentration fluctuations near Chernobyl: A complete statistical description  

E-print Network

Quantifying nonstationary radioactivity concentration fluctuations near Chernobyl: A complete fluctuations measured near Chernobyl after the 1986 disaster and find three new results: #i# the histogram patterns. PACS number#s#: 89.60.#x, 02.50.Fz, 05.45.Tp, 87.66.Na I. INTRODUCTION Chernobyl's No. 4 reactor

Shlyakhter, Ilya

394

A method for quantifying dynamic muscle dysfunction in children and young adults with cerebral palsy  

Microsoft Academic Search

Cerebral palsy (CP) is caused by a lesion to the brain resulting in adaptations to the structure and function of the muscles and compromised mobility. Spastic cerebral palsy is commonly assessed by the limb kinematics and kinetics measured in a gait laboratory. However, these measures do not directly quantify the patterns of muscle dysfunction that occur during movements. Recent studies

James Wakeling; Roisin Delaney; Israel Dudkiewicz

2007-01-01

395

Quantifying Utility and Trustworthiness for Advice Shared on Online Social Media  

E-print Network

Quantifying Utility and Trustworthiness for Advice Shared on Online Social Media Sai T. Moturu--The growing popularity of social media in recent years has resulted in the creation of an enormous amount and utility for social media content. We identify the necessity and challenges for their assessment

Liu, Huan

396

Quantifying the error in estimated transfer functions with application to model order selection  

Microsoft Academic Search

Previous results on estimating errors or error bounds on identified transfer functions have relied on prior assumptions about the noise and the unmodeled dynamics. This prior information took the form of parameterized bounding functions or parameterized probability density functions, in the time or frequency domain with known parameters. It is shown that the parameters that quantify this prior information can

Graham C. Goodwin; Michel Gevers; Brett Ninness

1992-01-01

397

Quantifying randomness of clinician mobility and interaction in emergency department using entropy  

Microsoft Academic Search

Entropy is a fundamental measure of randomness in a time series of data. In this paper, we use entropy to quantify the randomness of events in the workflow in an emergency department (ED). We collect data using Radio Identification (RID) sensor system and compute the entropy of mobility and interaction events generated from behaviors of each tagged clinician. The result

Min Zhang; Zhe Li; Xiaohui Kong; Jiajie Zhang; Vimla Patel

2010-01-01

398

PREDICTING PERFORMANCE AND QUANTIFYING CORPORATE GOVERNANCE RISK FOR LATIN AMERICAN ADRS AND BANKS  

E-print Network

PREDICTING PERFORMANCE AND QUANTIFYING CORPORATE GOVERNANCE RISK FOR LATIN AMERICAN ADRS AND BANKS governance risk in the case of Latin American markets. We compare our results using Adaboost with logistic re of Latin American American Depository Receipts (ADRs), and on another sample of Latin American banks. We

Freund, Yoav

399

Quantifying Aircraft Hidden Corrosion by Using Multi-Modal NDI Techniques  

Microsoft Academic Search

Conventional eddy current and pulsed eddy current techniques were employed to detect hidden corrosion in aircraft structures. To quantify the inspection results, a function that relates the nondestructive inspection (NDI) measurements to the material thickness is required. Conventionally, a calibration curve is derived from the calibration procedure for each inspection method. To take advantage of the multiple NDI inspections, in

Zheng Liu; D. S. Forsyth; B. A. Lepine; S. Safizadeh; A. Fahr

2004-01-01

400

Quantifying and comparing a P2 program's benefits: pollution prevention technical assistance in Nebraska  

Microsoft Academic Search

Past clients who were provided pollution prevention technical assistance were reassessed to quantify program impact in terms of implementation, money saved and waste reduced. Although the most complex and in-depth projects resulted in the largest savings and waste reduction on a per client basis, small business clients realized similar monetary and solid waste savings as larger industrial clients when normalized

Doug J. Youngblood; Bruce I. Dvorak; Wayne E. Woldt; Stacey A. Hawkey; Jan R. Hygnstrom

2008-01-01

401

COMPARISON OF MEASUREMENT TECHNIQUES FOR QUANTIFYING SELECTED ORGANIC EMISSIONS FROM KEROSENE SPACE HEATERS  

EPA Science Inventory

The report goes results of (1) a comparison the hood and chamber techniques for quantifying pollutant emission rates from unvented combustion appliances, and (2) an assessment of the semivolatile and nonvolatile organic-compound emissions from unvented kerosene space heaters. In ...

402

Quantifying uncertainties in the assessment of sediment quality: Statistical criteria and guidelines for sediment quality assessments  

Microsoft Academic Search

Current sediment quality guidelines generally adopt a tiered approach in order to assess sediment quality more cost?effectively. The uncertainties involved in the tiered approach of an integrative assessment, however have not been quantified resulting in a risk of committing type I error or type II error at the final confirmatory stage. This study develops statistical criteria and guidelines for the

Y. H. Chang; M. D. Scrimshaw; J. N. Lester

2004-01-01

403

Preliminary Results from CONTRAST  

NASA Astrophysics Data System (ADS)

The CONvective TRansport of Active Species in the Tropics (CONTRAST) experiment is designed to quantify how convection redistributes atmospheric gases in the tropical atmosphere. Observations will be obtained by the NSF/NCAR HIAPER aircraft from a series of flights in Guam during January and February 2014. We will describe preliminary results from the CONTRAST experiment.

Salawitch, Ross J.; Pan, Laura; Atlas, Elliot

2014-05-01

404

Quantifying nanoscale order in amorphous materials: simulating fluctuation electron microscopy of amorphous silicon  

NASA Astrophysics Data System (ADS)

Fluctuation electron microscopy (FEM) is explicitly sensitive to 3- and 4-body atomic correlation functions in amorphous materials; this is sufficient to establish the existence of structural order on the nanoscale, even when the radial distribution function extracted from diffraction data appears entirely amorphous. However, it remains a formidable challenge to invert the FEM data into a quantitative model of the structure. Here, we quantify the FEM method for a-Si by forward simulating the FEM data from a family of high quality atomistic models. Using a modified WWW method, we construct computational models that contain 10-40 vol% of topologically crystalline grains, 1-3 nm in diameter, in an amorphous matrix and calculate the FEM signal, which consists of the statistical variance V (k) of the dark-field image as a function of scattering vector k. We show that V (k) is a complex function of the size and volume fraction of the ordered regions present in the amorphous matrix. However, the ratio of the variance peaks as a function of k affords the size of the ordered regions; and the magnitude of the variance affords a semi-quantitative measure of the volume fraction. We have also compared models that contain various amounts of strain in the ordered regions. This analysis shows that the amount of strain in realistic models is sufficient to mute variance peaks at high k. We conclude with a comparison between the model results and experimental data.

Bogle, Stephanie N.; Voyles, Paul M.; Khare, Sanjay V.; Abelson, John R.

2007-11-01

405

Quantifying Spatial Genetic Structuring in Mesophotic Populations of the Precious Coral Corallium rubrum  

PubMed Central

While shallow water red coral populations have been overharvested in the past, nowadays, commercial harvesting shifted its pressure on mesophotic organisms. An understanding of red coral population structure, particularly larval dispersal patterns and connectivity among harvested populations is paramount to the viability of the species. In order to determine patterns of genetic spatial structuring of deep water Corallium rubrum populations, for the first time, colonies found between 58–118 m depth within the Tyrrhenian Sea were collected and analyzed. Ten microsatellite loci and two regions of mitochondrial DNA (mtMSH and mtC) were used to quantify patterns of genetic diversity within populations and to define population structuring at spatial scales from tens of metres to hundreds of kilometres. Microsatellites showed heterozygote deficiencies in all populations. Significant levels of genetic differentiation were observed at all investigated spatial scales, suggesting that populations are likely to be isolated. This differentiation may by the results of biological interactions, occurring within a small spatial scale and/or abiotic factors acting at a larger scale. Mitochondrial markers revealed significant genetic structuring at spatial scales greater then 100 km showing the occurrence of a barrier to gene flow between northern and southern Tyrrhenian populations. These findings provide support for the establishment of marine protected areas in the deep sea and off-shore reefs, in order to effectively maintain genetic diversity of mesophotic red coral populations. PMID:23646109

Costantini, Federica; Carlesi, Lorenzo; Abbiati, Marco

2013-01-01

406

Baltimore WATERS Test Bed -- Quantifying Groundwater in Urban Areas  

NASA Astrophysics Data System (ADS)

The purpose of this project is to quantify the urban water cycle, with an emphasis on urban groundwater, using investigations at multiple spatial scales. The overall study focuses on the 171 sq km Gwynns Falls watershed, which spans an urban to rural gradient of land cover and is part of the Baltimore Ecosystem Study LTER. Within the Gwynns Falls, finer-scale studies focus on the 14.3 sq km Dead Run and its subwatersheds. A coarse-grid MODFLOW model has been set up to quantify groundwater flow magnitude and direction at the larger watershed scale. Existing wells in this urban area are sparse, but are being located through mining of USGS NWIS and local well data bases. Wet and dry season water level synoptics, stream seepage transects, and existing permeability data are being used in model calibration. In collaboration with CUAHSI HMF Geophysics, a regional-scale microgravity survey was conducted over the watershed in July 2007 and will be repeated in spring 2008. This will enable calculation of the change in groundwater levels for use in model calibration. At the smaller spatial scale (Dead Run catchment), three types of data have been collected to refine our understanding of the groundwater system. (1) Multiple bromide tracer tests were conducted along a 4 km reach of Dead Run under low-flow conditions to examine groundwater- surface water exchange as a function of land cover type and stream position in the watershed. The tests will be repeated under higher base flow conditions in early spring 2008. Tracer test data will be interpreted using the USGS OTIS model and results will be incorporated into the MODFLOW model. (2) Riparian zone geophysical surveys were carried out with support from CUAHSI HMF Geophysics to delineate depth to bedrock and the water table topography as a function of distance from the stream channel. Resistivity, ground penetrating radar, and seismic refraction surveys were run in ten transects across and around the stream channels. (3) A finer-scale microgravity survey was conducted over this area and will be repeated in spring. Efforts to quantify other components of the water cycle include: (1) deployment of an eddy covariance station for ET measurement; (2) mining flow metering records; (3) evaluation of long-term stream-flow data records; and (4) processing precipitation fields. The objective of the precipitation analysis is to obtain rainfall fields at a spatial scale of 1 sq km for the study area. Analyses are based on rain gage observations and radar reflectivity observations from the Sterling, Virginia WSR-88D radar. Radar rainfall analyses utilize the HydroNEXRAD system. Data is being managed using the CUAHSI HIS Observations Data Model housed on a HIS server. The dataset will be made accessible through web services and the Data Access System for Hydrology.

Welty, C.; Miller, A. J.; Ryan, R. J.; Crook, N.; Kerchkof, T.; Larson, P.; Smith, J.; Baeck, M. L.; Kaushal, S.; Belt, K.; McGuire, M.; Scanlon, T.; Warner, J.; Shedlock, R.; Band, L.; Groffman, P.

2007-12-01

407

Rapidly quantifying the relative distention of a human bladder  

NASA Technical Reports Server (NTRS)

A device and method was developed to rapidly quantify the relative distention of the bladder of a human subject. An ultrasonic transducer is positioned on the human subject near the bladder. A microprocessor controlled pulser excites the transducer by sending an acoustic wave into the human subject. This wave interacts with the bladder walls and is reflected back to the ultrasonic transducer where it is received, amplified, and processed by the receiver. The resulting signal is digitized by an analog to digital converter, controlled by the microprocessor again, and is stored in data memory. The software in the microprocessor determines the relative distention of the bladder as a function of the propagated ultrasonic energy. Based on programmed scientific measurements and the human subject's past history as contained in program memory, the microprocessor sends out a signal to turn on any or all of the available alarms. The alarm system includes and audible alarm, the visible alarm, the tactile alarm, and the remote wireless alarm.

Companion, John A. (inventor); Heyman, Joseph S. (inventor); Mineo, Beth A. (inventor); Cavalier, Albert R. (inventor); Blalock, Travis N. (inventor)

1991-01-01

408

Rapidly quantifying the relative distention of a human bladder  

NASA Technical Reports Server (NTRS)

A device and method of rapidly quantifying the relative distention of the bladder in a human subject are disclosed. The ultrasonic transducer which is positioned on the subject in proximity to the bladder is excited by a pulser under the command of a microprocessor to launch an acoustic wave into the patient. This wave interacts with the bladder walls and is reflected back to the ultrasonic transducer, when it is received, amplified and processed by the receiver. The resulting