Sample records for large-sample field test

  1. [Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].

    PubMed

    Suzukawa, Yumi; Toyoda, Hideki

    2012-04-01

    This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.

  2. Field size, length, and width distributions based on LACIE ground truth data. [large area crop inventory experiment

    NASA Technical Reports Server (NTRS)

    Pitts, D. E.; Badhwar, G.

    1980-01-01

    The development of agricultural remote sensing systems requires knowledge of agricultural field size distributions so that the sensors, sampling frames, image interpretation schemes, registration systems, and classification systems can be properly designed. Malila et al. (1976) studied the field size distribution for wheat and all other crops in two Kansas LACIE (Large Area Crop Inventory Experiment) intensive test sites using ground observations of the crops and measurements of their field areas based on current year rectified aerial photomaps. The field area and size distributions reported in the present investigation are derived from a representative subset of a stratified random sample of LACIE sample segments. In contrast to previous work, the obtained results indicate that most field-size distributions are not log-normally distributed. The most common field size observed in this study was 10 acres for most crops studied.

  3. Validation of the Puumala virus rapid field test for bank voles in Germany.

    PubMed

    Reil, D; Imholt, C; Rosenfeld, U M; Drewes, S; Fischer, S; Heuser, E; Petraityte-Burneikiene, R; Ulrich, R G; Jacob, J

    2017-02-01

    Puumala virus (PUUV) causes many human infections in large parts of Europe and can lead to mild to moderate disease. The bank vole (Myodes glareolus) is the only reservoir of PUUV in Central Europe. A commercial PUUV rapid field test for rodents was validated for bank-vole blood samples collected in two PUUV-endemic regions in Germany (North Rhine-Westphalia and Baden-Württemberg). A comparison of the results of the rapid field test and standard ELISAs indicated a test efficacy of 93-95%, largely independent of the origin of the antigens used in the ELISA. In ELISAs, reactivity for the German PUUV strain was higher compared to the Swedish strain but not compared to the Finnish strain, which was used for the rapid field test. In conclusion, the use of the rapid field test can facilitate short-term estimation of PUUV seroprevalence in bank-vole populations in Germany and can aid in assessing human PUUV infection risk.

  4. Identifying Microlensing Events in Large, Non-Uniformly Sampled Surveys: The Case of the Palomar Transient Factory

    NASA Astrophysics Data System (ADS)

    Price-Whelan, Adrian M.; Agueros, M. A.; Fournier, A.; Street, R.; Ofek, E.; Levitan, D. B.; PTF Collaboration

    2013-01-01

    Many current photometric, time-domain surveys are driven by specific goals such as searches for supernovae or transiting exoplanets, or studies of stellar variability. These goals in turn set the cadence with which individual fields are re-imaged. In the case of the Palomar Transient Factory (PTF), several such sub-surveys are being conducted in parallel, leading to extremely non-uniform sampling over the survey's nearly 20,000 sq. deg. footprint. While the typical 7.26 sq. deg. PTF field has been imaged 20 times in R-band, ~2300 sq. deg. have been observed more than 100 times. We use the existing PTF data 6.4x107 light curves) to study the trade-off that occurs when searching for microlensing events when one has access to a large survey footprint with irregular sampling. To examine the probability that microlensing events can be recovered in these data, we also test previous statistics used on uniformly sampled data to identify variables and transients. We find that one such statistic, the von Neumann ratio, performs best for identifying simulated microlensing events. We develop a selection method using this statistic and apply it to data from all PTF fields with >100 observations to uncover a number of interesting candidate events. This work can help constrain all-sky event rate predictions and tests microlensing signal recovery in large datasets, both of which will be useful to future wide-field, time-domain surveys such as the LSST.

  5. Statistical Searches for Microlensing Events in Large, Non-uniformly Sampled Time-Domain Surveys: A Test Using Palomar Transient Factory Data

    NASA Astrophysics Data System (ADS)

    Price-Whelan, Adrian M.; Agüeros, Marcel A.; Fournier, Amanda P.; Street, Rachel; Ofek, Eran O.; Covey, Kevin R.; Levitan, David; Laher, Russ R.; Sesar, Branimir; Surace, Jason

    2014-01-01

    Many photometric time-domain surveys are driven by specific goals, such as searches for supernovae or transiting exoplanets, which set the cadence with which fields are re-imaged. In the case of the Palomar Transient Factory (PTF), several sub-surveys are conducted in parallel, leading to non-uniform sampling over its ~20,000 deg2 footprint. While the median 7.26 deg2 PTF field has been imaged ~40 times in the R band, ~2300 deg2 have been observed >100 times. We use PTF data to study the trade off between searching for microlensing events in a survey whose footprint is much larger than that of typical microlensing searches, but with far-from-optimal time sampling. To examine the probability that microlensing events can be recovered in these data, we test statistics used on uniformly sampled data to identify variables and transients. We find that the von Neumann ratio performs best for identifying simulated microlensing events in our data. We develop a selection method using this statistic and apply it to data from fields with >10 R-band observations, 1.1 × 109 light curves, uncovering three candidate microlensing events. We lack simultaneous, multi-color photometry to confirm these as microlensing events. However, their number is consistent with predictions for the event rate in the PTF footprint over the survey's three years of operations, as estimated from near-field microlensing models. This work can help constrain all-sky event rate predictions and tests microlensing signal recovery in large data sets, which will be useful to future time-domain surveys, such as that planned with the Large Synoptic Survey Telescope.

  6. Differences in results of analyses of concurrent and split stream-water samples collected and analyzed by the US Geological Survey and the Illinois Environmental Protection Agency, 1985-91

    USGS Publications Warehouse

    Melching, C.S.; Coupe, R.H.

    1995-01-01

    During water years 1985-91, the U.S. Geological Survey (USGS) and the Illinois Environmental Protection Agency (IEPA) cooperated in the collection and analysis of concurrent and split stream-water samples from selected sites in Illinois. Concurrent samples were collected independently by field personnel from each agency at the same time and sent to the IEPA laboratory, whereas the split samples were collected by USGS field personnel and divided into aliquots that were sent to each agency's laboratory for analysis. The water-quality data from these programs were examined by means of the Wilcoxon signed ranks test to identify statistically significant differences between results of the USGS and IEPA analyses. The data sets for constituents and properties identified by the Wilcoxon test as having significant differences were further examined by use of the paired t-test, mean relative percentage difference, and scattergrams to determine if the differences were important. Of the 63 constituents and properties in the concurrent-sample analysis, differences in only 2 (pH and ammonia) were statistically significant and large enough to concern water-quality engineers and planners. Of the 27 constituents and properties in the split-sample analysis, differences in 9 (turbidity, dissolved potassium, ammonia, total phosphorus, dissolved aluminum, dissolved barium, dissolved iron, dissolved manganese, and dissolved nickel) were statistically significant and large enough to con- cern water-quality engineers and planners. The differences in concentration between pairs of the concurrent samples were compared to the precision of the laboratory or field method used. The differences in concentration between pairs of the concurrent samples were compared to the precision of the laboratory or field method used. The differences in concentration between paris of split samples were compared to the precision of the laboratory method used and the interlaboratory precision of measuring a given concentration or property. Consideration of method precision indicated that differences between concurrent samples were insignificant for all concentrations and properties except pH, and that differences between split samples were significant for all concentrations and properties. Consideration of interlaboratory precision indicated that the differences between the split samples were not unusually large. The results for the split samples illustrate the difficulty in obtaining comparable and accurate water-quality data.

  7. Temperature-and field dependent characterization of a twisted stacked-tape cable

    NASA Astrophysics Data System (ADS)

    Barth, C.; Takayasu, M.; Bagrets, N.; Bayer, C. M.; Weiss, K.-P.; Lange, C.

    2015-04-01

    The twisted stacked-tape cable (TSTC) is one of the major high temperature superconductor cable concepts combining scalability, ease of fabrication and high current density making it a possible candidate as conductor for large scale magnets. To simulate the boundary conditions of such a magnets as well as the temperature dependence of TSTCs a 1.16 m long sample consisting of 40, 4 mm wide SuperPower REBCO tapes is characterized using the ‘FBI’ (force-field-current) superconductor test facility of the Institute for Technical Physics of the Karlsruhe Institute of Technology. In a first step, the magnetic background field is cycled while measuring the current carrying capabilities to determine the impact of Lorentz forces on the TSTC sample performance. In the first field cycle, the critical current of the TSTC sample is tested up to 12 T. A significant Lorentz force of up to 65.6 kN m-1 at the maximal magnetic background field of 12 T result in a 11.8% irreversible degradation of the current carrying capabilities. The degradation saturates (critical cable current of 5.46 kA at 4.2 K and 12 T background field) and does not increase in following field cycles. In a second step, the sample is characterized at different background fields (4-12 T) and surface temperatures (4.2-37.8 K) utilizing the variable temperature insert of the ‘FBI’ test facility. In a third step, the performance along the length of the sample is determined at 77 K, self-field. A 15% degradation is obtained for the central part of the sample which was within the high field region of the magnet during the in-field measurements.

  8. Application of laboratory permeability data

    USGS Publications Warehouse

    Johnson, A.I.

    1963-01-01

    Some of the basic material contained in this report originally was prepared in 1952 as instructional handouts for ground-water short courses and for training of foreign participants. The material has been revised and expanded and is presented in the present form to make it more readily available to the field hydrologist. Illustrations now present published examples of the applications suggested in the 1952 material. For small areas, a field pumping test is sufficient to predict the characteristics of an aquifer. With a large area under study, the aquifer properties must be determined at many different locations and it is not usually economically feasible to make sufficient field tests to define the aquifer properties in detail for the whole aquifer. By supplementing a few field tests with laboratory permeability data and geologic interpretation, more point measurements representative of the hydrologic properties of the aquifer may be obtained. A sufficient number of samples seldom can be obtained to completely identify the permeability or transmissibility in detail for a project area. However, a few judiciously chosen samples of high quality, combined with good geologic interpretation, often will permit the extrapolation of permeability information over a large area with a fair degree of reliability. The importance of adequate geologic information, as well as the importance of collecting samples representative of at least all major textural units lying within the section or area of study, cannot be overemphasized.

  9. Fast solver for large scale eddy current non-destructive evaluation problems

    NASA Astrophysics Data System (ADS)

    Lei, Naiguang

    Eddy current testing plays a very important role in non-destructive evaluations of conducting test samples. Based on Faraday's law, an alternating magnetic field source generates induced currents, called eddy currents, in an electrically conducting test specimen. The eddy currents generate induced magnetic fields that oppose the direction of the inducing magnetic field in accordance with Lenz's law. In the presence of discontinuities in material property or defects in the test specimen, the induced eddy current paths are perturbed and the associated magnetic fields can be detected by coils or magnetic field sensors, such as Hall elements or magneto-resistance sensors. Due to the complexity of the test specimen and the inspection environments, the availability of theoretical simulation models is extremely valuable for studying the basic field/flaw interactions in order to obtain a fuller understanding of non-destructive testing phenomena. Theoretical models of the forward problem are also useful for training and validation of automated defect detection systems. Theoretical models generate defect signatures that are expensive to replicate experimentally. In general, modelling methods can be classified into two categories: analytical and numerical. Although analytical approaches offer closed form solution, it is generally not possible to obtain largely due to the complex sample and defect geometries, especially in three-dimensional space. Numerical modelling has become popular with advances in computer technology and computational methods. However, due to the huge time consumption in the case of large scale problems, accelerations/fast solvers are needed to enhance numerical models. This dissertation describes a numerical simulation model for eddy current problems using finite element analysis. Validation of the accuracy of this model is demonstrated via comparison with experimental measurements of steam generator tube wall defects. These simulations generating two-dimension raster scan data typically takes one to two days on a dedicated eight-core PC. A novel direct integral solver for eddy current problems and GPU-based implementation is also investigated in this research to reduce the computational time.

  10. Field trials of line transect methods applied to estimation of desert tortoise abundance

    USGS Publications Warehouse

    Anderson, David R.; Burnham, Kenneth P.; Lubow, Bruce C.; Thomas, L. E. N.; Corn, Paul Stephen; Medica, Philip A.; Marlow, R.W.

    2001-01-01

    We examine the degree to which field observers can meet the assumptions underlying line transect sampling to monitor populations of desert tortoises (Gopherus agassizii). We present the results of 2 field trials using artificial tortoise models in 3 size classes. The trials were conducted on 2 occasions on an area south of Las Vegas, Nevada, where the density of the test population was known. In the first trials, conducted largely by experienced biologists who had been involved in tortoise surveys for many years, the density of adult tortoise models was well estimated (-3.9% bias), while the bias was higher (-20%) for subadult tortoise models. The bias for combined data was -12.0%. The bias was largely attributed to the failure to detect all tortoise models on or near the transect centerline. The second trials were conducted with a group of largely inexperienced student volunteers and used somewhat different searching methods, and the results were similar to the first trials. Estimated combined density of subadult and adult tortoise models had a negative bias (-7.3%), again attributable to failure to detect some models on or near the centerline. Experience in desert tortoise biology, either comparing the first and second trials or in the second trial with 2 experienced biologists versus 16 novices, did not have an apparent effect on the quality of the data or the accuracy of the estimates. Observer training, specific to line transect sampling, and field testing are important components of a reliable survey. Line transect sampling represents a viable method for large-scale monitoring of populations of desert tortoise; however, field protocol must be improved to assure the key assumptions are met.

  11. Potential, velocity, and density fields from sparse and noisy redshift-distance samples - Method

    NASA Technical Reports Server (NTRS)

    Dekel, Avishai; Bertschinger, Edmund; Faber, Sandra M.

    1990-01-01

    A method for recovering the three-dimensional potential, velocity, and density fields from large-scale redshift-distance samples is described. Galaxies are taken as tracers of the velocity field, not of the mass. The density field and the initial conditions are calculated using an iterative procedure that applies the no-vorticity assumption at an initial time and uses the Zel'dovich approximation to relate initial and final positions of particles on a grid. The method is tested using a cosmological N-body simulation 'observed' at the positions of real galaxies in a redshift-distance sample, taking into account their distance measurement errors. Malmquist bias and other systematic and statistical errors are extensively explored using both analytical techniques and Monte Carlo simulations.

  12. An improved filter elution and cell culture assay procedure for evaluating public groundwater systems for culturable enteroviruses.

    PubMed

    Dahling, Daniel R

    2002-01-01

    Large-scale virus studies of groundwater systems require practical and sensitive procedures for both sample processing and viral assay. Filter adsorption-elution procedures have traditionally been used to process large-volume water samples for viruses. In this study, five filter elution procedures using cartridge filters were evaluated for their effectiveness in processing samples. Of the five procedures tested, the third method, which incorporated two separate beef extract elutions (one being an overnight filter immersion in beef extract), recovered 95% of seeded poliovirus compared with recoveries of 36 to 70% for the other methods. For viral enumeration, an expanded roller bottle quantal assay was evaluated using seeded poliovirus. This cytopathic-based method was considerably more sensitive than the standard plaque assay method. The roller bottle system was more economical than the plaque assay for the evaluation of comparable samples. Using roller bottles required less time and manipulation than the plaque procedure and greatly facilitated the examination of large numbers of samples. The combination of the improved filter elution procedure and the roller bottle assay for viral analysis makes large-scale virus studies of groundwater systems practical. This procedure was subsequently field tested during a groundwater study in which large-volume samples (exceeding 800 L) were processed through the filters.

  13. Combined target factor analysis and Bayesian soft-classification of interference-contaminated samples: forensic fire debris analysis.

    PubMed

    Williams, Mary R; Sigman, Michael E; Lewis, Jennifer; Pitan, Kelly McHugh

    2012-10-10

    A bayesian soft classification method combined with target factor analysis (TFA) is described and tested for the analysis of fire debris data. The method relies on analysis of the average mass spectrum across the chromatographic profile (i.e., the total ion spectrum, TIS) from multiple samples taken from a single fire scene. A library of TIS from reference ignitable liquids with assigned ASTM classification is used as the target factors in TFA. The class-conditional distributions of correlations between the target and predicted factors for each ASTM class are represented by kernel functions and analyzed by bayesian decision theory. The soft classification approach assists in assessing the probability that ignitable liquid residue from a specific ASTM E1618 class, is present in a set of samples from a single fire scene, even in the presence of unspecified background contributions from pyrolysis products. The method is demonstrated with sample data sets and then tested on laboratory-scale burn data and large-scale field test burns. The overall performance achieved in laboratory and field test of the method is approximately 80% correct classification of fire debris samples. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  14. Determining optimal parameters of the self-referent encoding task: A large-scale examination of self-referent cognition and depression.

    PubMed

    Dainer-Best, Justin; Lee, Hae Yeon; Shumake, Jason D; Yeager, David S; Beevers, Christopher G

    2018-06-07

    Although the self-referent encoding task (SRET) is commonly used to measure self-referent cognition in depression, many different SRET metrics can be obtained. The current study used best subsets regression with cross-validation and independent test samples to identify the SRET metrics most reliably associated with depression symptoms in three large samples: a college student sample (n = 572), a sample of adults from Amazon Mechanical Turk (n = 293), and an adolescent sample from a school field study (n = 408). Across all 3 samples, SRET metrics associated most strongly with depression severity included number of words endorsed as self-descriptive and rate of accumulation of information required to decide whether adjectives were self-descriptive (i.e., drift rate). These metrics had strong intratask and split-half reliability and high test-retest reliability across a 1-week period. Recall of SRET stimuli and traditional reaction time (RT) metrics were not robustly associated with depression severity. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  15. Microbial Groundwater Sampling Protocol for Fecal-Rich Environments

    PubMed Central

    Harter, Thomas; Watanabe, Naoko; Li, Xunde; Atwill, Edward R; Samuels, William

    2014-01-01

    Inherently, confined animal farming operations (CAFOs) and other intense fecal-rich environments are potential sources of groundwater contamination by enteric pathogens. The ubiquity of microbial matter poses unique technical challenges in addition to economic constraints when sampling wells in such environments. In this paper, we evaluate a groundwater sampling protocol that relies on extended purging with a portable submersible stainless steel pump and Teflon® tubing as an alternative to equipment sterilization. The protocol allows for collecting a large number of samples quickly, relatively inexpensively, and under field conditions with limited access to capacity for sterilizing equipment. The protocol is tested on CAFO monitoring wells and considers three cross-contamination sources: equipment, wellbore, and ambient air. For the assessment, we use Enterococcus, a ubiquitous fecal indicator bacterium (FIB), in laboratory and field tests with spiked and blank samples, and in an extensive, multi-year field sampling campaign on 17 wells within 2 CAFOs. The assessment shows that extended purging can successfully control for equipment cross-contamination, but also controls for significant contamination of the well-head, within the well casing and within the immediate aquifer vicinity of the well-screen. Importantly, our tests further indicate that Enterococcus is frequently entrained in water samples when exposed to ambient air at a CAFO during sample collection. Wellbore and air contamination pose separate challenges in the design of groundwater monitoring strategies on CAFOs that are not addressed by equipment sterilization, but require adequate QA/QC procedures and can be addressed by the proposed sampling strategy. PMID:24903186

  16. Serological tests for detecting Rift Valley fever viral antibodies in sheep from the Nile Delta.

    PubMed Central

    Scott, R M; Feinsod, F M; Allam, I H; Ksiazek, T G; Peters, C J; Botros, B A; Darwish, M A

    1986-01-01

    To determine the accuracy of serological methods in detecting Rift Valley fever (RVF) viral antibodies, we examined serum samples obtained from 418 sheep in the Nile Delta by using five tests. The plaque reduction neutralization test (PRNT) was considered the standard serological method against which the four other tests were compared. Twenty-four serum samples had RVF viral antibodies detected by PRNT. Hemagglutination inhibition and enzyme-linked immunosorbent assay antibodies to RVF virus were also present in the same 24 serum samples. Indirect immunofluorescence was less sensitive in comparison with PRNT, and complement fixation was the least sensitive. These results extend observations made with laboratory animals to a large field-collected group of Egyptian sheep. PMID:3533977

  17. Depiction of pneumothoraces in a large animal model using x-ray dark-field radiography.

    PubMed

    Hellbach, Katharina; Baehr, Andrea; De Marco, Fabio; Willer, Konstantin; Gromann, Lukas B; Herzen, Julia; Dmochewitz, Michaela; Auweter, Sigrid; Fingerle, Alexander A; Noël, Peter B; Rummeny, Ernst J; Yaroshenko, Andre; Maack, Hanns-Ingo; Pralow, Thomas; van der Heijden, Hendrik; Wieberneit, Nataly; Proksa, Roland; Koehler, Thomas; Rindt, Karsten; Schroeter, Tobias J; Mohr, Juergen; Bamberg, Fabian; Ertl-Wagner, Birgit; Pfeiffer, Franz; Reiser, Maximilian F

    2018-02-08

    The aim of this study was to assess the diagnostic value of x-ray dark-field radiography to detect pneumothoraces in a pig model. Eight pigs were imaged with an experimental grating-based large-animal dark-field scanner before and after induction of a unilateral pneumothorax. Image contrast-to-noise ratios between lung tissue and the air-filled pleural cavity were quantified for transmission and dark-field radiograms. The projected area in the object plane of the inflated lung was measured in dark-field images to quantify the collapse of lung parenchyma due to a pneumothorax. Means and standard deviations for lung sizes and signal intensities from dark-field and transmission images were tested for statistical significance using Student's two-tailed t-test for paired samples. The contrast-to-noise ratio between the air-filled pleural space of lateral pneumothoraces and lung tissue was significantly higher in the dark-field (3.65 ± 0.9) than in the transmission images (1.13 ± 1.1; p = 0.002). In case of dorsally located pneumothoraces, a significant decrease (-20.5%; p > 0.0001) in the projected area of inflated lung parenchyma was found after a pneumothorax was induced. Therefore, the detection of pneumothoraces in x-ray dark-field radiography was facilitated compared to transmission imaging in a large animal model.

  18. Development of improved space sampling strategies for ocean chemical properties: Total carbon dioxide and dissolved nitrate

    NASA Technical Reports Server (NTRS)

    Goyet, Catherine; Davis, Daniel; Peltzer, Edward T.; Brewer, Peter G.

    1995-01-01

    Large-scale ocean observing programs such as the Joint Global Ocean Flux Study (JGOFS) and the World Ocean Circulation Experiment (WOCE) today, must face the problem of designing an adequate sampling strategy. For ocean chemical variables, the goals and observing technologies are quite different from ocean physical variables (temperature, salinity, pressure). We have recently acquired data on the ocean CO2 properties on WOCE cruises P16c and P17c that are sufficiently dense to test for sampling redundancy. We use linear and quadratic interpolation methods on the sampled field to investigate what is the minimum number of samples required to define the deep ocean total inorganic carbon (TCO2) field within the limits of experimental accuracy (+/- 4 micromol/kg). Within the limits of current measurements, these lines were oversampled in the deep ocean. Should the precision of the measurement be improved, then a denser sampling pattern may be desirable in the future. This approach rationalizes the efficient use of resources for field work and for estimating gridded (TCO2) fields needed to constrain geochemical models.

  19. Paleomagnetism of a primitive achondrite parent body: The acapulcoite-lodranites

    NASA Astrophysics Data System (ADS)

    Schnepf, N. R.; Weiss, B. P.; Andrade Lima, E.; Fu, R. R.; Uehara, M.; Gattacceca, J.; Wang, H.; Suavet, C. R.

    2014-12-01

    Primitive achondrites are a recently recognized meteorite grouping with textures and compositions intermediate between unmelted meteorites (chondrites) and igneous meteorites (achondrites). Their existence demonstrates prima facie that some planetesimals only experienced partial rather than complete melting. We present the first paleomagnetic measurements of acapulcoite-lodranite meteorites to determine the existence and intensity of ancient magnetic fields on their parent body. Our paleomagnetic study tests the hypothesis that their parent body had an advecting metallic core, with the goal of providing one of the first geophysical constraints on its large-scale structure and the extent of interior differentiation. In particular, by analyzing samples whose petrologic textures require an origin on a partially differentiated body, we will be able to critically test a recent proposal that some achondrites and chondrite groups could have originated on a single body (Weiss and Elkins-Tanton 2013). We analyzed samples of the meteorites Acapulco and Lodran. Like other acapulcoites and lodranites, these meteorites are granular rocks containing large (~0.1-0.3 mm) kamacite and taenite grains along with similarly sized silicate crystals. Many silicate grains contain numerous fine (1-10 μm) FeNi metal inclusions. Our compositional measurements and rock magnetic data suggest that tetrataenite is rare or absent. Bulk paleomagnetic measurements were done on four mutually oriented bulk samples of Acapulco and one bulk sample of Lodran. Alternating field (AF) demagnetization revealed that the magnetization of the bulk samples is highly unstable, likely due to the large (~0.1-0.3 mm) interstitial kamacite grains throughout the samples. To overcome this challenge, we are analyzing individual ~0.2 mm mutually oriented silicate grains extracted using a wire saw micromill. Preliminary SQUID microscopy measurements of a Lodran silicate grain suggest magnetization stable to AF levels of at least 25-40 mT.

  20. Large-scale monitoring of effects of clothianidin-dressed oilseed rape seeds on pollinating insects in northern Germany: residues of clothianidin in pollen, nectar and honey.

    PubMed

    Rolke, Daniel; Persigehl, Markus; Peters, Britta; Sterk, Guido; Blenau, Wolfgang

    2016-11-01

    This study was part of a large-scale monitoring project to assess the possible effects of Elado ® (10 g clothianidin & 2 g β-cyfluthrin/kg seed)-dressed oilseed rape seeds on different pollinators in Northern Germany. Firstly, residues of clothianidin and its active metabolites thiazolylnitroguanidine and thiazolylmethylurea were measured in nectar and pollen from Elado ® -dressed (test site, T) and undressed (reference site, R) oilseed rape collected by honey bees confined within tunnel tents. Clothianidin and its metabolites could not be detected or quantified in samples from R fields. Clothianidin concentrations in samples from T fields were 1.3 ± 0.9 μg/kg and 1.7 ± 0.9 μg/kg in nectar and pollen, respectively. Secondly, pollen and nectar for residue analyses were sampled from free flying honey bees, bumble bees and mason bees, placed at six study locations each in the R and T sites at the start of oilseed rape flowering. Honey samples were analysed from all honey bee colonies at the end of oilseed rape flowering. Neither clothianidin nor its metabolites were detectable or quantifiable in R site samples. Clothianidin concentrations in samples from the T site were below the limit of quantification (LOQ, 1.0 µg/kg) in most pollen and nectar samples collected by bees and 1.4 ± 0.5 µg/kg in honey taken from honey bee colonies. In summary, the study provides reliable semi-field and field data of clothianidin residues in nectar and pollen collected by different bee species in oilseed rape fields under common agricultural conditions.

  1. Discrimination of winter wheat on irrigated land in southern Finney County, Kansas

    NASA Technical Reports Server (NTRS)

    Morain, S. A. (Principal Investigator); Williams, D. L.; Barker, B.; Coiner, J. C.

    1973-01-01

    The author has identified the following significant results. Winter wheat in the large field irrigated landscape of southern Finney County, Kansas was successfully discriminated by use of 4 ERTS-1 images. These images were acquired 16 August 1972, 21 September 1972, and 2 December 1972. MSS-5 images from each date and the MSS-7 image from 2 December 1972 were used. Human interpretation of the four images resulted in a classification scheme which produced 98% correct estimation of the number of wheat fields in the training sample and 100% correct estimation in the test sample. Overall correct separation of wheat from non-wheat fields was 93% and 86%, respectively. Offsetting errors resulted in the estimation accuracy for wheat.

  2. Cross-cultural equivalence of the patient- and parent-reported quality of life in short stature youth (QoLISSY) questionnaire.

    PubMed

    Bullinger, Monika; Quitmann, Julia; Silva, Neuza; Rohenkohl, Anja; Chaplin, John E; DeBusk, Kendra; Mimoun, Emmanuelle; Feigerlova, Eva; Herdman, Michael; Sanz, Dolores; Wollmann, Hartmut; Pleil, Andreas; Power, Michael

    2014-01-01

    Testing cross-cultural equivalence of patient-reported outcomes requires sufficiently large samples per country, which is difficult to achieve in rare endocrine paediatric conditions. We describe a novel approach to cross-cultural testing of the Quality of Life in Short Stature Youth (QoLISSY) questionnaire in five countries by sequentially taking one country out (TOCO) from the total sample and iteratively comparing the resulting psychometric performance. Development of the QoLISSY proceeded from focus group discussions through pilot testing to field testing in 268 short-statured patients and their parents. To explore cross-cultural equivalence, the iterative TOCO technique was used to examine and compare the validity, reliability, and convergence of patient and parent responses on QoLISSY in the field test dataset, and to predict QoLISSY scores from clinical, socio-demographic and psychosocial variables. Validity and reliability indicators were satisfactory for each sample after iteratively omitting one country. Comparisons with the total sample revealed cross-cultural equivalence in internal consistency and construct validity for patients and parents, high inter-rater agreement and a substantial proportion of QoLISSY variance explained by predictors. The TOCO technique is a powerful method to overcome problems of country-specific testing of patient-reported outcome instruments. It provides an empirical support to QoLISSY's cross-cultural equivalence and is recommended for future research.

  3. Laboratory measurements of reservoir rock from the Geysers geothermal field, California

    USGS Publications Warehouse

    Lockner, D.A.; Summers, R.; Moore, D.; Byerlee, J.D.

    1982-01-01

    Rock samples taken from two outcrops, as well as rare cores from three well bores at the Geysers geothermal field, California, were tested at temperatures and pressures similar to those found in the geothermal field. Both intact and 30?? sawcut cylinders were deformed at confining pressures of 200-1000 bars, pore pressure of 30 bars and temperatures of 150?? and 240??C. Thin-section and X-ray analysis revealed that some borehole samples had undergone extensive alteration and recrystallization. Constant strain rate tests of 10-4 and 10-6 per sec gave a coefficient of friction of 0.68. Due to the highly fractured nature of the rocks taken from the production zone, intact samples were rarely 50% stronger than the frictional strength. This result suggests that the Geysers reservoir can support shear stresses only as large as its frictional shear strength. Velocity of p-waves (6.2 km/sec) was measured on one sample. Acoustic emission and sliding on a sawcut were related to changes in pore pressure. b-values computed from the acoustic emissions generated during fluid injection were typically about 0.55. An unusually high b-value (approximately 1.3) observed during sudden injection of water into the sample may have been related to thermal cracking. ?? 1982.

  4. Automated cognitive testing of monkeys in social groups yields results comparable to individual laboratory-based testing.

    PubMed

    Gazes, Regina Paxton; Brown, Emily Kathryn; Basile, Benjamin M; Hampton, Robert R

    2013-05-01

    Cognitive abilities likely evolved in response to specific environmental and social challenges and are therefore expected to be specialized for the life history of each species. Specialized cognitive abilities may be most readily engaged under conditions that approximate the natural environment of the species being studied. While naturalistic environments might therefore have advantages over laboratory settings for cognitive research, it is difficult to conduct certain types of cognitive tests in these settings. We implemented methods for automated cognitive testing of monkeys (Macaca mulatta) in large social groups (Field station) and compared the performance to that of laboratory-housed monkeys (Laboratory). The Field station animals shared access to four touch-screen computers in a large naturalistic social group. Each Field station subject had an RFID chip implanted in each arm for computerized identification and individualized assignment of cognitive tests. The Laboratory group was housed and tested in a typical laboratory setting, with individual access to testing computers in their home cages. Monkeys in both groups voluntarily participated at their own pace for food rewards. We evaluated performance in two visual psychophysics tests, a perceptual classification test, a transitive inference test, and a delayed matching-to-sample memory test. Despite the differences in housing, social environment, age, and sex, monkeys in the two groups performed similarly in all tests. Semi-free ranging monkeys living in complex social environments are therefore viable subjects for cognitive testing designed to take advantage of the unique affordances of naturalistic testing environments.

  5. Automated cognitive testing of monkeys in social groups yields results comparable to individual laboratory based testing

    PubMed Central

    Gazes, Regina Paxton; Brown, Emily Kathryn; Basile, Benjamin M.; Hampton, Robert R.

    2013-01-01

    Cognitive abilities likely evolved in response to specific environmental and social challenges and are therefore expected to be specialized for the life history of each species. Specialized cognitive abilities may be most readily engaged under conditions that approximate the natural environment of the species being studied. While naturalistic environments might therefore have advantages over laboratory settings for cognitive research, it is difficult to conduct certain types of cognitive tests in these settings. We implemented methods for automated cognitive testing of monkeys (Macaca mulatta) in large social groups (Field station) and compared the performance to that of laboratory housed monkeys (Laboratory). The Field station animals shared access to four touch screen computers in a large naturalistic social group. Each Field station subject had an RFID chip implanted in each arm for computerized identification and individualized assignment of cognitive tests. The Laboratory group was housed and tested in a typical laboratory setting, with individual access to testing computers in their home cages. Monkeys in both groups voluntarily participated at their own pace for food rewards. We evaluated performance in two visual psychophysics tests, a perceptual classification test, a transitive inference test, and a delayed matching to sample memory test. Despite differences in housing, social environment, age, and sex, monkeys in the two groups performed similarly in all tests. Semi-free ranging monkeys living in complex social environments are therefore viable subjects for cognitive testing designed to take advantage of the unique affordances of naturalistic testing environments. PMID:23263675

  6. Weighted Kolmogorov-Smirnov test: accounting for the tails.

    PubMed

    Chicheportiche, Rémy; Bouchaud, Jean-Philippe

    2012-10-01

    Accurate goodness-of-fit tests for the extreme tails of empirical distributions is a very important issue, relevant in many contexts, including geophysics, insurance, and finance. We have derived exact asymptotic results for a generalization of the large-sample Kolmogorov-Smirnov test, well suited to testing these extreme tails. In passing, we have rederived and made more precise the approximate limit solutions found originally in unrelated fields, first in [L. Turban, J. Phys. A 25, 127 (1992)] and later in [P. L. Krapivsky and S. Redner, Am. J. Phys. 64, 546 (1996)].

  7. Detailed design of the large-bore 8 T superconducting magnet for the NAFASSY test facility

    NASA Astrophysics Data System (ADS)

    Corato, V.; Affinito, L.; Anemona, A.; Besi Vetrella, U.; Di Zenobio, A.; Fiamozzi Zignani, C.; Freda, R.; Messina, G.; Muzzi, L.; Perrella, M.; Reccia, L.; Tomassetti, G.; Turtù, S.; della Corte, A.

    2015-03-01

    The ‘NAFASSY’ (NAtional FAcility for Superconducting SYstems) facility is designed to test wound conductor samples under high-field conditions at variable temperatures. Due to its unique features, it is reasonable to assume that in the near future NAFASSY will have a preeminent role at the international level in the qualification of long coiled cables in operative conditions. The magnetic system consists of a large warm bore background solenoid, made up of three series-connected grading sections obtained by winding three different Nb3Sn Cable-in-Conduit Conductors. Thanks to the financial support of the Italian Ministry for University and Research the low-field coil is currently under production. The design has been properly modified to allow the system to operate also as a stand-alone facility, with an inner bore diameter of 1144 mm. This magnet is able to provide about 7 T on its axis and about 8 T close to the insert inner radius, giving the possibility of performing a test relevant for large-sized NbTi or medium-field Nb3Sn conductors. The detailed design of the 8 T magnet, including the electro-magnetic, structural and thermo-hydraulic analysis, is here reported, as well as the production status.

  8. Field Testing New Plot Designs and Methods for Determining Hydrophytic Vegetation during Wetland Delineations in the United States

    DTIC Science & Technology

    2014-03-01

    Trees and woody vines are sampled in large plots with 9 m (30 ft) radii. Saplings, shrubs , and herbs are sampled in nested smaller plots with 2 m (5 ft... woody vines in 9 m (30 ft) radius plots and saplings, shrubs , and herbaceous species in 2 m (5 ft) radius plots. In herbaceous meadows, only the 2 m (5...suggests stratifying vegetation by growth forms of trees, shrubs , herbs, and vines and sampling plant communities by using nested circular plots

  9. Development of a large-scale, outdoor, ground-based test capability for evaluating the effect of rain on airfoil lift

    NASA Technical Reports Server (NTRS)

    Bezos, Gaudy M.; Campbell, Bryan A.

    1993-01-01

    A large-scale, outdoor, ground-based test capability for acquiring aerodynamic data in a simulated rain environment was developed at the Langley Aircraft Landing Dynamics Facility (ALDF) to assess the effect of heavy rain on airfoil performance. The ALDF test carriage was modified to transport a 10-ft-chord NACA 64210 wing section along a 3000-ft track at full-scale aircraft approach speeds. An overhead rain simulation system was constructed along a 525-ft section of the track with the capability of producing simulated rain fields of 2, 10, 30, and 40 in/hr. The facility modifications, the aerodynamic testing and rain simulation capability, the design and calibration of the rain simulation system, and the operational procedures developed to minimize the effect of wind on the simulated rain field and aerodynamic data are described in detail. The data acquisition and reduction processes are also presented along with sample force data illustrating the environmental effects on data accuracy and repeatability for the 'rain-off' test condition.

  10. Conducting field studies for testing pesticide leaching models

    USGS Publications Warehouse

    Smith, Charles N.; Parrish, Rudolph S.; Brown, David S.

    1990-01-01

    A variety of predictive models are being applied to evaluate the transport and transformation of pesticides in the environment. These include well known models such as the Pesticide Root Zone Model (PRZM), the Risk of Unsaturated-Saturated Transport and Transformation Interactions for Chemical Concentrations Model (RUSTIC) and the Groundwater Loading Effects of Agricultural Management Systems Model (GLEAMS). The potentially large impacts of using these models as tools for developing pesticide management strategies and regulatory decisions necessitates development of sound model validation protocols. This paper offers guidance on many of the theoretical and practical problems encountered in the design and implementation of field-scale model validation studies. Recommendations are provided for site selection and characterization, test compound selection, data needs, measurement techniques, statistical design considerations and sampling techniques. A strategy is provided for quantitatively testing models using field measurements.

  11. Moscow Test Well, INEL Oversight Program: Aqueous geochemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCurry, M.; Fromm, J.; Welhan, J.

    1992-09-29

    This report presents a summary and interpretation of data gathered during sampling of the Moscow Test Well at Moscow, Idaho during April and May of 1992. The principal objectives of this chemical survey were to validate sampling procedures with a new straddle packer sampling tool in a previously hydrologically well characterized and simple sampling environment, and to compare analytical results from two independent labs for reproducibility of analytical results. Analytes included a wide range of metals, anions, nutrients, BNA`s, and VOC`s. Secondary objectives included analyzing of waters from a large distilled water tank (utilized for all field laboratory purposes asmore » ``pure`` stock water), of water which passed through a steamer used to clean the packer, and of rinsates from the packer tool itself before it was lowered into the test well. Analyses were also obtained of blanks and spikes for data validation purposes.« less

  12. Ultra-Gradient Test Cavity for Testing SRF Wafer Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N.J. Pogue, P.M. McIntyre, A.I. Sattarov, C. Reece

    2010-11-01

    A 1.3 GHz test cavity has been designed to test wafer samples of superconducting materials. This mushroom shaped cavity, operating in TE01 mode, creates a unique distribution of surface fields. The surface magnetic field on the sample wafer is 3.75 times greater than elsewhere on the Niobium cavity surface. This field design is made possible through dielectrically loading the cavity by locating a hemisphere of ultra-pure sapphire just above the sample wafer. The sapphire pulls the fields away from the walls so the maximum field the Nb surface sees is 25% of the surface field on the sample. In thismore » manner, it should be possible to drive the sample wafer well beyond the BCS limit for Niobium while still maintaining a respectable Q. The sapphire's purity must be tested for its loss tangent and dielectric constant to finalize the design of the mushroom test cavity. A sapphire loaded CEBAF cavity has been constructed and tested. The results on the dielectric constant and loss tangent will be presented« less

  13. A New Facility for Testing Superconducting Solenoid Magnets with Large Fringe Fields at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orris, D.; Carcagno, R.; Nogiec, J.

    2013-09-01

    Testing superconducting solenoid with no iron flux return can be problematic for a magnet test facility due to the large magnetic fringe fields generated. These large external fields can interfere with the operation of equipment while precautions must be taken for personnel supporting the test. The magnetic forces between the solenoid under test and the external infrastructure must also be taken under consideration. A new test facility has been designed and built at Fermilab specifically for testing superconducting magnets with large external fringe fields. This paper discusses the test stand design, capabilities, and details of the instrumentation and controls withmore » data from the first solenoid tested in this facility: the Muon Ionization Cooling Experiment (MICE) coupling coil.« less

  14. What drives the evolution of Luminous Compact Blue Galaxies in Clusters vs. the Field?

    NASA Astrophysics Data System (ADS)

    Wirth, Gregory

    2017-08-01

    Present-day galaxy clusters consist chiefly of low-mass dwarf elliptical galaxies, but the progenitors of this dominant population remain unclear. A prime candidate is the class of objects known as Luminous Compact Blue Galaxies, common in intermediate-reshift clusters but virtually extinct today. Recent cosmological simulations suggest that the present-day dwarfs galaxies begin as irregular field galaxies, undergo an environmentally-driven starburst phase as they enter the cluster, and stop forming stars earlier than their counterparts in the field. This model predicts that cluster dwarfs should have lower stellar mass per unit dynamical mass than their counterparts in the field. We propose a two-pronged archival research program to test this key prediction using the combination of precision photometry from space and high-quality spectroscopy. First, we will combine optical HST/ACS imaging of five z=0.55 clusters (including two HST Frontier Fields) with Spitzer IR imaging and publicly-released Keck/DEIMOS spectroscopy to measure stellar-to-dynamical-mass ratios for a large sample of cluster LCBGs. Second, we will exploit a new catalog of LCBGs in the COSMOS field to gather corresponding data for a significant sample of field LCBGs. By comparing mass ratios from these datasets, we will test theoretical predictions and determine the primary physical driver of cluster dwarf-galaxy evolution.

  15. Ecological survey of M-Field, Edgewood Area Aberdeen Proving Ground, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Downs, J.L.; Eberhardt, L.E.; Fitzner, R.E.

    1991-12-01

    An ecological survey was conducted on M-Field, at the Edgewood Area, Aberdeen Proving Ground, Maryland. M-Field is used routinely to test army smokes and obscurants, including brass flakes, carbon fibers, and fog oils. The field has been used for testing purposes for the past 40 years, but little documented history is available. Under current environmental regulations, the test field must be assessed periodically to document the presence or potential use of the area by threatened and endangered species. The M-Field area is approximately 370 acres and is part of the US Army's Edgewood Area at Aberdeen Proving Ground in Harfordmore » County, Maryland. The grass-covered field is primarily lowlands with elevations from about 1.0 to 8 m above sea level, and several buildings and structures are present on the field. The ecological assessment of M-Field was conducted in three stages, beginning with a preliminary site visit in May to assess sampling requirements. Two field site visits were made June 3--7, and August 12--15, 1991, to identify the biota existing on the site. Data were gathered on vegetation, small mammals, invertebrates, birds, large mammals, amphibians, and reptiles.« less

  16. Ecological survey of M-Field, Edgewood Area Aberdeen Proving Ground, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Downs, J.L.; Eberhardt, L.E.; Fitzner, R.E.

    1991-12-01

    An ecological survey was conducted on M-Field, at the Edgewood Area, Aberdeen Proving Ground, Maryland. M-Field is used routinely to test army smokes and obscurants, including brass flakes, carbon fibers, and fog oils. The field has been used for testing purposes for the past 40 years, but little documented history is available. Under current environmental regulations, the test field must be assessed periodically to document the presence or potential use of the area by threatened and endangered species. The M-Field area is approximately 370 acres and is part of the US Army`s Edgewood Area at Aberdeen Proving Ground in Harfordmore » County, Maryland. The grass-covered field is primarily lowlands with elevations from about 1.0 to 8 m above sea level, and several buildings and structures are present on the field. The ecological assessment of M-Field was conducted in three stages, beginning with a preliminary site visit in May to assess sampling requirements. Two field site visits were made June 3--7, and August 12--15, 1991, to identify the biota existing on the site. Data were gathered on vegetation, small mammals, invertebrates, birds, large mammals, amphibians, and reptiles.« less

  17. Lichen elements as pollution indicators: evaluation of methods for large monitoring programmes

    Treesearch

    Susan Will-Wolf; Sarah Jovan; Michael C. Amacher

    2017-01-01

    Lichen element content is a reliable indicator for relative air pollution load in research and monitoring programmes requiring both efficiency and representation of many sites. We tested the value of costly rigorous field and handling protocols for sample element analysis using five lichen species. No relaxation of rigour was supported; four relaxed protocols generated...

  18. Detection of cocaine in cargo containers by high-volume vapor sampling: field test at Port of Miami

    NASA Astrophysics Data System (ADS)

    Neudorfl, Pavel; Hupe, Michael; Pilon, Pierre; Lawrence, Andre H.; Drolet, Gerry; Su, Chih-Wu; Rigdon, Stephen W.; Kunz, Terry D.; Ulwick, Syd; Hoglund, David E.; Wingo, Jeff J.; Demirgian, Jack C.; Shier, Patrick

    1997-02-01

    The use of marine containers is a well known smuggling method for large shipments of drugs. Such containers present an ideal method of smuggling as the examination method is time consuming, difficult and expensive for the importing community. At present, various methods are being studied for screening containers which would allow to rapidly distinguish between innocent and suspicious cargo. Air sampling is one such method. Air is withdrawn for the inside of containers and analyzed for telltale vapors uniquely associated with the drug. The attractive feature of the technique is that the containers could be sampled without destuffing and opening, since air could be conveniently withdrawn via ventilation ducts. In the present paper, the development of air sampling methodology for the detection of cocaine hydrochloride will be discussed, and the results from a recent field test will be presented. The results indicated that vapors of cocaine and its decomposition product, ecgonidine methyl ester, could serve as sensitive indicators of the presence of the drug in the containers.

  19. Utilization of early soybeans for food and reproduction by the tarnished plant bug (Hemiptera: Miridae) in the delta of Mississippi.

    PubMed

    Snodgrass, G L; Jackson, R E; Abel, C A; Perera, O P

    2010-08-01

    Commercially produced maturity group (MG) IV soybeans, Glycine max L., were sampled during bloom for tarnished plant bugs, Lygus lineolaris (Palisot de Beauvois), during May and June 1999 (3 fields) and 2001 (18 fields). The adults and nymphs were found primarily in single population peaks in both years, indicating a single new generation was produced during each year. The peak mean numbers of nymphs were 0.61 and 0.84 per drop cloth sample in 1999 and 2001, respectively. Adults peaked at 3.96 (1999) and 3.76 (2001) per sweep net sample (25 sweeps). Tests using laboratory-reared and field-collected tarnished plant bugs resulted in very poor survival of nymphs on 16 different soybean varieties (MG III, one; IV, four; V, nine; VI, two). A large cage (0.06 ha) field test found that the number of nymphs produced on eight soybean varieties after mated adults were released into the cages was lower than could be expected on a suitable host. These results indicated that soybean was a marginal host for tarnished plant bugs. However, the numbers of adults and nymphs found in the commercially produced fields sampled in the study may have been high enough to cause feeding damage to the flowering soybeans. The nature of the damage and its possible economic importance were not determined. Reproduction of tarnished plant bugs in the commercially produced early soybean fields showed that the early soybeans provided tarnished plant bugs with a very abundant host at a time when only wild hosts were previously available.

  20. Assessment and mitigation of errors associated with a large-scale field investigation of methane emissions from the Marcellus Shale

    NASA Astrophysics Data System (ADS)

    Caulton, D.; Golston, L.; Li, Q.; Bou-Zeid, E.; Pan, D.; Lane, H.; Lu, J.; Fitts, J. P.; Zondlo, M. A.

    2015-12-01

    Recent work suggests the distribution of methane emissions from fracking operations is a skewed distributed with a small percentage of emitters contributing a large proportion of the total emissions. In order to provide a statistically robust distributions of emitters and determine the presence of super-emitters, errors in current techniques need to be constrained and mitigated. The Marcellus shale, the most productive natural gas shale field in the United States, has received less intense focus for well-level emissions and is here investigated to provide the distribution of methane emissions. In July of 2015 approximately 250 unique well pads were sampled using the Princeton Atmospheric Chemistry Mobile Acquisition Node (PAC-MAN). This mobile lab includes a Garmin GPS unit, Vaisala weather station (WTX520), LICOR 7700 CH4 open path sensor and LICOR 7500 CO2/H2O open path sensor. Sampling sites were preselected based on wind direction, sampling distance and elevation grade. All sites were sampled during low boundary layer conditions (600-1000 and 1800-2200 local time). The majority of sites were sampled 1-3 times while selected test sites were sampled multiple times or resampled several times during the day. For selected sites a sampling tower was constructed consisting of a Metek uSonic-3 Class A sonic anemometer, and an additional LICOR 7700 and 7500. Data were recorded for at least one hour at these sites. A robust study and inter-comparison of different methodologies will be presented. The Gaussian plume model will be used to calculate fluxes for all sites and compare results from test sites with multiple passes. Tower data is used to provide constraints on the Gaussian plume model. Additionally, Large Eddy Simulation (LES) modeling will be used to calculate emissions from the tower sites. Alternative techniques will also be discussed. Results from these techniques will be compared to identify best practices and provide robust error estimates.

  1. Ultra-broadband ptychography with self-consistent coherence estimation from a high harmonic source

    NASA Astrophysics Data System (ADS)

    Odstrčil, M.; Baksh, P.; Kim, H.; Boden, S. A.; Brocklesby, W. S.; Frey, J. G.

    2015-09-01

    With the aim of improving imaging using table-top extreme ultraviolet sources, we demonstrate coherent diffraction imaging (CDI) with relative bandwidth of 20%. The coherence properties of the illumination probe are identified using the same imaging setup. The presented methods allows for the use of fewer monochromating optics, obtaining higher flux at the sample and thus reach higher resolution or shorter exposure time. This is important in the case of ptychography when a large number of diffraction patterns need to be collected. Our microscopy setup was tested on a reconstruction of an extended sample to show the quality of the reconstruction. We show that high harmonic generation based EUV tabletop microscope can provide reconstruction of samples with a large field of view and high resolution without additional prior knowledge about the sample or illumination.

  2. Investigation of natural gas plume dispersion using mobile observations and large eddy simulations

    NASA Astrophysics Data System (ADS)

    Caulton, Dana R.; Li, Qi; Golston, Levi; Pan, Da; Bou-Zeid, Elie; Fitts, Jeff; Lane, Haley; Lu, Jessica; Zondlo, Mark A.

    2016-04-01

    Recent work suggests the distribution of methane emissions from fracking operations is skewed with a small percentage of emitters contributing a large proportion of the total emissions. These sites are known as 'super-emitters.' The Marcellus shale, the most productive natural gas shale field in the United States, has received less intense focus for well-level emissions and is here used as a test site for targeted analysis between current standard trace-gas advection practices and possible improvements via advanced modeling techniques. The Marcellus shale is topographically complex, making traditional techniques difficult to implement and evaluate. For many ground based mobile studies, the inverse Gaussian plume method (IGM) is used to produce emission rates. This method is best applied to well-mixed plumes from strong point sources and may not currently be well-suited for use with disperse weak sources, short-time frame measurements or data collected in complex terrain. To assess the quality of IGM results and to improve source-strength estimations, a robust study that combines observational data with a hierarchy of models of increasing complexity will be presented. The field test sites were sampled with multiple passes using a mobile lab as well as a stationary tower. This mobile lab includes a Garmin GPS unit, Vaisala weather station (WTX520), LICOR 7700 CH4 open path sensor and LICOR 7500 CO2/H2O open path sensor. The sampling tower was constructed consisting of a Metek uSonic-3 Class A sonic anemometer, and an additional LICOR 7700 and 7500. Data were recorded for at least one hour at these sites. The modeling will focus on large eddy simulations (LES) of the wind and CH4 concentration fields for these test sites. The LES model used 2 m horizontal and 1 m vertical resolution and was integrated in time for 45 min for various test sites under stable, neutral and unstable conditions. It is here considered as the reference to which various IGM approaches can be compared. Preliminary results show large variability in this region which, under the observed meteorological conditions, is determined to be a factor of 2 for IGM results. While this level of uncertainty appears adequate to identify super-emitters under most circumstances, there is large uncertainty on individual measurements. LES can provide insights into the expected variability and its sources and into sampling patterns that will allow more robust error estimates.

  3. Excellent field emission properties of vertically oriented CuO nanowire films

    NASA Astrophysics Data System (ADS)

    Feng, Long; Yan, Hui; Li, Heng; Zhang, Rukang; Li, Zhe; Chi, Rui; Yang, Shuaiyu; Ma, Yaya; Fu, Bin; Liu, Jiwen

    2018-04-01

    Oriented CuO nanowire films were synthesized on a large scale using simple method of direct heating copper grids in air. The field emission properties of the sample can be enhanced by improving the aspect ratio of the nanowires just through a facile method of controlling the synthesis conditions. Although the density of the nanowires is large enough, the screen effect is not an important factor in this field emission process because few nanowires sticking out above the rest. Benefiting from the unique geometrical and structural features, the CuO nanowire samples show excellent field emission (FE) properties. The FE measurements of CuO nanowire films illustrate that the sample synthesized at 500 °C for 8 h has a comparatively low turn-on field of 0.68 V/μm, a low threshold field of 1.1 V/μm, and a large field enhancement factor β of 16782 (a record high value for CuO nanostructures, to the best of our knowledge), indicating that the samples are promising candidates for field emission applications.

  4. Source apportionment of methane using a triple isotope approach - Method development and application in the Baltic Sea

    NASA Astrophysics Data System (ADS)

    Steinbach, Julia; Holmstrand, Henry; Semiletov, Igor; Shakhova, Natalia; Shcherbakova, Kseniia; Kosmach, Denis; Sapart, Célia J.; Gustafsson, Örjan

    2015-04-01

    We present a method for measurements of the stable and radiocarbon isotope systems of methane in seawater and sediments. The triple isotope characterization of methane is useful in distinguishing different sources and for improving our understanding of biogeochemical processes affecting methane in the water column. D14C-CH4 is an especially powerful addition to stable isotope analyses in distinguishing between thermogenic and biogenic origins of the methane. Such measurements require large sample sizes, due to low natural abundance of the radiocarbon in CH4. Our system for sample collection, methane extraction and purification builds on the approach by Kessler and Reeburgh (Limn. & Ocean. Meth., 2005). An in-field system extracts methane from 30 -120 l water or 1-2 l sediment (depending on the in-situ methane concentration) by purging the samples with Helium to transfer the dissolved methane to the headspace and circulating it through cryogenically cooled absorbent traps where methane is collected. The in-field preparation eliminates the risks of storage and transport of large seawater quantities and subsequent leakage of sample gas as well as ongoing microbial processes and chemical reactions that may alter the sample composition. In the subsequent shore-based treatment, a laboratory system is used to purify and combust the collected CH4 to AMS-amenable CO2. Subsamples from the methane traps are analyzed for stable isotopes and compared to stable isotope measurements directly measured from small water samples taken in parallel, to correct for any potential fractionation occurring during this process. The system has been successfully tested and used on several shorter shipboard expeditions in the Baltic Sea and on a long summer expedition across the Arctic Ocean. Here we present the details of the method and testing, as well as first triple isotope field data from two cruises to the Landsort Deep area in the Central Baltic Sea.

  5. Field Tests of Real-time In-situ Dissolved CO2 Monitoring for CO2 Leakage Detection in Groundwater

    NASA Astrophysics Data System (ADS)

    Yang, C.; Zou, Y.; Delgado, J.; Guzman, N.; Pinedo, J.

    2016-12-01

    Groundwater monitoring for detecting CO2 leakage relies on groundwater sampling from water wells drilled into aquifers. Usually groundwater samples are required be collected periodically in field and analyzed in the laboratory. Obviously groundwater sampling is labor and cost-intensive for long-term monitoring of large areas. Potential damage and contamination of water samples during the sampling process can degrade accuracy, and intermittent monitoring may miss changes in the geochemical parameters of groundwater, and therefore signs of CO2 leakage. Real-time in-situ monitoring of geochemical parameters with chemical sensors may play an important role for CO2 leakage detection in groundwater at a geological carbon sequestration site. This study presents field demonstration of a real-time in situ monitoring system capable of covering large areas for detection of low levels of dissolved CO2 in groundwater and reliably differentiating natural variations of dissolved CO2 concentration from small changes resulting from leakage. The sand-alone system includes fully distributed fiber optic sensors for carbon dioxide detection with a unique sensor technology developed by Intelligent Optical Systems. The systems were deployed to the two research sites: the Brackenridge Field Laboratory where the aquifer is shallow at depths of 10-20 ft below surface and the Devine site where the aquifer is much deeper at depths of 140 to 150 ft. Groundwater samples were periodically collected from the water wells which were installed with the chemical sensors and further compared to the measurements of the chemical sensors. Our study shows that geochemical monitoring of dissolved CO2 with fiber optic sensors could provide reliable CO2 leakage signal detection in groundwater as long as CO2 leakage signals are stronger than background noises at the monitoring locations.

  6. Reducing uncertainty in dust monitoring to detect aeolian sediment transport responses to land cover change

    NASA Astrophysics Data System (ADS)

    Webb, N.; Chappell, A.; Van Zee, J.; Toledo, D.; Duniway, M.; Billings, B.; Tedela, N.

    2017-12-01

    Anthropogenic land use and land cover change (LULCC) influence global rates of wind erosion and dust emission, yet our understanding of the magnitude of the responses remains poor. Field measurements and monitoring provide essential data to resolve aeolian sediment transport patterns and assess the impacts of human land use and management intensity. Data collected in the field are also required for dust model calibration and testing, as models have become the primary tool for assessing LULCC-dust cycle interactions. However, there is considerable uncertainty in estimates of dust emission due to the spatial variability of sediment transport. Field sampling designs are currently rudimentary and considerable opportunities are available to reduce the uncertainty. Establishing the minimum detectable change is critical for measuring spatial and temporal patterns of sediment transport, detecting potential impacts of LULCC and land management, and for quantifying the uncertainty of dust model estimates. Here, we evaluate the effectiveness of common sampling designs (e.g., simple random sampling, systematic sampling) used to measure and monitor aeolian sediment transport rates. Using data from the US National Wind Erosion Research Network across diverse rangeland and cropland cover types, we demonstrate how only large changes in sediment mass flux (of the order 200% to 800%) can be detected when small sample sizes are used, crude sampling designs are implemented, or when the spatial variation is large. We then show how statistical rigour and the straightforward application of a sampling design can reduce the uncertainty and detect change in sediment transport over time and between land use and land cover types.

  7. Sand effects on thermal barrier coatings for gas turbine engines

    NASA Astrophysics Data System (ADS)

    Walock, Michael; Barnett, Blake; Ghoshal, Anindya; Murugan, Muthuvel; Swab, Jeffrey; Pepi, Marc; Hopkins, David; Gazonas, George; Kerner, Kevin

    Accumulation and infiltration of molten/ semi-molten sand and subsequent formation of calcia-magnesia-alumina-silicate (CMAS) deposits in gas turbine engines continues to be a significant problem for aviation assets. This complex problem is compounded by the large variations in the composition, size, and topology of natural sands, gas generator turbine temperatures, thermal barrier coating properties, and the incoming particulate's momentum. In order to simplify the materials testing process, significant time and resources have been spent in the development of synthetic sand mixtures. However, there is debate whether these mixtures accurately mimic the damage observed in field-returned engines. With this study, we provide a direct comparison of CMAS deposits from both natural and synthetic sands. Using spray deposition techniques, 7% yttria-stabilized zirconia coatings are deposited onto bond-coated, Ni-superalloy discs. Each sample is coated with a sand slurry, either natural or synthetic, and exposed to a high temperature flame for 1 hour. Test samples are characterized before and after flame exposure. In addition, the test samples will be compared to field-returned equipment. This research was sponsored by the US Army Research Laboratory, and was accomplished under Cooperative Agreement # W911NF-12-2-0019.

  8. What drives the evolution of Luminous Compact Blue Galaxies in Clusters vs. the Field?

    NASA Astrophysics Data System (ADS)

    Wirth, Gregory D.; Bershady, Matthew A.; Crawford, Steven M.; Hunt, Lucas; Pisano, Daniel J.; Randriamampandry, Solohery M.

    2018-06-01

    Low-mass dwarf ellipticals are the most numerous members of present-day galaxy clusters, but the progenitors of this dominant population remain unclear. A prime candidate is the class of objects known as Luminous Compact Blue Galaxies (LCBGs), common in intermediate-redshift clusters but virtually extinct today. Recent cosmological simulations suggest that present-day dwarf galaxies begin as irregular field galaxies, undergo an environmentally-driven starburst phase as they enter the cluster, and stop forming stars earlier than their counterparts in the field. This model predicts that cluster dwarfs should have lower stellar mass per unit dynamical mass than their counterparts in the field. We are undertaking a two-pronged archival research program to test this key prediction using the combination of precision photometry from space and high-quality spectroscopy. First, we are combining optical HST/ACS imaging of five z=0.55 clusters (including two HST Frontier Fields) with Spitzer IR imaging and publicly-released Keck/DEIMOS spectroscopy to measure stellar-to-dynamical-mass ratios for a large sample of cluster LCBGs. Second, we are exploiting a new catalog of LCBGs in the COSMOS field to gather corresponding data for a significant sample of field LCBGs. By comparing mass ratios from these datasets, we aim to test theoretical predictions and determine the primary physical driver of cluster dwarf-galaxy evolution.

  9. [Avian influenza virus infection in people occupied in poultry fields in Guangzhou city].

    PubMed

    Liu, Yang; Lu, En-jie; Wang, Yu-lin; Di, Biao; Li, Tie-gang; Zhou, Yong; Yang, Li-li; Xu, Xiao-yin; Fu, Chuan-xi; Wang, Ming

    2009-11-01

    To conduct serological investigation on H5N1/H9N2/H7N7 infection among people occupied in poultry fields. Serum samples were collected from people working in live poultry and none-poultry retailing food markets, poultry wholesaling, large-scale poultry breading factories and in small-scale farms, wide birds breeding, swine slaughtering houses and from normal population. Antibodies of H5, H9 and H7 with hemagglutination inhibition and neutralization tests were tested and analyzed. Logistic regression and chi(2) test were used. Among 2881 samples, 4 were positive to H5-Ab (0.14%), 146 were positive to H9-Ab (5.07%) and the prevalence of H9 among people from live poultry retailing (14.96%) was the highest. Prevalence rates of H9 were as follows: 8.90% in people working in the large-scale poultry breading factories, 6.69% in the live poultry wholesaling business, 3.75% in the wide birds breeding, 2.40% in the swine slaughtering, 2.21% in the non-poultry retailing, 1.77% in the rural poultry farmers and 2.30% in normal population. None was positive to H7-Ab among 1926 poultry workers. The H5 prevalence among people was much lower than expected, but the H9 prevalence was higher. None of the populations tested was found positive to H7-Ab. There was a higher risk of AIV infection in live poultry retailing, wholesaling and large-scale breading businesses, with the risk of live poultry retailing the highest. The longer the service length was, the higher the risk existed.

  10. Sampling and Data Analysis for Environmental Microbiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murray, Christopher J.

    2001-06-01

    A brief review of the literature indicates the importance of statistical analysis in applied and environmental microbiology. Sampling designs are particularly important for successful studies, and it is highly recommended that researchers review their sampling design before heading to the laboratory or the field. Most statisticians have numerous stories of scientists who approached them after their study was complete only to have to tell them that the data they gathered could not be used to test the hypothesis they wanted to address. Once the data are gathered, a large and complex body of statistical techniques are available for analysis ofmore » the data. Those methods include both numerical and graphical techniques for exploratory characterization of the data. Hypothesis testing and analysis of variance (ANOVA) are techniques that can be used to compare the mean and variance of two or more groups of samples. Regression can be used to examine the relationships between sets of variables and is often used to examine the dependence of microbiological populations on microbiological parameters. Multivariate statistics provides several methods that can be used for interpretation of datasets with a large number of variables and to partition samples into similar groups, a task that is very common in taxonomy, but also has applications in other fields of microbiology. Geostatistics and other techniques have been used to examine the spatial distribution of microorganisms. The objectives of this chapter are to provide a brief survey of some of the statistical techniques that can be used for sample design and data analysis of microbiological data in environmental studies, and to provide some examples of their use from the literature.« less

  11. Temperature- and field-dependent characterization of a conductor on round core cable

    NASA Astrophysics Data System (ADS)

    Barth, C.; van der Laan, D. C.; Bagrets, N.; Bayer, C. M.; Weiss, K.-P.; Lange, C.

    2015-06-01

    The conductor on round core (CORC) cable is one of the major high temperature superconductor cable concepts combining scalability, flexibility, mechanical strength, ease of fabrication and high current density; making it a possible candidate as conductor for large, high field magnets. To simulate the boundary conditions of such magnets as well as the temperature dependence of CORC cables a 1.16 m long sample consisting of 15, 4 mm wide SuperPower REBCO tapes was characterized using the ‘FBI’ (force—field—current) superconductor test facility of the Institute for Technical Physics of the Karlsruhe Institute of Technology. In a five step investigation, the CORC cable’s performance was determined at different transverse mechanical loads, magnetic background fields and temperatures as well as its response to swift current changes. In the first step, the sample’s 77 K, self-field current was measured in a liquid nitrogen bath. In the second step, the temperature dependence was measured at self-field condition and compared with extrapolated single tape data. In the third step, the magnetic background field was repeatedly cycled while measuring the current carrying capabilities to determine the impact of transverse Lorentz forces on the CORC cable sample’s performance. In the fourth step, the sample’s current carrying capabilities were measured at different background fields (2-12 T) and surface temperatures (4.2-51.5 K). Through finite element method simulations, the surface temperatures are converted into average sample temperatures and the gained field- and temperature dependence is compared with extrapolated single tape data. In the fifth step, the response of the CORC cable sample to rapid current changes (8.3 kA s-1) was observed with a fast data acquisition system. During these tests, the sample performance remains constant, no degradation is observed. The sample’s measured current carrying capabilities correlate to those of single tapes assuming field- and temperature dependence as published by the manufacturer.

  12. A flux extraction device to measure the magnetic moment of large samples; application to bulk superconductors.

    PubMed

    Egan, R; Philippe, M; Wera, L; Fagnard, J F; Vanderheyden, B; Dennis, A; Shi, Y; Cardwell, D A; Vanderbemden, P

    2015-02-01

    We report the design and construction of a flux extraction device to measure the DC magnetic moment of large samples (i.e., several cm(3)) at cryogenic temperature. The signal is constructed by integrating the electromotive force generated by two coils wound in series-opposition that move around the sample. We show that an octupole expansion of the magnetic vector potential can be used conveniently to treat near-field effects for this geometrical configuration. The resulting expansion is tested for the case of a large, permanently magnetized, type-II superconducting sample. The dimensions of the sensing coils are determined in such a way that the measurement is influenced by the dipole magnetic moment of the sample and not by moments of higher order, within user-determined upper bounds. The device, which is able to measure magnetic moments in excess of 1 A m(2) (1000 emu), is validated by (i) a direct calibration experiment using a small coil driven by a known current and (ii) by comparison with the results of numerical calculations obtained previously using a flux measurement technique. The sensitivity of the device is demonstrated by the measurement of flux-creep relaxation of the magnetization in a large bulk superconductor sample at liquid nitrogen temperature (77 K).

  13. Analysis of the Effect of Chronic and Low-Dose Radiation Exposure on Spermatogenic Cells of Male Large Japanese Field Mice ( Apodemus speciosus ) after the Fukushima Daiichi Nuclear Power Plant Accident.

    PubMed

    Takino, Sachio; Yamashiro, Hideaki; Sugano, Yukou; Fujishima, Yohei; Nakata, Akifumi; Kasai, Kosuke; Hayashi, Gohei; Urushihara, Yusuke; Suzuki, Masatoshi; Shinoda, Hisashi; Miura, Tomisato; Fukumoto, Manabu

    2017-02-01

    In this study we analyzed the effect of chronic and low-dose-rate (LDR) radiation on spermatogenic cells of large Japanese field mice ( Apodemus speciosus ) after the Fukushima Daiichi Nuclear Power Plant (FNPP) accident. In March 2014, large Japanese field mice were collected from two sites located in, and one site adjacent to, the FNPP ex-evacuation zone: Tanashio, Murohara and Akogi, respectively. Testes from these animals were analyzed histologically. External dose rate from radiocesium (combined 134 Cs and 137 Cs) in these animals at the sampling sites exhibited 21 μGy/day in Tanashio, 304-365 μGy/day in Murohara and 407-447 μGy/day in Akogi. In the Akogi group, the numbers of spermatogenic cells and proliferating cell nuclear antigen (PCNA)-positive cells per seminiferous tubule were significantly higher compared to the Tanashio and Murohara groups, respectively. TUNEL-positive apoptotic cells tended to be detected at a lower level in the Murohara and Akogi groups compared to the Tanashio group. These results suggest that enhanced spermatogenesis occurred in large Japanese field mice living in and around the FNPP ex-evacuation zone. It remains to be elucidated whether this phenomenon, attributed to chronic exposure to LDR radiation, will benefit or adversely affect large Japanese field mice.

  14. Wafer level reliability for high-performance VLSI design

    NASA Technical Reports Server (NTRS)

    Root, Bryan J.; Seefeldt, James D.

    1987-01-01

    As very large scale integration architecture requires higher package density, reliability of these devices has approached a critical level. Previous processing techniques allowed a large window for varying reliability. However, as scaling and higher current densities push reliability to its limit, tighter control and instant feedback becomes critical. Several test structures developed to monitor reliability at the wafer level are described. For example, a test structure was developed to monitor metal integrity in seconds as opposed to weeks or months for conventional testing. Another structure monitors mobile ion contamination at critical steps in the process. Thus the reliability jeopardy can be assessed during fabrication preventing defective devices from ever being placed in the field. Most importantly, the reliability can be assessed on each wafer as opposed to an occasional sample.

  15. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 1. Baseline Studies. Volume VIII. Summary of Baseline Studies and Data.

    DTIC Science & Technology

    1982-05-01

    in May 1976, and, by July 1976, all sampling techniques were employed. In addition to routine displays of data analysis such as frequency tables and...amphibian and reptile communities in large aquatic habitats in Florida, comparison with similar herpetofaunal assemblages or populations is not possible... field environment was initiated at Lake Conway near Orlando, Fla., to study the effectiveness of the fish as a biological macrophyte control agent. A

  16. Evaluation of a rapid diagnostic field test kit for identification of Phytophthora ramorum, P. kernoviae and other Phytophthora species at the point of inspection

    Treesearch

    C.R. Lane; E. Hobden; L. Laurenson; V.C. Barton; K.J.D. Hughes; H. Swan; N. Boonham; A.J. Inman

    2008-01-01

    Plant health regulations to prevent the introduction and spread of Phytophthora ramorum and P. kernoviae require rapid, cost effective diagnostic methods for screening large numbers of plant samples at the time of inspection. Current on-site techniques require expensive equipment, considerable expertise and are not suited for plant...

  17. Instrument to collect fogwater for chemical analysis

    NASA Astrophysics Data System (ADS)

    Jacob, Daniel J.; Waldman, Jed M.; Haghi, Mehrdad; Hoffmann, Michael R.; Flagan, Richard C.

    1985-06-01

    An instrument is presented which collects large samples of ambient fogwater by impaction of droplets on a screen. The collection efficiency of the instrument is determined as a function of droplet size, and it is shown that fog droplets in the range 3-100-μm diameter are efficiently collected. No significant evaporation or condensation occurs at any stage of the collection process. Field testing indicates that samples collected are representative of the ambient fogwater. The instrument may easily be automated, and is suitable for use in routine air quality monitoring programs.

  18. Efficacy of a lead based paint XRF analyzer and a commercially available colorimetric lead test kit as qualitative field tools for determining presence of lead in religious powders.

    PubMed

    Shah, Manthan P; Shendell, Derek G; Meng, Qingyu; Ohman-Strickland, Pamela; Halperin, William

    2018-04-23

    The performances of a portable X-Ray Fluorescence (XRF) lead paint analyzer (RMD LPA-1, Protec Instrument Corp., Waltham, MA) and a commercially available colorimetric lead test kit (First Alert Lead Test Kit, eAccess Solutions, Inc., Palatine, IL) were evaluated for use by local or state health departments as potential cost-effective rapid analysis or "spot test" field techniques for tentative identification of lead content in sindoor powders. For both field-sampling methods, sensitivity, specificity and predictive values varied widely for samples containing <300,000 μg/g lead. For samples containing ≥300,000 μg/g lead, the aforementioned metrics were 100% (however, the CIs had a wide range). In addition, both field sampling methods showed clear, consistent positive readings only for samples containing ≥300,000 μg/g lead. Even samples with lead content as high as 5,110 μg/g were not positively identified by either field analysis technique. The results of this study suggest the XRF analyzer and colorimetric lead test kit cannot be used as a rapid field test for sindoor by health department inspectors.

  19. Glimpsing the imprint of local environment on the galaxy stellar mass function

    NASA Astrophysics Data System (ADS)

    Tomczak, Adam R.; Lemaux, Brian C.; Lubin, Lori M.; Gal, Roy R.; Wu, Po-Feng; Holden, Bradford; Kocevski, Dale D.; Mei, Simona; Pelliccia, Debora; Rumbaugh, Nicholas; Shen, Lu

    2017-12-01

    We investigate the impact of local environment on the galaxy stellar mass function (SMF) spanning a wide range of galaxy densities from the field up to dense cores of massive galaxy clusters. Data are drawn from a sample of eight fields from the Observations of Redshift Evolution in Large-Scale Environments (ORELSE) survey. Deep photometry allow us to select mass-complete samples of galaxies down to 109 M⊙. Taking advantage of >4000 secure spectroscopic redshifts from ORELSE and precise photometric redshifts, we construct three-dimensional density maps between 0.55 < z < 1.3 using a Voronoi tessellation approach. We find that the shape of the SMF depends strongly on local environment exhibited by a smooth, continual increase in the relative numbers of high- to low-mass galaxies towards denser environments. A straightforward implication is that local environment proportionally increases the efficiency of (a) destroying lower mass galaxies and/or (b) growth of higher mass galaxies. We also find a presence of this environmental dependence in the SMFs of star-forming and quiescent galaxies, although not quite as strongly for the quiescent subsample. To characterize the connection between the SMF of field galaxies and that of denser environments, we devise a simple semi-empirical model. The model begins with a sample of ≈106 galaxies at zstart = 5 with stellar masses distributed according to the field. Simulated galaxies then evolve down to zfinal = 0.8 following empirical prescriptions for star-formation, quenching and galaxy-galaxy merging. We run the simulation multiple times, testing a variety of scenarios with differing overall amounts of merging. Our model suggests that a large number of mergers are required to reproduce the SMF in dense environments. Additionally, a large majority of these mergers would have to occur in intermediate density environments (e.g. galaxy groups).

  20. Weighted Kolmogorov-Smirnov test: Accounting for the tails

    NASA Astrophysics Data System (ADS)

    Chicheportiche, Rémy; Bouchaud, Jean-Philippe

    2012-10-01

    Accurate goodness-of-fit tests for the extreme tails of empirical distributions is a very important issue, relevant in many contexts, including geophysics, insurance, and finance. We have derived exact asymptotic results for a generalization of the large-sample Kolmogorov-Smirnov test, well suited to testing these extreme tails. In passing, we have rederived and made more precise the approximate limit solutions found originally in unrelated fields, first in [L. Turban, J. Phys. A1361-644710.1088/0305-4470/25/3/008 25, 127 (1992)] and later in [P. L. Krapivsky and S. Redner, Am. J. Phys.AJPIAS0002-950510.1119/1.18152 64, 546 (1996)].

  1. Semi-Automated Air-Coupled Impact-Echo Method for Large-Scale Parkade Structure.

    PubMed

    Epp, Tyler; Svecova, Dagmar; Cha, Young-Jin

    2018-03-29

    Structural Health Monitoring (SHM) has moved to data-dense systems, utilizing numerous sensor types to monitor infrastructure, such as bridges and dams, more regularly. One of the issues faced in this endeavour is the scale of the inspected structures and the time it takes to carry out testing. Installing automated systems that can provide measurements in a timely manner is one way of overcoming these obstacles. This study proposes an Artificial Neural Network (ANN) application that determines intact and damaged locations from a small training sample of impact-echo data, using air-coupled microphones from a reinforced concrete beam in lab conditions and data collected from a field experiment in a parking garage. The impact-echo testing in the field is carried out in a semi-autonomous manner to expedite the front end of the in situ damage detection testing. The use of an ANN removes the need for a user-defined cutoff value for the classification of intact and damaged locations when a least-square distance approach is used. It is postulated that this may contribute significantly to testing time reduction when monitoring large-scale civil Reinforced Concrete (RC) structures.

  2. Semi-Automated Air-Coupled Impact-Echo Method for Large-Scale Parkade Structure

    PubMed Central

    Epp, Tyler; Svecova, Dagmar; Cha, Young-Jin

    2018-01-01

    Structural Health Monitoring (SHM) has moved to data-dense systems, utilizing numerous sensor types to monitor infrastructure, such as bridges and dams, more regularly. One of the issues faced in this endeavour is the scale of the inspected structures and the time it takes to carry out testing. Installing automated systems that can provide measurements in a timely manner is one way of overcoming these obstacles. This study proposes an Artificial Neural Network (ANN) application that determines intact and damaged locations from a small training sample of impact-echo data, using air-coupled microphones from a reinforced concrete beam in lab conditions and data collected from a field experiment in a parking garage. The impact-echo testing in the field is carried out in a semi-autonomous manner to expedite the front end of the in situ damage detection testing. The use of an ANN removes the need for a user-defined cutoff value for the classification of intact and damaged locations when a least-square distance approach is used. It is postulated that this may contribute significantly to testing time reduction when monitoring large-scale civil Reinforced Concrete (RC) structures. PMID:29596332

  3. Efficacy of water-dispersible formulations of biological control strains of Aspergillus flavus for aflatoxin management in corn.

    PubMed

    Weaver, Mark A; Abbas, Hamed K; Jin, Xixuan; Elliott, Brad

    2016-01-01

    Field experiments were conducted in 2011 and 2012 to evaluate the efficacy of water-dispersible granule (WDG) formulations of biocontrol strains of Aspergillus flavus in controlling aflatoxin contamination of corn. In 2011, when aflatoxin was present at very high levels, there was no WDG treatment that could provide significant protection against aflatoxin contamination. The following year a new WDG formulation was tested that resulted in 100% reduction in aflatoxin in one field experiment and ≥ 49% reduction in all five WDG treatments with biocontrol strain 21882. Large sampling error, however, limited the resolution of various treatment effects. Corn samples were also subjected to microbial analysis to understand better the mechanisms of successful biocontrol. In the samples examined here, the size of the A. flavus population on the grain was associated with the amount of aflatoxin, but the toxigenic status of that population was a poor predictor of aflatoxin concentration.

  4. Where are Low Mass X-ray Binaries Formed?

    NASA Astrophysics Data System (ADS)

    Kundu, A.; Maccarone, T. J.; Zepf, S. E.

    2004-08-01

    Chandra images of nearby galaxies reveal large numbers of low mass X-ray binaries (LMXBs). As in the Galaxy, a significant fraction of these are associated with globular clusters. We exploit the LMXB-globular cluster link in order to probe both the physical properties of globular clusters that promote the formation of LMXBs within clusters with specific characteristics, and to study whether the non-cluster field LMXB population was originally formed in clusters and then released into the field. The large population of globular clusters around nearby galaxies and the range of properties such as age, metallicity and host galaxy environment spanned by these objects enables us to identify and probe the link between these characteristics and the formation of LMXBs. We present the results of our study of a large sample of elliptical and S0 galaxies which reveals among other things that bright LMXBs definitively prefer metal-rich cluster hosts and that this relationship is unlikely to be driven by age effects. The ancestry of the non-cluster field LMXBs is a matter of some debate with suggestions that they they might have formed in the field, or created in globular clusters and then subsequently released into the field either by being ejected from clusters by dynamical processes or as remnants of dynamically destroyed clusters. Each of these scenarios has a specific spatial signature that can be tested by our combined optical and X-ray study. Furthermore, these scenarios predict additional statistical variations that may be driven by the specific host galaxy environment. We present a detailed analysis of our sample galaxies and comment on the probability that the field sources were actually formed in clusters.

  5. Lessons from Astrobiological Planetary Analogue Exploration in Iceland: Biomarker Assay Performance and Downselection

    NASA Technical Reports Server (NTRS)

    Gentry, D. M.; Amador, E. S.; Cable, M. L.; Cantrell, T.; Chaudry, N.; Cullen, T.; Duca, Z.; Kirby, J.; Jacobsen, M.; McCaig, H.; hide

    2017-01-01

    Understanding the sensitivity of biomarker assays to the local physicochemical environment, and the underlying spatial distribution of the target biomarkers in 'homogeneous' environments, can increase mission science return. We have conducted four expeditions to Icelandic Mars analogue sites in which an increasingly refined battery of physicochemical measurements and biomarker assays were performed, staggered with scouting of further sites. Completed expeditions took place in 2012 (location scouting and field assay use testing), 2013 (sampling of two major sites with three assays and observational physicochemical measurements), 2015 (repeat sampling of prior sites and one new site, scouting of new sites, three assays and three instruments), and 2016 (preliminary sampling of new sites with analysis of returned samples). Target sites were geologically recent basaltic lava flows, and sample loci were arranged in hierarchically nested grids at 10 cm, 1 m, 10 m, 100 m, and >1 km order scales, subject to field constraints. Assays were intended to represent a diversity of potential biomarker types (cell counting via nucleic acid staining and fluorescence microscopy, ATP quantification via luciferase luminescence, and relative DNA quantification with simple domain-level primers) rather a specific mission science target, and were selected to reduce laboratory overhead, require limited consumables, and allow rapid turnaround. All analytical work was performed in situ or in a field laboratory within a day's travel of the field sites unless otherwise noted. We have demonstrated the feasibility of performing ATP quantification and qPCR analysis in a field-based laboratory with single-day turnaround. The ATP assay was generally robust and reliable and required minimal field equipment and training to produce a large amount of useful data. DNA was successfully extracted from all samples, but the serial-batch nature of qPCR significantly limited the number of primers (hence classifications) and replicates that could be run in a single day. Fluorescence microscopy did not prove feasible under the same constraints, primarily due to the large number of person-hours required to view, analyze, and record results from the images; however, this could be mitigated with higher-quality imaging instruments and appropriate image analysis software.

  6. Detection of nanoplastics in food by asymmetric flow field-flow fractionation coupled to multi-angle light scattering: possibilities, challenges and analytical limitations.

    PubMed

    Correia, Manuel; Loeschner, Katrin

    2018-02-06

    We tested the suitability of asymmetric flow field-flow fractionation (AF4) coupled to multi-angle light scattering (MALS) for detection of nanoplastics in fish. A homogenized fish sample was spiked with 100 nm polystyrene nanoparticles (PSNPs) (1.3 mg/g fish). Two sample preparation strategies were tested: acid digestion and enzymatic digestion with proteinase K. Both procedures were found suitable for degradation of the organic matrix. However, acid digestion resulted in large PSNPs aggregates/agglomerates (> 1 μm). The presence of large particulates was not observed after enzymatic digestion, and consequently it was chosen as a sample preparation method. The results demonstrated that it was possible to use AF4 for separating the PSNPs from the digested fish and to determine their size by MALS. The PSNPs could be easily detected by following their light scattering (LS) signal with a limit of detection of 52 μg/g fish. The AF4-MALS method could also be exploited for another type of nanoplastics in solution, namely polyethylene (PE). However, it was not possible to detect the PE particles in fish, due to the presence of an elevated LS background. Our results demonstrate that an analytical method developed for a certain type of nanoplastics may not be directly applicable to other types of nanoplastics and may require further adjustment. This work describes for the first time the detection of nanoplastics in a food matrix by AF4-MALS. Despite the current limitations, this is a promising methodology for detecting nanoplastics in food and in experimental studies (e.g., toxicity tests, uptake studies). Graphical abstract Basic concept for the detection of nanoplastics in fish by asymmetric flow field-flow fractionation coupled to multi-angle light scattering.

  7. FIELD-SCALE STUDIES: HOW DOES SOIL SAMPLE PRETREATMENT AFFECT REPRESENTATIVENESS ? (ABSTRACT)

    EPA Science Inventory

    Samples from field-scale studies are very heterogeneous and can contain large soil and rock particles. Oversize materials are often removed before chemical analysis of the soil samples because it is not practical to include these materials. Is the extracted sample representativ...

  8. FIELD-SCALE STUDIES: HOW DOES SOIL SAMPLE PRETREATMENT AFFECT REPRESENTATIVENESS?

    EPA Science Inventory

    Samples from field-scale studies are very heterogeneous and can contain large soil and rock particles. Oversize materials are often removed before chemical analysis of the soil samples because it is not practical to include these materials. Is the extracted sample representativ...

  9. A Multilevel, Hierarchical Sampling Technique for Spatially Correlated Random Fields

    DOE PAGES

    Osborn, Sarah; Vassilevski, Panayot S.; Villa, Umberto

    2017-10-26

    In this paper, we propose an alternative method to generate samples of a spatially correlated random field with applications to large-scale problems for forward propagation of uncertainty. A classical approach for generating these samples is the Karhunen--Loève (KL) decomposition. However, the KL expansion requires solving a dense eigenvalue problem and is therefore computationally infeasible for large-scale problems. Sampling methods based on stochastic partial differential equations provide a highly scalable way to sample Gaussian fields, but the resulting parametrization is mesh dependent. We propose a multilevel decomposition of the stochastic field to allow for scalable, hierarchical sampling based on solving amore » mixed finite element formulation of a stochastic reaction-diffusion equation with a random, white noise source function. Lastly, numerical experiments are presented to demonstrate the scalability of the sampling method as well as numerical results of multilevel Monte Carlo simulations for a subsurface porous media flow application using the proposed sampling method.« less

  10. A Multilevel, Hierarchical Sampling Technique for Spatially Correlated Random Fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, Sarah; Vassilevski, Panayot S.; Villa, Umberto

    In this paper, we propose an alternative method to generate samples of a spatially correlated random field with applications to large-scale problems for forward propagation of uncertainty. A classical approach for generating these samples is the Karhunen--Loève (KL) decomposition. However, the KL expansion requires solving a dense eigenvalue problem and is therefore computationally infeasible for large-scale problems. Sampling methods based on stochastic partial differential equations provide a highly scalable way to sample Gaussian fields, but the resulting parametrization is mesh dependent. We propose a multilevel decomposition of the stochastic field to allow for scalable, hierarchical sampling based on solving amore » mixed finite element formulation of a stochastic reaction-diffusion equation with a random, white noise source function. Lastly, numerical experiments are presented to demonstrate the scalability of the sampling method as well as numerical results of multilevel Monte Carlo simulations for a subsurface porous media flow application using the proposed sampling method.« less

  11. Eucalyptus plantations for energy production in Hawaii. 1980 annual report, January 1980-December 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitesell, C. D.

    1980-01-01

    In 1980 200 acres of eucalyptus trees were planted for a research and development biomass energy plantation bringing the total area under cultivation to 300 acres. Of this total acreage, 90 acres or 30% was planted in experimental plots. The remaining 70% of the cultivated area was closely monitored to determine the economic cost/benefit ratio of large scale biomass energy production. In the large scale plantings, standard field practices were set up for all phases of production: nursery, clearing, planting, weed control and fertilization. These practices were constantly evaluated for potential improvements in efficiency and reduced cost. Promising experimental treatmentsmore » were implemented on a large scale to test their effectiveness under field production conditions. In the experimental areas all scheduled data collection in 1980 has been completed and most measurements have been keypunched and analyzed. Soil samples and leaf samples have been analyzed for nutrient concentrations. Crop logging procedures have been set up to monitor tree growth through plant tissue analysis. An intensive computer search on biomass, nursery practices, harvesting equipment and herbicide applications has been completed through the services of the US Forest Service.« less

  12. Novel failure mechanism and improvement for split-gate trench MOSFET with large current under unclamped inductive switch stress

    NASA Astrophysics Data System (ADS)

    Tian, Ye; Yang, Zhuo; Xu, Zhiyuan; Liu, Siyang; Sun, Weifeng; Shi, Longxing; Zhu, Yuanzheng; Ye, Peng; Zhou, Jincheng

    2018-04-01

    In this paper, a novel failure mechanism under unclamped inductive switch (UIS) for Split-Gate Trench Metal Oxide Semiconductor Field Effect Transistor (MOSFET) with large current is investigated. The device sample is tested and analyzed in detail. The simulation results demonstrate that the nonuniform potential distribution of the source poly should be responsible for the failure. Three structures are proposed and verified available to improve the device UIS ruggedness by TCAD simulation. The best one of the structures the device with source metal inserting into source poly through contacts in the field oxide is carried out and measured. The results demonstrate that the optimized structure can balance the trade-off between the UIS ruggedness and the static characteristics.

  13. Macroscopic inhomogeneous deformation behavior arising in single crystal Ni-Mn-Ga foils under tensile loading

    NASA Astrophysics Data System (ADS)

    Murasawa, Go; Yeduru, Srinivasa R.; Kohl, Manfred

    2016-12-01

    This study investigated macroscopic inhomogeneous deformation occurring in single-crystal Ni-Mn-Ga foils under uniaxial tensile loading. Two types of single-crystal Ni-Mn-Ga foil samples were examined as-received and after thermo-mechanical training. Local strain and the strain field were measured under tensile loading using laser speckle and digital image correlation. The as-received sample showed a strongly inhomogeneous strain field with intermittence under progressive deformation, but the trained sample result showed strain field homogeneity throughout the specimen surface. The as-received sample is a mainly polycrystalline-like state composed of the domain structure. The sample contains many domain boundaries and large domain structures in the body. Its structure would cause large local strain band nucleation with intermittence. However, the trained one is an ideal single-crystalline state with a transformation preferential orientation of variants after almost all domain boundary and large domain structures vanish during thermo-mechanical training. As a result, macroscopic homogeneous deformation occurs on the trained sample surface during deformation.

  14. Occurrence of severe gastroenteritis in pups after canine parvovirus vaccine administration: a clinical and laboratory diagnostic dilemma.

    PubMed

    Decaro, Nicola; Desario, Costantina; Elia, Gabriella; Campolo, Marco; Lorusso, Alessio; Mari, Viviana; Martella, Vito; Buonavoglia, Canio

    2007-01-26

    A total of 29 faecal samples collected from dogs with diarrhoea following canine parvovirus (CPV) vaccination were tested by minor groove binder (MGB) probe assays for discrimination between CPV vaccine and field strains and by diagnostic tests for detection of other canine pathogens. Fifteen samples tested positive only for CPV field strains; however, both vaccine and field strains were detected in three samples. Eleven samples were found to contain only the vaccine strain, although eight of them tested positive for other pathogens of dogs. Only three samples were found to contain the vaccine strain without evidence of canine pathogens. The present study confirms that most cases of parvovirus-like disease occurring shortly after vaccination are related to infection with field strains of canine parvovirus type 2 (CPV-2) rather than to reversion to virulence of the modified live virus contained in the vaccine.

  15. MUSE field splitter unit: fan-shaped separator for 24 integral field units

    NASA Astrophysics Data System (ADS)

    Laurent, Florence; Renault, Edgard; Anwand, Heiko; Boudon, Didier; Caillier, Patrick; Kosmalski, Johan; Loupias, Magali; Nicklas, Harald; Seifert, Walter; Salaun, Yves; Xu, Wenli

    2014-07-01

    MUSE (Multi Unit Spectroscopic Explorer) is a second generation Very Large Telescope (VLT) integral field spectrograph developed for the European Southern Observatory (ESO). It combines a 1' x 1' field of view sampled at 0.2 arcsec for its Wide Field Mode (WFM) and a 7.5"x7.5" field of view for its Narrow Field Mode (NFM). Both modes will operate with the improved spatial resolution provided by GALACSI (Ground Atmospheric Layer Adaptive Optics for Spectroscopic Imaging), that will use the VLT deformable secondary mirror and 4 Laser Guide Stars (LGS) foreseen in 2015. MUSE operates in the visible wavelength range (0.465-0.93 μm). A consortium of seven institutes is currently commissioning MUSE in the Very Large Telescope for the Preliminary Acceptance in Chile, scheduled for September, 2014. MUSE is composed of several subsystems which are under the responsibility of each institute. The Fore Optics derotates and anamorphoses the image at the focal plane. A Splitting and Relay Optics feed the 24 identical Integral Field Units (IFU), that are mounted within a large monolithic instrument mechanical structure. Each IFU incorporates an image slicer, a fully refractive spectrograph with VPH-grating and a detector system connected to a global vacuum and cryogenic system. During 2012 and 2013, all MUSE subsystems were integrated, aligned and tested to the P.I. institute at Lyon. After successful PAE in September 2013, MUSE instrument was shipped to the Very Large Telescope in Chile where it was aligned and tested in ESO integration hall at Paranal. After, MUSE was directly transferred in monolithic way onto VLT telescope where the first light was achieved. This paper describes the MUSE main optical component: the Field Splitter Unit. It splits the VLT image into 24 subfields and provides the first separation of the beam for the 24 Integral Field Units. This talk depicts its manufacturing at Winlight Optics and its alignment into MUSE instrument. The success of the MUSE alignment is demonstrated by the excellent results obtained onto MUSE positioning, image quality and throughput onto the sky. MUSE commissioning at the VLT is planned for September, 2014.

  16. Experimental Study of the Effect of the Initial Spectrum Width on the Statistics of Random Wave Groups

    NASA Astrophysics Data System (ADS)

    Shemer, L.; Sergeeva, A.

    2009-12-01

    The statistics of random water wave field determines the probability of appearance of extremely high (freak) waves. This probability is strongly related to the spectral wave field characteristics. Laboratory investigation of the spatial variation of the random wave-field statistics for various initial conditions is thus of substantial practical importance. Unidirectional nonlinear random wave groups are investigated experimentally in the 300 m long Large Wave Channel (GWK) in Hannover, Germany, which is the biggest facility of its kind in Europe. Numerous realizations of a wave field with the prescribed frequency power spectrum, yet randomly-distributed initial phases of each harmonic, were generated by a computer-controlled piston-type wavemaker. Several initial spectral shapes with identical dominant wave length but different width were considered. For each spectral shape, the total duration of sampling in all realizations was long enough to yield sufficient sample size for reliable statistics. Through all experiments, an effort had been made to retain the characteristic wave height value and thus the degree of nonlinearity of the wave field. Spatial evolution of numerous statistical wave field parameters (skewness, kurtosis and probability distributions) is studied using about 25 wave gauges distributed along the tank. It is found that, depending on the initial spectral shape, the frequency spectrum of the wave field may undergo significant modification in the course of its evolution along the tank; the values of all statistical wave parameters are strongly related to the local spectral width. A sample of the measured wave height probability functions (scaled by the variance of surface elevation) is plotted in Fig. 1 for the initially narrow rectangular spectrum. The results in Fig. 1 resemble findings obtained in [1] for the initial Gaussian spectral shape. The probability of large waves notably surpasses that predicted by the Rayleigh distribution and is the highest at the distance of about 100 m. Acknowledgement This study is carried out in the framework of the EC supported project "Transnational access to large-scale tests in the Large Wave Channel (GWK) of Forschungszentrum Küste (Contract HYDRALAB III - No. 022441). [1] L. Shemer and A. Sergeeva, J. Geophys. Res. Oceans 114, C01015 (2009). Figure 1. Variation along the tank of the measured wave height distribution for rectangular initial spectral shape, the carrier wave period T0=1.5 s.

  17. First results of tests on the WEAVE fibres

    NASA Astrophysics Data System (ADS)

    Sayède, Frédéric; Younes, Youssef; Fasola, Gilles; Dorent, Stéphane; Abrams, Don Carlos; Aguerri, J. Alphonso L.; Bonifacio, Piercarlo; Carrasco, Esperanza; Dalton, Gavin; Dee, Kevin; Laporte, Philippe; Lewis, Ian; Lhome, Emilie; Middleton, Kevin; Pragt, Johan H.; Rey, Juerg; Stuik, Remko; Trager, Scott C.; Vallenari, Antonella

    2016-07-01

    WEAVE is a new wide-field spectroscopy facility proposed for the prime focus of the 4.2m William Herschel Telescope. The facility comprises a new 2-degree field of view prime focus corrector with a 1000-multiplex fibre positioner, a small number of individually deployable integral field units, and a large single integral field unit. The IFUs (Integral Field Units) and the MOS (Multi Object Spectrograph) fibres can be used to feed a dual-beam spectrograph that will provide full coverage of the majority of the visible spectrum in a single exposure at a spectral resolution of 5000 or modest wavelength coverage in both arms at a resolution 20000. The instrument is expected to be on-sky by the first quarter of 2018 to provide spectroscopic sampling of the fainter end of the Gaia astrometric catalogue, chemical labeling of stars to V 17, and dedicated follow up of substantial numbers of sources from the medium deep LOFAR surveys. After a brief description of the Fibre System, we describe the fibre test bench, its calibration, and some test results. We have to verify 1920 fibres from the MOS bundles and 740 fibres from the mini-IFU bundles with the test bench. In particular, we present the Focal Ratio Degradation of a cable.

  18. Comparability between various field and laboratory wood-stove emission-measurement methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCrillis, R.C.; Jaasma, D.R.

    1991-01-01

    The paper compares various field and laboratory woodstove emission measurement methods. In 1988, the U.S. EPA promulgated performance standards for residential wood heaters (woodstoves). Over the past several years, a number of field studies have been undertaken to determine the actual level of emission reduction achieved by new technology woodstoves in everyday use. The studies have required the development and use of particulate and gaseous emission sampling equipment compatible with operation in private homes. Since woodstoves are tested for certification in the laboratory using EPA Methods 5G and 5H, it is of interest to determine the correlation between these regulatorymore » methods and the inhouse equipment. Two inhouse sampling systems have been used most widely: one is an intermittent, pump-driven particulate sampler that collects particulate and condensible organics on a filter and organic adsorbent resin; and the other uses an evacuated cylinder as the motive force and particulate and condensible organics are collected in a condenser and dual filter. Both samplers can operate unattended for 1-week periods. A large number of tests have been run comparing Methods 5G and 5H to both samplers. The paper presents these comparison data and determines the relationships between regulations and field samplers.« less

  19. Field Exploration Science for a Return to the Moon

    NASA Astrophysics Data System (ADS)

    Schmitt, H. H.; Helper, M. A.; Muehlbberger, W.; Snoke, A. W.

    2006-12-01

    Apollo field exploration science, and subsequent analysis, and interpretation of its findings and collected samples, underpin our current understanding of the origin and history of the Moon. That understanding, in turn, continues to provide new and important insights into the early histories of the Earth and other bodies in the solar system, particularly during the period that life formed and began to evolve on Earth and possibly on Mars. Those early explorations also have disclosed significant and potentially commercially viable lunar resources that might help satisfy future demand for both terrestrial energy alternatives and space consumables. Lunar sortie missions as part of the Vision for Space Exploration provide an opportunity to continue and expand the human geological, geochemical and geophysical exploration of the Moon. Specific objectives of future field exploration science include: (1) Testing of the consensus "giant impact" hypothesis for the origin of the Moon by further investigation of materials that may augment understanding of the chondritic geochemistry of the lower lunar mantle; (2) Testing of the consensus impact "cataclysm" hypothesis by obtaining absolute ages on large lunar basins of relative ages older than the 3.8-3.9 Ga mascon basins dated by Apollo 15 and 17; (3) Calibration of the end of large impacts in the inner solar system; (4) Global delineation of the internal structure of the Moon; (5) Global sampling and field investigations that extend the data necessary to remotely correlate major lunar geological and geochemical units; (6) Definition of the depositional history of polar volatiles - cometary, solar wind, or otherwise; (7) Determine the recoverable in situ concentrations and distribution of potential volatile resources; and (8) Acquisition of information and samples related to relatively less site-specific aspects of lunar geological processes. Planning for renewed field exploration of the Moon depends largely on the selection, training and use of sortie crews; the selection of landing sites; and the adopted operational approach to sortie extravehicular activity (EVA). The equipment necessary for successful exploration consists of that required for sampling, sample documentation, communications, mobility, and position knowledge. Other types of active geophysical. geochemical and petrographic equipment, if available, could clearly enhance the scientific and operational return of extended exploration over that possible during Apollo missions. Equipment to increase the efficiency of exploration should include the following, helmet-mounted, systems: (1) voice activated or automatic, electronic, stereo photo-documentation camera that is photometrically and geometrically fully calibrated; (2) automatic position and elevation determination system; and (3) laser-ranging device, aligned with the stereo camera axis. Heads-up displays and controls on the helmet, activated and selected by voice, should be available for control and use of this equipment.

  20. Sampling errors in the estimation of empirical orthogonal functions. [for climatology studies

    NASA Technical Reports Server (NTRS)

    North, G. R.; Bell, T. L.; Cahalan, R. F.; Moeng, F. J.

    1982-01-01

    Empirical Orthogonal Functions (EOF's), eigenvectors of the spatial cross-covariance matrix of a meteorological field, are reviewed with special attention given to the necessary weighting factors for gridded data and the sampling errors incurred when too small a sample is available. The geographical shape of an EOF shows large intersample variability when its associated eigenvalue is 'close' to a neighboring one. A rule of thumb indicating when an EOF is likely to be subject to large sampling fluctuations is presented. An explicit example, based on the statistics of the 500 mb geopotential height field, displays large intersample variability in the EOF's for sample sizes of a few hundred independent realizations, a size seldom exceeded by meteorological data sets.

  1. Impact of spatial variability and sampling design on model performance

    NASA Astrophysics Data System (ADS)

    Schrape, Charlotte; Schneider, Anne-Kathrin; Schröder, Boris; van Schaik, Loes

    2017-04-01

    Many environmental physical and chemical parameters as well as species distributions display a spatial variability at different scales. In case measurements are very costly in labour time or money a choice has to be made between a high sampling resolution at small scales and a low spatial cover of the study area or a lower sampling resolution at the small scales resulting in local data uncertainties with a better spatial cover of the whole area. This dilemma is often faced in the design of field sampling campaigns for large scale studies. When the gathered field data are subsequently used for modelling purposes the choice of sampling design and resulting data quality influence the model performance criteria. We studied this influence with a virtual model study based on a large dataset of field information on spatial variation of earthworms at different scales. Therefore we built a virtual map of anecic earthworm distributions over the Weiherbach catchment (Baden-Württemberg in Germany). First of all the field scale abundance of earthworms was estimated using a catchment scale model based on 65 field measurements. Subsequently the high small scale variability was added using semi-variograms, based on five fields with a total of 430 measurements divided in a spatially nested sampling design over these fields, to estimate the nugget, range and standard deviation of measurements within the fields. With the produced maps, we performed virtual samplings of one up to 50 random points per field. We then used these data to rebuild the catchment scale models of anecic earthworm abundance with the same model parameters as in the work by Palm et al. (2013). The results of the models show clearly that a large part of the non-explained deviance of the models is due to the very high small scale variability in earthworm abundance: the models based on single virtual sampling points on average obtain an explained deviance of 0.20 and a correlation coefficient of 0.64. With increasing sampling points per field, we averaged the measured abundance of the sampling within each field to obtain a more representative value of the field average. Doubling the samplings per field strongly improved the model performance criteria (explained deviance 0.38 and correlation coefficient 0.73). With 50 sampling points per field the performance criteria were 0.91 and 0.97 respectively for explained deviance and correlation coefficient. The relationship between number of samplings and performance criteria can be described with a saturation curve. Beyond five samples per field the model improvement becomes rather small. With this contribution we wish to discuss the impact of data variability at sampling scale on model performance and the implications for sampling design and assessment of model results as well as ecological inferences.

  2. Impact of Different Visual Field Testing Paradigms on Sample Size Requirements for Glaucoma Clinical Trials.

    PubMed

    Wu, Zhichao; Medeiros, Felipe A

    2018-03-20

    Visual field testing is an important endpoint in glaucoma clinical trials, and the testing paradigm used can have a significant impact on the sample size requirements. To investigate this, this study included 353 eyes of 247 glaucoma patients seen over a 3-year period to extract real-world visual field rates of change and variability estimates to provide sample size estimates from computer simulations. The clinical trial scenario assumed that a new treatment was added to one of two groups that were both under routine clinical care, with various treatment effects examined. Three different visual field testing paradigms were evaluated: a) evenly spaced testing, b) United Kingdom Glaucoma Treatment Study (UKGTS) follow-up scheme, which adds clustered tests at the beginning and end of follow-up in addition to evenly spaced testing, and c) clustered testing paradigm, with clusters of tests at the beginning and end of the trial period and two intermediary visits. The sample size requirements were reduced by 17-19% and 39-40% using the UKGTS and clustered testing paradigms, respectively, when compared to the evenly spaced approach. These findings highlight how the clustered testing paradigm can substantially reduce sample size requirements and improve the feasibility of future glaucoma clinical trials.

  3. Field trapping and magnetic levitation performances of large single-grain Gd Ba Cu O at different temperatures

    NASA Astrophysics Data System (ADS)

    Nariki, S.; Fujikura, M.; Sakai, N.; Hirabayashi, I.; Murakami, M.

    2005-10-01

    We measured the temperature dependence of the trapped field and the magnetic levitation force for c-axis-oriented single-grain Gd-Ba-Cu-O bulk samples 48 mm in diameter. Trapped magnetic field of the samples was 2.1-2.2 T at 77 K and increased with decreasing temperature and reached 4.1 T at 70 K, however the sample fractured during the measurements at lower temperatures due to a large electromagnetic force. The reinforcement by a metal ring was effective in improving the mechanical strength. The sample encapsulated in an Al ring could trap a very high magnetic field of 9.0 T at 50 K. In liquid O 2 the Gd-Ba-Cu-O bulk exhibited a trapped field of 0.42 T and a magnetic levitation force about a half value of that in liquid N 2.

  4. Temperature Control Diagnostics for Sample Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santodonato, Louis J; Walker, Lakeisha MH; Church, Andrew J

    2010-01-01

    In a scientific laboratory setting, standard equipment such as cryocoolers are often used as part of a custom sample environment system designed to regulate temperature over a wide range. The end user may be more concerned with precise sample temperature control than with base temperature. But cryogenic systems tend to be specified mainly in terms of cooling capacity and base temperature. Technical staff at scientific user facilities (and perhaps elsewhere) often wonder how to best specify and evaluate temperature control capabilities. Here we describe test methods and give results obtained at a user facility that operates a large sample environmentmore » inventory. Although this inventory includes a wide variety of temperature, pressure, and magnetic field devices, the present work focuses on cryocooler-based systems.« less

  5. Transport and attenuation of carboxylate-modified latex microspheres in fractured rock laboratory and field tracer tests

    USGS Publications Warehouse

    Becker, M.W.; Reimus, P.W.; Vilks, P.

    1999-01-01

    Understanding colloid transport in ground water is essential to assessing the migration of colloid-size contaminants, the facilitation of dissolved contaminant transport by colloids, in situ bioremediation, and the health risks of pathogen contamination in drinking water wells. Much has been learned through laboratory and field-scale colloid tracer tests, but progress has been hampered by a lack of consistent tracer testing methodology at different scales and fluid velocities. This paper presents laboratory and field tracer tests in fractured rock that use the same type of colloid tracer over an almost three orders-of-magnitude range in scale and fluid velocity. Fluorescently-dyed carboxylate-modified latex (CML) microspheres (0.19 to 0.98 ??m diameter) were used as tracers in (1) a naturally fractured tuff sample, (2) a large block of naturally fractured granite, (3) a fractured granite field site, and (4) another fractured granite/schist field site. In all cases, the mean transport time of the microspheres was shorter than the solutes, regardless of detection limit. In all but the smallest scale test, only a fraction of the injected microsphere mass was recovered, with the smaller microspheres being recovered to a greater extent than the larger microspheres. Using existing theory, we hypothesize that the observed microsphere early arrival was due to volume exclusion and attenuation was due to aggregation and/or settling during transport. In most tests, microspheres were detected using flow cytometry, which proved to be an excellent method of analysis. CML microspheres appear to be useful tracers for fractured rock in forced gradient and short-term natural gradient tests, but longer residence times may result in small microsphere recoveries.Understanding colloid transport in ground water is essential to assessing the migration of colloid-size contaminants, the facilitation of dissolved contaminant transport by colloids, in situ bioremediation, and the health risks of pathogen contamination in drinking water wells. Much has been learned through laboratory and field-scale colloid tracer tests, but progress has been hampered by a lack of consistent tracer testing methodology at different scales and fluid velocities. This paper presents laboratory and field tracer tests in fractured rock that use the same type of colloid tracer over an almost three orders-of-magnitude range in scale and fluid velocity. Fluorescently-dyed carboxylate-modified latex (CML) microspheres (0.19 to 0.98 ??m diameter) were used as tracers in (1) a naturally fractured tuff sample, (2) a large block of naturally fractured granite, (3) a fractured granite field site, and (4) another fractured granite/schist field site. In all cases, the mean transport time of the microspheres was shorter than the solutes, regardless of detection limit. In all but the smallest scale test, only a fraction of the injected microsphere mass was recovered, with the smaller microspheres being recovered to a greater extent than the larger microspheres. Using existing theory, we hypothesize that the observed microsphere early arrival was due to volume exclusion and attenuation was due to aggregation and/or settling during transport. In most tests, microspheres were detected using flow cytometry, which proved to be an excellent method of analysis. CML microspheres appear to be useful tracers for fractured rock in forced gradient and short-term natural gradient tests, but longer residence times may result in small microsphere recoveries.

  6. Advances toward field application of 3D hydraulic tomography

    NASA Astrophysics Data System (ADS)

    Cardiff, M. A.; Barrash, W.; Kitanidis, P. K.

    2011-12-01

    Hydraulic tomography (HT) is a technique that shows great potential for aquifer characterization and one that holds the promise of producing 3D hydraulic property distributions, given suitable equipment. First suggested over 15 years ago, HT assimilates distributed aquifer pressure (head) response data collected during a series of multiple pumping tests to produce estimates of aquifer property variability. Unlike traditional curve-matching analyses, which assume homogeneity or "effective" parameters within the radius of influence of a hydrologic test, HT analysis relies on numerical models with detailed heterogeneity in order to invert for the highly resolved 3D parameter distribution that jointly fits all data. Several numerical and laboratory investigations of characterization using HT have shown that property distributions can be accurately estimated between observation locations when experiments are correctly designed - a property not always shared by other, simpler 1D characterization approaches such as partially-penetrating slug tests. HT may represent one of the best methods available for obtaining detailed 3D aquifer property descriptions, especially in deep or "hard" aquifer materials, where direct-push methods may not be feasible. However, to date HT has not yet been widely adopted at contaminated field sites. We believe that current perceived impediments to HT adoption center around four key issues: 1) A paucity in the scientific literature of proven, cross-validated 3D field applications 2) A lack of guidelines and best practices for performing field 3D HT experiments; 3) Practical difficulty and time commitment associated with the installation of a large number of high-accuracy sampling locations, and the running of a large number of pumping tests; and 4) Computational difficulty associated with solving large-scale inverse problems for parameter identification. In this talk, we present current results in 3D HT research that addresses these four issues, and thus bring HT closer to field practice. Topics to be discussed include: -Improving field efficiency through design and implementation of new modular, easily-installed equipment for 3D HT. -Validating field-scale 3D HT through application and cross-validation at the Boise Hydrogeophysical Research Site. -Developing guidelines for HT implementation based on field experience, numerical modeling, and a comprehensive literature review of the past 15 years of HT research. -Application of novel, fast numerical methods for large-scale HT data analysis. The results presented will focus on the application of 3D HT, but in general we also hope to provide insights on aquifer characterization that stimulate thought on the issue of continually updating aquifer characteristics estimates while recognizing uncertainties and providing guidance for future data collection.

  7. A field test of point relascope sampling of down coarse woody material in managed stands in the Acadian Forest

    Treesearch

    John C. Brissette; Mark J. Ducey; Jeffrey H. Gove

    2003-01-01

    We field tested a new method for sampling down coarse woody material (CWM) using an angle gauge and compared it with the more traditional line intersect sampling (LIS) method. Permanent sample locations in stands managed with different silvicultural treatments within the Penobscot Experimental Forest (Maine, USA) were used as the sampling locations. Point relascope...

  8. Measurement of dynamic magnetization induced by a pulsed field: Proposal for a new rock magnetism method

    NASA Astrophysics Data System (ADS)

    Kodama, Kazuto

    2015-02-01

    This study proposes a new method for measuring transient magnetization of natural samples induced by a pulsed field with duration of 11 ms using a pulse magnetizer. An experimental system was constructed, consisting of a pair of differential sensing coils connected with a high-speed digital oscilloscope for data acquisition. The data were transferred to a computer to obtain an initial magnetization curve and a descending branch of a hysteresis loop in a rapidly changing positive field. This system was tested with synthetic samples (permalloy ribbon, aluminum plate, and nickel powder) as well as two volcanic rock samples. Results from the synthetic samples showed considerable differences from those measured by a quasi-static method using a vibrating sample magnetometer (VSM). These differences were principally due to the time-dependent magnetic properties or to electromagnetic effects, such as magnetic viscosity, eddy current loss, or magnetic relaxation. Results from the natural samples showed that the transient magnetization-field curves were largely comparable to the corresponding portions of the hysteresis loops. However, the relative magnetization (scaled to the saturation magnetization) at the end of a pulse was greater than that measured by a VSM. This discrepancy, together with the occurrence of rapid exponential decay after a pulse, indicates magnetic relaxations that could be interpreted in terms of domain wall displacement. These results suggest that with further developments, the proposed technique can become a useful tool for characterizing magnetic particles contained in a variety of natural materials.

  9. Design of shared instruments to utilize simulated gravities generated by a large-gradient, high-field superconducting magnet.

    PubMed

    Wang, Y; Yin, D C; Liu, Y M; Shi, J Z; Lu, H M; Shi, Z H; Qian, A R; Shang, P

    2011-03-01

    A high-field superconducting magnet can provide both high-magnetic fields and large-field gradients, which can be used as a special environment for research or practical applications in materials processing, life science studies, physical and chemical reactions, etc. To make full use of a superconducting magnet, shared instruments (the operating platform, sample holders, temperature controller, and observation system) must be prepared as prerequisites. This paper introduces the design of a set of sample holders and a temperature controller in detail with an emphasis on validating the performance of the force and temperature sensors in the high-magnetic field.

  10. Design of shared instruments to utilize simulated gravities generated by a large-gradient, high-field superconducting magnet

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Yin, D. C.; Liu, Y. M.; Shi, J. Z.; Lu, H. M.; Shi, Z. H.; Qian, A. R.; Shang, P.

    2011-03-01

    A high-field superconducting magnet can provide both high-magnetic fields and large-field gradients, which can be used as a special environment for research or practical applications in materials processing, life science studies, physical and chemical reactions, etc. To make full use of a superconducting magnet, shared instruments (the operating platform, sample holders, temperature controller, and observation system) must be prepared as prerequisites. This paper introduces the design of a set of sample holders and a temperature controller in detail with an emphasis on validating the performance of the force and temperature sensors in the high-magnetic field.

  11. Alchemical prediction of hydration free energies for SAMPL

    PubMed Central

    Mobley, David L.; Liu, Shaui; Cerutti, David S.; Swope, William C.; Rice, Julia E.

    2013-01-01

    Hydration free energy calculations have become important tests of force fields. Alchemical free energy calculations based on molecular dynamics simulations provide a rigorous way to calculate these free energies for a particular force field, given sufficient sampling. Here, we report results of alchemical hydration free energy calculations for the set of small molecules comprising the 2011 Statistical Assessment of Modeling of Proteins and Ligands (SAMPL) challenge. Our calculations are largely based on the Generalized Amber Force Field (GAFF) with several different charge models, and we achieved RMS errors in the 1.4-2.2 kcal/mol range depending on charge model, marginally higher than what we typically observed in previous studies1-5. The test set consists of ethane, biphenyl, and a dibenzyl dioxin, as well as a series of chlorinated derivatives of each. We found that, for this set, using high-quality partial charges from MP2/cc-PVTZ SCRF RESP fits provided marginally improved agreement with experiment over using AM1-BCC partial charges as we have more typically done, in keeping with our recent findings5. Switching to OPLS Lennard-Jones parameters with AM1-BCC charges also improves agreement with experiment. We also find a number of chemical trends within each molecular series which we can explain, but there are also some surprises, including some that are captured by the calculations and some that are not. PMID:22198475

  12. Species detection using HyBeacon(®) probe technology: Working towards rapid onsite testing in non-human forensic and food authentication applications.

    PubMed

    Dawnay, Nick; Hughes, Rebecca; Court, Denise Syndercombe; Duxbury, Nicola

    2016-01-01

    Identifying individual species or determining species' composition in an unknown sample is important for a variety of forensic applications. Food authentication, monitoring illegal trade in endangered species, forensic entomology, sexual assault case work and counter terrorism are just some of the fields that can require the detection of the biological species present. Traditional laboratory based approaches employ a wide variety of tools and technologies and exploit a number of different species specific traits including morphology, molecular differences and immuno-chemical analyses. A large number of these approaches require laboratory based apparatus and results can take a number of days to be returned to investigating authorities. Having a presumptive test for rapid identification could lead to savings in terms of cost and time and allow sample prioritisation if confirmatory testing in a laboratory is required later. This model study describes the development of an assay using a single HyBeacon(®) probe and melt curve analyses allowing rapid screening and authentication of food products labelled as Atlantic cod (Gadus morhua). Exploiting melt curve detection of species specific SNP sites on the COI gene the test allows detection of a target species (Atlantic cod) and closely related species which may be used as substitutes. The assay has been designed for use with the Field Portable ParaDNA system, a molecular detection platform for non-expert users. The entire process from sampling to result takes approximately 75min. Validation studies were performed on both single source genomic DNA, mixed genomic DNA and commercial samples. Data suggests the assay has a lower limit of detection of 31 pg DNA. The specificity of the assay to Atlantic cod was measured by testing highly processed food samples including frozen, defrosted and cooked fish fillets as well as fish fingers, battered fish fillet and fish pie. Ninety-six (92.7%) of all Atlantic cod food products, tested, provided a correct single species result with the remaining samples erroneously identified as containing non-target species. The data shows that the assay was quick to design and characterise and is also capable of yielding results that would be beneficial in a variety of fields, not least the authentication of food. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Pilot model expansion tunnel test flow properties obtained from velocity, pressure, and probe measurements

    NASA Technical Reports Server (NTRS)

    Friesen, W. J.; Moore, J. A.

    1973-01-01

    Velocity-profile, pitot-pressure, and supplemental probe measurements were made at the nozzle exist of an expansion tunnel (a modification to the Langley pilot model expansion tube) for a nozzle net condition of a nitrogen test sample with a velocity of 4.5 km/sec and a density 0.005 times the density of nitrogen at standard conditions, both with the nozzle initially immersed in a helium atmosphere and with the nozzle initially evacuated. The purpose of the report is to present the results of these measurements and some of the physical properties of the nitrogen test sample which can be inferred from the measured results. The main conclusions reached are that: the velocity profiles differ for two nozzle conditions; regions of the flow field can be found where the velocity is uniform to within 5 percent and constant for several hundred microseconds; the velocity of the nitrogen test sample is reduced due to passage through the nozzle; and the velocity profiles do not significantly reflect the large variations which occur in the inferred density profiles.

  14. Charged-Particle Transport in the Data-Driven, Non-Isotropic Turbulent Mangetic Field in the Solar Wind

    NASA Astrophysics Data System (ADS)

    Sun, P.; Jokipii, J. R.; Giacalone, J.

    2016-12-01

    Anisotropies in astrophysical turbulence has been proposed and observed for a long time. And recent observations adopting the multi-scale analysis techniques provided a detailed description of the scale-dependent power spectrum of the magnetic field parallel and perpendicular to the scale-dependent magnetic field line at different scales in the solar wind. In the previous work, we proposed a multi-scale method to synthesize non-isotropic turbulent magnetic field with pre-determined power spectra of the fluctuating magnetic field as a function of scales. We present the effect of test particle transport in the resulting field with a two-scale algorithm. We find that the scale-dependent turbulence anisotropy has a significant difference in the effect on charged par- ticle transport from what the isotropy or the global anisotropy has. It is important to apply this field synthesis method to the solar wind magnetic field based on spacecraft data. However, this relies on how we extract the power spectra of the turbulent magnetic field across different scales. In this study, we propose here a power spectrum synthesis method based on Fourier analysis to extract the large and small scale power spectrum from a single spacecraft observation with a long enough period and a high sampling frequency. We apply the method to the solar wind measurement by the magnetometer onboard the ACE spacecraft and regenerate the large scale isotropic 2D spectrum and the small scale anisotropic 2D spectrum. We run test particle simulations in the magnetid field generated in this way to estimate the transport coefficients and to compare with the isotropic turbulence model.

  15. Lessons learned in preparing method 29 filters for compliance testing audits.

    PubMed

    Martz, R F; McCartney, J E; Bursey, J T; Riley, C E

    2000-01-01

    Companies conducting compliance testing are required to analyze audit samples at the time they collect and analyze the stack samples if audit samples are available. Eastern Research Group (ERG) provides technical support to the EPA's Emission Measurements Center's Stationary Source Audit Program (SSAP) for developing, preparing, and distributing performance evaluation samples and audit materials. These audit samples are requested via the regulatory Agency and include spiked audit materials for EPA Method 29-Metals Emissions from Stationary Sources, as well as other methods. To provide appropriate audit materials to federal, state, tribal, and local governments, as well as agencies performing environmental activities and conducting emission compliance tests, ERG has recently performed testing of blank filter materials and preparation of spiked filters for EPA Method 29. For sampling stationary sources using an EPA Method 29 sampling train, the use of filters without organic binders containing less than 1.3 microg/in.2 of each of the metals to be measured is required. Risk Assessment testing imposes even stricter requirements for clean filter background levels. Three vendor sources of quartz fiber filters were evaluated for background contamination to ensure that audit samples would be prepared using filters with the lowest metal background levels. A procedure was developed to test new filters, and a cleaning procedure was evaluated to see if a greater level of cleanliness could be achieved using an acid rinse with new filters. Background levels for filters supplied by different vendors and within lots of filters from the same vendor showed a wide variation, confirmed through contact with several analytical laboratories that frequently perform EPA Method 29 analyses. It has been necessary to repeat more than one compliance test because of suspect metals background contamination levels. An acid cleaning step produced improvement in contamination level, but the difference was not significant for most of the Method 29 target metals. As a result of our studies, we conclude: Filters for Method 29 testing should be purchased in lots as large as possible. Testing firms should pre-screen new boxes and/or new lots of filters used for Method 29 testing. Random analysis of three filters (top, middle, bottom of the box) from a new box of vendor filters before allowing them to be used in field tests is a prudent approach. A box of filters from a given vendor should be screened, and filters from this screened box should be used both for testing and as field blanks in each test scenario to provide the level of quality assurance required for stationary source testing.

  16. Development of large field-of-view two photon microscopy for imaging mouse cortex (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Bumstead, Jonathan; Côté, Daniel C.; Culver, Joseph P.

    2017-02-01

    Spontaneous neuronal activity has been measured at cellular resolution in mice, zebrafish, and C. elegans using optical sectioning microscopy techniques, such as light sheet microscopy (LSM) and two photon microscopy (TPM). Recent improvements in these modalities and genetically encoded calcium indicators (GECI's) have enabled whole brain imaging of calcium dynamics in zebrafish and C. elegans. However, these whole brain microscopy studies have not been extended to mice due to the limited field of view (FOV) of TPM and the cumbersome geometry of LSM. Conventional TPM is restricted to diffraction limited imaging over this small FOV (around 500 x 500 microns) due to the use of high magnification objectives (e.g. 1.0 NA; 20X) and the aberrations introduced by relay optics used in scanning the beam across the sample. To overcome these limitations, we have redesigned the entire optical path of the two photon microscope (scanning optics and objective lens) to support a field of view of Ø7 mm with relatively high spatial resolution (<10 microns). Using optical engineering software Zemax, we designed our system with commercially available optics that minimize astigmatism, field curvature, chromatic focal shift, and vignetting. Performance of the system was also tested experimentally with fluorescent beads in agarose, fixed samples, and in vivo structural imaging. Our large-FOV TPM provides a modality capable of studying distributed brain networks in mice at cellular resolution.

  17. Near-isothermal furnace for in situ and real time X-ray radiography solidification experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, M., E-mail: maike.becker@dlr.de; Dreißigacker, C.; Klein, S.

    2015-06-15

    In this paper, we present a newly developed near-isothermal X-ray transparent furnace for in situ imaging of solidification processes in thin metallic samples. We show that the furnace is ideally suited to study equiaxed microstructure evolution and grain interaction. To observe the growth dynamics of equiaxed dendritic structures, a minimal temperature gradient across the sample is required. A uniform thermal profile inside a circular sample is achieved by positioning the sample in the center of a cylindrical furnace body surrounded by a circular heater arrangement. Performance tests with the hypo-eutectic Al-15wt.%Cu and the near-eutectic Al-33wt.%Cu alloys validate the near-isothermal charactermore » of the sample environment. Controlled cooling rates of less than 0.5 K min{sup −1} up to 10 K min{sup −1} can be achieved in a temperature range of 720 K–1220 K. Integrated in our rotatable laboratory X-ray facility, X-RISE, the furnace provides a large field of view of 10.5 mm in diameter and a high spatial resolution of ∼4 μm. With the here presented furnace, equiaxed dendrite growth models can be rigorously tested against experiments on metal alloys by, e.g., enabling dendrite growth velocities to be determined as a function of undercooling or solutal fields in front of the growing dendrite to be measured.« less

  18. Microstructure Design of Tempered Martensite by Atomistically Informed Full-Field Simulation: From Quenching to Fracture

    PubMed Central

    Borukhovich, Efim; Du, Guanxing; Stratmann, Matthias; Boeff, Martin; Shchyglo, Oleg; Hartmaier, Alexander; Steinbach, Ingo

    2016-01-01

    Martensitic steels form a material class with a versatile range of properties that can be selected by varying the processing chain. In order to study and design the desired processing with the minimal experimental effort, modeling tools are required. In this work, a full processing cycle from quenching over tempering to mechanical testing is simulated with a single modeling framework that combines the features of the phase-field method and a coupled chemo-mechanical approach. In order to perform the mechanical testing, the mechanical part is extended to the large deformations case and coupled to crystal plasticity and a linear damage model. The quenching process is governed by the austenite-martensite transformation. In the tempering step, carbon segregation to the grain boundaries and the resulting cementite formation occur. During mechanical testing, the obtained material sample undergoes a large deformation that leads to local failure. The initial formation of the damage zones is observed to happen next to the carbides, while the final damage morphology follows the martensite microstructure. This multi-scale approach can be applied to design optimal microstructures dependent on processing and materials composition. PMID:28773791

  19. Permeability and compression characteristics of municipal solid waste samples

    NASA Astrophysics Data System (ADS)

    Durmusoglu, Ertan; Sanchez, Itza M.; Corapcioglu, M. Yavuz

    2006-08-01

    Four series of laboratory tests were conducted to evaluate the permeability and compression characteristics of municipal solid waste (MSW) samples. While the two series of tests were conducted using a conventional small-scale consolidometer, the two others were conducted in a large-scale consolidometer specially constructed for this study. In each consolidometer, the MSW samples were tested at two different moisture contents, i.e., original moisture content and field capacity. A scale effect between the two consolidometers with different sizes was investigated. The tests were carried out on samples reconsolidated to pressures of 123, 246, and 369 kPa. Time settlement data gathered from each load increment were employed to plot strain versus log-time graphs. The data acquired from the compression tests were used to back calculate primary and secondary compression indices. The consolidometers were later adapted for permeability experiments. The values of indices and the coefficient of compressibility for the MSW samples tested were within a relatively narrow range despite the size of the consolidometer and the different moisture contents of the specimens tested. The values of the coefficient of permeability were within a band of two orders of magnitude (10-6-10-4 m/s). The data presented in this paper agreed very well with the data reported by previous researchers. It was concluded that the scale effect in the compression behavior was significant. However, there was usually no linear relationship between the results obtained in the tests.

  20. Detection of Invasive Mosquito Vectors Using Environmental DNA (eDNA) from Water Samples

    PubMed Central

    Schneider, Judith; Valentini, Alice; Dejean, Tony; Montarsi, Fabrizio; Taberlet, Pierre

    2016-01-01

    Repeated introductions and spread of invasive mosquito species (IMS) have been recorded on a large scale these last decades worldwide. In this context, members of the mosquito genus Aedes can present serious risks to public health as they have or may develop vector competence for various viral diseases. While the Tiger mosquito (Aedes albopictus) is a well-known vector for e.g. dengue and chikungunya viruses, the Asian bush mosquito (Ae. j. japonicus) and Ae. koreicus have shown vector competence in the field and the laboratory for a number of viruses including dengue, West Nile fever and Japanese encephalitis. Early detection and identification is therefore crucial for successful eradication or control strategies. Traditional specific identification and monitoring of different and/or cryptic life stages of the invasive Aedes species based on morphological grounds may lead to misidentifications, and are problematic when extensive surveillance is needed. In this study, we developed, tested and applied an environmental DNA (eDNA) approach for the detection of three IMS, based on water samples collected in the field in several European countries. We compared real-time quantitative PCR (qPCR) assays specific for these three species and an eDNA metabarcoding approach with traditional sampling, and discussed the advantages and limitations of these methods. Detection probabilities for eDNA-based approaches were in most of the specific comparisons higher than for traditional survey and the results were congruent between both molecular methods, confirming the reliability and efficiency of alternative eDNA-based techniques for the early and unambiguous detection and surveillance of invasive mosquito vectors. The ease of water sampling procedures in the eDNA approach tested here allows the development of large-scale monitoring and surveillance programs of IMS, especially using citizen science projects. PMID:27626642

  1. Detection of Invasive Mosquito Vectors Using Environmental DNA (eDNA) from Water Samples.

    PubMed

    Schneider, Judith; Valentini, Alice; Dejean, Tony; Montarsi, Fabrizio; Taberlet, Pierre; Glaizot, Olivier; Fumagalli, Luca

    2016-01-01

    Repeated introductions and spread of invasive mosquito species (IMS) have been recorded on a large scale these last decades worldwide. In this context, members of the mosquito genus Aedes can present serious risks to public health as they have or may develop vector competence for various viral diseases. While the Tiger mosquito (Aedes albopictus) is a well-known vector for e.g. dengue and chikungunya viruses, the Asian bush mosquito (Ae. j. japonicus) and Ae. koreicus have shown vector competence in the field and the laboratory for a number of viruses including dengue, West Nile fever and Japanese encephalitis. Early detection and identification is therefore crucial for successful eradication or control strategies. Traditional specific identification and monitoring of different and/or cryptic life stages of the invasive Aedes species based on morphological grounds may lead to misidentifications, and are problematic when extensive surveillance is needed. In this study, we developed, tested and applied an environmental DNA (eDNA) approach for the detection of three IMS, based on water samples collected in the field in several European countries. We compared real-time quantitative PCR (qPCR) assays specific for these three species and an eDNA metabarcoding approach with traditional sampling, and discussed the advantages and limitations of these methods. Detection probabilities for eDNA-based approaches were in most of the specific comparisons higher than for traditional survey and the results were congruent between both molecular methods, confirming the reliability and efficiency of alternative eDNA-based techniques for the early and unambiguous detection and surveillance of invasive mosquito vectors. The ease of water sampling procedures in the eDNA approach tested here allows the development of large-scale monitoring and surveillance programs of IMS, especially using citizen science projects.

  2. Deep Borehole Field Test Research Activities at LBNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobson, Patrick; Tsang, Chin-Fu; Kneafsey, Timothy

    The goal of the U.S. Department of Energy Used Fuel Disposition’s (UFD) Deep Borehole Field Test is to drill two 5 km large-diameter boreholes: a characterization borehole with a bottom-hole diameter of 8.5 inches and a field test borehole with a bottom-hole diameter of 17 inches. These boreholes will be used to demonstrate the ability to drill such holes in crystalline rocks, effectively characterize the bedrock repository system using geophysical, geochemical, and hydrological techniques, and emplace and retrieve test waste packages. These studies will be used to test the deep borehole disposal concept, which requires a hydrologically isolated environment characterizedmore » by low permeability, stable fluid density, reducing fluid chemistry conditions, and an effective borehole seal. During FY16, Lawrence Berkeley National Laboratory scientists conducted a number of research studies to support the UFD Deep Borehole Field Test effort. This work included providing supporting data for the Los Alamos National Laboratory geologic framework model for the proposed deep borehole site, conducting an analog study using an extensive suite of geoscience data and samples from a deep (2.5 km) research borehole in Sweden, conducting laboratory experiments and coupled process modeling related to borehole seals, and developing a suite of potential techniques that could be applied to the characterization and monitoring of the deep borehole environment. The results of these studies are presented in this report.« less

  3. Uses of infrared thermography in the low-cost solar array program

    NASA Technical Reports Server (NTRS)

    Glazer, S. D.

    1982-01-01

    The Jet Propulsion Laboratory has used infrared thermography extensively in the Low-Cost Solar Array (LSA) photovoltaics program. A two-dimensional scanning infrared radiometer has been used to make field inspections of large free-standing photovoltaic arrays and smaller demonstration sites consisting of integrally mounted rooftop systems. These field inspections have proven especially valuable in the research and early development phases of the program, since certain types of module design flaws and environmental degradation manifest themselves in unique thermal patterns. The infrared camera was also used extensively in a series of laboratory tests on photovoltaic cells to obtain peak cell temperatures and thermal patterns during off-design operating conditions. The infrared field inspections and the laboratory experiments are discussed, and sample results are presented.

  4. Finding SDSS Galaxy Clusters in 4-dimensional Color Space Using the False Discovery Rate

    NASA Astrophysics Data System (ADS)

    Nichol, R. C.; Miller, C. J.; Reichart, D.; Wasserman, L.; Genovese, C.; SDSS Collaboration

    2000-12-01

    We describe a recently developed statistical technique that provides a meaningful cut-off in probability-based decision making. We are concerned with multiple testing, where each test produces a well-defined probability (or p-value). By well-known, we mean that the null hypothesis used to determine the p-value is fully understood and appropriate. The method is entitled False Discovery Rate (FDR) and its largest advantage over other measures is that it allows one to specify a maximal amount of acceptable error. As an example of this tool, we apply FDR to a four-dimensional clustering algorithm using SDSS data. For each galaxy (or test galaxy), we count the number of neighbors that fit within one standard deviation of a four dimensional Gaussian centered on that test galaxy. The mean and standard deviation of that Gaussian are determined from the colors and errors of the test galaxy. We then take that same Gaussian and place it on a random selection of n galaxies and make a similar count. In the limit of large n, we expect the median count around these random galaxies to represent a typical field galaxy. For every test galaxy we determine the probability (or p-value) that it is a field galaxy based on these counts. A low p-value implies that the test galaxy is in a cluster environment. Once we have a p-value for every galaxy, we use FDR to determine at what level we should make our probability cut-off. Once this cut-off is made, we have a final sample of galaxies that are cluster-like galaxies. Using FDR, we also know the maximum amount of field contamination in our cluster galaxy sample. We present our preliminary galaxy clustering results using these methods.

  5. Electromagnetic analysis of a superconducting transformer for high current characterization of cable in conduit conductors in background magnetic field

    NASA Astrophysics Data System (ADS)

    Wu, Xiangyang; Tan, Yunfei; Fang, Zhen; Jiang, Donghui; Chen, Zhiyou; Chen, Wenge; Kuang, Guangli

    2017-10-01

    A large cable-in-conduit-conductor (CICC) test facility has been designed and fabricated at the High Magnetic Field Laboratory of the Chinese Academy of Sciences (CHMFL) in order to meet the test requirement of the conductors which are applied to the future fusion reactor. The critical component of the test facility is an 80 kA superconducting transformer which consists of a multi-turn primary coil and a minor-turn secondary coil. As the current source of the conductor samples, the electromagnetic performance of the superconducting transformer determines the stability and safety of the test facility. In this paper, the key factors and parameters, which have much impact on the performance of the transformer, are analyzed in detail. The conceptual design and optimizing principles of the transformer are discussed. An Electromagnetic-Circuit coupled model built in ANSYS Multiphysics is successfully used to investigate the electromagnetic characterization of the transformer under the dynamic operation condition.

  6. Measuring ignitability for in situ burning of oil spills weathered under Arctic conditions: from laboratory studies to large-scale field experiments.

    PubMed

    Fritt-Rasmussen, Janne; Brandvik, Per Johan

    2011-08-01

    This paper compares the ignitability of Troll B crude oil weathered under simulated Arctic conditions (0%, 50% and 90% ice cover). The experiments were performed in different scales at SINTEF's laboratories in Trondheim, field research station on Svalbard and in broken ice (70-90% ice cover) in the Barents Sea. Samples from the weathering experiments were tested for ignitability using the same laboratory burning cell. The measured ignitability from the experiments in these different scales showed a good agreement for samples with similar weathering. The ice conditions clearly affected the weathering process, and 70% ice or more reduces the weathering and allows a longer time window for in situ burning. The results from the Barents Sea revealed that weathering and ignitability can vary within an oil slick. This field use of the burning cell demonstrated that it can be used as an operational tool to monitor the ignitability of oil spills. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Octet baryons in large magnetic fields

    NASA Astrophysics Data System (ADS)

    Deshmukh, Amol; Tiburzi, Brian C.

    2018-01-01

    Magnetic properties of octet baryons are investigated within the framework of chiral perturbation theory. Utilizing a power counting for large magnetic fields, the Landau levels of charged mesons are treated exactly giving rise to baryon energies that depend nonanalytically on the strength of the magnetic field. In the small-field limit, baryon magnetic moments and polarizabilities emerge from the calculated energies. We argue that the magnetic polarizabilities of hyperons provide a testing ground for potentially large contributions from decuplet pole diagrams. In external magnetic fields, such contributions manifest themselves through decuplet-octet mixing, for which possible results are compared in a few scenarios. These scenarios can be tested with lattice QCD calculations of the octet baryon energies in magnetic fields.

  8. MUSE alignment onto VLT

    NASA Astrophysics Data System (ADS)

    Laurent, Florence; Renault, Edgard; Boudon, Didier; Caillier, Patrick; Daguisé, Eric; Dupuy, Christophe; Jarno, Aurélien; Lizon, Jean-Louis; Migniau, Jean-Emmanuel; Nicklas, Harald; Piqueras, Laure

    2014-07-01

    MUSE (Multi Unit Spectroscopic Explorer) is a second generation Very Large Telescope (VLT) integral field spectrograph developed for the European Southern Observatory (ESO). It combines a 1' x 1' field of view sampled at 0.2 arcsec for its Wide Field Mode (WFM) and a 7.5"x7.5" field of view for its Narrow Field Mode (NFM). Both modes will operate with the improved spatial resolution provided by GALACSI (Ground Atmospheric Layer Adaptive Optics for Spectroscopic Imaging), that will use the VLT deformable secondary mirror and 4 Laser Guide Stars (LGS) foreseen in 2015. MUSE operates in the visible wavelength range (0.465-0.93 μm). A consortium of seven institutes is currently commissioning MUSE in the Very Large Telescope for the Preliminary Acceptance in Chile, scheduled for September, 2014. MUSE is composed of several subsystems which are under the responsibility of each institute. The Fore Optics derotates and anamorphoses the image at the focal plane. A Splitting and Relay Optics feed the 24 identical Integral Field Units (IFU), that are mounted within a large monolithic structure. Each IFU incorporates an image slicer, a fully refractive spectrograph with VPH-grating and a detector system connected to a global vacuum and cryogenic system. During 2012 and 2013, all MUSE subsystems were integrated, aligned and tested to the P.I. institute at Lyon. After successful PAE in September 2013, MUSE instrument was shipped to the Very Large Telescope in Chile where that was aligned and tested in ESO integration hall at Paranal. After, MUSE was directly transported, fully aligned and without any optomechanical dismounting, onto VLT telescope where the first light was overcame the 7th of February, 2014. This paper describes the alignment procedure of the whole MUSE instrument with respect to the Very Large Telescope (VLT). It describes how 6 tons could be move with accuracy better than 0.025mm and less than 0.25 arcmin in order to reach alignment requirements. The success of the MUSE alignment is demonstrated by the excellent results obtained onto MUSE image quality and throughput directly onto the sky.

  9. A Field-Portable Cell Analyzer without a Microscope and Reagents.

    PubMed

    Seo, Dongmin; Oh, Sangwoo; Lee, Moonjin; Hwang, Yongha; Seo, Sungkyu

    2017-12-29

    This paper demonstrates a commercial-level field-portable lens-free cell analyzer called the NaviCell (No-stain and Automated Versatile Innovative cell analyzer) capable of automatically analyzing cell count and viability without employing an optical microscope and reagents. Based on the lens-free shadow imaging technique, the NaviCell (162 × 135 × 138 mm³ and 1.02 kg) has the advantage of providing analysis results with improved standard deviation between measurement results, owing to its large field of view. Importantly, the cell counting and viability testing can be analyzed without the use of any reagent, thereby simplifying the measurement procedure and reducing potential errors during sample preparation. In this study, the performance of the NaviCell for cell counting and viability testing was demonstrated using 13 and six cell lines, respectively. Based on the results of the hemocytometer ( de facto standard), the error rate (ER) and coefficient of variation (CV) of the NaviCell are approximately 3.27 and 2.16 times better than the commercial cell counter, respectively. The cell viability testing of the NaviCell also showed an ER and CV performance improvement of 5.09 and 1.8 times, respectively, demonstrating sufficient potential in the field of cell analysis.

  10. Effect of Electropolishing and Low-Temperature Baking on the Superconducting Properties of Large-Grain Niobium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A. S. Dhavale, G. Ciovati, G. R. Myneni

    Measurements of superconducting properties such as bulk and surface critical fields and thermal conductivity have been carried out in the temperature range from 2 K to 8 K on large-grain samples of different purity and on a high-purity fine-grain sample, for comparison. The samples were treated by electropolishing and low temperature baking (120° C, 48 h). While the residual resistivity ratio changed by a factor of ~3 among the samples, no significant variation was found in their superconducting properties. The onset field for flux penetration at 2 K, Hffp, measured within a ~30 µm depth from the surface, was ~160more » mT, close to the bulk value. The baking effect was mainly to increase the field range up to which a coherent superconducting phase persists on the surface, above the upper critical field.« less

  11. On the classification of normally distributed neurons: an application to human dentate nucleus.

    PubMed

    Ristanović, Dušan; Milošević, Nebojša T; Marić, Dušica L

    2011-03-01

    One of the major goals in cellular neurobiology is the meaningful cell classification. However, in cell classification there are many unresolved issues that need to be addressed. Neuronal classification usually starts with grouping cells into classes according to their main morphological features. If one tries to test quantitatively such a qualitative classification, a considerable overlap in cell types often appears. There is little published information on it. In order to remove the above-mentioned shortcoming, we undertook the present study with the aim to offer a novel method for solving the class overlapping problem. To illustrate our method, we analyzed a sample of 124 neurons from adult human dentate nucleus. Among them we qualitatively selected 55 neurons with small dendritic fields (the small neurons), and 69 asymmetrical neurons with large dendritic fields (the large neurons). We showed that these two samples are normally and independently distributed. By measuring the neuronal soma areas of both samples, we observed that the corresponding normal curves cut each other. We proved that the abscissa of the point of intersection of the curves could represent the boundary between the two adjacent overlapping neuronal classes, since the error done by such division is minimal. Statistical evaluation of the division was also performed.

  12. Ancient DNA studies: new perspectives on old samples

    PubMed Central

    2012-01-01

    In spite of past controversies, the field of ancient DNA is now a reliable research area due to recent methodological improvements. A series of recent large-scale studies have revealed the true potential of ancient DNA samples to study the processes of evolution and to test models and assumptions commonly used to reconstruct patterns of evolution and to analyze population genetics and palaeoecological changes. Recent advances in DNA technologies, such as next-generation sequencing make it possible to recover DNA information from archaeological and paleontological remains allowing us to go back in time and study the genetic relationships between extinct organisms and their contemporary relatives. With the next-generation sequencing methodologies, DNA sequences can be retrieved even from samples (for example human remains) for which the technical pitfalls of classical methodologies required stringent criteria to guaranty the reliability of the results. In this paper, we review the methodologies applied to ancient DNA analysis and the perspectives that next-generation sequencing applications provide in this field. PMID:22697611

  13. Pre-Mission Input Requirements to Enable Successful Sample Collection by a Remote Field/EVA Team

    NASA Technical Reports Server (NTRS)

    Cohen, B. A.; Young, K. E.; Lim, D. S.

    2015-01-01

    This paper is intended to evaluate the sample collection process with respect to sample characterization and decision making. In some cases, it may be sufficient to know whether a given outcrop or hand sample is the same as or different from previous sampling localities or samples. In other cases, it may be important to have more in-depth characterization of the sample, such as basic composition, mineralogy, and petrology, in order to effectively identify the best sample. Contextual field observations, in situ/handheld analysis, and backroom evaluation may all play a role in understanding field lithologies and their importance for return. For example, whether a rock is a breccia or a clast-laden impact melt may be difficult based on a single sample, but becomes clear as exploration of a field site puts it into context. The FINESSE (Field Investigations to Enable Solar System Science and Exploration) team is a new activity focused on a science and exploration field based research program aimed at generating strategic knowledge in preparation for the human and robotic exploration of the Moon, near-Earth asteroids (NEAs) and Phobos and Deimos. We used the FINESSE field excursion to the West Clearwater Lake Impact structure (WCIS) as an opportunity to test factors related to sampling decisions. In contract to other technology-driven NASA analog studies, The FINESSE WCIS activity is science-focused, and moreover, is sampling-focused, with the explicit intent to return the best samples for geochronology studies in the laboratory. This specific objective effectively reduces the number of variables in the goals of the field test and enables a more controlled investigation of the role of the crewmember in selecting samples. We formulated one hypothesis to test: that providing details regarding the analytical fate of the samples (e.g. geochronology, XRF/XRD, etc.) to the crew prior to their traverse will result in samples that are more likely to meet specific analytical objectives than samples collected in the absence of this premission information. We conducted three tests of this hypothesis. Our investigation was designed to document processes, tools and procedures for crew sampling of planetary targets. This is not meant to be a blind, controlled test of crew efficacy, but rather an effort to recognize the relevant variables that enter into sampling protocol and to develop recommendations for crew and backroom training in future endeavors. Methods: One of the primary FINESSE field deployment objectives was to collect impact melt rocks and impact melt-bearing breccias from a number of locations around the WCIS structure to enable high precision geochronology of the crater to be performed [1]. We conducted three tests at WCIS after two full days of team participation in field site activities, including using remote sensing data and geologic maps, hiking overland to become familiar with the terrain, and examining previously-collected samples from other islands. In addition, the team members shared their projects and techniques with the entire team. We chose our "crew members" as volunteers from the team, all of whom had had moderate training in geologic fieldwork and became familiar with the general field setting. The first two tests were short, focused tests of our hypothesis. Test A was to obtain hydrothermal vugs; Test B was to obtain impact melt and intrusive rock as well as the contact between the two to check for contact metamorphism and age differences. In both cases, the test director had prior knowledge of the site geology and had developed a study-specific objective for sampling prior to deployment. Prior to the field deployment, the crewmember was briefed on the sampling objective and the laboratory techniques that would be used on the samples. At the field sites (Fig. 2), the crewmember was given 30 minutes to survey a small section of outcrop (10-15 m) and acquire a suite of three samples. The crewmember talked through his process and the test director kept track of the timeline in verbal cues to the crewmember. At the conclusion, the team member conducting the scientific study appraised the samples and train of thought. Test C was a 90-minute EVA simulation using two crewmembers working out of line-of-sight in communication with a science backroom. The science objectives were determined by the science backroom team in advance using a Gigapan image of the outcrop (Fig. 1). The science team formulated hypotheses for the outcrop units and created sampling objectives for impact-melt lithologies; the science team turned these into a science plan, which they communicated to the crew in camp prior to crew deployment. As part of the science plan, the science team also discussed their sample needs in depth with the crewmembers, including laboratory methods, objectives, and samples sizes needed. During the deployment, the two crewmembers relayed real-time information to the science backroom by radio with no time delay. Both the crew and science team re-evaluated their hypotheses and science plans in real-time. Discussion: Upon evaluation, we found that the focused tests (Tests A and B) were successful in meeting their scientific objectives. The crewmember used their knowledge of how the samples were to be used in further study (technique, sample size, and scientific need) to focus on the sampling task. The crewmember was comfortable spending minimal time describing and mapping the outcrop. The crewmember used all available time to get a good sample. The larger test was unsuccessful in meeting the sampling objectives. When the crewmembers began describing the lithologies, it was quickly apparent that the lithologies were not as the backroom expected and had communicated to the crew. When the outcrop wasn't as expected, the crew members instinctively switched to field characterization mode, taking significant time to characterize and map the outcrop. One crew member admitted that he "kind of lost track" of the sampling strategy as he focused on the basic outcrop characterization. This is the logical first step in a field geology campaign, that a significant amount of time must be spent by the crew and backroom to understand the outcrop and its significance. Basic field characterization of an outcrop is a focused activity that takes significant time and training [2, 3]. Sampling of representational lithologies can be added to this activity for little cost [4]. However, we have shown that identification of unusual or specific samples for laboratory study also takes significant time and knowledge. We suggest that sampling of this type be considered a separate activity from field characterization, and that crewmembers be trained in sampling needs for different kinds of studies (representative lithologies vs. specialized samples) to acquire a mindset for sampling similar to field mapping. Sampling activities should be given a significant amount of specifically allocated time in scheduling EVA activities; and in the better case, that sampling be done as a second activity to a previously studied outcrop where both crew and backroom are comfortable with its context and characteristics. Our hypothesis posited that crewmember knowledge of how the samples would be used upon return would aid them in choosing relevant samples. Our testing bore this hypothesis out to some extent. We therefore recommend that crewmember training should include exposure to the laboratory techniques and analyses that will be used on the samples to foster this knowledge. There is also the potential for increasing crewmember contextual knowledge real-time in the field through the introduction of in situ geochemical technologies such as field portable XRF. The presence of field portable geochemical technology could enable the astronauts to interrogate the samples for K abundance real-time, ensuring they could collect valuable and dateable samples [5]. Though simulations such as these can teach us a fair bit about decision making processes and timeline building, one EVA participant noted that when he wasn't collecting "real" samples, he wasn't at his best. This effect suggests that higher-fidelity studies involving truly remote participants conducting actual scientific studies merit further attention to capture lessons for application to future crew situations.

  14. A passive ozone sampler based on a reaction with nitrite.

    PubMed

    Koutrakis, P; Wolfson, J M; Bunyaviroch, A; Froehlich, S

    1994-02-01

    Standard ozone monitoring techniques utilize large, heavy, and expensive instruments that are not easily adapted for personal or microenvironmental monitoring. For large-scale monitoring projects that examine spatial variations of a pollutant and human exposure assessments, passive sampling devices can provide the methodology to meet monitoring and statistical goals. Recently, we developed a coated filter for ozone collection that we used in a commercially available passive sampling device. Successful preliminary results merited further validation tests, which are presented in this report. The passive ozone sampler used in field and laboratory experiments consists of a badge clip supporting a barrel-shaped body that contains two coated glass fiber filters. The principle component of the coating is nitrite ion, which in the presence of ozone is oxidized to nitrate ion on the filter medium (NO2- + O3 produces NO3- + O2). After sample collection, the filters were extracted with ultrapure water and analyzed for nitrate ion by ion chromatography. The results from laboratory and field validation tests indicated excellent agreement between the passive method and standard ozone monitoring techniques. We determined that relative humidity (ranging from 10% to 80%) and temperature (ranging from 0 degrees C to 40 degrees C) at typical ambient ozone levels (40 to 100 parts per billion) do not influence sampler performance. Face velocity and sampler orientation with respect to wind direction were found to affect the sampler's collection rate of ozone. Using a protective cup, which acts as both a wind screen and a rain cover, we were able to obtain a constant collection rate over a wide range of wind speeds.

  15. Kinematic Clues to OB Field Star Origins: Radial Velocities, Runaways, and Binaries

    NASA Astrophysics Data System (ADS)

    Januszewski, Helen; Castro, Norberto; Oey, Sally; Becker, Juliette; Kratter, Kaitlin M.; Mateo, Mario; Simón-Díaz, Sergio; Bjorkman, Jon E.; Bjorkman, Karen; Sigut, Aaron; Smullen, Rachel; M2FS Team

    2018-01-01

    Field OB stars are a crucial probe of star formation in extreme conditions. Properties of massive stars formed in relative isolation can distinguish between competing star formation theories, while the statistics of runaway stars allow an indirect test of the densest conditions in clusters. To address these questions, we have obtained multi-epoch, spectroscopic observations for a spatially complete sample of 48 OB field stars in the SMC Wing with the IMACS and M2FS multi-object spectrographs at the Magellan Telescopes. The observations span 3-6 epochs per star, with sampling frequency ranging from one day to about one year. From these spectra, we have calculated the radial velocities (RVs) and, in particular, the systemic velocities for binaries. Thus, we present the intrinsic RV distribution largely uncontaminated by binary motions. We estimate the runaway frequency, corresponding to the high velocity stars in our sample, and we also constrain the binary frequency. The binary frequency and fitted orbital parameters also place important constraints on star formation theories, as these properties drive the process of runaway ejection in clusters, and we discuss these properties as derived from our sample. This unique kinematic analysis of a high mass field star population thus provides a new look at the processes governing formation and interaction of stars in environments at extreme densities, from isolation to dense clusters.

  16. Formability analysis of sheet metals by cruciform testing

    NASA Astrophysics Data System (ADS)

    Güler, B.; Alkan, K.; Efe, M.

    2017-09-01

    Cruciform biaxial tests are increasingly becoming popular for testing the formability of sheet metals as they achieve frictionless, in-plane, multi-axial stress states with a single sample geometry. However, premature fracture of the samples during testing prevents large strain deformation necessary for the formability analysis. In this work, we introduce a miniature cruciform sample design (few mm test region) and a test setup to achieve centre fracture and large uniform strains. With its excellent surface finish and optimized geometry, the sample deforms with diagonal strain bands intersecting at the test region. These bands prevent local necking and concentrate the strains at the sample centre. Imaging and strain analysis during testing confirm the uniform strain distributions and the centre fracture are possible for various strain paths ranging from plane-strain to equibiaxial tension. Moreover, the sample deforms without deviating from the predetermined strain ratio at all test conditions, allowing formability analysis under large strains. We demonstrate these features of the cruciform test for three sample materials: Aluminium 6061-T6 alloy, DC-04 steel and Magnesium AZ31 alloy, and investigate their formability at both the millimetre scale and the microstructure scale.

  17. Application of field dependent polynomial model

    NASA Astrophysics Data System (ADS)

    Janout, Petr; Páta, Petr; Skala, Petr; Fliegel, Karel; Vítek, Stanislav; Bednář, Jan

    2016-09-01

    Extremely wide-field imaging systems have many advantages regarding large display scenes whether for use in microscopy, all sky cameras, or in security technologies. The Large viewing angle is paid by the amount of aberrations, which are included with these imaging systems. Modeling wavefront aberrations using the Zernike polynomials is known a longer time and is widely used. Our method does not model system aberrations in a way of modeling wavefront, but directly modeling of aberration Point Spread Function of used imaging system. This is a very complicated task, and with conventional methods, it was difficult to achieve the desired accuracy. Our optimization techniques of searching coefficients space-variant Zernike polynomials can be described as a comprehensive model for ultra-wide-field imaging systems. The advantage of this model is that the model describes the whole space-variant system, unlike the majority models which are partly invariant systems. The issue that this model is the attempt to equalize the size of the modeled Point Spread Function, which is comparable to the pixel size. Issues associated with sampling, pixel size, pixel sensitivity profile must be taken into account in the design. The model was verified in a series of laboratory test patterns, test images of laboratory light sources and consequently on real images obtained by an extremely wide-field imaging system WILLIAM. Results of modeling of this system are listed in this article.

  18. Removing systematic errors in interionic potentials of mean force computed in molecular simulations using reaction-field-based electrostatics

    PubMed Central

    Baumketner, Andrij

    2009-01-01

    The performance of reaction-field methods to treat electrostatic interactions is tested in simulations of ions solvated in water. The potential of mean force between sodium chloride pair of ions and between side chains of lysine and aspartate are computed using umbrella sampling and molecular dynamics simulations. It is found that in comparison with lattice sum calculations, the charge-group-based approaches to reaction-field treatments produce a large error in the association energy of the ions that exhibits strong systematic dependence on the size of the simulation box. The atom-based implementation of the reaction field is seen to (i) improve the overall quality of the potential of mean force and (ii) remove the dependence on the size of the simulation box. It is suggested that the atom-based truncation be used in reaction-field simulations of mixed media. PMID:19292522

  19. Targeting allergenic fungi in agricultural environments aids the identification of major sources and potential risks for human health.

    PubMed

    Weikl, F; Radl, V; Munch, J C; Pritsch, K

    2015-10-01

    Fungi are, after pollen, the second most important producers of outdoor airborne allergens. To identify sources of airborne fungal allergens, a workflow for qPCR quantification from environmental samples was developed, thoroughly tested, and finally applied. We concentrated on determining the levels of allergenic fungi belonging to Alternaria, Cladosporium, Fusarium, and Trichoderma in plant and soil samples from agricultural fields in which cereals were grown. Our aims were to identify the major sources of allergenic fungi and factors potentially influencing their occurrence. Plant materials were the main source of the tested fungi at and after harvest. Amounts of A. alternata and C. cladosporioides varied significantly in fields under different management conditions, but absolute levels were very high in all cases. This finding suggests that high numbers of allergenic fungi may be an inevitable side effect of farming in several crops. Applied in large-scale studies, the concept described here may help to explain the high number of sensitization to airborne fungal allergens. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Enhanced Conformational Sampling Using Replica Exchange with Collective-Variable Tempering.

    PubMed

    Gil-Ley, Alejandro; Bussi, Giovanni

    2015-03-10

    The computational study of conformational transitions in RNA and proteins with atomistic molecular dynamics often requires suitable enhanced sampling techniques. We here introduce a novel method where concurrent metadynamics are integrated in a Hamiltonian replica-exchange scheme. The ladder of replicas is built with different strengths of the bias potential exploiting the tunability of well-tempered metadynamics. Using this method, free-energy barriers of individual collective variables are significantly reduced compared with simple force-field scaling. The introduced methodology is flexible and allows adaptive bias potentials to be self-consistently constructed for a large number of simple collective variables, such as distances and dihedral angles. The method is tested on alanine dipeptide and applied to the difficult problem of conformational sampling in a tetranucleotide.

  1. Validation and Parameter Sensitivity Tests for Reconstructing Swell Field Based on an Ensemble Kalman Filter

    PubMed Central

    Wang, Xuan; Tandeo, Pierre; Fablet, Ronan; Husson, Romain; Guan, Lei; Chen, Ge

    2016-01-01

    The swell propagation model built on geometric optics is known to work well when simulating radiated swells from a far located storm. Based on this simple approximation, satellites have acquired plenty of large samples on basin-traversing swells induced by fierce storms situated in mid-latitudes. How to routinely reconstruct swell fields with these irregularly sampled observations from space via known swell propagation principle requires more examination. In this study, we apply 3-h interval pseudo SAR observations in the ensemble Kalman filter (EnKF) to reconstruct a swell field in ocean basin, and compare it with buoy swell partitions and polynomial regression results. As validated against in situ measurements, EnKF works well in terms of spatial–temporal consistency in far-field swell propagation scenarios. Using this framework, we further address the influence of EnKF parameters, and perform a sensitivity analysis to evaluate estimations made under different sets of parameters. Such analysis is of key interest with respect to future multiple-source routinely recorded swell field data. Satellite-derived swell data can serve as a valuable complementary dataset to in situ or wave re-analysis datasets. PMID:27898005

  2. Large-scale changes in network interactions as a physiological signature of spatial neglect

    PubMed Central

    Baldassarre, Antonello; Ramsey, Lenny; Hacker, Carl L.; Callejas, Alicia; Astafiev, Serguei V.; Metcalf, Nicholas V.; Zinn, Kristi; Rengachary, Jennifer; Snyder, Abraham Z.; Carter, Alex R.; Shulman, Gordon L.

    2014-01-01

    The relationship between spontaneous brain activity and behaviour following focal injury is not well understood. Here, we report a large-scale study of resting state functional connectivity MRI and spatial neglect following stroke in a large (n = 84) heterogeneous sample of first-ever stroke patients (within 1–2 weeks). Spatial neglect, which is typically more severe after right than left hemisphere injury, includes deficits of spatial attention and motor actions contralateral to the lesion, and low general attention due to impaired vigilance/arousal. Patients underwent structural and resting state functional MRI scans, and spatial neglect was measured using the Posner spatial cueing task, and Mesulam and Behavioural Inattention Test cancellation tests. A principal component analysis of the behavioural tests revealed a main factor accounting for 34% of variance that captured three correlated behavioural deficits: visual neglect of the contralesional visual field, visuomotor neglect of the contralesional field, and low overall performance. In an independent sample (21 healthy subjects), we defined 10 resting state networks consisting of 169 brain regions: visual-fovea and visual-periphery, sensory-motor, auditory, dorsal attention, ventral attention, language, fronto-parietal control, cingulo-opercular control, and default mode. We correlated the neglect factor score with the strength of resting state functional connectivity within and across the 10 resting state networks. All damaged brain voxels were removed from the functional connectivity:behaviour correlational analysis. We found that the correlated behavioural deficits summarized by the factor score were associated with correlated multi-network patterns of abnormal functional connectivity involving large swaths of cortex. Specifically, dorsal attention and sensory-motor networks showed: (i) reduced interhemispheric functional connectivity; (ii) reduced anti-correlation with fronto-parietal and default mode networks in the right hemisphere; and (iii) increased intrahemispheric connectivity with the basal ganglia. These patterns of functional connectivity:behaviour correlations were stronger in patients with right- as compared to left-hemisphere damage and were independent of lesion volume. Our findings identify large-scale changes in resting state network interactions that are a physiological signature of spatial neglect and may relate to its right hemisphere lateralization. PMID:25367028

  3. Structural design of a vertical antenna boresight 18.3 by 18.3-m planar near-field antenna measurement system

    NASA Technical Reports Server (NTRS)

    Sharp, G. R.; Trimarchi, P. A.; Wanhainen, J. S.

    1984-01-01

    A large very precise near-field planar scanner was proposed for NASA Lewis Research Center. This scanner would permit near-field measurements over a horizontal scan plane measuring 18.3 m by 18.3 m. Large aperture antennas mounted with antenna boresight vertical could be tested up to 60 GHz. When such a large near field scanner is used for pattern testing, the antenna or antenna system under test does not have to be moved. Hence, such antennas and antenna systems can be positioned and supported to simulate configuration in zero g. Thus, very large and heavy machinery that would be needed to accurately move the antennas are avoided. A preliminary investigation was undertaken to address the mechanical design of such a challenging near-field antenna scanner. The configuration, structural design and results of a parametric NASTRAN structural optimization analysis are contained. Further, the resulting design was dynamically analyzed in order to provide resonant frequency information to the scanner mechanical drive system designers. If other large near field scanners of comparable dimensions are to be constructed, the information can be used for design optimization of these also.

  4. A FIELD VALIDATION OF TWO SEDIMENT-AMPHIPOD TOXICITY TESTS

    EPA Science Inventory

    A field validation study of two sediment-amphipod toxicity tests was conducted using sediment samples collected subtidally in the vicinity of a polycyclic aromatic hydrocarbon (PAH)-contaminated Superfund site in Elliott Bay, WA, USA. Sediment samples were collected at 30 stati...

  5. Soils Sampling and Testing Training Guide for Field and Laboratory Technicians on Roadway Construction

    DOT National Transportation Integrated Search

    1999-12-01

    This manual has been developed as a training guide for field and laboratory technicians responsible for sampling and testing of soils used in roadway construction. Soils training and certification will increase the knowledge of laboratory, production...

  6. Three periods of one and a half decade of ischemic stroke susceptibility gene research: lessons we have learned

    PubMed Central

    2010-01-01

    Candidate gene association studies, linkage studies and genome-wide association studies have highlighted the role of genetic factors in the development of ischemic stroke. This research started over a decade ago, and can be separated into three major periods of research. In the first wave classic susceptibility markers associated with other diseases (such as the Leiden mutation in Factor V and mutations in the prothrombin and 5,10-methylenetetrahydrofolate reductase (MTHFR) genes) were tested for their role in stroke. These first studies used just a couple of hundred samples or even less. The second and still ongoing period bridges the two other periods of research and has led to a rapid increase in the spectrum of functional variants of genes or genomic regions, discovered primarily in relation to other diseases, tested on larger stroke samples of clinically better stratified patients. Large numbers of these alleles were originally discovered by array-based genome-wide association studies. The third period of research involves the direct array screening of large samples; this approach represents significant progress for research in the field. Research into susceptibility genes for stroke has taught us that careful stratification of patients is critical, that susceptibility alleles are often shared between diseases, and that not all susceptibility factors that associate with clinical traits that are themselves risk factors for stroke (such as increase of triglycerides) necessarily represent susceptibility for stroke. Research so far has been mainly focused on large- and small-vessel associated stroke, and knowledge on other types of stroke, which represent much smaller population samples, is still very scarce. Although some susceptibility allele tests are on the palette of some direct-to-consumer companies, the clinical utility and clinical validity of these test results still do not support their use in clinical practice. PMID:20831840

  7. Fort Dix Remedial Investigation/Feasibility Study for MAG-1 Area

    DTIC Science & Technology

    1994-01-01

    by PID headspace results or odor ), samples should be diluted to bring the target compound concentrations within the instrument calibration range...Conductivity Testing ................... 2-38 2.9 ANALYTICAL PROCEDURES FOR FIELD SCREENING SAMPLES .. 2-38 2.9.1 Volatile Organic Compounds ...ANALYSIS OF VOLATILE ORGANIC COMPOUNDS BY FIELD GAS CHROMATOGRAPHY - STANDARD OPERATING PROCEDURE APPENDIX B RDX EXPLOSIVES FIELD TEST KIT PROCEDURES

  8. Photospheric Magnetic Field Properties of Flaring versus Flare-quiet Active Regions. II. Discriminant Analysis

    NASA Astrophysics Data System (ADS)

    Leka, K. D.; Barnes, G.

    2003-10-01

    We apply statistical tests based on discriminant analysis to the wide range of photospheric magnetic parameters described in a companion paper by Leka & Barnes, with the goal of identifying those properties that are important for the production of energetic events such as solar flares. The photospheric vector magnetic field data from the University of Hawai'i Imaging Vector Magnetograph are well sampled both temporally and spatially, and we include here data covering 24 flare-event and flare-quiet epochs taken from seven active regions. The mean value and rate of change of each magnetic parameter are treated as separate variables, thus evaluating both the parameter's state and its evolution, to determine which properties are associated with flaring. Considering single variables first, Hotelling's T2-tests show small statistical differences between flare-producing and flare-quiet epochs. Even pairs of variables considered simultaneously, which do show a statistical difference for a number of properties, have high error rates, implying a large degree of overlap of the samples. To better distinguish between flare-producing and flare-quiet populations, larger numbers of variables are simultaneously considered; lower error rates result, but no unique combination of variables is clearly the best discriminator. The sample size is too small to directly compare the predictive power of large numbers of variables simultaneously. Instead, we rank all possible four-variable permutations based on Hotelling's T2-test and look for the most frequently appearing variables in the best permutations, with the interpretation that they are most likely to be associated with flaring. These variables include an increasing kurtosis of the twist parameter and a larger standard deviation of the twist parameter, but a smaller standard deviation of the distribution of the horizontal shear angle and a horizontal field that has a smaller standard deviation but a larger kurtosis. To support the ``sorting all permutations'' method of selecting the most frequently occurring variables, we show that the results of a single 10-variable discriminant analysis are consistent with the ranking. We demonstrate that individually, the variables considered here have little ability to differentiate between flaring and flare-quiet populations, but with multivariable combinations, the populations may be distinguished.

  9. Binary pseudo-random patterned structures for modulation transfer function calibration and resolution characterization of a full-field transmission soft x-ray microscope

    DOE PAGES

    Yashchuk, V. V.; Fischer, P. J.; Chan, E. R.; ...

    2015-12-09

    We present a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) one-dimensional sequences and two-dimensional arrays as an effective method for spectral characterization in the spatial frequency domain of a broad variety of metrology instrumentation, including interferometric microscopes, scatterometers, phase shifting Fizeau interferometers, scanning and transmission electron microscopes, and at this time, x-ray microscopes. The inherent power spectral density of BPR gratings and arrays, which has a deterministic white-noise-like character, allows a direct determination of the MTF with a uniform sensitivity over the entire spatial frequency range and field of view of an instrument. We demonstrate themore » MTF calibration and resolution characterization over the full field of a transmission soft x-ray microscope using a BPR multilayer (ML) test sample with 2.8 nm fundamental layer thickness. We show that beyond providing a direct measurement of the microscope's MTF, tests with the BPRML sample can be used to fine tune the instrument's focal distance. Finally, our results confirm the universality of the method that makes it applicable to a large variety of metrology instrumentation with spatial wavelength bandwidths from a few nanometers to hundreds of millimeters.« less

  10. Variable temperature superconducting microscope

    NASA Astrophysics Data System (ADS)

    Cheng, Bo; Yeh, W. J.

    2000-03-01

    We have developed and tested a promising type of superconducting quantum interference device (SQUID) microscope, which can be used to detect vortex motion and can operate in magnetic fields over a large temperature range. The system utilizes a single-loop coupling transformer, consisting of a patterned high Tc superconducting thin film. At one end of the transformer, a 20 μm diam detecting loop is placed close to the sample. At the other end, a large loop is coupled to a NbTi coil, which is connected to a low Tc SQUID sensor. Transformers in a variety of sizes have been tested and calibrated. The results show that the system is capable of detecting the motion of a single vortex. We have used the microscope to study the behavior of moving vortices at various positions in a YBa2Cu3O7 thin film bridge.

  11. Field demonstration of CO2 leakage detection and potential impacts on groundwater quality at Brackenridge Field Laboratory

    NASA Astrophysics Data System (ADS)

    Zou, Y.; Yang, C.; Guzman, N.; Delgado, J.; Mickler, P. J.; Horvoka, S.; Trevino, R.

    2015-12-01

    One concern related to GCS is possible risk of unintended CO2 leakage from the storage formations into overlying potable aquifers on underground sources of drinking water (USDW). Here we present a series of field tests conducted in an alluvial aquifer which is on a river terrace at The University of Texas Brackenridge Field Laboratory. Several shallow groundwater wells were completed to the limestone bedrock at a depth of 6 m and screened in the lower 3 m. Core sediments recovered from the shallow aquifer show that the sediments vary in grain size from clay-rich layers to coarse sandy gravels. Two main types of field tests were conducted at the BFL: single- (or double-) well push-pull test and pulse-like CO2 release test. A single- (or double-) well push-pull test includes three phases: the injection phase, the resting phase and pulling phase. During the injection phase, groundwater pumped from the shallow aquifer was stored in a tank, equilibrated with CO2 gasand then injected into the shallow aquifer to mimic CO2 leakage. During the resting phase, the groundwater charged with CO2 reacted with minerals in the aquifer sediments. During the pulling phase, groundwater was pumped from the injection well and groundwater samples were collected continuously for groundwater chemistry analysis. In such tests, large volume of groundwater which was charged with CO2 can be injected into the shallow aquifer and thus maximize contact of groundwater charged with CO2. Different than a single- (or double-) well push-pull test, a pulse-like CO2 release test for validating chemical sensors for CO2 leakage detection involves a CO2 release phase that CO2 gas was directly bubbled into the testing well and a post monitoring phase that groundwater chemistry was continuously monitored through sensors and/or grounder sampling. Results of the single- (or double-) well push-pull tests conducted in the shallow aquifer shows that the unintended CO2 leakage could lead to dissolution of carbonates and some silicates and mobilization of heavy metals from the aquifer sediments to groundwater, however, such mobilization posed no risks on groundwater quality at this site. The pulse-like tests have demonstrated it is plausible to use chemical sensors for CO2 leakage detection in groundwater.

  12. A uniaxial stress capacitive dilatometer for high-resolution thermal expansion and magnetostriction under multiextreme conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Küchler, R.; Experimental Physics VI, Center for Electronic Correlations and Magnetism, University of Augsburg, Universitätsstrasse 2, 86135 Augsburg; Stingl, C.

    2016-07-15

    Thermal expansion and magnetostriction are directional dependent thermodynamic quantities. For the characterization of novel quantum phases of matter, it is required to study materials under multi-extreme conditions, in particular, down to very low temperatures, in very high magnetic fields or under high pressure. We developed a miniaturized capacitive dilatometer suitable for temperatures down to 20 mK and usage in high magnetic fields, which exerts a large spring force between 40 to 75 N on the sample. This corresponds to a uniaxial stress up to 3 kbar for a sample with cross section of (0.5 mm){sup 2}. We describe design andmore » performance test of the dilatometer which resolves length changes with high resolution of 0.02 Å at low temperatures. The miniaturized device can be utilized in any standard cryostat, including dilution refrigerators or the commercial physical property measurement system.« less

  13. Field Comparison of the Sampling Efficacy of Two Smear Media: Cotton Fiber and Kraft Paper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogue, M.G.

    Two materials were compared in field tests at the Defense Waste Processing Facility: kraft paper (a strong, brown paper made from wood pulp prepared with a sodium sulfate solution) and cotton fiber. Based on a sampling of forty-six pairs of smears, the cotton fiber smears provide a greater sensitivity. The cotton fiber smears collected an average of forty-four percent more beta activity than the kraft paper smears and twenty-nine percent more alpha activity. Results show a greater sensitivity with cotton fiber over kraft paper at the 95 percent confidence level. Regulatory requirements for smear materials are vague. The data demonstratemore » that the difference in sensitivity of smear materials could lead to a large difference in reported results that are subsequently used for meeting shipping regulations or evaluating workplace contamination levels.« less

  14. Dark-field hyperspectral X-ray imaging

    PubMed Central

    Egan, Christopher K.; Jacques, Simon D. M.; Connolley, Thomas; Wilson, Matthew D.; Veale, Matthew C.; Seller, Paul; Cernik, Robert J.

    2014-01-01

    In recent times, there has been a drive to develop non-destructive X-ray imaging techniques that provide chemical or physical insight. To date, these methods have generally been limited; either requiring raster scanning of pencil beams, using narrow bandwidth radiation and/or limited to small samples. We have developed a novel full-field radiographic imaging technique that enables the entire physio-chemical state of an object to be imaged in a single snapshot. The method is sensitive to emitted and scattered radiation, using a spectral imaging detector and polychromatic hard X-radiation, making it particularly useful for studying large dense samples for materials science and engineering applications. The method and its extension to three-dimensional imaging is validated with a series of test objects and demonstrated to directly image the crystallographic preferred orientation and formed precipitates across an aluminium alloy friction stir weld section. PMID:24808753

  15. Replacement of Hydrochlorofluorocarbon (HCFC) -225 Solvent for Cleaning and Verification Sampling of NASA Propulsion Oxygen Systems Hardware, Ground Support Equipment, and Associated Test Systems

    NASA Technical Reports Server (NTRS)

    Burns, H. D.; Mitchell, M. A.; McMillian, J. H.; Farner, B. R.; Harper, S. A.; Peralta, S. F.; Lowrey, N. M.; Ross, H. R.; Juarez, A.

    2015-01-01

    Since the 1990's, NASA's rocket propulsion test facilities at Marshall Space Flight Center (MSFC) and Stennis Space Center (SSC) have used hydrochlorofluorocarbon-225 (HCFC-225), a Class II ozone-depleting substance, to safety clean and verify the cleanliness of large scale propulsion oxygen systems and associated test facilities. In 2012 through 2014, test laboratories at MSFC, SSC, and Johnson Space Center-White Sands Test Facility collaborated to seek out, test, and qualify an environmentally preferred replacement for HCFC-225. Candidate solvents were selected, a test plan was developed, and the products were tested for materials compatibility, oxygen compatibility, cleaning effectiveness, and suitability for use in cleanliness verification and field cleaning operations. Honewell Soltice (TradeMark) Performance Fluid (trans-1-chloro-3,3, 3-trifluoropropene) was selected to replace HCFC-225 at NASA's MSFC and SSC rocket propulsion test facilities.

  16. Recovery of diverse microbes in high turbidity surface water samples using dead-end ultrafiltration

    PubMed Central

    Mull, Bonnie; Hill, Vincent R.

    2015-01-01

    Dead-end ultrafiltration (DEUF) has been reported to be a simple, field-deployable technique for recovering bacteria, viruses, and parasites from large-volume water samples for water quality testing and waterborne disease investigations. While DEUF has been reported for application to water samples having relatively low turbidity, little information is available regarding recovery efficiencies for this technique when applied to sampling turbid water samples such as those commonly found in lakes and rivers. This study evaluated the effectiveness of a DEUF technique for recoveringMS2 bacteriophage, enterococci, Escherichia coli, Clostridium perfringens, and Cryptosporidium parvum oocysts in surface water samples having elevated turbidity. Average recovery efficiencies for each study microbe across all turbidity ranges were: MS2 (66%), C. parvum (49%), enterococci (85%), E. coli (81%), and C. perfringens (63%). The recovery efficiencies for MS2 and C. perfringens exhibited an inversely proportional relationship with turbidity, however no significant differences in recovery were observed for C. parvum, enterococci, or E. coli. Although ultrafilter clogging was observed, the DEUF method was able to process 100-L surface water samples at each turbidity level within 60 min. This study supports the use of the DEUF method for recovering a wide array of microbes in large-volume surface water samples having medium to high turbidity. PMID:23064261

  17. Recovery of diverse microbes in high turbidity surface water samples using dead-end ultrafiltration.

    PubMed

    Mull, Bonnie; Hill, Vincent R

    2012-12-01

    Dead-end ultrafiltration (DEUF) has been reported to be a simple, field-deployable technique for recovering bacteria, viruses, and parasites from large-volume water samples for water quality testing and waterborne disease investigations. While DEUF has been reported for application to water samples having relatively low turbidity, little information is available regarding recovery efficiencies for this technique when applied to sampling turbid water samples such as those commonly found in lakes and rivers. This study evaluated the effectiveness of a DEUF technique for recovering MS2 bacteriophage, enterococci, Escherichia coli, Clostridium perfringens, and Cryptosporidium parvum oocysts in surface water samples having elevated turbidity. Average recovery efficiencies for each study microbe across all turbidity ranges were: MS2 (66%), C. parvum (49%), enterococci (85%), E. coli (81%), and C. perfringens (63%). The recovery efficiencies for MS2 and C. perfringens exhibited an inversely proportional relationship with turbidity, however no significant differences in recovery were observed for C. parvum, enterococci, or E. coli. Although ultrafilter clogging was observed, the DEUF method was able to process 100-L surface water samples at each turbidity level within 60 min. This study supports the use of the DEUF method for recovering a wide array of microbes in large-volume surface water samples having medium to high turbidity. Published by Elsevier B.V.

  18. Guidelines for the processing and quality assurance of benthic invertebrate samples collected as part of the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Cuffney, T.F.; Gurtz, M.E.; Meador, M.R.

    1993-01-01

    Benthic invertebrate samples are collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program. This is a perennial, multidisciplinary program that integrates biological, physical, and chemical indicators of water quality to evaluate status and trends and to develop an understanding of the factors controlling observed water quality. The Program examines water quality in 60 study units (coupled ground- and surface-water systems) that encompass most of the conterminous United States and parts of Alaska and Hawaii. Study-unit teams collect and process qualitative and semi-quantitative invertebrate samples according to standardized procedures. These samples are processed (elutriated and subsampled) in the field to produce as many as four sample components: large-rare, main-body, elutriate, and split. Each sample component is preserved in 10-percent formalin, and two components, large-rare and main-body, are sent to contract laboratories for further processing. The large-rare component is composed of large invertebrates that are removed from the sample matrix during field processing and placed in one or more containers. The main-body sample component consists of the remaining sample materials (sediment, detritus, and invertebrates) and is subsampled in the field to achieve a volume of 750 milliliters or less. The remaining two sample components, elutriate and split, are used for quality-assurance and quality-control purposes. Contract laboratories are used to identify and quantify invertebrates from the large-rare and main-body sample components according to the procedures and guidelines specified within this document. These guidelines allow the use of subsampling techniques to reduce the volume of sample material processed and to facilitate identifications. These processing procedures and techniques may be modified if the modifications provide equal or greater levels of accuracy and precision. The intent of sample processing is to determine the quantity of each taxon present in the semi-quantitative samples or to list the taxa present in qualitative samples. The processing guidelines provide standardized laboratory forms, sample labels, detailed sample processing flow charts, standardized format for electronic data, quality-assurance procedures and checks, sample tracking standards, and target levels for taxonomic determinations. The contract laboratory (1) is responsible for identifications and quantifications, (2) constructs reference collections, (3) provides data in hard copy and electronic forms, (4) follows specified quality-assurance and quality-control procedures, and (5) returns all processed and unprocessed portions of the samples. The U.S. Geological Survey's Quality Management Group maintains a Biological Quality-Assurance Unit, located at the National Water-Quality Laboratory, Arvada, Colorado, to oversee the use of contract laboratories and ensure the quality of data obtained from these laboratories according to the guidelines established in this document. This unit establishes contract specifications, reviews contractor performance (timeliness, accuracy, and consistency), enters data into the National Water Information System-II data base, maintains in-house reference collections, deposits voucher specimens in outside museums, and interacts with taxonomic experts within and outside the U.S. Geological Survey. This unit also modifies the existing sample processing and quality-assurance guidelines, establishes criteria and testing procedures for qualifying potential contract laboratories, identifies qualified taxonomic experts, and establishes voucher collections.

  19. Rotating magnetic field experiments in a pure superconducting Pb sphere

    NASA Astrophysics Data System (ADS)

    Vélez, Saül; García-Santiago, Antoni; Hernandez, Joan Manel; Tejada, Javier

    2009-10-01

    The magnetic properties of a sphere of pure type-I superconducting lead (Pb) under rotating magnetic fields have been investigated in different experimental conditions by measuring the voltage generated in a set of detection coils by the response of the sample to the time variation in the magnetic field. The influence of the frequency of rotation of the magnet, the time it takes to record each data point and the temperature of the sample during the measuring process is explored. A strong reduction in the thermodynamic critical field and the onset of hysteretical effects in the magnetic field dependence of the amplitude of the magnetic susceptibility are observed for large frequencies and large values of the recording time. Heating of the sample during the motion of normal zones in the intermediate state and the dominance of a resistive term in the contribution of the Lenz’s law to the magnetic susceptibility in the normal state under time varying magnetic fields are suggested as possible explanations for these effects.

  20. Testing paleointensity determinations on recent lava flows and scorias from Miyakejima, Japan

    NASA Astrophysics Data System (ADS)

    Fukuma, K.

    2013-12-01

    Still no consensus has been reached on paleointensity method. Even the classical Thellier method has not been fully tested on recent lava flows with known geomagnetic field intensity based on a systematic sampling scheme. In this study, Thellier method was applied for 1983, 1962 and 1940 basaltic lava flows and scorias from Miyakejima, Japan. Several vertical lava sections and quenched scorias, which are quite variable in magnetic mineralogy and grain size, provide an unparalleled opportunity to test paleointensity methods. Thellier experiments were conducted on a completely automated three-component spinner magnetometer with thermal demagnetizer 'tspin'. Specimens were heated in air, applied laboratory field was 45 microT, and pTRM checks were performed at every two heating steps. Curie points and hysteresis properties were obtained on small fragments removed from cylindrical specimens. For lava flows sigmoidal curves were commonly observed on the Arai diagrams. Especially the interior part of lava flows always revealed sigmoidal patterns and sometimes resulted in erroneously blurred behaviors. The directions after zero-field heating were not necessarily stable in the course of the Thellier experiments. It was very difficult, for the interior part, to ascertain linear segments on Arai diagrams corresponding to the geomagnetic field intensity at the eruption. Upper and lower clinker samples also generally revealed sigmoidal or upward concave curves on Arai diagrams. Neither lower nor higher temperature portions of the sigmoids or concaves gave the expected geomagnetic field intensities. However, there were two exceptional cases of lava flows giving correct field intensities: upper clinkers with relatively low unblocking temperatures (< 400 deg.C) and lower clinkers with broad unblocking temperature ranges from room temperature to 600 deg.C. A most promising target for paleointensity experiments within the volcanic rocks is scoria. Scoria samples always carry single Curie temperatures higher than 500 deg.C, and the ratios of saturation remanence to saturation magnetization (Mr/Ms) of about 0.5 are indicative of truly single-domain low-titanium titanomagnetite. Unambiguous straight lines were always observed on Arai diagrams covering broad temperature ranges like the lower clinker samples, and the gradients gave the expected field values within a few percent errors. Thellier experiments applied for the recent lava flows did not successfully recover the expected field intensity from most samples. No linear segment was recognized or incorrect paleointensity values were obtained from short segments with limited temperature ranges. In Thellier or other types of paleointensity experiments laboratory alteration is checked in details, but if a sample once passed the alteration check, the TRM/NRM ratios of any limited temperature or field ranges were accepted as reflecting paleointensity. Previously published paleointensity data from lava flows should include much of such dubious data. Generally lava flows are not suitable for paleointensity determinations in light of its large grain-size and mixed magnetic mineralogy, except for scoria and clinker.

  1. Internet cognitive testing of large samples needed in genetic research.

    PubMed

    Haworth, Claire M A; Harlaar, Nicole; Kovas, Yulia; Davis, Oliver S P; Oliver, Bonamy R; Hayiou-Thomas, Marianna E; Frances, Jane; Busfield, Patricia; McMillan, Andrew; Dale, Philip S; Plomin, Robert

    2007-08-01

    Quantitative and molecular genetic research requires large samples to provide adequate statistical power, but it is expensive to test large samples in person, especially when the participants are widely distributed geographically. Increasing access to inexpensive and fast Internet connections makes it possible to test large samples efficiently and economically online. Reliability and validity of Internet testing for cognitive ability have not been previously reported; these issues are especially pertinent for testing children. We developed Internet versions of reading, language, mathematics and general cognitive ability tests and investigated their reliability and validity for 10- and 12-year-old children. We tested online more than 2500 pairs of 10-year-old twins and compared their scores to similar internet-based measures administered online to a subsample of the children when they were 12 years old (> 759 pairs). Within 3 months of the online testing at 12 years, we administered standard paper and pencil versions of the reading and mathematics tests in person to 30 children (15 pairs of twins). Scores on Internet-based measures at 10 and 12 years correlated .63 on average across the two years, suggesting substantial stability and high reliability. Correlations of about .80 between Internet measures and in-person testing suggest excellent validity. In addition, the comparison of the internet-based measures to ratings from teachers based on criteria from the UK National Curriculum suggests good concurrent validity for these tests. We conclude that Internet testing can be reliable and valid for collecting cognitive test data on large samples even for children as young as 10 years.

  2. The PneuCarriage Project: A Multi-Centre Comparative Study to Identify the Best Serotyping Methods for Examining Pneumococcal Carriage in Vaccine Evaluation Studies.

    PubMed

    Satzke, Catherine; Dunne, Eileen M; Porter, Barbara D; Klugman, Keith P; Mulholland, E Kim

    2015-11-01

    The pneumococcus is a diverse pathogen whose primary niche is the nasopharynx. Over 90 different serotypes exist, and nasopharyngeal carriage of multiple serotypes is common. Understanding pneumococcal carriage is essential for evaluating the impact of pneumococcal vaccines. Traditional serotyping methods are cumbersome and insufficient for detecting multiple serotype carriage, and there are few data comparing the new methods that have been developed over the past decade. We established the PneuCarriage project, a large, international multi-centre study dedicated to the identification of the best pneumococcal serotyping methods for carriage studies. Reference sample sets were distributed to 15 research groups for blinded testing. Twenty pneumococcal serotyping methods were used to test 81 laboratory-prepared (spiked) samples. The five top-performing methods were used to test 260 nasopharyngeal (field) samples collected from children in six high-burden countries. Sensitivity and positive predictive value (PPV) were determined for the test methods and the reference method (traditional serotyping of >100 colonies from each sample). For the alternate serotyping methods, the overall sensitivity ranged from 1% to 99% (reference method 98%), and PPV from 8% to 100% (reference method 100%), when testing the spiked samples. Fifteen methods had ≥70% sensitivity to detect the dominant (major) serotype, whilst only eight methods had ≥70% sensitivity to detect minor serotypes. For the field samples, the overall sensitivity ranged from 74.2% to 95.8% (reference method 93.8%), and PPV from 82.2% to 96.4% (reference method 99.6%). The microarray had the highest sensitivity (95.8%) and high PPV (93.7%). The major limitation of this study is that not all of the available alternative serotyping methods were included. Most methods were able to detect the dominant serotype in a sample, but many performed poorly in detecting the minor serotype populations. Microarray with a culture amplification step was the top-performing method. Results from this comprehensive evaluation will inform future vaccine evaluation and impact studies, particularly in low-income settings, where pneumococcal disease burden remains high.

  3. The PneuCarriage Project: A Multi-Centre Comparative Study to Identify the Best Serotyping Methods for Examining Pneumococcal Carriage in Vaccine Evaluation Studies

    PubMed Central

    Satzke, Catherine; Dunne, Eileen M.; Porter, Barbara D.; Klugman, Keith P.; Mulholland, E. Kim

    2015-01-01

    Background The pneumococcus is a diverse pathogen whose primary niche is the nasopharynx. Over 90 different serotypes exist, and nasopharyngeal carriage of multiple serotypes is common. Understanding pneumococcal carriage is essential for evaluating the impact of pneumococcal vaccines. Traditional serotyping methods are cumbersome and insufficient for detecting multiple serotype carriage, and there are few data comparing the new methods that have been developed over the past decade. We established the PneuCarriage project, a large, international multi-centre study dedicated to the identification of the best pneumococcal serotyping methods for carriage studies. Methods and Findings Reference sample sets were distributed to 15 research groups for blinded testing. Twenty pneumococcal serotyping methods were used to test 81 laboratory-prepared (spiked) samples. The five top-performing methods were used to test 260 nasopharyngeal (field) samples collected from children in six high-burden countries. Sensitivity and positive predictive value (PPV) were determined for the test methods and the reference method (traditional serotyping of >100 colonies from each sample). For the alternate serotyping methods, the overall sensitivity ranged from 1% to 99% (reference method 98%), and PPV from 8% to 100% (reference method 100%), when testing the spiked samples. Fifteen methods had ≥70% sensitivity to detect the dominant (major) serotype, whilst only eight methods had ≥70% sensitivity to detect minor serotypes. For the field samples, the overall sensitivity ranged from 74.2% to 95.8% (reference method 93.8%), and PPV from 82.2% to 96.4% (reference method 99.6%). The microarray had the highest sensitivity (95.8%) and high PPV (93.7%). The major limitation of this study is that not all of the available alternative serotyping methods were included. Conclusions Most methods were able to detect the dominant serotype in a sample, but many performed poorly in detecting the minor serotype populations. Microarray with a culture amplification step was the top-performing method. Results from this comprehensive evaluation will inform future vaccine evaluation and impact studies, particularly in low-income settings, where pneumococcal disease burden remains high. PMID:26575033

  4. Genus-Specific Primers for Study of Fusarium Communities in Field Samples

    PubMed Central

    Edel-Hermann, Véronique; Gautheron, Nadine; Durling, Mikael Brandström; Kolseth, Anna-Karin; Steinberg, Christian; Persson, Paula; Friberg, Hanna

    2015-01-01

    Fusarium is a large and diverse genus of fungi of great agricultural and economic importance, containing many plant pathogens and mycotoxin producers. To date, high-throughput sequencing of Fusarium communities has been limited by the lack of genus-specific primers targeting regions with high discriminatory power at the species level. In the present study, we evaluated two Fusarium-specific primer pairs targeting translation elongation factor 1 (TEF1). We also present the new primer pair Fa+7/Ra+6. Mock Fusarium communities reflecting phylogenetic diversity were used to evaluate the accuracy of the primers in reflecting the relative abundance of the species. TEF1 amplicons were subjected to 454 high-throughput sequencing to characterize Fusarium communities. Field samples from soil and wheat kernels were included to test the method on more-complex material. For kernel samples, a single PCR was sufficient, while for soil samples, nested PCR was necessary. The newly developed primer pairs Fa+7/Ra+6 and Fa/Ra accurately reflected Fusarium species composition in mock DNA communities. In field samples, 47 Fusarium operational taxonomic units were identified, with the highest Fusarium diversity in soil. The Fusarium community in soil was dominated by members of the Fusarium incarnatum-Fusarium equiseti species complex, contradicting findings in previous studies. The method was successfully applied to analyze Fusarium communities in soil and plant material and can facilitate further studies of Fusarium ecology. PMID:26519387

  5. Improvements in Technique of NMR Imaging and NMR Diffusion Measurements in the Presence of Background Gradients.

    NASA Astrophysics Data System (ADS)

    Lian, Jianyu

    In this work, modification of the cosine current distribution rf coil, PCOS, has been introduced and tested. The coil produces a very homogeneous rf magnetic field, and it is inexpensive to build and easy to tune for multiple resonance frequency. The geometrical parameters of the coil are optimized to produce the most homogeneous rf field over a large volume. To avoid rf field distortion when the coil length is comparable to a quarter wavelength, a parallel PCOS coil is proposed and discussed. For testing rf coils and correcting B _1 in NMR experiments, a simple, rugged and accurate NMR rf field mapping technique has been developed. The method has been tested and used in 1D, 2D, 3D and in vivo rf mapping experiments. The method has been proven to be very useful in the design of rf coils. To preserve the linear relation between rf output applied on an rf coil and modulating input for an rf modulating -amplifying system of NMR imaging spectrometer, a quadrature feedback loop is employed in an rf modulator with two orthogonal rf channels to correct the amplitude and phase non-linearities caused by the rf components in the rf system. The modulator is very linear over a large range and it can generate an arbitrary rf shape. A diffusion imaging sequence has been developed for measuring and imaging diffusion in the presence of background gradients. Cross terms between the diffusion sensitizing gradients and background gradients or imaging gradients can complicate diffusion measurement and make the interpretation of NMR diffusion data ambiguous, but these have been eliminated in this method. Further, the background gradients has been measured and imaged. A dipole random distribution model has been established to study background magnetic fields Delta B and background magnetic gradients G_0 produced by small particles in a sample when it is in a B_0 field. From this model, the minimum distance that a spin can approach a particle can be determined by measuring and <{bf G}_sp{0 }{2}>. From this model, the particle concentration in a sample can be determined by measuring the lineshape of a free induction decay (fid).

  6. Exploring the Energy Landscapes of Protein Folding Simulations with Bayesian Computation

    PubMed Central

    Burkoff, Nikolas S.; Várnai, Csilla; Wells, Stephen A.; Wild, David L.

    2012-01-01

    Nested sampling is a Bayesian sampling technique developed to explore probability distributions localized in an exponentially small area of the parameter space. The algorithm provides both posterior samples and an estimate of the evidence (marginal likelihood) of the model. The nested sampling algorithm also provides an efficient way to calculate free energies and the expectation value of thermodynamic observables at any temperature, through a simple post processing of the output. Previous applications of the algorithm have yielded large efficiency gains over other sampling techniques, including parallel tempering. In this article, we describe a parallel implementation of the nested sampling algorithm and its application to the problem of protein folding in a Gō-like force field of empirical potentials that were designed to stabilize secondary structure elements in room-temperature simulations. We demonstrate the method by conducting folding simulations on a number of small proteins that are commonly used for testing protein-folding procedures. A topological analysis of the posterior samples is performed to produce energy landscape charts, which give a high-level description of the potential energy surface for the protein folding simulations. These charts provide qualitative insights into both the folding process and the nature of the model and force field used. PMID:22385859

  7. Exploring the energy landscapes of protein folding simulations with Bayesian computation.

    PubMed

    Burkoff, Nikolas S; Várnai, Csilla; Wells, Stephen A; Wild, David L

    2012-02-22

    Nested sampling is a Bayesian sampling technique developed to explore probability distributions localized in an exponentially small area of the parameter space. The algorithm provides both posterior samples and an estimate of the evidence (marginal likelihood) of the model. The nested sampling algorithm also provides an efficient way to calculate free energies and the expectation value of thermodynamic observables at any temperature, through a simple post processing of the output. Previous applications of the algorithm have yielded large efficiency gains over other sampling techniques, including parallel tempering. In this article, we describe a parallel implementation of the nested sampling algorithm and its application to the problem of protein folding in a Gō-like force field of empirical potentials that were designed to stabilize secondary structure elements in room-temperature simulations. We demonstrate the method by conducting folding simulations on a number of small proteins that are commonly used for testing protein-folding procedures. A topological analysis of the posterior samples is performed to produce energy landscape charts, which give a high-level description of the potential energy surface for the protein folding simulations. These charts provide qualitative insights into both the folding process and the nature of the model and force field used. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  8. Wave-current induced erosion of cohesive riverbanks in northern Manitoba, Canada

    NASA Astrophysics Data System (ADS)

    Kimiaghalam, N.; Clark, S.; Ahmari, H.; Hunt, J.

    2015-03-01

    The field of cohesive soil erosion is still not fully understood, in large part due to the many soil parameters that affect cohesive soil erodibility. This study is focused on two channels, 2-Mile and 8-Mile channels in northern Manitoba, Canada, that were built to connect Lake Winnipeg with Playgreen Lake and Playgreen Lake with Kiskikittogisu Lake, respectively. The banks of the channels consist of clay rich soils and alluvial deposits of layered clay, silts and sands. The study of erosion at the sites is further complicated because the flow-induced erosion is combined with the effects of significant wave action due to the large fetch length on the adjacent lakes, particularly Lake Winnipeg that is the seventh largest lake in North America. The study included three main components: field measurements, laboratory experiments and numerical modelling. Field measurements consisted of soil sampling from the banks and bed of the channels, current measurements and water sampling. Grab soil samples were used to measure the essential physical and electrochemical properties of the riverbanks, and standard ASTM Shelby tube samples were used to estimate the critical shear stress and erodibility of the soil samples using an erosion measurement device (EMD). Water samples were taken to estimate the sediment concentration profile and also to monitor changes in sediment concentration along the channels over time. An Acoustic Doppler Current Profiler (ADCP) was used to collect bathymetry and current data, and two water level gauges have been installed to record water levels at the entrance and outlet of the channels. The MIKE 21 NSW model was used to simulate waves using historical winds and measured bathymetry of the channels and lakes. Finally, results from the wave numerical model, laboratory tests and current measurement were used to estimate the effect of each component on erodibility of the cohesive banks.

  9. Influence of the magnetic field profile on ITER conductor testing

    NASA Astrophysics Data System (ADS)

    Nijhuis, A.; Ilyin, Y.; ten Kate, H. H. J.

    2006-08-01

    We performed simulations with the numerical CUDI-CICC code on a typical short ITER (International Thermonuclear Experimental Reactor) conductor test sample of dual leg configuration, as usually tested in the SULTAN test facility, and made a comparison with the new EFDA-Dipole test facility offering a larger applied DC field region. The new EFDA-Dipole test facility, designed for short sample testing of conductors for ITER, has a homogeneous high field region of 1.2 m, while in the SULTAN facility this region is three times shorter. The inevitable non-uniformity of the current distribution in the cable, introduced by the joints at both ends, has a degrading effect on voltage-current (VI) and voltage-temperature (VT) characteristics, particularly for these short samples. This can easily result in an underestimation or overestimation of the actual conductor performance. A longer applied DC high field region along a conductor suppresses the current non-uniformity by increasing the overall longitudinal cable electric field when reaching the current sharing mode. The numerical interpretation study presented here gives a quantitative analysis for a relevant practical case of a test of a short sample poloidal field coil insert (PFCI) conductor in SULTAN. The simulation includes the results of current distribution analysis from self-field measurements with Hall sensor arrays, current sharing measurements and inter-petal resistance measurements. The outcome of the simulations confirms that the current uniformity improves with a longer high field region but the 'measured' VI transition is barely affected, though the local peak voltages become somewhat suppressed. It appears that the location of the high field region and voltage taps has practically no influence on the VI curve as long as the transverse voltage components are adequately cancelled. In particular, for a thin conduit wall, the voltage taps should be connected to the conduit in the form of an (open) azimuthally soldered wire, averaging the transverse conduit surface potentials initiated in the joints.

  10. The wireless networking system of Earthquake precursor mobile field observation

    NASA Astrophysics Data System (ADS)

    Wang, C.; Teng, Y.; Wang, X.; Fan, X.; Wang, X.

    2012-12-01

    The mobile field observation network could be real-time, reliably record and transmit large amounts of data, strengthen the physical signal observations in specific regions and specific period, it can improve the monitoring capacity and abnormal tracking capability. According to the features of scatter everywhere, a large number of current earthquake precursor observation measuring points, networking technology is based on wireless broadband accessing McWILL system, the communication system of earthquake precursor mobile field observation would real-time, reliably transmit large amounts of data to the monitoring center from measuring points through the connection about equipment and wireless accessing system, broadband wireless access system and precursor mobile observation management center system, thereby implementing remote instrument monitoring and data transmition. At present, the earthquake precursor field mobile observation network technology has been applied to fluxgate magnetometer array geomagnetic observations of Tianzhu, Xichang,and Xinjiang, it can be real-time monitoring the working status of the observational instruments of large area laid after the last two or three years, large scale field operation. Therefore, it can get geomagnetic field data of the local refinement regions and provide high-quality observational data for impending earthquake tracking forecast. Although, wireless networking technology is very suitable for mobile field observation with the features of simple, flexible networking etc, it also has the phenomenon of packet loss etc when transmitting a large number of observational data due to the wireless relatively weak signal and narrow bandwidth. In view of high sampling rate instruments, this project uses data compression and effectively solves the problem of data transmission packet loss; Control commands, status data and observational data transmission use different priorities and means, which control the packet loss rate within an acceptable range and do not affect real-time observation curve. After field running test and earthquake tracking project applications, the field mobile observation wireless networking system is operate normally, various function have good operability and show good performance, the quality of data transmission meet the system design requirements and play a significant role in practical applications.

  11. Development of a field testing protocol for identifying Deepwater Horizon oil spill residues trapped near Gulf of Mexico beaches.

    PubMed

    Han, Yuling; Clement, T Prabhakar

    2018-01-01

    The Deepwater Horizon (DWH) accident, one of the largest oil spills in U.S. history, contaminated several beaches located along the Gulf of Mexico (GOM) shoreline. The residues from the spill still continue to be deposited on some of these beaches. Methods to track and monitor the fate of these residues require approaches that can differentiate the DWH residues from other types of petroleum residues. This is because, historically, the crude oil released from sources such as natural seeps and anthropogenic discharges have also deposited other types of petroleum residues on GOM beaches. Therefore, identifying the origin of these residues is critical for developing effective management strategies for monitoring the long-term environmental impacts of the DWH oil spill. Advanced fingerprinting methods that are currently used for identifying the source of oil spill residues require detailed laboratory studies, which can be cost-prohibitive. Also, most agencies typically use untrained workers or volunteers to conduct shoreline monitoring surveys and these worker will not have access to advanced laboratory facilities. Furthermore, it is impractical to routinely fingerprint large volumes of samples that are collected after a major oil spill event, such as the DWH spill. In this study, we propose a simple field testing protocol that can identify DWH oil spill residues based on their unique physical characteristics. The robustness of the method is demonstrated by testing a variety of oil spill samples, and the results are verified by characterizing the samples using advanced chemical fingerprinting methods. The verification data show that the method yields results that are consistent with the results derived from advanced fingerprinting methods. The proposed protocol is a reliable, cost-effective, practical field approach for differentiating DWH residues from other types of petroleum residues.

  12. Testing core creation in hydrodynamical simulations using the HI kinematics of field dwarfs

    NASA Astrophysics Data System (ADS)

    Papastergis, E.; Ponomareva, A. A.

    2017-05-01

    The majority of recent hydrodynamical simulations indicate the creation of central cores in the mass profiles of low-mass halos, a process that is attributed to star formation-related baryonic feedback. Core creation is regarded as one of the most promising solutions to potential issues faced by lambda cold dark matter (ΛCDM) cosmology on small scales. For example, the reduced dynamical mass enclosed by cores can explain the low rotational velocities measured for nearby dwarf galaxies, thus possibly lifting the seeming contradiction with the ΛCDM expectations (the so-called "too big to fail" problem). Here we test core creation as a solution of cosmological issues by using a sample of dwarfs with measurements of their atomic hydrogen (HI) kinematics extending to large radii. Using the NIHAO hydrodynamical simulation as an example, we show that core creation can successfully reproduce the kinematics of dwarfs with small kinematic radii, R ≲ 1.5 kpc. However, the agreement with observations becomes poor once galaxies with kinematic measurements extending beyond the core region, R ≈ 1.5-4 kpc, are considered. This result illustrates the importance of testing the predictions of hydrodynamical simulations that are relevant for cosmology against a broad range of observational samples. We would like to stress that our result is valid only under the following set of assumptions: I) that our sample of dwarfs with HI kinematics is representative of the overall population of field dwarfs; II) that there are no severe measurement biases in the observational parameters of our HI dwarfs (e.g., related to inclination estimates); and III) that the HI velocity fields of dwarfs are regular enough to allow the recovery of the true enclosed dynamical mass.

  13. Development of a field testing protocol for identifying Deepwater Horizon oil spill residues trapped near Gulf of Mexico beaches

    PubMed Central

    Han, Yuling

    2018-01-01

    The Deepwater Horizon (DWH) accident, one of the largest oil spills in U.S. history, contaminated several beaches located along the Gulf of Mexico (GOM) shoreline. The residues from the spill still continue to be deposited on some of these beaches. Methods to track and monitor the fate of these residues require approaches that can differentiate the DWH residues from other types of petroleum residues. This is because, historically, the crude oil released from sources such as natural seeps and anthropogenic discharges have also deposited other types of petroleum residues on GOM beaches. Therefore, identifying the origin of these residues is critical for developing effective management strategies for monitoring the long-term environmental impacts of the DWH oil spill. Advanced fingerprinting methods that are currently used for identifying the source of oil spill residues require detailed laboratory studies, which can be cost-prohibitive. Also, most agencies typically use untrained workers or volunteers to conduct shoreline monitoring surveys and these worker will not have access to advanced laboratory facilities. Furthermore, it is impractical to routinely fingerprint large volumes of samples that are collected after a major oil spill event, such as the DWH spill. In this study, we propose a simple field testing protocol that can identify DWH oil spill residues based on their unique physical characteristics. The robustness of the method is demonstrated by testing a variety of oil spill samples, and the results are verified by characterizing the samples using advanced chemical fingerprinting methods. The verification data show that the method yields results that are consistent with the results derived from advanced fingerprinting methods. The proposed protocol is a reliable, cost-effective, practical field approach for differentiating DWH residues from other types of petroleum residues. PMID:29329313

  14. Pre-Mission Input Requirements to Enable Successful Sample Collection by A Remote Field/EVA Team

    NASA Technical Reports Server (NTRS)

    Cohen, B. A.; Lim, D. S. S.; Young, K. E.; Brunner, A.; Elphic, R. E.; Horne, A.; Kerrigan, M. C.; Osinski, G. R.; Skok, J. R.; Squyres, S. W.; hide

    2016-01-01

    The FINESSE (Field Investigations to Enable Solar System Science and Exploration) team, part of the Solar System Exploration Virtual Institute (SSERVI), is a field-based research program aimed at generating strategic knowledge in preparation for human and robotic exploration of the Moon, near-Earth asteroids, Phobos and Deimos, and beyond. In contract to other technology-driven NASA analog studies, The FINESSE WCIS activity is science-focused and, moreover, is sampling-focused with the explicit intent to return the best samples for geochronology studies in the laboratory. We used the FINESSE field excursion to the West Clearwater Lake Impact structure (WCIS) as an opportunity to test factors related to sampling decisions. We examined the in situ sample characterization and real-time decision-making process of the astronauts, with a guiding hypothesis that pre-mission training that included detailed background information on the analytical fate of a sample would better enable future astronauts to select samples that would best meet science requirements. We conducted three tests of this hypothesis over several days in the field. Our investigation was designed to document processes, tools and procedures for crew sampling of planetary targets. This was not meant to be a blind, controlled test of crew efficacy, but rather an effort to explicitly recognize the relevant variables that enter into sampling protocol and to be able to develop recommendations for crew and backroom training in future endeavors.

  15. Methods for the field evaluation of quantitative G6PD diagnostics: a review.

    PubMed

    Ley, Benedikt; Bancone, Germana; von Seidlein, Lorenz; Thriemer, Kamala; Richards, Jack S; Domingo, Gonzalo J; Price, Ric N

    2017-09-11

    Individuals with glucose-6-phosphate dehydrogenase (G6PD) deficiency are at risk of severe haemolysis following the administration of 8-aminoquinoline compounds. Primaquine is the only widely available 8-aminoquinoline for the radical cure of Plasmodium vivax. Tafenoquine is under development with the potential to simplify treatment regimens, but point-of-care (PoC) tests will be needed to provide quantitative measurement of G6PD activity prior to its administration. There is currently a lack of appropriate G6PD PoC tests, but a number of new tests are in development and are likely to enter the market in the coming years. As these are implemented, they will need to be validated in field studies. This article outlines the technical details for the field evaluation of novel quantitative G6PD diagnostics such as sample handling, reference testing and statistical analysis. Field evaluation is based on the comparison of paired samples, including one sample tested by the new assay at point of care and one sample tested by the gold-standard reference method, UV spectrophotometry in an established laboratory. Samples can be collected as capillary or venous blood; the existing literature suggests that potential differences in capillary or venous blood are unlikely to affect results substantially. The collection and storage of samples is critical to ensure preservation of enzyme activity, it is recommended that samples are stored at 4 °C and testing occurs within 4 days of collection. Test results can be visually presented as scatter plot, Bland-Altman plot, and a histogram of the G6PD activity distribution of the study population. Calculating the adjusted male median allows categorizing results according to G6PD activity to calculate standard performance indicators and to perform receiver operating characteristic (ROC) analysis.

  16. Recovering the full velocity and density fields from large-scale redshift-distance samples

    NASA Technical Reports Server (NTRS)

    Bertschinger, Edmund; Dekel, Avishai

    1989-01-01

    A new method for extracting the large-scale three-dimensional velocity and mass density fields from measurements of the radial peculiar velocities is presented. Galaxies are assumed to trace the velocity field rather than the mass. The key assumption made is that the Lagrangian velocity field has negligible vorticity, as might be expected from perturbations that grew by gravitational instability. By applying the method to cosmological N-body simulations, it is demonstrated that it accurately reconstructs the velocity field. This technique promises a direct determination of the mass density field and the initial conditions for the formation of large-scale structure from galaxy peculiar velocity surveys.

  17. Performance of internal covariance estimators for cosmic shear correlation functions

    DOE PAGES

    Friedrich, O.; Seitz, S.; Eifler, T. F.; ...

    2015-12-31

    Data re-sampling methods such as the delete-one jackknife are a common tool for estimating the covariance of large scale structure probes. In this paper we investigate the concepts of internal covariance estimation in the context of cosmic shear two-point statistics. We demonstrate how to use log-normal simulations of the convergence field and the corresponding shear field to carry out realistic tests of internal covariance estimators and find that most estimators such as jackknife or sub-sample covariance can reach a satisfactory compromise between bias and variance of the estimated covariance. In a forecast for the complete, 5-year DES survey we show that internally estimated covariance matrices can provide a large fraction of the true uncertainties on cosmological parameters in a 2D cosmic shear analysis. The volume inside contours of constant likelihood in themore » $$\\Omega_m$$-$$\\sigma_8$$ plane as measured with internally estimated covariance matrices is on average $$\\gtrsim 85\\%$$ of the volume derived from the true covariance matrix. The uncertainty on the parameter combination $$\\Sigma_8 \\sim \\sigma_8 \\Omega_m^{0.5}$$ derived from internally estimated covariances is $$\\sim 90\\%$$ of the true uncertainty.« less

  18. Micro- and nano-tomography at the DIAMOND beamline I13L imaging and coherence

    NASA Astrophysics Data System (ADS)

    Rau, C.; Bodey, A.; Storm, M.; Cipiccia, S.; Marathe, S.; Zdora, M.-C.; Zanette, I.; Wagner, U.; Batey, D.; Shi, X.

    2017-10-01

    The Diamond Beamline I13L is dedicated to imaging on the micro- and nano-lengthsale, operating in the energy range between 6 and 30keV. For this purpose two independently operating branchlines and endstations have been built. The imaging branch is fully operational for micro-tomography and in-line phase contrast imaging with micrometre resolution. Grating interferometry is currently implemented, adding the capability of measuring phase and small-angle information. For tomography with increased resolution a full-field microscope providing 50nm spatial resolution with a field of view of 100μm is being tested. The instrument provides a large working distance between optics and sample to adapt a wide range of customised sample environments. On the coherence branch coherent diffraction imaging techniques such as ptychography, coherent X-ray diffraction (CXRD) are currently developed for three dimensional imaging with the highest resolution. The imaging branch is operated in collaboration with Manchester University, called therefore the Diamond-Manchester Branchline. The scientific applications cover a large area including bio-medicine, materials science, chemistry geology and more. The present paper provides an overview about the current status of the beamline and the science addressed.

  19. RECONSTRUCTING REDSHIFT DISTRIBUTIONS WITH CROSS-CORRELATIONS: TESTS AND AN OPTIMIZED RECIPE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, Daniel J.; Newman, Jeffrey A., E-mail: djm70@pitt.ed, E-mail: janewman@pitt.ed

    2010-09-20

    Many of the cosmological tests to be performed by planned dark energy experiments will require extremely well-characterized photometric redshift measurements. Current estimates for cosmic shear are that the true mean redshift of the objects in each photo-z bin must be known to better than 0.002(1 + z), and the width of the bin must be known to {approx}0.003(1 + z) if errors in cosmological measurements are not to be degraded significantly. A conventional approach is to calibrate these photometric redshifts with large sets of spectroscopic redshifts. However, at the depths probed by Stage III surveys (such as DES), let alonemore » Stage IV (LSST, JDEM, and Euclid), existing large redshift samples have all been highly (25%-60%) incomplete, with a strong dependence of success rate on both redshift and galaxy properties. A powerful alternative approach is to exploit the clustering of galaxies to perform photometric redshift calibrations. Measuring the two-point angular cross-correlation between objects in some photometric redshift bin and objects with known spectroscopic redshift, as a function of the spectroscopic z, allows the true redshift distribution of a photometric sample to be reconstructed in detail, even if it includes objects too faint for spectroscopy or if spectroscopic samples are highly incomplete. We test this technique using mock DEEP2 Galaxy Redshift survey light cones constructed from the Millennium Simulation semi-analytic galaxy catalogs. From this realistic test, which incorporates the effects of galaxy bias evolution and cosmic variance, we find that the true redshift distribution of a photometric sample can, in fact, be determined accurately with cross-correlation techniques. We also compare the empirical error in the reconstruction of redshift distributions to previous analytic predictions, finding that additional components must be included in error budgets to match the simulation results. This extra error contribution is small for surveys that sample large areas of sky (>{approx}10{sup 0}-100{sup 0}), but dominant for {approx}1 deg{sup 2} fields. We conclude by presenting a step-by-step, optimized recipe for reconstructing redshift distributions from cross-correlation information using standard correlation measurements.« less

  20. Development and evaluation of a recombinant-glycoprotein-based latex agglutination test for rabies virus antibody assessment.

    PubMed

    Jemima, Ebenezer Angel; Manoharan, Seeralan; Kumanan, Kathaperumal

    2014-08-01

    The measurement of neutralizing antibodies induced by the glycoprotein of rabies virus is indispensable for assessing the level of neutralizing antibodies in animals or humans. A rapid fluorescent focus inhibition test (RFFIT) has been approved by WHO and is the most widely used method to measure the virus-neutralizing antibody content in serum, but a rapid test system would be of great value to screen large numbers of serum samples. To develop and evaluate a latex agglutination test (LAT) for measuring rabies virus antibodies, a recombinant glycoprotein was expressed in an insect cell system and purified, and the protein was coated onto latex beads at concentrations of 0.1, 0.25, 0.5, 0.75, and 1 mg/ml to find out the optimal concentration for coating latex beads. It was found that 0.5 mg/ml of recombinant protein was optimal for coating latex beads, and this concentration was used to sensitize the latex beads for screening of dog serum samples. Grading of LAT results was done with standard reference serum with known antibody titers. A total of 228 serum samples were tested, out of which 145 samples were positive by both RFFIT and LAT, and the specificity was found to be 100 %. In RFFIT, 151 samples were positive, the sensitivity was found to be 96.03 %, and the accuracy/concordance was found to be 97.39 %. A rapid field test-a latex agglutination test (LAT)-was developed and evaluated for rabies virus antibody assessment using recombinant glycoprotein of rabies virus expressed in an insect cell system.

  1. Electric Field Fluctuations in Water

    NASA Astrophysics Data System (ADS)

    Thorpe, Dayton; Limmer, David; Chandler, David

    2013-03-01

    Charge transfer in solution, such as autoionization and ion pair dissociation in water, is governed by rare electric field fluctuations of the solvent. Knowing the statistics of such fluctuations can help explain the dynamics of these rare events. Trajectories short enough to be tractable by computer simulation are virtually certain not to sample the large fluctuations that promote rare events. Here, we employ importance sampling techniques with classical molecular dynamics simulations of liquid water to study statistics of electric field fluctuations far from their means. We find that the distributions of electric fields located on individual water molecules are not in general gaussian. Near the mean this non-gaussianity is due to the internal charge distribution of the water molecule. Further from the mean, however, there is a previously unreported Bjerrum-like defect that stabilizes certain large fluctuations out of equilibrium. As expected, differences in electric fields acting between molecules are gaussian to a remarkable degree. By studying these differences, though, we are able to determine what configurations result not only in large electric fields, but also in electric fields with long spatial correlations that may be needed to promote charge separation.

  2. Field Tests of Optical Instruments

    DTIC Science & Technology

    1947-03-15

    s > S3KS55Ü j.6),&;i.r..fc..’.w.~— * s1 Field Tests of Optical Instruments ^. (Not known) (Same) Bureau of Ordnance. Washington, D..D...a large-scale field test of optical instruments are described. The tests were instituted to check the correctness of theoretical considerations and...of laboratory tests -which have been v.sed in the selection and design of such instruments. Field con- ditions approximated as far as possible those

  3. Lensfree fluorescent on-chip imaging of transgenic Caenorhabditis elegans over an ultra-wide field-of-view.

    PubMed

    Coskun, Ahmet F; Sencan, Ikbal; Su, Ting-Wei; Ozcan, Aydogan

    2011-01-06

    We demonstrate lensfree on-chip fluorescent imaging of transgenic Caenorhabditis elegans (C. elegans) over an ultra-wide field-of-view (FOV) of e.g., >2-8 cm(2) with a spatial resolution of ∼10 µm. This is the first time that a lensfree on-chip platform has successfully imaged fluorescent C. elegans samples. In our wide-field lensfree imaging platform, the transgenic samples are excited using a prism interface from the side, where the pump light is rejected through total internal reflection occurring at the bottom facet of the substrate. The emitted fluorescent signal from C. elegans samples is then recorded on a large area opto-electronic sensor-array over an FOV of e.g., >2-8 cm(2), without the use of any lenses, thin-film interference filters or mechanical scanners. Because fluorescent emission rapidly diverges, such lensfree fluorescent images recorded on a chip look blurred due to broad point-spread-function of our platform. To combat this resolution challenge, we use a compressive sampling algorithm to uniquely decode the recorded lensfree fluorescent patterns into higher resolution images, demonstrating ∼10 µm resolution. We tested the efficacy of this compressive decoding approach with different types of opto-electronic sensors to achieve a similar resolution level, independent of the imaging chip. We further demonstrate that this wide FOV lensfree fluorescent imaging platform can also perform sequential bright-field imaging of the same samples using partially-coherent lensfree digital in-line holography that is coupled from the top facet of the same prism used in fluorescent excitation. This unique combination permits ultra-wide field dual-mode imaging of C. elegans on a chip which could especially provide a useful tool for high-throughput screening applications in biomedical research.

  4. Magnetic Barkhausen Noise Measurements Using Tetrapole Probe Designs

    NASA Astrophysics Data System (ADS)

    McNairnay, Paul

    A magnetic Barkhausen noise (MBN) testing system was developed for Defence Research and Development Canada (DRDC) to perform MBN measurements on the Royal Canadian Navy's Victoria class submarine hulls that can be correlated with material properties, including residual stress. The DRDC system was based on the design of a MBN system developed by Steven White at Queen's University, which was capable of performing rapid angular dependent measurements through the implementation of a flux controlled tetrapole probe. In tetrapole probe designs, the magnetic excitation field is rotated in the surface plane of the sample under the assumption of linear superposition of two orthogonal magnetic fields. During the course of this work, however, the validity of flux superposition in ferromagnetic materials, for the purpose of measuring MBN, was brought into question. Consequently, a study of MBN anisotropy using tetrapole probes was performed. Results indicate that MBN anisotropy measured under flux superposition does not simulate MBN anisotropy data obtained through manual rotation of a single dipole excitation field. It is inferred that MBN anisotropy data obtained with tetrapole probes is the result of the magnetic domain structure's response to an orthogonal magnetization condition and not necessarily to any bulk superposition magnetization in the sample. A qualitative model for the domain configuration under two orthogonal magnetic fields is proposed to describe the results. An empirically derived fitting equation, that describes tetrapole MBN anisotropy data, is presented. The equation describes results in terms of two largely independent orthogonal fields, and includes interaction terms arising due to competing orthogonally magnetized domain structures and interactions with the sample's magnetic easy axis. The equation is used to fit results obtained from a number of samples and tetrapole orientations and in each case correctly identifies the samples' magnetic easy axis.

  5. Lensfree Fluorescent On-Chip Imaging of Transgenic Caenorhabditis elegans Over an Ultra-Wide Field-of-View

    PubMed Central

    Ozcan, Aydogan

    2011-01-01

    We demonstrate lensfree on-chip fluorescent imaging of transgenic Caenorhabditis elegans (C. elegans) over an ultra-wide field-of-view (FOV) of e.g., >2–8 cm2 with a spatial resolution of ∼10µm. This is the first time that a lensfree on-chip platform has successfully imaged fluorescent C. elegans samples. In our wide-field lensfree imaging platform, the transgenic samples are excited using a prism interface from the side, where the pump light is rejected through total internal reflection occurring at the bottom facet of the substrate. The emitted fluorescent signal from C. elegans samples is then recorded on a large area opto-electronic sensor-array over an FOV of e.g., >2–8 cm2, without the use of any lenses, thin-film interference filters or mechanical scanners. Because fluorescent emission rapidly diverges, such lensfree fluorescent images recorded on a chip look blurred due to broad point-spread-function of our platform. To combat this resolution challenge, we use a compressive sampling algorithm to uniquely decode the recorded lensfree fluorescent patterns into higher resolution images, demonstrating ∼10 µm resolution. We tested the efficacy of this compressive decoding approach with different types of opto-electronic sensors to achieve a similar resolution level, independent of the imaging chip. We further demonstrate that this wide FOV lensfree fluorescent imaging platform can also perform sequential bright-field imaging of the same samples using partially-coherent lensfree digital in-line holography that is coupled from the top facet of the same prism used in fluorescent excitation. This unique combination permits ultra-wide field dual-mode imaging of C. elegans on a chip which could especially provide a useful tool for high-throughput screening applications in biomedical research. PMID:21253611

  6. Reliability and degradation of oxide VCSELs due to reaction to atmospheric water vapor

    NASA Astrophysics Data System (ADS)

    Dafinca, Alexandru; Weidberg, Anthony R.; McMahon, Steven J.; Grillo, Alexander A.; Farthouat, Philippe; Ziolkowski, Michael; Herrick, Robert W.

    2013-03-01

    850nm oxide-aperture VCSELs are susceptible to premature failure if operated while exposed to atmospheric water vapor, and not protected by hermetic packaging. The ATLAS detector in CERN's Large Hadron Collider (LHC) has had approximately 6000 channels of Parallel Optic VCSELs fielded under well-documented ambient conditions. Exact time-to-failure data has been collected on this large sample, providing for the first time actual failure data at use conditions. In addition, the same VCSELs were tested under a variety of accelerated conditions to allow us to construct a more accurate acceleration model. Failure analysis information will also be presented to show what we believe causes corrosion-related failure for such VCSELs.

  7. Sampling Long- versus Short-Range Interactions Defines the Ability of Force Fields To Reproduce the Dynamics of Intrinsically Disordered Proteins.

    PubMed

    Mercadante, Davide; Wagner, Johannes A; Aramburu, Iker V; Lemke, Edward A; Gräter, Frauke

    2017-09-12

    Molecular dynamics (MD) simulations have valuably complemented experiments describing the dynamics of intrinsically disordered proteins (IDPs), particularly since the proposal of models to solve the artificial collapse of IDPs in silico. Such models suggest redefining nonbonded interactions, by either increasing water dispersion forces or adopting the Kirkwood-Buff force field. These approaches yield extended conformers that better comply with experiments, but it is unclear if they all sample the same intrachain dynamics of IDPs. We have tested this by employing MD simulations and single-molecule Förster resonance energy transfer spectroscopy to sample the dimensions of systems with different sequence compositions, namely strong and weak polyelectrolytes. For strong polyelectrolytes in which charge effects dominate, all the proposed solutions equally reproduce the expected ensemble's dimensions. For weak polyelectrolytes, at lower cutoffs, force fields abnormally alter intrachain dynamics, overestimating excluded volume over chain flexibility or reporting no difference between the dynamics of different chains. The TIP4PD water model alone can reproduce experimentally observed changes in extensions (dimensions), but not quantitatively and with only weak statistical significance. Force field limitations are reversed with increased interaction cutoffs, showing that chain dynamics are critically defined by the presence of long-range interactions. Force field analysis aside, our study provides the first insights into how long-range interactions critically define IDP dimensions and raises the question of which length range is crucial to correctly sample the overall dimensions and internal dynamics of the large group of weakly charged yet highly polar IDPs.

  8. Hard X-ray full field microscopy and magnifying microtomography using compound refractive lenses

    NASA Astrophysics Data System (ADS)

    Schroer, Christian G.; Günzler, Til Florian; Benner, Boris; Kuhlmann, Marion; Tümmler, Johannes; Lengeler, Bruno; Rau, Christoph; Weitkamp, Timm; Snigirev, Anatoly; Snigireva, Irina

    2001-07-01

    For hard X-rays, parabolic compound refractive lenses (PCRLs) are genuine imaging devices like glass lenses for visible light. Based on these new lenses, a hard X-ray full field microscope has been constructed that is ideally suited to image the interior of opaque samples with a minimum of sample preparation. As a result of a large depth of field, CRL micrographs are sharp projection images of most samples. To obtain 3D information about a sample, tomographic techniques are combined with magnified imaging.

  9. Impact of magnetic fields on the morphology of hybrid perovskite films for solar cells

    NASA Astrophysics Data System (ADS)

    Corpus-Mendoza, Asiel N.; Moreno-Romero, Paola M.; Hu, Hailin

    2018-05-01

    The impact of magnetic fields on the morphology of hybrid perovskite films is assessed via scanning electron microscopy and X-ray diffraction. Small-grain non-uniform perovskite films are obtained when a large magnetic flux density is applied to the sample during reaction of PbI2 and methylammonium iodide (chloride). Similarly, X-ray diffraction reveals a change of preferential crystalline planes when large magnetic fields are applied. Furthermore, we experimentally demonstrate that the quality of the perovskite film is affected by the magnetic field induced by the magnetic stirring system of the hot plate where the samples are annealed. As a consequence, optimization of the perovskite layer varies with magnetic field and annealing temperature. Finally, we prove that uncontrolled magnetic fields on the environment of preparation can severely influence the reproducibility of results.

  10. Enhanced Conformational Sampling Using Replica Exchange with Collective-Variable Tempering

    PubMed Central

    2015-01-01

    The computational study of conformational transitions in RNA and proteins with atomistic molecular dynamics often requires suitable enhanced sampling techniques. We here introduce a novel method where concurrent metadynamics are integrated in a Hamiltonian replica-exchange scheme. The ladder of replicas is built with different strengths of the bias potential exploiting the tunability of well-tempered metadynamics. Using this method, free-energy barriers of individual collective variables are significantly reduced compared with simple force-field scaling. The introduced methodology is flexible and allows adaptive bias potentials to be self-consistently constructed for a large number of simple collective variables, such as distances and dihedral angles. The method is tested on alanine dipeptide and applied to the difficult problem of conformational sampling in a tetranucleotide. PMID:25838811

  11. A Field-Portable Cell Analyzer without a Microscope and Reagents

    PubMed Central

    Oh, Sangwoo; Lee, Moonjin; Hwang, Yongha

    2017-01-01

    This paper demonstrates a commercial-level field-portable lens-free cell analyzer called the NaviCell (No-stain and Automated Versatile Innovative cell analyzer) capable of automatically analyzing cell count and viability without employing an optical microscope and reagents. Based on the lens-free shadow imaging technique, the NaviCell (162 × 135 × 138 mm3 and 1.02 kg) has the advantage of providing analysis results with improved standard deviation between measurement results, owing to its large field of view. Importantly, the cell counting and viability testing can be analyzed without the use of any reagent, thereby simplifying the measurement procedure and reducing potential errors during sample preparation. In this study, the performance of the NaviCell for cell counting and viability testing was demonstrated using 13 and six cell lines, respectively. Based on the results of the hemocytometer (de facto standard), the error rate (ER) and coefficient of variation (CV) of the NaviCell are approximately 3.27 and 2.16 times better than the commercial cell counter, respectively. The cell viability testing of the NaviCell also showed an ER and CV performance improvement of 5.09 and 1.8 times, respectively, demonstrating sufficient potential in the field of cell analysis. PMID:29286336

  12. The Development of Replicated Optical Integral Field Spectrographs and their Application to the Study of Lyman-alpha Emission at Moderate Redshifts

    NASA Astrophysics Data System (ADS)

    Chonis, Taylor Steven

    In the upcoming era of extremely large ground-based astronomical telescopes, the design of wide-field spectroscopic survey instrumentation has become increasingly complex due to the linear growth of instrument pupil size with telescope diameter for a constant spectral resolving power. The upcoming Visible Integral field Replicable Unit Spectrograph (VIRUS), a baseline array of 150 copies of a simple integral field spectrograph that will be fed by 3:36 x 104 optical fibers on the upgraded Hobby-Eberly Telescope (HET) at McDonald Observatory, represents one of the first uses of large-scale replication to break the relationship between instrument pupil size and telescope diameter. By dividing the telescope's field of view between a large number of smaller and more manageable instruments, the total information grasp of a traditional monolithic survey spectrograph can be achieved at a fraction of the cost and engineering complexity. To highlight the power of this method, VIRUS will execute the HET Dark Energy Experiment (HETDEX) and survey & 420 degrees2 of sky to an emission line flux limit of ˜ 10-17 erg s-1 cm -2 to detect ˜ 106 Lyman-alpha emitting galaxies (LAEs) as probes of large-scale structure at redshifts of 1:9 < z < 3:5. HETDEX will precisely measure the evolution of dark energy at that epoch, and will simultaneously amass an LAE sample that will be unprecedented for extragalactic astrophysics at the redshifts of interest. Large-scale replication has clear advantages to increasing the total information grasp of a spectrograph, but there are also challenges. In this dissertation, two of these challenges with respect to VIRUS are detailed. First, the VIRUS cryogenic system is discussed, specifically the design and tests of a novel thermal connector and internal camera croygenic components that link the 150 charge-coupled device detectors to the instrument's liquid nitrogen distribution system. Second, the design, testing, and mass production of the suite of volume phase holographic (VPH) diffraction gratings for VIRUS is presented, which highlights the challenge and success associated with producing of a very large number of highly customized optical elements whose performance is crucial to meeting the efficiency requirements of the spectrograph system. To accommodate VIRUS, the HET is undergoing a substantial wide-field upgrade to increase its field of view to 22' in diameter. The previous HET facility Low Resolution Spectrograph (LRS), which was directly fed by the telescope's previous spherical aberration corrector, must be removed from the prime focus instrument package as a result of the telescope upgrades and instead be fiber-coupled to the telescope focal plane. For a similar cost as modifying LRS to accommodate these changes, a new second generation instrument (LRS2) will be based on the VIRUS unit spectrograph. The design, operational concept, construction, and laboratory testing and characterization of LRS2 is the primary focus of this dissertation, which highlights the benefits of leveraging the large engineering investment, economies of scale, and laboratory and observatory infrastructure associated with the massively replicated VIRUS instrument. LRS2 will provide integral field spectroscopy for a seeing-limited field of 12" x 6". The multiplexed VIRUS framework facilitates broad wavelength coverage from 370 nm to 1.0 mum spread between two dual-channel spectrographs at a moderate spectral resolving power of R ≈ 2000. The design departures from VIRUS are presented, including the novel integral field unit, VPH grism dispersers, and various optical changes for accommodating the broadband wavelength coverage. Laboratory testing has verified that LRS2 largely meets its image quality specification and is nearly ready for delivery to the HET where its final verification and validation tasks will be executed. LRS2 will enable the continuation of most legacy LRS science programs and provide improved capability for future investigations. (Abstract shortened by ProQuest.).

  13. Cryocooler based test setup for high current applications

    NASA Astrophysics Data System (ADS)

    Pradhan, Jedidiah; Das, Nisith Kr.; Roy, Anindya; Duttagupta, Anjan

    2018-04-01

    A cryo-cooler based cryogenic test setup has been designed, fabricated, and tested. The setup incorporates two numbers of cryo-coolers, one for sample cooling and the other one for cooling the large magnet coil. The performance and versatility of the setup has been tested using large samples of high-temperature superconductor magnet coil as well as short samples with high current. Several un-calibrated temperature sensors have been calibrated using this system. This paper presents the details of the system along with results of different performance tests.

  14. Using habitat suitability models to target invasive plant species surveys.

    PubMed

    Crall, Alycia W; Jarnevich, Catherine S; Panke, Brendon; Young, Nick; Renz, Mark; Morisette, Jeffrey

    2013-01-01

    Managers need new tools for detecting the movement and spread of nonnative, invasive species. Habitat suitability models are a popular tool for mapping the potential distribution of current invaders, but the ability of these models to prioritize monitoring efforts has not been tested in the field. We tested the utility of an iterative sampling design (i.e., models based on field observations used to guide subsequent field data collection to improve the model), hypothesizing that model performance would increase when new data were gathered from targeted sampling using criteria based on the initial model results. We also tested the ability of habitat suitability models to predict the spread of invasive species, hypothesizing that models would accurately predict occurrences in the field, and that the use of targeted sampling would detect more species with less sampling effort than a nontargeted approach. We tested these hypotheses on two species at the state scale (Centaurea stoebe and Pastinaca sativa) in Wisconsin (USA), and one genus at the regional scale (Tamarix) in the western United States. These initial data were merged with environmental data at 30-m2 resolution for Wisconsin and 1-km2 resolution for the western United States to produce our first iteration models. We stratified these initial models to target field sampling and compared our models and success at detecting our species of interest to other surveys being conducted during the same field season (i.e., nontargeted sampling). Although more data did not always improve our models based on correct classification rate (CCR), sensitivity, specificity, kappa, or area under the curve (AUC), our models generated from targeted sampling data always performed better than models generated from nontargeted data. For Wisconsin species, the model described actual locations in the field fairly well (kappa = 0.51, 0.19, P < 0.01), and targeted sampling did detect more species than nontargeted sampling with less sampling effort (chi2 = 47.42, P < 0.01). From these findings, we conclude that habitat suitability models can be highly useful tools for guiding invasive species monitoring, and we support the use of an iterative sampling design for guiding such efforts.

  15. DNA-based species level detection of Glomeromycota: one PCR primer set for all arbuscular mycorrhizal fungi.

    PubMed

    Krüger, Manuela; Stockinger, Herbert; Krüger, Claudia; Schüssler, Arthur

    2009-01-01

    * At present, molecular ecological studies of arbuscular mycorrhizal fungi (AMF) are only possible above species level when targeting entire communities. To improve molecular species characterization and to allow species level community analyses in the field, a set of newly designed AMF specific PCR primers was successfully tested. * Nuclear rDNA fragments from diverse phylogenetic AMF lineages were sequenced and analysed to design four primer mixtures, each targeting one binding site in the small subunit (SSU) or large subunit (LSU) rDNA. To allow species resolution, they span a fragment covering the partial SSU, whole internal transcribed spacer (ITS) rDNA region and partial LSU. * The new primers are suitable for specifically amplifying AMF rDNA from material that may be contaminated by other organisms (e.g., samples from pot cultures or the field), characterizing the diversity of AMF species from field samples, and amplifying a SSU-ITS-LSU fragment that allows phylogenetic analyses with species level resolution. * The PCR primers can be used to monitor entire AMF field communities, based on a single rDNA marker region. Their application will improve the base for deep sequencing approaches; moreover, they can be efficiently used as DNA barcoding primers.

  16. Light-sheet enhanced resolution of light field microscopy for rapid imaging of large volumes

    NASA Astrophysics Data System (ADS)

    Madrid Wolff, Jorge; Castro, Diego; Arbeláez, Pablo; Forero-Shelton, Manu

    2018-02-01

    Whole-brain imaging is challenging because it demands microscopes with high temporal and spatial resolution, which are often at odds, especially in the context of large fields of view. We have designed and built a light-sheet microscope with digital micromirror illumination and light-field detection. On the one hand, light sheets provide high resolution optical sectioning on live samples without compromising their viability. On the other hand, light field imaging makes it possible to reconstruct full volumes of relatively large fields of view from a single camera exposure; however, its enhanced temporal resolution comes at the expense of spatial resolution, limiting its applicability. We present an approach to increase the resolution of light field images using DMD-based light sheet illumination. To that end, we develop a method to produce synthetic resolution targets for light field microscopy and a procedure to correct the depth at which planes are refocused with rendering software. We measured the axial resolution as a function of depth and show a three-fold potential improvement with structured illumination, albeit by sacrificing some temporal resolution, also three-fold. This results in an imaging system that may be adjusted to specific needs without having to reassemble and realign it. This approach could be used to image relatively large samples at high rates.

  17. 29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...

  18. 29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...

  19. 29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...

  20. 29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...

  1. 29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...

  2. Weighted Discriminative Dictionary Learning based on Low-rank Representation

    NASA Astrophysics Data System (ADS)

    Chang, Heyou; Zheng, Hao

    2017-01-01

    Low-rank representation has been widely used in the field of pattern classification, especially when both training and testing images are corrupted with large noise. Dictionary plays an important role in low-rank representation. With respect to the semantic dictionary, the optimal representation matrix should be block-diagonal. However, traditional low-rank representation based dictionary learning methods cannot effectively exploit the discriminative information between data and dictionary. To address this problem, this paper proposed weighted discriminative dictionary learning based on low-rank representation, where a weighted representation regularization term is constructed. The regularization associates label information of both training samples and dictionary atoms, and encourages to generate a discriminative representation with class-wise block-diagonal structure, which can further improve the classification performance where both training and testing images are corrupted with large noise. Experimental results demonstrate advantages of the proposed method over the state-of-the-art methods.

  3. CHARMM Force-Fields with Modified Polyphosphate Parameters Allow Stable Simulation of the ATP-Bound Structure of Ca(2+)-ATPase.

    PubMed

    Komuro, Yasuaki; Re, Suyong; Kobayashi, Chigusa; Muneyuki, Eiro; Sugita, Yuji

    2014-09-09

    Adenosine triphosphate (ATP) is an indispensable energy source in cells. In a wide variety of biological phenomena like glycolysis, muscle contraction/relaxation, and active ion transport, chemical energy released from ATP hydrolysis is converted to mechanical forces to bring about large-scale conformational changes in proteins. Investigation of structure-function relationships in these proteins by molecular dynamics (MD) simulations requires modeling of ATP in solution and ATP bound to proteins with accurate force-field parameters. In this study, we derived new force-field parameters for the triphosphate moiety of ATP based on the high-precision quantum calculations of methyl triphosphate. We tested our new parameters on membrane-embedded sarcoplasmic reticulum Ca(2+)-ATPase and four soluble proteins. The ATP-bound structure of Ca(2+)-ATPase remains stable during MD simulations, contrary to the outcome in shorter simulations using original parameters. Similar results were obtained with the four ATP-bound soluble proteins. The new force-field parameters were also tested by investigating the range of conformations sampled during replica-exchange MD simulations of ATP in explicit water. Modified parameters allowed a much wider range of conformational sampling compared with the bias toward extended forms with original parameters. A diverse range of structures agrees with the broad distribution of ATP conformations in proteins deposited in the Protein Data Bank. These simulations suggest that the modified parameters will be useful in studies of ATP in solution and of the many ATP-utilizing proteins.

  4. Studies on geotechnical properties of subsoil in south east coastal region of India

    NASA Astrophysics Data System (ADS)

    Dutta, Susom; Barik, D. K.

    2017-11-01

    Soil testing and analysis has become essential before commencement of any activity or process on soil i.e. residential construction, road construction etc. It is the most important work particularly in coastal area as these areas are more vulnerable to the natural disastrous like tsunami and cyclone. In India, there is lack of facility to collect and analyse the soil from the field. Hence, to study the various characteristics of the coastal region sub soil, Old Mahabalipuram area, which is the South East region of India has been chosen in this study. The aim of this study is to collect and analyse the soil sample from various localities of the Old Mahabalipuram area. The analysed soil data will be helpful for the people who are working in the field of Geotechnical in coastal region of India to make decision. The soil sample collected from different boreholes have undergone various field and laboratory tests like Pressuremeter Test, Field Permeability Test, Electrical Resistivity Test, Standard Penetration Test, Shear Test, Atterberg Limits etc. are performed including rock tests to know the geotechnical properties of the soil samples for each and every stratum

  5. Reliability and Validity of a New Test of Change-of-Direction Speed for Field-Based Sports: the Change-of-Direction and Acceleration Test (CODAT).

    PubMed

    Lockie, Robert G; Schultz, Adrian B; Callaghan, Samuel J; Jeffriess, Matthew D; Berry, Simon P

    2013-01-01

    Field sport coaches must use reliable and valid tests to assess change-of-direction speed in their athletes. Few tests feature linear sprinting with acute change- of-direction maneuvers. The Change-of-Direction and Acceleration Test (CODAT) was designed to assess field sport change-of-direction speed, and includes a linear 5-meter (m) sprint, 45° and 90° cuts, 3- m sprints to the left and right, and a linear 10-m sprint. This study analyzed the reliability and validity of this test, through comparisons to 20-m sprint (0-5, 0-10, 0-20 m intervals) and Illinois agility run (IAR) performance. Eighteen Australian footballers (age = 23.83 ± 7.04 yrs; height = 1.79 ± 0.06 m; mass = 85.36 ± 13.21 kg) were recruited. Following familiarization, subjects completed the 20-m sprint, CODAT, and IAR in 2 sessions, 48 hours apart. Intra-class correlation coefficients (ICC) assessed relative reliability. Absolute reliability was analyzed through paired samples t-tests (p ≤ 0.05) determining between-session differences. Typical error (TE), coefficient of variation (CV), and differences between the TE and smallest worthwhile change (SWC), also assessed absolute reliability and test usefulness. For the validity analysis, Pearson's correlations (p ≤ 0.05) analyzed between-test relationships. Results showed no between-session differences for any test (p = 0.19-0.86). CODAT time averaged ~6 s, and the ICC and CV equaled 0.84 and 3.0%, respectively. The homogeneous sample of Australian footballers meant that the CODAT's TE (0.19 s) exceeded the usual 0.2 x standard deviation (SD) SWC (0.10 s). However, the CODAT is capable of detecting moderate performance changes (SWC calculated as 0.5 x SD = 0.25 s). There was a near perfect correlation between the CODAT and IAR (r = 0.92), and very large correlations with the 20-m sprint (r = 0.75-0.76), suggesting that the CODAT was a valid change-of-direction speed test. Due to movement specificity, the CODAT has value for field sport assessment. Key pointsThe change-of-direction and acceleration test (CODAT) was designed specifically for field sport athletes from specific speed research, and data derived from time-motion analyses of sports such as rugby union, soccer, and Australian football. The CODAT features a linear 5-meter (m) sprint, 45° and 90° cuts and 3-m sprints to the left and right, and a linear 10-m sprint.The CODAT was found to be a reliable change-of-direction speed assessment when considering intra-class correlations between two testing sessions, and the coefficient of variation between trials. A homogeneous sample of Australian footballers resulted in absolute reliability limitations when considering differences between the typical error and smallest worthwhile change. However, the CODAT will detect moderate (0.5 times the test's standard deviation) changes in performance.The CODAT correlated with the Illinois agility run, highlighting that it does assess change-of-direction speed. There were also significant relationships with short sprint performance (i.e. 0-5 m and 0-10 m), demonstrating that linear acceleration is assessed within the CODAT, without the extended duration and therefore metabolic limitations of the IAR. Indeed, the average duration of the test (~6 seconds) is field sport-specific. Therefore, the CODAT could be used as an assessment of change-of-direction speed in field sport athletes.

  6. Technical Proposal for Loading 3000 Gallon Crude Oil Samples from Field Terminal to Sandia Pressurized Tanker to Support US DOE/DOT Crude Oil Characterization Research Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lord, David L.; Allen, Raymond

    Sandia National Laboratories is seeking access to crude oil samples for a research project evaluating crude oil combustion properties in large-scale tests at Sandia National Laboratories in Albuquerque, NM. Samples must be collected from a source location and transported to Albuquerque in a tanker that complies with all applicable regulations for transportation of crude oil over public roadways. Moreover, the samples must not gain or lose any components, to include dissolved gases, from the point of loading through the time of combustion at the Sandia testing facility. In order to achieve this, Sandia designed and is currently procuring a custommore » tanker that utilizes water displacement in order to achieve these performance requirements. The water displacement procedure is modeled after the GPA 2174 standard “Obtaining Liquid Hydrocarbons Samples for Analysis by Gas Chromatography” (GPA 2014) that is used routinely by crude oil analytical laboratories for capturing and testing condensates and “live” crude oils, though it is practiced at the liter scale in most applications. The Sandia testing requires 3,000 gallons of crude. As such, the water displacement method will be upscaled and implemented in a custom tanker. This report describes the loading process for acquiring a ~3,000 gallon crude oil sample from commercial process piping containing single phase liquid crude oil at nominally 50-100 psig. This document contains a general description of the process (Section 2), detailed loading procedure (Section 3) and associated oil testing protocols (Section 4).« less

  7. Determination of dissolved oxygen in the cryosphere: a comprehensive laboratory and field evaluation of fiber optic sensors.

    PubMed

    Bagshaw, E A; Wadham, J L; Mowlem, M; Tranter, M; Eveness, J; Fountain, A G; Telling, J

    2011-01-15

    Recent advances in the Cryospheric Sciences have shown that icy environments are host to consortia of microbial communities, whose function and dynamics are often controlled by the concentrations of dissolved oxygen (DO) in solution. To date, only limited spot determinations of DO have been possible in these environments. They reveal the potential for rates of change that exceed realistic manual sampling rates, highlighting the need to explore methods for the continuous measurement of DO concentrations. We report the first comprehensive field and laboratory performance tests of fiber-optic sensors (PreSens, Regensburg, Germany) for measuring DO in icy ecosystems. A series of laboratory tests performed at low and standard temperatures (-5 to 20 °C) demonstrates high precision (0.3% at 50 μmol/kg and 1.3% at 300 μmol/kg), rapid response times (<20 s), and minimal drift (<0.4%). Survival of freeze thaw was problematic, unless the sensor film was mechanically fixed to the fiber and protected by a stainless steel sheath. Results of two field deployments of sensors to the Swiss Alps and Antarctica largely demonstrate a performance consistent with laboratory tests and superior to traditional methods.

  8. Rapid microscopy measurement of very large spectral images.

    PubMed

    Lindner, Moshe; Shotan, Zav; Garini, Yuval

    2016-05-02

    The spectral content of a sample provides important information that cannot be detected by the human eye or by using an ordinary RGB camera. The spectrum is typically a fingerprint of the chemical compound, its environmental conditions, phase and geometry. Thus measuring the spectrum at each point of a sample is important for a large range of applications from art preservation through forensics to pathological analysis of a tissue section. To date, however, there is no system that can measure the spectral image of a large sample in a reasonable time. Here we present a novel method for scanning very large spectral images of microscopy samples even if they cannot be viewed in a single field of view of the camera. The system is based on capturing information while the sample is being scanned continuously 'on the fly'. Spectral separation implements Fourier spectroscopy by using an interferometer mounted along the optical axis. High spectral resolution of ~5 nm at 500 nm could be achieved with a diffraction-limited spatial resolution. The acquisition time is fairly high and takes 6-8 minutes for a sample size of 10mm x 10mm measured under a bright-field microscope using a 20X magnification.

  9. Results of external quality-assurance program for the National Atmospheric Deposition Program and National Trends Network during 1985

    USGS Publications Warehouse

    Brooks, M.H.; Schroder, L.J.; Willoughby, T.C.

    1988-01-01

    External quality assurance monitoring of the National Atmospheric Deposition Program (NADP) and National Trends Network (NTN) was performed by the U.S. Geological Survey during 1985. The monitoring consisted of three primary programs: (1) an intersite comparison program designed to assess the precision and accuracy of onsite pH and specific conductance measurements made by NADP and NTN site operators; (2) a blind audit sample program designed to assess the effect of routine field handling on the precision and bias of NADP and NTN wet deposition data; and (3) an interlaboratory comparison program designed to compare analytical data from the laboratory processing NADP and NTN samples with data produced by other laboratories routinely analyzing wet deposition samples and to provide estimates of individual laboratory precision. An average of 94% of the site operators participated in the four voluntary intersite comparisons during 1985. A larger percentage of participating site operators met the accuracy goal for specific conductance measurements (average, 87%) than for pH measurements (average, 67%). Overall precision was dependent on the actual specific conductance of the test solution and independent of the pH of the test solution. Data for the blind audit sample program indicated slight positive biases resulting from routine field handling for all analytes except specific conductance. These biases were not large enough to be significant for most data users. Data for the blind audit sample program also indicated that decreases in hydrogen ion concentration were accompanied by decreases in specific conductance. Precision estimates derived from the blind audit sample program indicate that the major source of uncertainty in wet deposition data is the routine field handling that each wet deposition sample receives. Results of the interlaboratory comparison program were similar to results of previous years ' evaluations, indicating that the participating laboratories produced comparable data when they analyzed identical wet deposition samples, and that the laboratory processing NADP and NTN samples achieved the best analyte precision of the participating laboratories. (Author 's abstract)

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yashchuk, V. V.; Fischer, P. J.; Chan, E. R.

    We present a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) one-dimensional sequences and two-dimensional arrays as an effective method for spectral characterization in the spatial frequency domain of a broad variety of metrology instrumentation, including interferometric microscopes, scatterometers, phase shifting Fizeau interferometers, scanning and transmission electron microscopes, and at this time, x-ray microscopes. The inherent power spectral density of BPR gratings and arrays, which has a deterministic white-noise-like character, allows a direct determination of the MTF with a uniform sensitivity over the entire spatial frequency range and field of view of an instrument. We demonstrate themore » MTF calibration and resolution characterization over the full field of a transmission soft x-ray microscope using a BPR multilayer (ML) test sample with 2.8 nm fundamental layer thickness. We show that beyond providing a direct measurement of the microscope's MTF, tests with the BPRML sample can be used to fine tune the instrument's focal distance. Finally, our results confirm the universality of the method that makes it applicable to a large variety of metrology instrumentation with spatial wavelength bandwidths from a few nanometers to hundreds of millimeters.« less

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yashchuk, V. V., E-mail: VVYashchuk@lbl.gov; Chan, E. R.; Lacey, I.

    We present a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) one-dimensional sequences and two-dimensional arrays as an effective method for spectral characterization in the spatial frequency domain of a broad variety of metrology instrumentation, including interferometric microscopes, scatterometers, phase shifting Fizeau interferometers, scanning and transmission electron microscopes, and at this time, x-ray microscopes. The inherent power spectral density of BPR gratings and arrays, which has a deterministic white-noise-like character, allows a direct determination of the MTF with a uniform sensitivity over the entire spatial frequency range and field of view of an instrument. We demonstrate themore » MTF calibration and resolution characterization over the full field of a transmission soft x-ray microscope using a BPR multilayer (ML) test sample with 2.8 nm fundamental layer thickness. We show that beyond providing a direct measurement of the microscope’s MTF, tests with the BPRML sample can be used to fine tune the instrument’s focal distance. Our results confirm the universality of the method that makes it applicable to a large variety of metrology instrumentation with spatial wavelength bandwidths from a few nanometers to hundreds of millimeters.« less

  12. Static and wind tunnel near-field/far-field jet noise measurements from model scale single-flow base line and suppressor nozzles. Summary report. [conducted in the Boeing large anechoic test chamber and the NASA-Ames 40by 80-foot wind tunnel

    NASA Technical Reports Server (NTRS)

    Jaeck, C. L.

    1977-01-01

    A test program was conducted in the Boeing large anechoic test chamber and the NASA-Ames 40- by 80-foot wind tunnel to study the near- and far-field jet noise characteristics of six baseline and suppressor nozzles. Static and wind-on noise source locations were determined. A technique for extrapolating near field jet noise measurements into the far field was established. It was determined if flight effects measured in the near field are the same as those in the far field. The flight effects on the jet noise levels of the baseline and suppressor nozzles were determined. Test models included a 15.24-cm round convergent nozzle, an annular nozzle with and without ejector, a 20-lobe nozzle with and without ejector, and a 57-tube nozzle with lined ejector. The static free-field test in the anechoic chamber covered nozzle pressure ratios from 1.44 to 2.25 and jet velocities from 412 to 594 m/s at a total temperature of 844 K. The wind tunnel flight effects test repeated these nozzle test conditions with ambient velocities of 0 to 92 m/s.

  13. Modeling a CO2 mineralization experiment of fractured peridotite from the Semail ophiolite/ Oman

    NASA Astrophysics Data System (ADS)

    Muller, Nadja; Zhang, Guoxiang; van Noort, Reinier; Spiers, Chris; Ten Grotenhuis, Saskia; Hoedeman, Gerco

    2010-05-01

    Most geologic CO2 sequestration technologies focus on sedimentary rocks, where the carbon dioxide is stored in a fluid phase. A possible alternative is to trap it as a mineral in the subsurface (in-situ) in basaltic or even (ultra)mafic rocks. Carbon dioxide in aqueous solution reacts with Mg-, Ca-, and Fe-bearing silicate minerals, precipitates as (MgCa,Fe)CO3 (carbonate), and can thus be permanently sequestered. The cation donors are silicate minerals such as olivine and pyroxene which are abundant in (ultra)mafic rocks, such as peridotite. Investigations are underway to evaluate the sequestration potential of the Semail Ophiolite in Oman, utilizing the large volumes of partially serpentinized peridotite that are present. Key factors are the rate of mineralization due to dissolution of the peridotite and precipitation of carbonate, the extent of the natural and hydraulic fracture network and the accessibility of the rock to reactive fluids. To quantify the influence of dissolution rates on the overall CO2 mineralization process, small, fractured peridotite samples were exposed to supercritical CO2 and water in laboratory experiments. The samples are cored from a large rock sample in the dimension of small cylinders with 1 cm in height and diameter, with a mass of ~2g. Several experimental conditions were tested with different equipment, from large volume autoclave to small volume cold seal vessel. The 650 ml autoclave contained 400-500g of water and a sample under 10 MPa of partial CO2 pressure up to 150. The small capsules in the cold seal vessel held 1-1.5g of water and the sample under CO2 partial pressure from 15MPa to 70 MPa and temperature from 60 to 200°C. The samples remained for two weeks in the reaction vessels. In addition, bench acid bath experiments in 150 ml vials were performed open to the atmosphere at 50-80°C and pH of ~3. The main observation was that the peridotite dissolved two orders of magnitude slower in the high pressure and temperature cell of the cold seal vessel than comparative experiments in large volume autoclaves and bench acid bath vials under lower and atmospheric pressure conditions. We attributed this observation to the limited water availability in the cold seal vessel, limiting the aqueous reaction of bi-carbonate formation and magnesite precipitation. To test this hypothesis, one of the cold seal vessel experiments at 20 MPa and 100°C was simulated with a reactive transport model, using TOUGHREACT. To simulate the actual experimental conditions, the model used a grid on mm and 100's of μm scale and a fractured peridotite medium with serpentine filling the fractures. The simulation produced dissolution comparable to the experiment and showed an effective shut down of the bi-carbonation reaction within one day after the start of the experiment. If the conditions of limited water supply seen in our experiments are applicable in a field setting, we could expect dissolution may be limited by the buffering of the pH and shut down of the bi-carbonate formation. Under field conditions water and CO2 will only flow in hydraulic induced fractures and the natural fracture network that is filled with serpentine and some carbonate. The simulation result and potential implication for the field application will require further experimental investigation in the lab or field in the future.

  14. Sample-to-answer palm-sized nucleic acid testing device towards low-cost malaria mass screening.

    PubMed

    Choi, Gihoon; Prince, Theodore; Miao, Jun; Cui, Liwang; Guan, Weihua

    2018-05-19

    The effectiveness of malaria screening and treatment highly depends on the low-cost access to the highly sensitive and specific malaria test. We report a real-time fluorescence nucleic acid testing device for malaria field detection with automated and scalable sample preparation capability. The device consists a compact analyzer and a disposable microfluidic reagent compact disc. The parasite DNA sample preparation and subsequent real-time LAMP detection were seamlessly integrated on a single microfluidic compact disc, driven by energy efficient non-centrifuge based magnetic field interactions. Each disc contains four parallel testing units which could be configured either as four identical tests or as four species-specific tests. When configured as species-specific tests, it could identify two of the most life-threatening malaria species (P. falciparum and P. vivax). The NAT device is capable of processing four samples simultaneously within 50 min turnaround time. It achieves a detection limit of ~0.5 parasites/µl for whole blood, sufficient for detecting asymptomatic parasite carriers. The combination of the sensitivity, specificity, cost, and scalable sample preparation suggests the real-time fluorescence LAMP device could be particularly useful for malaria screening in the field settings. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. PRELIMINARY DATA REPORT: HUMATE INJECTION AS AN ENHANCED ATTENUATION METHOD AT THE F-AREA SEEPAGE BASINS, SAVANNAH RIVER SITE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Millings, M.

    2013-09-16

    A field test of a humate technology for uranium and I-129 remediation was conducted at the F-Area Field Research Site as part of the Attenuation-Based Remedies for the Subsurface Applied Field Research Initiative (ABRS AFRI) funded by the DOE Office of Soil and Groundwater Remediation. Previous studies have shown that humic acid sorbed to sediments strongly binds uranium at mildly acidic pH and potentially binds iodine-129 (I-129). Use of humate could be applicable for contaminant stabilization at a wide variety of DOE sites however pilot field-scale tests and optimization of this technology are required to move this technical approach frommore » basic science to actual field deployment and regulatory acceptance. The groundwater plume at the F-Area Field Research Site contains a large number of contaminants, the most important from a risk perspective being strontium-90 (Sr-90), uranium isotopes, I-129, tritium, and nitrate. Groundwater remains acidic, with pH as low as 3.2 near the basins and increasing to the background pH of approximately 5at the plume fringes. The field test was conducted in monitoring well FOB 16D, which historically has shown low pH and elevated concentrations of Sr-90, uranium, I-129 and tritium. The field test included three months of baseline monitoring followed by injection of a potassium humate solution and approximately four and half months of post monitoring. Samples were collected and analyzed for numerous constituents but the focus was on attenuation of uranium, Sr-90, and I-129. This report provides background information, methodology, and preliminary field results for a humate field test. Results from the field monitoring show that most of the excess humate (i.e., humate that did not sorb to the sediments) has flushed through the surrounding formation. Furthermore, the data indicate that the test was successful in loading a band of sediment surrounding the injection point to a point where pH could return to near normal during the study timeframe. Future work will involve a final report, which will include data trends, correlations and interpretations of laboratory data.« less

  16. User's guide for polyethylene-based passive diffusion bag samplers to obtain volatile organic compound concentrations in wells. Part 2, Field tests

    USGS Publications Warehouse

    Vroblesky, Don A.

    2001-01-01

    Diffusion samplers installed in observation wells were found to be capable of yielding representative water samples for chlorinated volatile organic compounds. The samplers consisted of polyethylene bags containing deionized water and relied on diffusion of chlorinated volatile organic compounds through the polyethylene membrane. The known ability of polyethylene to transmit other volatile compounds, such as benzene and toluene, indicates that the samplers can be used for a variety of volatile organic compounds. In wells at the study area, the volatile organic compound concentrations in water samples obtained using the samplers without prior purging were similar to concentrations in water samples obtained from the respective wells using traditional purging and sampling approaches. The low cost associated with this approach makes it a viable option for monitoring large observation-well networks for volatile organic compounds.

  17. 49 CFR 178.985 - Vibration test.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 3 2013-10-01 2013-10-01 false Vibration test. 178.985 Section 178.985... Packagings § 178.985 Vibration test. (a) General. All rigid Large Packaging and flexible Large Packaging design types must be capable of withstanding the vibration test. (b) Test method. (1) A sample Large...

  18. 49 CFR 178.985 - Vibration test.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 3 2014-10-01 2014-10-01 false Vibration test. 178.985 Section 178.985... Packagings § 178.985 Vibration test. (a) General. All rigid Large Packaging and flexible Large Packaging design types must be capable of withstanding the vibration test. (b) Test method. (1) A sample Large...

  19. 49 CFR 178.985 - Vibration test.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 3 2012-10-01 2012-10-01 false Vibration test. 178.985 Section 178.985... Packagings § 178.985 Vibration test. (a) General. All rigid Large Packaging and flexible Large Packaging design types must be capable of withstanding the vibration test. (b) Test method. (1) A sample Large...

  20. 49 CFR 178.985 - Vibration test.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 3 2011-10-01 2011-10-01 false Vibration test. 178.985 Section 178.985... Packagings § 178.985 Vibration test. (a) General. All rigid Large Packaging and flexible Large Packaging design types must be capable of withstanding the vibration test. (b) Test method. (1) A sample Large...

  1. Rapid detection, characterization, and enumeration of foodborne pathogens.

    PubMed

    Hoorfar, J

    2011-11-01

    As food safety management further develops, microbiological testing will continue to play an important role in assessing whether Food Safety Objectives are achieved. However, traditional microbiological culture-based methods are limited, particularly in their ability to provide timely data. The present review discusses the reasons for the increasing interest in rapid methods, current developments in the field, the research needs, and the future trends. The advent of biotechnology has introduced new technologies that led to the emergence of rapid diagnostic methods and altered food testing practices. Rapid methods are comprised of many different detection technologies, including specialized enzyme substrates, antibodies and DNA, ranging from simple differential plating media to the use of sophisticated instruments. The use of non-invasive sampling techniques for live animals especially came into focus with the 1990s outbreak of bovine spongiform encephalopathy that was linked to the human outbreak of Creutzfeldt Jakob's Disease. Serology is still an important tool in preventing foodborne pathogens to enter the human food supply through meat and milk from animals. One of the primary uses of rapid methods is for fast screening of large number of samples, where most of them are expected to be test-negative, leading to faster product release for sale. This has been the main strength of rapid methods such as real-time Polymerase Chain Reaction (PCR). Enrichment PCR, where a primary culture broth is tested in PCR, is the most common approach in rapid testing. Recent reports show that it is possible both to enrich a sample and enumerate by pathogen-specific real-time PCR, if the enrichment time is short. This can be especially useful in situations where food producers ask for the level of pathogen in a contaminated product. Another key issue is automation, where the key drivers are miniaturization and multiple testing, which mean that not only one instrument is flexible enough to test for many pathogens but also many pathogens can be detected with one test. The review is mainly based on the author's scientific work that has contributed with the following new developments to this field: (i) serologic tests for large-scale screening, surveillance, or eradication programs, (ii) same-day detection of Salmonella that otherwise was considered as difficult to achieve, (iii) pathogen enumeration following a short log-phase enrichment, (iv) detection of foodborne pathogens in air samples, and finally (v) biotracing of pathogens based on mathematical modeling, even in the absence of isolate. Rapid methods are discussed in a broad global health perspective, international food supply, and for improvement of quantitative microbial risk assessments. The need for quantitative sample preparation techniques, culture-independent, metagenomic-based detection, online monitoring, a global validation infrastructure has been emphasized. The cost and ease of use of rapid assays remain challenging obstacles to surmount. © 2011 The Author. APMIS © 2011 APMIS.

  2. Factor Structure of the TOEFL® Internet-Based Test (iBT): Exploration in a Field Trial Sample. TOEFL iBT Research Report. TOEFL iBT-04. ETS RR-08-09

    ERIC Educational Resources Information Center

    Sawaki, Yasuyo; Stricker, Lawrence; Oranje, Andreas

    2008-01-01

    The present study investigated the factor structure of a field trial sample of the Test of English as a Foreign Language™ Internet-based test (TOEFL® iBT). An item-level confirmatory factor analysis (CFA) was conducted for a polychoric correlation matrix of items on a test form completed by 2,720 participants in the 2003-2004 TOEFL iBT Field…

  3. Sampling scales define occupancy and underlying occupancy-abundance relationships in animals.

    PubMed

    Steenweg, Robin; Hebblewhite, Mark; Whittington, Jesse; Lukacs, Paul; McKelvey, Kevin

    2018-01-01

    Occupancy-abundance (OA) relationships are a foundational ecological phenomenon and field of study, and occupancy models are increasingly used to track population trends and understand ecological interactions. However, these two fields of ecological inquiry remain largely isolated, despite growing appreciation of the importance of integration. For example, using occupancy models to infer trends in abundance is predicated on positive OA relationships. Many occupancy studies collect data that violate geographical closure assumptions due to the choice of sampling scales and application to mobile organisms, which may change how occupancy and abundance are related. Little research, however, has explored how different occupancy sampling designs affect OA relationships. We develop a conceptual framework for understanding how sampling scales affect the definition of occupancy for mobile organisms, which drives OA relationships. We explore how spatial and temporal sampling scales, and the choice of sampling unit (areal vs. point sampling), affect OA relationships. We develop predictions using simulations, and test them using empirical occupancy data from remote cameras on 11 medium-large mammals. Surprisingly, our simulations demonstrate that when using point sampling, OA relationships are unaffected by spatial sampling grain (i.e., cell size). In contrast, when using areal sampling (e.g., species atlas data), OA relationships are affected by spatial grain. Furthermore, OA relationships are also affected by temporal sampling scales, where the curvature of the OA relationship increases with temporal sampling duration. Our empirical results support these predictions, showing that at any given abundance, the spatial grain of point sampling does not affect occupancy estimates, but longer surveys do increase occupancy estimates. For rare species (low occupancy), estimates of occupancy will quickly increase with longer surveys, even while abundance remains constant. Our results also clearly demonstrate that occupancy for mobile species without geographical closure is not true occupancy. The independence of occupancy estimates from spatial sampling grain depends on the sampling unit. Point-sampling surveys can, however, provide unbiased estimates of occupancy for multiple species simultaneously, irrespective of home-range size. The use of occupancy for trend monitoring needs to explicitly articulate how the chosen sampling scales define occupancy and affect the occupancy-abundance relationship. © 2017 by the Ecological Society of America.

  4. Quantitative characterization of spin-orbit torques in Pt/Co/Pt/Co/Ta/BTO heterostructures due to the magnetization azimuthal angle dependence

    NASA Astrophysics Data System (ADS)

    Engel, Christian; Goolaup, Sarjoosing; Luo, Feilong; Lew, Wen Siang

    2017-08-01

    Substantial understanding of spin-orbit interactions in heavy-metal (HM)/ferromagnet (FM) heterostructures is crucial in developing spin-orbit torque (SOT) spintronics devices utilizing spin Hall and Rashba effects. Though the study of SOT effective field dependence on the out-of-plane magnetization angle has been relatively extensive, the understanding of in-plane magnetization angle dependence remains unknown. Here, we analytically propose a method to compute the SOT effective fields as a function of the in-plane magnetization angle using the harmonic Hall technique in perpendicular magnetic anisotropy (PMA) structures. Two different samples with PMA, a Pt /Co /Pt /Co /Ta /BaTi O3 (BTO) test sample and a Pt/Co/Pt/Co/Ta reference sample, are studied using the derived formula. Our measurements reveal that only the dampinglike field of the test sample with a BTO capping layer exhibits an in-plane magnetization angle dependence, while no angular dependence is found in the reference sample. The presence of the BTO layer in the test sample, which gives rise to a Rashba effect at the interface, is ascribed as the source of the angular dependence of the dampinglike field.

  5. Aerothermal Ground Testing of Flexible Thermal Protection Systems for Hypersonic Inflatable Aerodynamic Decelerators

    NASA Technical Reports Server (NTRS)

    Bruce, Walter E., III; Mesick, Nathaniel J.; Ferlemann, Paul G.; Siemers, Paul M., III; DelCorso, Joseph A.; Hughes, Stephen J.; Tobin, Steven A.; Kardell, Matthew P.

    2012-01-01

    Flexible TPS development involves ground testing and analysis necessary to characterize performance of the FTPS candidates prior to flight testing. This paper provides an overview of the analysis and ground testing efforts performed over the last year at the NASA Langley Research Center and in the Boeing Large-Core Arc Tunnel (LCAT). In the LCAT test series, material layups were subjected to aerothermal loads commensurate with peak re-entry conditions enveloping a range of HIAD mission trajectories. The FTPS layups were tested over a heat flux range from 20 to 50 W/cm with associated surface pressures of 3 to 8 kPa. To support the testing effort a significant redesign of the existing shear (wedge) model holder from previous testing efforts was undertaken to develop a new test technique for supporting and evaluating the FTPS in the high-temperature, arc jet flow. Since the FTPS test samples typically experience a geometry change during testing, computational fluid dynamic (CFD) models of the arc jet flow field and test model were developed to support the testing effort. The CFD results were used to help determine the test conditions experienced by the test samples as the surface geometry changes. This paper includes an overview of the Boeing LCAT facility, the general approach for testing FTPS, CFD analysis methodology and results, model holder design and test methodology, and selected thermal results of several FTPS layups.

  6. Virus fate and transport during recharge using recycled water at a research field site in the Montebello Forebay, Los Angeles County, California, 1997-2000

    USGS Publications Warehouse

    Anders, Robert; Yanko, William A.; Schroeder, Roy A.; Jackson, James L.

    2004-01-01

    Total and fecal coliform bacteria distributions in subsurface water samples collected at a research field site in Los Angeles County were found to increase from nondetectable levels immediately before artificial recharge using tertiary-treated municipal wastewater (recycled water). This rapid increase indicates that bacteria can move through the soil with the percolating recycled water over intervals of a few days and vertical and horizontal distances of about 3 meters. This conclusion formed the basis for three field-scale experiments using bacterial viruses (bacteriophage) MS2 and PRD1 as surrogates for human enteric viruses and bromide as a conservative tracer to determine the fate and transport of viruses in recycled water during subsurface transport under actual recharge conditions. The research field site consists of a test basin constructed adjacent to a large recharge facility (spreading grounds) located in the Montebello Forebay of Los Angeles County, California. The soil beneath the test basin is predominantly medium to coarse, moderately sorted, grayish-brown sand. The three tracer experiments were conducted during August 1997, August-September 1998, and August 2000. For each experiment, prepared solutions of bacteriophage and bromide were sprayed on the surface of the water in the test basin and injected, using peristaltic pumps, directly into the feed pipe delivering the recycled water to the test basin. Extensive data were obtained for water samples collected from the test basin itself and from depths of 0.3, 0.6, 1.0, 1.5, 3.0, and 7.6 meters below the bottom of the test basin. The rate of bacteriophage inactivation in the recycled water, independent of any processes occurring in the subsurface, was determined from measurements on water samples from the test basin. Regression analysis of the ratios of bacteriophage to bromide was used to determine the attenuation rates for MS2 and PRD1, defined as the logarithmic reduction in the ratio during each experiment. Although the inactivation rates increased during the third tracer experiment, they were nearly two orders of magnitude less than the attenuation rates. Therefore, adsorption, not inactivation, is the predominant removal mechanism for viruses during artificial recharge. Using the colloid-filtration model, the collision efficiency was determined for both bacteriophage during the second and third field-scale tracer experiments. The collision efficiency confirms that more favorable attachment conditions existed for PRD1, especially during the third tracer experiment. The different collision efficiencies between the second and third tracer experiments possibly were due to changing hydraulic conditions at the research field site during each experiment. The field data suggest that an optimal management scenario might exist to maximize the amount of recycled water that can be applied to the spreading grounds while still maintaining favorable attachment conditions for virus removal and thereby ensuring protection of the ground-water supply.

  7. Thermal wave interference with high-power VCSEL arrays for locating vertically oriented subsurface defects

    NASA Astrophysics Data System (ADS)

    Thiel, Erik; Kreutzbruck, Marc; Studemund, Taarna; Ziegler, Mathias

    2018-04-01

    Among the photothermal methods, full-field thermal imaging is used to characterize materials, to determine thicknesses of layers, or to find inhomogeneities such as voids or cracks. The use of classical light sources such as flash lamps (impulse heating) or halogen lamps (modulated heating) led to a variety of nondestructive testing methods, in particular, lock-in and flash-thermography. In vertical-cavity surface-emitting lasers (VCSELs), laser light is emitted perpendicularly to the surface with a symmetrical beam profile. Due to the vertical structure, they can be arranged in large arrays of many thousands of individual lasers, which allows power scaling into the kilowatt range. Recently, a high-power yet very compact version of such a VCSEL-array became available that offers both the fast timing behavior of a laser as well as the large illumination area of a lamp. Moreover, it allows a spatial and temporal control of the heating because individual parts of the array can be controlled arbitrarily in frequency, amplitude, and phase. In conjunction with a fast infrared camera, such structured heating opens up a field of novel thermal imaging and testing methods. As a first demonstration of this approach, we chose a testing problem very challenging to conventional thermal infrared testing: The detection of very thin subsurface defects perpendicularly oriented to the surface of metallic samples. First, we generate destructively interfering thermal wave fields, which are then affected by the presence of defects within their reach. It turned out that this technique allows highly sensitive detection of subsurface defects down to depths in excess of the usual thermographic rule of thumb, with no need for a reference or surface preparation.

  8. Large-scale mapping and predictive modeling of submerged aquatic vegetation in a shallow eutrophic lake.

    PubMed

    Havens, Karl E; Harwell, Matthew C; Brady, Mark A; Sharfstein, Bruce; East, Therese L; Rodusky, Andrew J; Anson, Daniel; Maki, Ryan P

    2002-04-09

    A spatially intensive sampling program was developed for mapping the submerged aquatic vegetation (SAV) over an area of approximately 20,000 ha in a large, shallow lake in Florida, U.S. The sampling program integrates Geographic Information System (GIS) technology with traditional field sampling of SAV and has the capability of producing robust vegetation maps under a wide range of conditions, including high turbidity, variable depth (0 to 2 m), and variable sediment types. Based on sampling carried out in August-September 2000, we measured 1,050 to 4,300 ha of vascular SAV species and approximately 14,000 ha of the macroalga Chara spp. The results were similar to those reported in the early 1990s, when the last large-scale SAV sampling occurred. Occurrence of Chara was strongly associated with peat sediments, and maximal depths of occurrence varied between sediment types (mud, sand, rock, and peat). A simple model of Chara occurrence, based only on water depth, had an accuracy of 55%. It predicted occurrence of Chara over large areas where the plant actually was not found. A model based on sediment type and depth had an accuracy of 75% and produced a spatial map very similar to that based on observations. While this approach needs to be validated with independent data in order to test its general utility, we believe it may have application elsewhere. The simple modeling approach could serve as a coarse-scale tool for evaluating effects of water level management on Chara populations.

  9. Multilevel Provider-Based Sampling for Recruitment of Pregnant Women and Mother-Newborn Dyads.

    PubMed

    McLaughlin, Thomas J; Aupont, Onesky; Kozinetz, Claudia A; Hubble, David; Moore-Simas, Tiffany A; Davis, Deborah; Park, Christina; Brenner, Ruth; Sepavich, Deidre; Felice, Marianne; Caviness, Chantal; Downs, Tim; Selwyn, Beatrice J; Forman, Michele R

    2016-06-01

    In 2010, the National Children's Study launched 3 alternative recruitment methods to test possible improvements in efficiency compared with traditional household-based recruitment and participant enrollment. In 2012, a fourth method, provider-based sampling (PBS), tested a probability-based sampling of prenatal provider locations supplemented by a second cohort of neonates born at a convenience sample of maternity hospitals. From a sampling frame of 472 prenatal care provider locations and 59 maternity hospitals, 49 provider and 7 hospital locations within or just outside 3 counties participated in study recruitment. During first prenatal care visits or immediately postdelivery at these locations, face-to-face contact was used to screen and recruit eligible women. Of 1450 screened women, 1270 were eligible. Consent rates at prenatal provider locations (62%-74% by county) were similar to those at birth locations (64%-77% by county). During 6 field months, 3 study centers enrolled a total prenatal cohort of 530 women (the majority in the first trimester) and during 2 months enrolled a birth cohort of an additional 320 mother-newborn dyads. As personnel became experienced in the field, the time required to enroll a woman in the prenatal cohort declined from up to 200 hours to 50 to 100 hours per woman recruited. We demonstrated that PBS was feasible and operationally efficient in recruiting a representative cohort of newborns from 3 diverse US counties. Our findings suggest that PBS is a practical approach to recruit large pregnancy and birth cohorts across the United States. Copyright © 2016 by the American Academy of Pediatrics.

  10. Estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean.

    PubMed

    Schillaci, Michael A; Schillaci, Mario E

    2009-02-01

    The use of small sample sizes in human and primate evolutionary research is commonplace. Estimating how well small samples represent the underlying population, however, is not commonplace. Because the accuracy of determinations of taxonomy, phylogeny, and evolutionary process are dependant upon how well the study sample represents the population of interest, characterizing the uncertainty, or potential error, associated with analyses of small sample sizes is essential. We present a method for estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean using small (n<10) or very small (n < or = 5) sample sizes. This method can be used by researchers to determine post hoc the probability that their sample is a meaningful approximation of the population parameter. We tested the method using a large craniometric data set commonly used by researchers in the field. Given our results, we suggest that sample estimates of the population mean can be reasonable and meaningful even when based on small, and perhaps even very small, sample sizes.

  11. Large Area Crop Inventory Experiment (LACIE). Development of procedure M for multicrop inventory, with tests of a spring-wheat configuration

    NASA Technical Reports Server (NTRS)

    Horvath, R. (Principal Investigator); Cicone, R.; Crist, E.; Kauth, R. J.; Lambeck, P.; Malila, W. A.; Richardson, W.

    1979-01-01

    The author has identified the following significant results. An outgrowth of research and development activities in support of LACIE was a multicrop area estimation procedure, Procedure M. This procedure was a flexible, modular system that could be operated within the LACIE framework. Its distinctive features were refined preprocessing (including spatially varying correction for atmospheric haze), definition of field like spatial features for labeling, spectral stratification, and unbiased selection of samples to label and crop area estimation without conventional maximum likelihood classification.

  12. A laboratory and field evaluation of a portable immunoassay test for triazine herbicides in environmental water samples

    USGS Publications Warehouse

    Schulze, P.A.; Capel, P.D.; Squillace, P.J.; Helsel, D.R.

    1993-01-01

    The usefulness and sensitivity, of a portable immunoassay test for the semiquantitative field screening of water samples was evaluated by means of laboratory and field studies. Laboratory results indicated that the tests were useful for the determination of atrazine concentrations of 0.1 to 1.5 μg/L. At a concentration of 1 μg/L, the relative standard deviation in the difference between the regression line and the actual result was about 40 percent. The immunoassay was less sensitive and produced similar errors for other triazine herbicides. After standardization, the test results were relatively insensitive to ionic content and variations in pH (range, 4 to 10), mildly sensitive to temperature changes, and quite sensitive to the timing of the final incubation step, variances in timing can be a significant source of error. Almost all of the immunoassays predicted a higher atrazine concentration in water samples when compared to results of gas chromatography. If these tests are used as a semiquantitative screening tool, this tendency for overprediction does not diminish the tests' usefulness. Generally, the tests seem to be a valuable method for screening water samples for triazine herbicides.

  13. Toward detecting California shrubland canopy chemistry with AIS data

    NASA Technical Reports Server (NTRS)

    Price, Curtis V.; Westman, Walter E.

    1987-01-01

    Airborne Imaging Spectrometer (AIS)-2 data of coastal sage scrub vegetation were examined for fine spectral features that might be used to predict concentrations of certain canopy chemical constituents. A Fourier notch filter was applied to the AIS data and the TREE and ROCK mode spectra were ratioed to a flat field. Portions of the resulting spectra resemble spectra for plant cellulose and starch in that both show reduced reflectance at 2100 and 2270 nm. The latter are regions of absorption of energy by organic bonds found in starch and cellulose. Whether the relationship is sufficient to predict the concentration of these chemicals from AIS spectra will require testing of the predictive ability of these wavebands with large field sample sizes.

  14. A panel of microsatellites to individually identify leopards and its application to leopard monitoring in human dominated landscapes.

    PubMed

    Mondol, Samrat; Navya, R; Athreya, Vidya; Sunagar, Kartik; Selvaraj, Velu Mani; Ramakrishnan, Uma

    2009-12-04

    Leopards are the most widely distributed of the large cats, ranging from Africa to the Russian Far East. Because of habitat fragmentation, high human population densities and the inherent adaptability of this species, they now occupy landscapes close to human settlements. As a result, they are the most common species involved in human wildlife conflict in India, necessitating their monitoring. However, their elusive nature makes such monitoring difficult. Recent advances in DNA methods along with non-invasive sampling techniques can be used to monitor populations and individuals across large landscapes including human dominated ones. In this paper, we describe a DNA-based method for leopard individual identification where we used fecal DNA samples to obtain genetic material. Further, we apply our methods to non-invasive samples collected in a human-dominated landscape to estimate the minimum number of leopards in this human-leopard conflict area in Western India. In this study, 25 of the 29 tested cross-specific microsatellite markers showed positive amplification in 37 wild-caught leopards. These loci revealed varied levels of polymorphism (four-12 alleles) and heterozygosity (0.05-0.79). Combining data on amplification success (including non-invasive samples) and locus specific polymorphisms, we showed that eight loci provide a sibling probability of identity of 0.0005, suggesting that this panel can be used to discriminate individuals in the wild. When this microsatellite panel was applied to fecal samples collected from a human-dominated landscape, we identified 7 individuals, with a sibling probability of identity of 0.001. Amplification success of field collected scats was up to 72%, and genotype error ranged from 0-7.4%. Our results demonstrated that the selected panel of eight microsatellite loci can conclusively identify leopards from various kinds of biological samples. Our methods can be used to monitor leopards over small and large landscapes to assess population trends, as well as could be tested for population assignment in forensic applications.

  15. A panel of microsatellites to individually identify leopards and its application to leopard monitoring in human dominated landscapes

    PubMed Central

    2009-01-01

    Background Leopards are the most widely distributed of the large cats, ranging from Africa to the Russian Far East. Because of habitat fragmentation, high human population densities and the inherent adaptability of this species, they now occupy landscapes close to human settlements. As a result, they are the most common species involved in human wildlife conflict in India, necessitating their monitoring. However, their elusive nature makes such monitoring difficult. Recent advances in DNA methods along with non-invasive sampling techniques can be used to monitor populations and individuals across large landscapes including human dominated ones. In this paper, we describe a DNA-based method for leopard individual identification where we used fecal DNA samples to obtain genetic material. Further, we apply our methods to non-invasive samples collected in a human-dominated landscape to estimate the minimum number of leopards in this human-leopard conflict area in Western India. Results In this study, 25 of the 29 tested cross-specific microsatellite markers showed positive amplification in 37 wild-caught leopards. These loci revealed varied levels of polymorphism (four-12 alleles) and heterozygosity (0.05-0.79). Combining data on amplification success (including non-invasive samples) and locus specific polymorphisms, we showed that eight loci provide a sibling probability of identity of 0.0005, suggesting that this panel can be used to discriminate individuals in the wild. When this microsatellite panel was applied to fecal samples collected from a human-dominated landscape, we identified 7 individuals, with a sibling probability of identity of 0.001. Amplification success of field collected scats was up to 72%, and genotype error ranged from 0-7.4%. Conclusion Our results demonstrated that the selected panel of eight microsatellite loci can conclusively identify leopards from various kinds of biological samples. Our methods can be used to monitor leopards over small and large landscapes to assess population trends, as well as could be tested for population assignment in forensic applications. PMID:19961605

  16. Lead in rice: analysis of baseline lead levels in market and field collected rice grains.

    PubMed

    Norton, Gareth J; Williams, Paul N; Adomako, Eureka E; Price, Adam H; Zhu, Yongguan; Zhao, Fang-Jie; McGrath, Steve; Deacon, Claire M; Villada, Antia; Sommella, Alessia; Lu, Ying; Ming, Lei; De Silva, P Mangala C S; Brammer, Hugh; Dasgupta, Tapash; Islam, M Rafiqul; Meharg, Andrew A

    2014-07-01

    In a large scale survey of rice grains from markets (13 countries) and fields (6 countries), a total of 1578 rice grain samples were analysed for lead. From the market collected samples, only 0.6% of the samples exceeded the Chinese and EU limit of 0.2 μg g(-1) lead in rice (when excluding samples collected from known contaminated/mine impacted regions). When evaluating the rice grain samples against the Food and Drug Administration's (FDA) provisional total tolerable intake (PTTI) values for children and pregnant women, it was found that only people consuming large quantities of rice were at risk of exceeding the PTTI from rice alone. Furthermore, 6 field experiments were conducted to evaluate the proportion of the variation in lead concentration in rice grains due to genetics. A total of 4 of the 6 field experiments had significant differences between genotypes, but when the genotypes common across all six field sites were assessed, only 4% of the variation was explained by genotype, with 9.5% and 11% of the variation explained by the environment and genotype by environment interaction respectively. Further work is needed to identify the sources of lead contamination in rice, with detailed information obtained on the locations and environments where the rice is sampled, so that specific risk assessments can be performed. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Within-field variability of plant and soil parameters

    NASA Technical Reports Server (NTRS)

    Ulaby, F. T. (Principal Investigator); Brisco, B.; Dobson, C.

    1981-01-01

    The variability of ground truth data collected for vegetation experiments was investigated. Two fields of wheat and one field of corn were sampled on two different dates. The variability of crop and soil parameters within a field, between two fields of the same type, and within a field over time were compared statistically. The number of samples from each test site required in order to be able to determine with confidence the mean and standard deviations for a given variable was determined. Eight samples were found to be adequate for plant height determinations, while twenty samples were required for plant moisture and soil moisture characterization. Eighteen samples were necessary for detecting within field variability over time and for between field variability for the same crop. The necessary sample sites vary according to the physiological growth stage of the crop and recent weather events that affect the moisture and/or height characteristics of the field in question.

  18. Conditional sampling technique to test the applicability of the Taylor hypothesis for the large-scale coherent structures

    NASA Technical Reports Server (NTRS)

    Hussain, A. K. M. F.

    1980-01-01

    Comparisons of the distributions of large scale structures in turbulent flow with distributions based on time dependent signals from stationary probes and the Taylor hypothesis are presented. The study investigated an area in the near field of a 7.62 cm circular air jet at a Re of 32,000, specifically having coherent structures through small-amplitude controlled excitation and stable vortex pairing in the jet column mode. Hot-wire and X-wire anemometry were employed to establish phase averaged spatial distributions of longitudinal and lateral velocities, coherent Reynolds stress and vorticity, background turbulent intensities, streamlines and pseudo-stream functions. The Taylor hypothesis was used to calculate spatial distributions of the phase-averaged properties, with results indicating that the usage of the local time-average velocity or streamwise velocity produces large distortions.

  19. Prototype sampling system for measuring workplace protection factors for gases and vapors.

    PubMed

    Groves, William A; Reynolds, Stephen J

    2003-05-01

    A prototype sampling system for measuring respirator workplace protection factors (WPFs) was developed. Methods for measuring the concentration of contaminants inside respirators have previously been described; however, these studies have typically involved continuous sampling of aerosols. Our work focuses on developing an intermittent sampling system designed to measure the concentration of gases and vapors during inspiration. This approach addresses two potential problems associated with continuous sampling: biased results due to lower contaminant concentrations and high humidity in exhaled air. The system consists of a pressure transducer circuit designed to activate a pair of personal sampling pumps during inspiration based on differential pressure inside the respirator. One pump draws air from inside the respirator while the second samples the ambient air. Solid granular adsorbent tubes are used to trap the contaminants, making the approach applicable to a large number of gases and vapors. Laboratory testing was performed using a respirator mounted on a headform connected to a breathing machine producing a sinusoidal flow pattern with an average flow rate of 20 L/min and a period of 3 seconds. The sampling system was adjusted to activate the pumps when the pressure inside the respirator was less than -0.1 inch H(2)O. Quantitative fit-tests using human subjects were conducted to evaluate the effect of the sampling system on respirator performance. A total of 299 fit-tests were completed for two different types of respirators (half- and full-facepiece) from two different manufacturers (MSA and North). Statistical tests showed no significant differences between mean fit factors for respirators equipped with the sampling system versus unmodified respirators. Field testing of the prototype sampling system was performed in livestock production facilities and estimates of WPFs for ammonia were obtained. Results demonstrate the feasibility of this approach and will be used in developing improved instrumentation for measuring WPFs.

  20. Testing the application of Teflon/quartz soil solution samplers for DOM sampling in the Critical Zone: Field and laboratory approaches

    NASA Astrophysics Data System (ADS)

    Dolan, E. M.; Perdrial, J. N.; Vazquez, A.; Hernández, S.; Chorover, J.

    2010-12-01

    Elizabeth Dolan1,2, Julia Perdrial3, Angélica Vázquez-Ortega3, Selene Hernández-Ruiz3, Jon Chorover3 1Deptartment of Soil, Environmental, and Atmospheric Science, University of Missouri. 2Biosphere 2, University of Arizona. 3Deptartment of Soil, Water, and Environmental Science, University of Arizona. Abstract: The behavior of dissolved organic matter (DOM) in soil is important to many biogeochemical processes. Extraction methods to obtain DOM from the unsaturated zone remain a current focus of research as different methods can influence the type and concentration of DOM obtained. Thus, the present comparison study involves three methods for soil solution sampling to assess their impact on DOM quantity and quality: 1) aqueous soil extracts, 2) solution yielded from laboratory installed suction cup samplers and 3) solutions from field installed suction cup samplers. All samples were analyzed for dissolved organic carbon and total nitrogen concentrations. Moreover, DOM quality was analyzed using fluorescence, UV-Vis and FTIR spectroscopies. Results indicate higher DOC values for laboratory extracted DOM: 20 mg/L for aqueous soil extracts and 31 mg/L for lab installed samplers compared to 12 mg/L for field installed samplers. Large variations in C/N ratios were also observed ranging from 1.5 in laboratory extracted DOM to 11 in field samples. Fluorescence excitation-emission matrices of DOM solutions obtained for the laboratory extraction methods showed higher intensities in regions typical for fulvic and humic acid-like materials relative to those extracted in the field. Similarly, the molar absorptivity calculated from DOC concentration normalization of UV-Vis absorbance of the laboratory-derived solutions was significantly higher as well, indicating greater aromaticity. The observed differences can be attributed to soil disturbance associated with obtaining laboratory derived solution samples. Our results indicate that laboratory extraction methods are not comparable to in-situ field soil solution extraction in terms of DOM.

  1. Large-scale changes in network interactions as a physiological signature of spatial neglect.

    PubMed

    Baldassarre, Antonello; Ramsey, Lenny; Hacker, Carl L; Callejas, Alicia; Astafiev, Serguei V; Metcalf, Nicholas V; Zinn, Kristi; Rengachary, Jennifer; Snyder, Abraham Z; Carter, Alex R; Shulman, Gordon L; Corbetta, Maurizio

    2014-12-01

    The relationship between spontaneous brain activity and behaviour following focal injury is not well understood. Here, we report a large-scale study of resting state functional connectivity MRI and spatial neglect following stroke in a large (n=84) heterogeneous sample of first-ever stroke patients (within 1-2 weeks). Spatial neglect, which is typically more severe after right than left hemisphere injury, includes deficits of spatial attention and motor actions contralateral to the lesion, and low general attention due to impaired vigilance/arousal. Patients underwent structural and resting state functional MRI scans, and spatial neglect was measured using the Posner spatial cueing task, and Mesulam and Behavioural Inattention Test cancellation tests. A principal component analysis of the behavioural tests revealed a main factor accounting for 34% of variance that captured three correlated behavioural deficits: visual neglect of the contralesional visual field, visuomotor neglect of the contralesional field, and low overall performance. In an independent sample (21 healthy subjects), we defined 10 resting state networks consisting of 169 brain regions: visual-fovea and visual-periphery, sensory-motor, auditory, dorsal attention, ventral attention, language, fronto-parietal control, cingulo-opercular control, and default mode. We correlated the neglect factor score with the strength of resting state functional connectivity within and across the 10 resting state networks. All damaged brain voxels were removed from the functional connectivity:behaviour correlational analysis. We found that the correlated behavioural deficits summarized by the factor score were associated with correlated multi-network patterns of abnormal functional connectivity involving large swaths of cortex. Specifically, dorsal attention and sensory-motor networks showed: (i) reduced interhemispheric functional connectivity; (ii) reduced anti-correlation with fronto-parietal and default mode networks in the right hemisphere; and (iii) increased intrahemispheric connectivity with the basal ganglia. These patterns of functional connectivity:behaviour correlations were stronger in patients with right- as compared to left-hemisphere damage and were independent of lesion volume. Our findings identify large-scale changes in resting state network interactions that are a physiological signature of spatial neglect and may relate to its right hemisphere lateralization. © The Author (2014). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. The cost of large numbers of hypothesis tests on power, effect size and sample size.

    PubMed

    Lazzeroni, L C; Ray, A

    2012-01-01

    Advances in high-throughput biology and computer science are driving an exponential increase in the number of hypothesis tests in genomics and other scientific disciplines. Studies using current genotyping platforms frequently include a million or more tests. In addition to the monetary cost, this increase imposes a statistical cost owing to the multiple testing corrections needed to avoid large numbers of false-positive results. To safeguard against the resulting loss of power, some have suggested sample sizes on the order of tens of thousands that can be impractical for many diseases or may lower the quality of phenotypic measurements. This study examines the relationship between the number of tests on the one hand and power, detectable effect size or required sample size on the other. We show that once the number of tests is large, power can be maintained at a constant level, with comparatively small increases in the effect size or sample size. For example at the 0.05 significance level, a 13% increase in sample size is needed to maintain 80% power for ten million tests compared with one million tests, whereas a 70% increase in sample size is needed for 10 tests compared with a single test. Relative costs are less when measured by increases in the detectable effect size. We provide an interactive Excel calculator to compute power, effect size or sample size when comparing study designs or genome platforms involving different numbers of hypothesis tests. The results are reassuring in an era of extreme multiple testing.

  3. High-resolution hydrodynamic chromatographic separation of large DNA using narrow, bare open capillaries: a rapid and economical alternative technology to pulsed-field gel electrophoresis?

    PubMed

    Liu, Lei; Veerappan, Vijaykumar; Pu, Qiaosheng; Cheng, Chang; Wang, Xiayan; Lu, Liping; Allen, Randy D; Guo, Guangsheng

    2014-01-07

    A high-resolution, rapid, and economical hydrodynamic chromatographic (HDC) method for large DNA separations in free solution was developed using narrow (5 μm diameter), bare open capillaries. Size-based separation was achieved in a chromatographic format with larger DNA molecules being eluting faster than smaller ones. Lambda DNA Mono Cut Mix was baseline-separated with the percentage resolutions generally less than 9.0% for all DNA fragments (1.5 to 48.5 kbp) tested in this work. High efficiencies were achieved for large DNA from this chromatographic technique, and the number of theoretical plates reached 3.6 × 10(5) plates for the longest (48.5 kbp) and 3.7 × 10(5) plates for the shortest (1.5 kbp) fragments. HDC parameters and performances were also discussed. The method was further applied for fractionating large DNA fragments from real-world samples (SacII digested Arabidopsis plant bacterial artificial chromosome (BAC) DNA and PmeI digested Rice BAC DNA) to demonstrate its feasibility for BAC DNA finger printing. Rapid separation of PmeI digested Rice BAC DNA covering from 0.44 to 119.041 kbp was achieved in less than 26 min. All DNA fragments of these samples were baseline separated in narrow bare open capillaries, while the smallest fragment (0.44 kbp) was missing in pulsed-field gel electrophoresis (PFGE) separation mode. It is demonstrated that narrow bare open capillary chromatography can realize a rapid separation for a wide size range of DNA mixtures that contain both small and large DNA fragments in a single run.

  4. 'Nano-immuno test' for the detection of live Mycobacterium avium subspecies paratuberculosis bacilli in the milk samples using magnetic nano-particles and chromogen.

    PubMed

    Singh, Manju; Singh, Shoor Vir; Gupta, Saurabh; Chaubey, Kundan Kumar; Stephan, Bjorn John; Sohal, Jagdip Singh; Dutta, Manali

    2018-04-26

    Early rapid detection of Mycobacterium avium subspecies paratuberculosis (MAP) bacilli in milk samples is the major challenge since traditional culture method is time consuming and laboratory dependent. We report a simple, sensitive and specific nano-technology based 'Nano-immuno test' capable of detecting viable MAP bacilli in the milk samples within 10 h. Viable MAP bacilli were captured by MAP specific antibody-conjugated magnetic nano-particles using resazurin dye as chromogen. Test was optimized using true culture positive (10-bovine and 12-goats) and true culture negative (16-bovine and 25-goats) raw milk samples. Domestic livestock species in India are endemically infected with MAP. After successful optimization, sensitivity and specificity of the 'nano-immuno test' in goats with respect to milk culture was 91.7% and 96.0%, respectively. Whereas, it was 90.0% (sensitivity) and 92.6% (specificity) with respect to IS900 PCR. In bovine milk samples, sensitivity and specificity of 'nano-immuno test' with respect to milk culture was 90.0% and 93.7%, respectively. However, with respect to IS900 PCR, the sensitivity and specificity was 88.9% and 94.1%, respectively. Test was validated with field raw milk samples (goats-258 and bovine-138) collected from domestic livestock species to detect live/viable MAP bacilli. Of 138 bovine raw milk samples screened by six diagnostic tests, 81 (58.7%) milk samples were positive for MAP infection in one or more than one diagnostic tests. Of 81 (58.7%) positive bovine raw milk samples, only 24 (17.4%) samples were detected positive for the presence of viable MAP bacilli. Of 258 goats raw milk samples screened by six diagnostic tests, 141 (54.6%) were positive for MAP infection in one or more than one test. Of 141 (54.6%) positive raw milk samples from goats, only 48 (34.0%) were detected positive for live MAP bacilli. Simplicity and efficiency of this novel 'nano-immuno test' makes it suitable for wide-scale screening of milk samples in the field. Standardization, validation and re-usability of functionalized nano-particles and the test was successfully achieved in field samples. Test was highly specific, simple to perform and easy to read by naked eyes and does not require laboratory support in the performance of test. Test has potential to be used as screening test to estimate bio-load of MAP in milk samples at National level.

  5. Detection of microwave emission due to rock fracture as a new tool for geophysics: A field test at a volcano in Miyake Island, Japan

    NASA Astrophysics Data System (ADS)

    Takano, Tadashi; Maeda, Takashi; Miki, Yoji; Akatsuka, Sayo; Hattori, Katsumi; Nishihashi, Masahide; Kaida, Daishi; Hirano, Takuya

    2013-07-01

    This paper describes a field test to verify a newly discovered phenomenon of microwave emission due to rock fracture in a volcano. The field test was carried out on Miyake Island, 150 km south of Tokyo. The main objective of the test was to investigate the applicability of the phenomenon to the study of geophysics, volcanology, and seismology by extending observations of this phenomenological occurrence from the laboratory to the natural field. We installed measuring systems for 300 MHz, 2 GHz, and 18 GHz-bands on the mountain top and mountain foot in order to discriminate local events from regional and global events. The systems include deliberate data subsystems that store slowly sampled data in the long term, and fast sampled data when triggered. We successfully obtained data from January to February 2008. During this period, characteristic microwave pulses were intermittently detected at 300 MHz. Two photographs taken before and after this period revealed that a considerably large-scale collapse occurred on the crater cliff. Moreover, seismograms obtained by nearby observatories strongly suggest that the crater subsidence occurred simultaneously with microwave signals on the same day during the observation period. For confirmation of the microwave emission caused by rock fracture, these microwave signals must be clearly discriminated from noise, interferences, and other disturbances. We carefully discriminated the microwave data taken at the mountaintop and foot, checked the lightning strike data around the island, and consequently concluded that these microwave signals could not be attributed to lightning. Artificial interferences were discriminated by the nature of their waveforms. Thus, we inferred that the signals detected at 300 MHz were due to rock fractures during cliff collapses. This result may provide a useful new tool for geoscientists and for the mitigation of natural hazards.

  6. A field test of cut-off importance sampling for bole volume

    Treesearch

    Jeffrey H. Gove; Harry T. Valentine; Michael J. Holmes

    2000-01-01

    Cut-off importance sampling has recently been introduced as a technique for estimating bole volume to some point below the tree tip, termed the cut-off point. A field test of this technique was conducted on a small population of eastern white pine trees using dendrometry as the standard for volume estimation. Results showed that the differences in volume estimates...

  7. Role of extrinsic noise in the sensitivity of the rod pathway: rapid dark adaptation of nocturnal vision in humans.

    PubMed

    Reeves, Adam; Grayhem, Rebecca

    2016-03-01

    Rod-mediated 500 nm test spots were flashed in Maxwellian view at 5 deg eccentricity, both on steady 10.4 deg fields of intensities (I) from 0.00001 to 1.0 scotopic troland (sc td) and from 0.2 s to 1 s after extinguishing the field. On dim fields, thresholds of tiny (5') tests were proportional to √I (Rose-DeVries law), while thresholds after extinction fell within 0.6 s to the fully dark-adapted absolute threshold. Thresholds of large (1.3 deg) tests were proportional to I (Weber law) and extinction thresholds, to √I. rod thresholds are elevated by photon-driven noise from dim fields that disappears at field extinction; large spot thresholds are additionally elevated by neural light adaptation proportional to √I. At night, recovery from dimly lit fields is fast, not slow.

  8. Evaluation of EMIT and RIA high volume test procedures for THC metabolites in urine utilizing GC/MS confirmation.

    PubMed

    Abercrombie, M L; Jewell, J S

    1986-01-01

    Results of EMIT, Abuscreen RIA, and GC/MS tests for THC metabolites in a high volume random urinalysis program are compared. Samples were field tested by non-laboratory personnel with an EMIT system using a 100 ng/mL cutoff. Samples were then sent to the Army Forensic Toxicology Drug Testing Laboratory (WRAMC) at Fort Meade, Maryland, where they were tested by RIA (Abuscreen) using a statistical 100 ng/mL cutoff. Confirmations of all RIA positives were accomplished using a GC/MS procedure. EMIT and RIA results agreed for 91% of samples. Data indicated a 4% false positive rate and a 10% false negative rate for EMIT field testing. In a related study, results for samples which tested positive by RIA for THC metabolites using a statistical 100 ng/mL cutoff were compared with results by GC/MS utilizing a 20 ng/mL cutoff for the THCA metabolite. Presence of THCA metabolite was detected in 99.7% of RIA positive samples. No relationship between quantitations determined by the two tests was found.

  9. Measurements of dimethyl sulfide and SO2 during GTE/CITE 3

    NASA Technical Reports Server (NTRS)

    Ferek, Ronald J.; Hegg, Dean A.

    1993-01-01

    As part of NASA's Tropospheric Experiment Chemical Instrumentation Test and Evaluation (GTE/CITE 3) Sulfur Gas Intercomparison, we conducted measurements of dimethyl sulfide (DMS) and SO2 using two techniques well suited for sampling from an aircraft due to their simplicity of design. DMS was collected by preconcentration on gold wire preceded by a KOH-impregnated filter oxidant scrubber, and analyzed by gas chromatography with flame photometric detection. SO2 was collected on K2CO3/glycerol-impregnated filters and analyzed by ion chromatography. In blind tests, both techniques produced excellent agreement with National Institutes of Standards and Technology (NIST) standards. For field measurements, the DMS technique produced excellent correlation with the mean of the six different techniques intercompared. For SO2, the five techniques intercompared were rather poorly correlated, but correlations between the three techniques which passed NIST standards tests were somewhat better. Our SO2 filter measurements exhibited rather large uncertainties due to higher than normal variabiltiy of the filter blanks, which we believe was caused by extended storage in the field. In measurements conducted off the coast of Natal, Brazil, a diurnal afternoon minimum in DMS concentrations accompanied by a corresponding maximum in SO2 concentrations was observed. However, due to rather large uncertainties in the SO2 measurements, any conclusions about the SO2 trend must by considered tentative.

  10. A continuously weighing, high frequency sand trap: Wind tunnel and field evaluations

    NASA Astrophysics Data System (ADS)

    Yang, Fan; Yang, XingHua; Huo, Wen; Ali, Mamtimin; Zheng, XinQian; Zhou, ChengLong; He, Qing

    2017-09-01

    A new continuously weighing, high frequency sand trap (CWHF) has been designed. Its sampling efficiency is evaluated in a wind tunnel and the potential of the new trap has been demonstrated in field trials. The newly designed sand trap allows fully automated and high frequency measurement of sediment fluxes over extensive periods. We show that it can capture the variations and structures of wind-driven sand transport processes and horizontal sediment flux, and reveal the relationships between sand transport and meteorological parameters. Its maximum sampling frequency can reach 10 Hz. Wind tunnel tests indicated that the sampling efficiency of the CWHF sand trap varies between 39.2 to 64.3%, with an average of 52.5%. It achieved a maximum sampling efficiency of 64.3% at a wind speed of 10 m s- 1. This is largely achieved by the inclusion of a vent hole which leads to a higher sampling efficiency than that of a step-like sand trap at high wind speeds. In field experiments, we show a good agreement between the mass of sediment from the CWHF sand trap, the wind speed at 2 m and the number of saltating particles at 5 cm above the ground surface. According to analysis of the horizontal sediment flux at four heights from the CWHF sand trap (25, 35, 50, and 100 cm), the vertical distribution of the horizontal sediment flux up to a height of 100 cm above the sand surface follows an exponential function. Our field experiments show that the new instrument can capture more detailed information on sediment transport with much reduced labor requirement. Therefore, it has great potential for application in wind-blown sand monitoring and process studies.

  11. Soil Sampling Techniques For Alabama Grain Fields

    NASA Technical Reports Server (NTRS)

    Thompson, A. N.; Shaw, J. N.; Mask, P. L.; Touchton, J. T.; Rickman, D.

    2003-01-01

    Characterizing the spatial variability of nutrients facilitates precision soil sampling. Questions exist regarding the best technique for directed soil sampling based on a priori knowledge of soil and crop patterns. The objective of this study was to evaluate zone delineation techniques for Alabama grain fields to determine which method best minimized the soil test variability. Site one (25.8 ha) and site three (20.0 ha) were located in the Tennessee Valley region, and site two (24.2 ha) was located in the Coastal Plain region of Alabama. Tennessee Valley soils ranged from well drained Rhodic and Typic Paleudults to somewhat poorly drained Aquic Paleudults and Fluventic Dystrudepts. Coastal Plain s o i l s ranged from coarse-loamy Rhodic Kandiudults to loamy Arenic Kandiudults. Soils were sampled by grid soil sampling methods (grid sizes of 0.40 ha and 1 ha) consisting of: 1) twenty composited cores collected randomly throughout each grid (grid-cell sampling) and, 2) six composited cores collected randomly from a -3x3 m area at the center of each grid (grid-point sampling). Zones were established from 1) an Order 1 Soil Survey, 2) corn (Zea mays L.) yield maps, and 3) airborne remote sensing images. All soil properties were moderately to strongly spatially dependent as per semivariogram analyses. Differences in grid-point and grid-cell soil test values suggested grid-point sampling does not accurately represent grid values. Zones created by soil survey, yield data, and remote sensing images displayed lower coefficient of variations (8CV) for soil test values than overall field values, suggesting these techniques group soil test variability. However, few differences were observed between the three zone delineation techniques. Results suggest directed sampling using zone delineation techniques outlined in this paper would result in more efficient soil sampling for these Alabama grain fields.

  12. Gravitational waves and large field inflation

    NASA Astrophysics Data System (ADS)

    Linde, Andrei

    2017-02-01

    According to the famous Lyth bound, one can confirm large field inflation by finding tensor modes with sufficiently large tensor-to-scalar ratio r. Here we will try to answer two related questions: is it possible to rule out all large field inflationary models by not finding tensor modes with r above some critical value, and what can we say about the scale of inflation by measuring r? However, in order to answer these questions one should distinguish between two different definitions of the large field inflation and three different definitions of the scale of inflation. We will examine these issues using the theory of cosmological α-attractors as a convenient testing ground.

  13. Optimal spatial sampling techniques for ground truth data in microwave remote sensing of soil moisture

    NASA Technical Reports Server (NTRS)

    Rao, R. G. S.; Ulaby, F. T.

    1977-01-01

    The paper examines optimal sampling techniques for obtaining accurate spatial averages of soil moisture, at various depths and for cell sizes in the range 2.5-40 acres, with a minimum number of samples. Both simple random sampling and stratified sampling procedures are used to reach a set of recommended sample sizes for each depth and for each cell size. Major conclusions from statistical sampling test results are that (1) the number of samples required decreases with increasing depth; (2) when the total number of samples cannot be prespecified or the moisture in only one single layer is of interest, then a simple random sample procedure should be used which is based on the observed mean and SD for data from a single field; (3) when the total number of samples can be prespecified and the objective is to measure the soil moisture profile with depth, then stratified random sampling based on optimal allocation should be used; and (4) decreasing the sensor resolution cell size leads to fairly large decreases in samples sizes with stratified sampling procedures, whereas only a moderate decrease is obtained in simple random sampling procedures.

  14. Understanding the effect of pulsed electric fields on thermostability of connective tissue isolated from beef pectoralis muscle using a model system.

    PubMed

    Alahakoon, A U; Oey, I; Silcock, P; Bremer, P

    2017-10-01

    Brisket is a low value/tough meat cut that contains a large amount of connective tissue. Conversion of collagen into gelatin during heating reduces the toughness of the connective tissue however this conversion is slow at low cooking temperatures (around 60°C). The objective of this project was to determine the ability of pulsed electric field (PEF) processing to reduce the thermal stability of connective tissue. To achieve this, a novel model system was designed in which connective tissue obtained from beef deep pectotalis muscle (brisket) was exposed to PEF at combinations of electric field strength (1.0 and 1.5kV/cm) and specific energy (50 and 100kJ/kg) within an agar matrix at electrical conductivities representing the electrical conductivity found in brisket. Differential scanning calorimetry showed that PEF treatment significantly (p<0.05) decreased the denaturation temperature of connective tissue compared to untreated samples. Increasing electric field strength and the specific energy increased the Ringer soluble collagen fraction. PEF treated samples showed higher solubilization compared to the untreated samples at both 60°C and 70°C in heat solubility test. SEM examination of PEF treated (at 1.5kV/cm and 100kJ/kg) and untreated samples revealed that PEF appeared to increase the porosity of the connective tissue structure. These finding suggest that PEF processing is a technology that could be used to improve the tenderness and decrease the cooking time of collagen rich, meat cuts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Environmental Monitoring of a Titan 34D 5 1/2 Segment Solid Rocket Motor Static Firing.

    DTIC Science & Technology

    1988-03-01

    concentrations. The sampling scheme called for three near - field sampling sites (AFAL Experimental Areas 1-90, 1-100, and the Receiving, Inspection and Storage...regeneration from acidic rainout. 4. Field -testing the Aerospace and AFESC/LLNL experimental HCI monitors. The firing was first attempted on 4 June...was designed to take advantage of the specified wind corridor, and provided for both near - field and far- field sampling of ground-level HCI

  16. Continuous improvement of medical test reliability using reference methods and matrix-corrected target values in proficiency testing schemes: application to glucose assay.

    PubMed

    Delatour, Vincent; Lalere, Beatrice; Saint-Albin, Karène; Peignaux, Maryline; Hattchouel, Jean-Marc; Dumont, Gilles; De Graeve, Jacques; Vaslin-Reimann, Sophie; Gillery, Philippe

    2012-11-20

    The reliability of biological tests is a major issue for patient care in terms of public health that involves high economic stakes. Reference methods, as well as regular external quality assessment schemes (EQAS), are needed to monitor the analytical performance of field methods. However, control material commutability is a major concern to assess method accuracy. To overcome material non-commutability, we investigated the possibility of using lyophilized serum samples together with a limited number of frozen serum samples to assign matrix-corrected target values, taking the example of glucose assays. Trueness of the current glucose assays was first measured against a primary reference method by using human frozen sera. Methods using hexokinase and glucose oxidase with spectroreflectometric detection proved very accurate, with bias ranging between -2.2% and +2.3%. Bias of methods using glucose oxidase with spectrophotometric detection was +4.5%. Matrix-related bias of the lyophilized materials was then determined and ranged from +2.5% to -14.4%. Matrix-corrected target values were assigned and used to assess trueness of 22 sub-peer groups. We demonstrated that matrix-corrected target values can be a valuable tool to assess field method accuracy in large scale surveys where commutable materials are not available in sufficient amount with acceptable costs. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Evaluation of field methods for vertical high resolution aquifer characterization

    NASA Astrophysics Data System (ADS)

    Vienken, T.; Tinter, M.; Rogiers, B.; Leven, C.; Dietrich, P.

    2012-12-01

    The delineation and characterization of subsurface (hydro)-stratigraphic structures is one of the challenging tasks of hydrogeological site investigations. The knowledge about the spatial distribution of soil specific properties and hydraulic conductivity (K) is the prerequisite for understanding flow and fluid transport processes. This is especially true for heterogeneous unconsolidated sedimentary deposits with a complex sedimentary architecture. One commonly used approach to investigate and characterize sediment heterogeneity is soil sampling and lab analyses, e.g. grain size distribution. Tests conducted on 108 samples show that calculation of K based on grain size distribution is not suitable for high resolution aquifer characterization of highly heterogeneous sediments due to sampling effects and large differences of calculated K values between applied formulas (Vienken & Dietrich 2011). Therefore, extensive tests were conducted at two test sites under different geological conditions to evaluate the performance of innovative Direct Push (DP) based approaches for the vertical high resolution determination of K. Different DP based sensor probes for the in-situ subsurface characterization based on electrical, hydraulic, and textural soil properties were used to obtain high resolution vertical profiles. The applied DP based tools proved to be a suitable and efficient alternative to traditional approaches. Despite resolution differences, all of the applied methods captured the main aquifer structure. Correlation of the DP based K estimates and proxies with DP based slug tests show that it is possible to describe the aquifer hydraulic structure on less than a meter scale by combining DP slug test data and continuous DP measurements. Even though correlations are site specific and appropriate DP tools must be chosen, DP is reliable and efficient alternative for characterizing even strongly heterogeneous sites with complex structured sedimentary aquifers (Vienken et al. 2012). References: Vienken, T., Leven, C., and Dietrich, P. 2012. Use of CPT and other direct push methods for (hydro-) stratigraphic aquifer characterization — a field study. Canadian Geotechnical Journal, 49(2): 197-206. Vienken, T., and Dietrich, P. 2011. Field evaluation of methods for determining hydraulic conductivity from grain size data. Journal of Hydrology, 400(1-2): 58-71.

  18. Retrieving cosmological signal using cosmic flows

    NASA Astrophysics Data System (ADS)

    Bouillot, V.; Alimi, J.-M.

    2011-12-01

    To understand the origin of the anomalously high bulk flow at large scales, we use very large simulations in various cosmological models. To disentangle between cosmological and environmental effects, we select samples with bulk flow profiles similar to the observational data Watkins et al. (2009) which exhibit a maximum in the bulk flow at 53 h^{-1} Mpc. The estimation of the cosmological parameters Ω_M and σ_8, done on those samples, is correct from the rms mass fluctuation whereas this estimation gives completely false values when done on bulk flow measurements, hence showing a dependance of velocity fields on larger scales. By drawing a clear link between velocity fields at 53 h^{-1} Mpc and asymmetric patterns of the density field at 85 h^{-1} Mpc, we show that the bulk flow can depend largely on the environment. The retrieving of the cosmological signal is achieved by studying the convergence of the bulk flow towards the linear prediction at very large scale (˜ 150 h^{-1} Mpc).

  19. Similar reliability and equivalent performance of female and male mice in the open field and water‐maze place navigation task

    PubMed Central

    Fritz, Ann‐Kristina; Amrein, Irmgard

    2017-01-01

    Although most nervous system diseases affect women and men differentially, most behavioral studies using mouse models do not include subjects of both sexes. Many researchers worry that data of female mice may be unreliable due to the estrous cycle. Here, we retrospectively evaluated sex effects on coefficient of variation (CV) in 5,311 mice which had performed the same place navigation protocol in the water‐maze and in 4,554 mice tested in the same open field arena. Confidence intervals for Cohen's d as measure of effect size were computed and tested for equivalence with 0.2 as equivalence margin. Despite the large sample size, only few behavioral parameters showed a significant sex effect on CV. Confidence intervals of effect size indicated that CV was either equivalent or showed a small sex difference at most, accounting for less than 2% of total group to group variation of CV. While female mice were potentially slightly more variable in water‐maze acquisition and in the open field, males tended to perform less reliably in the water‐maze probe trial. In addition to evaluating variability, we also directly compared mean performance of female and male mice and found them to be equivalent in both water‐maze place navigation and open field exploration. Our data confirm and extend other large scale studies in demonstrating that including female mice in experiments does not cause a relevant increase of data variability. Our results make a strong case for including mice of both sexes whenever open field or water‐maze are used in preclinical research. PMID:28654717

  20. Similar reliability and equivalent performance of female and male mice in the open field and water-maze place navigation task.

    PubMed

    Fritz, Ann-Kristina; Amrein, Irmgard; Wolfer, David P

    2017-09-01

    Although most nervous system diseases affect women and men differentially, most behavioral studies using mouse models do not include subjects of both sexes. Many researchers worry that data of female mice may be unreliable due to the estrous cycle. Here, we retrospectively evaluated sex effects on coefficient of variation (CV) in 5,311 mice which had performed the same place navigation protocol in the water-maze and in 4,554 mice tested in the same open field arena. Confidence intervals for Cohen's d as measure of effect size were computed and tested for equivalence with 0.2 as equivalence margin. Despite the large sample size, only few behavioral parameters showed a significant sex effect on CV. Confidence intervals of effect size indicated that CV was either equivalent or showed a small sex difference at most, accounting for less than 2% of total group to group variation of CV. While female mice were potentially slightly more variable in water-maze acquisition and in the open field, males tended to perform less reliably in the water-maze probe trial. In addition to evaluating variability, we also directly compared mean performance of female and male mice and found them to be equivalent in both water-maze place navigation and open field exploration. Our data confirm and extend other large scale studies in demonstrating that including female mice in experiments does not cause a relevant increase of data variability. Our results make a strong case for including mice of both sexes whenever open field or water-maze are used in preclinical research. © 2017 The Authors. American Journal of Medical Genetics Part C Published by Wiley Periodicals, Inc.

  1. Validation of large-scale, monochromatic UV disinfection systems for drinking water using dyed microspheres.

    PubMed

    Blatchley, E R; Shen, C; Scheible, O K; Robinson, J P; Ragheb, K; Bergstrom, D E; Rokjer, D

    2008-02-01

    Dyed microspheres have been developed as a new method for validation of ultraviolet (UV) reactor systems. When properly applied, dyed microspheres allow measurement of the UV dose distribution delivered by a photochemical reactor for a given operating condition. Prior to this research, dyed microspheres had only been applied to a bench-scale UV reactor. The goal of this research was to extend the application of dyed microspheres to large-scale reactors. Dyed microsphere tests were conducted on two prototype large-scale UV reactors at the UV Validation and Research Center of New York (UV Center) in Johnstown, NY. All microsphere tests were conducted under conditions that had been used previously in biodosimetry experiments involving two challenge bacteriophage: MS2 and Qbeta. Numerical simulations based on computational fluid dynamics and irradiance field modeling were also performed for the same set of operating conditions used in the microspheres assays. Microsphere tests on the first reactor illustrated difficulties in sample collection and discrimination of microspheres against ambient particles. Changes in sample collection and work-up were implemented in tests conducted on the second reactor that allowed for improvements in microsphere capture and discrimination against the background. Under these conditions, estimates of the UV dose distribution from the microspheres assay were consistent with numerical simulations and the results of biodosimetry, using both challenge organisms. The combined application of dyed microspheres, biodosimetry, and numerical simulation offers the potential to provide a more in-depth description of reactor performance than any of these methods individually, or in combination. This approach also has the potential to substantially reduce uncertainties in reactor validation, thereby leading to better understanding of reactor performance, improvements in reactor design, and decreases in reactor capital and operating costs.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bourdon, J.C.; Peltier, B.; Cooper, G.A.

    In this paper, field drill-off test results are compared with data from laboratory simulations. A simple theory for analyzing drill-off tests is developed. The weight-on bit (WOB) decay with time is close to exponential, but large threshold WOB's, resulting from poor weight transmission downhole, are sometimes observed in field tests.

  3. Electrofracturing test system and method of determining material characteristics of electrofractured material samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bauer, Stephen J.; Glover, Steven F.; Pfeifle, Tom

    A device for electrofracturing a material sample and analyzing the material sample is disclosed. The device simulates an in situ electrofracturing environment so as to obtain electrofractured material characteristics representative of field applications while allowing permeability testing of the fractured sample under in situ conditions.

  4. A magneto-resistance and magnetisation study of TaAs2 semimetal

    NASA Astrophysics Data System (ADS)

    Harimohan, V.; Bharathi, A.; Rajaraman, R.; Sundar, C. S.

    2018-04-01

    Here we report on the magneto-transport and magnetization studies on single crystalline samples of TaAs2. The resistivity versus temperature of the single crystalline sample shows a metallic behavior with a large residual resistivity ratio. The TaAs2 crystal shows large magneto resistance at low temperature, reaching 91000% at 2.5K in a field of 15 T and the resistivity versus temperature shows an upturn at low temperature, when measured with increase in magnetic field. Resistivity and magnetization measurements as a function of magnetic field show characteristic Shubnikov de Haas and de Hass van Alphen oscillations, displaying anisotropy with respect to the crystalline direction. The effective mass and Dingle temperature were estimated from the analysis of the oscillation amplitude as a function of temperature and magnetic field. Negative magneto-resistance was not observed with current parallel to the magnetic field direction, suggesting that TaAs2 is not an archetypical Weyl metal.

  5. Magneto-Optic Kerr Effect in a Magnetized Electron Gun

    NASA Astrophysics Data System (ADS)

    Hardy, Benjamin; Grames, Joseph; CenterInjectors; Sources Team

    2016-09-01

    Magnetized electron sources have the potential to improve ion beam cooling efficiency. At the Gun Test Stand at Jefferson Lab, a solenoid magnet will be installed adjacent to the photogun to magnetize the electron beam. Due to the photocathode operating in a vacuum chamber, measuring and monitoring the magnetic field at the beam source location with conventional probes is impractical. The Magneto-Optical Kerr Effect (MOKE) describes the change on polarized light by reflection from a magnetized surface. The reflection from the surface may alter the polarization direction, ellipticity, or intensity, and depends linearly upon the surface magnetization of the sample. By replacing the photocathode with a magnetized sample and reflecting polarized light from the sample surface, the magnetic field at the beam source is inferred. A controlled MOKE system has been assembled to test the magnetic field. Calibration of the solenoid magnet is performed by comparing the MOKE signal with magnetic field measurements. The apparatus will provide a description of the field at electron beam source. The report summarizes the method and results of controlled tests and calibration of the MOKE sample with the solenoid magnet field measurements. This work is supported by the National Science Foundation, Research Experience for Undergraduates Award 1359026 and the Department of Energy, Laboratory Directed Research and Development Contract DE-AC05-06OR23177.

  6. New Computational Methods for the Prediction and Analysis of Helicopter Noise

    NASA Technical Reports Server (NTRS)

    Strawn, Roger C.; Oliker, Leonid; Biswas, Rupak

    1996-01-01

    This paper describes several new methods to predict and analyze rotorcraft noise. These methods are: 1) a combined computational fluid dynamics and Kirchhoff scheme for far-field noise predictions, 2) parallel computer implementation of the Kirchhoff integrations, 3) audio and visual rendering of the computed acoustic predictions over large far-field regions, and 4) acoustic tracebacks to the Kirchhoff surface to pinpoint the sources of the rotor noise. The paper describes each method and presents sample results for three test cases. The first case consists of in-plane high-speed impulsive noise and the other two cases show idealized parallel and oblique blade-vortex interactions. The computed results show good agreement with available experimental data but convey much more information about the far-field noise propagation. When taken together, these new analysis methods exploit the power of new computer technologies and offer the potential to significantly improve our prediction and understanding of rotorcraft noise.

  7. Using habitat suitability models to target invasive plant species surveys

    USGS Publications Warehouse

    Crall, Alycia W.; Jarnevich, Catherine S.; Panke, Brendon; Young, Nick; Renz, Mark; Morisette, Jeffrey

    2013-01-01

    Managers need new tools for detecting the movement and spread of nonnative, invasive species. Habitat suitability models are a popular tool for mapping the potential distribution of current invaders, but the ability of these models to prioritize monitoring efforts has not been tested in the field. We tested the utility of an iterative sampling design (i.e., models based on field observations used to guide subsequent field data collection to improve the model), hypothesizing that model performance would increase when new data were gathered from targeted sampling using criteria based on the initial model results. We also tested the ability of habitat suitability models to predict the spread of invasive species, hypothesizing that models would accurately predict occurrences in the field, and that the use of targeted sampling would detect more species with less sampling effort than a nontargeted approach. We tested these hypotheses on two species at the state scale (Centaurea stoebe and Pastinaca sativa) in Wisconsin (USA), and one genus at the regional scale (Tamarix) in the western United States. These initial data were merged with environmental data at 30-m2 resolution for Wisconsin and 1-km2 resolution for the western United States to produce our first iteration models. We stratified these initial models to target field sampling and compared our models and success at detecting our species of interest to other surveys being conducted during the same field season (i.e., nontargeted sampling). Although more data did not always improve our models based on correct classification rate (CCR), sensitivity, specificity, kappa, or area under the curve (AUC), our models generated from targeted sampling data always performed better than models generated from nontargeted data. For Wisconsin species, the model described actual locations in the field fairly well (kappa = 0.51, 0.19, P 2) = 47.42, P < 0.01). From these findings, we conclude that habitat suitability models can be highly useful tools for guiding invasive species monitoring, and we support the use of an iterative sampling design for guiding such efforts.

  8. Gravel Transport Measured With Bedload Traps in Mountain Streams: Field Data Sets to be Published

    NASA Astrophysics Data System (ADS)

    Bunte, K.; Swingle, K. W.; Abt, S. R.; Ettema, R.; Cenderelli, D. A.

    2017-12-01

    Direct, accurate measurements of coarse bedload transport exist for only a few streams worldwide, because the task is laborious and requires a suitable device. However, sets of accurate field data would be useful for reference with unsampled sites and as a basis for model developments. The authors have carefully measured gravel transport and are compiling their data sets for publication. To ensure accurate measurements of gravel bedload in wadeable flow, the designed instrument consisted of an unflared aluminum frame (0.3 x 0.2 m) large enough for entry of cobbles. The attached 1 m or longer net with a 4 mm mesh held large bedload volumes. The frame was strapped onto a ground plate anchored onto the channel bed. This setup avoided involuntary sampler particle pick-up and enabled long sampling times, integrating over fluctuating transport. Beveled plates and frames facilitated easy particle entry. Accelerating flow over smooth plates compensated for deceleration within the net. Spacing multiple frames by 1 m enabled sampling much of the stream width. Long deployment, and storage of sampled bedload away from the frame's entrance, were attributes of traps rather than samplers; hence the name "bedload traps". The authors measured gravel transport with 4-6 bedload traps per cross-section at 10 mountain streams in CO, WY, and OR, accumulating 14 data sets (>1,350 samples). In 10 data sets, measurements covered much of the snowmelt high-flow season yielding 50-200 samples. Measurement time was typically 1 hour but ranged from 3 minutes to 3 hours, depending on transport intensity. Measuring back-to-back provided 6 to 10 samples over a 6 to 10-hour field day. Bedload transport was also measured with a 3-inch Helley-Smith sampler. The data set provides fractional (0.5 phi) transport rates in terms of particle mass and number for each bedload trap in the cross-section, the largest particle size, as well as total cross-sectional gravel transport rates. Ancillary field data include stage, discharge, long-term flow records if available, surface and subsurface sediment sizes, as well as longitudinal and cross-sectional site surveys. Besides transport relations, incipient motion conditions, hysteresis, and lateral variation, the data provide a reliable modeling basis to test insights and hypotheses regarding bedload transport.

  9. Detection of antibodies to egg drop syndrome virus in chicken serum using a field-based immunofiltration (flow-through) test.

    PubMed

    Raj, G Dhinakar; Thiagarajan, V; Nachimuthu, K

    2007-09-01

    A simple, user-friendly, and rapid method to detect the presence of antibodies to egg drop syndrome 76 (EDS) virus in chicken sera based on an immunofiltration (flow-through) test was developed. Purified EDS virus antigen was coated onto nitrocellulose membranes housed in a plastic module with layers of absorbent filter pads underneath. Following addition of serum to be tested and washing, monoclonal antibodies or polyclonal serum to chicken immunoglobulin G (IgG) was used as a bridge antibody to mediate binding between EDS virus-specific IgG and protein A gold conjugate. The appearance of a pink dot indicated the presence of antibodies to EDS virus in the sample tested. The results could be obtained within 5-10 min. The developed immunofiltration test could detect antibodies in the sera of experimentally vaccinated chickens from 2 wk postvaccination. With field sera samples, this test was positive in samples having hemagglutination inhibition titers of 8 and above. This test has the potential to be used as a field-based kit to assess seroconversion in EDS-vaccinated flocks.

  10. A SYSTEMATIC SEARCH FOR PERIODICALLY VARYING QUASARS IN PAN-STARRS1: AN EXTENDED BASELINE TEST IN MEDIUM DEEP SURVEY FIELD MD09

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, T.; Gezari, S.; Burgett, W.

    We present a systematic search for periodically varying quasars and supermassive black hole binary (SMBHB) candidates in the Pan-STARRS1 (PS1) Medium Deep Survey’s MD09 field. From a color-selected sample of 670 quasars extracted from a multi-band deep-stack catalog of point sources, we locally select variable quasars and look for coherent periods with the Lomb–Scargle periodogram. Three candidates from our sample demonstrate strong variability for more than ∼3 cycles, and their PS1 light curves are well fitted to sinusoidal functions. We test the persistence of the candidates’ apparent periodic variations detected during the 4.2 years of the PS1 survey with archivalmore » photometric data from the SDSS Stripe 82 survey or new monitoring with the Large Monolithic Imager at the Discovery Channel Telescope. None of the three periodic candidates (including PSO J334.2028+1.4075) remain persistent over the extended baseline of 7–14 years, corresponding to a detection rate of <1 in 670 quasars in a search area of ≈5 deg{sup 2}. Even though SMBHBs should be a common product of the hierarchal growth of galaxies, and periodic variability in SMBHBs has been theoretically predicted, a systematic search for such signatures in a large optical survey is strongly limited by its temporal baseline and the “red noise” associated with normal quasar variability. We show that follow-up long-term monitoring (≳5 cycles) is crucial to our search for these systems.« less

  11. RELIABILITY OF THE DETECTION OF THE BARYON ACOUSTIC PEAK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MartInez, Vicent J.; Arnalte-Mur, Pablo; De la Cruz, Pablo

    2009-05-01

    The correlation function of the distribution of matter in the universe shows, at large scales, baryon acoustic oscillations, which were imprinted prior to recombination. This feature was first detected in the correlation function of the luminous red galaxies of the Sloan Digital Sky Survey (SDSS). Recently, the final release (DR7) of the SDSS has been made available, and the useful volume is about two times bigger than in the old sample. We present here, for the first time, the redshift-space correlation function of this sample at large scales together with that for one shallower, but denser volume-limited subsample drawn frommore » the Two-Degree Field Redshift Survey. We test the reliability of the detection of the acoustic peak at about 100 h {sup -1} Mpc and the behavior of the correlation function at larger scales by means of careful estimation of errors. We confirm the presence of the peak in the latest data although broader than in previous detections.« less

  12. Low-field magnetoresistance up to 400 K in double perovskite Sr{sub 2}FeMoO{sub 6} synthesized by a citrate route

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harnagea, L., E-mail: harnagealuminita@gmail.com; Jurca, B.; Physical Chemistry Department, University of Bucharest, 4-12 Bd. Elisabeta, 030018 Bucharest

    2014-03-15

    A wet-chemistry technique, namely the citrate route, has been used to prepare high-quality polycrystalline samples of double perovskite Sr{sub 2}FeMoO{sub 6}. We report on the evolution of magnetic and magnetoresistive properties of the synthesized samples as a function of three parameters (i) the pH of the starting solution, (ii) the decomposition temperature of the citrate precursors and (iii) the sintering conditions. The low-field magnetoresistance (LFMR) value of our best samples is as high as 5% at room temperature for an applied magnetic field of 1 kOe. Additionally, the distinguishing feature of these samples is the persistence of LFMR, with amore » reasonably large value, up to 400 K which is a crucial parameter for any practical application. Our study indicates that the enhancement of LFMR observed is due to a good compromise between the grain size distribution and their magnetic polarization. -- Graphical abstract: The microstructure (left panel) and corresponding low-field magnetoresistance of one of the Sr{sub 2}FeMoO{sub 6} samples synthesized in the course of this work. Highlights: • Samples of Sr{sub 2}FeMoO{sub 6} are prepared using a citrate route under varying conditions. • Magnetoresistive properties are improved and optimized. • Low-field magnetoresitence values as large as 5% at 300 K/1 kOe are reported. • Persistence of low-field magnetoresistance up to 400 K.« less

  13. TRACING THE MAGNETIC FIELD MORPHOLOGY OF THE LUPUS I MOLECULAR CLOUD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franco, G. A. P.; Alves, F. O., E-mail: franco@fisica.ufmg.br, E-mail: falves@mpe.mpg.de

    2015-07-01

    Deep R-band CCD linear polarimetry collected for fields with lines of sight toward the Lupus I molecular cloud is used to investigate the properties of the magnetic field within this molecular cloud. The observed sample contains about 7000 stars, almost 2000 of them with a polarization signal-to-noise ratio larger than 5. These data cover almost the entire main molecular cloud and also sample two diffuse infrared patches in the neighborhood of Lupus I. The large-scale pattern of the plane-of-sky projection of the magnetic field is perpendicular to the main axis of Lupus I, but parallel to the two diffuse infraredmore » patches. A detailed analysis of our polarization data combined with the Herschel/SPIRE 350 μm dust emission map shows that the principal filament of Lupus I is constituted by three main clumps that are acted on by magnetic fields that have different large-scale structural properties. These differences may be the reason for the observed distribution of pre- and protostellar objects along the molecular cloud and the cloud’s apparent evolutionary stage. On the other hand, assuming that the magnetic field is composed of large-scale and turbulent components, we find that the latter is rather similar in all three clumps. The estimated plane-of-sky component of the large-scale magnetic field ranges from about 70 to 200 μG in these clumps. The intensity increases toward the Galactic plane. The mass-to-magnetic flux ratio is much smaller than unity, implying that Lupus I is magnetically supported on large scales.« less

  14. Leaching behaviour of bottom ash from RDF high-temperature gasification plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gori, M., E-mail: manuela.gori@dicea.unifi.it; Pifferi, L.; Sirini, P.

    2011-07-15

    This study investigated the physical properties, the chemical composition and the leaching behaviour of two bottom ash (BA) samples from two different refuse derived fuel high-temperature gasification plants, as a function of particle size. The X-ray diffraction patterns showed that the materials contained large amounts of glass. This aspect was also confirmed by the results of availability and ANC leaching tests. Chemical composition indicated that Fe, Mn, Cu and Cr were the most abundant metals, with a slight enrichment in the finest fractions. Suitability of samples for inert waste landfilling and reuse was evaluated through the leaching test EN 12457-2.more » In one sample the concentration of all metals was below the limit set by law, while limits were exceeded for Cu, Cr and Ni in the other sample, where the finest fraction showed to give the main contribution to leaching of Cu and Ni. Preliminary results of physical and geotechnical characterisation indicated the suitability of vitrified BA for reuse in the field of civil engineering. The possible application of a size separation pre-treatment in order to improve the chemical characteristics of the materials was also discussed.« less

  15. A Field-Based Cleaning Protocol for Sampling Devices Used in Life-Detection Studies

    NASA Astrophysics Data System (ADS)

    Eigenbrode, Jennifer; Benning, Liane G.; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E. F.

    2009-06-01

    Analytical approaches to extant and extinct life detection involve molecular detection often at trace levels. Thus, removal of biological materials and other organic molecules from the surfaces of devices used for sampling is essential for ascertaining meaningful results. Organic decontamination to levels consistent with null values on life-detection instruments is particularly challenging at remote field locations where Mars analog field investigations are carried out. Here, we present a seven-step, multi-reagent decontamination method that can be applied to sampling devices while in the field. In situ lipopolysaccharide detection via low-level endotoxin assays and molecular detection via gas chromatography-mass spectrometry were used to test the effectiveness of the decontamination protocol for sampling of glacial ice with a coring device and for sampling of sediments with a rover scoop during deployment at Arctic Mars-analog sites in Svalbard, Norway. Our results indicate that the protocols and detection technique sufficiently remove and detect low levels of molecular constituents necessary for life-detection tests.

  16. A field-based cleaning protocol for sampling devices used in life-detection studies.

    PubMed

    Eigenbrode, Jennifer; Benning, Liane G; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E F

    2009-06-01

    Analytical approaches to extant and extinct life detection involve molecular detection often at trace levels. Thus, removal of biological materials and other organic molecules from the surfaces of devices used for sampling is essential for ascertaining meaningful results. Organic decontamination to levels consistent with null values on life-detection instruments is particularly challenging at remote field locations where Mars analog field investigations are carried out. Here, we present a seven-step, multi-reagent decontamination method that can be applied to sampling devices while in the field. In situ lipopolysaccharide detection via low-level endotoxin assays and molecular detection via gas chromatography-mass spectrometry were used to test the effectiveness of the decontamination protocol for sampling of glacial ice with a coring device and for sampling of sediments with a rover scoop during deployment at Arctic Mars-analog sites in Svalbard, Norway. Our results indicate that the protocols and detection technique sufficiently remove and detect low levels of molecular constituents necessary for life-detection tests.

  17. Comparing Standard and Selective Degradation DNA Extraction Methods: Results from a Field Experiment with Sexual Assault Kits.

    PubMed

    Campbell, Rebecca; Pierce, Steven J; Sharma, Dhruv B; Shaw, Jessica; Feeney, Hannah; Nye, Jeffrey; Schelling, Kristin; Fehler-Cabral, Giannina

    2017-01-01

    A growing number of U.S. cities have large numbers of untested sexual assault kits (SAKs) in police property facilities. Testing older kits and maintaining current case work will be challenging for forensic laboratories, creating a need for more efficient testing methods. We evaluated selective degradation methods for DNA extraction using actual case work from a sample of previously unsubmitted SAKs in Detroit, Michigan. We randomly assigned 350 kits to either standard or selective degradation testing methods and then compared DNA testing rates and CODIS entry rates between the two groups. Continuation-ratio modeling showed no significant differences, indicating that the selective degradation method had no decrement in performance relative to customary methods. Follow-up equivalence tests indicated that CODIS entry rates for the two methods could differ by more than ±5%. Selective degradation methods required less personnel time for testing and scientific review than standard testing. © 2016 American Academy of Forensic Sciences.

  18. Absorption of Solar Radiation by Clouds: A Second Look at Irradiance Measurements

    NASA Technical Reports Server (NTRS)

    Tsay, Si-Chee; King, Michael D.; Cahalan, Robert F.; Lau, William K.-M. (Technical Monitor)

    2001-01-01

    A decade ago, Stephens and Tsay provided an overview of the subject of absorption of solar radiation by clouds in the earth's atmosphere. They summarized the available evidence that pointed to disagreements between theoretical and observed values of cloud absorption (and reflection). At that time, a theoretician's approach (assuming perfect flux measurements) was adopted to test the model uncertainty under various hypotheses, such as the omitted large drops, excess absorbing aerosols, enhanced water vapor continuum absorption, and cloud inhomogeneity. Since then, several advances in theoretical work have been made, but a satisfactory answer for the discrepancy is still lacking. Now, we offer an experimentalist's approach (focusing on field, not laboratory) to examine the observational uncertainty under numerous field factors, such as the temperature dependence, attitude control, and sampling strategy in the spatial and spectral domain. Examples from recent field campaigns have pointed out that these sources of error may be responsible for the unacceptable level of uncertainty (e.g., as large as 20 W/square m). We give examples of each, discuss their contribution to overall uncertainty in shortwave absorption, and suggest a coordinated approach to their solution.

  19. Velocimetry with refractive index matching for complex flow configurations, phase 1

    NASA Technical Reports Server (NTRS)

    Thompson, B. E.; Vafidis, C.; Whitelaw, J. H.

    1987-01-01

    The feasibility of obtaining detailed velocity field measurements in large Reynolds number flow of the Space Shuttle Main Engine (SSME) main injector bowl was demonstrated using laser velocimetry and the developed refractive-index-matching technique. An experimental system to provide appropriate flow rates and temperature control of refractive-index-matching fluid was designed and tested. Test results are presented to establish the feasibility of obtaining accurate velocity measurements that map the entire field including the flow through the LOX post bundles: sample mean velocity, turbulence intensity, and spectral results are presented. The results indicate that a suitable fluid and control system is feasible for the representation of complex rocket-engine configurations and that measurements of velocity characteristics can be obtained without the optical access restrictions normally associated with laser velocimetry. The refractive-index-matching technique considered needs to be further developed and extended to represent other rocket-engine flows where current methods either cannot measure with adequate accuracy or they fail.

  20. Paleomagnetic full vector record of four consecutive Mid Miocene geomagnetic reversals

    NASA Astrophysics Data System (ADS)

    Linder, J.; Leonhardt, R.

    2009-11-01

    Seventy Mid Miocene lava flows from flood basalt piles near Neskaupstadur (East Iceland) were sampled, which provide a quasi-continuous record of geomagnetic field variations. Samples were collected along the profile B of Watkins and Walker [Watkins, N., Walker, G.P.L., 1977. Magnetostratigraphy of eastern Iceland. Am. J. Sci. 277, 513-584], which was extended about 250 m farther down in a neighboring stream bed. Published radiometric age determinations [Harrison, C., McDougall, I., Watkins, N., 1979. A geomagnetic field reversal time scale back to 13.0 million years before present. Earth Planet. Sci. Lett. 42, 143-152] range from 12.2 to 12.8 Ma for the sampled sequence. Four reversals were recorded in this profile, with 18 transitional lavas found within or between 17 normal and 30 reversed polarity flows. The large amount of transitional lavas and the large virtual geomagnetic pole dispersion for stable field directions are noteworthy as such features are commonly observed in Icelandic lavas and manifest in a far-sidedness of the average VGP. The reason for this characteristic, which could be related to an anomaly beneath Iceland, a global field phenomenon, local tectonics, and/or non-horizontal flow emplacement, is scrutinized. Non-horizontal flow emplacement is likely in volcanic environments particularly if the sampled lavas are located on the paleoslopes of a central volcano. From the difference of the observed paleomagnetic mean directions to the expected directions assuming a geocentric axial dipole (GAD), a paleoslope which would explain the observed difference was calculated numerically. The obtained dip and dip direction point consistently to a possible volcanic extrusion center of the lavas. The determined paleodip, however, proved to be significantly too high compared to the usual slope of a central volcano, suggesting further reasons for deviations from the GAD. Other datasets of this age from Europe also show enhanced VGP dispersion, suggesting further contributions of geomagnetic origin for this observation. Basically all reversal paths move across the Pacific. Transitions were identified as belonging to C5An.1r-C5Ar.3r based on the Astronomically Tuned Neogene Timescale [Lourens, L., Hilgen, F.J., Laskar, J., Shackleton, N.J., Wilson, D., 2004. A Geological Time Scale. Cambridge University Press]. We selected 122 samples for paleointensity measurements using a modified Thellier method including tests for alteration and multidomain bias. 85 of the measured samples yielded data of sufficient quality to calculate paleointensities for 26 lava flows. The average paleointensity for stable field directions was 23.3 μT, whereas the intensity drops to a minimum of 5.8 μT during field transitions. The stable field intensities represent only about half of the present day field. The saw-tooth pattern of intensities, which is characterized by a sharp increase of intensity directly after a reversal and then followed by a gradual decrease towards the next reversal, was not found in this study.

  1. Introducing Simple Detection of Bioavailable Arsenic at Rafaela (Santa Fe Province, Argentina) Using the ARSOlux Biosensor.

    PubMed

    Siegfried, Konrad; Hahn-Tomer, Sonja; Koelsch, Andreas; Osterwalder, Eva; Mattusch, Juergen; Staerk, Hans-Joachim; Meichtry, Jorge M; De Seta, Graciela E; Reina, Fernando D; Panigatti, Cecilia; Litter, Marta I; Harms, Hauke

    2015-05-21

    Numerous articles have reported the occurrence of arsenic in drinking water in Argentina, and the resulting health effects in severely affected regions of the country. Arsenic in drinking water in Argentina is largely naturally occurring due to elevated background content of the metalloid in volcanic sediments, although, in some regions, mining can contribute. While the origin of arsenic release has been discussed extensively, the problem of drinking water contamination has not yet been solved. One key step in progress towards mitigation of problems related with the consumption of As-containing water is the availability of simple detection tools. A chemical test kit and the ARSOlux biosensor were evaluated as simple analytical tools for field measurements of arsenic in the groundwater of Rafaela (Santa Fe, Argentina), and the results were compared with ICP-MS and HPLC-ICP-MS measurements. A survey of the groundwater chemistry was performed to evaluate possible interferences with the field tests. The results showed that the ARSOlux biosensor performed better than the chemical field test, that the predominant species of arsenic in the study area was arsenate and that arsenic concentration in the studied samples had a positive correlation with fluoride and vanadium, and a negative one with calcium and iron.

  2. Introducing Simple Detection of Bioavailable Arsenic at Rafaela (Santa Fe Province, Argentina) Using the ARSOlux Biosensor

    PubMed Central

    Siegfried, Konrad; Hahn-Tomer, Sonja; Koelsch, Andreas; Osterwalder, Eva; Mattusch, Juergen; Staerk, Hans-Joachim; Meichtry, Jorge M.; De Seta, Graciela E.; Reina, Fernando D.; Panigatti, Cecilia; Litter, Marta I.; Harms, Hauke

    2015-01-01

    Numerous articles have reported the occurrence of arsenic in drinking water in Argentina, and the resulting health effects in severely affected regions of the country. Arsenic in drinking water in Argentina is largely naturally occurring due to elevated background content of the metalloid in volcanic sediments, although, in some regions, mining can contribute. While the origin of arsenic release has been discussed extensively, the problem of drinking water contamination has not yet been solved. One key step in progress towards mitigation of problems related with the consumption of As-containing water is the availability of simple detection tools. A chemical test kit and the ARSOlux biosensor were evaluated as simple analytical tools for field measurements of arsenic in the groundwater of Rafaela (Santa Fe, Argentina), and the results were compared with ICP-MS and HPLC-ICP-MS measurements. A survey of the groundwater chemistry was performed to evaluate possible interferences with the field tests. The results showed that the ARSOlux biosensor performed better than the chemical field test, that the predominant species of arsenic in the study area was arsenate and that arsenic concentration in the studied samples had a positive correlation with fluoride and vanadium, and a negative one with calcium and iron. PMID:26006123

  3. Mutual Inductance Problem for a System Consisting of a Current Sheet and a Thin Metal Plate

    NASA Technical Reports Server (NTRS)

    Fulton, J. P.; Wincheski, B.; Nath, S.; Namkung, M.

    1993-01-01

    Rapid inspection of aircraft structures for flaws is of vital importance to the commercial and defense aircraft industry. In particular, inspecting thin aluminum structures for flaws is the focus of a large scale R&D effort in the nondestructive evaluation (NDE) community. Traditional eddy current methods used today are effective, but require long inspection times. New electromagnetic techniques which monitor the normal component of the magnetic field above a sample due to a sheet of current as the excitation, seem to be promising. This paper is an attempt to understand and analyze the magnetic field distribution due to a current sheet above an aluminum test sample. A simple theoretical model, coupled with a two dimensional finite element model (FEM) and experimental data will be presented in the next few sections. A current sheet above a conducting sample generates eddy currents in the material, while a sensor above the current sheet or in between the two plates monitors the normal component of the magnetic field. A rivet or a surface flaw near a rivet in an aircraft aluminum skin will disturb the magnetic field, which is imaged by the sensor. Initial results showed a strong dependence of the flaw induced normal magnetic field strength on the thickness and conductivity of the current-sheet that could not be accounted for by skin depth attenuation alone. It was believed that the eddy current imaging method explained the dependence of the thickness and conductivity of the flaw induced normal magnetic field. Further investigation, suggested the complexity associated with the mutual inductance of the system needed to be studied. The next section gives an analytical model to better understand the phenomenon.

  4. Benefits of GMR sensors for high spatial resolution NDT applications

    NASA Astrophysics Data System (ADS)

    Pelkner, M.; Stegemann, R.; Sonntag, N.; Pohl, R.; Kreutzbruck, M.

    2018-04-01

    Magneto resistance sensors like GMR (giant magneto resistance) or TMR (tunnel magneto resistance) are widely used in industrial applications; examples are position measurement and read heads of hard disk drives. However, in case of non-destructive testing (NDT) applications these sensors, although their properties are outstanding like high spatial resolution, high field sensitivity, low cost and low energy consumption, never reached a technical transfer to an application beyond scientific scope. This paper deals with benefits of GMR/TMR sensors in terms of high spatial resolution testing for different NDT applications. The first example demonstrates the preeminent advantages of MR-elements compared with conventional coils used in eddy current testing (ET). The probe comprises one-wire excitation with an array of MR elements. This led to a better spatial resolution in terms of neighboring defects. The second section concentrates on MFL-testing (magnetic flux leakage) with active field excitation during and before testing. The latter illustrated the capability of highly resolved crack detection of a crossed notch. This example is best suited to show the ability of tiny magnetic field sensors for magnetic material characterization of a sample surface. Another example is based on characterization of samples after tensile test. Here, no external field is applied. The magnetization is only changed due to external load and magnetostriction leading to a field signature which GMR sensors can resolve. This gives access to internal changes of the magnetization state of the sample under test.

  5. Large Field Photogrammetry Techniques in Aircraft and Spacecraft Impact Testing

    NASA Technical Reports Server (NTRS)

    Littell, Justin D.

    2010-01-01

    The Landing and Impact Research Facility (LandIR) at NASA Langley Research Center is a 240 ft. high A-frame structure which is used for full-scale crash testing of aircraft and rotorcraft vehicles. Because the LandIR provides a unique capability to introduce impact velocities in the forward and vertical directions, it is also serving as the facility for landing tests on full-scale and sub-scale Orion spacecraft mass simulators. Recently, a three-dimensional photogrammetry system was acquired to assist with the gathering of vehicle flight data before, throughout and after the impact. This data provides the basis for the post-test analysis and data reduction. Experimental setups for pendulum swing tests on vehicles having both forward and vertical velocities can extend to 50 x 50 x 50 foot cubes, while weather, vehicle geometry, and other constraints make each experimental setup unique to each test. This paper will discuss the specific calibration techniques for large fields of views, camera and lens selection, data processing, as well as best practice techniques learned from using the large field of view photogrammetry on a multitude of crash and landing test scenarios unique to the LandIR.

  6. Effects of Cluster Environment on Chemical Abundances in Virgo Cluster Spirals

    NASA Astrophysics Data System (ADS)

    Kennicutt, R. C.; Skillman, E. D.; Shields, G. A.; Zaritsky, D.

    1995-12-01

    We have obtained new chemical abundance measurements of HII regions in Virgo cluster spiral galaxies, in order to test whether the cluster environment has significantly influenced the gas-phase abundances and chemical evolution of spiral disks. The sample of 9 Virgo spirals covers a narrow range of morphological type (Sbc - Sc) but shows broad ranges in HI deficiencies and radii in the cluster. This allows us to compare the Virgo sample as a whole to field spirals, using a large sample from Zaritsky, Kennicutt, & Huchra, and to test for systematic trends with HI content and location within the cluster. The Virgo spirals show a wide dispersion in mean disk abundances and abundance gradients. Strongly HI deficient spirals closest to the cluster core show anomalously high oxygen abundances (by 0.3 to 0.5 dex), while outlying spirals with normal HI content show abundances similar to those of field spirals. The most HI depleted spirals also show weaker abundance gradients on average, but the formal significance of this trend is marginal. We find a strong correlation between mean abundance and HI/optical diameter ratio that is quite distinct from the behavior seen in field galaxies. This suggests that dynamical processes associated with the cluster environment are more important than cluster membership in determining the evolution of chemical abundances and stellar populations in spiral galaxies. Simple chemical evolution models are calculated to predict the magnitude of the abundance enhancement expected if ram-pressure stripping or curtailment of infall is responsible for the gas deficiencies. The increased abundances of the spirals in the cluster core may have significant effects on their use as cosmological standard candles.

  7. The Rapid-Heat LAMPellet Method: A Potential Diagnostic Method for Human Urogenital Schistosomiasis

    PubMed Central

    Carranza-Rodríguez, Cristina; Pérez-Arellano, José Luis; Vicente, Belén; López-Abán, Julio; Muro, Antonio

    2015-01-01

    Background Urogenital schistosomiasis due to Schistosoma haematobium is a serious underestimated public health problem affecting 112 million people - particularly in sub-Saharan Africa. Microscopic examination of urine samples to detect parasite eggs still remains as definitive diagnosis. This work was focussed on developing a novel loop-mediated isothermal amplification (LAMP) assay for detection of S. haematobium DNA in human urine samples as a high-throughput, simple, accurate and affordable diagnostic tool to use in diagnosis of urogenital schistosomiasis. Methodology/Principal Findings A LAMP assay targeting a species specific sequence of S. haematobium ribosomal intergenic spacer was designed. The effectiveness of our LAMP was assessed in a number of patients´ urine samples with microscopy confirmed S. haematobium infection. For potentially large-scale application in field conditions, different DNA extraction methods, including a commercial kit, a modified NaOH extraction method and a rapid heating method were tested using small volumes of urine fractions (whole urine, supernatants and pellets). The heating of pellets from clinical samples was the most efficient method to obtain good-quality DNA detectable by LAMP. The detection limit of our LAMP was 1 fg/µL of S. haematobium DNA in urine samples. When testing all patients´ urine samples included in our study, diagnostic parameters for sensitivity and specificity were calculated for LAMP assay, 100% sensitivity (95% CI: 81.32%-100%) and 86.67% specificity (95% CI: 75.40%-94.05%), and also for microscopy detection of eggs in urine samples, 69.23% sensitivity (95% CI: 48.21% -85.63%) and 100% specificity (95% CI: 93.08%-100%). Conclusions/Significance We have developed and evaluated, for the first time, a LAMP assay for detection of S. haematobium DNA in heated pellets from patients´ urine samples using no complicated requirement procedure for DNA extraction. The procedure has been named the Rapid-Heat LAMPellet method and has the potential to be developed further as a field diagnostic tool for use in urogenital schistosomiasis-endemic areas. PMID:26230990

  8. Children's knowledge of the earth: a new methodological and statistical approach.

    PubMed

    Straatemeier, Marthe; van der Maas, Han L J; Jansen, Brenda R J

    2008-08-01

    In the field of children's knowledge of the earth, much debate has concerned the question of whether children's naive knowledge-that is, their knowledge before they acquire the standard scientific theory-is coherent (i.e., theory-like) or fragmented. We conducted two studies with large samples (N=328 and N=381) using a new paper-and-pencil test, denoted the EARTH (EArth Representation Test for cHildren), to discriminate between these two alternatives. We performed latent class analyses on the responses to the EARTH to test mental models associated with these alternatives. The naive mental models, as formulated by Vosniadou and Brewer, were not supported by the results. The results indicated that children's knowledge of the earth becomes more consistent as children grow older. These findings support the view that children's naive knowledge is fragmented.

  9. Effect of Marangoni Convection Generated by Voids on Segregation During Low-G and 1-G Solidification

    NASA Technical Reports Server (NTRS)

    Kassemi, M.; Fripp, A.; Rashidnia, N.; deGroh, H.

    2001-01-01

    Solidification experiments, especially microgravity solidification experiments, are often compromised by the evolution of unwanted voids or bubbles in the melt. Although these voids and/or bubbles are highly undesirable, there is currently no effective means of preventing their formation or of eliminating their adverse effects, particularly during microgravity experiments. Marangoni convection caused by these voids can drastically change the transport processes in the melt. Recent microgravity experiments by Matthiesen (1) Andrews (2) and Fripp (3) are perfect examples of how voids and bubbles can affect the outcome of costly space experiments and significantly increase the level of difficulty in interpreting their results. Formation of bubbles have caused problems in microgravity experiments for a long time. Even in the early Skylab mission an unexpectedly large number of bubbles were detected in the four materials processing experiments reported by Papazian and Wilcox (4). They demonstrated that while during ground-based tests bubbles were seen to detach from the interface easily and float to the top of the melt, in low-gravity tests no detachment from the interface occurred and large voids were grown in the crystal. More recently, the lead-tin-telluride crystal growth experiment of Fripp et al.(3) flown aboard the USMP-3 mission has provided very interesting results. The purpose of the study was to investigate the effect of natural convection on the solidification process by growing the samples at different orientations with respect to the gravitational field. Large pores and voids were found in the three solid crystal samples processed in space. Post-growth characterization of the compositional profiles of the cells indicated considerable levels of mixing even in the sample grown in the hot-on-top stable configuration. The mixing was attributed to thermocapillary convection caused by the voids and bubbles which evolved during growth. Since the thermocapillary convection is orientation-independent, diffusion-controlled growth was not possible in any of the samples, even the top-heated one. These results are consistent with recent studies of thermocapillary convection generated by a bubble on a heated surface undertaken by Kassemi and Rashidnia (5-7) where it is numerically and experimentally shown that the thermocapillary flow generated by a bubble in a model fluid (silicone oil) can drastically modify the temperature field through vigorous mixing of the fluid around it, especially under microgravity conditions.

  10. Proteomic Workflows for Biomarker Identification Using Mass Spectrometry — Technical and Statistical Considerations during Initial Discovery

    PubMed Central

    Orton, Dennis J.; Doucette, Alan A.

    2013-01-01

    Identification of biomarkers capable of differentiating between pathophysiological states of an individual is a laudable goal in the field of proteomics. Protein biomarker discovery generally employs high throughput sample characterization by mass spectrometry (MS), being capable of identifying and quantifying thousands of proteins per sample. While MS-based technologies have rapidly matured, the identification of truly informative biomarkers remains elusive, with only a handful of clinically applicable tests stemming from proteomic workflows. This underlying lack of progress is attributed in large part to erroneous experimental design, biased sample handling, as well as improper statistical analysis of the resulting data. This review will discuss in detail the importance of experimental design and provide some insight into the overall workflow required for biomarker identification experiments. Proper balance between the degree of biological vs. technical replication is required for confident biomarker identification. PMID:28250400

  11. Optimization and performance of the Robert Stobie Spectrograph Near-InfraRed detector system

    NASA Astrophysics Data System (ADS)

    Mosby, Gregory; Indahl, Briana; Eggen, Nathan; Wolf, Marsha; Hooper, Eric; Jaehnig, Kurt; Thielman, Don; Burse, Mahesh

    2018-01-01

    At the University of Wisconsin-Madison, we are building and testing the near-infrared (NIR) spectrograph for the Southern African Large Telescope-RSS-NIR. RSS-NIR will be an enclosed cooled integral field spectrograph. The RSS-NIR detector system uses a HAWAII-2RG (H2RG) HgCdTe detector from Teledyne controlled by the SIDECAR ASIC and an Inter-University Centre for Astronomy and Astrophysics (IUCCA) ISDEC card. We have successfully characterized and optimized the detector system and report on the optimization steps and performance of the system. We have reduced the CDS read noise to ˜20 e- for 200 kHz operation by optimizing ASIC settings. We show an additional factor of 3 reduction of read noise using Fowler sampling techniques and a factor of 2 reduction using up-the-ramp group sampling techniques. We also provide calculations to quantify the conditions for sky-limited observations using these sampling techniques.

  12. Evaluation damage threshold of optical thin-film using an amplified spontaneous emission source

    NASA Astrophysics Data System (ADS)

    Zhou, Qiong; Sun, Mingying; Zhang, Zhixiang; Yao, Yudong; Peng, Yujie; Liu, Dean; Zhu, Jianqiang

    2014-10-01

    An accurate evaluation method with an amplified spontaneous emission (ASE) as the irradiation source has been developed for testing thin-film damage threshold. The partial coherence of the ASE source results in a very smooth beam profile in the near-field and a uniform intensity distribution of the focal spot in the far-field. ASE is generated by an Nd: glass rod amplifier in SG-II high power laser facility, with pulse duration of 9 ns and spectral width (FWHM) of 1 nm. The damage threshold of the TiO2 high reflection film is 14.4J/cm2 using ASE as the irradiation source, about twice of 7.4 J/cm2 that tested by a laser source with the same pulse duration and central wavelength. The damage area induced by ASE is small with small-scale desquamation and a few pits, corresponding to the defect distribution of samples. Large area desquamation is observed in the area damaged by laser, as the main reason that the non-uniformity of the laser light. The ASE damage threshold leads to more accurate evaluations of the samples damage probability by reducing the influence of hot spots in the irradiation beam. Furthermore, the ASE source has a great potential in the detection of the defect distribution of the optical elements.

  13. Test of 60 kA coated conductor cable prototypes for fusion magnets

    NASA Astrophysics Data System (ADS)

    Uglietti, D.; Bykovsky, N.; Sedlak, K.; Stepanov, B.; Wesche, R.; Bruzzone, P.

    2015-12-01

    Coated conductors could be promising materials for the fabrication of the large magnet systems of future fusion devices. Two prototype conductors (flat cables in steel conduits), each about 2 m long, were manufactured using coated conductor tapes (4 mm wide) from Super Power and SuperOx, with a total tape length of 1.6 km. Each flat cable is assembled from 20 strands, each strand consisting of a stack of 16 tapes surrounded by two half circular copper profiles, twisted and soldered. The tapes were measured at 12 T and 4.2 K and the results of the measurements were used for the assessment of the conductor electromagnetic properties at low temperature and high field. The two conductors were assembled together in a sample that was tested in the European Dipole (EDIPO) facility. The current sharing temperatures of the two conductors were measured at background fields from 8 T up to 12 T and for currents from 30 kA up to 70 kA: the measured values are within a few percent of the values expected from the measurements on tapes (short samples). After electromagnetic cycling, T cs at 12 T and 50 kA decreased from about 12 K to 11 K (about 10%), corresponding to less than 3% of I c.

  14. Performance analysis of the toroidal field ITER production conductors

    NASA Astrophysics Data System (ADS)

    Breschi, M.; Macioce, D.; Devred, A.

    2017-05-01

    The production of the superconducting cables for the toroidal field (TF) magnets of the ITER machine has recently been completed at the manufacturing companies selected during the previous qualification phase. The quality assurance/quality control programs that have been implemented to ensure production uniformity across numerous suppliers include performance tests of several conductor samples from selected unit lengths. The short full-size samples (4 m long) were subjected to DC and AC tests in the SULTAN facility at CRPP in Villigen, Switzerland. In a previous work the results of the tests of the conductor performance qualification samples were reported. This work reports the analyses of the results of the tests of the production conductor samples. The results reported here concern the values of current sharing temperature, critical current, effective strain and n-value from the DC tests and the energy dissipated per cycle from the AC loss tests. A detailed comparison is also presented between the performance of the conductors and that of their constituting strands.

  15. Development of an online analyzer of atmospheric H 2O 2 and several organic hydroperoxides for field campaigns

    NASA Astrophysics Data System (ADS)

    François, S.; Sowka, I.; Monod, A.; Temime-Roussel, B.; Laugier, J. M.; Wortham, H.

    2005-03-01

    An online automated instrument was developed for atmospheric measurements of hydroperoxides with separation and quantification of H 2O 2 and several organic hydroperoxides. Samples were trapped in aqueous solutions in a scrubbing glass coil. Analyses were performed on an HPLC column followed by para-hydroxyphenylacetic acid (POPHA) acetic acid and peroxidase derivatization and fluorescence detection. Analytical and sampling tests were performed on different parameters to obtain optimum signal-to-noise ratios, high resolution and collection efficiencies higher than 95% for H 2O 2 and organic hydroperoxides. The obtained performances show large improvements compared to previous studies. The sampling and analytical devices can be coupled providing an online analyzer. The device was used during two field campaigns in the Marseilles area in June 2001 (offline analyzer) and in July 2002 (online analyzer) at rural sites at low and high altitudes, respectively, during the ESCOMPTE and BOND campaigns. During the ESCOMPTE campaign, H 2O 2 was detected occasionally, and no organic hydroperoxides was observed. During the BOND campaign, substantial amounts of H 2O 2 and 1-HEHP+MHP were often detected, and two other organic hydroperoxides were occasionally detected. These observations are discussed.

  16. Habitat and food resources of otters (Mustelidae) in Peninsular Malaysia

    NASA Astrophysics Data System (ADS)

    Abdul-Patah, P.; Nur-Syuhada, N.; Md-Nor, S.; Sasaki, H.; Md-Zain, B. M.

    2014-09-01

    Habitat and food resources of otters were studied in several locations in Peninsular Malaysia. A total of 210 fecal samples were collected from April 2010 to March 2011 believed to be of otter's were analyzed for their diet composition and their habitat preferences. The DNA testing conducted revealed that only 126 samples were identified as Lultrogale perspicillata and Aonyx cinereus with 105 and 21 samples, respectively. Habitat analyses revealed that these two species preferred paddy fields and mangroves as their main habitats but L. perspicillata preferred to hunt near habitat with large water bodies, such as mangroves, rivers, ponds, and lakes. A. cinereus on the other hand, were mainly found near land-based habitat, such as paddy fields, casuarinas forest and oil palms near mangroves. Habitats chosen were influenced by their food preferences where L. perspicillata consumed a variety of fish species with a supplementary diet of prawns, small mammals, and amphibians, compared to A. cinereus which consumed less fish and more non-fish food items, such as insects, crabs, and snails. Since, the most of the otter habitats in this study are not located within the protected areas, conservation effort involving administrations, landowners, private organizations and public are necessary.

  17. Finite Element Modeling of the Bulk Magnitization of Railroad Wheels to Improve Test Conditions for Magnetoacoustic Residual Stress Measurements

    NASA Technical Reports Server (NTRS)

    Fulton, J. P.; Wincheski, B.; Namkung, M.; Utrata, D.

    1992-01-01

    The magnetoacoustic measurement technique has been used successfully for residual stress measurements in laboratory samples. However, when used to field test samples with complex geometries, such as railroad wheels, the sensitivity of the method declines dramatically. It has been suggested that the decrease in performance may be due, in part, to an insufficient or nonuniform magnetic induction in the test sample. The purpose of this paper is to optimize the test conditions by using finite element modeling to predict the distribution of the induced bulk magnetization of railroad wheels. The results suggest that it is possible to obtain a sufficiently large and uniform bulk magnetization by altering the shape of the electromagnet used in the tests. Consequently, problems associated with bulk magnetization can be overcome, and should not prohibit the magnetoacoustic technique from being used to make residual stress measurements in railroad wheels. We begin by giving a brief overview of the magnetoacoustic technique as it applies to residual stress measurements of railroad wheels. We then define the finite element model used to predict the behavior of the current test configuration along with the nonlinear constitutive relations which we obtained experimentally through measurements on materials typically used to construct both railroad wheels and electromagnets. Finally, we show that by modifying the pole of the electromagnet it is possible to obtain a significantly more uniform bulk magnetization in the region of interest.

  18. Effect of sample initial magnetic field on the metal magnetic memory NDT result

    NASA Astrophysics Data System (ADS)

    Moonesan, Mahdi; Kashefi, Mehrdad

    2018-08-01

    One of the major concerns regarding the use of Metal Magnetic Memory (MMM) technique is the complexity of residual magnetization effect on output signals. The present study investigates the influence of residual magnetic field on stress induced magnetization. To this end, various initial magnetic fields were induced on a low carbon steel sample, and for each level of residual magnetic field, the sample was subjected to a set of 4-point bending tests and, their corresponding MMM signals were collected from the surface of the bended sample using a tailored metal magnetic memory scanning device. Results showed a strong correlation between sample residual magnetic field and its corresponding level of stress induced magnetic field. It was observed that the sample magnetic field increases with applying the bending stress as long as the initial residual magnet field is low (i.e. <117 mG), but starts decreasing with higher levels of initial residual magnetic fields. Besides, effect of bending stress on the MMM output of a notched sample was investigated. The result, again, showed that MMM signals exhibit a drop at stress concentration zone when sample has high level of initial residual magnetic field.

  19. Beginning Postsecondary Students Longitudinal Study First Follow-up (BPS:96/98) Field Test Report. Working Paper Series.

    ERIC Educational Resources Information Center

    Pratt, Daniel J.; Wine, Jennifer S.; Heuer, Ruth E.; Whitmore, Roy W.; Kelly, Janice E.; Doherty, John M.; Simpson, Joe B.; Marti, Norma

    This report describes the methods and procedures used for the field test of the Beginning Postsecondary Students Longitudinal Study First Followup 1996-98 (BPS:96/98). Students in this survey were first interviewed during 1995 as part of the National Postsecondary Student Aid Study 1996 field test. The BPS:96/98 full-scale student sample includes…

  20. Biphasic Finite Element Modeling Reconciles Mechanical Properties of Tissue-Engineered Cartilage Constructs Across Testing Platforms.

    PubMed

    Meloni, Gregory R; Fisher, Matthew B; Stoeckl, Brendan D; Dodge, George R; Mauck, Robert L

    2017-07-01

    Cartilage tissue engineering is emerging as a promising treatment for osteoarthritis, and the field has progressed toward utilizing large animal models for proof of concept and preclinical studies. Mechanical testing of the regenerative tissue is an essential outcome for functional evaluation. However, testing modalities and constitutive frameworks used to evaluate in vitro grown samples differ substantially from those used to evaluate in vivo derived samples. To address this, we developed finite element (FE) models (using FEBio) of unconfined compression and indentation testing, modalities commonly used for such samples. We determined the model sensitivity to tissue radius and subchondral bone modulus, as well as its ability to estimate material parameters using the built-in parameter optimization tool in FEBio. We then sequentially tested agarose gels of 4%, 6%, 8%, and 10% weight/weight using a custom indentation platform, followed by unconfined compression. Similarly, we evaluated the ability of the model to generate material parameters for living constructs by evaluating engineered cartilage. Juvenile bovine mesenchymal stem cells were seeded (2 × 10 7 cells/mL) in 1% weight/volume hyaluronic acid hydrogels and cultured in a chondrogenic medium for 3, 6, and 9 weeks. Samples were planed and tested sequentially in indentation and unconfined compression. The model successfully completed parameter optimization routines for each testing modality for both acellular and cell-based constructs. Traditional outcome measures and the FE-derived outcomes showed significant changes in material properties during the maturation of engineered cartilage tissue, capturing dynamic changes in functional tissue mechanics. These outcomes were significantly correlated with one another, establishing this FE modeling approach as a singular method for the evaluation of functional engineered and native tissue regeneration, both in vitro and in vivo.

  1. 18/20 T high magnetic field scanning tunneling microscope with fully low voltage operability, high current resolution, and large scale searching ability.

    PubMed

    Li, Quanfeng; Wang, Qi; Hou, Yubin; Lu, Qingyou

    2012-04-01

    We present a home-built 18/20 T high magnetic field scanning tunneling microscope (STM) featuring fully low voltage (lower than ±15 V) operability in low temperatures, large scale searching ability, and 20 fA high current resolution (measured by using a 100 GOhm dummy resistor to replace the tip-sample junction) with a bandwidth of 3.03 kHz. To accomplish low voltage operation which is important in achieving high precision, low noise, and low interference with the strong magnetic field, the coarse approach is implemented with an inertial slider driven by the lateral bending of a piezoelectric scanner tube (PST) whose inner electrode is axially split into two for enhanced bending per volt. The PST can also drive the same sliding piece to inertial slide in the other bending direction (along the sample surface) of the PST, which realizes the large area searching ability. The STM head is housed in a three segment tubular chamber, which is detachable near the STM head for the convenience of sample and tip changes. Atomic resolution images of a graphite sample taken under 17.6 T and 18.0001 T are presented to show its performance. © 2012 American Institute of Physics

  2. LAMPhimerus: A novel LAMP assay for detecting Amphimerus sp. DNA in human stool samples

    PubMed Central

    Calvopiña, Manuel; Fontecha-Cuenca, Cristina; Sugiyama, Hiromu; Sato, Megumi; López Abán, Julio; Vicente, Belén; Muro, Antonio

    2017-01-01

    Background Amphimeriasis is a fish-borne disease caused by the liver fluke Amphimerus spp. that has recently been reported as endemic in the tropical Pacific side of Ecuador with a high prevalence in humans and domestic animals. The diagnosis is based on the stool examination to identify parasite eggs, but it lacks sensitivity. Additionally, the morphology of the eggs may be confounded with other liver and intestinal flukes. No immunological or molecular methods have been developed to date. New diagnostic techniques for specific and sensitive detection of Amphimerus spp. DNA in clinical samples are needed. Methodology/Principal findings A LAMP targeting a sequence of the Amphimerus sp. internal transcribed spacer 2 region was designed. Amphimerus sp. DNA was obtained from adult worms recovered from animals and used to optimize the molecular assays. Conventional PCR was performed using outer primers F3-B3 to verify the proper amplification of the Amphimerus sp. DNA target sequence. LAMP was optimized using different reaction mixtures and temperatures, and it was finally set up as LAMPhimerus. The specificity and sensitivity of both PCR and LAMP were evaluated. The detection limit was 1 pg of genomic DNA. Field testing was done using 44 human stool samples collected from localities where fluke is endemic. Twenty-five samples were microscopy positive for Amphimerus sp. eggs detection. In molecular testing, PCR F3-B3 was ineffective when DNA from fecal samples was used. When testing all human stool samples included in our study, the diagnostic parameters for the sensitivity and specificity were calculated for our LAMPhimerus assay, which were 76.67% and 80.77%, respectively. Conclusions/Significance We have developed and evaluated, for the first time, a specific and sensitive LAMP assay for detecting Amphimerus sp. in human stool samples. The procedure has been named LAMPhimerus method and has the potential to be adapted for field diagnosis and disease surveillance in amphimeriasis-endemic areas. Future large-scale studies will assess the applicability of this novel LAMP assay. PMID:28628614

  3. Field Detection of Drugs of Abuse in Oral Fluid Using the Alere™ DDS®2 Mobile Test System with Confirmation by Liquid Chromatography Tandem Mass Spectrometry (LC-MS/MS).

    PubMed

    Krotulski, Alex J; Mohr, Amanda L A; Friscia, Melissa; Logan, Barry K

    2018-04-01

    The collection and analysis of drugs in oral fluid (OF) at the roadside has become more feasible with the introduction of portable testing devices such as the Alere™ DDS®2 Mobile Test System (DDS®2). The objective of this study was to compare the on-site results for the DDS®2 to laboratory-based confirmatory assays with respect to detection of drugs of abuse in human subjects. As part of a larger Institutional Review Board approved study, two OF samples were collected from each participant at a music festival in Miami, FL, USA. One OF sample was field screened using the DDS®2, and a confirmatory OF sample was collected using the Quantisal™ OF collection device and submitted to the laboratory for testing. In total, 124 subjects participated in this study providing two contemporaneous OF samples. DDS®2 field screening yielded positive results for delta-9-tetrahydrocannabinol (THC) (n = 27), cocaine (n = 12), amphetamine (n = 3), methamphetamine (n = 3) and benzodiazepine (n = 1). No opiate-positive OF samples were detected. For cocaine, amphetamine, methamphetamine and benzodiazepines, the DDS®2 displayed sensitivity, specificity and accuracy of 100%. For THC, the DDS®2 displayed sensitivity of 90%, specificity of 100% and accuracy of 97.5%, when the threshold for confirmation matched that of the manufacturers advertised cut-off. When this confirmatory threshold was lowered to the analytical limit of detection (i.e., 1 ng/mL), apparent device performance for THC was poorer due to additional samples testing positive by confirmatory assay that had tested negative on the DDS®2, demonstrating a need for correlation between manufacturer cut-off and analytical reporting limit. These results from drug-using subjects demonstrate the value of field-based OF testing, and illustrate the significance of selecting an appropriate confirmation cut-off concentration with respect to performance evaluation and detection of drug use.

  4. Spin-orbit torque induced magnetization switching in Ta/Co{sub 20}Fe{sub 60}B{sub 20}/MgO structures under small in-plane magnetic fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Jiangwei, E-mail: caojw@lzu.edu.cn; Zheng, Yuqiang; Su, Xianpeng

    2016-04-25

    Spin-orbit torque (SOT)-induced magnetization switching under small in-plane magnetic fields in as-deposited and annealed Ta/CoFeB/MgO structures is studied. For the as-deposited samples, partial SOT-induced switching behavior is observed under an in-plane field of less than 100 Oe. Conversely, for the annealed samples, an in-plane field of 10 Oe is large enough to achieve full deterministic magnetization switching. The Dzyaloshinskii-Moriya interaction at the Ta/CoFeB interface is believed to be the main reason for the discrepancy of the requisite in-plane magnetic fields for switching in the as-deposited and annealed samples. In addition, asymmetric field dependence behavior of SOT-induced magnetization switching is observed in themore » annealed samples. Deterministic magnetization switching in the absence of an external magnetic field is obtained in the annealed samples, which is extremely important to develop SOT-based magnetoresistive random access memory.« less

  5. Chemical analysis of whale breath volatiles: a case study for non-invasive field health diagnostics of marine mammals.

    PubMed

    Cumeras, Raquel; Cheung, William H K; Gulland, Frances; Goley, Dawn; Davis, Cristina E

    2014-09-12

    We explored the feasibility of collecting exhaled breath from a moribund gray whale (Eschrichtius robustus) for potential non-invasive health monitoring of marine mammals. Biogenic volatile organic compound (VOC) profiling is a relatively new field of research, in which the chemical composition of breath is used to non-invasively assess the health and physiological processes on-going within an animal or human. In this study, two telescopic sampling poles were designed and tested with the primary aim of collecting whale breath exhalations (WBEs). Once the WBEs were successfully collected, they were immediately transferred onto a stable matrix sorbent through a custom manifold system. A total of two large volume WBEs were successfully captured and pre-concentrated onto two Tenax®-TA traps (one exhalation per trap). The samples were then returned to the laboratory where they were analyzed using solid phase micro extraction (SPME) and gas chromatography/mass spectrometry (GC/MS). A total of 70 chemicals were identified (58 positively identified) in the whale breath samples. These chemicals were also matched against a database of VOCs found in humans, and 44% of chemicals found in the whale breath are also released by healthy humans. The exhaled gray whale breath showed a rich diversity of chemicals, indicating the analysis of whale breath exhalations is a promising new field of research.

  6. Feasibility and acceptability of the DSM-5 Field Trial procedures in the Johns Hopkins Community Psychiatry Programs†

    PubMed Central

    Clarke, Diana E.; Wilcox, Holly C.; Miller, Leslie; Cullen, Bernadette; Gerring, Joan; Greiner, Lisa H.; Newcomer, Alison; Mckitty, Mellisha V.; Regier, Darrel A.; Narrow, William E.

    2014-01-01

    The Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) contains criteria for psychiatric diagnoses that reflect advances in the science and conceptualization of mental disorders and address the needs of clinicians. DSM-5 also recommends research on dimensional measures of cross-cutting symptoms and diagnostic severity, which are expected to better capture patients’ experiences with mental disorders. Prior to its May 2013 release, the American Psychiatric Association (APA) conducted field trials to examine the feasibility, clinical utility, reliability, and where possible, the validity of proposed DSM-5 diagnostic criteria and dimensional measures. The methods and measures proposed for the DSM-5 field trials were pilot tested in adult and child/adolescent clinical samples, with the goal to identify and correct design and procedural problems with the proposed methods before resources were expended for the larger DSM-5 Field Trials. Results allowed for the refinement of the protocols, procedures, and measures, which facilitated recruitment, implementation, and completion of the DSM-5 Field Trials. These results highlight the benefits of pilot studies in planning large multisite studies. PMID:24615761

  7. One-step solvothermal deposition of ZnO nanorod arrays on a wood surface for robust superamphiphobic performance and superior ultraviolet resistance

    PubMed Central

    Yao, Qiufang; Wang, Chao; Fan, Bitao; Wang, Hanwei; Sun, Qingfeng; Jin, Chunde; Zhang, Hong

    2016-01-01

    In the present paper, uniformly large-scale wurtzite-structured ZnO nanorod arrays (ZNAs) were deposited onto a wood surface through a one-step solvothermal method. The as-prepared samples were characterized by X-ray diffraction (XRD), field-emission scanning electron microscopy (FE-SEM), transmission electron microscopy (TEM), Fourier transform infrared spectroscopy (FTIR), thermogravimetry (TG), and differential thermal analysis (DTA). ZNAs with a diameter of approximately 85 nm and a length of approximately 1.5 μm were chemically bonded onto the wood surface through hydrogen bonds. The superamphiphobic performance and ultraviolet resistance were measured and evaluated by water or oil contact angles (WCA or OCA) and roll-off angles, sand abrasion tests and an artificially accelerated ageing test. The results show that the ZNA-treated wood demonstrates a robust superamphiphobic performance under mechanical impact, corrosive liquids, intermittent and transpositional temperatures, and water spray. Additionally, the as-prepared wood sample shows superior ultraviolet resistance. PMID:27775091

  8. One-step solvothermal deposition of ZnO nanorod arrays on a wood surface for robust superamphiphobic performance and superior ultraviolet resistance

    NASA Astrophysics Data System (ADS)

    Yao, Qiufang; Wang, Chao; Fan, Bitao; Wang, Hanwei; Sun, Qingfeng; Jin, Chunde; Zhang, Hong

    2016-10-01

    In the present paper, uniformly large-scale wurtzite-structured ZnO nanorod arrays (ZNAs) were deposited onto a wood surface through a one-step solvothermal method. The as-prepared samples were characterized by X-ray diffraction (XRD), field-emission scanning electron microscopy (FE-SEM), transmission electron microscopy (TEM), Fourier transform infrared spectroscopy (FTIR), thermogravimetry (TG), and differential thermal analysis (DTA). ZNAs with a diameter of approximately 85 nm and a length of approximately 1.5 μm were chemically bonded onto the wood surface through hydrogen bonds. The superamphiphobic performance and ultraviolet resistance were measured and evaluated by water or oil contact angles (WCA or OCA) and roll-off angles, sand abrasion tests and an artificially accelerated ageing test. The results show that the ZNA-treated wood demonstrates a robust superamphiphobic performance under mechanical impact, corrosive liquids, intermittent and transpositional temperatures, and water spray. Additionally, the as-prepared wood sample shows superior ultraviolet resistance.

  9. Wide-field lensless fluorescent microscopy using a tapered fiber-optic faceplate on a chip.

    PubMed

    Coskun, Ahmet F; Sencan, Ikbal; Su, Ting-Wei; Ozcan, Aydogan

    2011-09-07

    We demonstrate lensless fluorescent microscopy over a large field-of-view of ~60 mm(2) with a spatial resolution of <4 µm. In this on-chip fluorescent imaging modality, the samples are placed on a fiber-optic faceplate that is tapered such that the density of the fiber-optic waveguides on the top facet is >5 fold larger than the bottom one. Placed on this tapered faceplate, the fluorescent samples are pumped from the side through a glass hemisphere interface. After excitation of the samples, the pump light is rejected through total internal reflection that occurs at the bottom facet of the sample substrate. The fluorescent emission from the sample is then collected by the smaller end of the tapered faceplate and is delivered to an opto-electronic sensor-array to be digitally sampled. Using a compressive sampling algorithm, we decode these raw lensfree images to validate the resolution (<4 µm) of this on-chip fluorescent imaging platform using microparticles as well as labeled Giardia muris cysts. This wide-field lensfree fluorescent microscopy platform, being compact and high-throughput, might provide a valuable tool especially for cytometry, rare cell analysis (involving large area microfluidic systems) as well as for microarray imaging applications.

  10. Field test of a motorcycle safety education course for novice riders

    DOT National Transportation Integrated Search

    1982-07-01

    The purpose of this study was to subject the Motorcycle Safety Foundation's Motorcycle Rider Course (MRC) to a large-scale field test designed to evaluate the following aspects of the course: (1) Instructional Effectiveness, (2) User Acceptance, and ...

  11. Evaluation of Niobium as Candidate Electrode Material for DC High Voltage Photoelectron Guns

    NASA Technical Reports Server (NTRS)

    BastaniNejad, M.; Mohamed, Abdullah; Elmustafa, A. A.; Adderley, P.; Clark, J.; Covert, S.; Hansknecht, J.; Hernandez-Garcia, C.; Poelker, M.; Mammei, R.; hide

    2012-01-01

    The field emission characteristics of niobium electrodes were compared to those of stainless steel electrodes using a DC high voltage field emission test apparatus. A total of eight electrodes were evaluated: two 304 stainless steel electrodes polished to mirror-like finish with diamond grit and six niobium electrodes (two single-crystal, two large-grain, and two fine-grain) that were chemically polished using a buffered-chemical acid solution. Upon the first application of high voltage, the best large-grain and single-crystal niobium electrodes performed better than the best stainless steel electrodes, exhibiting less field emission at comparable voltage and field strength. In all cases, field emission from electrodes (stainless steel and/or niobium) could be significantly reduced and sometimes completely eliminated, by introducing krypton gas into the vacuum chamber while the electrode was biased at high voltage. Of all the electrodes tested, a large-grain niobium electrode performed the best, exhibiting no measurable field emission (< 10 pA) at 225 kV with 20 mm cathode/anode gap, corresponding to a field strength of 18:7 MV/m.

  12. Testing the gravitational instability hypothesis?

    NASA Technical Reports Server (NTRS)

    Babul, Arif; Weinberg, David H.; Dekel, Avishai; Ostriker, Jeremiah P.

    1994-01-01

    We challenge a widely accepted assumption of observational cosmology: that successful reconstruction of observed galaxy density fields from measured galaxy velocity fields (or vice versa), using the methods of gravitational instability theory, implies that the observed large-scale structures and large-scale flows were produced by the action of gravity. This assumption is false, in that there exist nongravitational theories that pass the reconstruction tests and gravitational theories with certain forms of biased galaxy formation that fail them. Gravitational instability theory predicts specific correlations between large-scale velocity and mass density fields, but the same correlations arise in any model where (a) structures in the galaxy distribution grow from homogeneous initial conditions in a way that satisfies the continuity equation, and (b) the present-day velocity field is irrotational and proportional to the time-averaged velocity field. We demonstrate these assertions using analytical arguments and N-body simulations. If large-scale structure is formed by gravitational instability, then the ratio of the galaxy density contrast to the divergence of the velocity field yields an estimate of the density parameter Omega (or, more generally, an estimate of beta identically equal to Omega(exp 0.6)/b, where b is an assumed constant of proportionality between galaxy and mass density fluctuations. In nongravitational scenarios, the values of Omega or beta estimated in this way may fail to represent the true cosmological values. However, even if nongravitational forces initiate and shape the growth of structure, gravitationally induced accelerations can dominate the velocity field at late times, long after the action of any nongravitational impulses. The estimated beta approaches the true value in such cases, and in our numerical simulations the estimated beta values are reasonably accurate for both gravitational and nongravitational models. Reconstruction tests that show correlations between galaxy density and velocity fields can rule out some physically interesting models of large-scale structure. In particular, successful reconstructions constrain the nature of any bias between the galaxy and mass distributions, since processes that modulate the efficiency of galaxy formation on large scales in a way that violates the continuity equation also produce a mismatch between the observed galaxy density and the density inferred from the peculiar velocity field. We obtain successful reconstructions for a gravitational model with peaks biasing, but we also show examples of gravitational and nongravitational models that fail reconstruction tests because of more complicated modulations of galaxy formation.

  13. Development of a Tunnel Diode Resonator technique for magnetic measurements in Electrostatic Levitation chamber

    NASA Astrophysics Data System (ADS)

    Spyrison, N. S.; Prommapan, P.; Kim, H.; Maloney, J.; Rustan, G. E.; Kreyssig, A.; Goldman, A. I.; Prozorov, R.

    2011-03-01

    The incorporation of the Tunnel Diode Resonator (TDR) technique into an ElectroStatic Levitation (ESL) apparatus was explored. The TDR technique is known to operate and behave well at low temperatures with careful attention to coil-sample positioning in a dark, shielded environment. With these specifications a frequency resolution of 10-9 in a few seconds counting time can be achieved. Complications arise when this technique is applied in the ESL chamber where a sample of molten metal is levitating less then 10 mm from the coil in a large electrostatic field. We have tested a variety of coils unconventional to TDR; including Helmholtz pairs and Archimedean spiral coils. Work was supported by the Nation Science Foundation under grant DMR-08-17157

  14. Towards Using NMR to Screen for Spoiled Tomatoes Stored in 1,000 L, Aseptically Sealed, Metal-Lined Totes

    PubMed Central

    Pinter, Michael D.; Harter, Tod; McCarthy, Michael J.; Augustine, Matthew P.

    2014-01-01

    Nuclear magnetic resonance (NMR) spectroscopy is used to track factory relevant tomato paste spoilage. It was found that spoilage in tomato paste test samples leads to longer spin lattice relaxation times T1 using a conventional low magnetic field NMR system. The increase in T1 value for contaminated samples over a five day room temperature exposure period prompted the work to be extended to the study of industry standard, 1,000 L, non-ferrous, metal-lined totes. NMR signals and T1 values were recovered from a large format container with a single-sided NMR sensor. The results of this work suggest that a handheld NMR device can be used to study tomato paste spoilage in factory process environments. PMID:24594611

  15. Towards using NMR to screen for spoiled tomatoes stored in 1,000 L, aseptically sealed, metal-lined totes.

    PubMed

    Pinter, Michael D; Harter, Tod; McCarthy, Michael J; Augustine, Matthew P

    2014-03-03

    Nuclear magnetic resonance (NMR) spectroscopy is used to track factory relevant tomato paste spoilage. It was found that spoilage in tomato paste test samples leads to longer spin lattice relaxation times T1 using a conventional low magnetic field NMR system. The increase in T1 value for contaminated samples over a five day room temperature exposure period prompted the work to be extended to the study of industry standard, 1,000 L, non-ferrous, metal-lined totes. NMR signals and T1 values were recovered from a large format container with a single-sided NMR sensor. The results of this work suggest that a handheld NMR device can be used to study tomato paste spoilage in factory process environments.

  16. Crystallization of Calcium Carbonate in a Large Scale Field Study

    NASA Astrophysics Data System (ADS)

    Ueckert, Martina; Wismeth, Carina; Baumann, Thomas

    2017-04-01

    The long term efficiency of geothermal facilities and aquifer thermal energy storage in the carbonaceous Malm aquifer in the Bavarian Molasse Basin is seriously affected by precipitations of carbonates. This is mainly caused by pressure and temperature changes leading to oversaturation during production. Crystallization starts with polymorphic nuclei of calcium carbonate and is often described as diffusion-reaction controlled. Here, calcite crystallization is favoured by high concentration gradients while aragonite crystallization is occurring at high reaction rates. The factors affecting the crystallization processes have been described for simplified, well controlled laboratory experiments, the knowledge about the behaviour in more complex natural systems is still limited. The crystallization process of the polymorphic forms of calcium carbonate were investigated during a heat storage test at our test site in the eastern part of the Bavarian Molasse Basin. Complementary laboratory experiments in an autoclave were run. Both, field and laboratory experiments were conducted with carbonaceous tap water. Within the laboratory experiments additionally ultra pure water was used. To avoid precipitations of the tap water, a calculated amount of {CO_2} was added prior to heating the water from 45 - 110°C (laboratory) resp. 65 - 110°C (field). A total water volume of 0.5 L (laboratory) resp. 1 L (field) was immediately sampled and filtrated through 10 - 0.1

  17. MRI issues for ballistic objects: information obtained at 1.5-, 3- and 7-Tesla.

    PubMed

    Dedini, Russell D; Karacozoff, Alexandra M; Shellock, Frank G; Xu, Duan; McClellan, R Trigg; Pekmezci, Murat

    2013-07-01

    Few studies exist for magnetic resonance imaging (MRI) issues and ballistics, and there are no studies addressing movement, heating, and artifacts associated with ballistics at 3-tesla (T). Movement because of magnetic field interactions and radiofrequency (RF)-induced heating of retained bullets may injure nearby critical structures. Artifacts may also interfere with the diagnostic use of MRI. To investigate these potential hazards of MRI on a sample of bullets and shotgun pellets. Laboratory investigation, ex vivo. Thirty-two different bullets and seven different shotgun pellets, commonly encountered in criminal trauma, were assessed relative to 1.5-, 3-, and 7-T magnetic resonance systems. Magnetic field interactions, including translational attraction and torque, were measured. A representative sample of five bullets were then tested for magnetic field interactions, RF-induced heating, and the generation of artifacts at 3-T. At all static magnetic field strengths, non-steel-containing bullets and pellets exhibited no movement, whereas one steel core bullet and two steel pellets exhibited movement in excess of what might be considered safe for patients in MRI at 1.5-, 3- and 7-Tesla. At 3-T, the maximum temperature increase of five bullets tested was 1.7°C versus background heating of 1.5°C. Of five bullets tested for artifacts, those without a steel core exhibited small signal voids, whereas a single steel core bullet exhibited a very large signal void. Ballistics made of lead with copper or alloy jackets appear to be safe with respect to MRI-related movement at 1.5-, 3-, and 7-T static magnetic fields, whereas ballistics containing steel may pose a danger if near critical body structures because of strong magnetic field interactions. Temperature increases of selected ballistics during 3-T MRI was not clinically significant, even for the ferromagnetic projectiles. Finally, ballistics containing steel generated larger artifacts when compared with ballistics made of lead with copper and alloy jackets and may impair the diagnostic use of MRI. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Computational fluid dynamic (CFD) investigation of thermal uniformity in a thermal cycling based calibration chamber for MEMS

    NASA Astrophysics Data System (ADS)

    Gui, Xulong; Luo, Xiaobing; Wang, Xiaoping; Liu, Sheng

    2015-12-01

    Micro-electrical-mechanical system (MEMS) has become important for many industries such as automotive, home appliance, portable electronics, especially with the emergence of Internet of Things. Volume testing with temperature compensation has been essential in order to provide MEMS based sensors with repeatability, consistency, reliability, and durability, but low cost. Particularly, in the temperature calibration test, temperature uniformity of thermal cycling based calibration chamber becomes more important for obtaining precision sensors, as each sensor is different before the calibration. When sensor samples are loaded into the chamber, we usually open the door of the chamber, then place fixtures into chamber and mount the samples on the fixtures. These operations may affect temperature uniformity in the chamber. In order to study the influencing factors of sample-loading on the temperature uniformity in the chamber during calibration testing, numerical simulation work was conducted first. Temperature field and flow field were simulated in empty chamber, chamber with open door, chamber with samples, and chamber with fixtures, respectively. By simulation, it was found that opening chamber door, sample size and number of fixture layers all have effects on flow field and temperature field. By experimental validation, it was found that the measured temperature value was consistent with the simulated temperature value.

  19. Large strain cruciform biaxial testing for FLC detection

    NASA Astrophysics Data System (ADS)

    Güler, Baran; Efe, Mert

    2017-10-01

    Selection of proper test method, specimen design and analysis method are key issues for studying formability of sheet metals and detection of their forming limit curves (FLC). Materials with complex microstructures may need an additional micro-mechanical investigation and accurate modelling. Cruciform biaxial test stands as an alternative to standard tests as it achieves frictionless, in-plane, multi-axial stress states with a single sample geometry. In this study, we introduce a small-scale (less than 10 cm) cruciform sample allowing micro-mechanical investigation at stress states ranging from plane strain to equibiaxial. With successful specimen design and surface finish, large forming limit strains are obtained at the test region of the sample. The large forming limit strains obtained by experiments are compared to the values obtained from Marciniak-Kuczynski (M-K) local necking model and Cockroft-Latham damage model. This comparison shows that the experimental limiting strains are beyond the theoretical values, approaching to the fracture strain of the two test materials: Al-6061-T6 aluminum alloy and DC-04 high formability steel.

  20. Calculating p-values and their significances with the Energy Test for large datasets

    NASA Astrophysics Data System (ADS)

    Barter, W.; Burr, C.; Parkes, C.

    2018-04-01

    The energy test method is a multi-dimensional test of whether two samples are consistent with arising from the same underlying population, through the calculation of a single test statistic (called the T-value). The method has recently been used in particle physics to search for samples that differ due to CP violation. The generalised extreme value function has previously been used to describe the distribution of T-values under the null hypothesis that the two samples are drawn from the same underlying population. We show that, in a simple test case, the distribution is not sufficiently well described by the generalised extreme value function. We present a new method, where the distribution of T-values under the null hypothesis when comparing two large samples can be found by scaling the distribution found when comparing small samples drawn from the same population. This method can then be used to quickly calculate the p-values associated with the results of the test.

  1. International Collaboration Activities on Engineered Barrier Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jove-Colon, Carlos F.

    The Used Fuel Disposition Campaign (UFDC) within the DOE Fuel Cycle Technologies (FCT) program has been engaging in international collaborations between repository R&D programs for high-level waste (HLW) disposal to leverage on gathered knowledge and laboratory/field data of near- and far-field processes from experiments at underground research laboratories (URL). Heater test experiments at URLs provide a unique opportunity to mimetically study the thermal effects of heat-generating nuclear waste in subsurface repository environments. Various configurations of these experiments have been carried out at various URLs according to the disposal design concepts of the hosting country repository program. The FEBEX (Full-scale Engineeredmore » Barrier Experiment in Crystalline Host Rock) project is a large-scale heater test experiment originated by the Spanish radioactive waste management agency (Empresa Nacional de Residuos Radiactivos S.A. – ENRESA) at the Grimsel Test Site (GTS) URL in Switzerland. The project was subsequently managed by CIEMAT. FEBEX-DP is a concerted effort of various international partners working on the evaluation of sensor data and characterization of samples obtained during the course of this field test and subsequent dismantling. The main purpose of these field-scale experiments is to evaluate feasibility for creation of an engineered barrier system (EBS) with a horizontal configuration according to the Spanish concept of deep geological disposal of high-level radioactive waste in crystalline rock. Another key aspect of this project is to improve the knowledge of coupled processes such as thermal-hydro-mechanical (THM) and thermal-hydro-chemical (THC) operating in the near-field environment. The focus of these is on model development and validation of predictions through model implementation in computational tools to simulate coupled THM and THC processes.« less

  2. Characterization of YBa2Cu3O7, including critical current density Jc, by trapped magnetic field

    NASA Technical Reports Server (NTRS)

    Chen, In-Gann; Liu, Jianxiong; Weinstein, Roy; Lau, Kwong

    1992-01-01

    Spatial distributions of persistent magnetic field trapped by sintered and melt-textured ceramic-type high-temperature superconductor (HTS) samples have been studied. The trapped field can be reproduced by a model of the current consisting of two components: (1) a surface current Js and (2) a uniform volume current Jv. This Js + Jv model gives a satisfactory account of the spatial distribution of the magnetic field trapped by different types of HTS samples. The magnetic moment can be calculated, based on the Js + Jv model, and the result agrees well with that measured by standard vibrating sample magnetometer (VSM). As a consequence, Jc predicted by VSM methods agrees with Jc predicted from the Js + Jv model. The field mapping method described is also useful to reveal the granular structure of large HTS samples and regions of weak links.

  3. Photo stratification improves northwest timber volume estimates.

    Treesearch

    Colin D. MacLean

    1972-01-01

    Data from extensive timber inventories of 12 counties in western and central Washington were analyzed to test the relative efficiency of double sampling for stratification as a means of estimating total volume. Photo and field plots, when combined in a stratified sampling design, proved about twice as efficient as simple field sampling. Although some gains were made by...

  4. Deployment of a Reverse Transcription Loop-Mediated Isothermal Amplification Test for Ebola Virus Surveillance in Remote Areas in Guinea.

    PubMed

    Kurosaki, Yohei; Magassouba, N'Faly; Bah, Hadja Aïssatou; Soropogui, Barré; Doré, Amadou; Kourouma, Fodé; Cherif, Mahamoud Sama; Keita, Sakoba; Yasuda, Jiro

    2016-10-15

    To strengthen the laboratory diagnostic capacity for Ebola virus disease (EVD) in the remote areas of Guinea, we deployed a mobile field laboratory and implemented reverse transcription loop-mediated isothermal amplification (RT-LAMP) for postmortem testing. We tested 896 oral swab specimens and 21 serum samples, using both RT-LAMP and reverse transcription-polymerase chain reaction (RT-PCR). Neither test yielded a positive result, and the results from RT-LAMP and RT-PCR were consistent. More than 95% of the samples were tested within 2 days of sample collection. These results highlight the usefulness of the RT-LAMP assay as an EVD diagnostic testing method in the field or remote areas. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  5. Accelerated Creep Testing of High Strength Aramid Webbing

    NASA Technical Reports Server (NTRS)

    Jones, Thomas C.; Doggett, William R.; Stnfield, Clarence E.; Valverde, Omar

    2012-01-01

    A series of preliminary accelerated creep tests were performed on four variants of 12K and 24K lbf rated Vectran webbing to help develop an accelerated creep test methodology and analysis capability for high strength aramid webbings. The variants included pristine, aged, folded and stitched samples. This class of webbings is used in the restraint layer of habitable, inflatable space structures, for which the lifetime properties are currently not well characterized. The Stepped Isothermal Method was used to accelerate the creep life of the webbings and a novel stereo photogrammetry system was used to measure the full-field strains. A custom MATLAB code is described, and used to reduce the strain data to produce master creep curves for the test samples. Initial results show good correlation between replicates; however, it is clear that a larger number of samples are needed to build confidence in the consistency of the results. It is noted that local fiber breaks affect the creep response in a similar manner to increasing the load, thus raising the creep rate and reducing the time to creep failure. The stitched webbings produced the highest variance between replicates, due to the combination of higher local stresses and thread-on-fiber damage. Large variability in the strength of the webbings is also shown to have an impact on the range of predicted creep life.

  6. Optimal Bayesian Adaptive Design for Test-Item Calibration.

    PubMed

    van der Linden, Wim J; Ren, Hao

    2015-06-01

    An optimal adaptive design for test-item calibration based on Bayesian optimality criteria is presented. The design adapts the choice of field-test items to the examinees taking an operational adaptive test using both the information in the posterior distributions of their ability parameters and the current posterior distributions of the field-test parameters. Different criteria of optimality based on the two types of posterior distributions are possible. The design can be implemented using an MCMC scheme with alternating stages of sampling from the posterior distributions of the test takers' ability parameters and the parameters of the field-test items while reusing samples from earlier posterior distributions of the other parameters. Results from a simulation study demonstrated the feasibility of the proposed MCMC implementation for operational item calibration. A comparison of performances for different optimality criteria showed faster calibration of substantial numbers of items for the criterion of D-optimality relative to A-optimality, a special case of c-optimality, and random assignment of items to the test takers.

  7. Parallel traveling-wave MRI: a feasibility study.

    PubMed

    Pang, Yong; Vigneron, Daniel B; Zhang, Xiaoliang

    2012-04-01

    Traveling-wave magnetic resonance imaging utilizes far fields of a single-piece patch antenna in the magnet bore to generate radio frequency fields for imaging large-size samples, such as the human body. In this work, the feasibility of applying the "traveling-wave" technique to parallel imaging is studied using microstrip patch antenna arrays with both the numerical analysis and experimental tests. A specific patch array model is built and each array element is a microstrip patch antenna. Bench tests show that decoupling between two adjacent elements is better than -26-dB while matching of each element reaches -36-dB, demonstrating excellent isolation performance and impedance match capability. The sensitivity patterns are simulated and g-factors are calculated for both unloaded and loaded cases. The results on B 1- sensitivity patterns and g-factors demonstrate the feasibility of the traveling-wave parallel imaging. Simulations also suggest that different array configuration such as patch shape, position and orientation leads to different sensitivity patterns and g-factor maps, which provides a way to manipulate B(1) fields and improve the parallel imaging performance. The proposed method is also validated by using 7T MR imaging experiments. Copyright © 2011 Wiley-Liss, Inc.

  8. The Vimos VLT deep survey. Global properties of 20,000 galaxies in the IAB < 22.5 WIDE survey

    NASA Astrophysics Data System (ADS)

    Garilli, B.; Le Fèvre, O.; Guzzo, L.; Maccagni, D.; Le Brun, V.; de la Torre, S.; Meneux, B.; Tresse, L.; Franzetti, P.; Zamorani, G.; Zanichelli, A.; Gregorini, L.; Vergani, D.; Bottini, D.; Scaramella, R.; Scodeggio, M.; Vettolani, G.; Adami, C.; Arnouts, S.; Bardelli, S.; Bolzonella, M.; Cappi, A.; Charlot, S.; Ciliegi, P.; Contini, T.; Foucaud, S.; Gavignaud, I.; Ilbert, O.; Iovino, A.; Lamareille, F.; McCracken, H. J.; Marano, B.; Marinoni, C.; Mazure, A.; Merighi, R.; Paltani, S.; Pellò, R.; Pollo, A.; Pozzetti, L.; Radovich, M.; Zucca, E.; Blaizot, J.; Bongiorno, A.; Cucciati, O.; Mellier, Y.; Moreau, C.; Paioro, L.

    2008-08-01

    The VVDS-Wide survey has been designed to trace the large-scale distribution of galaxies at z ~ 1 on comoving scales reaching ~100~h-1 Mpc, while providing a good control of cosmic variance over areas as large as a few square degrees. This is achieved by measuring redshifts with VIMOS at the ESO VLT to a limiting magnitude IAB = 22.5, targeting four independent fields with sizes of up to 4 deg2 each. We discuss the survey strategy which covers 8.6 deg2 and present the general properties of the current redshift sample. This includes 32 734 spectra in the four regions, covering a total area of 6.1 deg2 with a sampling rate of 22 to 24%. This paper accompanies the public release of the first 18 143 redshifts of the VVDS-Wide survey from the 4 deg2 contiguous area of the F22 field at RA = 22^h. We have devised and tested an objective method to assess the quality of each spectrum, providing a compact figure-of-merit. This is particularly effective in the case of long-lasting spectroscopic surveys with varying observing conditions. Our figure of merit is a measure of the robustness of the redshift measurement and, most importantly, can be used to select galaxies with uniform high-quality spectra to carry out reliable measurements of spectral features. We also use the data available over the four independent regions to directly measure the variance in galaxy counts. We compare it with general predictions from the observed galaxy two-point correlation function at different redshifts and with that measured in mock galaxy surveys built from the Millennium simulation. The purely magnitude-limited VVDS Wide sample includes 19 977 galaxies, 304 type I AGNs, and 9913 stars. The redshift success rate is above 90% independent of magnitude. A cone diagram of the galaxy spatial distribution provides us with the current largest overview of large-scale structure up to z ~ 1, showing a rich texture of over- and under-dense regions. We give the mean N(z) distribution averaged over 6.1 deg2 for a sample limited in magnitude to IAB = 22.5. Comparing galaxy densities from the four fields shows that in a redshift bin Δz = 0.1 at z ~ 1 one still has factor-of-two variations over areas as large as ~ 0.25 deg2. This level of cosmic variance agrees with that obtained by integrating the galaxy two-point correlation function estimated from the F22 field alone. It is also in fairly good statistical agreement with that predicted by the Millennium simulations. The VVDS WIDE survey currently provides the largest area coverage among redshift surveys reaching z ~ 1. The variance estimated over the survey fields shows explicitly how clustering results from deep surveys of even 1 deg2 size should be interpreted with caution. The survey data represent a rich data base to select complete sub-samples of high-quality spectra and to study galaxy ensemble properties and galaxy clustering over unprecedented scales at these redshifts. The redshift catalog of the 4 deg2 F22 field is publicly available at http://cencosw.oamp.fr.

  9. Comparison of American Fisheries Society (AFS) standard fish sampling techniques and environmental DNA for characterizing fish communities in a large reservoir

    USGS Publications Warehouse

    Perez, Christina R.; Bonar, Scott A.; Amberg, Jon J.; Ladell, Bridget; Rees, Christopher B.; Stewart, William T.; Gill, Curtis J.; Cantrell, Chris; Robinson, Anthony

    2017-01-01

    Recently, methods involving examination of environmental DNA (eDNA) have shown promise for characterizing fish species presence and distribution in waterbodies. We evaluated the use of eDNA for standard fish monitoring surveys in a large reservoir. Specifically, we compared the presence, relative abundance, biomass, and relative percent composition of Largemouth Bass Micropterus salmoides and Gizzard Shad Dorosoma cepedianum measured through eDNA methods and established American Fisheries Society standard sampling methods for Theodore Roosevelt Lake, Arizona. Catches at electrofishing and gillnetting sites were compared with eDNA water samples at sites, within spatial strata, and over the entire reservoir. Gizzard Shad were detected at a higher percentage of sites with eDNA methods than with boat electrofishing in both spring and fall. In contrast, spring and fall gillnetting detected Gizzard Shad at more sites than eDNA. Boat electrofishing and gillnetting detected Largemouth Bass at more sites than eDNA; the exception was fall gillnetting, for which the number of sites of Largemouth Bass detection was equal to that for eDNA. We observed no relationship between relative abundance and biomass of Largemouth Bass and Gizzard Shad measured by established methods and eDNA copies at individual sites or lake sections. Reservoirwide catch composition for Largemouth Bass and Gizzard Shad (numbers and total weight [g] of fish) as determined through a combination of gear types (boat electrofishing plus gillnetting) was similar to the proportion of total eDNA copies from each species in spring and fall field sampling. However, no similarity existed between proportions of fish caught via spring and fall boat electrofishing and the proportion of total eDNA copies from each species. Our study suggests that eDNA field sampling protocols, filtration, DNA extraction, primer design, and DNA sequencing methods need further refinement and testing before incorporation into standard fish sampling surveys.

  10. A high-damping magnetorheological elastomer with bi-directional magnetic-control modulus for potential application in seismology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Miao, E-mail: yumiao@cqu.edu.cn; Qi, Song; Fu, Jie

    A high-damping magnetorheological elastomer (MRE) with bi-directional magnetic-control modulus is developed. This MRE was synthesized by filling NdFeB particles into polyurethane (PU)/ epoxy (EP) interpenetrating network (IPN) structure. The anisotropic samples were prepared in a permanent magnetic field and magnetized in an electromagnetic field of 1 T. Dynamic mechanical responses of the MRE to applied magnetic fields are investigated through magneto-rheometer, and morphology of MREs is observed via scanning electron microscope (SEM). Test result indicates that when the test field orientation is parallel to that of the sample's magnetization, the shear modulus of sample increases. On the other hand, when themore » orientation is opposite to that of the sample's magnetization, shear modulus decreases. In addition, this PU/EP IPN matrix based MRE has a high-damping property, with high loss factor and can be controlled by applying magnetic field. It is expected that the high damping property and the ability of bi-directional magnetic-control modulus of this MRE offer promising advantages in seismologic application.« less

  11. A Carbon Free Filter for Collection of Large Volume Samples of Cellular Biomass from Oligotrophic Waters

    PubMed Central

    Mailloux, Brian J.; Dochenetz, Audra; Bishop, Michael; Dong, Hailiang; Ziolkowski, Lori A.; Wommack, K. Eric; Sakowski, Eric G.; Onstott, Tullis C.; Slater, Greg F.

    2018-01-01

    Isotopic analysis of cellular biomass has greatly improved our understanding of carbon cycling in the environment. Compound specific radiocarbon analysis (CSRA) of cellular biomass is being increasingly applied in a number of fields. However, it is often difficult to collect sufficient cellular biomass for analysis from oligotrophic waters because easy-to-use filtering methods that are free of carbon contaminants do not exist. The goal of this work was to develop a new column based filter to autonomously collect high volume samples of biomass from oligotrophic waters for CSRA using material that can be baked at 450°C to remove potential organic contaminants. A series of filter materials were tested, including uncoated sand, ferrihydrite-coated sand, goethite-coated sand, aluminum-coated sand, uncoated glass wool, ferrihydrite-coated glass wool, and aluminum-coated glass wool, in the lab with 0.1 and 1.0 µm microspheres and E. coli. Results indicated that aluminum-coated glass wool was the most efficient filter and that the retention capacity of the filter far exceeded the biomass requirements for CSRA. Results from laboratory tests indicate that for oligotrophic waters with 1×105 cells ml−1, 117 L of water would need to be filtered to collect 100 µg of PLFA for bulk PLFA analysis and 2000 L for analysis of individual PLFAs. For field sampling, filtration tests on South African mine water indicated that after filtering 5955 liters, 450 µg of total PLFAs were present, ample biomass for radiocarbon analysis. In summary, we have developed a filter that is easy to use and deploy for collection of biomass for CSRA including total and individual PLFAs. PMID:22561839

  12. Comparison of Pumped and Diffusion Sampling Methods to Monitor Concentrations of Perchlorate and Explosive Compounds in Ground Water, Camp Edwards, Cape Cod, Massachusetts, 2004-05

    USGS Publications Warehouse

    LeBlanc, Denis R.; Vroblesky, Don A.

    2008-01-01

    Laboratory and field tests were conducted at Camp Edwards on the Massachusetts Military Reservation on Cape Cod to examine the utility of passive diffusion sampling for long-term monitoring of concentrations of perchlorate and explosive compounds in ground water. The diffusion samplers were constructed of 1-inch-diameter rigid, porous polyethylene tubing. The results of laboratory tests in which diffusion samplers were submerged in containers filled with ground water containing perchlorate, RDX (hexahydro-1,3,5-trinitro-1,3,5-triazine), and HMX (octahydro-1,3,5,7-tetranitro-1,3,5,7-tetrazocine) indicate that concentrations inside the diffusion samplers equilibrated with concentrations in the containers within the 19-day-long test period. Field tests of the diffusion samplers were conducted in 15 wells constructed of 2- or 2.5-inch-diameter polyvinyl chloride pipe with 10-foot-long slotted screens. Concentrations of perchlorate, RDX, and HMX in the diffusion samplers placed in the wells for 42 to 52 days were compared to concentrations in samples collected by low-flow pumped sampling from 53 days before to 109 days after retrieval of the diffusion samples. The results of the field tests indicate generally good agreement between the pumped and diffusion samples for concentrations of perchlorate, RDX, and HMX. The concentration differences indicate no systematic bias related to contaminant type or concentration levels.

  13. New design of cable-in-conduit conductor for application in future fusion reactors

    NASA Astrophysics Data System (ADS)

    Qin, Jinggang; Wu, Yu; Li, Jiangang; Liu, Fang; Dai, Chao; Shi, Yi; Liu, Huajun; Mao, Zhehua; Nijhuis, Arend; Zhou, Chao; Yagotintsev, Konstantin A.; Lubkemann, Ruben; Anvar, V. A.; Devred, Arnaud

    2017-11-01

    The China Fusion Engineering Test Reactor (CFETR) is a new tokamak device whose magnet system includes toroidal field, central solenoid (CS) and poloidal field coils. The main goal is to build a fusion engineering tokamak reactor with about 1 GW fusion power and self-sufficiency by blanket. In order to reach this high performance, the magnet field target is 15 T. However, the huge electromagnetic load caused by high field and current is a threat for conductor degradation under cycling. The conductor with a short-twist-pitch (STP) design has large stiffness, which enables a significant performance improvement in view of load and thermal cycling. But the conductor with STP design has a remarkable disadvantage: it can easily cause severe strand indentation during cabling. The indentation can reduce the strand performance, especially under high load cycling. In order to overcome this disadvantage, a new design is proposed. The main characteristic of this new design is an updated layout in the triplet. The triplet is made of two Nb3Sn strands and one soft copper strand. The twist pitch of the two Nb3Sn strands is large and cabled first. The copper strand is then wound around the two superconducting strands (CWS) with a shorter twist pitch. The following cable stages layout and twist pitches are similar to the ITER CS conductor with STP design. One short conductor sample with a similar scale to the ITER CS was manufactured and tested with the Twente Cable Press to investigate the mechanical properties, AC loss and internal inspection by destructive examination. The results are compared to the STP conductor (ITER CS and CFETR CSMC) tests. The results show that the new conductor design has similar stiffness, but much lower strand indentation than the STP design. The new design shows potential for application in future fusion reactors.

  14. Induced earthquake magnitudes are as large as (statistically) expected

    USGS Publications Warehouse

    Van Der Elst, Nicholas; Page, Morgan T.; Weiser, Deborah A.; Goebel, Thomas; Hosseini, S. Mehran

    2016-01-01

    A major question for the hazard posed by injection-induced seismicity is how large induced earthquakes can be. Are their maximum magnitudes determined by injection parameters or by tectonics? Deterministic limits on induced earthquake magnitudes have been proposed based on the size of the reservoir or the volume of fluid injected. However, if induced earthquakes occur on tectonic faults oriented favorably with respect to the tectonic stress field, then they may be limited only by the regional tectonics and connectivity of the fault network. In this study, we show that the largest magnitudes observed at fluid injection sites are consistent with the sampling statistics of the Gutenberg-Richter distribution for tectonic earthquakes, assuming no upper magnitude bound. The data pass three specific tests: (1) the largest observed earthquake at each site scales with the log of the total number of induced earthquakes, (2) the order of occurrence of the largest event is random within the induced sequence, and (3) the injected volume controls the total number of earthquakes rather than the total seismic moment. All three tests point to an injection control on earthquake nucleation but a tectonic control on earthquake magnitude. Given that the largest observed earthquakes are exactly as large as expected from the sampling statistics, we should not conclude that these are the largest earthquakes possible. Instead, the results imply that induced earthquake magnitudes should be treated with the same maximum magnitude bound that is currently used to treat seismic hazard from tectonic earthquakes.

  15. Spatio-temporal optimization of sampling for bluetongue vectors (Culicoides) near grazing livestock

    PubMed Central

    2013-01-01

    Background Estimating the abundance of Culicoides using light traps is influenced by a large variation in abundance in time and place. This study investigates the optimal trapping strategy to estimate the abundance or presence/absence of Culicoides on a field with grazing animals. We used 45 light traps to sample specimens from the Culicoides obsoletus species complex on a 14 hectare field during 16 nights in 2009. Findings The large number of traps and catch nights enabled us to simulate a series of samples consisting of different numbers of traps (1-15) on each night. We also varied the number of catch nights when simulating the sampling, and sampled with increasing minimum distances between traps. We used resampling to generate a distribution of different mean and median abundance in each sample. Finally, we used the hypergeometric distribution to estimate the probability of falsely detecting absence of vectors on the field. The variation in the estimated abundance decreased steeply when using up to six traps, and was less pronounced when using more traps, although no clear cutoff was found. Conclusions Despite spatial clustering in vector abundance, we found no effect of increasing the distance between traps. We found that 18 traps were generally required to reach 90% probability of a true positive catch when sampling just one night. But when sampling over two nights the same probability level was obtained with just three traps per night. The results are useful for the design of vector monitoring programmes on fields with grazing animals. PMID:23705770

  16. New procedure for sampling infiltration to assess post-fire soil water repellency

    Treesearch

    P. R. Robichaud; S. A. Lewis; L. E. Ashmun

    2008-01-01

    The Mini-disk Infiltrometer has been adapted for use as a field test of post-fire infiltration and soil water repellency. Although the Water Drop Penetration Time (WDPT) test is the common field test for soil water repellency, the Mini-disk Infiltrometer (MDI) test takes less time, is less subjective, and provides a relative infiltration rate. For each test, the porous...

  17. Examining the effects of turnover intentions on organizational citizenship behaviors and deviance behaviors: A psychological contract approach.

    PubMed

    Mai, Ke Michael; Ellis, Aleksander P J; Christian, Jessica Siegel; Porter, Christopher O L H

    2016-08-01

    Although turnover intentions are considered the most proximal antecedent of organizational exit, there is often temporal separation between thinking about leaving and actual exit. Using field data from 2 diverse samples of working adults, we explore a causal model of the effects of turnover intentions on employee behavior while they remain with the organization, focusing specifically on organizational citizenship behaviors (OCBs) and deviance behaviors (DBs). Utilizing expectancy theory as an explanatory framework, we argue that turnover intentions result in high levels of transactional contract orientation and low levels of relational contract orientation, which in turn lead to a decrease in the incidence of OCBs and an increase in the incidence of DBs. We first used a pilot study to investigate the direction of causality between turnover intentions and psychological contract orientations. Then, in Study 1, we tested our mediated model using a sample of employees from a large drug retailing chain. In Study 2, we expanded our model by arguing that the mediated effects are much stronger when the organization is deemed responsible for potential exit. We then tested our full model using a sample of employees from a large state-owned telecommunications corporation in China. Across both studies, results were generally consistent and supportive of our hypotheses. We discuss the implications of our findings for future theory, research, and practice regarding the management of both the turnover process and discretionary behaviors at work. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  18. Flight capacities of yellow-legged hornet (Vespa velutina nigrithorax, Hymenoptera: Vespidae) workers from an invasive population in Europe.

    PubMed

    Sauvard, Daniel; Imbault, Vanessa; Darrouzet, Éric

    2018-01-01

    The invasive yellow-legged hornet, Vespa velutina nigrithorax Lepeletier, 1836 (Hymenoptera: Vespidae), is native to Southeast Asia. It was first detected in France (in the southwest) in 2005. It has since expanded throughout Europe and has caused significant harm to honeybee populations. We must better characterize the hornet's flight capacity to understand the species' success and develop improved control strategies. Here, we carried out a study in which we quantified the flight capacities of V. velutina workers using computerized flight mills. We observed that workers were able to spend around 40% of the daily 7-hour flight tests flying. On average, they flew 10km to 30km during each flight test, although there was a large amount of variation. Workers sampled in early summer had lower flight capacities than workers sampled later in the season. Flight capacity decreased as workers aged. However, in the field, workers probably often die before this decrease becomes significant. During each flight test, workers performed several continuous flight phases of variable length that were separated by rest phases. Based on the length of those continuous flight phases and certain key assumptions, we estimated that V. velutina colony foraging radius is at least 700 m (half that in early summer); however, some workers are able to forage much farther. While these laboratory findings remain to be confirmed by field studies, our results can nonetheless help inform V. velutina biology and control efforts.

  19. Flight capacities of yellow-legged hornet (Vespa velutina nigrithorax, Hymenoptera: Vespidae) workers from an invasive population in Europe

    PubMed Central

    Imbault, Vanessa; Darrouzet, Éric

    2018-01-01

    The invasive yellow-legged hornet, Vespa velutina nigrithorax Lepeletier, 1836 (Hymenoptera: Vespidae), is native to Southeast Asia. It was first detected in France (in the southwest) in 2005. It has since expanded throughout Europe and has caused significant harm to honeybee populations. We must better characterize the hornet’s flight capacity to understand the species’ success and develop improved control strategies. Here, we carried out a study in which we quantified the flight capacities of V. velutina workers using computerized flight mills. We observed that workers were able to spend around 40% of the daily 7-hour flight tests flying. On average, they flew 10km to 30km during each flight test, although there was a large amount of variation. Workers sampled in early summer had lower flight capacities than workers sampled later in the season. Flight capacity decreased as workers aged. However, in the field, workers probably often die before this decrease becomes significant. During each flight test, workers performed several continuous flight phases of variable length that were separated by rest phases. Based on the length of those continuous flight phases and certain key assumptions, we estimated that V. velutina colony foraging radius is at least 700 m (half that in early summer); however, some workers are able to forage much farther. While these laboratory findings remain to be confirmed by field studies, our results can nonetheless help inform V. velutina biology and control efforts. PMID:29883467

  20. Towards a compact and precise sample holder for macromolecular crystallography.

    PubMed

    Papp, Gergely; Rossi, Christopher; Janocha, Robert; Sorez, Clement; Lopez-Marrero, Marcos; Astruc, Anthony; McCarthy, Andrew; Belrhali, Hassan; Bowler, Matthew W; Cipriani, Florent

    2017-10-01

    Most of the sample holders currently used in macromolecular crystallography offer limited storage density and poor initial crystal-positioning precision upon mounting on a goniometer. This has now become a limiting factor at high-throughput beamlines, where data collection can be performed in a matter of seconds. Furthermore, this lack of precision limits the potential benefits emerging from automated harvesting systems that could provide crystal-position information which would further enhance alignment at beamlines. This situation provided the motivation for the development of a compact and precise sample holder with corresponding pucks, handling tools and robotic transfer protocols. The development process included four main phases: design, prototype manufacture, testing with a robotic sample changer and validation under real conditions on a beamline. Two sample-holder designs are proposed: NewPin and miniSPINE. They share the same robot gripper and allow the storage of 36 sample holders in uni-puck footprint-style pucks, which represents 252 samples in a dry-shipping dewar commonly used in the field. The pucks are identified with human- and machine-readable codes, as well as with radio-frequency identification (RFID) tags. NewPin offers a crystal-repositioning precision of up to 10 µm but requires a specific goniometer socket. The storage density could reach 64 samples using a special puck designed for fully robotic handling. miniSPINE is less precise but uses a goniometer mount compatible with the current SPINE standard. miniSPINE is proposed for the first implementation of the new standard, since it is easier to integrate at beamlines. An upgraded version of the SPINE sample holder with a corresponding puck named SPINEplus is also proposed in order to offer a homogenous and interoperable system. The project involved several European synchrotrons and industrial companies in the fields of consumables and sample-changer robotics. Manual handling of miniSPINE was tested at different institutes using evaluation kits, and pilot beamlines are being equipped with compatible robotics for large-scale evaluation. A companion paper describes a new sample changer FlexED8 (Papp et al., 2017, Acta Cryst., D73, 841-851).

  1. Assessing spatio-temporal eruption forecasts in a monogenetic volcanic field

    NASA Astrophysics Data System (ADS)

    Bebbington, Mark S.

    2013-02-01

    Many spatio-temporal models have been proposed for forecasting the location and timing of the next eruption in a monogenetic volcanic field. These have almost invariably been fitted retrospectively. That is, the model has been tuned to all of the data, and hence an assessment of the goodness of fit has not been carried out on independent data. The low rate of eruptions in monogenetic fields means that there is not the opportunity to carry out a purely prospective test, as thousands of years would be required to accumulate the necessary data. This leaves open the possibility of a retrospective sequential test, where the parameters are calculated only on the basis of prior events and the resulting forecast compared statistically with the location and time of the next eruption. In general, events in volcanic fields are not dated with sufficient accuracy and precision to pursue this line of investigation; An exception is the Auckland Volcanic Field (New Zealand), consisting of c. 50 centers formed during the last c. 250 kyr, for which an age-order model exists in the form of a Monte Carlo sampling algorithm, facilitating repeated sequential testing. I examine a suite of spatial, temporal and spatio-temporal hazard models, comparing the degree of fit, and attempt to draw lessons from how and where each model is particularly successful or unsuccessful. A relatively simple (independent) combination of a renewal model (temporal term) and a spatially uniform ellipse (spatial term) performs as well as any other model. Both avoid over fitting the data, and hence large errors, when the spatio-temporal occurrence pattern changes.

  2. Brightness Induction and Suprathreshold Vision: Effects of Age and Visual Field

    PubMed Central

    McCourt, Mark E.; Leone, Lynnette M.; Blakeslee, Barbara

    2014-01-01

    A variety of visual capacities show significant age-related alterations. We assessed suprathreshold contrast and brightness perception across the lifespan in a large sample of healthy participants (N = 155; 142) ranging in age from 16–80 years. Experiment 1 used a quadrature-phase motion cancelation technique (Blakeslee & McCourt, 2008) to measure canceling contrast (in central vision) for induced gratings at two temporal frequencies (1 Hz and 4 Hz) at two test field heights (0.5° or 2° × 38.7°; 0.052 c/d). There was a significant age-related reduction in canceling contrast at 4 Hz, but not at 1 Hz. We find no age-related change in induction magnitude in the 1 Hz condition. We interpret the age-related decline in grating induction magnitude at 4 Hz to reflect a diminished capacity for inhibitory processing at higher temporal frequencies. In Experiment 2 participants adjusted the contrast of a matching grating (0.5° or 2° × 38.7°; 0.052 c/d) to equal that of both real (30% contrast, 0.052 c/d) and induced (McCourt, 1982) standard gratings (100% inducing grating contrast; 0.052 c/d). Matching gratings appeared in the upper visual field (UVF) and test gratings appeared in the lower visual field (LVF), and vice versa, at eccentricities of ±7.5°. Average induction magnitude was invariant with age for both test field heights. There was a significant age-related reduction in perceived contrast of stimuli in the LVF versus UVF for both real and induced gratings. PMID:25462024

  3. Field and laboratory data describing physical and chemical characteristics of metal-contaminated flood-plain deposits downstream from Lead, west-central South Dakota

    USGS Publications Warehouse

    Marron, D.C.

    1988-01-01

    Samples from metal-contaminated flood-plain sediments at 9 sites downstream from Lead, in west-central South Dakota, were collected during the summers of 1985-87 to characterize aspects of the sedimentology, chemistry, and geometry of a deposit that resulted from the discharge of a large volume of mining wastes into a river system. Field and laboratory data include stratigraphic descriptions, chemical contents and grain-size distributions of samples, and surveyed flood-plain positions of samples. This report describes sampling-site locations, and methods of sample collection and preservation, and subsequent laboratory analysis. Field and laboratory data are presented in 4 figures and 11 tables in the ' Supplemental Data ' section at the back of the report. (USGS)

  4. Prediction of acid mine drainage generation potential of various lithologies using static tests: Etili coal mine (NW Turkey) as a case study.

    PubMed

    Yucel, Deniz Sanliyuksel; Baba, Alper

    2016-08-01

    The Etili neighborhood in Can County (northwestern Turkey) has large reserves of coal and has been the site of many small- to medium-scale mining operations since the 1980s. Some of these have ceased working while others continue to operate. Once activities cease, the mining facilities and fields are usually abandoned without rehabilitation. The most significant environmental problem is acid mine drainage (AMD). This study was carried out to determine the acid generation potential of various lithological units in the Etili coal mine using static test methods. Seventeen samples were selected from areas with high acidic water concentrations: from different alteration zones belonging to volcanic rocks, from sedimentary rocks, and from coals and mine wastes. Static tests (paste pH, standard acid-base accounting, and net acid generation tests) were performed on these samples. The consistency of the static test results showed that oxidation of sulfide minerals, especially pyrite-which is widely found not only in the alteration zones of volcanic rocks but also in the coals and mine wastes-is the main factor controlling the generation of AMD in this mine. Lack of carbonate minerals in the region also increases the occurrence of AMD.

  5. Lack of association between digit ratio (2D:4D) and assertiveness: replication in a large sample.

    PubMed

    Voracek, Martin

    2009-12-01

    Findings regarding within-sex associations of digit ratio (2D:4D), a putative pointer to long-lasting effects of prenatal androgen action, and sexually differentiated personality traits have generally been inconsistent or unreplicable, suggesting that effects in this domain, if any, are likely small. In contrast to evidence from Wilson's important 1983 study, a forerunner of modern 2D:4D research, two recent studies in 2005 and 2008 by Freeman, et al. and Hampson, et al. showed assertiveness, a presumably male-typed personality trait, was not associated with 2D:4D; however, these studies were clearly statistically underpowered. Hence this study examined this question anew, based on a large sample of 491 men and 627 women. Assertiveness was only modestly sexually differentiated, favoring men, and a positive correlate of age and education and a negative correlate of weight and Body Mass Index among women, but not men. Replicating the two prior studies, 2D:4D was throughout unrelated to assertiveness scores. This null finding was preserved with controls for correlates of assertiveness, also in nonparametric analysis and with tests for curvilinear relations. Discussed are implications of this specific null finding, now replicated in a large sample, for studies of 2D:4D and personality in general and novel research approaches to proceed in this field.

  6. Information engineering for molecular diagnostics.

    PubMed Central

    Sorace, J. M.; Ritondo, M.; Canfield, K.

    1994-01-01

    Clinical laboratories are beginning to apply the recent advances in molecular biology to the testing of patient samples. The emerging field of Molecular Diagnostics will require a new Molecular Diagnostics Laboratory Information System which handles the data types, samples and test methods found in this field. The system must be very flexible in regards to supporting ad-hoc queries. The requirements which are shaping the developments in this field are reviewed and a data model developed. Several queries which demonstrate the data models ability to support the information needs of this area have been developed and run. These results demonstrate the ability of the purposed data model to meet the current and projected needs of this rapidly expanding field. PMID:7949937

  7. Experimental study on infrared radiation temperature field of concrete under uniaxial compression

    NASA Astrophysics Data System (ADS)

    Lou, Quan; He, Xueqiu

    2018-05-01

    Infrared thermography, as a nondestructive, non-contact and real-time monitoring method, has great significance in assessing the stability of concrete structure and monitoring its failure. It is necessary to conduct in depth study on the mechanism and application of infrared radiation (IR) of concrete failure under loading. In this paper, the concrete specimens with size of 100 × 100 × 100 mm were adopted to carry out the uniaxial compressions for the IR tests. The distribution of IR temperatures (IRTs), surface topography of IRT field and the reconstructed IR images were studied. The results show that the IRT distribution follows the Gaussian distribution, and the R2 of Gaussian fitting changes along with the loading time. The abnormities of R2 and AE counts display the opposite variation trends. The surface topography of IRT field is similar to the hyperbolic paraboloid, which is related to the stress distribution in the sample. The R2 of hyperbolic paraboloid fitting presents an upward trend prior to the fracture which enables to change the IRT field significantly. This R2 has a sharp drop in response to this large destruction. The normalization images of IRT field, including the row and column normalization images, were proposed as auxiliary means to analyze the IRT field. The row and column normalization images respectively show the transverse and longitudinal distribution of the IRT field, and they have clear responses to the destruction occurring on the sample surface. In this paper, the new methods and quantitative index were proposed for the analysis of IRT field, which have some theoretical and instructive significance for the analysis of the characteristics of IRT field, as well as the monitoring of instability and failure for concrete structure.

  8. Distance-limited perpendicular distance sampling for coarse woody debris: theory and field results

    Treesearch

    Mark J. Ducey; Micheal S. Williams; Jeffrey H. Gove; Steven Roberge; Robert S. Kenning

    2013-01-01

    Coarse woody debris (CWD) has been identified as an important component in many forest ecosystem processes. Perpendicular distance sampling (PDS) is one of the several efficient new methods that have been proposed for CWD inventory. One drawback of PDS is that the maximum search distance can be very large, especially if CWD diameters are large or the volume factor...

  9. Constraints to estimating the prevalence of trypanosome infections in East African zebu cattle.

    PubMed

    Cox, Andrew P; Tosas, Olga; Tilley, Aimee; Picozzi, Kim; Coleman, Paul; Hide, Geoff; Welburn, Susan C

    2010-09-06

    In East Africa, animal trypanosomiasis is caused by many tsetse transmitted protozoan parasites including Trypanosoma vivax, T. congolense and subspecies of T. brucei s.l. (T. b. brucei and zoonotic human infective T. b. rhodesiense) that may co-circulate in domestic and wild animals. Accurate species-specific prevalence measurements of these parasites in animal populations are complicated by mixed infections of trypanosomes within individual hosts, low parasite densities and difficulties in conducting field studies. Many Polymerase Chain Reaction (PCR) based diagnostic tools are available to characterise and quantify infection in animals. These are important for assessing the contribution of infections in animal reservoirs and the risk posed to humans from zoonotic trypanosome species. New matrices for DNA capture have simplified large scale field PCR analyses but few studies have examined the impact of these techniques on prevalence estimations. The Whatman FTA matrix has been evaluated using a random sample of 35 village zebu cattle from a population naturally exposed to trypanosome infection. Using a generic trypanosome-specific PCR, prevalence was systematically evaluated. Multiple PCR samples taken from single FTA cards demonstrated that a single punch from an FTA card is not sufficient to confirm the infectivity status of an individual animal as parasite DNA is unevenly distributed across the card. At low parasite densities in the host, this stochastic sampling effect results in underestimation of prevalence based on single punch PCR testing. Repeated testing increased the estimated prevalence of all Trypanosoma spp. from 9.7% to 86%. Using repeat testing, a very high prevalence of pathogenic trypanosomes was detected in these local village cattle: T. brucei (34.3%), T. congolense (42.9%) and T. vivax (22.9%). These results show that, despite the convenience of Whatman FTA cards and specific PCR based detection tools, the chronically low parasitaemias in indigenous African zebu cattle make it difficult to establish true prevalence. Although this study specifically applies to FTA cards, a similar effect would be experienced with other approaches using blood samples containing low parasite densities. For example, using blood film microscopy or PCR detection from liquid samples where the probability of detecting a parasite or DNA molecule, in the required number of fields of view or PCR reaction, is less than one.

  10. A Sample of Very Young Field L Dwarfs and Implications for the Brown Dwarf "Lithium Test" at Early Ages

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, J. Davy; Cruz, Kelle L.; Barman, Travis S.; Burgasser, Adam J.; Looper, Dagny L.; Tinney, C. G.; Gelino, Christopher R.; Lowrance, Patrick J.; Liebert, James; Carpenter, John M.; Hillenbrand, Lynne A.; Stauffer, John R.

    2008-12-01

    Using a large sample of optical spectra of late-type dwarfs, we identify a subset of late-M through L field dwarfs that, because of the presence of low-gravity features in their spectra, are believed to be unusually young. From a combined sample of 303 field L dwarfs, we find observationally that 7.6% +/- 1.6% are younger than 100 Myr. This percentage is in agreement with theoretical predictions once observing biases are taken into account. We find that these young L dwarfs tend to fall in the southern hemisphere (decl . < 0°) and may be previously unrecognized, low-mass members of nearby, young associations like Tucana-Horologium, TW Hydrae, β Pictoris, and AB Doradus. We use a homogeneously observed sample of ~150 optical spectra to examine lithium strength as a function of L/T spectral type and further corroborate the trends noted by Kirkpatrick and coworkers. We use our low-gravity spectra to investigate lithium strength as a function of age. The data weakly suggest that for early- to mid-L dwarfs the line strength reaches a maximum for a few × 100 Myr, whereas for much older (few Gyr) and much younger (<100 Myr) L dwarfs the line is weaker or undetectable. We show that a weakening of lithium at lower gravities is predicted by model atmosphere calculations, an effect partially corroborated by existing observational data. Larger samples containing L dwarfs of well-determined ages are needed to further test this empirically. If verified, this result would reinforce the caveat first cited by Kirkpatrick and coworkers that the lithium test should be used with caution when attempting to confirm the substellar nature of the youngest brown dwarfs. Most of the spectroscopic data presented herein were obtained at the W. M. Keck Observatory, which is operated as a scientific partnership among the California Institute of Technology, the University of California, and the National Aeronautics and Space Administration. The Observatory was made possible by the generous financial support of the W. M. Keck Foundation. Other spectroscopic data were collected at the Subaru Telescope, the twin telescopes of the Gemini Observatory, the Magellan-Clay Telescope, the Kitt Peak National Observatory Mayall Telescope, and the Cerro Tololo Interamerican Observatory Blanco Telescope.

  11. Constraints to estimating the prevalence of trypanosome infections in East African zebu cattle

    PubMed Central

    2010-01-01

    Background In East Africa, animal trypanosomiasis is caused by many tsetse transmitted protozoan parasites including Trypanosoma vivax, T. congolense and subspecies of T. brucei s.l. (T. b. brucei and zoonotic human infective T. b. rhodesiense) that may co-circulate in domestic and wild animals. Accurate species-specific prevalence measurements of these parasites in animal populations are complicated by mixed infections of trypanosomes within individual hosts, low parasite densities and difficulties in conducting field studies. Many Polymerase Chain Reaction (PCR) based diagnostic tools are available to characterise and quantify infection in animals. These are important for assessing the contribution of infections in animal reservoirs and the risk posed to humans from zoonotic trypanosome species. New matrices for DNA capture have simplified large scale field PCR analyses but few studies have examined the impact of these techniques on prevalence estimations. Results The Whatman FTA matrix has been evaluated using a random sample of 35 village zebu cattle from a population naturally exposed to trypanosome infection. Using a generic trypanosome-specific PCR, prevalence was systematically evaluated. Multiple PCR samples taken from single FTA cards demonstrated that a single punch from an FTA card is not sufficient to confirm the infectivity status of an individual animal as parasite DNA is unevenly distributed across the card. At low parasite densities in the host, this stochastic sampling effect results in underestimation of prevalence based on single punch PCR testing. Repeated testing increased the estimated prevalence of all Trypanosoma spp. from 9.7% to 86%. Using repeat testing, a very high prevalence of pathogenic trypanosomes was detected in these local village cattle: T. brucei (34.3%), T. congolense (42.9%) and T. vivax (22.9%). Conclusions These results show that, despite the convenience of Whatman FTA cards and specific PCR based detection tools, the chronically low parasitaemias in indigenous African zebu cattle make it difficult to establish true prevalence. Although this study specifically applies to FTA cards, a similar effect would be experienced with other approaches using blood samples containing low parasite densities. For example, using blood film microscopy or PCR detection from liquid samples where the probability of detecting a parasite or DNA molecule, in the required number of fields of view or PCR reaction, is less than one. PMID:20815940

  12. A magnetic shield/dual purpose mission

    NASA Technical Reports Server (NTRS)

    Watkins, Seth; Albertelli, Jamil; Copeland, R. Braden; Correll, Eric; Dales, Chris; Davis, Dana; Davis, Nechole; Duck, Rob; Feaster, Sandi; Grant, Patrick

    1994-01-01

    The objective of this work is to design, build, and fly a dual-purpose payload whose function is to produce a large volume, low intensity magnetic field and to test the concept of using such a magnetic field to protect manned spacecraft against particle radiation. An additional mission objective is to study the effect of this moving field on upper atmosphere plasmas. Both mission objectives appear to be capable of being tested using the same superconducting coil. The potential benefits of this magnetic shield concept apply directly to both earth-orbital and interplanetary missions. This payload would be a first step in assessing the true potential of large volume magnetic fields in the U.S. space program. Either converted launch systems or piggyback payload opportunities may be appropriate for this mission. The use of superconducting coils for magnetic shielding against solar flare radiation during manned interplanetary missions has long been contemplated and was considered in detail in the years preceding the Apollo mission. With the advent of new superconductors, it has now become realistic to reconsider this concept for a Mars mission. Even in near-earth orbits, large volume magnetic fields produced using conventional metallic superconductors allow novel plasma physics experiments to be contemplated. Both deployed field-coil and non-deployed field-coil shielding arrangements have been investigated, with the latter being most suitable for an initial test payload in a polar orbit.

  13. Bloodstain detection and discrimination impacted by spectral shift when using an interference filter-based visible and near-infrared multispectral crime scene imaging system

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Messinger, David W.; Dube, Roger R.

    2018-03-01

    Bloodstain detection and discrimination from nonblood substances on various substrates are critical in forensic science as bloodstains are a critical source for confirmatory DNA tests. Conventional bloodstain detection methods often involve time-consuming sample preparation, a chance of harm to investigators, the possibility of destruction of blood samples, and acquisition of too little data at crime scenes either in the field or in the laboratory. An imaging method has the advantages of being nondestructive, noncontact, real-time, and covering a large field-of-view. The abundant spectral information provided by multispectral imaging makes it a potential presumptive bloodstain detection and discrimination method. This article proposes an interference filter (IF) based area scanning three-spectral-band crime scene imaging system used for forensic bloodstain detection and discrimination. The impact of large angle of views on the spectral shift of calibrated IFs is determined, for both detecting and discriminating bloodstains from visually similar substances on multiple substrates. Spectral features in the visible and near-infrared portion employed by the relative band depth method are used. This study shows that 1 ml bloodstain on black felt, gray felt, red felt, white cotton, white polyester, and raw wood can be detected. Bloodstains on the above substrates can be discriminated from cola, coffee, ketchup, orange juice, red wine, and green tea.

  14. Sampling the sound field in auditoria using large natural-scale array measurements.

    PubMed

    Witew, Ingo B; Vorländer, Michael; Xiang, Ning

    2017-03-01

    Suitable data for spatial wave field analyses in concert halls need to satisfy the sampling theorem and hence requires densely spaced measurement positions over extended regions. The described measurement apparatus is capable of automatically sampling the sound field in auditoria over a surface of 5.30 m × 8.00 m to any appointed resolutions. In addition to discussing design features, a case study based on measured impulse responses is presented. The experimental data allow wave field animations demonstrating how sound propagating at grazing incidence over theater seating is scattered from rows of chairs (seat-dip effect). The visualized data of reflections and scattering from an auditorium's boundaries give insights and opportunities for advanced analyses.

  15. 40 CFR 1048.515 - What are the field-testing procedures?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... CFR part 1065, subpart J, we describe the equipment and sampling methods for testing engines in the... specify in § 1048.101(c) for any continuous sampling period of at least 120 seconds under the following ranges of operation and operating conditions: (1) Engine operation during the emission sampling period...

  16. Rasch model based analysis of the Force Concept Inventory

    NASA Astrophysics Data System (ADS)

    Planinic, Maja; Ivanjek, Lana; Susac, Ana

    2010-06-01

    The Force Concept Inventory (FCI) is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear measures for persons and items from raw test scores and which can provide important insight in the structure and functioning of the test (how item difficulties are distributed within the test, how well the items fit the model, and how well the items work together to define the underlying construct). The data for the Rasch analysis come from the large-scale research conducted in 2006-07, which investigated Croatian high school students’ conceptual understanding of mechanics on a representative sample of 1676 students (age 17-18 years). The instrument used in research was the FCI. The average FCI score for the whole sample was found to be (27.7±0.4)% , indicating that most of the students were still non-Newtonians at the end of high school, despite the fact that physics is a compulsory subject in Croatian schools. The large set of obtained data was analyzed with the Rasch measurement computer software WINSTEPS 3.66. Since the FCI is routinely used as pretest and post-test on two very different types of population (non-Newtonian and predominantly Newtonian), an additional predominantly Newtonian sample ( N=141 , average FCI score of 64.5%) of first year students enrolled in introductory physics course at University of Zagreb was also analyzed. The Rasch model based analysis suggests that the FCI has succeeded in defining a sufficiently unidimensional construct for each population. The analysis of fit of data to the model found no grossly misfitting items which would degrade measurement. Some items with larger misfit and items with significantly different difficulties in the two samples of students do require further examination. The analysis revealed some problems with item distribution in the FCI and suggested that the FCI may function differently in non-Newtonian and predominantly Newtonian population. Some possible improvements of the test are suggested.

  17. Risk in cleaning: chemical and physical exposure.

    PubMed

    Wolkoff, P; Schneider, T; Kildesø, J; Degerth, R; Jaroszewski, M; Schunk, H

    1998-04-23

    Cleaning is a large enterprise involving a large fraction of the workforce worldwide. A broad spectrum of cleaning agents has been developed to facilitate dust and dirt removal, for disinfection and surface maintenance. The cleaning agents are used in large quantities throughout the world. Although a complex pattern of exposure to cleaning agents and resulting health problems, such as allergies and asthma, are reported among cleaners, only a few surveys of this type of product have been performed. This paper gives a broad introduction to cleaning agents and the impact of cleaning on cleaners, occupants of indoor environments, and the quality of cleaning. Cleaning agents are usually grouped into different product categories according to their technical functions and the purpose of their use (e.g. disinfectants and surface care products). The paper also indicates the adverse health and comfort effects associated with the use of these agents in connection with the cleaning process. The paper identifies disinfectants as the most hazardous group of cleaning agents. Cleaning agents contain evaporative and non-evaporative substances. The major toxicologically significant constituents of the former are volatile organic compounds (VOCs), defined as substances with boiling points in the range of 0 degree C to about 400 degrees C. Although laboratory emission testing has shown many VOCs with quite different time-concentration profiles, few field studies have been carried out measuring the exposure of cleaners. However, both field studies and emission testing indicate that the use of cleaning agents results in a temporal increase in the overall VOC level. This increase may occur during the cleaning process and thus it can enhance the probability of increased short-term exposure of the cleaners. However, the increased levels can also be present after the cleaning and result in an overall increased VOC level that can possibly affect the indoor air quality (IAQ) perceived by occupants. The variety and duration of the emissions depend inter alia on the use of fragrances and high boiling VOCs. Some building materials appear to increase their VOC emission through wet cleaning and thus may affect the IAQ. Particles and dirt contain a great variety of both volatile and non-volatile substances, including allergens. While the volatile fraction can consist of more than 200 different VOCs including formaldehyde, the non-volatile fraction can contain considerable amounts (> 0.5%) of fatty acid salts and tensides (e.g. linear alkyl benzene sulphonates). The level of these substances can be high immediately after the cleaning process, but few studies have been conducted concerning this problem. The substances partly originate from the use of cleaning agents. Both types are suspected to be airway irritants. Cleaning activities generate dust, mostly by resuspension, but other occupant activities may also resuspend dust over longer periods of time. Personal sampling of VOCs and airborne dust gives higher results than stationary sampling. International bodies have proposed air sampling strategies. A variety of field sampling techniques for VOC and surface particle sampling is listed.

  18. Low energy prompt gamma-ray tests of a large volume BGO detector.

    PubMed

    Naqvi, A A; Kalakada, Zameer; Al-Anezi, M S; Raashid, M; Khateeb-ur-Rehman; Maslehuddin, M; Garwan, M A

    2012-01-01

    Tests of a large volume Bismuth Germinate (BGO) detector were carried out to detect low energy prompt gamma-rays from boron and cadmium-contaminated water samples using a portable neutron generator-based Prompt Gamma Neutron Activation Analysis (PGNAA) setup. Inspite of strong interference between the sample- and the detector-associated prompt gamma-rays, an excellent agreement has been observed between the experimental and calculated yields of the prompt gamma-rays, indicating successful application of the large volume BGO detector in the PGNAA analysis of bulk samples using low energy prompt gamma-rays. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Development of a persistent superconducting joint between Bi-2212/Ag-alloy multifilamentary round wires

    NASA Astrophysics Data System (ADS)

    Chen, Peng; Trociewitz, Ulf P.; Davis, Daniel S.; Bosque, Ernesto S.; Hilton, David K.; Kim, Youngjae; Abraimov, Dmytro V.; Starch, William L.; Jiang, Jianyi; Hellstrom, Eric E.; Larbalestier, David C.

    2017-02-01

    Superconducting joints are one of the key components needed to make Ag-alloy clad Bi2Sr2CaCu2O8+x (Bi-2212) superconducting round wire (RW) successful for high-field, high-homogeneity magnet applications, especially for nuclear magnetic resonance magnets in which persistent current mode operation is highly desired. In this study, a procedure for fabricating superconducting joints between Bi-2212 RWs during coil reaction was developed. Melting temperatures of Bi-2212 powder with different amounts of Ag addition were investigated by differential thermal analysis so as to provide information for selecting the proper joint matrix. Test joints of 1.3 mm dia. wires heat treated in 1 bar flowing oxygen using the typical partial melt Bi-2212 heat treatment (HT) had transport critical currents I c of ˜900 A at 4.2 K and self-field, decreasing to ˜480 A at 14 T evaluated at 0.1 μV cm-1 at 4.2 K. Compared to the I c of the open-ended short conductor samples with identical 1 bar HT, the I c values of the superconducting joint are ˜20% smaller than that of conductor samples measured in parallel field but ˜20% larger than conductor samples measured in perpendicular field. Microstructures examined by scanning electron microscopy clearly showed the formation of a superconducting Bi-2212 interface between the two Bi-2212 RWs. Furthermore, a Bi-2212 RW closed-loop solenoid with a superconducting joint heat treated in 1 bar flowing oxygen showed an estimated joint resistance below 5 × 10-12 Ω based on its field decay rate. This value is sufficiently low to demonstrate the potential for persistent operation of large inductance Bi-2212 coils.

  20. Effects of Vegetated Field Borders on Arthropods in Cotton Fields in Eastern North Carolina

    PubMed Central

    Outward, Randy; Sorenson, Clyde E.; Bradley, J. R.

    2008-01-01

    The influence, if any, of 5m wide, feral, herbaceous field borders on pest and beneficial arthropods in commercial cotton, Gossypium hirsutum (L.) (Malvales: Malvaceae), fields was measured through a variety of sampling techniques over three years. In each year, 5 fields with managed, feral vegetation borders and five fields without such borders were examined. Sampling was stratified from the field border or edge in each field in an attempt to elucidate any edge effects that might have occurred. Early season thrips populations appeared to be unaffected by the presence of a border. Pitfall sampling disclosed no differences in ground-dwelling predaceous arthropods but did detect increased populations of crickets around fields with borders. Cotton aphid (Aphis gossypii Glover) (Hemiptera: Aphididae) populations were too low during the study to adequately assess border effects. Heliothines, Heliothis virescens (F.) and Helicoverpa zea (Boddie) (Lepidoptera: Noctuidae), egg numbers and damage rates were largely unaffected by the presence or absence of a border, although in one instance egg numbers were significantly lower in fields with borders. Overall, foliage-dwelling predaceous arthropods were somewhat more abundant in fields with borders than in fields without borders. Tarnished plant bugs, Lygus lineolaris (Palisot de Beauvois) (Heteroptera: Miridae) were significantly more abundant in fields with borders, but stink bugs, Acrosternum hilare (Say), and Euschistus servus (Say) (Hemiptera: Pentatomidae) numbers appeared to be largely unaffected by border treatment. Few taxa clearly exhibited distributional edge effects relative to the presence or absence of border vegetation. Field borders like those examined in this study likely will have little impact on insect pest management in cotton under current insect management regimens. PMID:20345293

  1. When the Test of Mediation is More Powerful than the Test of the Total Effect

    PubMed Central

    O'Rourke, Holly P.; MacKinnon, David P.

    2014-01-01

    Although previous research has studied power in mediation models, the extent to which the inclusion of a mediator will increase power has not been investigated. First, a study compared analytical power of the mediated effect to the total effect in a single mediator model to identify the situations in which the inclusion of one mediator increased statistical power. Results from the first study indicated that including a mediator increased statistical power in small samples with large coefficients and in large samples with small coefficients, and when coefficients were non-zero and equal across models. Next, a study identified conditions where power was greater for the test of the total mediated effect compared to the test of the total effect in the parallel two mediator model. Results indicated that including two mediators increased power in small samples with large coefficients and in large samples with small coefficients, the same pattern of results found in the first study. Finally, a study assessed analytical power for a sequential (three-path) two mediator model and compared power to detect the three-path mediated effect to power to detect both the test of the total effect and the test of the mediated effect for the single mediator model. Results indicated that the three-path mediated effect had more power than the mediated effect from the single mediator model and the test of the total effect. Practical implications of these results for researchers are then discussed. PMID:24903690

  2. Biostatistics primer: part I.

    PubMed

    Overholser, Brian R; Sowinski, Kevin M

    2007-12-01

    Biostatistics is the application of statistics to biologic data. The field of statistics can be broken down into 2 fundamental parts: descriptive and inferential. Descriptive statistics are commonly used to categorize, display, and summarize data. Inferential statistics can be used to make predictions based on a sample obtained from a population or some large body of information. It is these inferences that are used to test specific research hypotheses. This 2-part review will outline important features of descriptive and inferential statistics as they apply to commonly conducted research studies in the biomedical literature. Part 1 in this issue will discuss fundamental topics of statistics and data analysis. Additionally, some of the most commonly used statistical tests found in the biomedical literature will be reviewed in Part 2 in the February 2008 issue.

  3. Woodstove emission measurement methods: Comparison and emission factors update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCrillis, R.C.; Jaasma, D.R.

    1993-01-01

    Since woodstoves are tested for certification in the laboratory using EPA Methods 5G and 5H, it is of interest to determine the correlation between these regulatory methods and the inhouse equipment. Two inhouse sampling systems have been used mostwidely: one is an intermittent, pump-driven particulate sampler that collects particulate and condensible organics on a filter and organic adsorbent resin; and the other uses an evacuated cylinder as the motive force and particulate and condensible organics are collected in a condenser and dual filter. Both samplers can operate unattended for 1-week periods. A large number of tests have been run comparingmore » Methods 5G and 5H to both samplers. The paper presents these comparison data and determines the relationships between regulations and field samplers.« less

  4. Design and construction of a point-contact spectroscopy rig with lateral scanning capability.

    PubMed

    Tortello, M; Park, W K; Ascencio, C O; Saraf, P; Greene, L H

    2016-06-01

    The design and realization of a cryogenic rig for point-contact spectroscopy measurements in the needle-anvil configuration is presented. Thanks to the use of two piezoelectric nano-positioners, the tip can move along the vertical (z) and horizontal (x) direction and thus the rig is suitable to probe different regions of a sample in situ. Moreover, it can also form double point-contacts on different facets of a single crystal for achieving, e.g., an interferometer configuration for phase-sensitive measurements. For the later purpose, the sample holder can also host a Helmholtz coil for applying a small transverse magnetic field to the junction. A semi-rigid coaxial cable can be easily added for studying the behavior of Josephson junctions under microwave irradiation. The rig can be detached from the probe and thus used with different cryostats. The performance of this new probe has been tested in a Quantum Design PPMS system by conducting point-contact Andreev reflection measurements on Nb thin films over large areas as a function of temperature and magnetic field.

  5. Design and construction of a point-contact spectroscopy rig with lateral scanning capability

    NASA Astrophysics Data System (ADS)

    Tortello, M.; Park, W. K.; Ascencio, C. O.; Saraf, P.; Greene, L. H.

    2016-06-01

    The design and realization of a cryogenic rig for point-contact spectroscopy measurements in the needle-anvil configuration is presented. Thanks to the use of two piezoelectric nano-positioners, the tip can move along the vertical (z) and horizontal (x) direction and thus the rig is suitable to probe different regions of a sample in situ. Moreover, it can also form double point-contacts on different facets of a single crystal for achieving, e.g., an interferometer configuration for phase-sensitive measurements. For the later purpose, the sample holder can also host a Helmholtz coil for applying a small transverse magnetic field to the junction. A semi-rigid coaxial cable can be easily added for studying the behavior of Josephson junctions under microwave irradiation. The rig can be detached from the probe and thus used with different cryostats. The performance of this new probe has been tested in a Quantum Design PPMS system by conducting point-contact Andreev reflection measurements on Nb thin films over large areas as a function of temperature and magnetic field.

  6. Design and construction of a point-contact spectroscopy rig with lateral scanning capability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tortello, M.; Park, W. K., E-mail: wkpark@illinois.edu; Ascencio, C. O.

    2016-06-15

    The design and realization of a cryogenic rig for point-contact spectroscopy measurements in the needle-anvil configuration is presented. Thanks to the use of two piezoelectric nano-positioners, the tip can move along the vertical (z) and horizontal (x) direction and thus the rig is suitable to probe different regions of a sample in situ. Moreover, it can also form double point-contacts on different facets of a single crystal for achieving, e.g., an interferometer configuration for phase-sensitive measurements. For the later purpose, the sample holder can also host a Helmholtz coil for applying a small transverse magnetic field to the junction. Amore » semi-rigid coaxial cable can be easily added for studying the behavior of Josephson junctions under microwave irradiation. The rig can be detached from the probe and thus used with different cryostats. The performance of this new probe has been tested in a Quantum Design PPMS system by conducting point-contact Andreev reflection measurements on Nb thin films over large areas as a function of temperature and magnetic field.« less

  7. The Architecture Design of Detection and Calibration System for High-voltage Electrical Equipment

    NASA Astrophysics Data System (ADS)

    Ma, Y.; Lin, Y.; Yang, Y.; Gu, Ch; Yang, F.; Zou, L. D.

    2018-01-01

    With the construction of Material Quality Inspection Center of Shandong electric power company, Electric Power Research Institute takes on more jobs on quality analysis and laboratory calibration for high-voltage electrical equipment, and informationization construction becomes urgent. In the paper we design a consolidated system, which implements the electronic management and online automation process for material sampling, test apparatus detection and field test. In the three jobs we use QR code scanning, online Word editing and electronic signature. These techniques simplify the complex process of warehouse management and testing report transferring, and largely reduce the manual procedure. The construction of the standardized detection information platform realizes the integrated management of high-voltage electrical equipment from their networking, running to periodic detection. According to system operation evaluation, the speed of transferring report is doubled, and querying data is also easier and faster.

  8. Uniform field loop-gap resonator and rectangular TEU02 for aqueous sample EPR at 94 GHz

    NASA Astrophysics Data System (ADS)

    Sidabras, Jason W.; Sarna, Tadeusz; Mett, Richard R.; Hyde, James S.

    2017-09-01

    In this work we present the design and implementation of two uniform-field resonators: a seven-loop-six-gap loop-gap resonator (LGR) and a rectangular TEU02 cavity resonator. Each resonator has uniform-field-producing end-sections. These resonators have been designed for electron paramagnetic resonance (EPR) of aqueous samples at 94 GHz. The LGR geometry employs low-loss Rexolite end-sections to improve the field homogeneity over a 3 mm sample region-of-interest from near-cosine distribution to 90% uniform. The LGR was designed to accommodate large degassable Polytetrafluorethylen (PTFE) tubes (0.81 mm O.D.; 0.25 mm I.D.) for aqueous samples. Additionally, field modulation slots are designed for uniform 100 kHz field modulation incident at the sample. Experiments using a point sample of lithium phthalocyanine (LiPC) were performed to measure both the uniformity of the microwave magnetic field and 100 kHz field modulation, and confirm simulations. The rectangular TEU02 cavity resonator employs over-sized end-sections with sample shielding to provide an 87% uniform field for a 0.1 × 2 × 6 mm3 sample geometry. An evanescent slotted window was designed for light access to irradiate 90% of the sample volume. A novel dual-slot iris was used to minimize microwave magnetic field perturbations and maintain cross-sectional uniformity. Practical EPR experiments using the application of light irradiated rose bengal (4,5,6,7-tetrachloro-2‧,4‧,5‧,7‧-tetraiodofluorescein) were performed in the TEU02 cavity. The implementation of these geometries providing a practical designs for uniform field resonators that continue resonator advancements towards quantitative EPR spectroscopy.

  9. Paleomagnetism studies at micrometer scales using quantum diamond microscopy

    NASA Astrophysics Data System (ADS)

    Kehayias, P.; Fu, R. R.; Glenn, D. R.; Lima, E. A.; Men, M.; Merryman, H.; Walsworth, A.; Weiss, B. P.; Walsworth, R. L.

    2017-12-01

    Traditional paleomagnetic experiments generally measure the net magnetic moment of cm-size rock samples. Field tests such as the conglomerate and fold tests, based on the measurements of such cm-size samples, are frequently used to constrain the timing of magnetization. However, structures permitting such field tests may occur at the micron scale in geological samples, precluding paleomagnetic field tests using traditional bulk measurement techniques. The quantum diamond microscope (QDM) is a recently developed technology that uses magnetically-sensitive nitrogen-vacancy (NV) color centers in diamond for magnetic mapping with micron resolution [1]. QDM data were previously used to identify the ferromagnetic carriers in chondrules and terrestrial zircons and to image the magnetization distribution in multi-domain dendritic magnetite. Taking advantage of new hardware components, we have developed an optimized QDM setup with a 1E-15 J/T moment sensitivity over a measurement area of several millimeters squared. The improved moment sensitivity of the new QDM setup permits us to image natural remanent magnetization (NRM) in weakly magnetized samples, thereby enabling paleomagnetic field tests at the millimeter scale. We will present recent and ongoing QDM measurements of (1) the Renazzo class carbonaceous (CR) chondrite GRA 95229 and (2) 1 cm scale folds in a post-Bitter Springs Stage ( 790 Ma) carbonate from the Svanbergfjellet Formation (Svalbard). Results from the GRA 95229 micro-conglomerate test, performed on single chondrules containing dusty olivine metals crystallized during chondrule formation, hold implications for the preservation of nebular magnetic field records. The Svanbergfjellet Formation micro-fold test can help confirm the primary origin of a paleomagnetic pole at 790 Ma, which has been cited as evidence for rapid true polar wander in the 820-790 Ma interval. In addition, we will detail technical aspects of the new QDM setup, emphasizing key elements that enable improved sensitivity. [1] D. R. Glenn et al., arXiv:1707.06714 (2017).

  10. Characteristics of large particles and their effects on the submarine light field

    NASA Astrophysics Data System (ADS)

    Hou, Weilin

    Large particles play important roles in the ocean by modifying the underwater light field and effecting material transfer. The particle size distribution of large particles has been measured in-situ with multiple- camera video microscopy and the automated particle sizing and recognition software developed. Results show that there are more large particles in coastal waters than previously thaught, based upon by a hyperbolic size- distribution curve with a (log-log) slope parameter of close to 3 instead of 4 for the particles larger than 100μm diameter. Larger slopes are more typical for particles in the open ocean. This slope permits estimation of the distribution into the small-particle size range for use in correcting the beam-attenuation measurements for near-forward scattering. The large- particle slope and c-meter were used to estimate the small-particle size distributions which nearly matched those measured with a Coulter Counteroler (3.05%). There is also a fair correlation (r2=0.729) between the slope of the distribution and its concentration parameters. Scattering by large particles is influenced by not only the concentrations of these particles, but also the scattering phase functions. This first in-situ measurement of large-particle scattering with multiple angles reveals that they scatter more in the backward direction than was previously believed, and the enhanced backscattering can be explained in part by multiple scattering of aggregated particles. Proper identification of these large particles can be of great help in understanding the status of the ecosystem. By extracting particle features using high-resolution video images via moment-invariant functions and applying this information to lower-resolution images, we increase the effective sample volume without severely degrading classification efficiency. Traditional pattern recognition algorithms of images classified zooplankton with results within 24% of zooplankton collected using bottle samples. A faster particle recognition scheme using optical scattering is introduced and test results are satisfactory with an average error of 32%. This method promises given that the signal-to-noise ratio of the observations can be improved.

  11. 40 CFR 53.58 - Operational field precision and blank test.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... samplers are also subject to a test for possible deposition of particulate matter on inactive filters... deposition is defined as the mass of material inadvertently deposited on a sample filter that is stored in a... electrical power to accommodate three test samplers are required. (2) Teflon sample filters, as specified in...

  12. Assuring the Quality of Test Results in the Field of Nuclear Techniques and Ionizing Radiation. The Practical Implementation of Section 5.9 of the EN ISO/IEC 17025 Standard

    NASA Astrophysics Data System (ADS)

    Cucu, Daniela; Woods, Mike

    2008-08-01

    The paper aims to present a practical approach for testing laboratories to ensure the quality of their test results. It is based on the experience gained in assessing a large number of testing laboratories, discussing with management and staff, reviewing results obtained in national and international PTs and ILCs and exchanging information in the EA laboratory committee. According to EN ISO/IEC 17025, an accredited laboratory has to implement a programme to ensure the quality of its test results for each measurand. Pre-analytical, analytical and post-analytical measures shall be applied in a systematic manner. They shall include both quality control and quality assurance measures. When designing the quality assurance programme a laboratory should consider pre-analytical activities (like personnel training, selection and validation of test methods, qualifying equipment), analytical activities ranging from sampling, sample preparation, instrumental analysis and post-analytical activities (like decoding, calculation, use of statistical tests or packages, management of results). Designed on different levels (analyst, quality manager and technical manager), including a variety of measures, the programme shall ensure the validity and accuracy of test results, the adequacy of the management system, prove the laboratory's competence in performing tests under accreditation and last but not least show the comparability of test results. Laboratory management should establish performance targets and review periodically QC/QA results against them, implementing appropriate measures in case of non-compliance.

  13. Site Environmental Report for Calendar Year 2004. DOE Operations at The Boeing Company Santa Susana Field Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ning; Rutherford, Phil; Lee, Majelle

    2005-09-01

    This Annual Site Environmental Report (ASER) for 2004 describes the environmental conditions related to work performed for the Department of Energy (DOE) at Area IV of Boeing’s Santa Susana Field Laboratory (SSFL). In the past, the Energy Technology Engineering Center (ETEC), a government-owned, company-operated test facility, was located in Area IV. The operations in Area IV included development, fabrication, and disassembly of nuclear reactors, reactor fuel, and other radioactive materials. Other activities in the area involved the operation of large-scale liquid metal facilities that were used for testing non-nuclear liquid metal fast breeder components. All nuclear work was terminated inmore » 1988; all subsequent radiological work has been directed toward decontamination and decommissioning (D&D) of the former nuclear facilities and their associated sites. Closure of the liquid metal test facilities began in 1996. Results of the radiological monitoring program for the calendar year 2004 continue to indicate that there are no significant releases of radioactive material from Area IV of SSFL. All potential exposure pathways are sampled and/or monitored, including air, soil, surface water, groundwater, direct radiation, transfer of property (land, structures, waste), and recycling.« less

  14. Site Environmental Report for Calendar Year 2006. DOE Operations at The Boeing Company Santa Susana Field Laboratory, Area IV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ning; Rutherford, Phil

    2007-09-01

    This Annual Site Environmental Report (ASER) for 2006 describes the environmental conditions related to work performed for the Department of Energy (DOE) at Area IV of Boeing’s Santa Susana Field Laboratory (SSFL). In the past, the Energy Technology Engineering Center (ETEC), a government-owned, company-operated test facility, was located in Area IV. The operations in Area IV included development, fabrication, and disassembly of nuclear reactors, reactor fuel, and other radioactive materials. Other activities in the area involved the operation of large-scale liquid metal facilities that were used for testing non-nuclear liquid metal fast breeder components. All nuclear work was terminated inmore » 1988; all subsequent radiological work has been directed toward decontamination and decommissioning (D&D) of the former nuclear facilities and their associated sites. Closure of the liquid metal test facilities began in 1996. Results of the radiological monitoring program for the calendar year 2006 continue to indicate that there are no significant releases of radioactive material from Area IV of SSFL. All potential exposure pathways are sampled and/or monitored, including air, soil, surface water, groundwater, direct radiation, transfer of property (land, structures, waste), and recycling.« less

  15. Small-scale dynamic confinement gap test

    NASA Astrophysics Data System (ADS)

    Cook, Malcolm

    2011-06-01

    Gap tests are routinely used to ascertain the shock sensitiveness of new explosive formulations. The tests are popular since that are easy and relatively cheap to perform. However, with modern insensitive formulations with big critical diameters, large test samples are required. This can make testing and screening of new formulations expensive since large quantities of test material are required. Thus a new test that uses significantly smaller sample quantities would be very beneficial. In this paper we describe a new small-scale test that has been designed using our CHARM ignition and growth routine in the DYNA2D hydrocode. The new test is a modified gap test and uses detonating nitromethane to provide dynamic confinement (instead of a thick metal case) whilst exposing the sample to a long duration shock wave. The long duration shock wave allows less reactive materials that are below their critical diameter, more time to react. We present details on the modelling of the test together with some preliminary experiments to demonstrate the potential of the new test method.

  16. Rapid Identification of a Cooling Tower-Associated Legionnaires' Disease Outbreak Supported by Polymerase Chain Reaction Testing of Environmental Samples, New York City, 2014-2015.

    PubMed

    Benowitz, Isaac; Fitzhenry, Robert; Boyd, Christopher; Dickinson, Michelle; Levy, Michael; Lin, Ying; Nazarian, Elizabeth; Ostrowsky, Belinda; Passaretti, Teresa; Rakeman, Jennifer; Saylors, Amy; Shamoonian, Elena; Smith, Terry-Ann; Balter, Sharon

    2018-04-01

    We investigated an outbreak of eight Legionnaires' disease cases among persons living in an urban residential community of 60,000 people. Possible environmental sources included two active cooling towers (air-conditioning units for large buildings) <1 km from patient residences, a market misting system, a community-wide water system used for heating and cooling, and potable water. To support a timely public health response, we used real-time polymerase chain reaction (PCR) to identify Legionella DNA in environmental samples within hours of specimen collection. We detected L. pneumophila serogroup 1 DNA only at a power plant cooling tower, supporting the decision to order remediation before culture results were available. An isolate from a power plant cooling tower sample was indistinguishable from a patient isolate by pulsed-field gel electrophoresis, suggesting the cooling tower was the outbreak source. PCR results were available <1 day after sample collection, and culture results were available as early as 5 days after plating. PCR is a valuable tool for identifying Legionella DNA in environmental samples in outbreak settings.

  17. Successful application of FTA Classic Card technology and use of bacteriophage phi29 DNA polymerase for large-scale field sampling and cloning of complete maize streak virus genomes.

    PubMed

    Owor, Betty E; Shepherd, Dionne N; Taylor, Nigel J; Edema, Richard; Monjane, Adérito L; Thomson, Jennifer A; Martin, Darren P; Varsani, Arvind

    2007-03-01

    Leaf samples from 155 maize streak virus (MSV)-infected maize plants were collected from 155 farmers' fields in 23 districts in Uganda in May/June 2005 by leaf-pressing infected samples onto FTA Classic Cards. Viral DNA was successfully extracted from cards stored at room temperature for 9 months. The diversity of 127 MSV isolates was analysed by PCR-generated RFLPs. Six representative isolates having different RFLP patterns and causing either severe, moderate or mild disease symptoms, were chosen for amplification from FTA cards by bacteriophage phi29 DNA polymerase using the TempliPhi system. Full-length genomes were inserted into a cloning vector using a unique restriction enzyme site, and sequenced. The 1.3-kb PCR product amplified directly from FTA-eluted DNA and used for RFLP analysis was also cloned and sequenced. Comparison of cloned whole genome sequences with those of the original PCR products indicated that the correct virus genome had been cloned and that no errors were introduced by the phi29 polymerase. This is the first successful large-scale application of FTA card technology to the field, and illustrates the ease with which large numbers of infected samples can be collected and stored for downstream molecular applications such as diversity analysis and cloning of potentially new virus genomes.

  18. A monitoring of chemical contaminants in waters used for field irrigation and livestock watering in the Veneto region (Italy), using bioassays as a screening tool.

    PubMed

    De Liguoro, Marco; Bona, Mirco Dalla; Gallina, Guglielmo; Capolongo, Francesca; Gallocchio, Federica; Binato, Giovanni; Di Leva, Vincenzo

    2014-03-01

    In this study, 50 livestock watering sources (ground water) and 50 field irrigation sources (surface water) from various industrialised areas of the Veneto region were monitored for chemical contaminants. From each site, four water samples (one in each season) were collected during the period from summer 2009 through to spring 2010. Surface water samples and ground water samples were first screened for toxicity using the growth inhibition test on Pseudokirchneriella subcapitata and the immobilisation test on Daphnia magna, respectively. Then, based on the results of these toxicity tests, 28 ground water samples and 26 surface water samples were submitted to chemical analysis for various contaminants (insecticides/acaricides, fungicides, herbicides, metals and anions) by means of UPLC-MS(n) HPLC-MS(n), AAS and IEC. With the exception of one surface water sample where the total pesticides concentration was greater than 4 μg L(-1), positive samples (51.9 %) showed only traces (nanograms per liter) of pesticides. Metals were generally under the detection limit. High concentrations of chlorines (up to 692 mg L(-1)) were found in some ground water samples while some surface water samples showed an excess of nitrites (up to 336 mg L(-1)). Detected levels of contamination were generally too low to justify the toxicity recorded in bioassays, especially in the case of surface water samples, and analytical results painted quite a reassuring picture, while tests on P. subcapitata showed a strong growth inhibition activity. It was concluded that, from an ecotoxicological point of view, surface waters used for field irrigation in the Veneto region cannot be considered safe.

  19. Issues Related to Large Flight Hardware Acoustic Qualification Testing

    NASA Technical Reports Server (NTRS)

    Kolaini, Ali R.; Perry, Douglas C.; Kern, Dennis L.

    2011-01-01

    The characteristics of acoustical testing volumes generated by reverberant chambers or a circle of loudspeakers with and without large flight hardware within the testing volume are significantly different. The parameters attributing to these differences are normally not accounted for through analysis or acoustic tests prior to the qualification testing without the test hardware present. In most cases the control microphones are kept at least 2-ft away from hardware surfaces, chamber walls, and speaker surfaces to minimize the impact of the hardware in controlling the sound field. However, the acoustic absorption and radiation of sound by hardware surfaces may significantly alter the sound pressure field controlled within the chamber/speaker volume to a given specification. These parameters often result in an acoustic field that may provide under/over testing scenarios for flight hardware. In this paper the acoustic absorption by hardware surfaces will be discussed in some detail. A simple model is provided to account for some of the observations made from Mars Science Laboratory spacecraft that recently underwent acoustic qualification tests in a reverberant chamber.

  20. Detection of toxoplasma-specific immunoglobulin G in human sera: performance comparison of in house Dot-ELISA with ECLIA and ELISA.

    PubMed

    Teimouri, Aref; Modarressi, Mohammad Hossein; Shojaee, Saeedeh; Mohebali, Mehdi; Zouei, Nima; Rezaian, Mostafa; Keshavarz, Hossein

    2018-05-08

    In the current study, performance of electrochemiluminescence immunoassay (ECLIA) in detection of anti-toxoplasma IgG in human sera was compared with that of enzyme-linked immunosorbent assay (ELISA). Furthermore, performance of an in house Dot-ELISA in detection of anti-toxoplasma IgG was compared with that of ECLIA and ELISA. In total, 219 human sera were tested to detect anti-toxoplasma IgG using Dynex DS2® and Roche Cobas® e411 Automated Analyzers. Discordant results rechecked using immunofluorescence assay (IFA). Then, sera were used in an in house Dot-ELISA to assess toxoplasma-specific IgG. Of the 219 samples, two samples were found undetermined using ECLIA but reactive using ELISA. Using IFA, the two sera were reported unreactive. Furthermore, two samples were found reactive using ECLIA and unreactive using ELISA. These samples were reported reactive using IFA. The overall agreement for the two former methods was 98% (rZ0.98.1; P < 0.001). The intrinsic parameters calculated for in house Dot-ELISA included sensitivity of 79.5, specificity of 78.2, and accuracy of 78.9%, compared to ECLIA and ELISA. Positive and negative predictive values included 82.9 and 74.2%, respectively. A 100% sensitivity was found in in house Dot-ELISA for highly reactive sera in ECLIA and ELISA. ECLIA is appropriate for the first-line serological screening tests and can replace ELISA due to high speed, sensitivity, and specificity, particularly in large laboratories. Dot-ELISA is a rapid, sensitive, specific, cost-effective, user-friendly, and field-portable technique and hence can be used for screening toxoplasmosis, especially in rural fields or less equipped laboratories.

  1. Micromechanical Characterization of Polysilicon Films through On-Chip Tests

    PubMed Central

    Mirzazadeh, Ramin; Eftekhar Azam, Saeed; Mariani, Stefano

    2016-01-01

    When the dimensions of polycrystalline structures become comparable to the average grain size, some reliability issues can be reported for the moving parts of inertial microelectromechanical systems (MEMS). Not only the overall behavior of the device turns out to be affected by a large scattering, but also the sensitivity to imperfections gets enhanced. In this work, through on-chip tests, we experimentally investigate the behavior of thin polysilicon samples using standard electrostatic actuation/sensing. The discrepancy between the target and actual responses of each sample has then been exploited to identify: (i) the overall stiffness of the film and, according to standard continuum elasticity, a morphology-based value of its Young’s modulus; (ii) the relevant over-etch induced by the fabrication process. To properly account for the aforementioned stochastic features at the micro-scale, the identification procedure has been based on particle filtering. A simple analytical reduced-order model of the moving structure has been also developed to account for the nonlinearities in the electrical field, up to pull-in. Results are reported for a set of ten film samples of constant slenderness, and the effects of different actuation mechanisms on the identified micromechanical features are thoroughly discussed. PMID:27483268

  2. Micromechanical Characterization of Polysilicon Films through On-Chip Tests.

    PubMed

    Mirzazadeh, Ramin; Eftekhar Azam, Saeed; Mariani, Stefano

    2016-07-28

    When the dimensions of polycrystalline structures become comparable to the average grain size, some reliability issues can be reported for the moving parts of inertial microelectromechanical systems (MEMS). Not only the overall behavior of the device turns out to be affected by a large scattering, but also the sensitivity to imperfections gets enhanced. In this work, through on-chip tests, we experimentally investigate the behavior of thin polysilicon samples using standard electrostatic actuation/sensing. The discrepancy between the target and actual responses of each sample has then been exploited to identify: (i) the overall stiffness of the film and, according to standard continuum elasticity, a morphology-based value of its Young's modulus; (ii) the relevant over-etch induced by the fabrication process. To properly account for the aforementioned stochastic features at the micro-scale, the identification procedure has been based on particle filtering. A simple analytical reduced-order model of the moving structure has been also developed to account for the nonlinearities in the electrical field, up to pull-in. Results are reported for a set of ten film samples of constant slenderness, and the effects of different actuation mechanisms on the identified micromechanical features are thoroughly discussed.

  3. Assessment protocols of maximum oxygen consumption in young people with Down syndrome--a review.

    PubMed

    Seron, Bruna Barboza; Greguol, Márcia

    2014-03-01

    Maximum oxygen consumption is considered the gold standard measure of cardiorespiratory fitness. Young people with Down syndrome (DS) present low values of this indicator compared to their peers without disabilities and to young people with an intellectual disability but without DS. The use of reliable and valid assessment methods provides more reliable results for the diagnosis of cardiorespiratory fitness and the response of this variable to exercise. The aim of the present study was to review the literature on the assessment protocols used to measure maximum oxygen consumption in children and adolescents with Down syndrome giving emphasis to the protocols used, the validation process and their feasibility. The search was carried out in eight electronic databases--Scopus, Medline-Pubmed, Web of science, SportDiscus, Cinhal, Academic Search Premier, Scielo, and Lilacs. The inclusion criteria were: (a) articles which assessed VO2peak and/or VO2max (independent of the validation method), (b) samples composed of children and/or adolescents with Down syndrome, (c) participants of up to 20 years old, and (d) studies performed after 1990. Fifteen studies were selected and, of these, 11 measured the VO2peak using tests performed in a laboratory, 2 used field tests and the remaining 2 used both laboratory and field tests. The majority of the selected studies used maximal tests and conducted familiarization sessions. All the studies took into account the clinical conditions that could hamper testing or endanger the individuals. However, a large number of studies used tests which had not been specifically validated for the evaluated population. Finally, the search emphasized the small number of studies which use field tests to evaluate oxygen consumption. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. A 4 Tesla Superconducting Magnet Developed for a 6 Circle Huber Diffractometer at the XMaS Beamline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, P. B. J.; Brown, S. D.; Bouchenoire, L.

    2007-01-19

    We report here on the development and testing of a 4 Tesla cryogen free superconducting magnet designed to fit within the Euler cradle of a 6 circle Huber diffractometer, allowing scattering in both the vertical and horizontal planes. The geometry of this magnet allows the field to be applied in three orientations. The first being along the beam direction, the second with the field transverse to the beam direction a horizontal plane and finally the field can be applied vertically with respect to the beam. The magnet has a warm bore and an open geometry of 180 deg. , allowingmore » large access to reciprocal space. A variable temperature insert has been developed, which is capable of working down to a temperature of 1.7 K and operating over a wide range of angles whilst maintaining a temperature stability of a few mK. Initial ferromagnetic diffraction measurements have been carried out on single crystal Tb and Dy samples.« less

  5. Large-field high-contrast hard x-ray Zernike phase-contrast nano-imaging beamline at Pohang Light Source.

    PubMed

    Lim, Jun; Park, So Yeong; Huang, Jung Yun; Han, Sung Mi; Kim, Hong-Tae

    2013-01-01

    We developed an off-axis-illuminated zone-plate-based hard x-ray Zernike phase-contrast microscope beamline at Pohang Light Source. Owing to condenser optics-free and off-axis illumination, a large field of view was achieved. The pinhole-type Zernike phase plate affords high-contrast images of a cell with minimal artifacts such as the shade-off and halo effects. The setup, including the optics and the alignment, is simple and easy, and allows faster and easier imaging of large bio-samples.

  6. Preparation and Testing of Impedance-based Fluidic Biochips with RTgill-W1 Cells for Rapid Evaluation of Drinking Water Samples for Toxicity

    PubMed Central

    Brennan, Linda M.; Widder, Mark W.; McAleer, Michael K.; Mayo, Michael W.; Greis, Alex P.; van der Schalie, William H.

    2016-01-01

    This manuscript describes how to prepare fluidic biochips with Rainbow trout gill epithelial (RTgill-W1) cells for use in a field portable water toxicity sensor. A monolayer of RTgill-W1 cells forms on the sensing electrodes enclosed within the biochips. The biochips are then used for testing in a field portable electric cell-substrate impedance sensing (ECIS) device designed for rapid toxicity testing of drinking water. The manuscript further describes how to run a toxicity test using the prepared biochips. A control water sample and the test water sample are mixed with pre-measured powdered media and injected into separate channels of the biochip. Impedance readings from the sensing electrodes in each of the biochip channels are measured and compared by an automated statistical software program. The screen on the ECIS instrument will indicate either "Contamination Detected" or "No Contamination Detected" within an hour of sample injection. Advantages are ease of use and rapid response to a broad spectrum of inorganic and organic chemicals at concentrations that are relevant to human health concerns, as well as the long-term stability of stored biochips in a ready state for testing. Limitations are the requirement for cold storage of the biochips and limited sensitivity to cholinesterase-inhibiting pesticides. Applications for this toxicity detector are for rapid field-portable testing of drinking water supplies by Army Preventative Medicine personnel or for use at municipal water treatment facilities. PMID:27023147

  7. Preparation and Testing of Impedance-based Fluidic Biochips with RTgill-W1 Cells for Rapid Evaluation of Drinking Water Samples for Toxicity.

    PubMed

    Brennan, Linda M; Widder, Mark W; McAleer, Michael K; Mayo, Michael W; Greis, Alex P; van der Schalie, William H

    2016-03-07

    This manuscript describes how to prepare fluidic biochips with Rainbow trout gill epithelial (RTgill-W1) cells for use in a field portable water toxicity sensor. A monolayer of RTgill-W1 cells forms on the sensing electrodes enclosed within the biochips. The biochips are then used for testing in a field portable electric cell-substrate impedance sensing (ECIS) device designed for rapid toxicity testing of drinking water. The manuscript further describes how to run a toxicity test using the prepared biochips. A control water sample and the test water sample are mixed with pre-measured powdered media and injected into separate channels of the biochip. Impedance readings from the sensing electrodes in each of the biochip channels are measured and compared by an automated statistical software program. The screen on the ECIS instrument will indicate either "Contamination Detected" or "No Contamination Detected" within an hour of sample injection. Advantages are ease of use and rapid response to a broad spectrum of inorganic and organic chemicals at concentrations that are relevant to human health concerns, as well as the long-term stability of stored biochips in a ready state for testing. Limitations are the requirement for cold storage of the biochips and limited sensitivity to cholinesterase-inhibiting pesticides. Applications for this toxicity detector are for rapid field-portable testing of drinking water supplies by Army Preventative Medicine personnel or for use at municipal water treatment facilities.

  8. Study on Thermal Decomposition Characteristics of Ammonium Nitrate Emulsion Explosive in Different Scales

    NASA Astrophysics Data System (ADS)

    Wu, Qiujie; Tan, Liu; Xu, Sen; Liu, Dabin; Min, Li

    2018-04-01

    Numerous accidents of emulsion explosive (EE) are attributed to uncontrolled thermal decomposition of ammonium nitrate emulsion (ANE, the intermediate of EE) and EE in large scale. In order to study the thermal decomposition characteristics of ANE and EE in different scales, a large-scale test of modified vented pipe test (MVPT), and two laboratory-scale tests of differential scanning calorimeter (DSC) and accelerating rate calorimeter (ARC) were applied in the present study. The scale effect and water effect both play an important role in the thermal stability of ANE and EE. The measured decomposition temperatures of ANE and EE in MVPT are 146°C and 144°C, respectively, much lower than those in DSC and ARC. As the size of the same sample in DSC, ARC, and MVPT successively increases, the onset temperatures decrease. In the same test, the measured onset temperature value of ANE is higher than that of EE. The water composition of the sample stabilizes the sample. The large-scale test of MVPT can provide information for the real-life operations. The large-scale operations have more risks, and continuous overheating should be avoided.

  9. Prevalence of pfhrp2 and/or pfhrp3 Gene Deletion in Plasmodium falciparum Population in Eight Highly Endemic States in India

    PubMed Central

    Bharti, Praveen Kumar; Chandel, Himanshu Singh; Ahmad, Amreen; Krishna, Sri; Udhayakumar, Venkatachalam; Singh, Neeru

    2016-01-01

    Background Plasmodium falciparum encoded histidine rich protein (HRP2) based malaria rapid diagnostic tests (RDTs) are used in India. Deletion of pfhrp2 and pfhrp3 genes contributes to false negative test results, and large numbers of such deletions have been reported from South America, highlighting the importance of surveillance to detect such deletions. Methods This is the first prospective field study carried out at 16 sites located in eight endemic states of India to assess the performance of PfHRP2 based RDT kits used in the national malaria control programme. In this study, microscopically confirmed P. falciparum but RDT negative samples were assessed for presence of pfhrp2, pfhrp3, and their flanking genes using PCR. Results Among 1521 microscopically positive P. falciparum samples screened, 50 were negative by HRP2 based RDT test. Molecular testing was carried out using these 50 RDT negative samples by assuming that 1471 RDT positive samples carried pfhrp2 gene. It was found that 2.4% (36/1521) and 1.8% (27/1521) of samples were negative for pfhrp2 and pfhrp3 genes, respectively. However, the frequency of pfhrp2 deletions varied between the sites ranging from 0–25% (2.4, 95% CI; 1.6–3.3). The frequency of both pfhrp2 and pfhrp3 gene deletion varied from 0–8% (1.6, 95% CI; 1.0–2.4). Conclusion This study provides evidence for low level presence of pfhrp2 and pfhrp3 deleted P. falciparum parasites in different endemic regions of India, and periodic surveillance is warranted for reliable use of PfHRP2 based RDTs. PMID:27518538

  10. Search Coil vs. Fluxgate Magnetometer Measurements at Interplanetary Shocks

    NASA Technical Reports Server (NTRS)

    Wilson, L.B., III

    2012-01-01

    We present magnetic field observations at interplanetary shocks comparing two different sample rates showing significantly different results. Fluxgate magnetometer measurements show relatively laminar supercritical shock transitions at roughly 11 samples/s. Search coil magnetometer measurements at 1875 samples/s, however, show large amplitude (dB/B as large as 2) fluctuations that are not resolved by the fluxgate magnetometer. We show that these fluctuations, identified as whistler mode waves, would produce a significant perturbation to the shock transition region changing the interpretation from laminar to turbulent. Thus, previous observations of supercritical interplanetary shocks classified as laminar may have been under sampled.

  11. Evidence of Zika Virus RNA Fragments in Aedes albopictus (Diptera: Culicidae) Field-Collected Eggs From Camaçari, Bahia, Brazil.

    PubMed

    Smartt, Chelsea T; Stenn, Tanise M S; Chen, Tse-Yu; Teixeira, Maria Gloria; Queiroz, Erivaldo P; Souza Dos Santos, Luciano; Queiroz, Gabriel A N; Ribeiro Souza, Kathleen; Kalabric Silva, Luciano; Shin, Dongyoung; Tabachnick, Walter J

    2017-07-01

    A major mosquito-borne viral disease outbreak caused by Zika virus (ZIKV) occurred in Bahia, Brazil, in 2015, largely due to transmission by the mosquito, Aedes aegypti (L.). Detecting ZIKV in field samples of Ae. aegypti has proven problematic in some locations, suggesting other mosquito species might be contributing to the spread of ZIKV. In this study, several (five) adult Aedes albopictus (Skuse) mosquitoes that emerged from a 2015 field collection of eggs from Camaçari, Bahia, Brazil, were positive for ZIKV RNA; however, attempts to isolate live virus were not successful. Results from this study suggest that field-collected Ae. albopictus eggs may contain ZIKV RNA that require further tests for infectious ZIKV. There is a need to investigate the role of Ae. albopictus in the ZIKV infection process in Brazil and to study the potential presence of vertical and sexual transmission of ZIKV in this species. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Applicability of ELISA-based Determination of Pesticides for Groundwater Quality Monitoring

    NASA Astrophysics Data System (ADS)

    Tsuchihara, Takeo; Yoshimoto, Shuhei; Ishida, Satoshi; Imaizumi, Masayuki

    The principals and procedures of ELISA (Enzyme-linked Immunosorbent Assay)-based determination of pesticides (Fenitrothion) in environmental samples were reviewed, and the applicability of the ELISA method for groundwater quality monitoring were validated through the experimental tracer tests in soil columns and the field test in Okinoerabu Island. The test results showed that the ELISA method could be useful not only for screening but also for quantitative analysis of pesticides. In the experimental tracer tests in soil columns, the retardation of pesticides leaching compared with conservative tracers were observed. In the field test, the contamination of the pesticide was detected in groundwater samples in Okinoerabu Island, even though the targeted pesticide was considered to be applied to the upland field 4 months ago. In order to investigate the transport and fate of pesticides in groundwater taking into account retardation from the field to groundwater table and the residue in groundwater, continuous observations of pesticides in groundwater are in a strong need, and the ELISA method is applicable to the long-term quality groundwater monitoring.

  13. Isolation of leptospira Serovars Canicola and Copenhageni from cattle urine in the state of ParanÁ, Brazil

    PubMed Central

    Zacarias, Francielle Gibson da Silva; Vasconcellos, Silvio Arruda; Anzai, Eleine Kuroki; Giraldi, Nilson; de Freitas, Julio Cesar; Hartskeerl, Rudy

    2008-01-01

    In 2001, 698 urine samples were randomly collected from cattle at a slaughterhouse in the State of Paraná, Brazil. Direct examination using dark field microscopy was carried out immediately after collection. Five putative positive samples were cultured in modified EMJH medium, yielding two positive cultures (LO-14 and LO-10). Typing with monoclonal antibodies revealed that the two isolates were similar to Canicola (LO-14) and Copenhageni (LO-10). Microscopic agglutination test results show that Hardjo is the most common serovar in cattle in Brazil. Rats and dogs are the common maintenance hosts of serovars Copenhageni and Canicola. The excretion of highly pathogenic serovars such as Copenhageni and Canicola by cattle can represent an increasing risk for severe leptospirosis is large populations, mainly living in rural areas. PMID:24031301

  14. High-precision processing and detection of the high-caliber off-axis aspheric mirror

    NASA Astrophysics Data System (ADS)

    Dai, Chen; Li, Ang; Xu, Lingdi; Zhang, Yingjie

    2017-10-01

    To achieve the efficient, controllable, digital processing and high-precision detection of the high-caliber off-axis aspheric mirror, meeting the high-level development needs of the modern high-resolution, large field of space optical remote sensing camera, we carried out the research on high precision machining and testing technology of off-axis aspheric mirror. First, we forming the off-axis aspheric sample with diameter of 574mm × 302mm by milling it with milling machine, and then the intelligent robot equipment was used for off-axis aspheric high precision polishing. Surface detection of the sample will be proceed with the off-axis aspheric contact contour detection technology and offaxis non-spherical surface interference detection technology after its fine polishing using ion beam equipment. The final surface accuracy RMS is 12nm.

  15. A novel enhanced diffusion sampler for collecting gaseous pollutants without air agitation.

    PubMed

    Pan, Xuelian; Zhuo, Shaojie; Zhong, Qirui; Chen, Yuanchen; Du, Wei; Cheng, Hefa; Wang, Xilong; Zeng, Eddy Y; Xing, Baoshan; Tao, Shu

    2018-03-06

    A novel enhanced diffusion sampler for collecting gaseous phase polycyclic aromatic hydrocarbons (PAHs) without air agitation is proposed. The diffusion of target compounds into a sampling chamber is facilitated by continuously purging through a closed-loop flow to create a large concentration difference between the ambient air and the air in the sampling chamber. A glass-fiber filter-based prototype was developed. It was demonstrated that the device could collect gaseous PAHs at a much higher rate (1.6 ± 1.4 L/min) than regular passive samplers, while the ambient air is not agitated. The prototype was also tested in both the laboratory and field for characterizing the concentration gradients over a short distance from the soil surface. The sampler has potential to be applied in other similar situations to characterize the concentration profiles of other chemicals.

  16. Modeling apparent color for visual evaluation of camouflage fabrics

    NASA Astrophysics Data System (ADS)

    Ramsey, S.; Mayo, T.; Shabaev, A.; Lambrakos, S. G.

    2017-08-01

    As the U.S. Navy, Army, and Special Operations Forces progress towards fielding more advanced uniforms with multi-colored and highly detailed camouflage patterning, additional test methodologies are necessary in evaluating color in these types of camouflage textiles. The apparent color is the combination of all visible wavelengths (380-760 nm) of light reflected from large (>=1m2 ) fabric sample sizes for a given standoff distance (10-25ft). Camouflage patterns lose resolution with increasing standoff distance, and eventually all colors within the pattern appear monotone (the "apparent color" of the pattern). This paper presents an apparent color prediction model that can be used for evaluation of camouflage fabrics.

  17. Precise mapping of the magnetic field in the CMS barrel yoke using cosmic rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatrchyan, S.; et al.,

    2010-03-01

    The CMS detector is designed around a large 4 T superconducting solenoid, enclosed in a 12000-tonne steel return yoke. A detailed map of the magnetic field is required for the accurate simulation and reconstruction of physics events in the CMS detector, not only in the inner tracking region inside the solenoid but also in the large and complex structure of the steel yoke, which is instrumented with muon chambers. Using a large sample of cosmic muon events collected by CMS in 2008, the field in the steel of the barrel yoke has been determined with a precision of 3 tomore » 8% depending on the location.« less

  18. Contamination and effects in freshwater ditches resulting from an aerial application of cypermethrin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shires, S.W.; Bennett, D.

    1985-04-01

    Cypermethrin (Ripcord) was applied at 25 g ai ha-1 by fixed-wing aircraft to a large field (11.6 ha) of winter wheat bordered on three sides by drainage ditches. About 60% of the nominal application rate was deposited on the crop and about 6% (maximum) was deposited over the water surface. The amount of spray drift deposited upwind declined sharply with increasing distance from the treated field. Downwind, the spray drift was small but occurred over a much greater distance. Very low (0.03 micrograms liter-1 maximum) concentrations of cypermethrin were found in subsurface water samples and these declined rapidly after spraying.more » Bioassay tests, using a sensitive indicator species, confirmed that only a small amount of cypermethrin contamination had occurred in the ditch adjacent to the downwind perimeter of the field. Frequent sampling of the zooplankton and macroinvertebrate fauna of the ditches indicated that there were no marked biological effects resulting from the cypermethrin application. Only a few air-breathing corixids and the highly susceptible water mites showed minor short-term reductions in abundance after spraying. No effects were observed on either caged or indigenous fish stocks and no significant residues of cypermethrin were found in fish tissues.« less

  19. Pure phase encode magnetic field gradient monitor.

    PubMed

    Han, Hui; MacGregor, Rodney P; Balcom, Bruce J

    2009-12-01

    Numerous methods have been developed to measure MRI gradient waveforms and k-space trajectories. The most promising new strategy appears to be magnetic field monitoring with RF microprobes. Multiple RF microprobes may record the magnetic field evolution associated with a wide variety of imaging pulse sequences. The method involves exciting one or more test samples and measuring the time evolution of magnetization through the FIDs. Two critical problems remain. The gradient waveform duration is limited by the sample T(2)*, while the k-space maxima are limited by gradient dephasing. The method presented is based on pure phase encode FIDs and solves the above two problems in addition to permitting high strength gradient measurement. A small doped water phantom (1-3 mm droplet, T(1), T(2), T(2)* < 100 micros) within a microprobe is excited by a series of closely spaced broadband RF pulses each followed by FID single point acquisition. Two trial gradient waveforms have been chosen to illustrate the technique, neither of which could be measured by the conventional RF microprobe measurement. The first is an extended duration gradient waveform while the other illustrates the new method's ability to measure gradient waveforms with large net area and/or high amplitude. The new method is a point monitor with simple implementation and low cost hardware requirements.

  20. Immobilization of copper flotation waste using red mud and clinoptilolite.

    PubMed

    Coruh, Semra

    2008-10-01

    The flash smelting process has been used in the copper industry for a number of years and has replaced most of the reverberatory applications, known as conventional copper smelting processes. Copper smelters produce large amounts of copper slag or copper flotation waste and the dumping of these quantities of copper slag causes economic, environmental and space problems. The aim of this study was to perform a laboratory investigation to assess the feasibility of immobilizing the heavy metals contained in copper flotation waste. For this purpose, samples of copper flotation waste were immobilized with relatively small proportions of red mud and large proportions of clinoptilolite. The results of laboratory leaching demonstrate that addition of red mud and clinoptilolite to the copper flotation waste drastically reduced the heavy metal content in the effluent and the red mud performed better than clinoptilolite. This study also compared the leaching behaviour of metals in copper flotation waste by short-time extraction tests such as the toxicity characteristic leaching procedure (TCLP), deionized water (DI) and field leach test (FLT). The results of leach tests showed that the results of the FLT and DI methods were close and generally lower than those of the TCLP methods.

  1. Development of an automated asbestos counting software based on fluorescence microscopy.

    PubMed

    Alexandrov, Maxym; Ichida, Etsuko; Nishimura, Tomoki; Aoki, Kousuke; Ishida, Takenori; Hirota, Ryuichi; Ikeda, Takeshi; Kawasaki, Tetsuo; Kuroda, Akio

    2015-01-01

    An emerging alternative to the commonly used analytical methods for asbestos analysis is fluorescence microscopy (FM), which relies on highly specific asbestos-binding probes to distinguish asbestos from interfering non-asbestos fibers. However, all types of microscopic asbestos analysis require laborious examination of large number of fields of view and are prone to subjective errors and large variability between asbestos counts by different analysts and laboratories. A possible solution to these problems is automated counting of asbestos fibers by image analysis software, which would lower the cost and increase the reliability of asbestos testing. This study seeks to develop a fiber recognition and counting software for FM-based asbestos analysis. We discuss the main features of the developed software and the results of its testing. Software testing showed good correlation between automated and manual counts for the samples with medium and high fiber concentrations. At low fiber concentrations, the automated counts were less accurate, leading us to implement correction mode for automated counts. While the full automation of asbestos analysis would require further improvements in accuracy of fiber identification, the developed software could already assist professional asbestos analysts and record detailed fiber dimensions for the use in epidemiological research.

  2. Field test to intercompare carbon monoxide, nitric oxide and hydroxyl instrumentation at Wallops Island, Virginia

    NASA Technical Reports Server (NTRS)

    Gregory, Gerald L.; Beck, Sherwin M.; Bendura, Richard J.

    1987-01-01

    Documentation of the first of three instrument intercomparisons conducted as part of NASA Global Tropospheric Experiment/Chemical Instrumentation Test and Evaluation (GTE/CITE-1) is given. This ground-based intercomparison was conducted during July 1983 at NASA Wallops Flight Facility. Instruments intercompared included one laser system and three grab-sample approaches for CO; two chemiluminescent systems and one laser-induced fluorescent (LIF) technique for NO; and two different LIF systems and a radiochemical tracer technique for OH. The major objectives of this intercomparison was to intercompare ambient measurements of CO, NO, and OH at a common site by using techniques of fundamentally different detection principles and to identify any major biases among the techniques prior to intercomparison on an aircraft platform. Included in the report are comprehensive discussions of workshop requirements, philosophies, and operations as well as intercomparison analyses and results. In addition, the large body of nonintercomparison data incorporated into the workshop measurements is summarized. The report is an important source document for those interested in conducting similar large and complex intercomparison tests as well as those interested in using the data base for purposes other than instrument intercomparison.

  3. High-pressure swing system for measurements of radioactive fission gases in air samples

    NASA Astrophysics Data System (ADS)

    Schell, W. R.; Vives-Battle, J.; Yoon, S. R.; Tobin, M. J.

    1999-01-01

    Radionuclides emitted from nuclear reactors, fuel reprocessing facilities and nuclear weapons tests are distributed widely in the atmosphere but have very low concentrations. As part of the Comprehensive Test Ban Treaty (CTBT), identification and verification of the emission of radionuclides from such sources are fundamental in maintaining nuclear security. To detect underground and underwater nuclear weapons tests, only the gaseous components need to be analyzed. Equipment has now been developed that can be used to collect large volumes of air, separate and concentrate the radioactive gas constituents, such as xenon and krypton, and measure them quantitatively. By measuring xenon isotopes with different half-lives, the time since the fission event can be determined. Developments in high-pressure (3500 kPa) swing chromatography using molecular sieve adsorbents have provided the means to collect and purify trace quantities of the gases from large volumes of air automatically. New scintillation detectors, together with timing and pulse shaping electronics, have provided the low-background levels essential in identifying the gamma ray, X-ray, and electron energy spectra of specific radionuclides. System miniaturization and portability with remote control could be designed for a field-deployable production model.

  4. Reverberation Chamber Uniformity Validation and Radiated Susceptibility Test Procedures for the NASA High Intensity Radiated Fields Laboratory

    NASA Technical Reports Server (NTRS)

    Koppen, Sandra V.; Nguyen, Truong X.; Mielnik, John J.

    2010-01-01

    The NASA Langley Research Center's High Intensity Radiated Fields Laboratory has developed a capability based on the RTCA/DO-160F Section 20 guidelines for radiated electromagnetic susceptibility testing in reverberation chambers. Phase 1 of the test procedure utilizes mode-tuned stirrer techniques and E-field probe measurements to validate chamber uniformity, determines chamber loading effects, and defines a radiated susceptibility test process. The test procedure is segmented into numbered operations that are largely software controlled. This document is intended as a laboratory test reference and includes diagrams of test setups, equipment lists, as well as test results and analysis. Phase 2 of development is discussed.

  5. Cluster Lensing with the BTC

    NASA Astrophysics Data System (ADS)

    Fischer, P.

    1997-12-01

    Weak distortions of background galaxies are rapidly emerging as a powerful tool for the measurement of galaxy cluster mass distributions. Lensing based studies have the advantage of being direct measurements of mass and are not model-dependent as are other techniques (X-ray, radial velocities). To date studies have been limited by CCD field size meaning that full coverage of the clusters out to the virial radii and beyond has not been possible. Probing this large radius region is essential for testing models of large scale structure formation. New wide field CCD mosaics, for the first time, allow mass measurements out to very large radius. We have obtained images for a sample of clusters with the ``Big Throughput Camera'' (BTC) on the CTIO 4m. This camera comprises four thinned SITE 2048(2) CCDs, each 15arcmin on a side for a total area of one quarter of a square degree. We have developed an automated reduction pipeline which: 1) corrects for spatial distortions, 2) corrects for PSF anisotropy, 3) determines relative scaling and background levels, and 4) combines multiple exposures. In this poster we will present some preliminary results of our cluster lensing study. This will include radial mass and light profiles and 2-d mass and galaxy density maps.

  6. Utilizing the ultrasensitive Schistosoma up-converting phosphor lateral flow circulating anodic antigen (UCP-LF CAA) assay for sample pooling-strategies.

    PubMed

    Corstjens, Paul L A M; Hoekstra, Pytsje T; de Dood, Claudia J; van Dam, Govert J

    2017-11-01

    Methodological applications of the high sensitivity genus-specific Schistosoma CAA strip test, allowing detection of single worm active infections (ultimate sensitivity), are discussed for efficient utilization in sample pooling strategies. Besides relevant cost reduction, pooling of samples rather than individual testing can provide valuable data for large scale mapping, surveillance, and monitoring. The laboratory-based CAA strip test utilizes luminescent quantitative up-converting phosphor (UCP) reporter particles and a rapid user-friendly lateral flow (LF) assay format. The test includes a sample preparation step that permits virtually unlimited sample concentration with urine, reaching ultimate sensitivity (single worm detection) at 100% specificity. This facilitates testing large urine pools from many individuals with minimal loss of sensitivity and specificity. The test determines the average CAA level of the individuals in the pool thus indicating overall worm burden and prevalence. When requiring test results at the individual level, smaller pools need to be analysed with the pool-size based on expected prevalence or when unknown, on the average CAA level of a larger group; CAA negative pools do not require individual test results and thus reduce the number of tests. Straightforward pooling strategies indicate that at sub-population level the CAA strip test is an efficient assay for general mapping, identification of hotspots, determination of stratified infection levels, and accurate monitoring of mass drug administrations (MDA). At the individual level, the number of tests can be reduced i.e. in low endemic settings as the pool size can be increased as opposed to prevalence decrease. At the sub-population level, average CAA concentrations determined in urine pools can be an appropriate measure indicating worm burden. Pooling strategies allowing this type of large scale testing are feasible with the various CAA strip test formats and do not affect sensitivity and specificity. It allows cost efficient stratified testing and monitoring of worm burden at the sub-population level, ideally for large-scale surveillance generating hard data for performance of MDA programs and strategic planning when moving towards transmission-stop and elimination.

  7. 75 FR 38774 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-06

    ... design research as part of testing for its censuses and surveys. At this time, the Census Bureau is... follows: Field test, Respondent debriefing questionnaire, Split sample experiments, Cognitive interviews... each round will be provided separately. When split sample experiments are conducted, either in small...

  8. Test plan for evaluating the operational performance of the prototype nested, fixed-depth fluidic sampler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    REICH, F.R.

    The PHMC will provide Low Activity Wastes (LAW) tank wastes for final treatment by a privatization contractor from two double-shell feed tanks, 241-AP-102 and 241-AP-104. Concerns about the inability of the baseline ''grab'' sampling to provide large volume samples within time constraints has led to the development of a nested, fixed-depth sampling system. This sampling system will provide large volume, representative samples without the environmental, radiation exposure, and sample volume impacts of the current base-line ''grab'' sampling method. A plan has been developed for the cold testing of this nested, fixed-depth sampling system with simulant materials. The sampling system willmore » fill the 500-ml bottles and provide inner packaging to interface with the Hanford Sites cask shipping systems (PAS-1 and/or ''safe-send''). The sampling system will provide a waste stream that will be used for on-line, real-time measurements with an at-tank analysis system. The cold tests evaluate the performance and ability to provide samples that are representative of the tanks' content within a 95 percent confidence interval, to sample while mixing pumps are operating, to provide large sample volumes (1-15 liters) within a short time interval, to sample supernatant wastes with over 25 wt% solids content, to recover from precipitation- and settling-based plugging, and the potential to operate over the 20-year expected time span of the privatization contract.« less

  9. Rapid wide-field Mueller matrix polarimetry imaging based on four photoelastic modulators with no moving parts.

    PubMed

    Alali, Sanaz; Gribble, Adam; Vitkin, I Alex

    2016-03-01

    A new polarimetry method is demonstrated to image the entire Mueller matrix of a turbid sample using four photoelastic modulators (PEMs) and a charge coupled device (CCD) camera, with no moving parts. Accurate wide-field imaging is enabled with a field-programmable gate array (FPGA) optical gating technique and an evolutionary algorithm (EA) that optimizes imaging times. This technique accurately and rapidly measured the Mueller matrices of air, polarization elements, and turbid phantoms. The system should prove advantageous for Mueller matrix analysis of turbid samples (e.g., biological tissues) over large fields of view, in less than a second.

  10. Free flux flow in two single crystals of V3Si with differing pinning strengths

    NASA Astrophysics Data System (ADS)

    Gafarov, O.; Gapud, A. A.; Moraes, S.; Thompson, J. R.; Christen, D. K.; Reyes, A. P.

    2011-10-01

    Results of measurements on two very clean, single-crystal samples of the A15 superconductor V3Si are presented. Magnetization and transport data have confirmed the ``clean'' quality of both samples, as manifested by: (i) high residual electrical resistivity ratio, (ii) very low critical current densities Jc, and (iii) a ``peak'' effect in the field dependence of critical current. The (H,T) phase line for this peak effect is shifted down for the slightly ``dirtier'' sample, which consequently also has higher critical current density Jc(H). Large Lorentz forces are applied on mixed-state vortices via large currents, in order to induce the highly ordered free flux flow (FFF) phase, using experimental methods developed previously. The traditional model by Bardeen and Stephen (BS) predicts a simple field dependence of flux flow resistivity ρf(H) ˜ H/Hc2, presuming a field-independent flux core size. A model by Kogan and Zelezhina (KZ) takes into account the effects of magnetic field on core size, and predict a clear deviation from the linear BS dependence. In this study, ρf(H) is confirmed to be consistent with predictions of KZ.

  11. Threshold Theory Tested in an Organizational Setting: The Relation between Perceived Innovativeness and Intelligence in a Large Sample of Leaders

    ERIC Educational Resources Information Center

    Christensen, Bo T.; Hartmann, Peter V. W.; Rasmussen, Thomas Hedegaard

    2017-01-01

    A large sample of leaders (N = 4257) was used to test the link between leader innovativeness and intelligence. The threshold theory of the link between creativity and intelligence assumes that below a certain IQ level (approximately IQ 120), there is some correlation between IQ and creative potential, but above this cutoff point, there is no…

  12. Rheological properties of isotropic magnetorheological elastomers featuring an epoxidized natural rubber

    NASA Astrophysics Data System (ADS)

    Azhani Yunus, Nurul; Amri Mazlan, Saiful; Ubaidillah; Choi, Seung-Bok; Imaduddin, Fitrian; Aziz, Siti Aishah Abdul; Khairi, Muntaz Hana Ahmad

    2016-10-01

    This study presents principal field-dependent rheological properties of magnetorheological elastomers (MREs) in which an epoxidized natural rubber (ENR) is adopted as a matrix (in short, we call it ENR-based MREs). The isotropic ENR-based MRE samples are fabricated by mixing the ENR compound with carbonyl iron particles (CIPs) with different weight percentages. The morphological properties of the samples are firstly analysed using the microstructure assessment. The influences of the magnetic field on the viscoelastic properties of ENR-based MREs are then examined through the dynamic test under various excitation frequencies. The microstructure of MRE samples exhibits a homogeneous distribution of CIPs in the ENR matrix. The dramatic increment of storage modulus, loss modulus and loss tangent of the ENR-based MREs are also observed from the field-dependent rheological test. This directly demonstrates that the stiffness and damping properties of the samples can be adjusted by the magnetic field. It is also seen that the CIP content, exciting frequency and the magnetic field essentially influence the dynamic properties of the ENR-based MREs. The strong correlation between the magnetization and the magneto-induced storage modulus could be used as a useful guidance in synthesizing the ENR-based MREs for certain applications.

  13. Modification of a Pollen Trap Design To Capture Airborne Conidia of Entomophaga maimaiga and Detection of Conidia by Quantitative PCR.

    PubMed

    Bittner, Tonya D; Hajek, Ann E; Liebhold, Andrew M; Thistle, Harold

    2017-09-01

    The goal of this study was to develop effective and practical field sampling methods for quantification of aerial deposition of airborne conidia of Entomophaga maimaiga over space and time. This important fungal pathogen is a major cause of larval death in invasive gypsy moth ( Lymantria dispar ) populations in the United States. Airborne conidia of this pathogen are relatively large (similar in size to pollen), with unusual characteristics, and require specialized methods for collection and quantification. Initially, dry sampling (settling of spores from the air onto a dry surface) was used to confirm the detectability of E. maimaiga at field sites with L. dispar deaths caused by E. maimaiga , using quantitative PCR (qPCR) methods. We then measured the signal degradation of conidial DNA on dry surfaces under field conditions, ultimately rejecting dry sampling as a reliable method due to rapid DNA degradation. We modified a chamber-style trap commonly used in palynology to capture settling spores in buffer. We tested this wet-trapping method in a large-scale (137-km) spore-trapping survey across gypsy moth outbreak regions in Pennsylvania undergoing epizootics, in the summer of 2016. Using 4-day collection periods during the period of late instar and pupal development, we detected variable amounts of target DNA settling from the air. The amounts declined over the season and with distance from the nearest defoliated area, indicating airborne spore dispersal from outbreak areas. IMPORTANCE We report on a method for trapping and quantifying airborne spores of Entomophaga maimaiga , an important fungal pathogen affecting gypsy moth ( Lymantria dispar ) populations. This method can be used to track dispersal of E. maimaiga from epizootic areas and ultimately to provide critical understanding of the spatial dynamics of gypsy moth-pathogen interactions. Copyright © 2017 American Society for Microbiology.

  14. The Relationship between Sample Sizes and Effect Sizes in Systematic Reviews in Education

    ERIC Educational Resources Information Center

    Slavin, Robert; Smith, Dewi

    2009-01-01

    Research in fields other than education has found that studies with small sample sizes tend to have larger effect sizes than those with large samples. This article examines the relationship between sample size and effect size in education. It analyzes data from 185 studies of elementary and secondary mathematics programs that met the standards of…

  15. M-H characteristics and demagnetization resistance of samarium-cobalt permanent magnets to 300 C

    NASA Technical Reports Server (NTRS)

    Niedra, Janis M.

    1992-01-01

    The influence of temperature on the M-H demagnetization characteristics of permanent magnets is important information for the full utilization of the capabilities of samarium-cobalt magnets at high temperatures in demagnetization-resistant permanent magnet devices. In high temperature space power converters, such as free-piston Stirling engine driven linear alternators, magnet demagnetization can occur as a long-term consequence of thermal agitation of domains and of metallurgical change, and also as an immediate consequence of too large an applied field. Investigated here is the short-term demagnetization resistance to applied fields derived from basic M-H data. This quasistatic demagnetization data was obtained for commercial, high-intrinsic-coercivity, Sm2Co17-type magnets from 5 sources, in the temperature range 23 to 300 C. An electromagnet driven, electronic hysteresigraph was used to test the 1-cm cubic samples. The observed variation of the 2nd quadrant M-H characteristics was a typical rapid loss of M-coercivity and a relatively lesser loss of remanence with increasing temperature. The 2nd quadrant M-H curve knee point is used to define the limits of operation safe against irreversible demagnetization due to an excessive bucking field for a given flux density swing at temperature. Such safe operating area plots are shown to differentiate the high temperature capabilities of the samples from different sources. For most of the samples, their 2nd quadrant M-H loop squareness increased with temperature, reaching a peak or a plateau above 250 C.

  16. Personal medical electronic devices and walk-through metal detector security systems: assessing electromagnetic interference effects.

    PubMed

    Guag, Joshua; Addissie, Bisrat; Witters, Donald

    2017-03-20

    There have been concerns that Electromagnetic security systems such as walk-through metal detectors (WTMDs) can potentially cause electromagnetic interference (EMI) in certain active medical devices including implantable cardiac pacemakers and implantable neurostimulators. Incidents of EMI between WTMDs and active medical devices also known as personal medical electronic devices (PMED) continue to be reported. This paper reports on emission measurements of sample WTMDs and testing of 20 PMEDs in a WTMD simulation system. Magnetic fields from sample WTMD systems were characterized for emissions and exposure of certain PMEDs. A WTMD simulator system designed and evaluated by FDA in previous studies was used to mimic the PMED exposures to the waveform from sample WTMDs. The simulation system allows for controlled PMED exposure enabling careful study with adjustable magnetic field strengths and exposure duration, and provides flexibility for PMED exposure at elevated levels in order to study EMI effects on the PMED. The PMED samples consisted of six implantable cardiac pacemakers, six implantable cardioverter defibrillators (ICD), five implantable neurostimulators, and three insulin pumps. Each PMED was exposed in the simulator to the sample WTMD waveforms using methods based on appropriate consensus test standards for each of the device type. Testing the sample PMEDs using the WTMD simulator revealed EMI effects on two implantable pacemakers and one implantable neurostimulator for exposure field strength comparable to actual WTMD field strength. The observed effects were transient and the PMEDs returned to pre-exposure operation within a few seconds after removal from the simulated WTMD exposure fields. No EMI was observed for the sample ICDs or insulin pumps. The findings are consistent with earlier studies where certain sample PMEDs exhibited EMI effects. Clinical implications were not addressed in this study. Additional studies are needed to evaluate potential PMED EMI susceptibilities over a broader range of security systems.

  17. Confirmatory analysis of field-presumptive GSR test sample using SEM/EDS

    NASA Astrophysics Data System (ADS)

    Toal, Sarah J.; Niemeyer, Wayne D.; Conte, Sean; Montgomery, Daniel D.; Erikson, Gregory S.

    2014-09-01

    RedXDefense has developed an automated red-light/green-light field presumptive lead test using a sampling pad which can be subsequently processed in a Scanning Electron Microscope for GSR confirmation. The XCAT's sampling card is used to acquire a sample from a suspect's hands on the scene and give investigators an immediate presumptive as to the presence of lead possibly from primer residue. Positive results can be obtained after firing as little as one shot. The same sampling card can then be sent to a crime lab and processed on the SEM for GSR following ASTM E-1588-10 Standard Guide for Gunshot Residue Analysis by Scanning Electron Microscopy/Energy Dispersive X-Ray Spectrometry, in the same manner as the existing tape lifts currently used in the field. Detection of GSR-characteristic particles (fused lead, barium, and antimony) as small as 0.8 microns (0.5 micron resolution) has been achieved using a JEOL JSM-6480LV SEM equipped with an Oxford Instruments INCA EDS system with a 50mm2 SDD detector, 350X magnification, in low-vacuum mode and in high vacuum mode after coating with carbon in a sputter coater. GSR particles remain stable on the sampling pad for a minimum of two months after chemical exposure (long term stability tests are in progress). The presumptive result provided by the XCAT yields immediate actionable intelligence to law enforcement to facilitate their investigation, without compromising the confirmatory test necessary to further support the investigation and legal case.

  18. Psychometric properties of the Chinese-version Quality of Nursing Work Life Scale.

    PubMed

    Lee, Ya-Wen; Dai, Yu-Tzu; McCreary, Linda L; Yao, Grace; Brooks, Beth A

    2014-09-01

    In this study, we developed and tested the psychometric properties of the Chinese-version Quality of Nursing Work Life Scale along seven subscales: supportive milieu with security and professional recognition, work arrangement and workload, work/home life balance, head nurse's/supervisor's management style, teamwork and communication, nursing staffing and patient care, and milieu of respect and autonomy. An instrument-development procedure with three phases was conducted in seven hospitals in 2010-2011. Phase I comprised translation and the cultural-adaptation process, phase II comprised a pilot study, and phase III comprised a field-testing process. Purposive sampling was used in the pilot study (n = 150) and the large field study (n = 1254). Five new items were added, and 85.7% of the original items were retained in the 41 item Chinese version. Principal component analysis revealed that a model accounted for 56.6% of the variance with acceptable internal consistency, concurrent validity, and discriminant validity. This study gave evidence of reliability and validity of the 41 item Chinese-version Quality of Nursing Work Life Scale. © 2014 Wiley Publishing Asia Pty Ltd.

  19. The Washback Effect of Konkoor on Teachers' Attitudes toward Their Teaching

    ERIC Educational Resources Information Center

    Birjandi, Parviz; Shirkhani, Servat

    2012-01-01

    Large scale tests have been considered by many scholars in the field of language testing and teaching to influence teaching and learning considerably. The present study looks at the effect of a large scale test (Konkoor) on the attitudes of teachers in high schools. Konkoor is the university entrance examination in Iran which is taken by at least…

  20. Large-Scale Academic Achievement Testing of Deaf and Hard-of-Hearing Students: Past, Present, and Future

    ERIC Educational Resources Information Center

    Qi, Sen; Mitchell, Ross E.

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the…

  1. The Use of Handheld X-Ray Fluorescence (XRF) Technology in Unraveling the Eruptive History of the San Francisco Volcanic Field, Arizona

    NASA Technical Reports Server (NTRS)

    Young, Kelsey E.; Evans, C. A.; Hodges, K. V.

    2012-01-01

    While traditional geologic mapping includes the examination of structural relationships between rock units in the field, more advanced technology now enables us to simultaneously collect and combine analytical datasets with field observations. Information about tectonomagmatic processes can be gleaned from these combined data products. Historically, construction of multi-layered field maps that include sample data has been accomplished serially (first map and collect samples, analyze samples, combine data, and finally, readjust maps and conclusions about geologic history based on combined data sets). New instruments that can be used in the field, such as a handheld xray fluorescence (XRF) unit, are now available. Targeted use of such instruments enables geologists to collect preliminary geochemical data while in the field so that they can optimize scientific data return from each field traverse. Our study tests the application of this technology and projects the benefits gained by real-time geochemical data in the field. The integrated data set produces a richer geologic map and facilitates a stronger contextual picture for field geologists when collecting field observations and samples for future laboratory work. Real-time geochemical data on samples also provide valuable insight regarding sampling decisions by the field geologist

  2. Polarized neutron imaging and three-dimensional calculation of magnetic flux trapping in bulk of superconductors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Treimer, Wolfgang; Ebrahimi, Omid; Karakas, Nursel

    Polarized neutron radiography was used to study the three-dimensional magnetic flux distribution inside of single-crystal and polycrystalline Pb cylinders with large (cm3) volume and virtually zero demagnetization. Experiments with single crystals being in the Meissner phase (T

  3. Towards wide-angle neutron polarization analysis with a 3He spin filter for TOPAS and NEAT: Testing magic PASTIS on V20 at HZB

    NASA Astrophysics Data System (ADS)

    Babcock, Earl; Salhi, Zahir; Gainov, Ramil; Woracek, Robin; Soltner, Helmut; Pistel, Patrick; Beule, Fabian; Bussmann, Klaus; Heynen, Achim; Kämmerling, Hans; Suxdorf, Frank; Strobl, Marcus; Russina, Margarita; Voigt, Jörg; Ioffe, Alexander

    2018-05-01

    An XYZ polarization analysis solution has been developed for the new thermal time-of-flight spectrometer TOPAS [1], to be operated in the coming east neutron guide hall at the MLZ. This prototype is currently being prepared to use on NEAT at HZB [2]. Polarization Analysis Studies on a Thermal Inelastic Spectrometer, commonly called PASTIS [3], is based on polarized 3He neutron spin filters and an XYZ field configuration for the sample environment and a polarization-preserving neutron guide field. The complete system was designed to provide adiabatic transport of the neutron polarization to the sample position on TOPAS while maintaining the homogeneity of the XYZ field. This complete system has now been tested on the polarized time-of-flight ESS test beam line V20 at HZB [4]. We present results of this test and the next steps forward.

  4. Galaxy groups in the low-redshift Universe

    NASA Astrophysics Data System (ADS)

    Lim, S. H.; Mo, H. J.; Lu, Yi; Wang, Huiyuan; Yang, Xiaohu

    2017-09-01

    We apply a halo-based group finder to four large redshift surveys, the 2MRS (Two Micron All-Sky Redshift Survey), 6dFGS (Six-degree Field Galaxy Survey), SDSS (Sloan Digital Sky Survey) and 2dFGRS (Two-degree Field Galaxy Redshift Survey), to construct group catalogues in the low-redshift Universe. The group finder is based on that of Yang et al. but with an improved halo mass assignment so that it can be applied uniformly to various redshift surveys of galaxies. Halo masses are assigned to groups according to proxies based on the stellar mass/luminosity of member galaxies. The performances of the group finder in grouping galaxies according to common haloes and in halo mass assignments are tested using realistic mock samples constructed from hydrodynamical simulations and empirical models of galaxy occupation in dark matter haloes. Our group finder finds ∼94 per cent of the correct true member galaxies for 90-95 per cent of the groups in the mock samples; the halo masses assigned by the group finder are un-biased with respect to the true halo masses, and have a typical uncertainty of ∼0.2 dex. The properties of group catalogues constructed from the observational samples are described and compared with other similar catalogues in the literature.

  5. High explosive spot test analyses of samples from Operable Unit (OU) 1111

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McRae, D.; Haywood, W.; Powell, J.

    1995-01-01

    A preliminary evaluation has been completed of environmental contaminants at selected sites within the Group DX-10 (formally Group M-7) area. Soil samples taken from specific locations at this detonator facility were analyzed for harmful metals and screened for explosives. A sanitary outflow, a burn pit, a pentaerythritol tetranitrate (PETN) production outflow field, an active firing chamber, an inactive firing chamber, and a leach field were sampled. Energy dispersive x-ray fluorescence (EDXRF) was used to obtain semi-quantitative concentrations of metals in the soil. Two field spot-test kits for explosives were used to assess the presence of energetic materials in the soilmore » and in items found at the areas tested. PETN is the major explosive in detonators manufactured and destroyed at Los Alamos. No measurable amounts of PETN or other explosives were detected in the soil, but items taken from the burn area and a high-energy explosive (HE)/chemical sump were contaminated. The concentrations of lead, mercury, and uranium are given.« less

  6. When the test of mediation is more powerful than the test of the total effect.

    PubMed

    O'Rourke, Holly P; MacKinnon, David P

    2015-06-01

    Although previous research has studied power in mediation models, the extent to which the inclusion of a mediator will increase power has not been investigated. To address this deficit, in a first study we compared the analytical power values of the mediated effect and the total effect in a single-mediator model, to identify the situations in which the inclusion of one mediator increased statistical power. The results from this first study indicated that including a mediator increased statistical power in small samples with large coefficients and in large samples with small coefficients, and when coefficients were nonzero and equal across models. Next, we identified conditions under which power was greater for the test of the total mediated effect than for the test of the total effect in the parallel two-mediator model. These results indicated that including two mediators increased power in small samples with large coefficients and in large samples with small coefficients, the same pattern of results that had been found in the first study. Finally, we assessed the analytical power for a sequential (three-path) two-mediator model and compared the power to detect the three-path mediated effect to the power to detect both the test of the total effect and the test of the mediated effect for the single-mediator model. The results indicated that the three-path mediated effect had more power than the mediated effect from the single-mediator model and the test of the total effect. Practical implications of these results for researchers are then discussed.

  7. Sensitive and Rapid Detection of Viable Giardia Cysts and Cryptosporidium parvum Oocysts in Large-Volume Water Samples with Wound Fiberglass Cartridge Filters and Reverse Transcription-PCR

    PubMed Central

    Kaucner, Christine; Stinear, Timothy

    1998-01-01

    We recently described a reverse transcription-PCR (RT-PCR) for detecting low numbers of viable Cryptosporidium parvum oocysts spiked into clarified environmental water concentrates. We have now modified the assay for direct analysis of primary sample concentrates with simultaneous detection of viable C. parvum oocysts, Giardia cysts, and a novel type of internal positive control (IPC). The IPC was designed to assess both efficiency of mRNA isolation and potential RT-PCR inhibition. Sensitivity testing showed that low numbers of organisms, in the range of a single viable cyst and oocyst, could be detected when spiked into 100-μl packed pellet volumes of concentrates from creek and river water samples. The RT-PCR was compared with an immunofluorescence (IF) assay by analyzing 29 nonspiked environmental water samples. Sample volumes of 20 to 1,500 liters were concentrated with a wound fiberglass cartridge filter. Frequency of detection for viable Giardia cysts increased from 24% by IF microscopy to 69% by RT-PCR. Viable C. parvum oocysts were detected only once by RT-PCR (3%) in contrast to detection of viable Cryptosporidium spp. in four samples by IF microscopy (14%), suggesting that Cryptosporidium species other than C. parvum were present in the water. This combination of the large-volume sampling method with RT-PCR represents a significant advance in terms of protozoan pathogen monitoring and in the wider application of PCR technology to this field of microbiology. PMID:9572946

  8. The T dwarf population in the UKIDSS LAS .

    NASA Astrophysics Data System (ADS)

    Cardoso, C. V.; Burningham, B.; Smith, L.; Smart, R.; Pinfield, D.; Magazzù, A.; Ghinassi, F.; Lattanzi, M.

    We present the most recent results from the UKIDSS Large Area Survey (LAS) census and follow up of new T brown dwarfs in the local field. The new brown dwarf candidates are identified using optical and infrared survey photometry (UKIDSS and SDSS) and followed up with narrow band methane photometry (TNG) and spectroscopy (Gemini and Subaru) to confirm their brown dwarf nature. Employing this procedure we have discovered several dozens of new T brown dwarfs in the field. Using methane differential photometry as a proxy for spectral type for T brown dwarfs has proved to be a very efficient technique. This method can be useful in the future to reliably identify brown dwarfs in deep surveys that produce large samples of faint targets where spectroscopy is not feasible for all candidates. With this statistical robust sample of the mid and late T brown dwarf field population we were also able to address the discrepancies between the observed field space density and the expected values given the most accepted forms of the IMF of young clusters.

  9. Field comparison of real-time polymerase chain reaction and bacterial culture for identification of bovine mastitis bacteria.

    PubMed

    Koskinen, M T; Wellenberg, G J; Sampimon, O C; Holopainen, J; Rothkamp, A; Salmikivi, L; van Haeringen, W A; Lam, T J G M; Pyörälä, S

    2010-12-01

    Fast and reliable identification of the microorganisms causing mastitis is important for management of the disease and for targeting antimicrobial treatment. Methods based on PCR are being used increasingly in mastitis diagnostics. Comprehensive field comparisons of PCR and traditional milk bacteriology have not been available. The results of a PCR kit capable of detecting 11 important etiological agents of mastitis directly from milk in 4h were compared with those of conventional bacterial culture (48h). In total, 1,000 quarter milk samples were taken from cows with clinical or subclinical mastitis, or from clinically healthy quarters with low somatic cell count (SCC). Bacterial culture identified udder pathogens in 600/780 (77%) of the clinical samples, whereas PCR identified bacteria in 691/780 (89%) of the clinical samples. The PCR analysis detected major pathogens in a large number of clinical samples that were negative for the species in culture. These included 53 samples positive for Staphylococcus aureus by PCR, but negative by culture. A total of 137 samples from clinical mastitis, 5 samples from subclinical mastitis, and 1 sample from a healthy quarter were positive for 3 or more bacterial species in PCR, whereas culture identified 3 or more species in 60 samples from clinical mastitis. Culture identified a species not targeted by the PCR test in 44 samples from clinical mastitis and in 9 samples from subclinical mastitis. Low SCC samples provided a small number of positive results both in culture (4/93; 4.3%) and by PCR (7/93; 7.5%). In conclusion, the PCR kit provided several benefits over conventional culture, including speed, automated interpretation of results, and increased sensitivity. This kit holds much promise as a tool to complement traditional methods in identification of pathogens. In conventional mastitis bacteriology, a sample with 3 or more species is considered contaminated, and resampling of the cow is recommended. Further study is required to investigate how high sensitivity of PCR and its quantitative features can be applied to improve separation of relevant udder pathogens from likely contaminants in samples where multiple species are detected. Furthermore, increasing the number of species targeted by the PCR test would be advantageous. Copyright © 2010 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  10. The development of a short domain-general measure of working memory capacity.

    PubMed

    Oswald, Frederick L; McAbee, Samuel T; Redick, Thomas S; Hambrick, David Z

    2015-12-01

    Working memory capacity is one of the most frequently measured individual difference constructs in cognitive psychology and related fields. However, implementation of complex span and other working memory measures is generally time-consuming for administrators and examinees alike. Because researchers often must manage the tension between limited testing time and measuring numerous constructs reliably, a short and effective measure of working memory capacity would often be a major practical benefit in future research efforts. The current study developed a shortened computerized domain-general measure of working memory capacity by representatively sampling items from three existing complex working memory span tasks: operation span, reading span, and symmetry span. Using a large archival data set (Study 1, N = 4,845), we developed and applied a principled strategy for developing the reduced measure, based on testing a series of confirmatory factor analysis models. Adequate fit indices from these models lent support to this strategy. The resulting shortened measure was then administered to a second independent sample (Study 2, N = 172), demonstrating that the new measure saves roughly 15 min (30%) of testing time on average, and even up to 25 min depending on the test-taker. On the basis of these initial promising findings, several directions for future research are discussed.

  11. Using He I λ10830 to Diagnose Mass Flows Around Herbig Ae/Be Stars

    NASA Astrophysics Data System (ADS)

    Cauley, Paul W.; Johns-Krull, Christopher M.

    2015-01-01

    The pre-main sequence Herbig Ae/Be stars (HAEBES) are the intermediate mass cousins of the low mass T Tauri stars (TTSs). However, it is not clear that the same accretion and mass outflow mechanisms operate identically in both mass regimes. Classical TTSs (CTTSs) accrete material from their disks along stellar magnetic field lines in a scenario called magnetospheric accretion. Magnetospheric accretion requires a strong stellar dipole field in order to truncate the inner gas disk. These fields are either absent or very weak on a large majority of HAEBES, challenging the view that magnetospheric accretion is the dominant accretion mechanism. If magnetospheric accretion does not operate similarly around HAEBES as it does around CTTSs, then strong magnetocentrifugal outflows, which are directly linked to accretion and are ubiquitous around CTTSs, may be driven less efficiently from HAEBE systems. Here we present high resolution spectroscopic observations of the He I λ10830 line in a sample of 48 HAEBES. He I λ10830 is an excellent tracer of both mass infall and outflow which is directly manifested as red and blue-shifted absorption in the profile morphologies. These features, among others, are common in our sample. The occurrence of both red and blue-shifted absorption profiles is less frequent, however, than is found in CTTSs. Statistical contingency tests confirm this difference at a significant level. In addition, we find strong evidence for smaller disk truncation radii in the objects displaying red-shifted absorption profiles. This is expected for HAEBES experiencing magnetospheric accretion based on their large rotation rates and weak magnetic field strengths. Finally, the low incidence of blue-shifted absorption in our sample compared to CTTSs and the complete lack of simultaneous red and blue-shifted absorption features suggests that magnetospheric accretion in HAEBES is less efficient at driving strong outflows. The stellar wind-like outflows that are observed are likely driven, at least in part, by boundary layer accretion. The smaller (or absent) disk truncation radii in HAEBES may have consequences for the frequency of planets in close orbits around main sequence B and A stars.

  12. Evaluation of niobium as candidate electrode material for DC high voltage photoelectron guns

    DOE PAGES

    BastaniNejad, M.; Mohamed, Md. Abdullah; Elmustafa, A. A.; ...

    2012-08-17

    In this study, the field emission characteristics of niobium electrodes were compared to those of stainless steel electrodes using a DC high voltage field emission test apparatus. A total of eight electrodes were evaluated: two 304 stainless steel electrodes polished to mirror-like finish with diamond grit and six niobium electrodes (two single-crystal, two large-grain and two fine-grain) that were chemically polished using a buffered-chemical acid solution. Upon the first application of high voltage, the best large-grain and single-crystal niobium electrodes performed better than the best stainless steel electrodes, exhibiting less field emission at comparable voltage and gradient. In all cases,more » field emission from electrodes (stainless steel and/or niobium) could be significantly reduced and sometimes completely eliminated, by introducing krypton gas into the vacuum chamber while the electrode was biased at high voltage. Of all the electrodes tested, a large-grain niobium electrode performed the best, exhibiting no measurable field emission (< 10 pA) at 225 kV with 20 mm cathode/anode gap, corresponding to a gradient of 18.7 MV/m.« less

  13. Test-particle simulations of SEP propagation in IMF with large-scale fluctuations

    NASA Astrophysics Data System (ADS)

    Kelly, J.; Dalla, S.; Laitinen, T.

    2012-11-01

    The results of full-orbit test-particle simulations of SEPs propagating through an IMF which exhibits large-scale fluctuations are presented. A variety of propagation conditions are simulated - scatter-free, and scattering with mean free path, λ, of 0.3 and 2.0 AU - and the cross-field transport of SEPs is investigated. When calculating cross-field displacements the Parker spiral geometry is accounted for and the role of magnetic field expansion is taken into account. It is found that transport across the magnetic field is enhanced in the λ =0.3 AU and λ =2 AU cases, compared to the scatter-free case, with the λ =2 AU case in particular containing outlying particles that had strayed a large distance across the IMF. Outliers are catergorized by means of Chauvenet's criterion and it is found that typically between 1 and 2% of the population falls within this category. The ratio of latitudinal to longitudinal diffusion coefficient perpendicular to the magnetic field is typically 0.2, suggesting that transport in latitude is less efficient.

  14. Implementation of low temperature tests for asphalt mixtures to improve the longevity of road surfaces.

    DOT National Transportation Integrated Search

    2013-12-01

    Field samples were obtained from cores taken from multiple roads around the Salt Lake Valley in Utah and prepared for BBR testing. The response of field cores showed that even though the same binder grade used in the region was the same, the resultin...

  15. Analysis of field permeability and laboratory shear stress for Western Kentucky Parkway, milepost 18.240 to milepost 25.565, Caldwell-Hopkins counties

    DOT National Transportation Integrated Search

    2003-02-01

    This report lists and discusses results of field permeability tests and laboratory shear tests on samples from a construction project on the Western Kentucky Parkway in Caldwell-Hopkins Counties. Approximately 6,500 tons of asphaltic concrete overlay...

  16. Objective instrumental memory and performance tests for evaluation of patients with brain damage: a search for a behavioral diagnostic tool.

    PubMed

    Harness, B Z; Bental, E; Carmon, A

    1976-03-01

    Cognition and performance of patients with localized and diffuse brain damage was evaluated through the application of objective perceptual testing. A series of visual perceptual and verbal tests, memory tests, as well as reaction time tasks were administered to the patients by logic programming equipment. In order to avoid a bias due to communicative disorders, all responses were motor, and achievement was scored in terms of correct identification and latencies of response. Previously established norms based on a large sample of non-brain-damaged hospitalized patients served to standardize the performance of the brain-damaged patient since preliminary results showed that age and educational level constitute an important variable affecting performance of the control group. The achievement of brain-damaged patients, corrected for these factors, was impaired significantly in all tests with respect to both recognition and speed of performance. Lateralized effects of brain damage were not significantly demonstrated. However, when the performance was analyzed with respect to the locus of visual input, it was found that patients with right hemispheric lesions showed impairment mainly on perception of figurative material, and that this deficit was more apparent in the left visual field. Conversely, patients with left hemispheric lesions tended to show impairment on perception of visually presented verbal material when the input was delivered to the right visual field.

  17. An approach for detecting five typical vegetation types on the Chinese Loess Plateau using Landsat TM data.

    PubMed

    Wang, Zhi-Jie; Jiao, Ju-Ying; Lei, Bo; Su, Yuan

    2015-09-01

    Remote sensing can provide large-scale spatial data for the detection of vegetation types. In this study, two shortwave infrared spectral bands (TM5 and TM7) and one visible spectral band (TM3) of Landsat 5 TM data were used to detect five typical vegetation types (communities dominated by Bothriochloa ischaemum, Artemisia gmelinii, Hippophae rhamnoides, Robinia pseudoacacia, and Quercus liaotungensis) using 270 field survey data in the Yanhe watershed on the Loess Plateau. The relationships between 200 field data points and their corresponding radiance reflectance were analyzed, and the equation termed the vegetation type index (VTI) was generated. The VTI values of five vegetation types were calculated, and the accuracy was tested using the remaining 70 field data points. The applicability of VTI was also tested by the distribution of vegetation type of two small watersheds in the Yanhe watershed and field sample data collected from other regions (Ziwuling Region, Huangling County, and Luochuan County) on the Loess Plateau. The results showed that the VTI can effectively detect the five vegetation types with an average accuracy exceeding 80 % and a representativeness above 85 %. As a new approach for monitoring vegetation types using remote sensing at a larger regional scale, VTI can play an important role in the assessment of vegetation restoration and in the investigation of the spatial distribution and community diversity of vegetation on the Loess Plateau.

  18. Statistical Measures of Large-Scale Structure

    NASA Astrophysics Data System (ADS)

    Vogeley, Michael; Geller, Margaret; Huchra, John; Park, Changbom; Gott, J. Richard

    1993-12-01

    \\inv Mpc} To quantify clustering in the large-scale distribution of galaxies and to test theories for the formation of structure in the universe, we apply statistical measures to the CfA Redshift Survey. This survey is complete to m_{B(0)}=15.5 over two contiguous regions which cover one-quarter of the sky and include ~ 11,000 galaxies. The salient features of these data are voids with diameter 30-50\\hmpc and coherent dense structures with a scale ~ 100\\hmpc. Comparison with N-body simulations rules out the ``standard" CDM model (Omega =1, b=1.5, sigma_8 =1) at the 99% confidence level because this model has insufficient power on scales lambda >30\\hmpc. An unbiased open universe CDM model (Omega h =0.2) and a biased CDM model with non-zero cosmological constant (Omega h =0.24, lambda_0 =0.6) match the observed power spectrum. The amplitude of the power spectrum depends on the luminosity of galaxies in the sample; bright (L>L(*) ) galaxies are more strongly clustered than faint galaxies. The paucity of bright galaxies in low-density regions may explain this dependence. To measure the topology of large-scale structure, we compute the genus of isodensity surfaces of the smoothed density field. On scales in the ``non-linear" regime, <= 10\\hmpc, the high- and low-density regions are multiply-connected over a broad range of density threshold, as in a filamentary net. On smoothing scales >10\\hmpc, the topology is consistent with statistics of a Gaussian random field. Simulations of CDM models fail to produce the observed coherence of structure on non-linear scales (>95% confidence level). The underdensity probability (the frequency of regions with density contrast delta rho //lineρ=-0.8) depends strongly on the luminosity of galaxies; underdense regions are significantly more common (>2sigma ) in bright (L>L(*) ) galaxy samples than in samples which include fainter galaxies.

  19. Hydrogeologic data from test drilling near Verna, Florida, 1978

    USGS Publications Warehouse

    Barker, Michael; Bowman, Geronia; Sutcliffe, Horace

    1981-01-01

    Four test wells were drilled in the vicinity of the city of Sarasota well field near Verna, Fla., to provide hydrologic and geologic information. An expedient and economical method of air lifting water samples from isolated water-producing zones while drilling was utilized. Lithologic logs of drill cuttings and geophysical logs, including point resistance and spontaneous potential electric logs, gamma-ray logs, and caliper logs, were made. Chemical quality of water was determined for principal producing zones at each well. Dissolved solids from composite water samples ranged from 313 milligrams per liter in test well 0-1 north of the well field to 728 milligrams per liter in test well 0-3 within the well field. Each test well was pumped to determine maximum discharge, water-level drawdown, and recovery time. A leaking pump column on test well 0-1 prevented accurate measurement of drawdown on the well. Test well 0-2, located east of the well field, had a pumping rate of 376 gallons per minute and 13.11 feet of drawdown after 3 hours and 50 minutes; test well 0-3 had a maximum yield of 320 gallons per minute, a drawdown of 31.91 feet after 2 hours and 35 minutes of pumping, had a recovery time of 20 minutes; and test well 0-4, south of the well field, had a pumping rate of 200 gallons per minute with 63.34 feet of drawdown after 2 hours and 35 minutes. (USGS)

  20. Design and validation of a passive deposition sampler.

    PubMed

    Einstein, Stephanie A; Yu, Chang-Ho; Mainelis, Gediminas; Chen, Lung Chi; Weisel, Clifford P; Lioy, Paul J

    2012-09-01

    A new, passive particle deposition air sampler, called the Einstein-Lioy Deposition Sampler (ELDS), has been developed to fill a gap in passive sampling for near-field particle emissions. The sampler can be configured in several ways: with a protective hood for outdoor sampling, without a protective hood, and as a dust plate. In addition, there is an XRF-ready option that allows for direct sampling onto a filter-mounted XRF cartridge which can be used in conjunction with all configurations. A wind tunnel was designed and constructed to test the performance of different sampler configurations using a test dust with a known particle size distribution. The sampler configurations were also tested versus each other to evaluate whether or not the protective hood would affect the collected particle size distribution. A field study was conducted to test the sampler under actual environmental conditions and to evaluate its ability to collect samples for chemical analysis. Individual experiments for each configuration demonstrated precision of the sampler. The field experiment demonstrated the ability of the sampler to both collect mass and allow for the measurement of an environmental contaminant i.e. Cr(6+). The ELDS was demonstrated to be statistically not different for Hooded and Non-Hooded models, compared to each other and the test dust; thus, it can be used indoors and outdoors in a variety of configurations to suit the user's needs.

  1. Field and laboratory comparative evaluation of ten rapid malaria diagnostic tests.

    PubMed

    Craig, M H; Bredenkamp, B L; Williams, C H Vaughan; Rossouw, E J; Kelly, V J; Kleinschmidt, I; Martineau, A; Henry, G F J

    2002-01-01

    The paper reports on a comparative evaluation of 10 rapid malaria tests available in South Africa in 1998: AccuCheck (AC, developmental), Cape Biotech (CB), ICT Malaria Pf (ICT1) and Pf/Pv (ICT2), Kat Medical (KAT), MakroMal (MM), OptiMAL (OP), ParaSight-F (PS), Quorum (Q), Determine-Malaria (DM). In a laboratory study, designed to test absolute detection limits, Plasmodium falciparum-infected blood was diluted with uninfected blood to known parasite concentrations ranging from 500 to 0.1 parasites per microlitre (P/microL). The 50% detection limits were: ICT1, 3.28; ICT2, 4.86; KAT, 6.36; MM, 9.37; CB, 11.42; DM, 12.40; Q, 16.98; PS, 20; AC, 31.15 and OP, 91.16 P/microL. A field study was carried out to test post-treatment specificity. Blood samples from malaria patients were tested with all products (except AC and DM) on the day of treatment and 3 and 7 days thereafter, against a gold standard of microscopy and polymerase chain reaction (PCR). OP and PS produced fewer false-positive results on day 7 (18 and 19%, respectively) than the other rapid tests (38-56%). However, microscopy, PCR, OP and PS disagreed largely as to which individuals remained positive. The tests were further compared with regard to general specificity, particularly cross-reactivity with rheumatoid factor, speed, simplicity, their ability to detect other species, storage requirements and general presentation.

  2. Feasibility and acceptability of the DSM-5 Field Trial procedures in the Johns Hopkins Community Psychiatry Programs.

    PubMed

    Clarke, Diana E; Wilcox, Holly C; Miller, Leslie; Cullen, Bernadette; Gerring, Joan; Greiner, Lisa H; Newcomer, Alison; McKitty, Mellisha V; Regier, Darrel A; Narrow, William E

    2014-06-01

    The Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) contains criteria for psychiatric diagnoses that reflect advances in the science and conceptualization of mental disorders and address the needs of clinicians. DSM-5 also recommends research on dimensional measures of cross-cutting symptoms and diagnostic severity, which are expected to better capture patients' experiences with mental disorders. Prior to its May 2013 release, the American Psychiatric Association (APA) conducted field trials to examine the feasibility, clinical utility, reliability, and where possible, the validity of proposed DSM-5 diagnostic criteria and dimensional measures. The methods and measures proposed for the DSM-5 field trials were pilot tested in adult and child/adolescent clinical samples, with the goal to identify and correct design and procedural problems with the proposed methods before resources were expended for the larger DSM-5 Field Trials. Results allowed for the refinement of the protocols, procedures, and measures, which facilitated recruitment, implementation, and completion of the DSM-5 Field Trials. These results highlight the benefits of pilot studies in planning large multisite studies. Copyright © 2013, American Psychiatric Association. All rights reserved.

  3. Gibbs sampling on large lattice with GMRF

    NASA Astrophysics Data System (ADS)

    Marcotte, Denis; Allard, Denis

    2018-02-01

    Gibbs sampling is routinely used to sample truncated Gaussian distributions. These distributions naturally occur when associating latent Gaussian fields to category fields obtained by discrete simulation methods like multipoint, sequential indicator simulation and object-based simulation. The latent Gaussians are often used in data assimilation and history matching algorithms. When the Gibbs sampling is applied on a large lattice, the computing cost can become prohibitive. The usual practice of using local neighborhoods is unsatisfying as it can diverge and it does not reproduce exactly the desired covariance. A better approach is to use Gaussian Markov Random Fields (GMRF) which enables to compute the conditional distributions at any point without having to compute and invert the full covariance matrix. As the GMRF is locally defined, it allows simultaneous updating of all points that do not share neighbors (coding sets). We propose a new simultaneous Gibbs updating strategy on coding sets that can be efficiently computed by convolution and applied with an acceptance/rejection method in the truncated case. We study empirically the speed of convergence, the effect of choice of boundary conditions, of the correlation range and of GMRF smoothness. We show that the convergence is slower in the Gaussian case on the torus than for the finite case studied in the literature. However, in the truncated Gaussian case, we show that short scale correlation is quickly restored and the conditioning categories at each lattice point imprint the long scale correlation. Hence our approach enables to realistically apply Gibbs sampling on large 2D or 3D lattice with the desired GMRF covariance.

  4. Technical Note: Harmonic analysis applied to MR image distortion fields specific to arbitrarily shaped volumes.

    PubMed

    Stanescu, T; Jaffray, D

    2018-05-25

    Magnetic resonance imaging is expected to play a more important role in radiation therapy given the recent developments in MR-guided technologies. MR images need to consistently show high spatial accuracy to facilitate RT specific tasks such as treatment planning and in-room guidance. The present study investigates a new harmonic analysis method for the characterization of complex 3D fields derived from MR images affected by system-related distortions. An interior Dirichlet problem based on solving the Laplace equation with boundary conditions (BCs) was formulated for the case of a 3D distortion field. The second-order boundary value problem (BVP) was solved using a finite elements method (FEM) for several quadratic geometries - i.e., sphere, cylinder, cuboid, D-shaped, and ellipsoid. To stress-test the method and generalize it, the BVP was also solved for more complex surfaces such as a Reuleaux 9-gon and the MR imaging volume of a scanner featuring a high degree of surface irregularities. The BCs were formatted from reference experimental data collected with a linearity phantom featuring a volumetric grid structure. The method was validated by comparing the harmonic analysis results with the corresponding experimental reference fields. The harmonic fields were found to be in good agreement with the baseline experimental data for all geometries investigated. In the case of quadratic domains, the percentage of sampling points with residual values larger than 1 mm were 0.5% and 0.2% for the axial components and vector magnitude, respectively. For the general case of a domain defined by the available MR imaging field of view, the reference data showed a peak distortion of about 12 mm and 79% of the sampling points carried a distortion magnitude larger than 1 mm (tolerance intrinsic to the experimental data). The upper limits of the residual values after comparison with the harmonic fields showed max and mean of 1.4 mm and 0.25 mm, respectively, with only 1.5% of sampling points exceeding 1 mm. A novel harmonic analysis approach relying on finite element methods was introduced and validated for multiple volumes with surface shape functions ranging from simple to highly complex. Since a boundary value problem is solved the method requires input data from only the surface of the desired domain of interest. It is believed that the harmonic method will facilitate (a) the design of new phantoms dedicated for the quantification of MR image distortions in large volumes and (b) an integrative approach of combining multiple imaging tests specific to radiotherapy into a single test object for routine imaging quality control. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  5. A Novel Method for Quantifying Helmeted Field of View of a Spacesuit - And What It Means for Constellation

    NASA Technical Reports Server (NTRS)

    McFarland, Shane M.

    2010-01-01

    Field of view has always been a design feature paramount to helmet design, and in particular spacesuit design, where the helmet must provide an adequate field of view for a large range of activities, environments, and body positions. Historically, suited field of view has been evaluated either qualitatively in parallel with design or quantitatively using various test methods and protocols. As such, oftentimes legacy suit field of view information is either ambiguous for lack of supporting data or contradictory to other field of view tests performed with different subjects and test methods. This paper serves to document a new field of view testing method that is more reliable and repeatable than its predecessors. It borrows heavily from standard ophthalmologic field of vision tests such as the Goldmann kinetic perimetry test, but is designed specifically for evaluating field of view of a spacesuit helmet. In this test, four suits utilizing three different helmet designs were tested for field of view. Not only do these tests provide more reliable field of view data for legacy and prototype helmet designs, they also provide insight into how helmet design impacts field of view and what this means for the Constellation Project spacesuit helmet, which must meet stringent field of view requirements that are more generous to the crewmember than legacy designs.

  6. A field study of solid rocket exhaust impacts on the near-field environment

    NASA Technical Reports Server (NTRS)

    Anderson, B. J.; Keller, Vernon W.

    1990-01-01

    Large solid rocket motors release large quantities of hydrogen chloride and aluminum oxide exhaust during launch and testing. Measurements and analysis of the interaction of this material with the deluge water spray and other environmental factors in the near field (within 1 km of the launch or test site) are summarized. Measurements of mixed solid and liquid deposition (typically 2 normal HCl) following space shuttle launches and 6.4 percent scale model tests are described. Hydrogen chloride gas concentrations measured in the hours after the launch of STS 41D and STS 51A are reported. Concentrations of 9 ppm, which are above the 5 ppm exposure limits for workers, were detected an hour after STS 51A. A simplified model which explains the primary features of the gas concentration profiles is included.

  7. ff14IDPs Force Field Improving the Conformation Sampling of Intrinsically Disordered Proteins

    PubMed Central

    Song, Dong; Wang, Wei; Ye, Wei; Ji, Dingjue; Luo, Ray; Chen, Hai-Feng

    2017-01-01

    Intrinsically disordered proteins (IDPs) are proteins which lack of specific tertiary structure and unable to fold spontaneously without the partner binding. These IDPs are found to associate with various diseases, such as diabetes, cancer, and neurodegenerative diseases. However, current widely used force fields, such as ff99SB, ff14SB, OPLS/AA, and Charmm27 are insufficient in sampling the conformational characters of IDPs. In this study, the CMAP method was used to correct the φ/ψ distributions of disorder-promoting amino acids. The simulation results show that the force filed parameters (ff14IDPs) can improve the φ/ψ distributions of the disorder-promoting amino acids, with RMSD less than 0.10% relative to the benchmark data of IDPs. Further test suggests that the calculated secondary chemical shifts under ff14IDPs force field are in quantitative agreement with the data of NMR experiment for five tested systems. In addition, the simulation results show that ff14IDPs can still be used to model structural proteins, such as tested lysozyme and ubiquitin, with better performance in coil regions than the original general Amber force field ff14SB. These findings confirm that the newly developed Amber ff14IDPs force field is a robust model for improving the conformation sampling of IDPs. PMID:27484738

  8. From Cosmic Dusk till Dawn with RELICS

    NASA Astrophysics Data System (ADS)

    Bradac, Marusa

    When did galaxies start forming stars? What is the role of distant galaxies in galaxy formation models and the epoch of reionization? What are the conditions in typical lowmass, star-forming galaxies at z 4? Why is galaxy evolution dependent on environment? Recent observations indicate several critical puzzles in studies that address these questions. Chief among these, galaxies might have started forming stars earlier than previously thought (<400Myr after the Big Bang) and their star formation history differs from what is predicted from simulations. Furthermore, the details of the mechanisms that regulate star formation and morphological transformation in dense environments are still unknown. To solve these puzzles of galaxy evolution, we will use 41 galaxy clusters from the RELICS program (Reionization Lensing Cluster Survey) that are among the most powerful cosmic telescopes. Their magnification will allow us to study stellar properties of a large number of galaxies all the way to the reionization era. Accurate knowledge of stellar masses, ages, and star formation rates (SFRs) requires measuring both rest-frame UV and optical light, which only Spitzer can probe at z>0.5-11 for a sufficiently large sample of typical galaxies. This program will combine Spitzer imaging from two large programs, Director Discretionary Time (DDT) and the SRELICS program led by the PI.The main challenge in a study such as this is the capability to perform reliable photometry in crowded fields. Our team recently helped develop TPHOT, which is a much improved and much faster version of previously available codes. TPHOT is specifically designed to extract fluxes in crowded fields with very different PSFs. We will combine Spitzer photometry with ground based imaging and spectroscopy to obtain robust measurements of galaxy star formation rates, stellar masses, and stellar ages. This program will be a crucial legacy complement to previous Spitzer/IRAC deep blank field surveys and cluster studies, and will open up new parameter space by probing intrinsically fainter objects than most current surveys with a significantly improved sample variance over deep field surveys. It will allow us to study the properties (e.g. SFRs and stellar masses) of a large number of galaxies (200 at z=6-10), thus meeting our goal of reconstructing the cosmic SFR density with sufficient precision to better understand the role of galaxies in the reionization process. We will measure the presence (or absence) of established stellar populations with Spitzer for the largest sample to date. Furthermore this proposal will allow us to study the SFRs of the intrinsically faint (and magnified) intermediate redshift (z 4) galaxies, as well as the stellar mass function of z=0.3-0.7 galaxy members of our cluster sample, thereby expanding our understanding of star formation from reionization to the epoch of galaxy formation and dense environments. Many of the science goals of this proposal are main science drivers for JWST. Due to magnification our effective depth and resolution match those of the JWST blank fields and affords us a sneak preview of JWST sources with Spitzer now. This program will thus provide a valuable test-bed for simulations, observation planning and source selection just in time for JWST Cycle 1.

  9. Static and wind tunnel near-field/far-field jet noise measurements from model scale single-flow baseline and suppressor nozzles. Volume 1: Noise source locations and extrapolation of static free-field jet noise data

    NASA Technical Reports Server (NTRS)

    Jaeck, C. L.

    1976-01-01

    A test was conducted in the Boeing Large Anechoic Chamber to determine static jet noise source locations of six baseline and suppressor nozzle models, and establish a technique for extrapolating near field data into the far field. The test covered nozzle pressure ratios from 1.44 to 2.25 and jet velocities from 412 to 594 m/s at a total temperature of 844 K.

  10. Measuring salivary analytes from free-ranging monkeys

    PubMed Central

    Higham, James P.; Vitale, Alison; Rivera, Adaris Mas; Ayala, James E.; Maestripieri, Dario

    2014-01-01

    Studies of large free-ranging mammals have been revolutionized by non-invasive methods for assessing physiology, which usually involve the measurement of fecal or urinary biomarkers. However, such techniques are limited by numerous factors. To expand the range of physiological variables measurable non-invasively from free-ranging primates, we developed techniques for sampling monkey saliva by offering monkeys ropes with oral swabs sewn on the ends. We evaluated different attractants for encouraging individuals to offer samples, and proportions of individuals in different age/sex categories willing to give samples. We tested the saliva samples we obtained in three commercially available assays: cortisol, Salivary Alpha Amylase, and Secretory Immunoglobulin A. We show that habituated free-ranging rhesus macaques will give saliva samples voluntarily without training, with 100% of infants, and over 50% of adults willing to chew on collection devices. Our field methods are robust even for analytes that show poor recovery from cotton, and/or that have concentrations dependent on salivary flow rate. We validated the cortisol and SAA assays for use in rhesus macaques by showing aspects of analytical validation, such as that samples dilute linearly and in parallel to assay standards. We also found that values measured correlated with biologically meaningful characteristics of sampled individuals (age and dominance rank). The SIgA assay tested did not react to samples. Given the wide range of analytes measurable in saliva but not in feces or urine, our methods considerably improve our ability to study physiological aspects of the behavior and ecology of free-ranging primates, and are also potentially adaptable to other mammalian taxa. PMID:20837036

  11. A new sampler for stratified lagoon chemical and microbiological assessments.

    PubMed

    McLaughlin, M R; Brooks, J P; Adeli, A

    2014-07-01

    A sampler was needed for a spatial and temporal study of microbial and chemical stratification in a large swine manure lagoon that was known to contain zoonotic bacteria. Conventional samplers were limited to collections of surface water samples near the bank or required a manned boat. A new sampler was developed to allow simultaneous collection of multiple samples at different depths, up to 2.3 m, without a manned boat. The sampler was tethered for stability, used remote control (RC) for sample collection, and accommodated rapid replacement of sterile tubing modules and sample containers. The sampler comprised a PVC pontoon with acrylic deck and watertight enclosures, for a 12 VDC gearmotor, to operate the collection module, and vacuum system, to draw samples into reusable autoclavable tubing and 250-mL bottles. Although designed primarily for water samples, the sampler was easily modified to collect sludge. The sampler held a stable position during deployment, created minimal disturbance in the water column, and was readily cleaned and sanitized for transport. The sampler was field tested initially in a shallow fresh water lake and subsequently in a swine manure treatment lagoon. Analyses of water samples from the lagoon tests showed that chemical and bacterial levels, pH, and EC did not differ between 0.04, 0.47, and 1.0 m depths, but some chemical and bacterial levels differed between winter and spring collections. These results demonstrated the utility of the sampler and suggested that future manure lagoon studies employ fewer or different depths and more sampling dates.

  12. Selected Field Parameters from Streams and Analytical Data from Water and Macroinvertebrate Samples, Central Colorado Assessment Project, Environmental Assessment Task, 2004 and 2005

    USGS Publications Warehouse

    Fey, David L.; Church, Stan E.; Schmidt, Travis S.; Wanty, Richard B.; Verplanck, Philip L.; Lamothe, Paul J.; Adams, Monique; Anthony, Michael W.

    2007-01-01

    The U.S. Geological Survey (USGS) Central Colorado Assessment Project (CCAP) began in October 2003 and is planned to last through September 2008. One major goal of this project is to compare the relationships between surface-water chemistry and aquatic fauna in mined and unmined areas. To accomplish this goal, we are conducting a State-scale reconnaissance sampling program, in which we are collecting water and macroinvertebrate samples. Selected results from the first two years of project analyses are reported here. We plan to develop statistical models and use geographic information system (GIS) technology to quantify the relationships between ecological indicators of metal contamination in Rocky Mountain streams and water quality, landscape and land-use characteristics (for example, mine density, geology, geomorphology, vegetation, topography). Our research will test the hypothesis that physicochemical variables and ecological responses to metal concentrations in stream water in Rocky Mountain streams are ultimately determined largely by historical land uses.

  13. Field test comparison of an autocorrelation technique for determining grain size using a digital 'beachball' camera versus traditional methods

    USGS Publications Warehouse

    Barnard, P.L.; Rubin, D.M.; Harney, J.; Mustain, N.

    2007-01-01

    This extensive field test of an autocorrelation technique for determining grain size from digital images was conducted using a digital bed-sediment camera, or 'beachball' camera. Using 205 sediment samples and >1200 images from a variety of beaches on the west coast of the US, grain size ranging from sand to granules was measured from field samples using both the autocorrelation technique developed by Rubin [Rubin, D.M., 2004. A simple autocorrelation algorithm for determining grain size from digital images of sediment. Journal of Sedimentary Research, 74(1): 160-165.] and traditional methods (i.e. settling tube analysis, sieving, and point counts). To test the accuracy of the digital-image grain size algorithm, we compared results with manual point counts of an extensive image data set in the Santa Barbara littoral cell. Grain sizes calculated using the autocorrelation algorithm were highly correlated with the point counts of the same images (r2 = 0.93; n = 79) and had an error of only 1%. Comparisons of calculated grain sizes and grain sizes measured from grab samples demonstrated that the autocorrelation technique works well on high-energy dissipative beaches with well-sorted sediment such as in the Pacific Northwest (r2 ??? 0.92; n = 115). On less dissipative, more poorly sorted beaches such as Ocean Beach in San Francisco, results were not as good (r2 ??? 0.70; n = 67; within 3% accuracy). Because the algorithm works well compared with point counts of the same image, the poorer correlation with grab samples must be a result of actual spatial and vertical variability of sediment in the field; closer agreement between grain size in the images and grain size of grab samples can be achieved by increasing the sampling volume of the images (taking more images, distributed over a volume comparable to that of a grab sample). In all field tests the autocorrelation method was able to predict the mean and median grain size with ???96% accuracy, which is more than adequate for the majority of sedimentological applications, especially considering that the autocorrelation technique is estimated to be at least 100 times faster than traditional methods.

  14. A direct observation method for auditing large urban centers using stratified sampling, mobile GIS technology and virtual environments.

    PubMed

    Lafontaine, Sean J V; Sawada, M; Kristjansson, Elizabeth

    2017-02-16

    With the expansion and growth of research on neighbourhood characteristics, there is an increased need for direct observational field audits. Herein, we introduce a novel direct observational audit method and systematic social observation instrument (SSOI) for efficiently assessing neighbourhood aesthetics over large urban areas. Our audit method uses spatial random sampling stratified by residential zoning and incorporates both mobile geographic information systems technology and virtual environments. The reliability of our method was tested in two ways: first, in 15 Ottawa neighbourhoods, we compared results at audited locations over two subsequent years, and second; we audited every residential block (167 blocks) in one neighbourhood and compared the distribution of SSOI aesthetics index scores with results from the randomly audited locations. Finally, we present interrater reliability and consistency results on all observed items. The observed neighbourhood average aesthetics index score estimated from four or five stratified random audit locations is sufficient to characterize the average neighbourhood aesthetics. The SSOI was internally consistent and demonstrated good to excellent interrater reliability. At the neighbourhood level, aesthetics is positively related to SES and physical activity and negatively correlated with BMI. The proposed approach to direct neighbourhood auditing performs sufficiently and has the advantage of financial and temporal efficiency when auditing a large city.

  15. How Big Is Big Enough? Sample Size Requirements for CAST Item Parameter Estimation

    ERIC Educational Resources Information Center

    Chuah, Siang Chee; Drasgow, Fritz; Luecht, Richard

    2006-01-01

    Adaptive tests offer the advantages of reduced test length and increased accuracy in ability estimation. However, adaptive tests require large pools of precalibrated items. This study looks at the development of an item pool for 1 type of adaptive administration: the computer-adaptive sequential test. An important issue is the sample size required…

  16. Industrial Raman gas sensing for real-time system control

    NASA Astrophysics Data System (ADS)

    Buric, M.; Mullen, J.; Chorpening, B.; Woodruff, S.

    2014-06-01

    Opportunities exist to improve on-line process control in energy applications with a fast, non-destructive measurement of gas composition. Here, we demonstrate a Raman sensing system which is capable of reporting the concentrations of numerous species simultaneously with sub-percent accuracy and sampling times below one-second for process control applications in energy or chemical production. The sensor is based upon a hollow-core capillary waveguide with a 300 micron bore with reflective thin-film metal and dielectric linings. The effect of using such a waveguide in a Raman process is to integrate Raman photons along the length of the sample-filled waveguide, thus permitting the acquisition of very large Raman signals for low-density gases in a short time. The resultant integrated Raman signals can then be used for quick and accurate analysis of a gaseous mixture. The sensor is currently being tested for energy applications such as coal gasification, turbine control, well-head monitoring for exploration or production, and non-conventional gas utilization. In conjunction with an ongoing commercialization effort, the researchers have recently completed two prototype instruments suitable for hazardous area operation and testing. Here, we report pre-commercialization testing of those field prototypes for control applications in gasification or similar processes. Results will be discussed with respect to accuracy, calibration requirements, gas sampling techniques, and possible control strategies of industrial significance.

  17. Field assessment of dried Plasmodium falciparum samples for malaria rapid diagnostic test quality control and proficiency testing in Ethiopia.

    PubMed

    Tamiru, Afework; Boulanger, Lucy; Chang, Michelle A; Malone, Joseph L; Aidoo, Michael

    2015-01-21

    Rapid diagnostic tests (RDTs) are now widely used for laboratory confirmation of suspected malaria cases to comply with the World Health Organization recommendation for universal testing before treatment. However, many malaria programmes lack quality control (QC) processes to assess RDT use under field conditions. Prior research showed the feasibility of using the dried tube specimen (DTS) method for preserving Plasmodium falciparum parasites for use as QC samples for RDTs. This study focused on the use of DTS for RDT QC and proficiency testing under field conditions. DTS were prepared using cultured P. falciparum at densities of 500 and 1,000 parasites/μL; 50 μL aliquots of these along with parasite negative human blood controls (0 parasites/μL) were air-dried in specimen tubes and reactivity verified after rehydration. The DTS were used in a field study in the Oromia Region of Ethiopia. Replicate DTS samples containing 0, 500 and 1,000 parasites/μL were stored at 4°C at a reference laboratory and at ambient temperatures at two nearby health facilities. At weeks 0, 4, 8, 12, 16, 20, and 24, the DTS were rehydrated and tested on RDTs stored under manufacturer-recommended temperatures at the RL and on RDTs stored under site-specific conditions at the two health facilities. Reactivity of DTS stored at 4°C at the reference laboratory on RDTs stored at the reference laboratory was considered the gold standard for assessing DTS stability. A proficiency-testing panel consisting of one negative and three positive samples, monitored with a checklist was administered at weeks 12 and 24. At all the seven time points, DTS stored at both the reference laboratory and health facility were reactive on RDTs stored under the recommended temperature and under field conditions, and the DTS without malaria parasites were negative. At the reference laboratory and one health facility, a 500 parasites/μL DTS from the proficiency panel was falsely reported as negative at week 24 due to errors in interpreting faint test lines. The DTS method can be used under field conditions to supplement other RDT QC methods and health worker proficiency in Ethiopia and possibly other malaria-endemic countries.

  18. High-Temperature Fluid-Wall Reactor Technology Research, Test and Evaluation Performed at Naval Construction Battalion Center, Gulfport, MS, for the USAF Installation/Restoration Program

    DTIC Science & Technology

    1988-01-01

    under field conditions. Sampling and analytical laboratory activities were performed by Ecology and Environment, Inc., and California Analytical...the proposed AER3 test conditions. All test samples would be obtained onsite by Ecology and Environment, Inc., of Buffalo, New York, and sent to...ensuring its safe operation. Ecology and Environment performed onsite verification sampling. This activity was coordinated with the Huber project team

  19. Theoretical study on the laser-driven ion-beam trace probe in toroidal devices with large poloidal magnetic field

    NASA Astrophysics Data System (ADS)

    Yang, X.; Xiao, C.; Chen, Y.; Xu, T.; Yu, Y.; Xu, M.; Wang, L.; Wang, X.; Lin, C.

    2018-03-01

    Recently, a new diagnostic method, Laser-driven Ion-beam Trace Probe (LITP), has been proposed to reconstruct 2D profiles of the poloidal magnetic field (Bp) and radial electric field (Er) in the tokamak devices. A linear assumption and test particle model were used in those reconstructions. In some toroidal devices such as the spherical tokamak and the Reversal Field Pinch (RFP), Bp is not small enough to meet the linear assumption. In those cases, the error of reconstruction increases quickly when Bp is larger than 10% of the toroidal magnetic field (Bt), and the previous test particle model may cause large error in the tomography process. Here a nonlinear reconstruction method is proposed for those cases. Preliminary numerical results show that LITP could be applied not only in tokamak devices, but also in other toroidal devices, such as the spherical tokamak, RFP, etc.

  20. Large-scale variability of wind erosion mass flux rates at Owens Lake 1. Vertical profiles of horizontal mass fluxes of wind-eroded particles with diameter greater than 50 μm

    USGS Publications Warehouse

    Gillette, Dale A.; Fryrear, D.W.; Xiao, Jing Bing; Stockton, Paul; Ono, Duane; Helm, Paula J.; Gill, Thomas E; Ley, Trevor

    1997-01-01

    A field experiment at Owens (dry) Lake, California, tested whether and how the relative profiles of airborne horizontal mass fluxes for >50-μm wind-eroded particles changed with friction velocity. The horizontal mass flux at almost all measured heights increased proportionally to the cube of friction velocity above an apparent threshold friction velocity for all sediment tested and increased with height except at one coarse-sand site where the relative horizontal mass flux profile did not change with friction velocity. Size distributions for long-time-averaged horizontal mass flux samples showed a saltation layer from the surface to a height between 30 and 50 cm, above which suspended particles dominate. Measurements from a large dust source area on a line parallel to the wind showed that even though the saltation flux reached equilibrium ∼650 m downwind of the starting point of erosion, weakly suspended particles were still input into the atmosphere 1567 m downwind of the starting point; thus the saltating fraction of the total mass flux decreased after 650 m. The scale length difference and ratio of 70/30 suspended mass flux to saltation mass flux at the farthest down wind sampling site confirm that suspended particles are very important for mass budgets in large source areas and that saltation mass flux can be a variable fraction of total horizontal mass flux for soils with a substantial fraction of <100-μm particles.

  1. The Gumbel hypothesis test for left censored observations using regional earthquake records as an example

    NASA Astrophysics Data System (ADS)

    Thompson, E. M.; Hewlett, J. B.; Baise, L. G.; Vogel, R. M.

    2011-01-01

    Annual maximum (AM) time series are incomplete (i.e., censored) when no events are included above the assumed censoring threshold (i.e., magnitude of completeness). We introduce a distrtibutional hypothesis test for left-censored Gumbel observations based on the probability plot correlation coefficient (PPCC). Critical values of the PPCC hypothesis test statistic are computed from Monte-Carlo simulations and are a function of sample size, censoring level, and significance level. When applied to a global catalog of earthquake observations, the left-censored Gumbel PPCC tests are unable to reject the Gumbel hypothesis for 45 of 46 seismic regions. We apply four different field significance tests for combining individual tests into a collective hypothesis test. None of the field significance tests are able to reject the global hypothesis that AM earthquake magnitudes arise from a Gumbel distribution. Because the field significance levels are not conclusive, we also compute the likelihood that these field significance tests are unable to reject the Gumbel model when the samples arise from a more complex distributional alternative. A power study documents that the censored Gumbel PPCC test is unable to reject some important and viable Generalized Extreme Value (GEV) alternatives. Thus, we cannot rule out the possibility that the global AM earthquake time series could arise from a GEV distribution with a finite upper bound, also known as a reverse Weibull distribution. Our power study also indicates that the binomial and uniform field significance tests are substantially more powerful than the more commonly used Bonferonni and false discovery rate multiple comparison procedures.

  2. Evaluation of children with ADHD on the Ball-Search Field Task

    PubMed Central

    Rosetti, Marcos F.; Ulloa, Rosa E.; Vargas-Vargas, Ilse L.; Reyes-Zamorano, Ernesto; Palacios-Cruz, Lino; de la Peña, Francisco; Larralde, Hernán; Hudson, Robyn

    2016-01-01

    Searching, defined for the purpose of the present study as the displacement of an individual to locate resources, is a fundamental behavior of all mobile organisms. In humans this behavior underlies many aspects of everyday life, involving cognitive processes such as sustained attention, memory and inhibition. We explored the performance of 36 treatment-free children diagnosed with attention-deficit hyperactivity disorder (ADHD) and 132 children from a control school sample on the ecologically based ball-search field task (BSFT), which required them to locate and collect golf balls in a large outdoor area. Children of both groups enjoyed the task and were motivated to participate in it. However, performance showed that ADHD-diagnosed subjects were significantly less efficient in their searching. We suggest that the BSFT provides a promising basis for developing more complex ecologically-derived tests that might help to better identify particular cognitive processes and impairments associated with ADHD. PMID:26805450

  3. Diagnostic performance characteristics of a rapid field test for anthrax in cattle.

    PubMed

    Muller, Janine; Gwozdz, Jacek; Hodgeman, Rachel; Ainsworth, Catherine; Kluver, Patrick; Czarnecki, Jill; Warner, Simone; Fegan, Mark

    2015-07-01

    Although diagnosis of anthrax can be made in the field with a peripheral blood smear, and in the laboratory with bacterial culture or molecular based tests, these tests require either considerable experience or specialised equipment. Here we report on the evaluation of the diagnostic sensitivity and specificity of a simple and rapid in-field diagnostic test for anthrax, the anthrax immunochromatographic test (AICT). The AICT detects the protective antigen (PA) component of the anthrax toxin present within the blood of an animal that has died from anthrax. The test provides a result in 15min and offers the advantage of avoiding the necessity for on-site necropsy and subsequent occupational risks and environmental contamination. The specificity of the test was determined by testing samples taken from 622 animals, not infected with Bacillus anthracis. Diagnostic sensitivity was estimated on samples taken from 58 animals, naturally infected with B. anthracis collected over a 10-year period. All samples used to estimate the diagnostic sensitivity and specificity of the AICT were also tested using the gold standard of bacterial culture. The diagnostic specificity of the test was estimated to be 100% (99.4-100%; 95% CI) and the diagnostic sensitivity was estimated to be 93.1% (83.3-98.1%; 95% CI) (Clopper-Pearson method). Four samples produced false negative AICT results. These were among 9 samples, all of which tested positive for B. anthracis by culture, where there was a time delay between collection and testing of >48h and/or the samples were collected from animals that were >48h post-mortem. A statistically significant difference (P<0.001; Fishers exact test) was found between the ability of the AICT to detect PA in samples from culture positive animals <48h post-mortem, 49 of 49, Se=100% (92.8-100%; 95% CI) compared with samples tested >48h post-mortem 5 of 9 Se=56% (21-86.3%; 95% CI) (Clopper-Pearson method). Based upon these results a post hoc cut-off for use of the AICT of 48h post-mortem was applied, Se=100% (92.8-100%; 95% CI) and Sp=100% (99.4-100%; 95% CI). The high diagnostic sensitivity and specificity and the simplicity of the AICT enables it to be used for active surveillance in areas with a history of anthrax, or used as a preliminary tool in investigating sudden, unexplained death in cattle. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Structural morphology of zinc oxide structures with antibacterial application of calamine lotion

    NASA Astrophysics Data System (ADS)

    Ann, Ling Chuo; Mahmud, Shahrom; Bakhori, Siti Khadijah Mohd; Sirelkhatim, Amna; Mohamad, Dasmawati; Hasan, Habsah; Seeni, Azman; Rahman, Rosliza Abdul

    2015-04-01

    In this study, we report the structural morphology of a zinc oxide (ZnO) sample and antibacterial application of the ZnO structures in calamine lotion. Antibacterial activities of the calamine lotion towards Staphylococcus aureus and Pseudomonas aeruginosa were investigated. The structural morphology of ZnO sample was studied using a transmission electron microscope (TEM) and a field-emission scanning electron microscope (FESEM). The morphologies of the ZnO structure consisted of many rod and spherical structures. The particle sizes of the sample ranged from 40 nm to 150 nm. A calamine lotion was prepared through mixing the ZnO structures with other constituents in suitable proportion. The energy-dispersive x-ray spectroscopy (EDS) revealed the presence of large amount of ZnO structures whiles the X-ray diffraction (XRD) results showed a good crystalline property of ZnO in the calamine lotion mixture. The morphological structures of ZnO were found to remain unchanged in the calamine lotion mixture through FESEM imaging. In the antibacterial test, prepared calamine lotion exhibited a remarkable bacterial inhibition on Staphylococcus aureus and Pseudomonas aeruginosa after 24 h of treatment. The bactericidal capability of calamine lotion was largely due to the presence of ZnO structures which induce high toxicity and killing effect on the bacteria.

  5. Laser Light-field Fusion for Wide-field Lensfree On-chip Phase Contrast Microscopy of Nanoparticles

    NASA Astrophysics Data System (ADS)

    Kazemzadeh, Farnoud; Wong, Alexander

    2016-12-01

    Wide-field lensfree on-chip microscopy, which leverages holography principles to capture interferometric light-field encodings without lenses, is an emerging imaging modality with widespread interest given the large field-of-view compared to lens-based techniques. In this study, we introduce the idea of laser light-field fusion for lensfree on-chip phase contrast microscopy for detecting nanoparticles, where interferometric laser light-field encodings acquired using a lensfree, on-chip setup with laser pulsations at different wavelengths are fused to produce marker-free phase contrast images of particles at the nanometer scale. As a proof of concept, we demonstrate, for the first time, a wide-field lensfree on-chip instrument successfully detecting 300 nm particles across a large field-of-view of ~30 mm2 without any specialized or intricate sample preparation, or the use of synthetic aperture- or shift-based techniques.

  6. Laser Light-field Fusion for Wide-field Lensfree On-chip Phase Contrast Microscopy of Nanoparticles.

    PubMed

    Kazemzadeh, Farnoud; Wong, Alexander

    2016-12-13

    Wide-field lensfree on-chip microscopy, which leverages holography principles to capture interferometric light-field encodings without lenses, is an emerging imaging modality with widespread interest given the large field-of-view compared to lens-based techniques. In this study, we introduce the idea of laser light-field fusion for lensfree on-chip phase contrast microscopy for detecting nanoparticles, where interferometric laser light-field encodings acquired using a lensfree, on-chip setup with laser pulsations at different wavelengths are fused to produce marker-free phase contrast images of particles at the nanometer scale. As a proof of concept, we demonstrate, for the first time, a wide-field lensfree on-chip instrument successfully detecting 300 nm particles across a large field-of-view of ~30 mm 2 without any specialized or intricate sample preparation, or the use of synthetic aperture- or shift-based techniques.

  7. An optimised protocol for molecular identification of Eimeria from chickens☆

    PubMed Central

    Kumar, Saroj; Garg, Rajat; Moftah, Abdalgader; Clark, Emily L.; Macdonald, Sarah E.; Chaudhry, Abdul S.; Sparagano, Olivier; Banerjee, Partha S.; Kundu, Krishnendu; Tomley, Fiona M.; Blake, Damer P.

    2014-01-01

    Molecular approaches supporting identification of Eimeria parasites infecting chickens have been available for more than 20 years, although they have largely failed to replace traditional measures such as microscopy and pathology. Limitations of microscopy-led diagnostics, including a requirement for specialist parasitological expertise and low sample throughput, are yet to be outweighed by the difficulties associated with accessing genomic DNA from environmental Eimeria samples. A key step towards the use of Eimeria species-specific PCR as a sensitive and reproducible discriminatory tool for use in the field is the production of a standardised protocol that includes sample collection and DNA template preparation, as well as primer selection from the numerous PCR assays now published. Such a protocol will facilitate development of valuable epidemiological datasets which may be easily compared between studies and laboratories. The outcome of an optimisation process undertaken in laboratories in India and the UK is described here, identifying four steps. First, samples were collected into a 2% (w/v) potassium dichromate solution. Second, oocysts were enriched by flotation in saturated saline. Third, genomic DNA was extracted using a QIAamp DNA Stool mini kit protocol including a mechanical homogenisation step. Finally, nested PCR was carried out using previously published primers targeting the internal transcribed spacer region 1 (ITS-1). Alternative methods tested included sample processing in the presence of faecal material, DNA extraction using a traditional phenol/chloroform protocol, the use of SCAR multiplex PCR (one tube and two tube versions) and speciation using the morphometric tool COCCIMORPH for the first time with field samples. PMID:24138724

  8. Comparison of geochemical data obtained using four brine sampling methods at the SECARB Phase III Anthropogenic Test CO2 injection site, Citronelle Oil Field, Alabama

    USGS Publications Warehouse

    Conaway, Christopher; Thordsen, James J.; Manning, Michael A.; Cook, Paul J.; Trautz, Robert C.; Thomas, Burt; Kharaka, Yousif K.

    2016-01-01

    The chemical composition of formation water and associated gases from the lower Cretaceous Paluxy Formation was determined using four different sampling methods at a characterization well in the Citronelle Oil Field, Alabama, as part of the Southeast Regional Carbon Sequestration Partnership (SECARB) Phase III Anthropogenic Test, which is an integrated carbon capture and storage project. In this study, formation water and gas samples were obtained from well D-9-8 #2 at Citronelle using gas lift, electric submersible pump, U-tube, and a downhole vacuum sampler (VS) and subjected to both field and laboratory analyses. Field chemical analyses included electrical conductivity, dissolved sulfide concentration, alkalinity, and pH; laboratory analyses included major, minor and trace elements, dissolved carbon, volatile fatty acids, free and dissolved gas species. The formation water obtained from this well is a Na–Ca–Cl-type brine with a salinity of about 200,000 mg/L total dissolved solids. Differences were evident between sampling methodologies, particularly in pH, Fe and alkalinity. There was little gas in samples, and gas composition results were strongly influenced by sampling methods. The results of the comparison demonstrate the difficulty and importance of preserving volatile analytes in samples, with the VS and U-tube system performing most favorably in this aspect.

  9. Permeability and compressibility of resedimented Gulf of Mexico mudrock

    NASA Astrophysics Data System (ADS)

    Betts, W. S.; Flemings, P. B.; Schneider, J.

    2011-12-01

    We use a constant-rate-of strain consolidation test on resedimented Gulf of Mexico mudrock to determine the compression index (Cc) to be 0.618 and the expansion index (Ce) to be 0.083. We used crushed, homogenized Pliocene and Pleistocene mudrock extracted from cored wells in the Eugene Island block 330 oil field. This powdered material has a liquid limit (LL) of 87, a plastic limit (PL) of 24, and a plasticity index (PI) of 63. The particle size distribution from hydrometer analyses is approximately 65% clay-sized particles (<2 μm) with the remainder being less than 70 microns in diameter. Resedimented specimens have been used to characterize the geotechnical and geophysical behavior of soils and mudstones independent of the variability of natural samples and without the effects of sampling disturbance. Previous investigations of resedimented offshore Gulf of Mexico sediments (e.g. Mazzei, 2008) have been limited in scope. This is the first test of the homogenized Eugene Island core material. These results will be compared to in situ measurements to determine the controls on consolidation over large stress ranges.

  10. Efficient bootstrap estimates for tail statistics

    NASA Astrophysics Data System (ADS)

    Breivik, Øyvind; Aarnes, Ole Johan

    2017-03-01

    Bootstrap resamples can be used to investigate the tail of empirical distributions as well as return value estimates from the extremal behaviour of the sample. Specifically, the confidence intervals on return value estimates or bounds on in-sample tail statistics can be obtained using bootstrap techniques. However, non-parametric bootstrapping from the entire sample is expensive. It is shown here that it suffices to bootstrap from a small subset consisting of the highest entries in the sequence to make estimates that are essentially identical to bootstraps from the entire sample. Similarly, bootstrap estimates of confidence intervals of threshold return estimates are found to be well approximated by using a subset consisting of the highest entries. This has practical consequences in fields such as meteorology, oceanography and hydrology where return values are calculated from very large gridded model integrations spanning decades at high temporal resolution or from large ensembles of independent and identically distributed model fields. In such cases the computational savings are substantial.

  11. SMERFS: Stochastic Markov Evaluation of Random Fields on the Sphere

    NASA Astrophysics Data System (ADS)

    Creasey, Peter; Lang, Annika

    2018-04-01

    SMERFS (Stochastic Markov Evaluation of Random Fields on the Sphere) creates large realizations of random fields on the sphere. It uses a fast algorithm based on Markov properties and fast Fourier Transforms in 1d that generates samples on an n X n grid in O(n2 log n) and efficiently derives the necessary conditional covariance matrices.

  12. Graphene/graphene oxide and their derivatives in the separation/isolation and preconcentration of protein species: A review.

    PubMed

    Chen, Xuwei; Hai, Xin; Wang, Jianhua

    2016-05-30

    The distinctive/unique electrical, chemical and optical properties make graphene/graphene oxide-based materials popular in the field of analytical chemistry. Its large surface offers excellent capacity to anchor target analyte, making it an powerful sorbent in the adsorption and preconcentration of trace level analyte of interest in the field of sample preparation. The large delocalized π-electron system of graphene framework provides strong affinity to species containing aromatic rings, such as proteins, and the abundant active sites on its surface offers the chance to modulate adsorption tendency towards specific protein via functional modification/decoration. This review provides an overview of the current research on graphene/graphene oxide-based materials as attractive and powerful adsorption media in the separation/isolation and preconcentration of protein species from biological sample matrixes. These practices are aiming at providing protein sample of high purity for further investigations and applications, or to achieve certain extent of enrichment prior to quantitative assay. In addition, the challenges and future perspectives in the related research fields have been discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. A new FIA-Type strategic inventory (NFI)

    Treesearch

    Richard A. Grotefendt; Hans T. Schreuder

    2006-01-01

    New remote sensing technologies are now available to lower the cost of doing strategic surveys. A new sampling approach for the Forest Inventory and Analysis program (FIA) of the U.S.D.A. Forest Service is discussed involving a bi-sampling unit (BSU) that is composed of a field sample unit (FSU) centered within a large scale (1:1,000 to 1:3,000) photo sample unit (PSU...

  14. Test of Relativistic Gravity for Propulsion at the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Felber, Franklin

    2010-01-01

    A design is presented of a laboratory experiment that could test the suitability of relativistic gravity for propulsion of spacecraft to relativistic speeds. An exact time-dependent solution of Einstein's gravitational field equation confirms that even the weak field of a mass moving at relativistic speeds could serve as a driver to accelerate a much lighter payload from rest to a good fraction of the speed of light. The time-dependent field of ultrarelativistic particles in a collider ring is calculated. An experiment is proposed as the first test of the predictions of general relativity in the ultrarelativistic limit by measuring the repulsive gravitational field of bunches of protons in the Large Hadron Collider (LHC). The estimated `antigravity beam' signal strength at a resonant detector of each proton bunch is 3 nm/s2 for 2 ns during each revolution of the LHC. This experiment can be performed off-line, without interfering with the normal operations of the LHC.

  15. The cosmological principle is not in the sky

    NASA Astrophysics Data System (ADS)

    Park, Chan-Gyung; Hyun, Hwasu; Noh, Hyerim; Hwang, Jai-chan

    2017-08-01

    The homogeneity of matter distribution at large scales, known as the cosmological principle, is a central assumption in the standard cosmological model. The case is testable though, thus no longer needs to be a principle. Here we perform a test for spatial homogeneity using the Sloan Digital Sky Survey Luminous Red Galaxies (LRG) sample by counting galaxies within a specified volume with the radius scale varying up to 300 h-1 Mpc. We directly confront the large-scale structure data with the definition of spatial homogeneity by comparing the averages and dispersions of galaxy number counts with allowed ranges of the random distribution with homogeneity. The LRG sample shows significantly larger dispersions of number counts than the random catalogues up to 300 h-1 Mpc scale, and even the average is located far outside the range allowed in the random distribution; the deviations are statistically impossible to be realized in the random distribution. This implies that the cosmological principle does not hold even at such large scales. The same analysis of mock galaxies derived from the N-body simulation, however, suggests that the LRG sample is consistent with the current paradigm of cosmology, thus the simulation is also not homogeneous in that scale. We conclude that the cosmological principle is neither in the observed sky nor demanded to be there by the standard cosmological world model. This reveals the nature of the cosmological principle adopted in the modern cosmology paradigm, and opens a new field of research in theoretical cosmology.

  16. Field spectrometer (S191H) preprocessor tape quality test program design document

    NASA Technical Reports Server (NTRS)

    Campbell, H. M.

    1976-01-01

    Program QA191H performs quality assurance tests on field spectrometer data recorded on 9-track magnetic tape. The quality testing involves the comparison of key housekeeping and data parameters with historic and predetermined tolerance limits. Samples of key parameters are processed during the calibration period and wavelength cal period, and the results are printed out and recorded on an historical file tape.

  17. A coronagraphic search for brown dwarfs around nearby stars

    NASA Technical Reports Server (NTRS)

    Nakajima, T.; Durrance, S. T.; Golimowski, D. A.; Kulkarni, S. R.

    1994-01-01

    Brown dwarf companions have been searched for around stars within 10 pc of the Sun using the Johns-Hopkins University Adaptive Optics Coronagraph (AOC), a stellar coronagraph with an image stabilizer. The AOC covers the field around the target star with a minimum search radius of 1 sec .5 and a field of view of 1 arcmin sq. We have reached an unprecedented dynamic range of Delta m = 13 in our search for faint companions at I band. Comparison of our survey with other brown dwarf searches shows that the AOC technique is unique in its dynamic range while at the same time just as sensitive to brown dwarfs as the recent brown dwarf surveys. The present survey covered 24 target stars selected from the Gliese catalog. A total of 94 stars were detected in 16 fields. The low-latitude fields are completely dominated by background star contamination. Kolmogorov-Smirnov tests were carried out for a sample restricted to high latitudes and a sample with small angular separations. The high-latitude sample (b greater than or equal to 44 deg) appears to show spatial concentration toward target stars. The small separation sample (Delta Theta less than 20 sec) shows weaker dependence on Galactic coordinates than field stars. These statistical tests suggest that both the high-latitude sample and the small separation sample can include a substantial fraction of true companions. However, the nature of these putative companions is mysterious. They are too faint to be white dwarfs and too blue for brown dwarfs. Ignoring the signif icance of the statistical tests, we can reconcile most of the detections with distant main-sequence stars or white dwarfs except for a candidate next to GL 475. Given the small size of our sample, we conclude that considerably more targets need to be surveyed before a firm conclusion on the possibility of a new class of companions can be made.

  18. Sequential accelerated tests: Improving the correlation of accelerated tests to module performance in the field

    NASA Astrophysics Data System (ADS)

    Felder, Thomas; Gambogi, William; Stika, Katherine; Yu, Bao-Ling; Bradley, Alex; Hu, Hongjie; Garreau-Iles, Lucie; Trout, T. John

    2016-09-01

    DuPont has been working steadily to develop accelerated backsheet tests that correlate with solar panels observations in the field. This report updates efforts in sequential testing. Single exposure tests are more commonly used and can be completed more quickly, and certain tests provide helpful predictions of certain backsheet failure modes. DuPont recommendations for single exposure tests are based on 25-year exposure levels for UV and humidity/temperature, and form a good basis for sequential test development. We recommend a sequential exposure of damp heat followed by UV then repetitions of thermal cycling and UVA. This sequence preserves 25-year exposure levels for humidity/temperature and UV, and correlates well with a large body of field observations. Measurements can be taken at intervals in the test, although the full test runs 10 months. A second, shorter sequential test based on damp heat and thermal cycling tests mechanical durability and correlates with loss of mechanical properties seen in the field. Ongoing work is directed toward shorter sequential tests that preserve good correlation to field data.

  19. Likelihood inference of non-constant diversification rates with incomplete taxon sampling.

    PubMed

    Höhna, Sebastian

    2014-01-01

    Large-scale phylogenies provide a valuable source to study background diversification rates and investigate if the rates have changed over time. Unfortunately most large-scale, dated phylogenies are sparsely sampled (fewer than 5% of the described species) and taxon sampling is not uniform. Instead, taxa are frequently sampled to obtain at least one representative per subgroup (e.g. family) and thus to maximize diversity (diversified sampling). So far, such complications have been ignored, potentially biasing the conclusions that have been reached. In this study I derive the likelihood of a birth-death process with non-constant (time-dependent) diversification rates and diversified taxon sampling. Using simulations I test if the true parameters and the sampling method can be recovered when the trees are small or medium sized (fewer than 200 taxa). The results show that the diversification rates can be inferred and the estimates are unbiased for large trees but are biased for small trees (fewer than 50 taxa). Furthermore, model selection by means of Akaike's Information Criterion favors the true model if the true rates differ sufficiently from alternative models (e.g. the birth-death model is recovered if the extinction rate is large and compared to a pure-birth model). Finally, I applied six different diversification rate models--ranging from a constant-rate pure birth process to a decreasing speciation rate birth-death process but excluding any rate shift models--on three large-scale empirical phylogenies (ants, mammals and snakes with respectively 149, 164 and 41 sampled species). All three phylogenies were constructed by diversified taxon sampling, as stated by the authors. However only the snake phylogeny supported diversified taxon sampling. Moreover, a parametric bootstrap test revealed that none of the tested models provided a good fit to the observed data. The model assumptions, such as homogeneous rates across species or no rate shifts, appear to be violated.

  20. Vertical Stratification of Soil Phosphorus as a Concern for Dissolved Phosphorus Runoff in the Lake Erie Basin.

    PubMed

    Baker, David B; Johnson, Laura T; Confesor, Remegio B; Crumrine, John P

    2017-11-01

    During the re-eutrophication of Lake Erie, dissolved reactive phosphorus (DRP) loading and concentrations to the lake have nearly doubled, while particulate phosphorus (PP) has remained relatively constant. One potential cause of increased DRP concentrations is P stratification, or the buildup of soil-test P (STP) in the upper soil layer (<5 cm). Stratification often accompanies no-till and mulch-till practices that reduce erosion and PP loading, practices that have been widely implemented throughout the Lake Erie Basin. To evaluate the extent of P stratification in the Sandusky Watershed, certified crop advisors were enlisted to collect stratified soil samples (0-5 or 0-2.5 cm) alongside their normal agronomic samples (0-20 cm) ( = 1758 fields). The mean STP level in the upper 2.5 cm was 55% higher than the mean of agronomic samples used for fertilizer recommendations. The amounts of stratification were highly variable and did not correlate with agronomic STPs (Spearman's = 0.039, = 0.178). Agronomic STP in 70% of the fields was within the buildup or maintenance ranges for corn ( L.) and soybeans [ (L.) Merr.] (0-46 mg kg Mehlich-3 P). The cumulative risks for DRP runoff from the large number of fields in the buildup and maintenance ranges exceeded the risks from fields above those ranges. Reducing stratification by a one-time soil inversion has the potential for larger and quicker reductions in DRP runoff risk than practices related to drawing down agronomic STP levels. Periodic soil inversion and mixing, targeted by stratified STP data, should be considered a viable practice to reduce DRP loading to Lake Erie. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  1. Comparison of supervised machine learning algorithms for waterborne pathogen detection using mobile phone fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Ceylan Koydemir, Hatice; Feng, Steve; Liang, Kyle; Nadkarni, Rohan; Benien, Parul; Ozcan, Aydogan

    2017-06-01

    Giardia lamblia is a waterborne parasite that affects millions of people every year worldwide, causing a diarrheal illness known as giardiasis. Timely detection of the presence of the cysts of this parasite in drinking water is important to prevent the spread of the disease, especially in resource-limited settings. Here we provide extended experimental testing and evaluation of the performance and repeatability of a field-portable and cost-effective microscopy platform for automated detection and counting of Giardia cysts in water samples, including tap water, non-potable water, and pond water. This compact platform is based on our previous work, and is composed of a smartphone-based fluorescence microscope, a disposable sample processing cassette, and a custom-developed smartphone application. Our mobile phone microscope has a large field of view of 0.8 cm2 and weighs only 180 g, excluding the phone. A custom-developed smartphone application provides a user-friendly graphical interface, guiding the users to capture a fluorescence image of the sample filter membrane and analyze it automatically at our servers using an image processing algorithm and training data, consisting of >30,000 images of cysts and >100,000 images of other fluorescent particles that are captured, including, e.g. dust. The total time that it takes from sample preparation to automated cyst counting is less than an hour for each 10 ml of water sample that is tested. We compared the sensitivity and the specificity of our platform using multiple supervised classification models, including support vector machines and nearest neighbors, and demonstrated that a bootstrap aggregating (i.e. bagging) approach using raw image file format provides the best performance for automated detection of Giardia cysts. We evaluated the performance of this machine learning enabled pathogen detection device with water samples taken from different sources (e.g. tap water, non-potable water, pond water) and achieved a limit of detection of 12 cysts per 10 ml, an average cyst capture efficiency of 79%, and an accuracy of 95%. Providing rapid detection and quantification of waterborne pathogens without the need for a microbiology expert, this field-portable imaging and sensing platform running on a smartphone could be very useful for water quality monitoring in resource-limited settings.

  2. Effectiveness of Cool Roof Coatings with Ceramic Particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brehob, Ellen G; Desjarlais, Andre Omer; Atchley, Jerald Allen

    2011-01-01

    Liquid applied coatings promoted as cool roof coatings, including several with ceramic particles, were tested at Oak Ridge National Laboratory (ORNL), Oak Ridge, Tenn., for the purpose of quantifying their thermal performances. Solar reflectance measurements were made for new samples and aged samples using a portable reflectometer (ASTM C1549, Standard Test Method for Determination of Solar Reflectance Near Ambient Temperature Using a Portable Solar Reflectometer) and for new samples using the integrating spheres method (ASTM E903, Standard Test Method for Solar Absorptance, Reflectance, and Transmittance of Materials Using Integrating Spheres). Thermal emittance was measured for the new samples using amore » portable emissometer (ASTM C1371, Standard Test Method for Determination of Emittance of Materials Near Room 1 Proceedings of the 2011 International Roofing Symposium Temperature Using Portable Emissometers). Thermal conductivity of the coatings was measured using a FOX 304 heat flow meter (ASTM C518, Standard Test Method for Steady-State Thermal Transmission Properties by Means of the Heat Flow Meter Apparatus). The surface properties of the cool roof coatings had higher solar reflectance than the reference black and white material, but there were no significant differences among coatings with and without ceramics. The coatings were applied to EPDM (ethylene propylene diene monomer) membranes and installed on the Roof Thermal Research Apparatus (RTRA), an instrumented facility at ORNL for testing roofs. Roof temperatures and heat flux through the roof were obtained for a year of exposure in east Tennessee. The field tests showed significant reduction in cooling required compared with the black reference roof (~80 percent) and a modest reduction in cooling compared with the white reference roof (~33 percent). The coating material with the highest solar reflectivity (no ceramic particles) demonstrated the best overall thermal performance (combination of reducing the cooling load cost and not incurring a large heating penalty cost) and suggests solar reflectivity is the significant characteristic for selecting cool roof coatings.« less

  3. Comparing Apollo and Mars Exploration Rover (MER) Operations Paradigms for Human Exploration During NASA Desert-Rats Science Operations

    NASA Technical Reports Server (NTRS)

    Yingst, R. A.; Cohen, B. A.; Ming, D. W.; Eppler, D. B.

    2011-01-01

    NASA's Desert Research and Technology Studies (D-RATS) field test is one of several analog tests that NASA conducts each year to combine operations development, technology advances and science under planetary surface conditions. The D-RATS focus is testing preliminary operational concepts for extravehicular activity (EVA) systems in the field using simulated surface operations and EVA hardware and procedures. For 2010 hardware included the Space Exploration Vehicles, Habitat Demonstration Units, Tri-ATHLETE, and a suite of new geology sample collection tools, including a self-contained GeoLab glove box for conducting in-field analysis of various collected rock samples. The D-RATS activities develop technical skills and experience for the mission planners, engineers, scientists, technicians, and astronauts responsible for realizing the goals of exploring planetary surfaces.

  4. Quantifying the Availability of Vertebrate Hosts to Ticks: A Camera-Trapping Approach

    PubMed Central

    Hofmeester, Tim R.; Rowcliffe, J. Marcus; Jansen, Patrick A.

    2017-01-01

    The availability of vertebrate hosts is a major determinant of the occurrence of ticks and tick-borne zoonoses in natural and anthropogenic ecosystems and thus drives disease risk for wildlife, livestock, and humans. However, it remains challenging to quantify the availability of vertebrate hosts in field settings, particularly for medium-sized to large-bodied mammals. Here, we present a method that uses camera traps to quantify the availability of warm-bodied vertebrates to ticks. The approach is to deploy camera traps at questing height at a representative sample of random points across the study area, measure the average photographic capture rate for vertebrate species, and then correct these rates for the effective detection distance. The resulting “passage rate” is a standardized measure of the frequency at which vertebrates approach questing ticks, which we show is proportional to contact rate. A field test across twenty 1-ha forest plots in the Netherlands indicated that this method effectively captures differences in wildlife assemblage composition between sites. Also, the relative abundances of three life stages of the sheep tick Ixodes ricinus from drag sampling were correlated with passage rates of deer, which agrees with the known association with this group of host species, suggesting that passage rate effectively reflects the availability of medium- to large-sized hosts to ticks. This method will facilitate quantitative studies of the relationship between densities of questing ticks and the availability of different vertebrate species—wild as well as domesticated species—in natural and anthropogenic settings. PMID:28770219

  5. Using SPMDs To Assess Natural Recovery Of PCB-Contaminated Sediments In Lake Hartwell, SC: I. A Field Test Of New In-Situ Deployment Methods

    EPA Science Inventory

    Results from the field testing of some innovative sampling methods developed to evaluate risk management strategies for polychlorinated biphenyl (PCB) contaminated sediments are presented. Semipermeable membrane devices (SPMDs) were combined with novel deployment methods to quan...

  6. The upper critical field of filamentary Nb3Sn conductors

    NASA Astrophysics Data System (ADS)

    Godeke, A.; Jewell, M. C.; Fischer, C. M.; Squitieri, A. A.; Lee, P. J.; Larbalestier, D. C.

    2005-05-01

    We have examined the upper critical field of a large and representative set of present multifilamentary Nb3Sn wires and one bulk sample over a temperature range from 1.4 K up to the zero-field critical temperature. Since all present wires use a solid-state diffusion reaction to form the A15 layers, inhomogeneities with respect to Sn content are inevitable, in contrast to some previously studied homogeneous samples. Our study emphasizes the effects that these inevitable inhomogeneities have on the field-temperature phase boundary. The property inhomogeneities are extracted from field-dependent resistive transitions which we find broaden with increasing inhomogeneity. The upper 90%-99% of the transitions clearly separates alloyed and binary wires but a pure, Cu-free binary bulk sample also exhibits a zero-temperature critical field that is comparable to the ternary wires. The highest μ0Hc2 detected in the ternary wires are remarkably constant: The highest zero-temperature upper critical fields and zero-field critical temperatures fall within 29.5±0.3 and 17.8±0.3K, respectively, independent of the wire layout. The complete field-temperature phase boundary can be described very well with the relatively simple Maki-DeGennes model using a two-parameter fit, independent of composition, strain state, sample layout, or applied critical state criterion.

  7. Latest Results From the QuakeFinder Statistical Analysis Framework

    NASA Astrophysics Data System (ADS)

    Kappler, K. N.; MacLean, L. S.; Schneider, D.; Bleier, T.

    2017-12-01

    Since 2005 QuakeFinder (QF) has acquired an unique dataset with outstanding spatial and temporal sampling of earth's magnetic field along several active fault systems. This QF network consists of 124 stations in California and 45 stations along fault zones in Greece, Taiwan, Peru, Chile and Indonesia. Each station is equipped with three feedback induction magnetometers, two ion sensors, a 4 Hz geophone, a temperature sensor, and a humidity sensor. Data are continuously recorded at 50 Hz with GPS timing and transmitted daily to the QF data center in California for analysis. QF is attempting to detect and characterize anomalous EM activity occurring ahead of earthquakes. There have been many reports of anomalous variations in the earth's magnetic field preceding earthquakes. Specifically, several authors have drawn attention to apparent anomalous pulsations seen preceding earthquakes. Often studies in long term monitoring of seismic activity are limited by availability of event data. It is particularly difficult to acquire a large dataset for rigorous statistical analyses of the magnetic field near earthquake epicenters because large events are relatively rare. Since QF has acquired hundreds of earthquakes in more than 70 TB of data, we developed an automated approach for finding statistical significance of precursory behavior and developed an algorithm framework. Previously QF reported on the development of an Algorithmic Framework for data processing and hypothesis testing. The particular instance of algorithm we discuss identifies and counts magnetic variations from time series data and ranks each station-day according to the aggregate number of pulses in a time window preceding the day in question. If the hypothesis is true that magnetic field activity increases over some time interval preceding earthquakes, this should reveal itself by the station-days on which earthquakes occur receiving higher ranks than they would if the ranking scheme were random. This can be analysed using the Receiver Operating Characteristic test. In this presentation we give a status report of our latest results, largely focussed on reproducibility of results, robust statistics in the presence of missing data, and exploring optimization landscapes in our parameter space.

  8. Sampling scales define occupancy and underlying occupancy-abundance relationships in animals

    Treesearch

    Robin Steenweg; Mark Hebblewhite; Jesse Whittington; Paul Lukacs; Kevin McKelvey

    2018-01-01

    Occupancy-abundance (OA) relationships are a foundational ecological phenomenon and field of study, and occupancy models are increasingly used to track population trends and understand ecological interactions. However, these two fields of ecological inquiry remain largely isolated, despite growing appreciation of the importance of integration. For example, using...

  9. Computational Cosmology: From the Early Universe to the Large Scale Structure.

    PubMed

    Anninos, Peter

    2001-01-01

    In order to account for the observable Universe, any comprehensive theory or model of cosmology must draw from many disciplines of physics, including gauge theories of strong and weak interactions, the hydrodynamics and microphysics of baryonic matter, electromagnetic fields, and spacetime curvature, for example. Although it is difficult to incorporate all these physical elements into a single complete model of our Universe, advances in computing methods and technologies have contributed significantly towards our understanding of cosmological models, the Universe, and astrophysical processes within them. A sample of numerical calculations (and numerical methods applied to specific issues in cosmology are reviewed in this article: from the Big Bang singularity dynamics to the fundamental interactions of gravitational waves; from the quark-hadron phase transition to the large scale structure of the Universe. The emphasis, although not exclusively, is on those calculations designed to test different models of cosmology against the observed Universe.

  10. Computational Cosmology: from the Early Universe to the Large Scale Structure.

    PubMed

    Anninos, Peter

    1998-01-01

    In order to account for the observable Universe, any comprehensive theory or model of cosmology must draw from many disciplines of physics, including gauge theories of strong and weak interactions, the hydrodynamics and microphysics of baryonic matter, electromagnetic fields, and spacetime curvature, for example. Although it is difficult to incorporate all these physical elements into a single complete model of our Universe, advances in computing methods and technologies have contributed significantly towards our understanding of cosmological models, the Universe, and astrophysical processes within them. A sample of numerical calculations addressing specific issues in cosmology are reviewed in this article: from the Big Bang singularity dynamics to the fundamental interactions of gravitational waves; from the quark-hadron phase transition to the large scale structure of the Universe. The emphasis, although not exclusively, is on those calculations designed to test different models of cosmology against the observed Universe.

  11. Heterogeneous asymmetric recombinase polymerase amplification (haRPA) for rapid hygiene control of large-volume water samples.

    PubMed

    Elsäßer, Dennis; Ho, Johannes; Niessner, Reinhard; Tiehm, Andreas; Seidel, Michael

    2018-04-01

    Hygiene of drinking water is periodically controlled by cultivation and enumeration of indicator bacteria. Rapid and comprehensive measurements of emerging pathogens are of increasing interest to improve drinking water safety. In this study, the feasibility to detect bacteriophage PhiX174 as a potential indicator for virus contamination in large volumes of water is demonstrated. Three consecutive concentration methods (continuous ultrafiltration, monolithic adsorption filtration, and centrifugal ultrafiltration) were combined to concentrate phages stepwise from 1250 L drinking water into 1 mL. Heterogeneous asymmetric recombinase polymerase amplification (haRPA) is applied as rapid detection method. Field measurements were conducted to test the developed system for hygiene online monitoring under realistic conditions. We could show that this system allows the detection of artificial contaminations of bacteriophage PhiX174 in drinking water pipelines. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Validation of ELISAs for the detection of antibodies to Sarcoptes scabiei in pigs.

    PubMed

    van der Heijden, H M; Rambags, P G; Elbers, A R; van Maanen, C; Hunneman, W A

    2000-03-28

    An Enzyme-linked ImmunoSorbent Assay (ELISA) was developed for the detection of antibodies to Sarcoptes scabiei. This 'Animal Health Service'-ELISA (AHS-ELISA) was compared with a commercial test (Checkit(R) Sarcoptest) using experimental and field sera. The experimental study was a contact infestation experiment. Eighty piglets were randomly divided between the experimental and control group. After introduction of three Sarcoptes scabiei var. suis infested pigs in the experimental group, both groups were monitored by determining scratching indices, taking ear scrapings and blood samples in Weeks 0, 2, 4, 6, 8, 12 and 16. Four pigs in the control group were immunised with either Dermatophagoides pteronyssinus (Dp) antigens (n=2), or Acarus siro (As) antigens (n=2). In the control group all (non-immunised) pigs were negative in all tests. In the experimental group only slightly elevated scratching indices were observed, with a maximum in Week 8. After 2 weeks for the first time an ear scraping was positive (2.5%). In Week 8 the highest number of positive ear scrapings were found (25.0%). Positive results in the Sarcoptest were first obtained in Week 12 (10.5% positive), while eventually 29.0% of the finishing pigs were positive after 16 weeks. The AHS-ELISA first detected a serological response after 6 weeks (5. 0% positives), increasing until after 16 weeks a large proportion (74.2%) of the finishing pigs were seropositive, making the AHS-ELISA the most sensitive test. In the AHS-ELISA one As-immunised pig remained seronegative, but the other hyper-immunised pigs crossreacted. In the Sarcoptest, only Dp-immunised pigs had elevated Optical Densities (OD's) albeit below the cut-off level. Although hyper-immunisation is not a representation of field conditions, it cannot be excluded that the AHS-ELISA is not 100% specific.Field samples were taken from 20 sows in 30 herds, classified as mange-free, suspect, or infested. On a herd level there was high agreement among the ELISAs. Both serological tests were suitable to distinguish mange-free herds from infested herds. In one infested herd the decline of maternal antibody in piglets was studied by sampling 40 piglets from 20 different litters. The lowest average OD using the AHS-ELISA was found at 5 weeks of age, followed by a significant increase at 7 weeks. The average OD with the Sarcoptest was at a minimum level at 3 weeks, but no increase was found later. For screening of herds, interference of maternal antibodies is avoided by sampling at an age of 7 weeks or older.

  13. Insecticidal Management and Movement of the Brown Stink Bug, Euschistus servus, in Corn

    PubMed Central

    Reisig, Dominic D.

    2011-01-01

    In eastern North Carolina, some brown stink bugs, Euschistus servus (Say) (Hemiptera: Pentatomidae) are suspected to pass the F1 generation in wheat (Triticum aestivum L.) (Poales: Poaceae) before moving into corn (Zea mays L.) (Poales: Poaceae). These pests can injure corn ears as they develop. To test their effectiveness as a management tactic, pyrethroids were aerially applied to field corn in two experiments, one with 0.77 ha plots and another with 85 ha plots. Euschistus servus population abundance was monitored over time in both experiments and yield was assessed in the larger of the two experiments. In the smaller experiment, the populations were spatially monitored in a 6.3 ha area of corn adjacent to a recently harvested wheat field (352 sampling points of 6.1 row-meters in all but the first sampling event). Overall E. servus abundance decreased throughout the monitoring period in the sampling area of the smaller experiment, but remained unchanged over time in the large-scale experiment. During all sampling periods in both experiments, abundance was the same between treatments. Yield was unaffected by treatment where it was measured in the larger experiment. In the smaller experiment, E. servus were initially aggregated at the field edge of the corn (two, six and 13 days following the wheat harvest). Sixteen days following the wheat harvest they were randomly distributed in the corn. Although it was not directly measured, stink bugs are suspected to move the cornfield edge as a result of the adjacent wheat harvest. More study of the biology of E. servus is needed, specifically in the area of host preference, phenology and movement to explain these phenomena and to produce better management strategies for these pests. PMID:22950984

  14. Comparison of in situ uranium KD values with a laboratory determined surface complexation model

    USGS Publications Warehouse

    Curtis, G.P.; Fox, P.; Kohler, M.; Davis, J.A.

    2004-01-01

    Reactive solute transport simulations in groundwater require a large number of parameters to describe hydrologic and chemical reaction processes. Appropriate methods for determining chemical reaction parameters required for reactive solute transport simulations are still under investigation. This work compares U(VI) distribution coefficients (i.e. KD values) measured under field conditions with KD values calculated from a surface complexation model developed in the laboratory. Field studies were conducted in an alluvial aquifer at a former U mill tailings site near the town of Naturita, CO, USA, by suspending approximately 10 g samples of Naturita aquifer background sediments (NABS) in 17-5.1-cm diameter wells for periods of 3 to 15 months. Adsorbed U(VI) on these samples was determined by extraction with a pH 9.45 NaHCO3/Na2CO3 solution. In wells where the chemical conditions in groundwater were nearly constant, adsorbed U concentrations for samples taken after 3 months of exposure to groundwater were indistinguishable from samples taken after 15 months. Measured in situ K D values calculated from the measurements of adsorbed and dissolved U(VI) ranged from 0.50 to 10.6 mL/g and the KD values decreased with increasing groundwater alkalinity, consistent with increased formation of soluble U(VI)-carbonate complexes at higher alkalinities. The in situ K D values were compared with KD values predicted from a surface complexation model (SCM) developed under laboratory conditions in a separate study. A good agreement between the predicted and measured in situ KD values was observed. The demonstration that the laboratory derived SCM can predict U(VI) adsorption in the field provides a critical independent test of a submodel used in a reactive transport model. ?? 2004 Elsevier Ltd. All rights reserved.

  15. Field Geologic Observation and Sample Collection Strategies for Planetary Surface Exploration: Insights from the 2010 Desert RATS Geologist Crewmembers

    NASA Technical Reports Server (NTRS)

    Hurtado, Jose M., Jr.; Young, Kelsey; Bleacher, Jacob E.; Garry, W. Brent; Rice, James W., Jr.

    2012-01-01

    Observation is the primary role of all field geologists, and geologic observations put into an evolving conceptual context will be the most important data stream that will be relayed to Earth during a planetary exploration mission. Sample collection is also an important planetary field activity, and its success is closely tied to the quality of contextual observations. To test protocols for doing effective planetary geologic field- work, the Desert RATS(Research and Technology Studies) project deployed two prototype rovers for two weeks of simulated exploratory traverses in the San Francisco volcanic field of northern Arizona. The authors of this paper represent the geologist crew members who participated in the 2010 field test.We document the procedures adopted for Desert RATS 2010 and report on our experiences regarding these protocols. Careful consideration must be made of various issues that impact the interplay between field geologic observations and sample collection, including time management; strategies relatedtoduplicationofsamplesandobservations;logisticalconstraintson the volume and mass of samples and the volume/transfer of data collected; and paradigms for evaluation of mission success. We find that the 2010 field protocols brought to light important aspects of each of these issues, and we recommend best practices and modifications to training and operational protocols to address them. Underlying our recommendations is the recognition that the capacity of the crew to flexibly execute their activities is paramount. Careful design of mission parameters, especially field geologic protocols, is critical for enabling the crews to successfully meet their science objectives.

  16. Modelling lidar volume-averaging and its significance to wind turbine wake measurements

    NASA Astrophysics Data System (ADS)

    Meyer Forsting, A. R.; Troldborg, N.; Borraccino, A.

    2017-05-01

    Lidar velocity measurements need to be interpreted differently than conventional in-situ readings. A commonly ignored factor is “volume-averaging”, which refers to lidars not sampling in a single, distinct point but along its entire beam length. However, especially in regions with large velocity gradients, like the rotor wake, can it be detrimental. Hence, an efficient algorithm mimicking lidar flow sampling is presented, which considers both pulsed and continous-wave lidar weighting functions. The flow-field around a 2.3 MW turbine is simulated using Detached Eddy Simulation in combination with an actuator line to test the algorithm and investigate the potential impact of volume-averaging. Even with very few points discretising the lidar beam is volume-averaging captured accurately. The difference in a lidar compared to a point measurement is greatest at the wake edges and increases from 30% one rotor diameter (D) downstream of the rotor to 60% at 3D.

  17. Prediction of near-surface soil moisture at large scale by digital terrain modeling and neural networks.

    PubMed

    Lavado Contador, J F; Maneta, M; Schnabel, S

    2006-10-01

    The capability of Artificial Neural Network models to forecast near-surface soil moisture at fine spatial scale resolution has been tested for a 99.5 ha watershed located in SW Spain using several easy to achieve digital models of topographic and land cover variables as inputs and a series of soil moisture measurements as training data set. The study methods were designed in order to determining the potentials of the neural network model as a tool to gain insight into soil moisture distribution factors and also in order to optimize the data sampling scheme finding the optimum size of the training data set. Results suggest the efficiency of the methods in forecasting soil moisture, as a tool to assess the optimum number of field samples, and the importance of the variables selected in explaining the final map obtained.

  18. Keck Geology Consortium Lava Project: Undergraduate Research Linking Natural and Experimental Basaltic Lava Flows

    NASA Astrophysics Data System (ADS)

    Karson, J. A.; Hazlett, R. W.; Wysocki, R.; Bromfield, M. E.; Browne, N. C.; Davis, N. C.; Pelland, C. G.; Rowan, W. L.; Warner, K. A.

    2014-12-01

    Undergraduate students in the Keck Geology Consortium Lava Project participated in a month-long investigation of features of basaltic lava flows from two very different perspectives. The first half of the project focused on field relations in basaltic lava flows from the 1984 Krafla Fires eruption in northern Iceland. Students gained valuable experience in the collection of observations and samples in the field leading to hypotheses for the formation of selected features related to lava flow dynamics. Studies focused on a wide range of features including: morphology and heat loss in lava tubes (pyroducts), growth and collapse of lava ponds and overflow deposits, textural changes of lava falls (flow over steep steps), spaced spatter cones from flows over wet ground, and anisotropy of magnetic susceptibility related to flow kinematics. In the second half of the program students designed, helped execute, documented, and analyzed features similar to those they studied in the field with large-scale (50-250 kg) basaltic lava flows created in the Syracuse University Lava Project (http://lavaproject.syr.edu). Data collected included video from multiple perspectives, infrared thermal (FLIR) images, still images, detailed measurements of flow dimensions and rates, and samples for textural and magnetic analyses. Experimental lava flow features provided critical tests of hypotheses generated in the field and a refined understanding of the behavior and final morphology of basaltic lava flows. The linked field and experimental studies formed the basis for year-long independent research projects under the supervision of their faculty mentors, leading to senior theses at the students' respective institutions.

  19. Diagnostic performance of a novel loop-mediated isothermal amplification (LAMP) assay targeting the apicoplast genome for malaria diagnosis in a field setting in sub-Saharan Africa.

    PubMed

    Oriero, Eniyou C; Okebe, Joseph; Jacobs, Jan; Van Geertruyden, Jean-Pierre; Nwakanma, Davis; D'Alessandro, Umberto

    2015-10-09

    New diagnostic tools to detect reliably and rapidly asymptomatic and low-density malaria infections are needed as their treatment could interrupt transmission. Isothermal amplification techniques are being explored for field diagnosis of malaria. In this study, a novel molecular tool (loop-mediated isothermal amplification-LAMP) targeting the apicoplast genome of Plasmodium falciparum was evaluated for the detection of asymptomatic malaria-infected individuals in a rural setting in The Gambia. A blood was collected from 341 subjects (median age 9 years, range 1-68 years) screened for malaria. On site, a rapid diagnostic test (RDT, SD Bioline Malaria Antigen P.f) was performed, thick blood films (TBF) slides for microscopy were prepared and dry blood spots (DBS) were collected on Whatman(®) 903 Specimen collection paper. The TBF and DBS were transported to the field laboratory where microscopy and LAMP testing were performed. The latter was done on DNA extracted from the DBS using a crude (methanol/heating) extraction method. A laboratory-based PCR amplification was done on all the samples using DNA extracted with the Qiagen kit and its results were taken as reference for all the other tests. Plasmodium falciparum malaria prevalence was 37 % (127/341) as detected by LAMP, 30 % (104/341) by microscopy and 37 % (126/341) by RDT. Compared to the reference PCR method, sensitivity was 92 % for LAMP, 78 % for microscopy, and 76 % for RDT; specificity was 97 % for LAMP, 99 % for microscopy, and 88 % for RDT. Area under the receiver operating characteristic (ROC) curve in comparison with the reference standard was 0.94 for LAMP, 0.88 for microscopy and 0.81 for RDT. Turn-around time for the entire LAMP assay was approximately 3 h and 30 min for an average of 27 ± 9.5 samples collected per day, compared to a minimum of 10 samples an hour per operator by RDT and over 8 h by microscopy. The LAMP assay could produce reliable results the same day of the screening. It could detect a higher proportion of low density malaria infections than the other methods tested and may be used for large campaigns of systematic screening and treatment.

  20. Infrared Testing of the Wide-field Infrared Survey Telescope Grism Using Computer Generated Holograms

    NASA Technical Reports Server (NTRS)

    Dominguez, Margaret Z.; Content, David A.; Gong, Qian; Griesmann, Ulf; Hagopian, John G.; Marx, Catherine T; Whipple, Arthur L.

    2017-01-01

    Infrared Computer Generated Holograms (CGHs) were designed, manufactured and used to measure the performance of the grism (grating prism) prototype which includes testing Diffractive Optical Elements (DOE). The grism in the Wide Field Infrared Survey Telescope (WFIRST) will allow the surveying of a large section of the sky to find bright galaxies.

  1. Study Abroad Field Trip Improves Test Performance through Engagement and New Social Networks

    ERIC Educational Resources Information Center

    Houser, Chris; Brannstrom, Christian; Quiring, Steven M.; Lemmons, Kelly K.

    2011-01-01

    Although study abroad trips provide an opportunity for affective and cognitive learning, it is largely assumed that they improve learning outcomes. The purpose of this study is to determine whether a study abroad field trip improved cognitive learning by comparing test performance between the study abroad participants (n = 20) and their peers who…

  2. Diagnosing the Role of Alfvén Waves in Magnetosphere-Ionosphere Coupling: Swarm Observations of Large Amplitude Nonstationary Magnetic Perturbations During an Interval of Northward IMF

    NASA Astrophysics Data System (ADS)

    Pakhotin, I. P.; Mann, I. R.; Lysak, R. L.; Knudsen, D. J.; Gjerloev, J. W.; Rae, I. J.; Forsyth, C.; Murphy, K. R.; Miles, D. M.; Ozeke, L. G.; Balasis, G.

    2018-01-01

    High-resolution multispacecraft Swarm data are used to examine magnetosphere-ionosphere coupling during a period of northward interplanetary magnetic field (IMF) on 31 May 2014. The observations reveal a prevalence of unexpectedly large amplitude (>100 nT) and time-varying magnetic perturbations during the polar passes, with especially large amplitude magnetic perturbations being associated with large-scale downward field-aligned currents. Differences between the magnetic field measurements sampled at 50 Hz from Swarm A and C, approximately 10 s apart along track, and the correspondence between the observed electric and magnetic fields at 16 samples per second, provide significant evidence for an important role for Alfvén waves in magnetosphere-ionosphere coupling even during northward IMF conditions. Spectral comparison between the wave E- and B-fields reveals a frequency-dependent phase difference and amplitude ratio consistent with interference between incident and reflected Alfvén waves. At low frequencies, the E/B ratio is in phase with an amplitude determined by the Pedersen conductance. At higher frequencies, the amplitude and phase change as a function of frequency in good agreement with an ionospheric Alfvén resonator model including Pedersen conductance effects. Indeed, within this Alfvén wave incidence, reflection, and interference paradigm, even quasi-static field-aligned currents might be reasonably interpreted as very low frequency (ω → 0) Alfvén waves. Overall, our results not only indicate the importance of Alfvén waves for magnetosphere-ionosphere coupling but also demonstrate a method for using Swarm data for the innovative experimental diagnosis of Pedersen conductance from low-Earth orbit satellite measurements.

  3. Single-particle imaging for biosensor applications

    NASA Astrophysics Data System (ADS)

    Yorulmaz, Mustafa; Isil, Cagatay; Seymour, Elif; Yurdakul, Celalettin; Solmaz, Berkan; Koc, Aykut; Ünlü, M. Selim

    2017-10-01

    Current state-of-the-art technology for in-vitro diagnostics employ laboratory tests such as ELISA that consists of a multi-step test procedure and give results in analog format. Results of these tests are interpreted by the color change in a set of diluted samples in a multi-well plate. However, detection of the minute changes in the color poses challenges and can lead to false interpretations. Instead, a technique that allows individual counting of specific binding events would be useful to overcome such challenges. Digital imaging has been applied recently for diagnostics applications. SPR is one of the techniques allowing quantitative measurements. However, the limit of detection in this technique is on the order of nM. The current required detection limit, which is already achieved with the analog techniques, is around pM. Optical techniques that are simple to implement and can offer better sensitivities have great potential to be used in medical diagnostics. Interference Microscopy is one of the tools that have been investigated over years in optics field. More of the studies have been performed in confocal geometry and each individual nanoparticle was observed separately. Here, we achieve wide-field imaging of individual nanoparticles in a large field-of-view ( 166 μm × 250 μm) on a micro-array based sensor chip in fraction of a second. We tested the sensitivity of our technique on dielectric nanoparticles because they exhibit optical properties similar to viruses and cells. We can detect non-resonant dielectric polystyrene nanoparticles of 100 nm. Moreover, we perform post-processing applications to further enhance visibility.

  4. The ISRU Field Tests 2010 and 2012 at Mauna Kea, Hawaii: Results from the Miniaturised Mossbauer Spectrometers Mimos II and Mimos IIA

    NASA Technical Reports Server (NTRS)

    Klingelhoefer, G.; Morris, R. V.; Blumers, M; Bernhardt, B.; Graff, T.

    2014-01-01

    The 2010 and 2012 In-Situ Resource Utilization Analogue Test (ISRU) [1] on the Mauna Kea volcano in Hawai'i was coordinated by the Northern Centre for Advanced Technology (NORCAT) in collaboration with the Canadian Space Agency (CSA), the German Aerospace Center (DLR), and the National Aeronautics and Space Administration (NASA), through the PISCES program. Several instruments were tested as reference candidates for future analogue testing at the new field test site at the Mauna Kea volcano in Hawai'i. The fine-grained, volcanic nature of the material is a suitable lunar and martian analogue, and can be used to test excavation, site preparation, and resource utilization techniques. The 2010 location Pu'u Hiwahine, a cinder cone located below the summit of Mauna Kea (19deg45'39.29" N, 155deg28'14.56" W) at an elevation of 2800 m, provides a large number of slopes, rock avalanches, etc. to perform mobility tests, site preparation or resource prospecting. Besides hardware testing of technologies and systems related to resource identification, also in situ science measurements played a significant role in integration of ISRU and science instruments. For the advanced Mössbauer instrument MIMOS IIA, the new detector technologies and electronic components increase sensitivity and performance significantly. In combination with the high energy resolution of the SDD it is possible to perform Xray fluorescence analysis simultaneously to Mössbauer spectroscopy. In addition to the Fe-mineralogy, information on the sample's elemental composition will be gathered. The 2010 and 2012 field campaigns demonstrated that in-situ Mössbauer spectroscopy is an effective tool for both science and feedstock exploration and process monitoring. Engineering tests showed that a compact nickel metal hydride battery provided sufficient power for over 12 hr of continuous operation for the MIMOS instruments.

  5. GeoLab's First Field Trials, 2010 Desert RATS: Evaluating Tools for Early Sample Characterization

    NASA Technical Reports Server (NTRS)

    Evans, Cindy A.; Bell, M. S.; Calaway, M. J.; Graff, Trevor; Young, Kelsey

    2011-01-01

    As part of an accelerated prototyping project to support science operations tests for future exploration missions, we designed and built a geological laboratory, GeoLab, that was integrated into NASA's first generation Habitat Demonstration Unit-1/Pressurized Excursion Module (HDU1-PEM). GeoLab includes a pressurized glovebox for transferring and handling samples collected on geological traverses, and a suite of instruments for collecting preliminary data to help characterize those samples. The GeoLab and the HDU1-PEM were tested for the first time as part of the 2010 Desert Research and Technology Studies (DRATS), NASA's analog field exercise for testing mission technologies. The HDU1- PEM and GeoLab participated in two weeks of joint operations in northern Arizona with two crewed rovers and the DRATS science team.

  6. Magnetic Torque in Single Crystal Ni-Mn-Ga

    NASA Astrophysics Data System (ADS)

    Hobza, Anthony; Müllner, Peter

    2017-06-01

    Magnetic shape memory alloys deform in an external magnetic field in two distinct ways: by axial straining—known as magnetic-field-induced strain—and by bending when exposed to torque. Here, we examine the magnetic torque that a magnetic field exerts on a long Ni-Mn-Ga rod. A single crystal specimen of Ni-Mn-Ga was constrained with respect to bending and subjected to an external magnetic field. The torque required to rotate the specimen in the field was measured as a function of the orientation of the sample with the external magnetic field, strain, and the magnitude of the external magnetic field. The torque was analyzed based on the changes in the free energy with the angle between the field and the sample. The contributions of magnetocrystalline anisotropy and shape anisotropy to the Zeeman energy determine the net torque. The torque is large when magneotcrystalline and shape anisotropies act synergistically and small when these anisotropies act antagonistically.

  7. Validation of a Rapid Rabies Diagnostic Tool for Field Surveillance in Developing Countries

    PubMed Central

    Léchenne, Monique; Naïssengar, Kemdongarti; Lepelletier, Anthony; Alfaroukh, Idriss Oumar; Bourhy, Hervé; Zinsstag, Jakob; Dacheux, Laurent

    2016-01-01

    Background One root cause of the neglect of rabies is the lack of adequate diagnostic tests in the context of low income countries. A rapid, performance friendly and low cost method to detect rabies virus (RABV) in brain samples will contribute positively to surveillance and consequently to accurate data reporting, which is presently missing in the majority of rabies endemic countries. Methodology/Principal findings We evaluated a rapid immunodiagnostic test (RIDT) in comparison with the standard fluorescent antibody test (FAT) and confirmed the detection of the viral RNA by real time reverse transcription polymerase chain reaction (RT-qPCR). Our analysis is a multicentre approach to validate the performance of the RIDT in both a field laboratory (N’Djamena, Chad) and an international reference laboratory (Institut Pasteur, Paris, France). In the field laboratory, 48 samples from dogs were tested and in the reference laboratory setting, a total of 73 samples was tested, representing a wide diversity of RABV in terms of animal species tested (13 different species), geographical origin of isolates with special emphasis on Africa, and different phylogenetic clades. Under reference laboratory conditions, specificity was 93.3% and sensitivity was 95.3% compared to the gold standard FAT test. Under field laboratory conditions, the RIDT yielded a higher reliability than the FAT test particularly on fresh and decomposed samples. Viral RNA was later extracted directly from the test filter paper and further used successfully for sequencing and genotyping. Conclusion/Significance The RIDT shows excellent performance qualities both in regard to user friendliness and reliability of the result. In addition, the test cassettes can be used as a vehicle to ship viral RNA to reference laboratories for further laboratory confirmation of the diagnosis and for epidemiological investigations using nucleotide sequencing. The potential for satisfactory use in remote locations is therefore very high to improve the global knowledge of rabies epidemiology. However, we suggest some changes to the protocol, as well as careful further validation, before promotion and wider use. PMID:27706156

  8. Desert Research and Technology Studies 2005 Report

    NASA Technical Reports Server (NTRS)

    Ross, Amy J.; Kosmo, Joseph J.; Janoiko, Barbara A.; Bernard, Craig; Splawn, Keith; Eppler, Dean B.

    2006-01-01

    During the first two weeks of September 2005, the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) Advanced Extravehicular Activity (AEVA) team led the field test portion of the 2005 Research and Technology Studies (RATS). The Desert RATS field test activity is the culmination of the various individual science and advanced engineering discipline areas year-long technology and operations development efforts into a coordinated field test demonstration under representative (analog) planetary surface terrain conditions. The purpose of the RATS is to drive out preliminary exploration concept of operations EVA system requirements by providing hands-on experience with simulated planetary surface exploration extravehicular activity (EVA) hardware and procedures. The RATS activities also are of significant importance in helping to develop the necessary levels of technical skills and experience for the next generation of engineers, scientists, technicians, and astronauts who will be responsible for realizing the goals of the Constellation Program. The 2005 Desert RATS was the eighth RATS field test and was the most systems-oriented, integrated field test to date with participants from NASA field centers, the United States Geologic Survey (USGS), industry partners, and research institutes. Each week of the test, the 2005 RATS addressed specific sets of objectives. The first week focused on the performance of surface science astro-biological sampling operations, including planetary protection considerations and procedures. The second week supported evaluation of the Science, Crew, Operations, and Utility Testbed (SCOUT) proto-type rover and its sub-systems. Throughout the duration of the field test, the Communications, Avionics, and Infomatics pack (CAI-pack) was tested. This year the CAI-pack served to provide information on surface navigation, science sample collection procedures, and EVA timeline awareness. Additionally, 2005 was the first year since the Apollo program that two pressurized suited test subjects have worked together simultaneously. Another first was the demonstration of recharge of cryogenic life support systems while in-use by the suited test subjects. The recharge capability allowed the simulated EVA test duration to be doubled, facilitating SCOUT proto-type rover testing. This paper summarizes Desert RATS 2005 test hardware, detailed test objectives, test operations and test results.

  9. Comparison of no-purge and pumped sampling methods for monitoring concentrations of ordnance-related compounds in groundwater, Camp Edwards, Massachusetts Military Reservation, Cape Cod, Massachusetts, 2009-2010

    USGS Publications Warehouse

    Savoie, Jennifer G.; LeBlanc, Denis R.

    2012-01-01

    Field tests were conducted near the Impact Area at Camp Edwards on the Massachusetts Military Reservation, Cape Cod, Massachusetts, to determine the utility of no-purge groundwater sampling for monitoring concentrations of ordnance-related explosive compounds and perchlorate in the sand and gravel aquifer. The no-purge methods included (1) a diffusion sampler constructed of rigid porous polyethylene, (2) a diffusion sampler constructed of regenerated-cellulose membrane, and (3) a tubular grab sampler (bailer) constructed of polyethylene film. In samples from 36 monitoring wells, concentrations of perchlorate (ClO4-), hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX), and octahydro-1,3,5,7-tetranitro-1,3,5,7-tetrazocine (HMX), the major contaminants of concern in the Impact Area, in the no-purge samples were compared to concentrations of these compounds in samples collected by low-flow pumped sampling with dedicated bladder pumps. The monitoring wells are constructed of 2- and 2.5-inch-diameter polyvinyl chloride pipe and have approximately 5- to 10-foot-long slotted screens. The no-purge samplers were left in place for 13-64 days to ensure that ambient groundwater flow had flushed the well screen and concentrations in the screen represented water in the adjacent formation. The sampling methods were compared first in six monitoring wells. Concentrations of ClO4-, RDX, and HMX in water samples collected by the three no-purge sampling methods and low-flow pumped sampling were in close agreement for all six monitoring wells. There is no evidence of a systematic bias in the concentration differences among the methods on the basis of type of sampling device, type of contaminant, or order in which the no-purge samplers were tested. A subsequent examination of vertical variations in concentrations of ClO4- in the 10-foot-long screens of six wells by using rigid porous polyethylene diffusion samplers indicated that concentrations in a given well varied by less than 15 percent and the small variations were unlikely to affect the utility of the various sampling methods. The grab sampler was selected for additional tests in 29 of the 36 monitoring wells used during the study. Concentrations of ClO4-, RDX, HMX, and other minor explosive compounds in water samples collected by using a 1-liter grab sampler and low-flow pumped sampling were in close agreement in field tests in the 29 wells. A statistical analysis based on the sign test indicated that there was no bias in the concentration differences between the methods. There also was no evidence for a systematic bias in concentration differences between the methods related to location of the monitoring wells laterally or vertically in the groundwater-flow system. Field tests in five wells also demonstrated that sample collection by using a 2-liter grab sampler and sequential bailing with the 1-liter grab sampler were options for obtaining sufficient sample volume for replicate and spiked quality assurance and control samples. The evidence from the field tests supports the conclusion that diffusion sampling with the rigid porous polyethylene and regenerated-cellulose membranes and grab sampling with the polyethylene-film samplers provide comparable data on the concentrations of ordnance-related compounds in groundwater at the MMR to that obtained by low-flow pumped sampling. These sampling methods are useful methods for monitoring these compounds at the MMR and in similar hydrogeologic environments.

  10. Determination of MLC model parameters for Monaco using commercial diode arrays.

    PubMed

    Kinsella, Paul; Shields, Laura; McCavana, Patrick; McClean, Brendan; Langan, Brian

    2016-07-08

    Multileaf collimators (MLCs) need to be characterized accurately in treatment planning systems to facilitate accurate intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT). The aim of this study was to examine the use of MapCHECK 2 and ArcCHECK diode arrays for optimizing MLC parameters in Monaco X-ray voxel Monte Carlo (XVMC) dose calculation algorithm. A series of radiation test beams designed to evaluate MLC model parameters were delivered to MapCHECK 2, ArcCHECK, and EBT3 Gafchromic film for comparison. Initial comparison of the calculated and ArcCHECK-measured dose distributions revealed it was unclear how to change the MLC parameters to gain agreement. This ambiguity arose due to an insufficient sampling of the test field dose distributions and unexpected discrepancies in the open parts of some test fields. Consequently, the XVMC MLC parameters were optimized based on MapCHECK 2 measurements. Gafchromic EBT3 film was used to verify the accuracy of MapCHECK 2 measured dose distributions. It was found that adjustment of the MLC parameters from their default values resulted in improved global gamma analysis pass rates for MapCHECK 2 measurements versus calculated dose. The lowest pass rate of any MLC-modulated test beam improved from 68.5% to 93.5% with 3% and 2 mm gamma criteria. Given the close agreement of the optimized model to both MapCHECK 2 and film, the optimized model was used as a benchmark to highlight the relatively large discrepancies in some of the test field dose distributions found with ArcCHECK. Comparison between the optimized model-calculated dose and ArcCHECK-measured dose resulted in global gamma pass rates which ranged from 70.0%-97.9% for gamma criteria of 3% and 2 mm. The simple square fields yielded high pass rates. The lower gamma pass rates were attributed to the ArcCHECK overestimating the dose in-field for the rectangular test fields whose long axis was parallel to the long axis of the ArcCHECK. Considering ArcCHECK measurement issues and the lower gamma pass rates for the MLC-modulated test beams, it was concluded that MapCHECK 2 was a more suitable detector than ArcCHECK for the optimization process. © 2016 The Authors

  11. Periprosthetic infection: where do we stand with regard to Gram stain?

    PubMed

    Ghanem, Elie; Ketonis, Constantinos; Restrepo, Camilo; Joshi, Ashish; Barrack, Robert; Parvizi, Javad

    2009-02-01

    One of the routinely used intraoperative tests for diagnosis of periprosthetic infection (PPI) is the Gram stain. It is not known if the result of this test can vary according to the type of joint affected or the number of specimen samples collected. We examined the role of this diagnostic test in a large cohort of patients from a single institution. A positive gram stain was defined as the visualization of bacterial cells or "many neutrophils" (> 5 per high-power field) in the smear. The sensitivity, specificity, and predictive values of each individual diagnostic arm of Gram stain were determined. Combinations were performed in series, which required both tests to be positive to confirm infection, and also in parallel, which necessitated both tests to be negative to rule out infection. The presence of organisms and "many" neutrophils on a Gram smear had high specificity (98-100%) and positive predictive value (89-100%) in both THA and TKA. The sensitivities (30-50%) and negative predictive values (70-79%) of the 2 tests were low for both joint types. When the 2 tests were combined in series, the specificity and positive predictive value were absolute (100%). The sensitivity and the negative predictive value improved for both THA and TKA (43-64% and 82%, respectively). Although the 2 diagnostic arms of Gram staining can be combined to achieve improved negative predictive value (82%), Gram stain continues to have little value in ruling out PPI. With the advances in the field of molecular biology, novel diagnostic modalities need to be designed that can replace these traditional and poor tests.

  12. Chemical Synthesis and Oxide Dispersion Properties of Strengthened Tungsten via Spark Plasma Sintering

    PubMed Central

    Ding, Xiao-Yu; Luo, Lai-Ma; Chen, Hong-Yu; Zhu, Xiao-Yong; Zan, Xiang; Cheng, Ji-Gui; Wu, Yu-Cheng

    2016-01-01

    Highly uniform oxide dispersion-strengthened materials W–1 wt % Nd2O3 and W–1 wt % CeO2 were successfully fabricated via a novel wet chemical method followed by hydrogen reduction. The powders were consolidated by spark plasma sintering at 1700 °C to suppress grain growth. The samples were characterized by performing field emission scanning electron microscopy and transmission electron microscopy analyses, Vickers microhardness measurements, thermal conductivity, and tensile testing. The oxide particles were dispersed at the tungsten grain boundaries and within the grains. The thermal conductivity of the samples at room temperature exceeded 140 W/m·K. The tensile tests indicated that W–1 wt % CeO2 exhibited a ductile–brittle transition temperature between 500 °C and 550 °C, which was a lower range than that for W–1 wt % Nd2O3. Surface topography and Vickers microhardness analyses were conducted before and after irradiations with 50 eV He ions at a fluence of 1 × 1022 m−2 for 1 h in the large-powder material irradiation experiment system. The grain boundaries of the irradiated area became more evident than that of the unirradiated area for both samples. Irradiation hardening was recognized for the W–1 wt % Nd2O3 and W–1 wt % CeO2 samples. PMID:28773999

  13. Comparison of talc-Celite and polyelectrolyte 60 in virus recovery from sewage: development of technique and experiments with poliovirus (type 1, Sabin)-contaminated multilitre samples.

    PubMed

    Sattar, S A; Westwood, J C

    1976-11-01

    For virus recovery from sewage, a mixture of talc and Celite was tested as a possible inexpensive substitute for polyelectrolyte 60 (PE 60). After adjustment of pH to 6 and the addition of 45-60 plaque forming units (PFU)/ml of poliovirus type I (Sabin) to the sewage sample under test, 100 ml of it was passed through either a PE 60 (400 mg) or a talc (300 mg)-Celite (100 mg) layer; the layer-adsorbed virus was eluted with 10 ml of 10% fetal calf serum (FCS) in saline (pH 7.2). In these experiments, PE 60 layers recovered 73-80% (mean 76%) of the input virus. In comparison, virus recoveries with the talc-Celite layers were 65-70% (mean 68%). Passage of 5 litres of raw sewage (containing 50 to 1.26 X 10(5) PFU/100 ml of the poliovirus) through the talc (15 g)-Celite (5 g) layers and virus elution with 50 ml of 10% FCS in saline gave virus recoveries of 33-63% (mean 49%). Except for pH adjustment and prefiltration through two layers of gauze to remove large solids, no other sample pretreatment was found to be necessary. Application of this technique to recovery of indigenous viruses from field samples of raw sewage and effluents has been highly satisfactory.

  14. Deep Extragalactic VIsible Legacy Survey (DEVILS): Motivation, Design and Target Catalogue

    NASA Astrophysics Data System (ADS)

    Davies, L. J. M.; Robotham, A. S. G.; Driver, S. P.; Lagos, C. P.; Cortese, L.; Mannering, E.; Foster, C.; Lidman, C.; Hashemizadeh, A.; Koushan, S.; O'Toole, S.; Baldry, I. K.; Bilicki, M.; Bland-Hawthorn, J.; Bremer, M. N.; Brown, M. J. I.; Bryant, J. J.; Catinella, B.; Croom, S. M.; Grootes, M. W.; Holwerda, B. W.; Jarvis, M. J.; Maddox, N.; Meyer, M.; Moffett, A. J.; Phillipps, S.; Taylor, E. N.; Windhorst, R. A.; Wolf, C.

    2018-06-01

    The Deep Extragalactic VIsible Legacy Survey (DEVILS) is a large spectroscopic campaign at the Anglo-Australian Telescope (AAT) aimed at bridging the near and distant Universe by producing the highest completeness survey of galaxies and groups at intermediate redshifts (0.3 < z < 1.0). Our sample consists of ˜60,000 galaxies to Y<21.2 mag, over ˜6 deg2 in three well-studied deep extragalactic fields (Cosmic Origins Survey field, COSMOS, Extended Chandra Deep Field South, ECDFS and the X-ray Multi-Mirror Mission Large-Scale Structure region, XMM-LSS - all Large Synoptic Survey Telescope deep-drill fields). This paper presents the broad experimental design of DEVILS. Our target sample has been selected from deep Visible and Infrared Survey Telescope for Astronomy (VISTA) Y-band imaging (VISTA Deep Extragalactic Observations, VIDEO and UltraVISTA), with photometry measured by PROFOUND. Photometric star/galaxy separation is done on the basis of NIR colours, and has been validated by visual inspection. To maximise our observing efficiency for faint targets we employ a redshift feedback strategy, which continually updates our target lists, feeding back the results from the previous night's observations. We also present an overview of the initial spectroscopic observations undertaken in late 2017 and early 2018.

  15. Molecular diagnosis of Plasmodium ovale by photo-induced electron transfer fluorogenic primers: PET-PCR

    PubMed Central

    Akerele, David; Ljolje, Dragan; Talundzic, Eldin; Udhayakumar, Venkatachalam

    2017-01-01

    Accurate diagnosis of malaria infections continues to be challenging and elusive, especially in the detection of submicroscopic infections. Developing new malaria diagnostic tools that are sensitive enough to detect low-level infections, user friendly, cost effective and capable of performing large scale diagnosis, remains critical. We have designed novel self-quenching photo-induced electron transfer (PET) fluorogenic primers for the detection of P. ovale by real-time PCR. In our study, a total of 173 clinical samples, consisting of different malaria species, were utilized to test this novel PET-PCR primer. The sensitivity and specificity were calculated using nested-PCR as the reference test. The novel primer set demonstrated a sensitivity of 97.5% and a specificity of 99.2% (95% CI 85.2–99.8% and 95.2–99.9% respectively). Furthermore, the limit of detection for P. ovale was found to be 1 parasite/μl. The PET-PCR assay is a new molecular diagnostic tool with comparable performance to other commonly used PCR methods. It is relatively easy to perform, and amiable to large scale malaria surveillance studies and malaria control and elimination programs. Further field validation of this novel primer will be helpful to ascertain the utility for large scale malaria screening programs. PMID:28640824

  16. Molecular diagnosis of Plasmodium ovale by photo-induced electron transfer fluorogenic primers: PET-PCR.

    PubMed

    Akerele, David; Ljolje, Dragan; Talundzic, Eldin; Udhayakumar, Venkatachalam; Lucchi, Naomi W

    2017-01-01

    Accurate diagnosis of malaria infections continues to be challenging and elusive, especially in the detection of submicroscopic infections. Developing new malaria diagnostic tools that are sensitive enough to detect low-level infections, user friendly, cost effective and capable of performing large scale diagnosis, remains critical. We have designed novel self-quenching photo-induced electron transfer (PET) fluorogenic primers for the detection of P. ovale by real-time PCR. In our study, a total of 173 clinical samples, consisting of different malaria species, were utilized to test this novel PET-PCR primer. The sensitivity and specificity were calculated using nested-PCR as the reference test. The novel primer set demonstrated a sensitivity of 97.5% and a specificity of 99.2% (95% CI 85.2-99.8% and 95.2-99.9% respectively). Furthermore, the limit of detection for P. ovale was found to be 1 parasite/μl. The PET-PCR assay is a new molecular diagnostic tool with comparable performance to other commonly used PCR methods. It is relatively easy to perform, and amiable to large scale malaria surveillance studies and malaria control and elimination programs. Further field validation of this novel primer will be helpful to ascertain the utility for large scale malaria screening programs.

  17. Laboratory toxicity and benthic invertebrate field colonization of Upper Columbia River sediments: finding adverse effects using multiple lines of evidence.

    PubMed

    Fairchild, J F; Kemble, N E; Allert, A L; Brumbaugh, W G; Ingersoll, C G; Dowling, B; Gruenenfelder, C; Roland, J L

    2012-07-01

    From 1930 to 1995, the Upper Columbia River (UCR) of northeast Washington State received approximately 12 million metric tons of smelter slag and associated effluents from a large smelter facility located in Trail, British Columbia, approximately 10 km north of the United States-Canadian border. Studies conducted during the past two decades have demonstrated the presence of toxic concentrations of heavy metals in slag-based sandy sediments, including cadmium, copper, zinc, and lead in the UCR area as well as the downstream reservoir portion of Lake Roosevelt. We conducted standardized whole-sediment toxicity tests with the amphipod Hyalella azteca (28-day) and the midge Chironomus dilutus (10-day) on 11 samples, including both UCR and study-specific reference sediments. Metal concentrations in sediments were modeled for potential toxicity using three approaches: (1) probable effects quotients (PEQs) based on total recoverable metals (TRMs) and simultaneously extracted metals (SEMs); (2) SEMs corrected for acid-volatile sulfides (AVS; i.e., ∑SEM - AVS); and (3) ∑SEM - AVS normalized to the fractional organic carbon (f(oc)) (i.e., ∑SEM - AVS/f(oc)). The most highly metal-contaminated sample (∑PEQ(TRM) = 132; ∑PEQ(SEM) = 54; ∑SEM - AVS = 323; and ∑SEM - AVS/(foc) = 64,600 umol/g) from the UCR was dominated by weathered slag sediment particles and resulted in 80% mortality and 94% decrease in biomass of amphipods; in addition, this sample significantly decreased growth of midge by 10%. The traditional ∑AVS - SEM, uncorrected for organic carbon, was the most accurate approach for estimating the effects of metals in the UCR. Treatment of the toxic slag sediment with 20% Resinex SIR-300 metal-chelating resin significantly decreased the toxicity of the sample. Samples ∑SEM - AVS > 244 was not toxic to amphipods or midge in laboratory testing, indicating that this value may be an approximate threshold for effects in the UCR. In situ benthic invertebrate colonization studies in an experimental pond (8-week duration) indicated that two of the most metal-contaminated UCR sediments (dominated by high levels of sand-sized slag particles) exhibited decreased invertebrate colonization compared with sand-based reference sediments. Field-exposed SIR-300 resin samples also exhibited decreased invertebrate colonization numbers compared with reference materials, which may indicate behavioral avoidance of this material under field conditions. Multiple lines of evidence (analytical chemistry, laboratory toxicity, and field colonization results), along with findings from previous studies, indicate that high metal concentrations associated with slag-enriched sediments in the UCR are likely to adversely impact the growth and survival of native benthic invertebrate communities. Additional laboratory toxicity testing, refinement of the applications of sediment benchmarks for metal toxicity, and in situ benthic invertebrate studies will assist in better defining the spatial extent, temporal variations, and ecological impacts of metal-contaminated sediments in the UCR system.

  18. Sample stacking of fast-moving anions in capillary zone electrophoresis with pH-suppressed electroosmotic flow.

    PubMed

    Quirino, J P; Terabe, S

    1999-07-30

    On-line sample concentration of fast moving inorganic anions by large volume sample stacking (LVSS) and field enhanced sample injection (FESI) with a water plug under acidic conditions is presented. Detection sensitivity enhancements were around 100 and 1000-fold for LVSS and FESI, respectively. However, reproducibility and linearity of response in the LVSS approach is superior compared to the FESI approach.

  19. Integrative Genomics and Computational Systems Medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDermott, Jason E.; Huang, Yufei; Zhang, Bing

    The exponential growth in generation of large amounts of genomic data from biological samples has driven the emerging field of systems medicine. This field is promising because it improves our understanding of disease processes at the systems level. However, the field is still in its young stage. There exists a great need for novel computational methods and approaches to effectively utilize and integrate various omics data.

  20. Comparison of sampling and test methods for determining asphalt content and moisture correction in asphalt concrete mixtures.

    DOT National Transportation Integrated Search

    1985-03-01

    The purpose of this report is to identify the difference, if any, in AASHTO and OSHD test procedures and results. This report addresses the effect of the size of samples taken in the field and evaluates the methods of determining the moisture content...

Top