Sample records for large-sample field test

  1. Internet cognitive testing of large samples needed in genetic research.

    PubMed

    Haworth, Claire M A; Harlaar, Nicole; Kovas, Yulia; Davis, Oliver S P; Oliver, Bonamy R; Hayiou-Thomas, Marianna E; Frances, Jane; Busfield, Patricia; McMillan, Andrew; Dale, Philip S; Plomin, Robert

    2007-08-01

    Quantitative and molecular genetic research requires large samples to provide adequate statistical power, but it is expensive to test large samples in person, especially when the participants are widely distributed geographically. Increasing access to inexpensive and fast Internet connections makes it possible to test large samples efficiently and economically online. Reliability and validity of Internet testing for cognitive ability have not been previously reported; these issues are especially pertinent for testing children. We developed Internet versions of reading, language, mathematics and general cognitive ability tests and investigated their reliability and validity for 10- and 12-year-old children. We tested online more than 2500 pairs of 10-year-old twins and compared their scores to similar internet-based measures administered online to a subsample of the children when they were 12 years old (> 759 pairs). Within 3 months of the online testing at 12 years, we administered standard paper and pencil versions of the reading and mathematics tests in person to 30 children (15 pairs of twins). Scores on Internet-based measures at 10 and 12 years correlated .63 on average across the two years, suggesting substantial stability and high reliability. Correlations of about .80 between Internet measures and in-person testing suggest excellent validity. In addition, the comparison of the internet-based measures to ratings from teachers based on criteria from the UK National Curriculum suggests good concurrent validity for these tests. We conclude that Internet testing can be reliable and valid for collecting cognitive test data on large samples even for children as young as 10 years.

  2. A New Facility for Testing Superconducting Solenoid Magnets with Large Fringe Fields at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orris, D.; Carcagno, R.; Nogiec, J.

    2013-09-01

    Testing superconducting solenoid with no iron flux return can be problematic for a magnet test facility due to the large magnetic fringe fields generated. These large external fields can interfere with the operation of equipment while precautions must be taken for personnel supporting the test. The magnetic forces between the solenoid under test and the external infrastructure must also be taken under consideration. A new test facility has been designed and built at Fermilab specifically for testing superconducting magnets with large external fringe fields. This paper discusses the test stand design, capabilities, and details of the instrumentation and controls withmore » data from the first solenoid tested in this facility: the Muon Ionization Cooling Experiment (MICE) coupling coil.« less

  3. Large Field Photogrammetry Techniques in Aircraft and Spacecraft Impact Testing

    NASA Technical Reports Server (NTRS)

    Littell, Justin D.

    2010-01-01

    The Landing and Impact Research Facility (LandIR) at NASA Langley Research Center is a 240 ft. high A-frame structure which is used for full-scale crash testing of aircraft and rotorcraft vehicles. Because the LandIR provides a unique capability to introduce impact velocities in the forward and vertical directions, it is also serving as the facility for landing tests on full-scale and sub-scale Orion spacecraft mass simulators. Recently, a three-dimensional photogrammetry system was acquired to assist with the gathering of vehicle flight data before, throughout and after the impact. This data provides the basis for the post-test analysis and data reduction. Experimental setups for pendulum swing tests on vehicles having both forward and vertical velocities can extend to 50 x 50 x 50 foot cubes, while weather, vehicle geometry, and other constraints make each experimental setup unique to each test. This paper will discuss the specific calibration techniques for large fields of views, camera and lens selection, data processing, as well as best practice techniques learned from using the large field of view photogrammetry on a multitude of crash and landing test scenarios unique to the LandIR.

  4. Impact of Different Visual Field Testing Paradigms on Sample Size Requirements for Glaucoma Clinical Trials.

    PubMed

    Wu, Zhichao; Medeiros, Felipe A

    2018-03-20

    Visual field testing is an important endpoint in glaucoma clinical trials, and the testing paradigm used can have a significant impact on the sample size requirements. To investigate this, this study included 353 eyes of 247 glaucoma patients seen over a 3-year period to extract real-world visual field rates of change and variability estimates to provide sample size estimates from computer simulations. The clinical trial scenario assumed that a new treatment was added to one of two groups that were both under routine clinical care, with various treatment effects examined. Three different visual field testing paradigms were evaluated: a) evenly spaced testing, b) United Kingdom Glaucoma Treatment Study (UKGTS) follow-up scheme, which adds clustered tests at the beginning and end of follow-up in addition to evenly spaced testing, and c) clustered testing paradigm, with clusters of tests at the beginning and end of the trial period and two intermediary visits. The sample size requirements were reduced by 17-19% and 39-40% using the UKGTS and clustered testing paradigms, respectively, when compared to the evenly spaced approach. These findings highlight how the clustered testing paradigm can substantially reduce sample size requirements and improve the feasibility of future glaucoma clinical trials.

  5. Confirmatory analysis of field-presumptive GSR test sample using SEM/EDS

    NASA Astrophysics Data System (ADS)

    Toal, Sarah J.; Niemeyer, Wayne D.; Conte, Sean; Montgomery, Daniel D.; Erikson, Gregory S.

    2014-09-01

    RedXDefense has developed an automated red-light/green-light field presumptive lead test using a sampling pad which can be subsequently processed in a Scanning Electron Microscope for GSR confirmation. The XCAT's sampling card is used to acquire a sample from a suspect's hands on the scene and give investigators an immediate presumptive as to the presence of lead possibly from primer residue. Positive results can be obtained after firing as little as one shot. The same sampling card can then be sent to a crime lab and processed on the SEM for GSR following ASTM E-1588-10 Standard Guide for Gunshot Residue Analysis by Scanning Electron Microscopy/Energy Dispersive X-Ray Spectrometry, in the same manner as the existing tape lifts currently used in the field. Detection of GSR-characteristic particles (fused lead, barium, and antimony) as small as 0.8 microns (0.5 micron resolution) has been achieved using a JEOL JSM-6480LV SEM equipped with an Oxford Instruments INCA EDS system with a 50mm2 SDD detector, 350X magnification, in low-vacuum mode and in high vacuum mode after coating with carbon in a sputter coater. GSR particles remain stable on the sampling pad for a minimum of two months after chemical exposure (long term stability tests are in progress). The presumptive result provided by the XCAT yields immediate actionable intelligence to law enforcement to facilitate their investigation, without compromising the confirmatory test necessary to further support the investigation and legal case.

  6. The cost of large numbers of hypothesis tests on power, effect size and sample size.

    PubMed

    Lazzeroni, L C; Ray, A

    2012-01-01

    Advances in high-throughput biology and computer science are driving an exponential increase in the number of hypothesis tests in genomics and other scientific disciplines. Studies using current genotyping platforms frequently include a million or more tests. In addition to the monetary cost, this increase imposes a statistical cost owing to the multiple testing corrections needed to avoid large numbers of false-positive results. To safeguard against the resulting loss of power, some have suggested sample sizes on the order of tens of thousands that can be impractical for many diseases or may lower the quality of phenotypic measurements. This study examines the relationship between the number of tests on the one hand and power, detectable effect size or required sample size on the other. We show that once the number of tests is large, power can be maintained at a constant level, with comparatively small increases in the effect size or sample size. For example at the 0.05 significance level, a 13% increase in sample size is needed to maintain 80% power for ten million tests compared with one million tests, whereas a 70% increase in sample size is needed for 10 tests compared with a single test. Relative costs are less when measured by increases in the detectable effect size. We provide an interactive Excel calculator to compute power, effect size or sample size when comparing study designs or genome platforms involving different numbers of hypothesis tests. The results are reassuring in an era of extreme multiple testing.

  7. Statistical Searches for Microlensing Events in Large, Non-uniformly Sampled Time-Domain Surveys: A Test Using Palomar Transient Factory Data

    NASA Astrophysics Data System (ADS)

    Price-Whelan, Adrian M.; Agüeros, Marcel A.; Fournier, Amanda P.; Street, Rachel; Ofek, Eran O.; Covey, Kevin R.; Levitan, David; Laher, Russ R.; Sesar, Branimir; Surace, Jason

    2014-01-01

    Many photometric time-domain surveys are driven by specific goals, such as searches for supernovae or transiting exoplanets, which set the cadence with which fields are re-imaged. In the case of the Palomar Transient Factory (PTF), several sub-surveys are conducted in parallel, leading to non-uniform sampling over its ~20,000 deg2 footprint. While the median 7.26 deg2 PTF field has been imaged ~40 times in the R band, ~2300 deg2 have been observed >100 times. We use PTF data to study the trade off between searching for microlensing events in a survey whose footprint is much larger than that of typical microlensing searches, but with far-from-optimal time sampling. To examine the probability that microlensing events can be recovered in these data, we test statistics used on uniformly sampled data to identify variables and transients. We find that the von Neumann ratio performs best for identifying simulated microlensing events in our data. We develop a selection method using this statistic and apply it to data from fields with >10 R-band observations, 1.1 × 109 light curves, uncovering three candidate microlensing events. We lack simultaneous, multi-color photometry to confirm these as microlensing events. However, their number is consistent with predictions for the event rate in the PTF footprint over the survey's three years of operations, as estimated from near-field microlensing models. This work can help constrain all-sky event rate predictions and tests microlensing signal recovery in large data sets, which will be useful to future time-domain surveys, such as that planned with the Large Synoptic Survey Telescope.

  8. Ultra-Gradient Test Cavity for Testing SRF Wafer Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N.J. Pogue, P.M. McIntyre, A.I. Sattarov, C. Reece

    2010-11-01

    A 1.3 GHz test cavity has been designed to test wafer samples of superconducting materials. This mushroom shaped cavity, operating in TE01 mode, creates a unique distribution of surface fields. The surface magnetic field on the sample wafer is 3.75 times greater than elsewhere on the Niobium cavity surface. This field design is made possible through dielectrically loading the cavity by locating a hemisphere of ultra-pure sapphire just above the sample wafer. The sapphire pulls the fields away from the walls so the maximum field the Nb surface sees is 25% of the surface field on the sample. In thismore » manner, it should be possible to drive the sample wafer well beyond the BCS limit for Niobium while still maintaining a respectable Q. The sapphire's purity must be tested for its loss tangent and dielectric constant to finalize the design of the mushroom test cavity. A sapphire loaded CEBAF cavity has been constructed and tested. The results on the dielectric constant and loss tangent will be presented« less

  9. A field test of cut-off importance sampling for bole volume

    Treesearch

    Jeffrey H. Gove; Harry T. Valentine; Michael J. Holmes

    2000-01-01

    Cut-off importance sampling has recently been introduced as a technique for estimating bole volume to some point below the tree tip, termed the cut-off point. A field test of this technique was conducted on a small population of eastern white pine trees using dendrometry as the standard for volume estimation. Results showed that the differences in volume estimates...

  10. Soils Sampling and Testing Training Guide for Field and Laboratory Technicians on Roadway Construction

    DOT National Transportation Integrated Search

    1999-12-01

    This manual has been developed as a training guide for field and laboratory technicians responsible for sampling and testing of soils used in roadway construction. Soils training and certification will increase the knowledge of laboratory, production...

  11. Octet baryons in large magnetic fields

    NASA Astrophysics Data System (ADS)

    Deshmukh, Amol; Tiburzi, Brian C.

    2018-01-01

    Magnetic properties of octet baryons are investigated within the framework of chiral perturbation theory. Utilizing a power counting for large magnetic fields, the Landau levels of charged mesons are treated exactly giving rise to baryon energies that depend nonanalytically on the strength of the magnetic field. In the small-field limit, baryon magnetic moments and polarizabilities emerge from the calculated energies. We argue that the magnetic polarizabilities of hyperons provide a testing ground for potentially large contributions from decuplet pole diagrams. In external magnetic fields, such contributions manifest themselves through decuplet-octet mixing, for which possible results are compared in a few scenarios. These scenarios can be tested with lattice QCD calculations of the octet baryon energies in magnetic fields.

  12. Field assessment of dried Plasmodium falciparum samples for malaria rapid diagnostic test quality control and proficiency testing in Ethiopia.

    PubMed

    Tamiru, Afework; Boulanger, Lucy; Chang, Michelle A; Malone, Joseph L; Aidoo, Michael

    2015-01-21

    Rapid diagnostic tests (RDTs) are now widely used for laboratory confirmation of suspected malaria cases to comply with the World Health Organization recommendation for universal testing before treatment. However, many malaria programmes lack quality control (QC) processes to assess RDT use under field conditions. Prior research showed the feasibility of using the dried tube specimen (DTS) method for preserving Plasmodium falciparum parasites for use as QC samples for RDTs. This study focused on the use of DTS for RDT QC and proficiency testing under field conditions. DTS were prepared using cultured P. falciparum at densities of 500 and 1,000 parasites/μL; 50 μL aliquots of these along with parasite negative human blood controls (0 parasites/μL) were air-dried in specimen tubes and reactivity verified after rehydration. The DTS were used in a field study in the Oromia Region of Ethiopia. Replicate DTS samples containing 0, 500 and 1,000 parasites/μL were stored at 4°C at a reference laboratory and at ambient temperatures at two nearby health facilities. At weeks 0, 4, 8, 12, 16, 20, and 24, the DTS were rehydrated and tested on RDTs stored under manufacturer-recommended temperatures at the RL and on RDTs stored under site-specific conditions at the two health facilities. Reactivity of DTS stored at 4°C at the reference laboratory on RDTs stored at the reference laboratory was considered the gold standard for assessing DTS stability. A proficiency-testing panel consisting of one negative and three positive samples, monitored with a checklist was administered at weeks 12 and 24. At all the seven time points, DTS stored at both the reference laboratory and health facility were reactive on RDTs stored under the recommended temperature and under field conditions, and the DTS without malaria parasites were negative. At the reference laboratory and one health facility, a 500 parasites/μL DTS from the proficiency panel was falsely reported as negative at week 24

  13. Detection of cocaine in cargo containers by high-volume vapor sampling: field test at Port of Miami

    NASA Astrophysics Data System (ADS)

    Neudorfl, Pavel; Hupe, Michael; Pilon, Pierre; Lawrence, Andre H.; Drolet, Gerry; Su, Chih-Wu; Rigdon, Stephen W.; Kunz, Terry D.; Ulwick, Syd; Hoglund, David E.; Wingo, Jeff J.; Demirgian, Jack C.; Shier, Patrick

    1997-02-01

    The use of marine containers is a well known smuggling method for large shipments of drugs. Such containers present an ideal method of smuggling as the examination method is time consuming, difficult and expensive for the importing community. At present, various methods are being studied for screening containers which would allow to rapidly distinguish between innocent and suspicious cargo. Air sampling is one such method. Air is withdrawn for the inside of containers and analyzed for telltale vapors uniquely associated with the drug. The attractive feature of the technique is that the containers could be sampled without destuffing and opening, since air could be conveniently withdrawn via ventilation ducts. In the present paper, the development of air sampling methodology for the detection of cocaine hydrochloride will be discussed, and the results from a recent field test will be presented. The results indicated that vapors of cocaine and its decomposition product, ecgonidine methyl ester, could serve as sensitive indicators of the presence of the drug in the containers.

  14. Recovering the full velocity and density fields from large-scale redshift-distance samples

    NASA Technical Reports Server (NTRS)

    Bertschinger, Edmund; Dekel, Avishai

    1989-01-01

    A new method for extracting the large-scale three-dimensional velocity and mass density fields from measurements of the radial peculiar velocities is presented. Galaxies are assumed to trace the velocity field rather than the mass. The key assumption made is that the Lagrangian velocity field has negligible vorticity, as might be expected from perturbations that grew by gravitational instability. By applying the method to cosmological N-body simulations, it is demonstrated that it accurately reconstructs the velocity field. This technique promises a direct determination of the mass density field and the initial conditions for the formation of large-scale structure from galaxy peculiar velocity surveys.

  15. Gravitational waves and large field inflation

    NASA Astrophysics Data System (ADS)

    Linde, Andrei

    2017-02-01

    According to the famous Lyth bound, one can confirm large field inflation by finding tensor modes with sufficiently large tensor-to-scalar ratio r. Here we will try to answer two related questions: is it possible to rule out all large field inflationary models by not finding tensor modes with r above some critical value, and what can we say about the scale of inflation by measuring r? However, in order to answer these questions one should distinguish between two different definitions of the large field inflation and three different definitions of the scale of inflation. We will examine these issues using the theory of cosmological α-attractors as a convenient testing ground.

  16. A laboratory and field evaluation of a portable immunoassay test for triazine herbicides in environmental water samples

    USGS Publications Warehouse

    Schulze, P.A.; Capel, P.D.; Squillace, P.J.; Helsel, D.R.

    1993-01-01

    The usefulness and sensitivity, of a portable immunoassay test for the semiquantitative field screening of water samples was evaluated by means of laboratory and field studies. Laboratory results indicated that the tests were useful for the determination of atrazine concentrations of 0.1 to 1.5 μg/L. At a concentration of 1 μg/L, the relative standard deviation in the difference between the regression line and the actual result was about 40 percent. The immunoassay was less sensitive and produced similar errors for other triazine herbicides. After standardization, the test results were relatively insensitive to ionic content and variations in pH (range, 4 to 10), mildly sensitive to temperature changes, and quite sensitive to the timing of the final incubation step, variances in timing can be a significant source of error. Almost all of the immunoassays predicted a higher atrazine concentration in water samples when compared to results of gas chromatography. If these tests are used as a semiquantitative screening tool, this tendency for overprediction does not diminish the tests' usefulness. Generally, the tests seem to be a valuable method for screening water samples for triazine herbicides.

  17. Sampling the sound field in auditoria using large natural-scale array measurements.

    PubMed

    Witew, Ingo B; Vorländer, Michael; Xiang, Ning

    2017-03-01

    Suitable data for spatial wave field analyses in concert halls need to satisfy the sampling theorem and hence requires densely spaced measurement positions over extended regions. The described measurement apparatus is capable of automatically sampling the sound field in auditoria over a surface of 5.30 m × 8.00 m to any appointed resolutions. In addition to discussing design features, a case study based on measured impulse responses is presented. The experimental data allow wave field animations demonstrating how sound propagating at grazing incidence over theater seating is scattered from rows of chairs (seat-dip effect). The visualized data of reflections and scattering from an auditorium's boundaries give insights and opportunities for advanced analyses.

  18. Field Tests of Optical Instruments

    DTIC Science & Technology

    1947-03-15

    s > S3KS55Ü j.6),&;i.r..fc..’.w.~— * s1 Field Tests of Optical Instruments ^. (Not known) (Same) Bureau of Ordnance. Washington, D..D...a large-scale field test of optical instruments are described. The tests were instituted to check the correctness of theoretical considerations and...of laboratory tests -which have been v.sed in the selection and design of such instruments. Field con- ditions approximated as far as possible those

  19. Wood Dust Sampling: Field Evaluation of Personal Samplers When Large Particles Are Present

    PubMed Central

    Lee, Taekhee; Harper, Martin; Slaven, James E.; Lee, Kiyoung; Rando, Roy J.; Maples, Elizabeth H.

    2011-01-01

    Recent recommendations for wood dust sampling include sampling according to the inhalable convention of International Organization for Standardization (ISO) 7708 (1995) Air quality—particle size fraction definitions for health-related sampling. However, a specific sampling device is not mandated, and while several samplers have laboratory performance approaching theoretical for an ‘inhalable’ sampler, the best choice of sampler for wood dust is not clear. A side-by-side field study was considered the most practical test of samplers as laboratory performance tests consider overall performance based on a wider range of particle sizes than are commonly encountered in the wood products industry. Seven companies in the wood products industry of the Southeast USA (MS, KY, AL, and WV) participated in this study. The products included hardwood flooring, engineered hardwood flooring, door skins, shutter blinds, kitchen cabinets, plywood, and veneer. The samplers selected were 37-mm closed-face cassette with ACCU-CAP™, Button, CIP10-I, GSP, and Institute of Occupational Medicine. Approximately 30 of each possible pairwise combination of samplers were collected as personal sample sets. Paired samplers of the same type were used to calculate environmental variance that was then used to determine the number of pairs of samples necessary to detect any difference at a specified level of confidence. Total valid sample number was 888 (444 valid pairs). The mass concentration of wood dust ranged from 0.02 to 195 mg m−3. Geometric mean (geometric standard deviation) and arithmetic mean (standard deviation) of wood dust were 0.98 mg m−3 (3.06) and 2.12 mg m−3 (7.74), respectively. One percent of the samples exceeded 15 mg m−3, 6% exceeded 5 mg m−3, and 48% exceeded 1 mg m−3. The number of collected pairs is generally appropriate to detect a 35% difference when outliers (negative mass loadings) are removed. Statistical evaluation of the nonsimilar sampler pair

  20. Gibbs sampling on large lattice with GMRF

    NASA Astrophysics Data System (ADS)

    Marcotte, Denis; Allard, Denis

    2018-02-01

    Gibbs sampling is routinely used to sample truncated Gaussian distributions. These distributions naturally occur when associating latent Gaussian fields to category fields obtained by discrete simulation methods like multipoint, sequential indicator simulation and object-based simulation. The latent Gaussians are often used in data assimilation and history matching algorithms. When the Gibbs sampling is applied on a large lattice, the computing cost can become prohibitive. The usual practice of using local neighborhoods is unsatisfying as it can diverge and it does not reproduce exactly the desired covariance. A better approach is to use Gaussian Markov Random Fields (GMRF) which enables to compute the conditional distributions at any point without having to compute and invert the full covariance matrix. As the GMRF is locally defined, it allows simultaneous updating of all points that do not share neighbors (coding sets). We propose a new simultaneous Gibbs updating strategy on coding sets that can be efficiently computed by convolution and applied with an acceptance/rejection method in the truncated case. We study empirically the speed of convergence, the effect of choice of boundary conditions, of the correlation range and of GMRF smoothness. We show that the convergence is slower in the Gaussian case on the torus than for the finite case studied in the literature. However, in the truncated Gaussian case, we show that short scale correlation is quickly restored and the conditioning categories at each lattice point imprint the long scale correlation. Hence our approach enables to realistically apply Gibbs sampling on large 2D or 3D lattice with the desired GMRF covariance.

  1. Soil Sampling Techniques For Alabama Grain Fields

    NASA Technical Reports Server (NTRS)

    Thompson, A. N.; Shaw, J. N.; Mask, P. L.; Touchton, J. T.; Rickman, D.

    2003-01-01

    Characterizing the spatial variability of nutrients facilitates precision soil sampling. Questions exist regarding the best technique for directed soil sampling based on a priori knowledge of soil and crop patterns. The objective of this study was to evaluate zone delineation techniques for Alabama grain fields to determine which method best minimized the soil test variability. Site one (25.8 ha) and site three (20.0 ha) were located in the Tennessee Valley region, and site two (24.2 ha) was located in the Coastal Plain region of Alabama. Tennessee Valley soils ranged from well drained Rhodic and Typic Paleudults to somewhat poorly drained Aquic Paleudults and Fluventic Dystrudepts. Coastal Plain s o i l s ranged from coarse-loamy Rhodic Kandiudults to loamy Arenic Kandiudults. Soils were sampled by grid soil sampling methods (grid sizes of 0.40 ha and 1 ha) consisting of: 1) twenty composited cores collected randomly throughout each grid (grid-cell sampling) and, 2) six composited cores collected randomly from a -3x3 m area at the center of each grid (grid-point sampling). Zones were established from 1) an Order 1 Soil Survey, 2) corn (Zea mays L.) yield maps, and 3) airborne remote sensing images. All soil properties were moderately to strongly spatially dependent as per semivariogram analyses. Differences in grid-point and grid-cell soil test values suggested grid-point sampling does not accurately represent grid values. Zones created by soil survey, yield data, and remote sensing images displayed lower coefficient of variations (8CV) for soil test values than overall field values, suggesting these techniques group soil test variability. However, few differences were observed between the three zone delineation techniques. Results suggest directed sampling using zone delineation techniques outlined in this paper would result in more efficient soil sampling for these Alabama grain fields.

  2. Validation of the Puumala virus rapid field test for bank voles in Germany.

    PubMed

    Reil, D; Imholt, C; Rosenfeld, U M; Drewes, S; Fischer, S; Heuser, E; Petraityte-Burneikiene, R; Ulrich, R G; Jacob, J

    2017-02-01

    Puumala virus (PUUV) causes many human infections in large parts of Europe and can lead to mild to moderate disease. The bank vole (Myodes glareolus) is the only reservoir of PUUV in Central Europe. A commercial PUUV rapid field test for rodents was validated for bank-vole blood samples collected in two PUUV-endemic regions in Germany (North Rhine-Westphalia and Baden-Württemberg). A comparison of the results of the rapid field test and standard ELISAs indicated a test efficacy of 93-95%, largely independent of the origin of the antigens used in the ELISA. In ELISAs, reactivity for the German PUUV strain was higher compared to the Swedish strain but not compared to the Finnish strain, which was used for the rapid field test. In conclusion, the use of the rapid field test can facilitate short-term estimation of PUUV seroprevalence in bank-vole populations in Germany and can aid in assessing human PUUV infection risk.

  3. A field test of point relascope sampling of down coarse woody material in managed stands in the Acadian Forest

    Treesearch

    John C. Brissette; Mark J. Ducey; Jeffrey H. Gove

    2003-01-01

    We field tested a new method for sampling down coarse woody material (CWM) using an angle gauge and compared it with the more traditional line intersect sampling (LIS) method. Permanent sample locations in stands managed with different silvicultural treatments within the Penobscot Experimental Forest (Maine, USA) were used as the sampling locations. Point relascope...

  4. A Multilevel, Hierarchical Sampling Technique for Spatially Correlated Random Fields

    DOE PAGES

    Osborn, Sarah; Vassilevski, Panayot S.; Villa, Umberto

    2017-10-26

    In this paper, we propose an alternative method to generate samples of a spatially correlated random field with applications to large-scale problems for forward propagation of uncertainty. A classical approach for generating these samples is the Karhunen--Loève (KL) decomposition. However, the KL expansion requires solving a dense eigenvalue problem and is therefore computationally infeasible for large-scale problems. Sampling methods based on stochastic partial differential equations provide a highly scalable way to sample Gaussian fields, but the resulting parametrization is mesh dependent. We propose a multilevel decomposition of the stochastic field to allow for scalable, hierarchical sampling based on solving amore » mixed finite element formulation of a stochastic reaction-diffusion equation with a random, white noise source function. Lastly, numerical experiments are presented to demonstrate the scalability of the sampling method as well as numerical results of multilevel Monte Carlo simulations for a subsurface porous media flow application using the proposed sampling method.« less

  5. A Multilevel, Hierarchical Sampling Technique for Spatially Correlated Random Fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, Sarah; Vassilevski, Panayot S.; Villa, Umberto

    In this paper, we propose an alternative method to generate samples of a spatially correlated random field with applications to large-scale problems for forward propagation of uncertainty. A classical approach for generating these samples is the Karhunen--Loève (KL) decomposition. However, the KL expansion requires solving a dense eigenvalue problem and is therefore computationally infeasible for large-scale problems. Sampling methods based on stochastic partial differential equations provide a highly scalable way to sample Gaussian fields, but the resulting parametrization is mesh dependent. We propose a multilevel decomposition of the stochastic field to allow for scalable, hierarchical sampling based on solving amore » mixed finite element formulation of a stochastic reaction-diffusion equation with a random, white noise source function. Lastly, numerical experiments are presented to demonstrate the scalability of the sampling method as well as numerical results of multilevel Monte Carlo simulations for a subsurface porous media flow application using the proposed sampling method.« less

  6. Field Exploration and Life Detection Sampling Through Planetary Analogue Sampling (FELDSPAR).

    NASA Technical Reports Server (NTRS)

    Stockton, A.; Amador, E. S.; Cable, M. L.; Cantrell, T.; Chaudry, N.; Cullen, T.; Duca, Z.; Gentry, D. M.; Kirby, J.; Jacobsen, M.; hide

    2017-01-01

    Exploration missions to Mars rely on rovers to perform analyses over small sampling areas; however, landing sites for these missions are selected based on large-scale, low-resolution remote data. The use of Earth analogue environments to estimate the multi-scale spatial distributions of key signatures of habitability can help ensure mission science goals are met. A main goal of FELDSPAR is to conduct field operations analogous to Mars sample return in its science, operations, and technology from landing site selection, to in-field sampling location selection, remote or stand-off analysis, in situ analysis, and home laboratory analysis. Lava fields and volcanic regions are relevant analogues to Martian landscapes due to desiccation, low nutrient availability, and temperature extremes. Operationally, many Icelandic lava fields are remote enough to require that field expeditions address several sampling constraints that are experienced in robotic exploration, including in situ and sample return missions. The Fimmvruhls lava field was formed by a basaltic effusive eruption associated with the 2010 Eyjafjallajkull eruption. Mlifellssandur is a recently deglaciated plain to the north of the Myrdalsjkull glacier. Holuhraun was formed by a 2014 fissure eruptions just north of the large Vatnajkull glacier. Dyngjusandur is an alluvial plain apparently kept barren by repeated mechanical weathering. Informed by our 2013 expedition, we collected samples in nested triangular grids every decade from the 10 cm scale to the 1 km scale (as permitted by the size of the site). Satellite imagery is available for older sites, and for Mlifellssandur, Holuhraun, and Dyngjusandur we obtained overhead imagery at 1 m to 200 m elevation. PanCam-style photographs were taken in the field by sampling personnel. In-field reflectance spectroscopy was also obtained with an ASD spectrometer in Dyngjusandur. All sites chosen were 'homogeneous' in apparent color, morphology, moisture, grain size, and

  7. Potential, velocity, and density fields from sparse and noisy redshift-distance samples - Method

    NASA Technical Reports Server (NTRS)

    Dekel, Avishai; Bertschinger, Edmund; Faber, Sandra M.

    1990-01-01

    A method for recovering the three-dimensional potential, velocity, and density fields from large-scale redshift-distance samples is described. Galaxies are taken as tracers of the velocity field, not of the mass. The density field and the initial conditions are calculated using an iterative procedure that applies the no-vorticity assumption at an initial time and uses the Zel'dovich approximation to relate initial and final positions of particles on a grid. The method is tested using a cosmological N-body simulation 'observed' at the positions of real galaxies in a redshift-distance sample, taking into account their distance measurement errors. Malmquist bias and other systematic and statistical errors are extensively explored using both analytical techniques and Monte Carlo simulations.

  8. Identifying Microlensing Events in Large, Non-Uniformly Sampled Surveys: The Case of the Palomar Transient Factory

    NASA Astrophysics Data System (ADS)

    Price-Whelan, Adrian M.; Agueros, M. A.; Fournier, A.; Street, R.; Ofek, E.; Levitan, D. B.; PTF Collaboration

    2013-01-01

    Many current photometric, time-domain surveys are driven by specific goals such as searches for supernovae or transiting exoplanets, or studies of stellar variability. These goals in turn set the cadence with which individual fields are re-imaged. In the case of the Palomar Transient Factory (PTF), several such sub-surveys are being conducted in parallel, leading to extremely non-uniform sampling over the survey's nearly 20,000 sq. deg. footprint. While the typical 7.26 sq. deg. PTF field has been imaged 20 times in R-band, ~2300 sq. deg. have been observed more than 100 times. We use the existing PTF data 6.4x107 light curves) to study the trade-off that occurs when searching for microlensing events when one has access to a large survey footprint with irregular sampling. To examine the probability that microlensing events can be recovered in these data, we also test previous statistics used on uniformly sampled data to identify variables and transients. We find that one such statistic, the von Neumann ratio, performs best for identifying simulated microlensing events. We develop a selection method using this statistic and apply it to data from all PTF fields with >100 observations to uncover a number of interesting candidate events. This work can help constrain all-sky event rate predictions and tests microlensing signal recovery in large datasets, both of which will be useful to future wide-field, time-domain surveys such as the LSST.

  9. Noninvasive pH monitoring of platelet concentrates: a large field test.

    PubMed

    Gkoumassi, Effimia; Klein-Bosgoed, Christa; Dijkstra-Tiekstra, Margriet J; de Korte, Dirk; de Wildt-Eggen, Janny

    2013-10-01

    Developing new quality control methods for platelet concentrates (PCs) can contribute to increasing transfusion safety and efficiency. The aim of this study was to investigate in a large field test the quality of expired PCs and whether 100% noninvasive pH monitoring can be used to predict PC quality. The pH of 13,693 PCs produced for transfusion was monitored daily using Blood Storage, Inc.'s pH sterile, automated fluoroscopic evaluation technology. Upon indication of compromised quality or expiration, PCs were returned and in vitro tests were performed. A total of 998 PCs were returned, of which 962 outdated, 26 had a positive BacT/ALERT reaction, seven had aggregates, one was without swirl, one had low pH, and one had high pH. BacT/ALERT was faster in identifying bacterial contamination than pH measurements. The pH at the end of the storage period was significantly lower than at the beginning. In vitro tests indicated that while the PC quality was acceptable upon expiration, it rapidly declined after expiration. In this setting where the vast majority of PCs were of good quality and within acceptable pH limits, daily, noninvasive routine pH measurement has limited added value in identifying quality-compromised PCs. © 2013 Sanquin Research. Transfusion © 2013 American Association of Blood Banks.

  10. Evaluation of the Biological Sampling Kit (BiSKit) for Large-Area Surface Sampling

    PubMed Central

    Buttner, Mark P.; Cruz, Patricia; Stetzenbach, Linda D.; Klima-Comba, Amy K.; Stevens, Vanessa L.; Emanuel, Peter A.

    2004-01-01

    Current surface sampling methods for microbial contaminants are designed to sample small areas and utilize culture analysis. The total number of microbes recovered is low because a small area is sampled, making detection of a potential pathogen more difficult. Furthermore, sampling of small areas requires a greater number of samples to be collected, which delays the reporting of results, taxes laboratory resources and staffing, and increases analysis costs. A new biological surface sampling method, the Biological Sampling Kit (BiSKit), designed to sample large areas and to be compatible with testing with a variety of technologies, including PCR and immunoassay, was evaluated and compared to other surface sampling strategies. In experimental room trials, wood laminate and metal surfaces were contaminated by aerosolization of Bacillus atrophaeus spores, a simulant for Bacillus anthracis, into the room, followed by settling of the spores onto the test surfaces. The surfaces were sampled with the BiSKit, a cotton-based swab, and a foam-based swab. Samples were analyzed by culturing, quantitative PCR, and immunological assays. The results showed that the large surface area (1 m2) sampled with the BiSKit resulted in concentrations of B. atrophaeus in samples that were up to 10-fold higher than the concentrations obtained with the other methods tested. A comparison of wet and dry sampling with the BiSKit indicated that dry sampling was more efficient (efficiency, 18.4%) than wet sampling (efficiency, 11.3%). The sensitivities of detection of B. atrophaeus on metal surfaces were 42 ± 5.8 CFU/m2 for wet sampling and 100.5 ± 10.2 CFU/m2 for dry sampling. These results demonstrate that the use of a sampling device capable of sampling larger areas results in higher sensitivity than that obtained with currently available methods and has the advantage of sampling larger areas, thus requiring collection of fewer samples per site. PMID:15574898

  11. Conducting field studies for testing pesticide leaching models

    USGS Publications Warehouse

    Smith, Charles N.; Parrish, Rudolph S.; Brown, David S.

    1990-01-01

    A variety of predictive models are being applied to evaluate the transport and transformation of pesticides in the environment. These include well known models such as the Pesticide Root Zone Model (PRZM), the Risk of Unsaturated-Saturated Transport and Transformation Interactions for Chemical Concentrations Model (RUSTIC) and the Groundwater Loading Effects of Agricultural Management Systems Model (GLEAMS). The potentially large impacts of using these models as tools for developing pesticide management strategies and regulatory decisions necessitates development of sound model validation protocols. This paper offers guidance on many of the theoretical and practical problems encountered in the design and implementation of field-scale model validation studies. Recommendations are provided for site selection and characterization, test compound selection, data needs, measurement techniques, statistical design considerations and sampling techniques. A strategy is provided for quantitatively testing models using field measurements.

  12. Field size, length, and width distributions based on LACIE ground truth data. [large area crop inventory experiment

    NASA Technical Reports Server (NTRS)

    Pitts, D. E.; Badhwar, G.

    1980-01-01

    The development of agricultural remote sensing systems requires knowledge of agricultural field size distributions so that the sensors, sampling frames, image interpretation schemes, registration systems, and classification systems can be properly designed. Malila et al. (1976) studied the field size distribution for wheat and all other crops in two Kansas LACIE (Large Area Crop Inventory Experiment) intensive test sites using ground observations of the crops and measurements of their field areas based on current year rectified aerial photomaps. The field area and size distributions reported in the present investigation are derived from a representative subset of a stratified random sample of LACIE sample segments. In contrast to previous work, the obtained results indicate that most field-size distributions are not log-normally distributed. The most common field size observed in this study was 10 acres for most crops studied.

  13. FIELD-SCALE STUDIES: HOW DOES SOIL SAMPLE PRETREATMENT AFFECT REPRESENTATIVENESS?

    EPA Science Inventory

    Samples from field-scale studies are very heterogeneous and can contain large soil and rock particles. Oversize materials are often removed before chemical analysis of the soil samples because it is not practical to include these materials. Is the extracted sample representativ...

  14. Characteristics of the LacTek test as applied to tissue samples: assessment of performance using incurred field samples.

    PubMed

    Mitchell, J M; McNab, W B; Yee, A J; Griffiths, M W; McEwen, S A; Spilsbury, L; Boison, J O

    1998-08-01

    The Lactek test, marketed for antimicrobial residue detection in milk, was validated for the detection of antimicrobial residues in tissues. A previous study found that the LacTek test could confidently identify tissue samples spiked with antimicrobial residues. However, the test could not reliably distinguish violative from nonviolative spiked samples relative to Canadian maximum residue limits (MRLs). The objectives of this study were to assess and compare the performance of the LacTek tests for beta-lactams, tetracyclines, gentamicin, and sulfamethazine on samples containing naturally incurred residues by running the test in parallel with the standard microbial inhibition test (MIT) presently used for the routine testing of tissues at our facility and to assess the agreement with high pressure liquid chromatographic (HPLC) determinative methods. Parallel testing with the official MIT found that the Lactek tests could be confidently used for testing tissue samples containing incurred residues. Among 1,008 MIT-positive samples, the LacTek test found that 90% contained beta-lactams and/or tetracyclines. A further 7.3% of violative residues could not be identified to an antimicrobial class. In addition, 9% of samples testing negative on the MIT were found to contain an antimicrobial residue by the LacTek tests. Comparative testing with HPLC methods found that there was very good agreement between the two tests and that most violations were due to penicillin G and oxytetracycline. Although the LacTek test cannot be used to distinguish violative from nonviolative residue levels, it does offer several advantages over the present MIT. These include speed, ease of use, the ability to identify residues to a specific class, and an improved sensitivity at the MRL level for the most commonly found antimicrobials in tissue.

  15. Depiction of pneumothoraces in a large animal model using x-ray dark-field radiography.

    PubMed

    Hellbach, Katharina; Baehr, Andrea; De Marco, Fabio; Willer, Konstantin; Gromann, Lukas B; Herzen, Julia; Dmochewitz, Michaela; Auweter, Sigrid; Fingerle, Alexander A; Noël, Peter B; Rummeny, Ernst J; Yaroshenko, Andre; Maack, Hanns-Ingo; Pralow, Thomas; van der Heijden, Hendrik; Wieberneit, Nataly; Proksa, Roland; Koehler, Thomas; Rindt, Karsten; Schroeter, Tobias J; Mohr, Juergen; Bamberg, Fabian; Ertl-Wagner, Birgit; Pfeiffer, Franz; Reiser, Maximilian F

    2018-02-08

    The aim of this study was to assess the diagnostic value of x-ray dark-field radiography to detect pneumothoraces in a pig model. Eight pigs were imaged with an experimental grating-based large-animal dark-field scanner before and after induction of a unilateral pneumothorax. Image contrast-to-noise ratios between lung tissue and the air-filled pleural cavity were quantified for transmission and dark-field radiograms. The projected area in the object plane of the inflated lung was measured in dark-field images to quantify the collapse of lung parenchyma due to a pneumothorax. Means and standard deviations for lung sizes and signal intensities from dark-field and transmission images were tested for statistical significance using Student's two-tailed t-test for paired samples. The contrast-to-noise ratio between the air-filled pleural space of lateral pneumothoraces and lung tissue was significantly higher in the dark-field (3.65 ± 0.9) than in the transmission images (1.13 ± 1.1; p = 0.002). In case of dorsally located pneumothoraces, a significant decrease (-20.5%; p > 0.0001) in the projected area of inflated lung parenchyma was found after a pneumothorax was induced. Therefore, the detection of pneumothoraces in x-ray dark-field radiography was facilitated compared to transmission imaging in a large animal model.

  16. Cross-validation of the Dot Counting Test in a large sample of credible and non-credible patients referred for neuropsychological testing.

    PubMed

    McCaul, Courtney; Boone, Kyle B; Ermshar, Annette; Cottingham, Maria; Victor, Tara L; Ziegler, Elizabeth; Zeller, Michelle A; Wright, Matthew

    2018-01-18

    To cross-validate the Dot Counting Test in a large neuropsychological sample. Dot Counting Test scores were compared in credible (n = 142) and non-credible (n = 335) neuropsychology referrals. Non-credible patients scored significantly higher than credible patients on all Dot Counting Test scores. While the original E-score cut-off of ≥17 achieved excellent specificity (96.5%), it was associated with mediocre sensitivity (52.8%). However, the cut-off could be substantially lowered to ≥13.80, while still maintaining adequate specificity (≥90%), and raising sensitivity to 70.0%. Examination of non-credible subgroups revealed that Dot Counting Test sensitivity in feigned mild traumatic brain injury (mTBI) was 55.8%, whereas sensitivity was 90.6% in patients with non-credible cognitive dysfunction in the context of claimed psychosis, and 81.0% in patients with non-credible cognitive performance in depression or severe TBI. Thus, the Dot Counting Test may have a particular role in detection of non-credible cognitive symptoms in claimed psychiatric disorders. Alternative to use of the E-score, failure on ≥1 cut-offs applied to individual Dot Counting Test scores (≥6.0″ for mean grouped dot counting time, ≥10.0″ for mean ungrouped dot counting time, and ≥4 errors), occurred in 11.3% of the credible sample, while nearly two-thirds (63.6%) of the non-credible sample failed one of more of these cut-offs. An E-score cut-off of 13.80, or failure on ≥1 individual score cut-offs, resulted in few false positive identifications in credible patients, and achieved high sensitivity (64.0-70.0%), and therefore appear appropriate for use in identifying neurocognitive performance invalidity.

  17. A Sample of Very Young Field L Dwarfs and Implications for the Brown Dwarf "Lithium Test" at Early Ages

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, J. Davy; Cruz, Kelle L.; Barman, Travis S.; Burgasser, Adam J.; Looper, Dagny L.; Tinney, C. G.; Gelino, Christopher R.; Lowrance, Patrick J.; Liebert, James; Carpenter, John M.; Hillenbrand, Lynne A.; Stauffer, John R.

    2008-12-01

    Using a large sample of optical spectra of late-type dwarfs, we identify a subset of late-M through L field dwarfs that, because of the presence of low-gravity features in their spectra, are believed to be unusually young. From a combined sample of 303 field L dwarfs, we find observationally that 7.6% +/- 1.6% are younger than 100 Myr. This percentage is in agreement with theoretical predictions once observing biases are taken into account. We find that these young L dwarfs tend to fall in the southern hemisphere (decl . < 0°) and may be previously unrecognized, low-mass members of nearby, young associations like Tucana-Horologium, TW Hydrae, β Pictoris, and AB Doradus. We use a homogeneously observed sample of ~150 optical spectra to examine lithium strength as a function of L/T spectral type and further corroborate the trends noted by Kirkpatrick and coworkers. We use our low-gravity spectra to investigate lithium strength as a function of age. The data weakly suggest that for early- to mid-L dwarfs the line strength reaches a maximum for a few × 100 Myr, whereas for much older (few Gyr) and much younger (<100 Myr) L dwarfs the line is weaker or undetectable. We show that a weakening of lithium at lower gravities is predicted by model atmosphere calculations, an effect partially corroborated by existing observational data. Larger samples containing L dwarfs of well-determined ages are needed to further test this empirically. If verified, this result would reinforce the caveat first cited by Kirkpatrick and coworkers that the lithium test should be used with caution when attempting to confirm the substellar nature of the youngest brown dwarfs. Most of the spectroscopic data presented herein were obtained at the W. M. Keck Observatory, which is operated as a scientific partnership among the California Institute of Technology, the University of California, and the National Aeronautics and Space Administration. The Observatory was made possible by the generous

  18. Further examination of embedded performance validity indicators for the Conners' Continuous Performance Test and Brief Test of Attention in a large outpatient clinical sample.

    PubMed

    Sharland, Michael J; Waring, Stephen C; Johnson, Brian P; Taran, Allise M; Rusin, Travis A; Pattock, Andrew M; Palcher, Jeanette A

    2018-01-01

    Assessing test performance validity is a standard clinical practice and although studies have examined the utility of cognitive/memory measures, few have examined attention measures as indicators of performance validity beyond the Reliable Digit Span. The current study further investigates the classification probability of embedded Performance Validity Tests (PVTs) within the Brief Test of Attention (BTA) and the Conners' Continuous Performance Test (CPT-II), in a large clinical sample. This was a retrospective study of 615 patients consecutively referred for comprehensive outpatient neuropsychological evaluation. Non-credible performance was defined two ways: failure on one or more PVTs and failure on two or more PVTs. Classification probability of the BTA and CPT-II into non-credible groups was assessed. Sensitivity, specificity, positive predictive value, and negative predictive value were derived to identify clinically relevant cut-off scores. When using failure on two or more PVTs as the indicator for non-credible responding compared to failure on one or more PVTs, highest classification probability, or area under the curve (AUC), was achieved by the BTA (AUC = .87 vs. .79). CPT-II Omission, Commission, and Total Errors exhibited higher classification probability as well. Overall, these findings corroborate previous findings, extending them to a large clinical sample. BTA and CPT-II are useful embedded performance validity indicators within a clinical battery but should not be used in isolation without other performance validity indicators.

  19. FIELD-SCALE STUDIES: HOW DOES SOIL SAMPLE PRETREATMENT AFFECT REPRESENTATIVENESS ? (ABSTRACT)

    EPA Science Inventory

    Samples from field-scale studies are very heterogeneous and can contain large soil and rock particles. Oversize materials are often removed before chemical analysis of the soil samples because it is not practical to include these materials. Is the extracted sample representativ...

  20. Deep Borehole Field Test Research Activities at LBNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobson, Patrick; Tsang, Chin-Fu; Kneafsey, Timothy

    The goal of the U.S. Department of Energy Used Fuel Disposition’s (UFD) Deep Borehole Field Test is to drill two 5 km large-diameter boreholes: a characterization borehole with a bottom-hole diameter of 8.5 inches and a field test borehole with a bottom-hole diameter of 17 inches. These boreholes will be used to demonstrate the ability to drill such holes in crystalline rocks, effectively characterize the bedrock repository system using geophysical, geochemical, and hydrological techniques, and emplace and retrieve test waste packages. These studies will be used to test the deep borehole disposal concept, which requires a hydrologically isolated environment characterizedmore » by low permeability, stable fluid density, reducing fluid chemistry conditions, and an effective borehole seal. During FY16, Lawrence Berkeley National Laboratory scientists conducted a number of research studies to support the UFD Deep Borehole Field Test effort. This work included providing supporting data for the Los Alamos National Laboratory geologic framework model for the proposed deep borehole site, conducting an analog study using an extensive suite of geoscience data and samples from a deep (2.5 km) research borehole in Sweden, conducting laboratory experiments and coupled process modeling related to borehole seals, and developing a suite of potential techniques that could be applied to the characterization and monitoring of the deep borehole environment. The results of these studies are presented in this report.« less

  1. Large strain cruciform biaxial testing for FLC detection

    NASA Astrophysics Data System (ADS)

    Güler, Baran; Efe, Mert

    2017-10-01

    Selection of proper test method, specimen design and analysis method are key issues for studying formability of sheet metals and detection of their forming limit curves (FLC). Materials with complex microstructures may need an additional micro-mechanical investigation and accurate modelling. Cruciform biaxial test stands as an alternative to standard tests as it achieves frictionless, in-plane, multi-axial stress states with a single sample geometry. In this study, we introduce a small-scale (less than 10 cm) cruciform sample allowing micro-mechanical investigation at stress states ranging from plane strain to equibiaxial. With successful specimen design and surface finish, large forming limit strains are obtained at the test region of the sample. The large forming limit strains obtained by experiments are compared to the values obtained from Marciniak-Kuczynski (M-K) local necking model and Cockroft-Latham damage model. This comparison shows that the experimental limiting strains are beyond the theoretical values, approaching to the fracture strain of the two test materials: Al-6061-T6 aluminum alloy and DC-04 high formability steel.

  2. Large-scale field testing on flexible shallow landslide barriers

    NASA Astrophysics Data System (ADS)

    Bugnion, Louis; Volkwein, Axel; Wendeler, Corinna; Roth, Andrea

    2010-05-01

    Open shallow landslides occur regularly in a wide range of natural terrains. Generally, they are difficult to predict and result in damages to properties and disruption of transportation systems. In order to improve the knowledge about the physical process itself and to develop new protection measures, large-scale field experiments were conducted in Veltheim, Switzerland. Material was released down a 30° inclined test slope into a flexible barrier. The flow as well as the impact into the barrier was monitored using various measurement techniques. Laser devices recording flow heights, a special force plate measuring normal and shear basal forces as well as load cells for impact pressures were installed along the test slope. In addition, load cells were built in the support and retaining cables of the barrier to provide data for detailed back-calculation of load distribution during impact. For the last test series an additional guiding wall in flow direction on both sides of the barrier was installed to achieve higher impact pressures in the middle of the barrier. With these guiding walls the flow is not able to spread out before hitting the barrier. A special constructed release mechanism simulating the sudden failure of the slope was designed such that about 50 m3 of mixed earth and gravel saturated with water can be released in an instant. Analysis of cable forces combined with impact pressures and velocity measurements during a test series allow us now to develop a load model for the barrier design. First numerical simulations with the software tool FARO, originally developed for rockfall barriers and afterwards calibrated for debris flow impacts, lead already to structural improvements on barrier design. Decisive for the barrier design is the first dynamic impact pressure depending on the flow velocity and afterwards the hydrostatic pressure of the complete retained material behind the barrier. Therefore volume estimation of open shallow landslides by assessing

  3. Remedial investigation sampling and analysis plan for J-Field, Aberdeen Proving Ground, Maryland. Volume 1: Field Sampling Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benioff, P.; Biang, R.; Dolak, D.

    1995-03-01

    The Environmental Management Division (EMD) of Aberdeen Proving Ground (APG), Maryland, is conducting a remedial investigation and feasibility study (RI/FS) of the J-Field area at APG pursuant to the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), as amended. J-Field is within the Edgewood Area of APG in Harford County, Maryland (Figure 1. 1). Since World War II activities in the Edgewood Area have included the development, manufacture, testing, and destruction of chemical agents and munitions. These materials were destroyed at J-Field by open burning and open detonation (OB/OD). Considerable archival information about J-Field exists as a result of effortsmore » by APG staff to characterize the hazards associated with the site. Contamination of J-Field was first detected during an environmental survey of the Edgewood Area conducted in 1977 and 1978 by the US Army Toxic and Hazardous Materials Agency (USATHAMA) (predecessor to the US Army Environmental Center [AEC]). As part of a subsequent USATHAMA -environmental survey, 11 wells were installed and sampled at J-Field. Contamination at J-Field was also detected during a munitions disposal survey conducted by Princeton Aqua Science in 1983. The Princeton Aqua Science investigation involved the installation and sampling of nine wells and the collection and analysis of surficial and deep composite soil samples. In 1986, a Resource Conservation and Recovery Act (RCRA) permit (MD3-21-002-1355) requiring a basewide RCRA Facility Assessment (RFA) and a hydrogeologic assessment of J-Field was issued by the US Environmental Protection Agency (EPA). In 1987, the US Geological Survey (USGS) began a two-phased hydrogeologic assessment in data were collected to model, groundwater flow at J-Field. Soil gas investigations were conducted, several well clusters were installed, a groundwater flow model was developed, and groundwater and surface water monitoring programs were established that continue

  4. A flux extraction device to measure the magnetic moment of large samples; application to bulk superconductors.

    PubMed

    Egan, R; Philippe, M; Wera, L; Fagnard, J F; Vanderheyden, B; Dennis, A; Shi, Y; Cardwell, D A; Vanderbemden, P

    2015-02-01

    We report the design and construction of a flux extraction device to measure the DC magnetic moment of large samples (i.e., several cm(3)) at cryogenic temperature. The signal is constructed by integrating the electromotive force generated by two coils wound in series-opposition that move around the sample. We show that an octupole expansion of the magnetic vector potential can be used conveniently to treat near-field effects for this geometrical configuration. The resulting expansion is tested for the case of a large, permanently magnetized, type-II superconducting sample. The dimensions of the sensing coils are determined in such a way that the measurement is influenced by the dipole magnetic moment of the sample and not by moments of higher order, within user-determined upper bounds. The device, which is able to measure magnetic moments in excess of 1 A m(2) (1000 emu), is validated by (i) a direct calibration experiment using a small coil driven by a known current and (ii) by comparison with the results of numerical calculations obtained previously using a flux measurement technique. The sensitivity of the device is demonstrated by the measurement of flux-creep relaxation of the magnetization in a large bulk superconductor sample at liquid nitrogen temperature (77 K).

  5. Statistical Analysis of a Large Sample Size Pyroshock Test Data Set Including Post Flight Data Assessment. Revision 1

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Anne M.

    2010-01-01

    The Earth Observing System (EOS) Terra spacecraft was launched on an Atlas IIAS launch vehicle on its mission to observe planet Earth in late 1999. Prior to launch, the new design of the spacecraft's pyroshock separation system was characterized by a series of 13 separation ground tests. The analysis methods used to evaluate this unusually large amount of shock data will be discussed in this paper, with particular emphasis on population distributions and finding statistically significant families of data, leading to an overall shock separation interface level. The wealth of ground test data also allowed a derivation of a Mission Assurance level for the flight. All of the flight shock measurements were below the EOS Terra Mission Assurance level thus contributing to the overall success of the EOS Terra mission. The effectiveness of the statistical methodology for characterizing the shock interface level and for developing a flight Mission Assurance level from a large sample size of shock data is demonstrated in this paper.

  6. Threshold Theory Tested in an Organizational Setting: The Relation between Perceived Innovativeness and Intelligence in a Large Sample of Leaders

    ERIC Educational Resources Information Center

    Christensen, Bo T.; Hartmann, Peter V. W.; Rasmussen, Thomas Hedegaard

    2017-01-01

    A large sample of leaders (N = 4257) was used to test the link between leader innovativeness and intelligence. The threshold theory of the link between creativity and intelligence assumes that below a certain IQ level (approximately IQ 120), there is some correlation between IQ and creative potential, but above this cutoff point, there is no…

  7. Genus-Specific Primers for Study of Fusarium Communities in Field Samples

    PubMed Central

    Edel-Hermann, Véronique; Gautheron, Nadine; Durling, Mikael Brandström; Kolseth, Anna-Karin; Steinberg, Christian; Persson, Paula; Friberg, Hanna

    2015-01-01

    Fusarium is a large and diverse genus of fungi of great agricultural and economic importance, containing many plant pathogens and mycotoxin producers. To date, high-throughput sequencing of Fusarium communities has been limited by the lack of genus-specific primers targeting regions with high discriminatory power at the species level. In the present study, we evaluated two Fusarium-specific primer pairs targeting translation elongation factor 1 (TEF1). We also present the new primer pair Fa+7/Ra+6. Mock Fusarium communities reflecting phylogenetic diversity were used to evaluate the accuracy of the primers in reflecting the relative abundance of the species. TEF1 amplicons were subjected to 454 high-throughput sequencing to characterize Fusarium communities. Field samples from soil and wheat kernels were included to test the method on more-complex material. For kernel samples, a single PCR was sufficient, while for soil samples, nested PCR was necessary. The newly developed primer pairs Fa+7/Ra+6 and Fa/Ra accurately reflected Fusarium species composition in mock DNA communities. In field samples, 47 Fusarium operational taxonomic units were identified, with the highest Fusarium diversity in soil. The Fusarium community in soil was dominated by members of the Fusarium incarnatum-Fusarium equiseti species complex, contradicting findings in previous studies. The method was successfully applied to analyze Fusarium communities in soil and plant material and can facilitate further studies of Fusarium ecology. PMID:26519387

  8. Influence of the magnetic field profile on ITER conductor testing

    NASA Astrophysics Data System (ADS)

    Nijhuis, A.; Ilyin, Y.; ten Kate, H. H. J.

    2006-08-01

    We performed simulations with the numerical CUDI-CICC code on a typical short ITER (International Thermonuclear Experimental Reactor) conductor test sample of dual leg configuration, as usually tested in the SULTAN test facility, and made a comparison with the new EFDA-Dipole test facility offering a larger applied DC field region. The new EFDA-Dipole test facility, designed for short sample testing of conductors for ITER, has a homogeneous high field region of 1.2 m, while in the SULTAN facility this region is three times shorter. The inevitable non-uniformity of the current distribution in the cable, introduced by the joints at both ends, has a degrading effect on voltage-current (VI) and voltage-temperature (VT) characteristics, particularly for these short samples. This can easily result in an underestimation or overestimation of the actual conductor performance. A longer applied DC high field region along a conductor suppresses the current non-uniformity by increasing the overall longitudinal cable electric field when reaching the current sharing mode. The numerical interpretation study presented here gives a quantitative analysis for a relevant practical case of a test of a short sample poloidal field coil insert (PFCI) conductor in SULTAN. The simulation includes the results of current distribution analysis from self-field measurements with Hall sensor arrays, current sharing measurements and inter-petal resistance measurements. The outcome of the simulations confirms that the current uniformity improves with a longer high field region but the 'measured' VI transition is barely affected, though the local peak voltages become somewhat suppressed. It appears that the location of the high field region and voltage taps has practically no influence on the VI curve as long as the transverse voltage components are adequately cancelled. In particular, for a thin conduit wall, the voltage taps should be connected to the conduit in the form of an (open) azimuthally

  9. Emperical Tests of Acceptance Sampling Plans

    NASA Technical Reports Server (NTRS)

    White, K. Preston, Jr.; Johnson, Kenneth L.

    2012-01-01

    Acceptance sampling is a quality control procedure applied as an alternative to 100% inspection. A random sample of items is drawn from a lot to determine the fraction of items which have a required quality characteristic. Both the number of items to be inspected and the criterion for determining conformance of the lot to the requirement are given by an appropriate sampling plan with specified risks of Type I and Type II sampling errors. In this paper, we present the results of empirical tests of the accuracy of selected sampling plans reported in the literature. These plans are for measureable quality characteristics which are known have either binomial, exponential, normal, gamma, Weibull, inverse Gaussian, or Poisson distributions. In the main, results support the accepted wisdom that variables acceptance plans are superior to attributes (binomial) acceptance plans, in the sense that these provide comparable protection against risks at reduced sampling cost. For the Gaussian and Weibull plans, however, there are ranges of the shape parameters for which the required sample sizes are in fact larger than the corresponding attributes plans, dramatically so for instances of large skew. Tests further confirm that the published inverse-Gaussian (IG) plan is flawed, as reported by White and Johnson (2011).

  10. Effect of sample initial magnetic field on the metal magnetic memory NDT result

    NASA Astrophysics Data System (ADS)

    Moonesan, Mahdi; Kashefi, Mehrdad

    2018-08-01

    One of the major concerns regarding the use of Metal Magnetic Memory (MMM) technique is the complexity of residual magnetization effect on output signals. The present study investigates the influence of residual magnetic field on stress induced magnetization. To this end, various initial magnetic fields were induced on a low carbon steel sample, and for each level of residual magnetic field, the sample was subjected to a set of 4-point bending tests and, their corresponding MMM signals were collected from the surface of the bended sample using a tailored metal magnetic memory scanning device. Results showed a strong correlation between sample residual magnetic field and its corresponding level of stress induced magnetic field. It was observed that the sample magnetic field increases with applying the bending stress as long as the initial residual magnet field is low (i.e. <117 mG), but starts decreasing with higher levels of initial residual magnetic fields. Besides, effect of bending stress on the MMM output of a notched sample was investigated. The result, again, showed that MMM signals exhibit a drop at stress concentration zone when sample has high level of initial residual magnetic field.

  11. Height-resolved large-sample INAA of a 1 m long, 13 cm diameter ditch-bottom sample

    NASA Astrophysics Data System (ADS)

    Blaauw, M.; Baas, H. W.; Donze, M.

    2003-06-01

    A facility for instrumental neutron activation analysis (INAA) of large samples (up to 1 m long and 15 cm diameter) has been built. Correction methods for the simultaneous occurrence of neutron self-shielding and gamma-ray self-attenuation effects have been implemented and tested with a variety of samples. Now, the method has been extended to allow for the interpretation of scanned, collimated measurements, where results are obtained for individual voxels. As a validation and demonstration, a ditch-bottom sample of the maximum size was taken in a frozen condition. It was cut in 2 cm slices, still frozen, and put together again with each slice in a 2 cm height Petri dish divided in three sections. This allowed for verification of the results by ordinary INAA. Possible explanations for the discrepancies we observed between ordinary and large-sample INAA in the region where the concentration gradients are the steepest are discussed.

  12. Issues Related to Large Flight Hardware Acoustic Qualification Testing

    NASA Technical Reports Server (NTRS)

    Kolaini, Ali R.; Perry, Douglas C.; Kern, Dennis L.

    2011-01-01

    The characteristics of acoustical testing volumes generated by reverberant chambers or a circle of loudspeakers with and without large flight hardware within the testing volume are significantly different. The parameters attributing to these differences are normally not accounted for through analysis or acoustic tests prior to the qualification testing without the test hardware present. In most cases the control microphones are kept at least 2-ft away from hardware surfaces, chamber walls, and speaker surfaces to minimize the impact of the hardware in controlling the sound field. However, the acoustic absorption and radiation of sound by hardware surfaces may significantly alter the sound pressure field controlled within the chamber/speaker volume to a given specification. These parameters often result in an acoustic field that may provide under/over testing scenarios for flight hardware. In this paper the acoustic absorption by hardware surfaces will be discussed in some detail. A simple model is provided to account for some of the observations made from Mars Science Laboratory spacecraft that recently underwent acoustic qualification tests in a reverberant chamber.

  13. Pre-Mission Input Requirements to Enable Successful Sample Collection by a Remote Field/EVA Team

    NASA Technical Reports Server (NTRS)

    Cohen, B. A.; Young, K. E.; Lim, D. S.

    2015-01-01

    This paper is intended to evaluate the sample collection process with respect to sample characterization and decision making. In some cases, it may be sufficient to know whether a given outcrop or hand sample is the same as or different from previous sampling localities or samples. In other cases, it may be important to have more in-depth characterization of the sample, such as basic composition, mineralogy, and petrology, in order to effectively identify the best sample. Contextual field observations, in situ/handheld analysis, and backroom evaluation may all play a role in understanding field lithologies and their importance for return. For example, whether a rock is a breccia or a clast-laden impact melt may be difficult based on a single sample, but becomes clear as exploration of a field site puts it into context. The FINESSE (Field Investigations to Enable Solar System Science and Exploration) team is a new activity focused on a science and exploration field based research program aimed at generating strategic knowledge in preparation for the human and robotic exploration of the Moon, near-Earth asteroids (NEAs) and Phobos and Deimos. We used the FINESSE field excursion to the West Clearwater Lake Impact structure (WCIS) as an opportunity to test factors related to sampling decisions. In contract to other technology-driven NASA analog studies, The FINESSE WCIS activity is science-focused, and moreover, is sampling-focused, with the explicit intent to return the best samples for geochronology studies in the laboratory. This specific objective effectively reduces the number of variables in the goals of the field test and enables a more controlled investigation of the role of the crewmember in selecting samples. We formulated one hypothesis to test: that providing details regarding the analytical fate of the samples (e.g. geochronology, XRF/XRD, etc.) to the crew prior to their traverse will result in samples that are more likely to meet specific analytical

  14. The large sample size fallacy.

    PubMed

    Lantz, Björn

    2013-06-01

    Significance in the statistical sense has little to do with significance in the common practical sense. Statistical significance is a necessary but not a sufficient condition for practical significance. Hence, results that are extremely statistically significant may be highly nonsignificant in practice. The degree of practical significance is generally determined by the size of the observed effect, not the p-value. The results of studies based on large samples are often characterized by extreme statistical significance despite small or even trivial effect sizes. Interpreting such results as significant in practice without further analysis is referred to as the large sample size fallacy in this article. The aim of this article is to explore the relevance of the large sample size fallacy in contemporary nursing research. Relatively few nursing articles display explicit measures of observed effect sizes or include a qualitative discussion of observed effect sizes. Statistical significance is often treated as an end in itself. Effect sizes should generally be calculated and presented along with p-values for statistically significant results, and observed effect sizes should be discussed qualitatively through direct and explicit comparisons with the effects in related literature. © 2012 Nordic College of Caring Science.

  15. Testing the application of Teflon/quartz soil solution samplers for DOM sampling in the Critical Zone: Field and laboratory approaches

    NASA Astrophysics Data System (ADS)

    Dolan, E. M.; Perdrial, J. N.; Vazquez, A.; Hernández, S.; Chorover, J.

    2010-12-01

    Elizabeth Dolan1,2, Julia Perdrial3, Angélica Vázquez-Ortega3, Selene Hernández-Ruiz3, Jon Chorover3 1Deptartment of Soil, Environmental, and Atmospheric Science, University of Missouri. 2Biosphere 2, University of Arizona. 3Deptartment of Soil, Water, and Environmental Science, University of Arizona. Abstract: The behavior of dissolved organic matter (DOM) in soil is important to many biogeochemical processes. Extraction methods to obtain DOM from the unsaturated zone remain a current focus of research as different methods can influence the type and concentration of DOM obtained. Thus, the present comparison study involves three methods for soil solution sampling to assess their impact on DOM quantity and quality: 1) aqueous soil extracts, 2) solution yielded from laboratory installed suction cup samplers and 3) solutions from field installed suction cup samplers. All samples were analyzed for dissolved organic carbon and total nitrogen concentrations. Moreover, DOM quality was analyzed using fluorescence, UV-Vis and FTIR spectroscopies. Results indicate higher DOC values for laboratory extracted DOM: 20 mg/L for aqueous soil extracts and 31 mg/L for lab installed samplers compared to 12 mg/L for field installed samplers. Large variations in C/N ratios were also observed ranging from 1.5 in laboratory extracted DOM to 11 in field samples. Fluorescence excitation-emission matrices of DOM solutions obtained for the laboratory extraction methods showed higher intensities in regions typical for fulvic and humic acid-like materials relative to those extracted in the field. Similarly, the molar absorptivity calculated from DOC concentration normalization of UV-Vis absorbance of the laboratory-derived solutions was significantly higher as well, indicating greater aromaticity. The observed differences can be attributed to soil disturbance associated with obtaining laboratory derived solution samples. Our results indicate that laboratory extraction methods are not

  16. Development of a Large Field-of-View PIV System for Rotorcraft Testing in the 14- x 22-Foot Subsonic Tunnel

    NASA Technical Reports Server (NTRS)

    Jenkins, Luther N.; Yao, Chung-Sheng; Bartram, Scott M.; Harris, Jerome; Allan, Brian; Wong, Oliver; Mace, W. Derry

    2009-01-01

    A Large Field-of-View Particle Image Velocimetry (LFPIV) system has been developed for rotor wake diagnostics in the 14-by 22-Foot Subsonic Tunnel. The system has been used to measure three components of velocity in a plane as large as 1.524 meters by 0.914 meters in both forward flight and hover tests. Overall, the system performance has exceeded design expectations in terms of accuracy and efficiency. Measurements synchronized with the rotor position during forward flight and hover tests have shown that the system is able to capture the complex interaction of the body and rotor wakes as well as basic details of the blade tip vortex at several wake ages. Measurements obtained with traditional techniques such as multi-hole pressure probes, Laser Doppler Velocimetry (LDV), and 2D Particle Image Velocimetry (PIV) show good agreement with LFPIV measurements.

  17. Detailed design of the large-bore 8 T superconducting magnet for the NAFASSY test facility

    NASA Astrophysics Data System (ADS)

    Corato, V.; Affinito, L.; Anemona, A.; Besi Vetrella, U.; Di Zenobio, A.; Fiamozzi Zignani, C.; Freda, R.; Messina, G.; Muzzi, L.; Perrella, M.; Reccia, L.; Tomassetti, G.; Turtù, S.; della Corte, A.

    2015-03-01

    The ‘NAFASSY’ (NAtional FAcility for Superconducting SYstems) facility is designed to test wound conductor samples under high-field conditions at variable temperatures. Due to its unique features, it is reasonable to assume that in the near future NAFASSY will have a preeminent role at the international level in the qualification of long coiled cables in operative conditions. The magnetic system consists of a large warm bore background solenoid, made up of three series-connected grading sections obtained by winding three different Nb3Sn Cable-in-Conduit Conductors. Thanks to the financial support of the Italian Ministry for University and Research the low-field coil is currently under production. The design has been properly modified to allow the system to operate also as a stand-alone facility, with an inner bore diameter of 1144 mm. This magnet is able to provide about 7 T on its axis and about 8 T close to the insert inner radius, giving the possibility of performing a test relevant for large-sized NbTi or medium-field Nb3Sn conductors. The detailed design of the 8 T magnet, including the electro-magnetic, structural and thermo-hydraulic analysis, is here reported, as well as the production status.

  18. Results of Large-Scale Spacecraft Flammability Tests

    NASA Technical Reports Server (NTRS)

    Ferkul, Paul; Olson, Sandra; Urban, David L.; Ruff, Gary A.; Easton, John; T'ien, James S.; Liao, Ta-Ting T.; Fernandez-Pello, A. Carlos; Torero, Jose L.; Eigenbrand, Christian; hide

    2017-01-01

    For the first time, a large-scale fire was intentionally set inside a spacecraft while in orbit. Testing in low gravity aboard spacecraft had been limited to samples of modest size: for thin fuels the longest samples burned were around 15 cm in length and thick fuel samples have been even smaller. This is despite the fact that fire is a catastrophic hazard for spaceflight and the spread and growth of a fire, combined with its interactions with the vehicle cannot be expected to scale linearly. While every type of occupied structure on earth has been the subject of full scale fire testing, this had never been attempted in space owing to the complexity, cost, risk and absence of a safe location. Thus, there is a gap in knowledge of fire behavior in spacecraft. The recent utilization of large, unmanned, resupply craft has provided the needed capability: a habitable but unoccupied spacecraft in low earth orbit. One such vehicle was used to study the flame spread over a 94 x 40.6 cm thin charring solid (fiberglasscotton fabric). The sample was an order of magnitude larger than anything studied to date in microgravity and was of sufficient scale that it consumed 1.5 of the available oxygen. The experiment which is called Saffire consisted of two tests, forward or concurrent flame spread (with the direction of flow) and opposed flame spread (against the direction of flow). The average forced air speed was 20 cms. For the concurrent flame spread test, the flame size remained constrained after the ignition transient, which is not the case in 1-g. These results were qualitatively different from those on earth where an upward-spreading flame on a sample of this size accelerates and grows. In addition, a curious effect of the chamber size is noted. Compared to previous microgravity work in smaller tunnels, the flame in the larger tunnel spread more slowly, even for a wider sample. This is attributed to the effect of flow acceleration in the smaller tunnels as a result of hot

  19. Light-sheet enhanced resolution of light field microscopy for rapid imaging of large volumes

    NASA Astrophysics Data System (ADS)

    Madrid Wolff, Jorge; Castro, Diego; Arbeláez, Pablo; Forero-Shelton, Manu

    2018-02-01

    Whole-brain imaging is challenging because it demands microscopes with high temporal and spatial resolution, which are often at odds, especially in the context of large fields of view. We have designed and built a light-sheet microscope with digital micromirror illumination and light-field detection. On the one hand, light sheets provide high resolution optical sectioning on live samples without compromising their viability. On the other hand, light field imaging makes it possible to reconstruct full volumes of relatively large fields of view from a single camera exposure; however, its enhanced temporal resolution comes at the expense of spatial resolution, limiting its applicability. We present an approach to increase the resolution of light field images using DMD-based light sheet illumination. To that end, we develop a method to produce synthetic resolution targets for light field microscopy and a procedure to correct the depth at which planes are refocused with rendering software. We measured the axial resolution as a function of depth and show a three-fold potential improvement with structured illumination, albeit by sacrificing some temporal resolution, also three-fold. This results in an imaging system that may be adjusted to specific needs without having to reassemble and realign it. This approach could be used to image relatively large samples at high rates.

  20. A Field-Based Cleaning Protocol for Sampling Devices Used in Life-Detection Studies

    NASA Astrophysics Data System (ADS)

    Eigenbrode, Jennifer; Benning, Liane G.; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E. F.

    2009-06-01

    Analytical approaches to extant and extinct life detection involve molecular detection often at trace levels. Thus, removal of biological materials and other organic molecules from the surfaces of devices used for sampling is essential for ascertaining meaningful results. Organic decontamination to levels consistent with null values on life-detection instruments is particularly challenging at remote field locations where Mars analog field investigations are carried out. Here, we present a seven-step, multi-reagent decontamination method that can be applied to sampling devices while in the field. In situ lipopolysaccharide detection via low-level endotoxin assays and molecular detection via gas chromatography-mass spectrometry were used to test the effectiveness of the decontamination protocol for sampling of glacial ice with a coring device and for sampling of sediments with a rover scoop during deployment at Arctic Mars-analog sites in Svalbard, Norway. Our results indicate that the protocols and detection technique sufficiently remove and detect low levels of molecular constituents necessary for life-detection tests.

  1. A field-based cleaning protocol for sampling devices used in life-detection studies.

    PubMed

    Eigenbrode, Jennifer; Benning, Liane G; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E F

    2009-06-01

    Analytical approaches to extant and extinct life detection involve molecular detection often at trace levels. Thus, removal of biological materials and other organic molecules from the surfaces of devices used for sampling is essential for ascertaining meaningful results. Organic decontamination to levels consistent with null values on life-detection instruments is particularly challenging at remote field locations where Mars analog field investigations are carried out. Here, we present a seven-step, multi-reagent decontamination method that can be applied to sampling devices while in the field. In situ lipopolysaccharide detection via low-level endotoxin assays and molecular detection via gas chromatography-mass spectrometry were used to test the effectiveness of the decontamination protocol for sampling of glacial ice with a coring device and for sampling of sediments with a rover scoop during deployment at Arctic Mars-analog sites in Svalbard, Norway. Our results indicate that the protocols and detection technique sufficiently remove and detect low levels of molecular constituents necessary for life-detection tests.

  2. GICHD mine dog testing project : soil sample results #5.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnett, James L.; Phelan, James M.; Archuleta, Luisa M.

    2004-01-01

    A mine dog evaluation project initiated by the Geneva International Center for Humanitarian Demining is evaluating the capability and reliability of mine detection dogs. The performance of field-operational mine detection dogs will be measured in test minefields in Afghanistan containing actual, but unfused landmines. Repeated performance testing over two years through various seasonal weather conditions will provide data simulating near real world conditions. Soil samples will be obtained adjacent to the buried targets repeatedly over the course of the test. Chemical analysis results from these soil samples will be used to evaluate correlations between mine dog detection performance and seasonalmore » weather conditions. This report documents the analytical chemical methods and results from the fifth batch of soils received. This batch contained samples from Kharga, Afghanistan collected in June 2003.« less

  3. A FIELD VALIDATION OF TWO SEDIMENT-AMPHIPOD TOXICITY TESTS

    EPA Science Inventory

    A field validation study of two sediment-amphipod toxicity tests was conducted using sediment samples collected subtidally in the vicinity of a polycyclic aromatic hydrocarbon (PAH)-contaminated Superfund site in Elliott Bay, WA, USA. Sediment samples were collected at 30 stati...

  4. Sampling Large Graphs for Anticipatory Analytics

    DTIC Science & Technology

    2015-05-15

    low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges

  5. Flexible sampling large-scale social networks by self-adjustable random walk

    NASA Astrophysics Data System (ADS)

    Xu, Xiao-Ke; Zhu, Jonathan J. H.

    2016-12-01

    Online social networks (OSNs) have become an increasingly attractive gold mine for academic and commercial researchers. However, research on OSNs faces a number of difficult challenges. One bottleneck lies in the massive quantity and often unavailability of OSN population data. Sampling perhaps becomes the only feasible solution to the problems. How to draw samples that can represent the underlying OSNs has remained a formidable task because of a number of conceptual and methodological reasons. Especially, most of the empirically-driven studies on network sampling are confined to simulated data or sub-graph data, which are fundamentally different from real and complete-graph OSNs. In the current study, we propose a flexible sampling method, called Self-Adjustable Random Walk (SARW), and test it against with the population data of a real large-scale OSN. We evaluate the strengths of the sampling method in comparison with four prevailing methods, including uniform, breadth-first search (BFS), random walk (RW), and revised RW (i.e., MHRW) sampling. We try to mix both induced-edge and external-edge information of sampled nodes together in the same sampling process. Our results show that the SARW sampling method has been able to generate unbiased samples of OSNs with maximal precision and minimal cost. The study is helpful for the practice of OSN research by providing a highly needed sampling tools, for the methodological development of large-scale network sampling by comparative evaluations of existing sampling methods, and for the theoretical understanding of human networks by highlighting discrepancies and contradictions between existing knowledge/assumptions of large-scale real OSN data.

  6. Sequential accelerated tests: Improving the correlation of accelerated tests to module performance in the field

    NASA Astrophysics Data System (ADS)

    Felder, Thomas; Gambogi, William; Stika, Katherine; Yu, Bao-Ling; Bradley, Alex; Hu, Hongjie; Garreau-Iles, Lucie; Trout, T. John

    2016-09-01

    DuPont has been working steadily to develop accelerated backsheet tests that correlate with solar panels observations in the field. This report updates efforts in sequential testing. Single exposure tests are more commonly used and can be completed more quickly, and certain tests provide helpful predictions of certain backsheet failure modes. DuPont recommendations for single exposure tests are based on 25-year exposure levels for UV and humidity/temperature, and form a good basis for sequential test development. We recommend a sequential exposure of damp heat followed by UV then repetitions of thermal cycling and UVA. This sequence preserves 25-year exposure levels for humidity/temperature and UV, and correlates well with a large body of field observations. Measurements can be taken at intervals in the test, although the full test runs 10 months. A second, shorter sequential test based on damp heat and thermal cycling tests mechanical durability and correlates with loss of mechanical properties seen in the field. Ongoing work is directed toward shorter sequential tests that preserve good correlation to field data.

  7. Field-aligned currents and large-scale magnetospheric electric fields

    NASA Technical Reports Server (NTRS)

    Dangelo, N.

    1979-01-01

    The existence of field-aligned currents (FAC) at northern and southern high latitudes was confirmed by a number of observations, most clearly by experiments on the TRIAD and ISIS 2 satellites. The high-latitude FAC system is used to relate what is presently known about the large-scale pattern of high-latitude ionospheric electric fields and their relation to solar wind parameters. Recently a simplified model was presented for polar cap electric fields. The model is of considerable help in visualizing the large-scale features of FAC systems. A summary of the FAC observations is given. The simplified model is used to visualize how the FAC systems are driven by their generators.

  8. Pre-Mission Input Requirements to Enable Successful Sample Collection by A Remote Field/EVA Team

    NASA Technical Reports Server (NTRS)

    Cohen, B. A.; Lim, D. S. S.; Young, K. E.; Brunner, A.; Elphic, R. E.; Horne, A.; Kerrigan, M. C.; Osinski, G. R.; Skok, J. R.; Squyres, S. W.; hide

    2016-01-01

    The FINESSE (Field Investigations to Enable Solar System Science and Exploration) team, part of the Solar System Exploration Virtual Institute (SSERVI), is a field-based research program aimed at generating strategic knowledge in preparation for human and robotic exploration of the Moon, near-Earth asteroids, Phobos and Deimos, and beyond. In contract to other technology-driven NASA analog studies, The FINESSE WCIS activity is science-focused and, moreover, is sampling-focused with the explicit intent to return the best samples for geochronology studies in the laboratory. We used the FINESSE field excursion to the West Clearwater Lake Impact structure (WCIS) as an opportunity to test factors related to sampling decisions. We examined the in situ sample characterization and real-time decision-making process of the astronauts, with a guiding hypothesis that pre-mission training that included detailed background information on the analytical fate of a sample would better enable future astronauts to select samples that would best meet science requirements. We conducted three tests of this hypothesis over several days in the field. Our investigation was designed to document processes, tools and procedures for crew sampling of planetary targets. This was not meant to be a blind, controlled test of crew efficacy, but rather an effort to explicitly recognize the relevant variables that enter into sampling protocol and to be able to develop recommendations for crew and backroom training in future endeavors.

  9. Field testing of fugitive dust control techniques at a uranium mill tailings pile - 1982 Field Test, Gas Hills, Wyoming.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elmore, M.R.; Hartley, J.N.

    A field test was conducted on a uranium tailings pile to evaluate the effectiveness of 15 chemical stabilizers for control of fugitive dust from uranium mill tailings. A tailings pile at the Federal American Partners (FAP) Uranium Mill, Gas Hills, Wyoming, was used for the field test. Preliminary laboratory tests using a wing tunnel were conducted to select the more promising stabilizers for field testing. Fourteen of the chemical stabilizers were applied with a field spray system pulled behind a tractor; one--Hydro Mulch--was applied with a hydroseeder. A portable weather station and data logger were installed to record the weathermore » conditions at the test site. After 1 year of monitoring (including three site visits), all of the stabilizers have degraded to some degree; but those applied at the manufacturers' recommended rate are still somewhat effective in reducing fugitive emissions. The following synthetic polymer emulsions appear to be the more effective stabilizers: Wallpol 40-133 from Reichold Chemicals, SP-400 from Johnson and March Corporation, and CPB-12 from Wen Don Corporation. Installed costs for the test plots ranged from $8400 to $11,300/ha; this range results from differences in stabilizer costs. Large-scale stabilization costs of the test materials are expected to range from $680 to $3600/ha based on FAP experience. Evaluation of the chemical stabilizers will continue for approximately 1 year. 2 references, 33 figures, 22 tables.« less

  10. GICHD mine dog testing project - soil sample results #4.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnett, James L.; Phelan, James M.; Archuleta, Luisa M.

    2003-08-01

    A mine dog evaluation project initiated by the Geneva International Center for Humanitarian Demining is evaluating the capability and reliability of mine detection dogs. The performance of field-operational mine detection dogs will be measured in test minefields in Afghanistan and Bosnia containing actual, but unfused landmines. Repeated performance testing over two years through various seasonal weather conditions will provide data simulating near real world conditions. Soil samples will be obtained adjacent to the buried targets repeatedly over the course of the test. Chemical analysis results from these soil samples will be used to evaluate correlations between mine dog detection performancemore » and seasonal weather conditions. This report documents the analytical chemical methods and results from the fourth batch of soils received. This batch contained samples from Kharga, Afghanistan collected in April 2003 and Sarajevo, Bosnia collected in May 2003.« less

  11. FIELD SAMPLING PROTOCOLS AND ANALYSIS

    EPA Science Inventory

    I have been asked to speak again to the environmental science class regarding actual research scenarios related to my work at Kerr Lab. I plan to discuss sampling protocols along with various field analyses performed during sampling activities. Many of the students have never see...

  12. Large antenna experiments aboard the space shuttle: Application of nonuniform sampling techniques

    NASA Technical Reports Server (NTRS)

    Rahmatsamii, Y.

    1988-01-01

    Future satellite communication and scientific spacecraft will utilize antennas with dimensions as large as 20 meters. In order to commercially use these large, low sidelobe and multiple beam antennas, a high level of confidence must be established as to their performance in the 0-g and space environment. Furthermore, it will be desirable to demonstrate the applicability of surface compensation techniques for slowly varying surface distortions which could result from thermal effects. An overview of recent advances in performing RF measurements on large antennas is presented with emphasis given to the application of a space based far-field range utilizing the Space Shuttle and the concept of a newly developed nonuniform sampling technique.

  13. Measuring the Large-scale Solar Magnetic Field

    NASA Astrophysics Data System (ADS)

    Hoeksema, J. T.; Scherrer, P. H.; Peterson, E.; Svalgaard, L.

    2017-12-01

    The Sun's large-scale magnetic field is important for determining global structure of the corona and for quantifying the evolution of the polar field, which is sometimes used for predicting the strength of the next solar cycle. Having confidence in the determination of the large-scale magnetic field of the Sun is difficult because the field is often near the detection limit, various observing methods all measure something a little different, and various systematic effects can be very important. We compare resolved and unresolved observations of the large-scale magnetic field from the Wilcox Solar Observatory, Heliseismic and Magnetic Imager (HMI), Michelson Doppler Imager (MDI), and Solis. Cross comparison does not enable us to establish an absolute calibration, but it does allow us to discover and compensate for instrument problems, such as the sensitivity decrease seen in the WSO measurements in late 2016 and early 2017.

  14. Successful application of FTA Classic Card technology and use of bacteriophage phi29 DNA polymerase for large-scale field sampling and cloning of complete maize streak virus genomes.

    PubMed

    Owor, Betty E; Shepherd, Dionne N; Taylor, Nigel J; Edema, Richard; Monjane, Adérito L; Thomson, Jennifer A; Martin, Darren P; Varsani, Arvind

    2007-03-01

    Leaf samples from 155 maize streak virus (MSV)-infected maize plants were collected from 155 farmers' fields in 23 districts in Uganda in May/June 2005 by leaf-pressing infected samples onto FTA Classic Cards. Viral DNA was successfully extracted from cards stored at room temperature for 9 months. The diversity of 127 MSV isolates was analysed by PCR-generated RFLPs. Six representative isolates having different RFLP patterns and causing either severe, moderate or mild disease symptoms, were chosen for amplification from FTA cards by bacteriophage phi29 DNA polymerase using the TempliPhi system. Full-length genomes were inserted into a cloning vector using a unique restriction enzyme site, and sequenced. The 1.3-kb PCR product amplified directly from FTA-eluted DNA and used for RFLP analysis was also cloned and sequenced. Comparison of cloned whole genome sequences with those of the original PCR products indicated that the correct virus genome had been cloned and that no errors were introduced by the phi29 polymerase. This is the first successful large-scale application of FTA card technology to the field, and illustrates the ease with which large numbers of infected samples can be collected and stored for downstream molecular applications such as diversity analysis and cloning of potentially new virus genomes.

  15. Hydrogen Field Test Standard: Laboratory and Field Performance

    PubMed Central

    Pope, Jodie G.; Wright, John D.

    2015-01-01

    The National Institute of Standards and Technology (NIST) developed a prototype field test standard (FTS) that incorporates three test methods that could be used by state weights and measures inspectors to periodically verify the accuracy of retail hydrogen dispensers, much as gasoline dispensers are tested today. The three field test methods are: 1) gravimetric, 2) Pressure, Volume, Temperature (PVT), and 3) master meter. The FTS was tested in NIST's Transient Flow Facility with helium gas and in the field at a hydrogen dispenser location. All three methods agree within 0.57 % and 1.53 % for all test drafts of helium gas in the laboratory setting and of hydrogen gas in the field, respectively. The time required to perform six test drafts is similar for all three methods, ranging from 6 h for the gravimetric and master meter methods to 8 h for the PVT method. The laboratory tests show that 1) it is critical to wait for thermal equilibrium to achieve density measurements in the FTS that meet the desired uncertainty requirements for the PVT and master meter methods; in general, we found a wait time of 20 minutes introduces errors < 0.1 % and < 0.04 % in the PVT and master meter methods, respectively and 2) buoyancy corrections are important for the lowest uncertainty gravimetric measurements. The field tests show that sensor drift can become a largest component of uncertainty that is not present in the laboratory setting. The scale was calibrated after it was set up at the field location. Checks of the calibration throughout testing showed drift of 0.031 %. Calibration of the master meter and the pressure sensors prior to travel to the field location and upon return showed significant drifts in their calibrations; 0.14 % and up to 1.7 %, respectively. This highlights the need for better sensor selection and/or more robust sensor testing prior to putting into field service. All three test methods are capable of being successfully performed in the field and give

  16. A simplified field protocol for genetic sampling of birds using buccal swabs

    USGS Publications Warehouse

    Vilstrup, Julia T.; Mullins, Thomas D.; Miller, Mark P.; McDearman, Will; Walters, Jeffrey R.; Haig, Susan M.

    2018-01-01

    DNA sampling is an essential prerequisite for conducting population genetic studies. For many years, blood sampling has been the preferred method for obtaining DNA in birds because of their nucleated red blood cells. Nonetheless, use of buccal swabs has been gaining favor because they are less invasive yet still yield adequate amounts of DNA for amplifying mitochondrial and nuclear markers; however, buccal swab protocols often include steps (e.g., extended air-drying and storage under frozen conditions) not easily adapted to field settings. Furthermore, commercial extraction kits and swabs for buccal sampling can be expensive for large population studies. We therefore developed an efficient, cost-effective, and field-friendly protocol for sampling wild birds after comparing DNA yield among 3 inexpensive buccal swab types (2 with foam tips and 1 with a cotton tip). Extraction and amplification success was high (100% and 97.2% respectively) using inexpensive generic swabs. We found foam-tipped swabs provided higher DNA yields than cotton-tipped swabs. We further determined that omitting a drying step and storing swabs in Longmire buffer increased efficiency in the field while still yielding sufficient amounts of DNA for detailed population genetic studies using mitochondrial and nuclear markers. This new field protocol allows time- and cost-effective DNA sampling of juveniles or small-bodied birds for which drawing blood may cause excessive stress to birds and technicians alike.

  17. Diagnostic test accuracy and prevalence inferences based on joint and sequential testing with finite population sampling.

    PubMed

    Su, Chun-Lung; Gardner, Ian A; Johnson, Wesley O

    2004-07-30

    The two-test two-population model, originally formulated by Hui and Walter, for estimation of test accuracy and prevalence estimation assumes conditionally independent tests, constant accuracy across populations and binomial sampling. The binomial assumption is incorrect if all individuals in a population e.g. child-care centre, village in Africa, or a cattle herd are sampled or if the sample size is large relative to population size. In this paper, we develop statistical methods for evaluating diagnostic test accuracy and prevalence estimation based on finite sample data in the absence of a gold standard. Moreover, two tests are often applied simultaneously for the purpose of obtaining a 'joint' testing strategy that has either higher overall sensitivity or specificity than either of the two tests considered singly. Sequential versions of such strategies are often applied in order to reduce the cost of testing. We thus discuss joint (simultaneous and sequential) testing strategies and inference for them. Using the developed methods, we analyse two real and one simulated data sets, and we compare 'hypergeometric' and 'binomial-based' inferences. Our findings indicate that the posterior standard deviations for prevalence (but not sensitivity and specificity) based on finite population sampling tend to be smaller than their counterparts for infinite population sampling. Finally, we make recommendations about how small the sample size should be relative to the population size to warrant use of the binomial model for prevalence estimation. Copyright 2004 John Wiley & Sons, Ltd.

  18. Characteristics of large particles and their effects on the submarine light field

    NASA Astrophysics Data System (ADS)

    Hou, Weilin

    Large particles play important roles in the ocean by modifying the underwater light field and effecting material transfer. The particle size distribution of large particles has been measured in-situ with multiple- camera video microscopy and the automated particle sizing and recognition software developed. Results show that there are more large particles in coastal waters than previously thaught, based upon by a hyperbolic size- distribution curve with a (log-log) slope parameter of close to 3 instead of 4 for the particles larger than 100μm diameter. Larger slopes are more typical for particles in the open ocean. This slope permits estimation of the distribution into the small-particle size range for use in correcting the beam-attenuation measurements for near-forward scattering. The large- particle slope and c-meter were used to estimate the small-particle size distributions which nearly matched those measured with a Coulter Counteroler (3.05%). There is also a fair correlation (r2=0.729) between the slope of the distribution and its concentration parameters. Scattering by large particles is influenced by not only the concentrations of these particles, but also the scattering phase functions. This first in-situ measurement of large-particle scattering with multiple angles reveals that they scatter more in the backward direction than was previously believed, and the enhanced backscattering can be explained in part by multiple scattering of aggregated particles. Proper identification of these large particles can be of great help in understanding the status of the ecosystem. By extracting particle features using high-resolution video images via moment-invariant functions and applying this information to lower-resolution images, we increase the effective sample volume without severely degrading classification efficiency. Traditional pattern recognition algorithms of images classified zooplankton with results within 24% of zooplankton collected using bottle samples

  19. Electrofracturing test system and method of determining material characteristics of electrofractured material samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bauer, Stephen J.; Glover, Steven F.; Pfeifle, Tom

    A device for electrofracturing a material sample and analyzing the material sample is disclosed. The device simulates an in situ electrofracturing environment so as to obtain electrofractured material characteristics representative of field applications while allowing permeability testing of the fractured sample under in situ conditions.

  20. GICHD Mine Dog Testing Project - Soil Sample Results No.3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PHELAN, JAMES M.; BARNETT, JAMES L.; BENDER, SUSAN FAE ANN

    2003-03-01

    A mine dog evaluation project initiated by the Geneva International Center for Humanitarian Demining is evaluating the capability and reliability of mine detection dogs. The performance of field-operational mine detection dogs will be measured in test minefields in Afghanistan and Bosnia containing actual, but unfused landmines. Repeated performance testing over two years through various seasonal weather conditions will provide data simulating near real world conditions. Soil samples will be obtained adjacent to the buried targets repeatedly over the course of the test. Chemical analysis results from these soil samples will be used to evaluate correlations between mine dog detection performancemore » and seasonal weather conditions. This report documents the analytical chemical methods and results from the third batch of soils received. This batch contained samples from Kharga, Afghanistan collected in October 2002.« less

  1. Large-area photogrammetry based testing of wind turbine blades

    NASA Astrophysics Data System (ADS)

    Poozesh, Peyman; Baqersad, Javad; Niezrecki, Christopher; Avitabile, Peter; Harvey, Eric; Yarala, Rahul

    2017-03-01

    An optically based sensing system that can measure the displacement and strain over essentially the entire area of a utility-scale blade leads to a measurement system that can significantly reduce the time and cost associated with traditional instrumentation. This paper evaluates the performance of conventional three dimensional digital image correlation (3D DIC) and three dimensional point tracking (3DPT) approaches over the surface of wind turbine blades and proposes a multi-camera measurement system using dynamic spatial data stitching. The potential advantages for the proposed approach include: (1) full-field measurement distributed over a very large area, (2) the elimination of time-consuming wiring and expensive sensors, and (3) the need for large-channel data acquisition systems. There are several challenges associated with extending the capability of a standard 3D DIC system to measure entire surface of utility scale blades to extract distributed strain, deflection, and modal parameters. This paper only tries to address some of the difficulties including: (1) assessing the accuracy of the 3D DIC system to measure full-field distributed strain and displacement over the large area, (2) understanding the geometrical constraints associated with a wind turbine testing facility (e.g. lighting, working distance, and speckle pattern size), (3) evaluating the performance of the dynamic stitching method to combine two different fields of view by extracting modal parameters from aligned point clouds, and (4) determining the feasibility of employing an output-only system identification to estimate modal parameters of a utility scale wind turbine blade from optically measured data. Within the current work, the results of an optical measurement (one stereo-vision system) performed on a large area over a 50-m utility-scale blade subjected to quasi-static and cyclic loading are presented. The blade certification and testing is typically performed using International

  2. Comparison of geochemical data obtained using four brine sampling methods at the SECARB Phase III Anthropogenic Test CO2 injection site, Citronelle Oil Field, Alabama

    USGS Publications Warehouse

    Conaway, Christopher; Thordsen, James J.; Manning, Michael A.; Cook, Paul J.; Trautz, Robert C.; Thomas, Burt; Kharaka, Yousif K.

    2016-01-01

    The chemical composition of formation water and associated gases from the lower Cretaceous Paluxy Formation was determined using four different sampling methods at a characterization well in the Citronelle Oil Field, Alabama, as part of the Southeast Regional Carbon Sequestration Partnership (SECARB) Phase III Anthropogenic Test, which is an integrated carbon capture and storage project. In this study, formation water and gas samples were obtained from well D-9-8 #2 at Citronelle using gas lift, electric submersible pump, U-tube, and a downhole vacuum sampler (VS) and subjected to both field and laboratory analyses. Field chemical analyses included electrical conductivity, dissolved sulfide concentration, alkalinity, and pH; laboratory analyses included major, minor and trace elements, dissolved carbon, volatile fatty acids, free and dissolved gas species. The formation water obtained from this well is a Na–Ca–Cl-type brine with a salinity of about 200,000 mg/L total dissolved solids. Differences were evident between sampling methodologies, particularly in pH, Fe and alkalinity. There was little gas in samples, and gas composition results were strongly influenced by sampling methods. The results of the comparison demonstrate the difficulty and importance of preserving volatile analytes in samples, with the VS and U-tube system performing most favorably in this aspect.

  3. LSA field test

    NASA Technical Reports Server (NTRS)

    Jaffe, P.

    1979-01-01

    Degradation tests indicate that electrical degradation is not a slow monotonically increasing phenomenon as originally thought but occurs abruptly as the result of some traumatic event. This finding has led to a change in the test philosophy. A discussion of this change is presented along with a summary of degradation and failure data from all the sites and results from a variety of special tests. New instrumentation for in-field measurements are described. Field testing activity was expanded by the addition of twelve remote sites located as far away as Alaska and the Canal Zone. Descriptions of the new sites are included.

  4. Deep Borehole Field Test Laboratory and Borehole Testing Strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhlman, Kristopher L.; Brady, Patrick V.; MacKinnon, Robert J.

    2016-09-19

    Deep Borehole Disposal (DBD) of high-level radioactive wastes has been considered an option for geological isolation for many years (Hess et al. 1957). Recent advances in drilling technology have decreased costs and increased reliability for large-diameter (i.e., ≥50 cm [19.7”]) boreholes to depths of several kilometers (Beswick 2008; Beswick et al. 2014). These advances have therefore also increased the feasibility of the DBD concept (Brady et al. 2009; Cornwall 2015), and the current field test design will demonstrate the DBD concept and these advances. The US Department of Energy (DOE) Strategy for the Management and Disposal of Used Nuclear Fuelmore » and High-Level Radioactive Waste (DOE 2013) specifically recommended developing a research and development plan for DBD. DOE sought input or expression of interest from States, local communities, individuals, private groups, academia, or any other stakeholders willing to host a Deep Borehole Field Test (DBFT). The DBFT includes drilling two boreholes nominally 200m [656’] apart to approximately 5 km [16,400’] total depth, in a region where crystalline basement is expected to begin at less than 2 km depth [6,560’]. The characterization borehole (CB) is the smaller-diameter borehole (i.e., 21.6 cm [8.5”] diameter at total depth), and will be drilled first. The geologic, hydrogeologic, geochemical, geomechanical and thermal testing will take place in the CB. The field test borehole (FTB) is the larger-diameter borehole (i.e., 43.2 cm [17”] diameter at total depth). Surface handling and borehole emplacement of test package will be demonstrated using the FTB to evaluate engineering feasibility and safety of disposal operations (SNL 2016).« less

  5. [Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].

    PubMed

    Suzukawa, Yumi; Toyoda, Hideki

    2012-04-01

    This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.

  6. Sample-to-answer palm-sized nucleic acid testing device towards low-cost malaria mass screening.

    PubMed

    Choi, Gihoon; Prince, Theodore; Miao, Jun; Cui, Liwang; Guan, Weihua

    2018-05-19

    The effectiveness of malaria screening and treatment highly depends on the low-cost access to the highly sensitive and specific malaria test. We report a real-time fluorescence nucleic acid testing device for malaria field detection with automated and scalable sample preparation capability. The device consists a compact analyzer and a disposable microfluidic reagent compact disc. The parasite DNA sample preparation and subsequent real-time LAMP detection were seamlessly integrated on a single microfluidic compact disc, driven by energy efficient non-centrifuge based magnetic field interactions. Each disc contains four parallel testing units which could be configured either as four identical tests or as four species-specific tests. When configured as species-specific tests, it could identify two of the most life-threatening malaria species (P. falciparum and P. vivax). The NAT device is capable of processing four samples simultaneously within 50 min turnaround time. It achieves a detection limit of ~0.5 parasites/µl for whole blood, sufficient for detecting asymptomatic parasite carriers. The combination of the sensitivity, specificity, cost, and scalable sample preparation suggests the real-time fluorescence LAMP device could be particularly useful for malaria screening in the field settings. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. CALIFA: a diameter-selected sample for an integral field spectroscopy galaxy survey

    NASA Astrophysics Data System (ADS)

    Walcher, C. J.; Wisotzki, L.; Bekeraité, S.; Husemann, B.; Iglesias-Páramo, J.; Backsmann, N.; Barrera Ballesteros, J.; Catalán-Torrecilla, C.; Cortijo, C.; del Olmo, A.; Garcia Lorenzo, B.; Falcón-Barroso, J.; Jilkova, L.; Kalinova, V.; Mast, D.; Marino, R. A.; Méndez-Abreu, J.; Pasquali, A.; Sánchez, S. F.; Trager, S.; Zibetti, S.; Aguerri, J. A. L.; Alves, J.; Bland-Hawthorn, J.; Boselli, A.; Castillo Morales, A.; Cid Fernandes, R.; Flores, H.; Galbany, L.; Gallazzi, A.; García-Benito, R.; Gil de Paz, A.; González-Delgado, R. M.; Jahnke, K.; Jungwiert, B.; Kehrig, C.; Lyubenova, M.; Márquez Perez, I.; Masegosa, J.; Monreal Ibero, A.; Pérez, E.; Quirrenbach, A.; Rosales-Ortega, F. F.; Roth, M. M.; Sanchez-Blazquez, P.; Spekkens, K.; Tundo, E.; van de Ven, G.; Verheijen, M. A. W.; Vilchez, J. V.; Ziegler, B.

    2014-09-01

    We describe and discuss the selection procedure and statistical properties of the galaxy sample used by the Calar Alto Legacy Integral Field Area (CALIFA) survey, a public legacy survey of 600 galaxies using integral field spectroscopy. The CALIFA "mother sample" was selected from the Sloan Digital Sky Survey (SDSS) DR7 photometric catalogue to include all galaxies with an r-band isophotal major axis between 45'' and 79.2'' and with a redshift 0.005 < z < 0.03. The mother sample contains 939 objects, 600 of which will be observed in the course of the CALIFA survey. The selection of targets for observations is based solely on visibility and thus keeps the statistical properties of the mother sample. By comparison with a large set of SDSS galaxies, we find that the CALIFA sample is representative of galaxies over a luminosity range of -19 > Mr > -23.1 and over a stellar mass range between 109.7 and 1011.4 M⊙. In particular, within these ranges, the diameter selection does not lead to any significant bias against - or in favour of - intrinsically large or small galaxies. Only below luminosities of Mr = -19 (or stellar masses <109.7 M⊙) is there a prevalence of galaxies with larger isophotal sizes, especially of nearly edge-on late-type galaxies, but such galaxies form <10% of the full sample. We estimate volume-corrected distribution functions in luminosities and sizes and show that these are statistically fully compatible with estimates from the full SDSS when accounting for large-scale structure. For full characterization of the sample, we also present a number of value-added quantities determined for the galaxies in the CALIFA sample. These include consistent multi-band photometry based on growth curve analyses; stellar masses; distances and quantities derived from these; morphological classifications; and an overview of available multi-wavelength photometric measurements. We also explore different ways of characterizing the environments of CALIFA galaxies

  8. Intact preservation of environmental samples by freezing under an alternating magnetic field.

    PubMed

    Morono, Yuki; Terada, Takeshi; Yamamoto, Yuhji; Xiao, Nan; Hirose, Takehiro; Sugeno, Masaya; Ohwada, Norio; Inagaki, Fumio

    2015-04-01

    The study of environmental samples requires a preservation system that stabilizes the sample structure, including cells and biomolecules. To address this fundamental issue, we tested the cell alive system (CAS)-freezing technique for subseafloor sediment core samples. In the CAS-freezing technique, an alternating magnetic field is applied during the freezing process to produce vibration of water molecules and achieve a stable, super-cooled liquid phase. Upon further cooling, the temperature decreases further, achieving a uniform freezing of sample with minimal ice crystal formation. In this study, samples were preserved using the CAS and conventional freezing techniques at 4, -20, -80 and -196 (liquid nitrogen) °C. After 6 months of storage, microbial cell counts by conventional freezing significantly decreased (down to 10.7% of initial), whereas that by CAS-freezing resulted in minimal. When Escherichia coli cells were tested under the same freezing conditions and storage for 2.5 months, CAS-frozen E. coli cells showed higher viability than the other conditions. In addition, an alternating magnetic field does not impact on the direction of remanent magnetization in sediment core samples, although slight partial demagnetization in intensity due to freezing was observed. Consequently, our data indicate that the CAS technique is highly useful for the preservation of environmental samples. © 2014 Society for Applied Microbiology and John Wiley & Sons Ltd.

  9. Evaluation of membrane filter field monitors for microbiological air sampling

    NASA Technical Reports Server (NTRS)

    Fields, N. D.; Oxborrow, G. S.; Puleo, J. R.; Herring, C. M.

    1974-01-01

    Due to area constraints encountered in assembly and testing areas of spacecraft, the membrane filter field monitor (MF) and the National Aeronautics and Space Administration-accepted Reyniers slit air sampler were compared for recovery of airborne microbial contamination. The intramural air in a microbiological laboratory area and a clean room environment used for the assembly and testing of the Apollo spacecraft was studied. A significantly higher number of microorganisms was recovered by the Reyniers sampler. A high degree of consistency between the two sampling methods was shown by a regression analysis, with a correlation coefficient of 0.93. The MF samplers detected 79% of the concentration measured by the Reyniers slit samplers. The types of microorganisms identified from both sampling methods were similar.

  10. Gaussian vs. Bessel light-sheets: performance analysis in live large sample imaging

    NASA Astrophysics Data System (ADS)

    Reidt, Sascha L.; Correia, Ricardo B. C.; Donnachie, Mark; Weijer, Cornelis J.; MacDonald, Michael P.

    2017-08-01

    Lightsheet fluorescence microscopy (LSFM) has rapidly progressed in the past decade from an emerging technology into an established methodology. This progress has largely been driven by its suitability to developmental biology, where it is able to give excellent spatial-temporal resolution over relatively large fields of view with good contrast and low phototoxicity. In many respects it is superseding confocal microscopy. However, it is no magic bullet and still struggles to image deeply in more highly scattering samples. Many solutions to this challenge have been presented, including, Airy and Bessel illumination, 2-photon operation and deconvolution techniques. In this work, we show a comparison between a simple but effective Gaussian beam illumination and Bessel illumination for imaging in chicken embryos. Whilst Bessel illumination is shown to be of benefit when a greater depth of field is required, it is not possible to see any benefits for imaging into the highly scattering tissue of the chick embryo.

  11. A Test of the Interpersonal Theory of Suicide in a Large Sample of Current Firefighters

    PubMed Central

    Chu, Carol; Buchman-Schmitt, Jennifer M.; Hom, Melanie A.; Stanley, Ian H.; Joiner, Thomas E.

    2017-01-01

    Recent research suggests that firefighters experience elevated rates of suicidal ideation and behaviors. The interpersonal theory of suicide may shed light on this finding. This theory postulates that suicide desire is strongest among individuals experiencing perceived burdensomeness and thwarted belongingness, and that the combination of suicide desire and acquired capability for suicide is necessary for the development of suicidal behaviors. We tested the propositions of the interpersonal theory in a large sample of current United States firefighters (N=863). Participants completed self-report measures of perceived burdensomeness, thwarted belongingness, fearlessness about death (FAD; a component acquired capability), and career suicidal ideation and suicide attempt history. Regression models were used to examine the association between interpersonal theory constructs, career suicidal ideation severity, and the presence of career suicide attempts. In line with theory predictions, the three-way interaction between perceived burdensomeness, thwarted belongingness, and FAD was significantly associated with career suicide attempts, beyond participant sex. However, findings were no longer significant after accounting for years of firefighter service or age. Contrary to predictions, the two-way interaction between perceived burdensomeness and thwarted belongingness was not significantly related to career suicidal ideation severity. Applications of the theory to firefighters and future research are discussed. PMID:27078756

  12. A test of the interpersonal theory of suicide in a large sample of current firefighters.

    PubMed

    Chu, Carol; Buchman-Schmitt, Jennifer M; Hom, Melanie A; Stanley, Ian H; Joiner, Thomas E

    2016-06-30

    Recent research suggests that firefighters experience elevated rates of suicidal ideation and behaviors. The interpersonal theory of suicide may shed light on this finding. This theory postulates that suicidal desire is strongest among individuals experiencing perceived burdensomeness and thwarted belongingness, and that the combination of suicide desire and acquired capability for suicide is necessary for the development of suicidal behaviors. We tested the propositions of the interpersonal theory in a large sample of current United States firefighters (N=863). Participants completed self-report measures of perceived burdensomeness, thwarted belongingness, fearlessness about death (FAD; a component of acquired capability), and career suicidal ideation and suicide attempt history. Regression models were used to examine the association between interpersonal theory constructs, career suicidal ideation severity, and the presence of career suicide attempts. In line with theory predictions, the three-way interaction between perceived burdensomeness, thwarted belongingness, and FAD was significantly associated with career suicide attempts, beyond participant sex. However, findings were no longer significant after accounting for years of firefighter service or age. Contrary to predictions, the two-way interaction between perceived burdensomeness and thwarted belongingness was not significantly related to career suicidal ideation severity. Applications of the theory to firefighters and future research are discussed. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. Statistical characterization of a large geochemical database and effect of sample size

    USGS Publications Warehouse

    Zhang, C.; Manheim, F.T.; Hinde, J.; Grossman, J.N.

    2005-01-01

    The authors investigated statistical distributions for concentrations of chemical elements from the National Geochemical Survey (NGS) database of the U.S. Geological Survey. At the time of this study, the NGS data set encompasses 48,544 stream sediment and soil samples from the conterminous United States analyzed by ICP-AES following a 4-acid near-total digestion. This report includes 27 elements: Al, Ca, Fe, K, Mg, Na, P, Ti, Ba, Ce, Co, Cr, Cu, Ga, La, Li, Mn, Nb, Nd, Ni, Pb, Sc, Sr, Th, V, Y and Zn. The goal and challenge for the statistical overview was to delineate chemical distributions in a complex, heterogeneous data set spanning a large geographic range (the conterminous United States), and many different geological provinces and rock types. After declustering to create a uniform spatial sample distribution with 16,511 samples, histograms and quantile-quantile (Q-Q) plots were employed to delineate subpopulations that have coherent chemical and mineral affinities. Probability groupings are discerned by changes in slope (kinks) on the plots. Major rock-forming elements, e.g., Al, Ca, K and Na, tend to display linear segments on normal Q-Q plots. These segments can commonly be linked to petrologic or mineralogical associations. For example, linear segments on K and Na plots reflect dilution of clay minerals by quartz sand (low in K and Na). Minor and trace element relationships are best displayed on lognormal Q-Q plots. These sensitively reflect discrete relationships in subpopulations within the wide range of the data. For example, small but distinctly log-linear subpopulations for Pb, Cu, Zn and Ag are interpreted to represent ore-grade enrichment of naturally occurring minerals such as sulfides. None of the 27 chemical elements could pass the test for either normal or lognormal distribution on the declustered data set. Part of the reasons relate to the presence of mixtures of subpopulations and outliers. Random samples of the data set with successively

  14. Development of a large-scale, outdoor, ground-based test capability for evaluating the effect of rain on airfoil lift

    NASA Technical Reports Server (NTRS)

    Bezos, Gaudy M.; Campbell, Bryan A.

    1993-01-01

    A large-scale, outdoor, ground-based test capability for acquiring aerodynamic data in a simulated rain environment was developed at the Langley Aircraft Landing Dynamics Facility (ALDF) to assess the effect of heavy rain on airfoil performance. The ALDF test carriage was modified to transport a 10-ft-chord NACA 64210 wing section along a 3000-ft track at full-scale aircraft approach speeds. An overhead rain simulation system was constructed along a 525-ft section of the track with the capability of producing simulated rain fields of 2, 10, 30, and 40 in/hr. The facility modifications, the aerodynamic testing and rain simulation capability, the design and calibration of the rain simulation system, and the operational procedures developed to minimize the effect of wind on the simulated rain field and aerodynamic data are described in detail. The data acquisition and reduction processes are also presented along with sample force data illustrating the environmental effects on data accuracy and repeatability for the 'rain-off' test condition.

  15. Testing AGN unification via inference from large catalogs

    NASA Astrophysics Data System (ADS)

    Nikutta, Robert; Ivezic, Zeljko; Elitzur, Moshe; Nenkova, Maia

    2018-01-01

    Source orientation and clumpiness of the central dust are the main factors in AGN classification. Type-1 QSOs are easy to observe and large samples are available (e.g. in SDSS), but obscured type-2 AGN are dimmer and redder as our line of sight is more obscured, making it difficult to obtain a complete sample. WISE has found up to a million QSOs. With only 4 bands and a relatively small aperture the analysis of individual sources is challenging, but the large sample allows inference of bulk properties at a very significant level.CLUMPY (www.clumpy.org) is arguably the most popular database of AGN torus SEDs. We model the ensemble properties of the entire WISE AGN content using regularized linear regression, with orientation-dependent CLUMPY color-color-magnitude (CCM) tracks as basis functions. We can reproduce the observed number counts per CCM bin with percent-level accuracy, and simultaneously infer the probability distributions of all torus parameters, redshifts, additional SED components, and identify type-1/2 AGN populations through their IR properties alone. We increase the statistical power of our AGN unification tests even further, by adding other datasets as axes in the regression problem. To this end, we make use of the NOAO Data Lab (datalab.noao.edu), which hosts several high-level large datasets and provides very powerful tools for handling large data, e.g. cross-matched catalogs, fast remote queries, etc.

  16. 30 CFR 15.5 - Test samples.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Test samples. 15.5 Section 15.5 Mineral... § 15.5 Test samples. (a) Submission of test samples. (1) The applicant shall not submit explosives or... magazine for at least 30 days before gallery tests are conducted. ...

  17. 30 CFR 15.5 - Test samples.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Test samples. 15.5 Section 15.5 Mineral... § 15.5 Test samples. (a) Submission of test samples. (1) The applicant shall not submit explosives or... magazine for at least 30 days before gallery tests are conducted. ...

  18. 30 CFR 15.5 - Test samples.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Test samples. 15.5 Section 15.5 Mineral... § 15.5 Test samples. (a) Submission of test samples. (1) The applicant shall not submit explosives or... magazine for at least 30 days before gallery tests are conducted. ...

  19. From large-eddy simulation to multi-UAVs sampling of shallow cumulus clouds

    NASA Astrophysics Data System (ADS)

    Lamraoui, Fayçal; Roberts, Greg; Burnet, Frédéric

    2016-04-01

    In-situ sampling of clouds that can provide simultaneous measurements at satisfying spatio-temporal resolutions to capture 3D small scale physical processes continues to present challenges. This project (SKYSCANNER) aims at bringing together cloud sampling strategies using a swarm of unmanned aerial vehicles (UAVs) based on Large-eddy simulation (LES). The multi-UAV-based field campaigns with a personalized sampling strategy for individual clouds and cloud fields will significantly improve the understanding of the unresolved cloud physical processes. An extensive set of LES experiments for case studies from ARM-SGP site have been performed using MesoNH model at high resolutions down to 10 m. The carried out simulations led to establishing a macroscopic model that quantifies the interrelationship between micro- and macrophysical properties of shallow convective clouds. Both the geometry and evolution of individual clouds are critical to multi-UAV cloud sampling and path planning. The preliminary findings of the current project reveal several linear relationships that associate many cloud geometric parameters to cloud related meteorological variables. In addition, the horizontal wind speed indicates a proportional impact on cloud number concentration as well as triggering and prolonging the occurrence of cumulus clouds. In the framework of the joint collaboration that involves a Multidisciplinary Team (including institutes specializing in aviation, robotics and atmospheric science), this model will be a reference point for multi-UAVs sampling strategies and path planning.

  20. Large Block Test Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, W

    2001-12-01

    This report documents the Large-Block Test (LBT) conducted at Fran Ridge near Yucca Mountain, Nevada. The LBT was a thermal test conducted on an exposed block of middle non-lithophysal Topopah Spring tuff (Tptpmn) and was designed to assist in understanding the thermal-hydrological-mechanical-chemical (THMC) processes associated with heating and then cooling a partially saturated fractured rock mass. The LBT was unique in that it was a large (3 x 3 x 4.5 m) block with top and sides exposed. Because the block was exposed at the surface, boundary conditions on five of the six sides of the block were relatively wellmore » known and controlled, making this test both easier to model and easier to monitor. This report presents a detailed description of the test as well as analyses of the data and conclusions drawn from the test. The rock block that was tested during the LBT was exposed by excavation and removal of the surrounding rock. The block was characterized and instrumented, and the sides were sealed and insulated to inhibit moisture and heat loss. Temperature on the top of the block was also controlled. The block was heated for 13 months, during which time temperature, moisture distribution, and deformation were monitored. After the test was completed and the block cooled down, a series of boreholes were drilled, and one of the heater holes was over-cored to collect samples for post-test characterization of mineralogy and mechanical properties. Section 2 provides background on the test. Section 3 lists the test objectives and describes the block site, the site configuration, and measurements made during the test. Section 3 also presents a chronology of events associated with the LBT, characterization of the block, and the pre-heat analyses of the test. Section 4 describes the fracture network contained in the block. Section 5 describes the heating/cooling system used to control the temperature in the block and presents the thermal history of the block during the

  1. On two-sample McNemar test.

    PubMed

    Xiang, Jim X

    2016-01-01

    Measuring a change in the existence of disease symptoms before and after a treatment is examined for statistical significance by means of the McNemar test. When comparing two treatments, Feuer and Kessler (1989) proposed a two-sample McNemar test. In this article, we show that this test usually inflates the type I error in the hypothesis testing, and propose a new two-sample McNemar test that is superior in terms of preserving type I error. We also make the connection between the two-sample McNemar test and the test statistic for the equal residual effects in a 2 × 2 crossover design. The limitations of the two-sample McNemar test are also discussed.

  2. Development of large field-of-view two photon microscopy for imaging mouse cortex (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Bumstead, Jonathan; Côté, Daniel C.; Culver, Joseph P.

    2017-02-01

    Spontaneous neuronal activity has been measured at cellular resolution in mice, zebrafish, and C. elegans using optical sectioning microscopy techniques, such as light sheet microscopy (LSM) and two photon microscopy (TPM). Recent improvements in these modalities and genetically encoded calcium indicators (GECI's) have enabled whole brain imaging of calcium dynamics in zebrafish and C. elegans. However, these whole brain microscopy studies have not been extended to mice due to the limited field of view (FOV) of TPM and the cumbersome geometry of LSM. Conventional TPM is restricted to diffraction limited imaging over this small FOV (around 500 x 500 microns) due to the use of high magnification objectives (e.g. 1.0 NA; 20X) and the aberrations introduced by relay optics used in scanning the beam across the sample. To overcome these limitations, we have redesigned the entire optical path of the two photon microscope (scanning optics and objective lens) to support a field of view of Ø7 mm with relatively high spatial resolution (<10 microns). Using optical engineering software Zemax, we designed our system with commercially available optics that minimize astigmatism, field curvature, chromatic focal shift, and vignetting. Performance of the system was also tested experimentally with fluorescent beads in agarose, fixed samples, and in vivo structural imaging. Our large-FOV TPM provides a modality capable of studying distributed brain networks in mice at cellular resolution.

  3. Large Field Visualization with Demand-Driven Calculation

    NASA Technical Reports Server (NTRS)

    Moran, Patrick J.; Henze, Chris

    1999-01-01

    We present a system designed for the interactive definition and visualization of fields derived from large data sets: the Demand-Driven Visualizer (DDV). The system allows the user to write arbitrary expressions to define new fields, and then apply a variety of visualization techniques to the result. Expressions can include differential operators and numerous other built-in functions, ail of which are evaluated at specific field locations completely on demand. The payoff of following a demand-driven design philosophy throughout becomes particularly evident when working with large time-series data, where the costs of eager evaluation alternatives can be prohibitive.

  4. Investigation of flow fields within large scale hypersonic inlet models

    NASA Technical Reports Server (NTRS)

    Gnos, A. V.; Watson, E. C.; Seebaugh, W. R.; Sanator, R. J.; Decarlo, J. P.

    1973-01-01

    Analytical and experimental investigations were conducted to determine the internal flow characteristics in model passages representative of hypersonic inlets for use at Mach numbers to about 12. The passages were large enough to permit measurements to be made in both the core flow and boundary layers. The analytical techniques for designing the internal contours and predicting the internal flow-field development accounted for coupling between the boundary layers and inviscid flow fields by means of a displacement-thickness correction. Three large-scale inlet models, each having a different internal compression ratio, were designed to provide high internal performance with an approximately uniform static-pressure distribution at the throat station. The models were tested in the Ames 3.5-Foot Hypersonic Wind Tunnel at a nominal free-stream Mach number of 7.4 and a unit free-stream Reynolds number of 8.86 X one million per meter.

  5. Low energy prompt gamma-ray tests of a large volume BGO detector.

    PubMed

    Naqvi, A A; Kalakada, Zameer; Al-Anezi, M S; Raashid, M; Khateeb-ur-Rehman; Maslehuddin, M; Garwan, M A

    2012-01-01

    Tests of a large volume Bismuth Germinate (BGO) detector were carried out to detect low energy prompt gamma-rays from boron and cadmium-contaminated water samples using a portable neutron generator-based Prompt Gamma Neutron Activation Analysis (PGNAA) setup. Inspite of strong interference between the sample- and the detector-associated prompt gamma-rays, an excellent agreement has been observed between the experimental and calculated yields of the prompt gamma-rays, indicating successful application of the large volume BGO detector in the PGNAA analysis of bulk samples using low energy prompt gamma-rays. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. A hard-to-read font reduces the framing effect in a large sample.

    PubMed

    Korn, Christoph W; Ries, Juliane; Schalk, Lennart; Oganian, Yulia; Saalbach, Henrik

    2018-04-01

    How can apparent decision biases, such as the framing effect, be reduced? Intriguing findings within recent years indicate that foreign language settings reduce framing effects, which has been explained in terms of deeper cognitive processing. Because hard-to-read fonts have been argued to trigger deeper cognitive processing, so-called cognitive disfluency, we tested whether hard-to-read fonts reduce framing effects. We found no reliable evidence for an effect of hard-to-read fonts on four framing scenarios in a laboratory (final N = 158) and an online study (N = 271). However, in a preregistered online study with a rather large sample (N = 732), a hard-to-read font reduced the framing effect in the classic "Asian disease" scenario (in a one-sided test). This suggests that hard-read-fonts can modulate decision biases-albeit with rather small effect sizes. Overall, our findings stress the importance of large samples for the reliability and replicability of modulations of decision biases.

  7. Design of shared instruments to utilize simulated gravities generated by a large-gradient, high-field superconducting magnet.

    PubMed

    Wang, Y; Yin, D C; Liu, Y M; Shi, J Z; Lu, H M; Shi, Z H; Qian, A R; Shang, P

    2011-03-01

    A high-field superconducting magnet can provide both high-magnetic fields and large-field gradients, which can be used as a special environment for research or practical applications in materials processing, life science studies, physical and chemical reactions, etc. To make full use of a superconducting magnet, shared instruments (the operating platform, sample holders, temperature controller, and observation system) must be prepared as prerequisites. This paper introduces the design of a set of sample holders and a temperature controller in detail with an emphasis on validating the performance of the force and temperature sensors in the high-magnetic field.

  8. Design of shared instruments to utilize simulated gravities generated by a large-gradient, high-field superconducting magnet

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Yin, D. C.; Liu, Y. M.; Shi, J. Z.; Lu, H. M.; Shi, Z. H.; Qian, A. R.; Shang, P.

    2011-03-01

    A high-field superconducting magnet can provide both high-magnetic fields and large-field gradients, which can be used as a special environment for research or practical applications in materials processing, life science studies, physical and chemical reactions, etc. To make full use of a superconducting magnet, shared instruments (the operating platform, sample holders, temperature controller, and observation system) must be prepared as prerequisites. This paper introduces the design of a set of sample holders and a temperature controller in detail with an emphasis on validating the performance of the force and temperature sensors in the high-magnetic field.

  9. Uniform field loop-gap resonator and rectangular TEU02 for aqueous sample EPR at 94 GHz

    NASA Astrophysics Data System (ADS)

    Sidabras, Jason W.; Sarna, Tadeusz; Mett, Richard R.; Hyde, James S.

    2017-09-01

    In this work we present the design and implementation of two uniform-field resonators: a seven-loop-six-gap loop-gap resonator (LGR) and a rectangular TEU02 cavity resonator. Each resonator has uniform-field-producing end-sections. These resonators have been designed for electron paramagnetic resonance (EPR) of aqueous samples at 94 GHz. The LGR geometry employs low-loss Rexolite end-sections to improve the field homogeneity over a 3 mm sample region-of-interest from near-cosine distribution to 90% uniform. The LGR was designed to accommodate large degassable Polytetrafluorethylen (PTFE) tubes (0.81 mm O.D.; 0.25 mm I.D.) for aqueous samples. Additionally, field modulation slots are designed for uniform 100 kHz field modulation incident at the sample. Experiments using a point sample of lithium phthalocyanine (LiPC) were performed to measure both the uniformity of the microwave magnetic field and 100 kHz field modulation, and confirm simulations. The rectangular TEU02 cavity resonator employs over-sized end-sections with sample shielding to provide an 87% uniform field for a 0.1 × 2 × 6 mm3 sample geometry. An evanescent slotted window was designed for light access to irradiate 90% of the sample volume. A novel dual-slot iris was used to minimize microwave magnetic field perturbations and maintain cross-sectional uniformity. Practical EPR experiments using the application of light irradiated rose bengal (4,5,6,7-tetrachloro-2‧,4‧,5‧,7‧-tetraiodofluorescein) were performed in the TEU02 cavity. The implementation of these geometries providing a practical designs for uniform field resonators that continue resonator advancements towards quantitative EPR spectroscopy.

  10. Testing for independence in J×K contingency tables with complex sample survey data.

    PubMed

    Lipsitz, Stuart R; Fitzmaurice, Garrett M; Sinha, Debajyoti; Hevelone, Nathanael; Giovannucci, Edward; Hu, Jim C

    2015-09-01

    The test of independence of row and column variables in a (J×K) contingency table is a widely used statistical test in many areas of application. For complex survey samples, use of the standard Pearson chi-squared test is inappropriate due to correlation among units within the same cluster. Rao and Scott (1981, Journal of the American Statistical Association 76, 221-230) proposed an approach in which the standard Pearson chi-squared statistic is multiplied by a design effect to adjust for the complex survey design. Unfortunately, this test fails to exist when one of the observed cell counts equals zero. Even with the large samples typical of many complex surveys, zero cell counts can occur for rare events, small domains, or contingency tables with a large number of cells. Here, we propose Wald and score test statistics for independence based on weighted least squares estimating equations. In contrast to the Rao-Scott test statistic, the proposed Wald and score test statistics always exist. In simulations, the score test is found to perform best with respect to type I error. The proposed method is motivated by, and applied to, post surgical complications data from the United States' Nationwide Inpatient Sample (NIS) complex survey of hospitals in 2008. © 2015, The International Biometric Society.

  11. Field trapping and magnetic levitation performances of large single-grain Gd Ba Cu O at different temperatures

    NASA Astrophysics Data System (ADS)

    Nariki, S.; Fujikura, M.; Sakai, N.; Hirabayashi, I.; Murakami, M.

    2005-10-01

    We measured the temperature dependence of the trapped field and the magnetic levitation force for c-axis-oriented single-grain Gd-Ba-Cu-O bulk samples 48 mm in diameter. Trapped magnetic field of the samples was 2.1-2.2 T at 77 K and increased with decreasing temperature and reached 4.1 T at 70 K, however the sample fractured during the measurements at lower temperatures due to a large electromagnetic force. The reinforcement by a metal ring was effective in improving the mechanical strength. The sample encapsulated in an Al ring could trap a very high magnetic field of 9.0 T at 50 K. In liquid O 2 the Gd-Ba-Cu-O bulk exhibited a trapped field of 0.42 T and a magnetic levitation force about a half value of that in liquid N 2.

  12. Sample features associated with success rates in population-based EGFR mutation testing.

    PubMed

    Shiau, Carolyn J; Babwah, Jesse P; da Cunha Santos, Gilda; Sykes, Jenna R; Boerner, Scott L; Geddie, William R; Leighl, Natasha B; Wei, Cuihong; Kamel-Reid, Suzanne; Hwang, David M; Tsao, Ming-Sound

    2014-07-01

    Epidermal growth factor receptor (EGFR) mutation testing has become critical in the treatment of patients with advanced non-small-cell lung cancer. This study involves a large cohort and epidemiologically unselected series of EGFR mutation testing for patients with nonsquamous non-small-cell lung cancer in a North American population to determine sample-related factors that influence success in clinical EGFR testing. Data from consecutive cases of Canadian province-wide testing at a centralized diagnostic laboratory for a 24-month period were reviewed. Samples were tested for exon-19 deletion and exon-21 L858R mutations using a validated polymerase chain reaction method with 1% to 5% detection sensitivity. From 2651 samples submitted, 2404 samples were tested with 2293 samples eligible for analysis (1780 histology and 513 cytology specimens). The overall test-failure rate was 5.4% with overall mutation rate of 20.6%. No significant differences in the failure rate, mutation rate, or mutation type were found between histology and cytology samples. Although tumor cellularity was significantly associated with test-success or mutation rates in histology and cytology specimens, respectively, mutations could be detected in all specimen types. Significant rates of EGFR mutation were detected in cases with thyroid transcription factor (TTF)-1-negative immunohistochemistry (6.7%) and mucinous component (9.0%). EGFR mutation testing should be attempted in any specimen, whether histologic or cytologic. Samples should not be excluded from testing based on TTF-1 status or histologic features. Pathologists should report the amount of available tumor for testing. However, suboptimal samples with a negative EGFR mutation result should be considered for repeat testing with an alternate sample.

  13. Crash testing difference-smoothing algorithm on a large sample of simulated light curves from TDC1

    NASA Astrophysics Data System (ADS)

    Rathna Kumar, S.

    2017-09-01

    In this work, we propose refinements to the difference-smoothing algorithm for the measurement of time delay from the light curves of the images of a gravitationally lensed quasar. The refinements mainly consist of a more pragmatic approach to choose the smoothing time-scale free parameter, generation of more realistic synthetic light curves for the estimation of time delay uncertainty and using a plot of normalized χ2 computed over a wide range of trial time delay values to assess the reliability of a measured time delay and also for identifying instances of catastrophic failure. We rigorously tested the difference-smoothing algorithm on a large sample of more than thousand pairs of simulated light curves having known true time delays between them from the two most difficult 'rungs' - rung3 and rung4 - of the first edition of Strong Lens Time Delay Challenge (TDC1) and found an inherent tendency of the algorithm to measure the magnitude of time delay to be higher than the true value of time delay. However, we find that this systematic bias is eliminated by applying a correction to each measured time delay according to the magnitude and sign of the systematic error inferred by applying the time delay estimator on synthetic light curves simulating the measured time delay. Following these refinements, the TDC performance metrics for the difference-smoothing algorithm are found to be competitive with those of the best performing submissions of TDC1 for both the tested 'rungs'. The MATLAB codes used in this work and the detailed results are made publicly available.

  14. ff14IDPs Force Field Improving the Conformation Sampling of Intrinsically Disordered Proteins

    PubMed Central

    Song, Dong; Wang, Wei; Ye, Wei; Ji, Dingjue; Luo, Ray; Chen, Hai-Feng

    2017-01-01

    Intrinsically disordered proteins (IDPs) are proteins which lack of specific tertiary structure and unable to fold spontaneously without the partner binding. These IDPs are found to associate with various diseases, such as diabetes, cancer, and neurodegenerative diseases. However, current widely used force fields, such as ff99SB, ff14SB, OPLS/AA, and Charmm27 are insufficient in sampling the conformational characters of IDPs. In this study, the CMAP method was used to correct the φ/ψ distributions of disorder-promoting amino acids. The simulation results show that the force filed parameters (ff14IDPs) can improve the φ/ψ distributions of the disorder-promoting amino acids, with RMSD less than 0.10% relative to the benchmark data of IDPs. Further test suggests that the calculated secondary chemical shifts under ff14IDPs force field are in quantitative agreement with the data of NMR experiment for five tested systems. In addition, the simulation results show that ff14IDPs can still be used to model structural proteins, such as tested lysozyme and ubiquitin, with better performance in coil regions than the original general Amber force field ff14SB. These findings confirm that the newly developed Amber ff14IDPs force field is a robust model for improving the conformation sampling of IDPs. PMID:27484738

  15. Static and wind tunnel near-field/far-field jet noise measurements from model scale single-flow base line and suppressor nozzles. Summary report. [conducted in the Boeing large anechoic test chamber and the NASA-Ames 40by 80-foot wind tunnel

    NASA Technical Reports Server (NTRS)

    Jaeck, C. L.

    1977-01-01

    A test program was conducted in the Boeing large anechoic test chamber and the NASA-Ames 40- by 80-foot wind tunnel to study the near- and far-field jet noise characteristics of six baseline and suppressor nozzles. Static and wind-on noise source locations were determined. A technique for extrapolating near field jet noise measurements into the far field was established. It was determined if flight effects measured in the near field are the same as those in the far field. The flight effects on the jet noise levels of the baseline and suppressor nozzles were determined. Test models included a 15.24-cm round convergent nozzle, an annular nozzle with and without ejector, a 20-lobe nozzle with and without ejector, and a 57-tube nozzle with lined ejector. The static free-field test in the anechoic chamber covered nozzle pressure ratios from 1.44 to 2.25 and jet velocities from 412 to 594 m/s at a total temperature of 844 K. The wind tunnel flight effects test repeated these nozzle test conditions with ambient velocities of 0 to 92 m/s.

  16. Long-term average non-dipole fields; how large or how small?

    NASA Astrophysics Data System (ADS)

    Van Der Voo, R.; Domeier, M. M.; Torsvik, T. H.

    2012-12-01

    Paul Louis Mercanton suggested already in the late 1920's that paleomagnetism might provide a test of continental drift. However, the absence of an adequate understanding of the ancient (!) geomagnetic field structure hampered such a test until some 25 years later. But then, the results of the paleomagnetic study of Neogene Icelandic lavas by Hospers in the early 1950's, provided a breakthrough. Two very important findings were: (1) that the field in the Neogene was predominantly dipolar, implying that higher-order fields (quadrupoles, octupoles) averaged to near-zero, and (2) that the dipole axis remained on average aligned with the rotation axis, during normal- as well as reversed-polarity fields intervals. The last conclusion prompted Creer, Irving, and Runcorn to remark that "The coincidence of the magnetic and rotation axes [ . . .] covering many reversals is explained by the dominance of the Coriolis force". The geocentric axial dipole (GAD) hypothesis remained ever after the main guiding principle of paleomagnetic analysis, allowing declination anomalies to be interpreted as rotations and inclinations as representative of paleolatitudes. It is generally agreed upon that the long-term averaged field structure is largely, but not perfectly, dipolar. The critical question about non-dipole fields is "how large" (or, perhaps, "how small"). Analysis of the magnitude of non-dipole fields is restricted to zonal fields of degree (n) two or three, i.e., axial quadrupole and octupole fields, characterized by Gaussian coefficient ratios (Gn) where G is the ratio of the appropriate higher-order field coefficient and the axial dipole field coefficient. For the last 5 million years G2 and G3 are small, but not zero (Johnson et al., 2008, G-cubed), and for earlier geological times (Permian, Triassic) some speculations by some of us have considered values up to 0.2, on the basis of inclination patterns. The underlying assumption that inclination anomalies were attributable

  17. Compressive and Flexural Tests on Adobe Samples Reinforced with Wire Mesh

    NASA Astrophysics Data System (ADS)

    Jokhio, G. A.; Al-Tawil, Y. M. Y.; Syed Mohsin, S. M.; Gul, Y.; Ramli, N. I.

    2018-03-01

    Adobe is an economical, naturally available, and environment friendly construction material that offers excellent thermal and sound insulations as well as indoor air quality. It is important to understand and enhance the mechanical properties of this material, where a high degree of variation is reported in the literature owing to lack of research and standardization in this field. The present paper focuses first on the understanding of mechanical behaviour of adobe subjected to compressive stresses as well as flexure and then on enhancing the same with the help of steel wire mesh as reinforcement. A total of 22 samples were tested out of which, 12 cube samples were tested for compressive strength, whereas 10 beams samples were tested for modulus of rupture. Half of the samples in each category were control samples i.e. without wire mesh reinforcement, whereas the remaining half were reinforced with a single layer of wire mesh per sample. It has been found that the compressive strength of adobe increases by about 43% after adding a single layer of wire mesh reinforcement. The flexural response of adobe has also shown improvement with the addition of wire mesh reinforcement.

  18. Calculating p-values and their significances with the Energy Test for large datasets

    NASA Astrophysics Data System (ADS)

    Barter, W.; Burr, C.; Parkes, C.

    2018-04-01

    The energy test method is a multi-dimensional test of whether two samples are consistent with arising from the same underlying population, through the calculation of a single test statistic (called the T-value). The method has recently been used in particle physics to search for samples that differ due to CP violation. The generalised extreme value function has previously been used to describe the distribution of T-values under the null hypothesis that the two samples are drawn from the same underlying population. We show that, in a simple test case, the distribution is not sufficiently well described by the generalised extreme value function. We present a new method, where the distribution of T-values under the null hypothesis when comparing two large samples can be found by scaling the distribution found when comparing small samples drawn from the same population. This method can then be used to quickly calculate the p-values associated with the results of the test.

  19. Transport and attenuation of carboxylate-modified latex microspheres in fractured rock laboratory and field tracer tests

    USGS Publications Warehouse

    Becker, M.W.; Reimus, P.W.; Vilks, P.

    1999-01-01

    Understanding colloid transport in ground water is essential to assessing the migration of colloid-size contaminants, the facilitation of dissolved contaminant transport by colloids, in situ bioremediation, and the health risks of pathogen contamination in drinking water wells. Much has been learned through laboratory and field-scale colloid tracer tests, but progress has been hampered by a lack of consistent tracer testing methodology at different scales and fluid velocities. This paper presents laboratory and field tracer tests in fractured rock that use the same type of colloid tracer over an almost three orders-of-magnitude range in scale and fluid velocity. Fluorescently-dyed carboxylate-modified latex (CML) microspheres (0.19 to 0.98 ??m diameter) were used as tracers in (1) a naturally fractured tuff sample, (2) a large block of naturally fractured granite, (3) a fractured granite field site, and (4) another fractured granite/schist field site. In all cases, the mean transport time of the microspheres was shorter than the solutes, regardless of detection limit. In all but the smallest scale test, only a fraction of the injected microsphere mass was recovered, with the smaller microspheres being recovered to a greater extent than the larger microspheres. Using existing theory, we hypothesize that the observed microsphere early arrival was due to volume exclusion and attenuation was due to aggregation and/or settling during transport. In most tests, microspheres were detected using flow cytometry, which proved to be an excellent method of analysis. CML microspheres appear to be useful tracers for fractured rock in forced gradient and short-term natural gradient tests, but longer residence times may result in small microsphere recoveries.Understanding colloid transport in ground water is essential to assessing the migration of colloid-size contaminants, the facilitation of dissolved contaminant transport by colloids, in situ bioremediation, and the health risks

  20. Comparative Field Tests of Pressurised Rover Prototypes

    NASA Astrophysics Data System (ADS)

    Mann, G. A.; Wood, N. B.; Clarke, J. D.; Piechochinski, S.; Bamsey, M.; Laing, J. H.

    The conceptual designs, interior layouts and operational performances of three pressurised rover prototypes - Aonia, ARES and Everest - were field tested during a recent simulation at the Mars Desert Research Station in Utah. A human factors experiment, in which the same crew of three executed the same simulated science mission in each of the three vehicles, yielded comparative data on the capacity of each vehicle to safely and comfortably carry explorers away from the main base, enter and exit the vehicle in spacesuits, perform science tasks in the field, and manage geological and biological samples. As well as offering recommendations for design improvements for specific vehicles, the results suggest that a conventional Sports Utility Vehicle (SUV) would not be suitable for analog field work; that a pressurised docking tunnel to the main habitat is essential; that better provisions for spacesuit storage are required; and that a crew consisting of one driver/navigator and two field science crew specialists may be optimal. From a field operations viewpoint, a recurring conflict between rover and habitat crews at the time of return to the habitat was observed. An analysis of these incidents leads to proposed refinements of operational protocols, specific crew training for rover returns and again points to the need for a pressurised docking tunnel. Sound field testing, circulating of results, and building the lessons learned into new vehicles is advocated as a way of producing ever higher fidelity rover analogues.

  1. Factor Structure of the TOEFL® Internet-Based Test (iBT): Exploration in a Field Trial Sample. TOEFL iBT Research Report. TOEFL iBT-04. ETS RR-08-09

    ERIC Educational Resources Information Center

    Sawaki, Yasuyo; Stricker, Lawrence; Oranje, Andreas

    2008-01-01

    The present study investigated the factor structure of a field trial sample of the Test of English as a Foreign Language™ Internet-based test (TOEFL® iBT). An item-level confirmatory factor analysis (CFA) was conducted for a polychoric correlation matrix of items on a test form completed by 2,720 participants in the 2003-2004 TOEFL iBT Field…

  2. Test of Relativistic Gravity for Propulsion at the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Felber, Franklin

    2010-01-01

    A design is presented of a laboratory experiment that could test the suitability of relativistic gravity for propulsion of spacecraft to relativistic speeds. An exact time-dependent solution of Einstein's gravitational field equation confirms that even the weak field of a mass moving at relativistic speeds could serve as a driver to accelerate a much lighter payload from rest to a good fraction of the speed of light. The time-dependent field of ultrarelativistic particles in a collider ring is calculated. An experiment is proposed as the first test of the predictions of general relativity in the ultrarelativistic limit by measuring the repulsive gravitational field of bunches of protons in the Large Hadron Collider (LHC). The estimated `antigravity beam' signal strength at a resonant detector of each proton bunch is 3 nm/s2 for 2 ns during each revolution of the LHC. This experiment can be performed off-line, without interfering with the normal operations of the LHC.

  3. Large photocathode 20-inch PMT testing methods for the JUNO experiment

    NASA Astrophysics Data System (ADS)

    Anfimov, N.

    2017-06-01

    The 20 kt Liquid Scintillator (LS) JUNO detector is being constructed by the International Collaboration in China, with the primary goal of addressing the question of neutrino mass ordering (hierarchy). The main challenge for JUNO is to achieve a record energy resolution, ~ 3% at 1 MeV of energy released in the LS, which is required to perform the neutrino mass hierarchy determination. About 20 000 large 20'' PMTs with high Photon Detection Efficiency (PDE) and good photocathode uniformity will ensure an approximately 80% surface coverage of the JUNO detector. The JUNO collaboration is preparing equipment for the mass tests of all PMTs using 4 dedicated containers. Each container consists of 36 drawers. Each drawer will test a single PMT. This approach allows us to test 144 PMTs in parallel. The primary measurement in the container will be the PMT response to illumination of its photocathode by a low-intensity uniform light. Each of the 20000 PMTs will undergo the container test. Additionally, a dedicated scanning system was constructed for sampled tests of PMTs that allows us to study the variation of the PDE over the entire PMT photocathode surface. A sophisticated laboratory for PMT testing was recently built. It includes a dark room where the scanning station is housed. The core of the scanning station is a rotating frame with 7 LED sources of calibrated short light flashes that are placed along the photocathode surface covering zenith angles from the top of a PMT to its equator. It allows for the testing of individual PMTs in all relevant aspects by scanning the photocathode and identifying any potential problems. The collection efficiency of a large PMT is known to be very sensitive to the Earth Magnetic Field (EMF), therefore, understanding the necessary level of EMF suppression is crucial for the JUNO Experiment. A dark room with Helmholtz coils compensating the EMF components is available for these tests at a JUNO facility. The Hamamatsu R12860 20'' PMT is

  4. Evaluation of a Commercial Field Test to Detect African Swine Fever.

    PubMed

    Cappai, Stefano; Loi, Federica; Coccollone, Annamaria; Cocco, Manuele; Falconi, Caterina; Dettori, Giovanna; Feliziani, Francesco; Sanna, Maria Luisa; Oggiano, Annalisa; Rolesu, Sandro

    2017-07-01

    African swine fever (ASF) is one of the most important and complex infectious diseases affecting pigs ( Sus scrofa ). The disease has been present in Sardinia, Italy, since 1978. Factors influencing the presence of the disease on the island are the presence of illegally bred pigs, uncontrolled movements of animals, and local traditions. Implementation of public health programs is essential for controlling ASF. The use of new diagnostic techniques on both wild boar (WB) and illegally bred pigs would provide tools for faster and more inexpensive control of the disease. We evaluated a commercial serological test kit (Pen-side [PS]) for use in the field. We sampled 113 hunter-harvested WB during the 2014-15 season, collecting blood and lung samples to conduct serological analyses and to screen for the ASF virus. Although the sensitivity (81.8%) and specificity (95.9%) of tests performed in the field were reduced compared to the same test in laboratory, they nevertheless allowed for rapid diagnosis and reduced unnecessary carcass destruction. The test, conducted in the field, was less expensive than in the laboratory and required less manpower. Therefore, we conclude that the combined use of antibody PS test and antigen PS test may be a valuable emergency management method during an outbreak as well as a useful tool for conducting regular monitoring activities as a preventive policy.

  5. Pilot Field Test Study

    NASA Technical Reports Server (NTRS)

    Sherriff, Abigail

    2015-01-01

    The Field Test study is currently in full swing, preceded by the successful completion of the Pilot Field Test study that paved the way for collecting data on the astronauts in the medical tent in Kazakhstan. Abigail Sherriff worked alongside Logan Dobbe on one Field Test aspect to determine foot clearance over obstacles (5cm, 10cm, and 15cm) using APDM Inc. Internal Measurement Units (IMU) worn by the astronauts. They created a program to accurately calculate foot clearance using the accelerometer, magnetometer, and gyroscope data with the IMUs attached to the top of the shoes. To validate the functionality of their program, they completed a successful study on test subjects performing various tasks in an optical motion studio, considered a gold standard in biomechanics research. Future work will include further validation and expanding the program to include other analyses.

  6. Exploring proximity effects and large depth of field in helium ion beam lithography: large-area dense patterns and tilted surface exposure.

    PubMed

    Flatabø, Ranveig; Agarwal, Akshay; Hobbs, Richard; Greve, Martin M; Holst, Bodil; Berggren, Karl K

    2018-07-06

    Helium ion beam lithography (HIL) is an emerging nanofabrication technique. It benefits from a reduced interaction volume compared to that of an electron beam of similar energy, and hence reduced long-range scattering (proximity effect), higher resist sensitivity and potentially higher resolution. Furthermore, the small angular spread of the helium ion beam gives rise to a large depth of field. This should enable patterning on tilted and curved surfaces without the need of any additional adjustments, such as laser-auto focus. So far, most work on HIL has been focused on exploiting the reduced proximity effect to reach single-digit nanometer resolution, and has thus been concentrated on single-pixel exposures over small areas. Here we explore two new areas of application. Firstly, we investigate the proximity effect in large-area exposures and demonstrate HIL's capabilities in fabricating precise high-density gratings on large planar surfaces (100 μm × 100 μm, with pitch down to 35 nm) using an area dose for exposure. Secondly, we exploit the large depth of field by making the first HIL patterns on tilted surfaces (sample stage tilted 45°). We demonstrate a depth of field greater than 100 μm for a resolution of about 20 nm.

  7. Testing of high-volume sampler inlets for the sampling of atmospheric radionuclides.

    PubMed

    Irshad, Hammad; Su, Wei-Chung; Cheng, Yung S; Medici, Fausto

    2006-09-01

    Sampling of air for radioactive particles is one of the most important techniques used to determine the nuclear debris from a nuclear weapon test in the Earth's atmosphere or those particles vented from underground or underwater tests. Massive-flow air samplers are used to sample air for any indication of radionuclides that are a signature of nuclear tests. The International Monitoring System of the Comprehensive Nuclear Test Ban Treaty Organization includes seismic, hydroacoustic, infrasound, and gaseous xenon isotopes sampling technologies, in addition to radionuclide sampling, to monitor for any violation of the treaty. Lovelace Respiratory Research Institute has developed a large wind tunnel to test the outdoor radionuclide samplers for the International Monitoring System. The inlets for these samplers are tested for their collection efficiencies for different particle sizes at various wind speeds. This paper describes the results from the testing of two radionuclide sampling units used in the International Monitoring System. The possible areas of depositional wall losses are identified and the losses in these areas are determined. Sampling inlet type 1 was tested at 2.2 m s wind speed for 5, 10, and 20-microm aerodynamic diameter particles. The global collection efficiency was about 87.6% for 10-microm particles for sampling inlet type 1. Sampling inlet type 2 was tested for three wind speeds at 0.56, 2.2, and 6.6 m s for 5, 10, and 20-microm aerodynamic diameter particles in two different configurations (sampling head lowered and raised). The global collection efficiencies for these configurations for 10-microm particles at 2.2 m s wind speed were 77.4% and 82.5%, respectively. The sampling flow rate was 600 m h for both sampling inlets.

  8. High explosive spot test analyses of samples from Operable Unit (OU) 1111

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McRae, D.; Haywood, W.; Powell, J.

    1995-01-01

    A preliminary evaluation has been completed of environmental contaminants at selected sites within the Group DX-10 (formally Group M-7) area. Soil samples taken from specific locations at this detonator facility were analyzed for harmful metals and screened for explosives. A sanitary outflow, a burn pit, a pentaerythritol tetranitrate (PETN) production outflow field, an active firing chamber, an inactive firing chamber, and a leach field were sampled. Energy dispersive x-ray fluorescence (EDXRF) was used to obtain semi-quantitative concentrations of metals in the soil. Two field spot-test kits for explosives were used to assess the presence of energetic materials in the soilmore » and in items found at the areas tested. PETN is the major explosive in detonators manufactured and destroyed at Los Alamos. No measurable amounts of PETN or other explosives were detected in the soil, but items taken from the burn area and a high-energy explosive (HE)/chemical sump were contaminated. The concentrations of lead, mercury, and uranium are given.« less

  9. Commutability of food microbiology proficiency testing samples.

    PubMed

    Abdelmassih, M; Polet, M; Goffaux, M-J; Planchon, V; Dierick, K; Mahillon, J

    2014-03-01

    Food microbiology proficiency testing (PT) is a useful tool to assess the analytical performances among laboratories. PT items should be close to routine samples to accurately evaluate the acceptability of the methods. However, most PT providers distribute exclusively artificial samples such as reference materials or irradiated foods. This raises the issue of the suitability of these samples because the equivalence-or 'commutability'-between results obtained on artificial vs. authentic food samples has not been demonstrated. In the clinical field, the use of noncommutable PT samples has led to erroneous evaluation of the performances when different analytical methods were used. This study aimed to provide a first assessment of the commutability of samples distributed in food microbiology PT. REQUASUD and IPH organized 13 food microbiology PTs including 10-28 participants. Three types of PT items were used: genuine food samples, sterile food samples and reference materials. The commutability of the artificial samples (reference material or sterile samples) was assessed by plotting the distribution of the results on natural and artificial PT samples. This comparison highlighted matrix-correlated issues when nonfood matrices, such as reference materials, were used. Artificially inoculated food samples, on the other hand, raised only isolated commutability issues. In the organization of a PT-scheme, authentic or artificially inoculated food samples are necessary to accurately evaluate the analytical performances. Reference materials, used as PT items because of their convenience, may present commutability issues leading to inaccurate penalizing conclusions for methods that would have provided accurate results on food samples. For the first time, the commutability of food microbiology PT samples was investigated. The nature of the samples provided by the organizer turned out to be an important factor because matrix effects can impact on the analytical results. © 2013

  10. Attitude Estimation for Large Field-of-View Sensors

    NASA Technical Reports Server (NTRS)

    Cheng, Yang; Crassidis, John L.; Markley, F. Landis

    2005-01-01

    The QUEST measurement noise model for unit vector observations has been widely used in spacecraft attitude estimation for more than twenty years. It was derived under the approximation that the noise lies in the tangent plane of the respective unit vector and is axially symmetrically distributed about the vector. For large field-of-view sensors, however, this approximation may be poor, especially when the measurement falls near the edge of the field of view. In this paper a new measurement noise model is derived based on a realistic noise distribution in the focal-plane of a large field-of-view sensor, which shows significant differences from the QUEST model for unit vector observations far away from the sensor boresight. An extended Kalman filter for attitude estimation is then designed with the new measurement noise model. Simulation results show that with the new measurement model the extended Kalman filter achieves better estimation performance using large field-of-view sensor observations.

  11. MEGARA: large pupil element tests and performance

    NASA Astrophysics Data System (ADS)

    Martínez-Delgado, I.; Sánchez-Blanco, E.; Pérez-Calpena, A.; García-Vargas, M. L.; Maldonado, X. M.; Gil de Paz, A.; Carrasco, E.; Gallego, J.; Iglesias-Páramo, J.; Sánchez-Moreno, F. M.

    2016-07-01

    MEGARA is a third generation spectrograph for the Spanish 10.4m telescope (GTC) providing two observing modes: a large central Integral Field Unit (IFU), called the Large Compact Bundle (LCB), covering a FOV of 12.5 × 11.3 arcsec2, and a Multi-Object Spectrograph (MOS) with a FOV of 3.5 × 3.5 arcmin2. MEGARA will observe the whole visible range from 3650A to 10000A allowing different spectral resolutions (low, medium and high) with R = 6000, 11000 and 18000 respectively. The dispersive elements are placed at the spectrograph pupil position in the path of the collimated beam and they are composed of a set of volume phase hologram gratings (VPHs) sandwiched between two flat windows and coupled in addition to two prisms in the case of the medium- and high-resolution units. We will describe the tests and setups developed to check the requirements of all units, as well as the obtained performance at laboratory

  12. The four-meter confrontation visual field test.

    PubMed Central

    Kodsi, S R; Younge, B R

    1992-01-01

    The 4-m confrontation visual field test has been successfully used at the Mayo Clinic for many years in addition to the standard 0.5-m confrontation visual field test. The 4-m confrontation visual field test is a test of macular function and can identify small central or paracentral scotomas that the examiner may not find when the patient is tested only at 0.5 m. Also, macular sparing in homonymous hemianopias and quadrantanopias may be identified with the 4-m confrontation visual field test. We recommend use of this confrontation visual field test, in addition to the standard 0.5-m confrontation visual field test, on appropriately selected patients to obtain the most information possible by confrontation visual field tests. PMID:1494829

  13. The four-meter confrontation visual field test.

    PubMed

    Kodsi, S R; Younge, B R

    1992-01-01

    The 4-m confrontation visual field test has been successfully used at the Mayo Clinic for many years in addition to the standard 0.5-m confrontation visual field test. The 4-m confrontation visual field test is a test of macular function and can identify small central or paracentral scotomas that the examiner may not find when the patient is tested only at 0.5 m. Also, macular sparing in homonymous hemianopias and quadrantanopias may be identified with the 4-m confrontation visual field test. We recommend use of this confrontation visual field test, in addition to the standard 0.5-m confrontation visual field test, on appropriately selected patients to obtain the most information possible by confrontation visual field tests.

  14. Sampling Error in Relation to Cyst Nematode Population Density Estimation in Small Field Plots.

    PubMed

    Župunski, Vesna; Jevtić, Radivoje; Jokić, Vesna Spasić; Župunski, Ljubica; Lalošević, Mirjana; Ćirić, Mihajlo; Ćurčić, Živko

    2017-06-01

    Cyst nematodes are serious plant-parasitic pests which could cause severe yield losses and extensive damage. Since there is still very little information about error of population density estimation in small field plots, this study contributes to the broad issue of population density assessment. It was shown that there was no significant difference between cyst counts of five or seven bulk samples taken per each 1-m 2 plot, if average cyst count per examined plot exceeds 75 cysts per 100 g of soil. Goodness of fit of data to probability distribution tested with χ 2 test confirmed a negative binomial distribution of cyst counts for 21 out of 23 plots. The recommended measure of sampling precision of 17% expressed through coefficient of variation ( cv ) was achieved if the plots of 1 m 2 contaminated with more than 90 cysts per 100 g of soil were sampled with 10-core bulk samples taken in five repetitions. If plots were contaminated with less than 75 cysts per 100 g of soil, 10-core bulk samples taken in seven repetitions gave cv higher than 23%. This study indicates that more attention should be paid on estimation of sampling error in experimental field plots to ensure more reliable estimation of population density of cyst nematodes.

  15. Reliability of air displacement plethysmography in a large, heterogeneous sample.

    PubMed

    Noreen, Eric E; Lemon, Peter W R

    2006-08-01

    Several studies have assessed the validity of air displacement plethysmography (ADP), but few have assessed the reliability of ADP using a large, heterogeneous sample. This study was conducted to determine the reliability of ADP using the Bod Pod in a large, heterogeneous sample. A total of 980 healthy men and women (30 +/- 15 yr, mean +/- SD) completed two body composition assessments separated by 15-30 min. All testing was done in accordance with the manufacturer's instructions. A significant correlation (r = 0.992, P = 0.001) was found between body density (BD) 1 (1.046 +/- 0.001 kg.L(-1); mean +/- SEM) and BD 2 (1.046 +/- 0.001 kg.L(-1). A paired t-test revealed no significant difference between BD 1 and 2 (P = 0.935). The coefficient of variation (CV) for BD was 0.15%. A significant intraclass correlation coefficient (ICC) was found for BD (ICC = 0.996, P = 0.001), and the standard error of measurement (SEM) was 0.001 kg.L(-1). Body mass (BM) 1 and 2 were correlated significantly (r = 0.999, P = 0.001); however, a significant (P = 0.001) decrease was seen from BM 1 (75.510 +/- 0.461 kg) to BM 2 (75.497 +/- 0.461 kg). Body volume (BV) tended to decrease (P = 0.08) from BV 1 (69.900 +/- 0.449 L) to BV 2 (69.884 +/- 0.449 L). ADP using the Bod Pod appears to assess BD reliably; however, the observed CV suggests that multiple trials are necessary to detect small treatment effects.

  16. Detecting Superior Face Recognition Skills in a Large Sample of Young British Adults

    PubMed Central

    Bobak, Anna K.; Pampoulov, Philip; Bate, Sarah

    2016-01-01

    The Cambridge Face Memory Test Long Form (CFMT+) and Cambridge Face Perception Test (CFPT) are typically used to assess the face processing ability of individuals who believe they have superior face recognition skills. Previous large-scale studies have presented norms for the CFPT but not the CFMT+. However, previous research has also highlighted the necessity for establishing country-specific norms for these tests, indicating that norming data is required for both tests using young British adults. The current study addressed this issue in 254 British participants. In addition to providing the first norm for performance on the CFMT+ in any large sample, we also report the first UK specific cut-off for superior face recognition on the CFPT. Further analyses identified a small advantage for females on both tests, and only small associations between objective face recognition skills and self-report measures. A secondary aim of the study was to examine the relationship between trait or social anxiety and face processing ability, and no associations were noted. The implications of these findings for the classification of super-recognizers are discussed. PMID:27713706

  17. Salmonella testing of pooled pre-enrichment broth cultures for screening multiple food samples.

    PubMed

    Price, W R; Olsen, R A; Hunter, J E

    1972-04-01

    A method has been described for testing multiple food samples for Salmonella without loss in sensitivity. The method pools multiple pre-enrichment broth cultures into single enrichment broths. The subsequent stages of the Salmonella analysis are not altered. The method was found applicable to several dry food materials including nonfat dry milk, dried egg albumin, cocoa, cottonseed flour, wheat flour, and shredded coconut. As many as 25 pre-enrichment broth cultures were pooled without apparent loss in the sensitivity of Salmonella detection as compared to individual sample analysis. The procedure offers a simple, yet effective, way to increase sample capacity in the Salmonella testing of foods, particularly where a large proportion of samples ordinarily is negative. It also permits small portions of pre-enrichment broth cultures to be retained for subsequent individual analysis if positive tests are found. Salmonella testing of pooled pre-enrichment broths provides increased consumer protection for a given amount of analytical effort as compared to individual sample analysis.

  18. 30 CFR 14.5 - Test samples.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Test samples. 14.5 Section 14.5 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR TESTING, EVALUATION, AND APPROVAL OF... Test samples. Upon request by MSHA, the applicant must submit 3 precut, unrolled, flat conveyor belt...

  19. 30 CFR 14.5 - Test samples.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Test samples. 14.5 Section 14.5 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR TESTING, EVALUATION, AND APPROVAL OF... Test samples. Upon request by MSHA, the applicant must submit 3 precut, unrolled, flat conveyor belt...

  20. 30 CFR 14.5 - Test samples.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Test samples. 14.5 Section 14.5 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR TESTING, EVALUATION, AND APPROVAL OF... Test samples. Upon request by MSHA, the applicant must submit 3 precut, unrolled, flat conveyor belt...

  1. Benthic macroinvertebrate field sampling effort required to ...

    EPA Pesticide Factsheets

    This multi-year pilot study evaluated a proposed field method for its effectiveness in the collection of a benthic macroinvertebrate sample adequate for use in the condition assessment of streams and rivers in the Neuquén Province, Argentina. A total of 13 sites, distributed across three rivers, were sampled. At each site, benthic macroinvertebrates were collected at 11 transects. Each sample was processed independently in the field and laboratory. Based on a literature review and resource considerations, the collection of 300 organisms (minimum) at each site was determined to be necessary to support a robust condition assessment, and therefore, selected as the criterion for judging the adequacy of the method. This targeted number of organisms was collected at all sites, at a minimum, when collections from all 11 transects were combined. Subsequent bootstrapping analysis of data was used to estimate whether collecting at fewer transects would reach the minimum target number of organisms for all sites. In a subset of sites, the total number of organisms frequently fell below the target when fewer than 11 transects collections were combined.Site conditions where <300 organisms might be collected are discussed. These preliminary results suggest that the proposed field method results in a sample that is adequate for robust condition assessment of the rivers and streams of interest. When data become available from a broader range of sites, the adequacy of the field

  2. Large-Scale Spacecraft Fire Safety Tests

    NASA Technical Reports Server (NTRS)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; hide

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  3. Sampling large random knots in a confined space

    NASA Astrophysics Data System (ADS)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  4. A Rigorous Test of the Fit of the Circumplex Model to Big Five Personality Data: Theoretical and Methodological Issues and Two Large Sample Empirical Tests.

    PubMed

    DeGeest, David Scott; Schmidt, Frank

    2015-01-01

    Our objective was to apply the rigorous test developed by Browne (1992) to determine whether the circumplex model fits Big Five personality data. This test has yet to be applied to personality data. Another objective was to determine whether blended items explained correlations among the Big Five traits. We used two working adult samples, the Eugene-Springfield Community Sample and the Professional Worker Career Experience Survey. Fit to the circumplex was tested via Browne's (1992) procedure. Circumplexes were graphed to identify items with loadings on multiple traits (blended items), and to determine whether removing these items changed five-factor model (FFM) trait intercorrelations. In both samples, the circumplex structure fit the FFM traits well. Each sample had items with dual-factor loadings (8 items in the first sample, 21 in the second). Removing blended items had little effect on construct-level intercorrelations among FFM traits. We conclude that rigorous tests show that the fit of personality data to the circumplex model is good. This finding means the circumplex model is competitive with the factor model in understanding the organization of personality traits. The circumplex structure also provides a theoretically and empirically sound rationale for evaluating intercorrelations among FFM traits. Even after eliminating blended items, FFM personality traits remained correlated.

  5. Importance sampling large deviations in nonequilibrium steady states. I.

    PubMed

    Ray, Ushnish; Chan, Garnet Kin-Lic; Limmer, David T

    2018-03-28

    Large deviation functions contain information on the stability and response of systems driven into nonequilibrium steady states and in such a way are similar to free energies for systems at equilibrium. As with equilibrium free energies, evaluating large deviation functions numerically for all but the simplest systems is difficult because by construction they depend on exponentially rare events. In this first paper of a series, we evaluate different trajectory-based sampling methods capable of computing large deviation functions of time integrated observables within nonequilibrium steady states. We illustrate some convergence criteria and best practices using a number of different models, including a biased Brownian walker, a driven lattice gas, and a model of self-assembly. We show how two popular methods for sampling trajectory ensembles, transition path sampling and diffusion Monte Carlo, suffer from exponentially diverging correlations in trajectory space as a function of the bias parameter when estimating large deviation functions. Improving the efficiencies of these algorithms requires introducing guiding functions for the trajectories.

  6. Importance sampling large deviations in nonequilibrium steady states. I

    NASA Astrophysics Data System (ADS)

    Ray, Ushnish; Chan, Garnet Kin-Lic; Limmer, David T.

    2018-03-01

    Large deviation functions contain information on the stability and response of systems driven into nonequilibrium steady states and in such a way are similar to free energies for systems at equilibrium. As with equilibrium free energies, evaluating large deviation functions numerically for all but the simplest systems is difficult because by construction they depend on exponentially rare events. In this first paper of a series, we evaluate different trajectory-based sampling methods capable of computing large deviation functions of time integrated observables within nonequilibrium steady states. We illustrate some convergence criteria and best practices using a number of different models, including a biased Brownian walker, a driven lattice gas, and a model of self-assembly. We show how two popular methods for sampling trajectory ensembles, transition path sampling and diffusion Monte Carlo, suffer from exponentially diverging correlations in trajectory space as a function of the bias parameter when estimating large deviation functions. Improving the efficiencies of these algorithms requires introducing guiding functions for the trajectories.

  7. Large Payload Transportation and Test Considerations

    NASA Technical Reports Server (NTRS)

    Rucker, Michelle A.; Pope, James C.

    2011-01-01

    Ironically, the limiting factor to a national heavy lift strategy may not be the rocket technology needed to throw a heavy payload, but rather the terrestrial infrastructure - roads, bridges, airframes, and buildings - necessary to transport, acceptance test, and process large spacecraft. Failure to carefully consider how large spacecraft are designed, and where they are manufactured, tested, or launched, could result in unforeseen cost to modify/develop infrastructure, or incur additional risk due to increased handling or elimination of key verifications. During test and verification planning for the Altair project, a number of transportation and test issues related to the large payload diameter were identified. Although the entire Constellation Program - including Altair - was canceled in the 2011 NASA budget, issues identified by the Altair project serve as important lessons learned for future payloads that may be developed to support national "heavy lift" strategies. A feasibility study performed by the Constellation Ground Operations (CxGO) project found that neither the Altair Ascent nor Descent Stage would fit inside available transportation aircraft. Ground transportation of a payload this large over extended distances is generally not permitted by most states, so overland transportation alone would not have been an option. Limited ground transportation to the nearest waterway may be permitted, but water transportation could take as long as 66 days per production unit, depending on point of origin and acceptance test facility; transportation from the western United States would require transit through the Panama Canal to access the Kennedy Space Center launch site. Large payloads also pose acceptance test and ground processing challenges. Although propulsion, mechanical vibration, and reverberant acoustic test facilities at NASA s Plum Brook Station have been designed to accommodate large spacecraft, special handling and test work-arounds may be necessary

  8. Test-particle simulations of SEP propagation in IMF with large-scale fluctuations

    NASA Astrophysics Data System (ADS)

    Kelly, J.; Dalla, S.; Laitinen, T.

    2012-11-01

    The results of full-orbit test-particle simulations of SEPs propagating through an IMF which exhibits large-scale fluctuations are presented. A variety of propagation conditions are simulated - scatter-free, and scattering with mean free path, λ, of 0.3 and 2.0 AU - and the cross-field transport of SEPs is investigated. When calculating cross-field displacements the Parker spiral geometry is accounted for and the role of magnetic field expansion is taken into account. It is found that transport across the magnetic field is enhanced in the λ =0.3 AU and λ =2 AU cases, compared to the scatter-free case, with the λ =2 AU case in particular containing outlying particles that had strayed a large distance across the IMF. Outliers are catergorized by means of Chauvenet's criterion and it is found that typically between 1 and 2% of the population falls within this category. The ratio of latitudinal to longitudinal diffusion coefficient perpendicular to the magnetic field is typically 0.2, suggesting that transport in latitude is less efficient.

  9. Field test of a motorcycle safety education course for novice riders

    DOT National Transportation Integrated Search

    1982-07-01

    The purpose of this study was to subject the Motorcycle Safety Foundation's Motorcycle Rider Course (MRC) to a large-scale field test designed to evaluate the following aspects of the course: (1) Instructional Effectiveness, (2) User Acceptance, and ...

  10. An evaluation of soil sampling for 137Cs using various field-sampling volumes.

    PubMed

    Nyhan, J W; White, G C; Schofield, T G; Trujillo, G

    1983-05-01

    The sediments from a liquid effluent receiving area at the Los Alamos National Laboratory and soils from an intensive study area in the fallout pathway of Trinity were sampled for 137Cs using 25-, 500-, 2500- and 12,500-cm3 field sampling volumes. A highly replicated sampling program was used to determine mean concentrations and inventories of 137Cs at each site, as well as estimates of spatial, aliquoting, and counting variance components of the radionuclide data. The sampling methods were also analyzed as a function of soil size fractions collected in each field sampling volume and of the total cost of the program for a given variation in the radionuclide survey results. Coefficients of variation (CV) of 137Cs inventory estimates ranged from 0.063 to 0.14 for Mortandad Canyon sediments, whereas CV values for Trinity soils were observed from 0.38 to 0.57. Spatial variance components of 137Cs concentration data were usually found to be larger than either the aliquoting or counting variance estimates and were inversely related to field sampling volume at the Trinity intensive site. Subsequent optimization studies of the sampling schemes demonstrated that each aliquot should be counted once, and that only 2-4 aliquots out of as many as 30 collected need be assayed for 137Cs. The optimization studies showed that as sample costs increased to 45 man-hours of labor per sample, the variance of the mean 137Cs concentration decreased dramatically, but decreased very little with additional labor.

  11. Non-mean-field theory of anomalously large double layer capacitance

    NASA Astrophysics Data System (ADS)

    Loth, M. S.; Skinner, Brian; Shklovskii, B. I.

    2010-07-01

    Mean-field theories claim that the capacitance of the double layer formed at a metal/ionic conductor interface cannot be larger than that of the Helmholtz capacitor, whose width is equal to the radius of an ion. However, in some experiments the apparent width of the double layer capacitor is substantially smaller. We propose an alternate non-mean-field theory of the ionic double layer to explain such large capacitance values. Our theory allows for the binding of discrete ions to their image charges in the metal, which results in the formation of interface dipoles. We focus primarily on the case where only small cations are mobile and other ions form an oppositely charged background. In this case, at small temperature and zero applied voltage dipoles form a correlated liquid on both contacts. We show that at small voltages the capacitance of the double layer is determined by the transfer of dipoles from one electrode to the other and is therefore limited only by the weak dipole-dipole repulsion between bound ions so that the capacitance is very large. At large voltages the depletion of bound ions from one of the capacitor electrodes triggers a collapse of the capacitance to the much smaller mean-field value, as seen in experimental data. We test our analytical predictions with a Monte Carlo simulation and find good agreement. We further argue that our “one-component plasma” model should work well for strongly asymmetric ion liquids. We believe that this work also suggests an improved theory of pseudocapacitance.

  12. Sampling design for groundwater solute transport: Tests of methods and analysis of Cape Cod tracer test data

    USGS Publications Warehouse

    Knopman, Debra S.; Voss, Clifford I.; Garabedian, Stephen P.

    1991-01-01

    Tests of a one-dimensional sampling design methodology on measurements of bromide concentration collected during the natural gradient tracer test conducted by the U.S. Geological Survey on Cape Cod, Massachusetts, demonstrate its efficacy for field studies of solute transport in groundwater and the utility of one-dimensional analysis. The methodology was applied to design of sparse two-dimensional networks of fully screened wells typical of those often used in engineering practice. In one-dimensional analysis, designs consist of the downstream distances to rows of wells oriented perpendicular to the groundwater flow direction and the timing of sampling to be carried out on each row. The power of a sampling design is measured by its effectiveness in simultaneously meeting objectives of model discrimination, parameter estimation, and cost minimization. One-dimensional models of solute transport, differing in processes affecting the solute and assumptions about the structure of the flow field, were considered for description of tracer cloud migration. When fitting each model using nonlinear regression, additive and multiplicative error forms were allowed for the residuals which consist of both random and model errors. The one-dimensional single-layer model of a nonreactive solute with multiplicative error was judged to be the best of those tested. Results show the efficacy of the methodology in designing sparse but powerful sampling networks. Designs that sample five rows of wells at five or fewer times in any given row performed as well for model discrimination as the full set of samples taken up to eight times in a given row from as many as 89 rows. Also, designs for parameter estimation judged to be good by the methodology were as effective in reducing the variance of parameter estimates as arbitrary designs with many more samples. Results further showed that estimates of velocity and longitudinal dispersivity in one-dimensional models based on data from only five

  13. Sample Holder for Cryogenic Adhesive Shear Test

    NASA Technical Reports Server (NTRS)

    Ledbetter, F. E.; Clemons, J. M.; White, W. T.; Penn, B.; Semmel, M. L.

    1983-01-01

    Five samples tested in one cooldown. Holder mounted in testing machine. Submerged in cryogenic liquid held in cryostat. Movable crosshead of testing machine moves gradually downward. Samples placed under tension, one after another, starting with top one; each sample fails in turn before next is stressed.

  14. Measurement and modeling of polarized specular neutron reflectivity in large magnetic fields.

    PubMed

    Maranville, Brian B; Kirby, Brian J; Grutter, Alexander J; Kienzle, Paul A; Majkrzak, Charles F; Liu, Yaohua; Dennis, Cindi L

    2016-08-01

    The presence of a large applied magnetic field removes the degeneracy of the vacuum energy states for spin-up and spin-down neutrons. For polarized neutron reflectometry, this must be included in the reference potential energy of the Schrödinger equation that is used to calculate the expected scattering from a magnetic layered structure. For samples with magnetization that is purely parallel or antiparallel to the applied field which defines the quantization axis, there is no mixing of the spin states (no spin-flip scattering) and so this additional potential is constant throughout the scattering region. When there is non-collinear magnetization in the sample, however, there will be significant scattering from one spin state into the other, and the reference potentials will differ between the incoming and outgoing wavefunctions, changing the angle and intensities of the scattering. The theory of the scattering and recommended experimental practices for this type of measurement are presented, as well as an example measurement.

  15. Measurement and modeling of polarized specular neutron reflectivity in large magnetic fields

    PubMed Central

    Maranville, Brian B.; Kirby, Brian J.; Grutter, Alexander J.; Kienzle, Paul A.; Majkrzak, Charles F.; Liu, Yaohua; Dennis, Cindi L.

    2016-01-01

    The presence of a large applied magnetic field removes the degeneracy of the vacuum energy states for spin-up and spin-down neutrons. For polarized neutron reflectometry, this must be included in the reference potential energy of the Schrödinger equation that is used to calculate the expected scattering from a magnetic layered structure. For samples with magnetization that is purely parallel or antiparallel to the applied field which defines the quantization axis, there is no mixing of the spin states (no spin-flip scattering) and so this additional potential is constant throughout the scattering region. When there is non-collinear magnetization in the sample, however, there will be significant scattering from one spin state into the other, and the reference potentials will differ between the incoming and outgoing wavefunctions, changing the angle and intensities of the scattering. The theory of the scattering and recommended experimental practices for this type of measurement are presented, as well as an example measurement. PMID:27504074

  16. NOAO testing procedures for large optics

    NASA Astrophysics Data System (ADS)

    Stepp, Larry M.; Poczulp, Gary A.; Pearson, Earl T.; Roddier, Nicolas A.

    1992-03-01

    This paper describes optical testing procedures used at the National Optical Astronomy Observatories (NOAO) for testing large optics. It begins with a discussion of the philosophy behind the testing approach and then describes a number of different testing methods used at NOAO, including the wire test, full-aperture and sub-aperture Hartmann testing, and scatterplate interferometry. Specific innovations that enhance the testing capabilities are mentioned. NOAO data reduction software is described. Examples are given of specific output formats that are useful to the optician, using illustrations taken from recent testing of a 3.5- meter, f/1.75 borosilicate honeycomb mirror. Finally, we discuss some of the optical testing challenges posed by the large optics for the Gemini 8-meter Telescopes Project.

  17. Lack of association between digit ratio (2D:4D) and assertiveness: replication in a large sample.

    PubMed

    Voracek, Martin

    2009-12-01

    Findings regarding within-sex associations of digit ratio (2D:4D), a putative pointer to long-lasting effects of prenatal androgen action, and sexually differentiated personality traits have generally been inconsistent or unreplicable, suggesting that effects in this domain, if any, are likely small. In contrast to evidence from Wilson's important 1983 study, a forerunner of modern 2D:4D research, two recent studies in 2005 and 2008 by Freeman, et al. and Hampson, et al. showed assertiveness, a presumably male-typed personality trait, was not associated with 2D:4D; however, these studies were clearly statistically underpowered. Hence this study examined this question anew, based on a large sample of 491 men and 627 women. Assertiveness was only modestly sexually differentiated, favoring men, and a positive correlate of age and education and a negative correlate of weight and Body Mass Index among women, but not men. Replicating the two prior studies, 2D:4D was throughout unrelated to assertiveness scores. This null finding was preserved with controls for correlates of assertiveness, also in nonparametric analysis and with tests for curvilinear relations. Discussed are implications of this specific null finding, now replicated in a large sample, for studies of 2D:4D and personality in general and novel research approaches to proceed in this field.

  18. Preservation of RNA and DNA from mammal samples under field conditions.

    PubMed

    Camacho-Sanchez, Miguel; Burraco, Pablo; Gomez-Mestre, Ivan; Leonard, Jennifer A

    2013-07-01

    Ecological and conservation genetics require sampling of organisms in the wild. Appropriate preservation of the collected samples, usually by cryostorage, is key to the quality of the genetic data obtained. Nevertheless, cryopreservation in the field to ensure RNA and DNA stability is not always possible. We compared several nucleic acid preservation solutions appropriate for field sampling and tested them on rat (Rattus rattus) blood, ear and tail tip, liver, brain and muscle. We compared the efficacy of a nucleic acid preservation (NAP) buffer for DNA preservation against 95% ethanol and Longmire buffer, and for RNA preservation against RNAlater (Qiagen) and Longmire buffer, under simulated field conditions. For DNA, the NAP buffer was slightly better than cryopreservation or 95% ethanol, but high molecular weight DNA was preserved in all conditions. The NAP buffer preserved RNA as well as RNAlater. Liver yielded the best RNA and DNA quantity and quality; thus, liver should be the tissue preferentially collected from euthanized animals. We also show that DNA persists in nonpreserved muscle tissue for at least 1 week at ambient temperature, although degradation is noticeable in a matter of hours. When cryopreservation is not possible, the NAP buffer is an economical alternative for RNA preservation at ambient temperature for at least 2 months and DNA preservation for at least 10 months. © 2013 John Wiley & Sons Ltd.

  19. The development of large-aperture test system of infrared camera and visible CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying

    2015-10-01

    Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.

  20. Synthesizing Information From Language Samples and Standardized Tests in School-Age Bilingual Assessment

    PubMed Central

    Pham, Giang

    2017-01-01

    Purpose Although language samples and standardized tests are regularly used in assessment, few studies provide clinical guidance on how to synthesize information from these testing tools. This study extends previous work on the relations between tests and language samples to a new population—school-age bilingual speakers with primary language impairment—and considers the clinical implications for bilingual assessment. Method Fifty-one bilingual children with primary language impairment completed narrative language samples and standardized language tests in English and Spanish. Children were separated into younger (ages 5;6 [years;months]–8;11) and older (ages 9;0–11;2) groups. Analysis included correlations with age and partial correlations between language sample measures and test scores in each language. Results Within the younger group, positive correlations with large effect sizes indicated convergence between test scores and microstructural language sample measures in both Spanish and English. There were minimal correlations in the older group for either language. Age related to English but not Spanish measures. Conclusions Tests and language samples complement each other in assessment. Wordless picture-book narratives may be more appropriate for ages 5–8 than for older children. We discuss clinical implications, including a case example of a bilingual child with primary language impairment, to illustrate how to synthesize information from these tools in assessment. PMID:28055056

  1. Field Comparison of the Sampling Efficacy of Two Smear Media: Cotton Fiber and Kraft Paper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogue, M.G.

    Two materials were compared in field tests at the Defense Waste Processing Facility: kraft paper (a strong, brown paper made from wood pulp prepared with a sodium sulfate solution) and cotton fiber. Based on a sampling of forty-six pairs of smears, the cotton fiber smears provide a greater sensitivity. The cotton fiber smears collected an average of forty-four percent more beta activity than the kraft paper smears and twenty-nine percent more alpha activity. Results show a greater sensitivity with cotton fiber over kraft paper at the 95 percent confidence level. Regulatory requirements for smear materials are vague. The data demonstratemore » that the difference in sensitivity of smear materials could lead to a large difference in reported results that are subsequently used for meeting shipping regulations or evaluating workplace contamination levels.« less

  2. Large Scale Flame Spread Environmental Characterization Testing

    NASA Technical Reports Server (NTRS)

    Clayman, Lauren K.; Olson, Sandra L.; Gokoghi, Suleyman A.; Brooker, John E.; Ferkul, Paul V.; Kacher, Henry F.

    2013-01-01

    Under the Advanced Exploration Systems (AES) Spacecraft Fire Safety Demonstration Project (SFSDP), as a risk mitigation activity in support of the development of a large-scale fire demonstration experiment in microgravity, flame-spread tests were conducted in normal gravity on thin, cellulose-based fuels in a sealed chamber. The primary objective of the tests was to measure pressure rise in a chamber as sample material, burning direction (upward/downward), total heat release, heat release rate, and heat loss mechanisms were varied between tests. A Design of Experiments (DOE) method was imposed to produce an array of tests from a fixed set of constraints and a coupled response model was developed. Supplementary tests were run without experimental design to additionally vary select parameters such as initial chamber pressure. The starting chamber pressure for each test was set below atmospheric to prevent chamber overpressure. Bottom ignition, or upward propagating burns, produced rapid acceleratory turbulent flame spread. Pressure rise in the chamber increases as the amount of fuel burned increases mainly because of the larger amount of heat generation and, to a much smaller extent, due to the increase in gaseous number of moles. Top ignition, or downward propagating burns, produced a steady flame spread with a very small flat flame across the burning edge. Steady-state pressure is achieved during downward flame spread as the pressure rises and plateaus. This indicates that the heat generation by the flame matches the heat loss to surroundings during the longer, slower downward burns. One heat loss mechanism included mounting a heat exchanger directly above the burning sample in the path of the plume to act as a heat sink and more efficiently dissipate the heat due to the combustion event. This proved an effective means for chamber overpressure mitigation for those tests producing the most total heat release and thusly was determined to be a feasible mitigation

  3. Conditional sampling technique to test the applicability of the Taylor hypothesis for the large-scale coherent structures

    NASA Technical Reports Server (NTRS)

    Hussain, A. K. M. F.

    1980-01-01

    Comparisons of the distributions of large scale structures in turbulent flow with distributions based on time dependent signals from stationary probes and the Taylor hypothesis are presented. The study investigated an area in the near field of a 7.62 cm circular air jet at a Re of 32,000, specifically having coherent structures through small-amplitude controlled excitation and stable vortex pairing in the jet column mode. Hot-wire and X-wire anemometry were employed to establish phase averaged spatial distributions of longitudinal and lateral velocities, coherent Reynolds stress and vorticity, background turbulent intensities, streamlines and pseudo-stream functions. The Taylor hypothesis was used to calculate spatial distributions of the phase-averaged properties, with results indicating that the usage of the local time-average velocity or streamwise velocity produces large distortions.

  4. Testing core creation in hydrodynamical simulations using the HI kinematics of field dwarfs

    NASA Astrophysics Data System (ADS)

    Papastergis, E.; Ponomareva, A. A.

    2017-05-01

    The majority of recent hydrodynamical simulations indicate the creation of central cores in the mass profiles of low-mass halos, a process that is attributed to star formation-related baryonic feedback. Core creation is regarded as one of the most promising solutions to potential issues faced by lambda cold dark matter (ΛCDM) cosmology on small scales. For example, the reduced dynamical mass enclosed by cores can explain the low rotational velocities measured for nearby dwarf galaxies, thus possibly lifting the seeming contradiction with the ΛCDM expectations (the so-called "too big to fail" problem). Here we test core creation as a solution of cosmological issues by using a sample of dwarfs with measurements of their atomic hydrogen (HI) kinematics extending to large radii. Using the NIHAO hydrodynamical simulation as an example, we show that core creation can successfully reproduce the kinematics of dwarfs with small kinematic radii, R ≲ 1.5 kpc. However, the agreement with observations becomes poor once galaxies with kinematic measurements extending beyond the core region, R ≈ 1.5-4 kpc, are considered. This result illustrates the importance of testing the predictions of hydrodynamical simulations that are relevant for cosmology against a broad range of observational samples. We would like to stress that our result is valid only under the following set of assumptions: I) that our sample of dwarfs with HI kinematics is representative of the overall population of field dwarfs; II) that there are no severe measurement biases in the observational parameters of our HI dwarfs (e.g., related to inclination estimates); and III) that the HI velocity fields of dwarfs are regular enough to allow the recovery of the true enclosed dynamical mass.

  5. Field and laboratory comparative evaluation of ten rapid malaria diagnostic tests.

    PubMed

    Craig, M H; Bredenkamp, B L; Williams, C H Vaughan; Rossouw, E J; Kelly, V J; Kleinschmidt, I; Martineau, A; Henry, G F J

    2002-01-01

    The paper reports on a comparative evaluation of 10 rapid malaria tests available in South Africa in 1998: AccuCheck (AC, developmental), Cape Biotech (CB), ICT Malaria Pf (ICT1) and Pf/Pv (ICT2), Kat Medical (KAT), MakroMal (MM), OptiMAL (OP), ParaSight-F (PS), Quorum (Q), Determine-Malaria (DM). In a laboratory study, designed to test absolute detection limits, Plasmodium falciparum-infected blood was diluted with uninfected blood to known parasite concentrations ranging from 500 to 0.1 parasites per microlitre (P/microL). The 50% detection limits were: ICT1, 3.28; ICT2, 4.86; KAT, 6.36; MM, 9.37; CB, 11.42; DM, 12.40; Q, 16.98; PS, 20; AC, 31.15 and OP, 91.16 P/microL. A field study was carried out to test post-treatment specificity. Blood samples from malaria patients were tested with all products (except AC and DM) on the day of treatment and 3 and 7 days thereafter, against a gold standard of microscopy and polymerase chain reaction (PCR). OP and PS produced fewer false-positive results on day 7 (18 and 19%, respectively) than the other rapid tests (38-56%). However, microscopy, PCR, OP and PS disagreed largely as to which individuals remained positive. The tests were further compared with regard to general specificity, particularly cross-reactivity with rheumatoid factor, speed, simplicity, their ability to detect other species, storage requirements and general presentation.

  6. Magnetostatic modes in ferromagnetic samples with inhomogeneous internal fields

    NASA Astrophysics Data System (ADS)

    Arias, Rodrigo

    2015-03-01

    Magnetostatic modes in ferromagnetic samples are very well characterized and understood in samples with uniform internal magnetic fields. More recently interest has shifted to the study of magnetization modes in ferromagnetic samples with inhomogeneous internal fields. The present work shows that under the magnetostatic approximation and for samples of arbitrary shape and/or arbitrary inhomogeneous internal magnetic fields the modes can be classified as elliptic or hyperbolic, and their associated frequency spectrum can be delimited. This results from the analysis of the character of the second order partial differential equation for the magnetostatic potential under these general conditions. In general, a sample with an inhomogeneous internal field and at a given frequency, may have regions of elliptic and hyperbolic character separated by a boundary. In the elliptic regions the magnetostatic modes have a smooth monotonic character (generally decaying form the surfaces (a ``tunneling'' behavior)) and in hyperbolic regions an oscillatory wave-like character. A simple local criterion distinguishes hyperbolic from elliptic regions: the sign of a susceptibility parameter. This study shows that one may control to some extent magnetostatic modes via external fields or geometry. R.E.A. acknowledges Financiamiento Basal para Centros Cientificos y Tecnologicos de Excelencia under Project No. FB 0807 (Chile), Grant No. ICM P10-061-F by Fondo de Innovacion para la Competitividad-MINECON, and Proyecto Fondecyt 1130192.

  7. A portable x-ray fluorescence instrument for analyzing dust wipe samples for lead: evaluation with field samples.

    PubMed

    Sterling, D A; Lewis, R D; Luke, D A; Shadel, B N

    2000-06-01

    Dust wipe samples collected in the field were tested by nondestructive X-ray fluorescence (XRF) followed by laboratory analysis with flame atomic absorption spectrophotometry (FAAS). Data were analyzed for precision and accuracy of measurement. Replicate samples with the XRF show high precision with an intraclass correlation coefficient (ICC) of 0.97 (P<0.0001) and an overall coefficient of variation of 11.6%. Paired comparison indicates no statistical difference (P=0.272) between XRF and FAAS analysis. Paired samples are highly correlated with an R(2) ranging between 0.89 for samples that contain paint chips and 0.93 for samples that do not contain paint chips. The ICC for absolute agreement between XRF and laboratory results was 0.95 (P<0.0001). The relative error over the concentration range of 25 to 14,200 microgram Pb is -12% (95% CI, -18 to -5). The XRF appears to be an excellent method for rapid on-site evaluation of dust wipes for clearance and risk assessment purposes, although there are indications of some confounding when paint chips are present. Copyright 2000 Academic Press.

  8. Analysis of large soil samples for actinides

    DOEpatents

    Maxwell, III; Sherrod, L [Aiken, SC

    2009-03-24

    A method of analyzing relatively large soil samples for actinides by employing a separation process that includes cerium fluoride precipitation for removing the soil matrix and precipitates plutonium, americium, and curium with cerium and hydrofluoric acid followed by separating these actinides using chromatography cartridges.

  9. Stochastic locality and master-field simulations of very large lattices

    NASA Astrophysics Data System (ADS)

    Lüscher, Martin

    2018-03-01

    In lattice QCD and other field theories with a mass gap, the field variables in distant regions of a physically large lattice are only weakly correlated. Accurate stochastic estimates of the expectation values of local observables may therefore be obtained from a single representative field. Such master-field simulations potentially allow very large lattices to be simulated, but require various conceptual and technical issues to be addressed. In this talk, an introduction to the subject is provided and some encouraging results of master-field simulations of the SU(3) gauge theory are reported.

  10. Diagnostic performance characteristics of a rapid field test for anthrax in cattle.

    PubMed

    Muller, Janine; Gwozdz, Jacek; Hodgeman, Rachel; Ainsworth, Catherine; Kluver, Patrick; Czarnecki, Jill; Warner, Simone; Fegan, Mark

    2015-07-01

    Although diagnosis of anthrax can be made in the field with a peripheral blood smear, and in the laboratory with bacterial culture or molecular based tests, these tests require either considerable experience or specialised equipment. Here we report on the evaluation of the diagnostic sensitivity and specificity of a simple and rapid in-field diagnostic test for anthrax, the anthrax immunochromatographic test (AICT). The AICT detects the protective antigen (PA) component of the anthrax toxin present within the blood of an animal that has died from anthrax. The test provides a result in 15min and offers the advantage of avoiding the necessity for on-site necropsy and subsequent occupational risks and environmental contamination. The specificity of the test was determined by testing samples taken from 622 animals, not infected with Bacillus anthracis. Diagnostic sensitivity was estimated on samples taken from 58 animals, naturally infected with B. anthracis collected over a 10-year period. All samples used to estimate the diagnostic sensitivity and specificity of the AICT were also tested using the gold standard of bacterial culture. The diagnostic specificity of the test was estimated to be 100% (99.4-100%; 95% CI) and the diagnostic sensitivity was estimated to be 93.1% (83.3-98.1%; 95% CI) (Clopper-Pearson method). Four samples produced false negative AICT results. These were among 9 samples, all of which tested positive for B. anthracis by culture, where there was a time delay between collection and testing of >48h and/or the samples were collected from animals that were >48h post-mortem. A statistically significant difference (P<0.001; Fishers exact test) was found between the ability of the AICT to detect PA in samples from culture positive animals <48h post-mortem, 49 of 49, Se=100% (92.8-100%; 95% CI) compared with samples tested >48h post-mortem 5 of 9 Se=56% (21-86.3%; 95% CI) (Clopper-Pearson method). Based upon these results a post hoc cut-off for use of

  11. Anxiety in visual field testing.

    PubMed

    Chew, Shenton S L; Kerr, Nathan M; Wong, Aaron B C; Craig, Jennifer P; Chou, Chi-Ying; Danesh-Meyer, Helen V

    2016-08-01

    To determine if Humphrey visual field (HVF) testing induces anxiety and how anxiety relates to visual field parameters of reliability and severity. A prospective cohort study at a university affiliated private ophthalmic practice. 137 consecutive age-matched and gender-matched patients with glaucoma undergoing either HVF testing only (n=102) or Heidelberg retinal tomography (HRT) only (n=35) were enrolled. Prior to testing, participants completed the State-Trait Anxiety Inventory questionnaire. A 5-point Likert scale was used to grade pretest anxiety and was repeated after testing to grade intratest anxiety. Subjective discomfort parameters were also recorded. Anxiety scores were used to make non-parametrical comparisons and correlations between cohorts and also against visual field reliability and severity indices. Trait anxiety (p=0.838) and pretest anxiety (p=0.802) were not significantly different between test groups. Within the HVF group, intratest anxiety was 1.2 times higher than pretest anxiety (p=0.0001), but was not significantly different in the HRT group (p=0.145). Pretest anxiety was correlated with test unreliability (Spearman's r=0.273, p=0.006), which was predictive of worse test severity (p=0.0027). Subjects who had undergone more than 10 visual field tests had significantly lower pretest and intratest anxiety levels than those who had not (p=0.0030 and p=0.0004, respectively). HVF testing induces more anxiety than HRT. Increased pretest anxiety may reduce HVF test reliability. Increased test experience or interventions aimed at reducing pretest anxiety may result in improved test reliability and accuracy. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  12. Pumping tests in networks of multilevel sampling wells: Motivation and methodology

    USGS Publications Warehouse

    Butler, J.J.; McElwee, C.D.; Bohling, Geoffrey C.

    1999-01-01

    The identification of spatial variations in hydraulic conductivity (K) on a scale of relevance for transport investigations has proven to be a considerable challenge. Recently, a new field method for the estimation of interwell variations in K has been proposed. This method, hydraulic tomography, essentially consists of a series of short‐term pumping tests performed in a tomographic‐like arrangement. In order to fully realize the potential of this approach, information about lateral and vertical variations in pumping‐induced head changes (drawdown) is required with detail that has previously been unobtainable in the field. Pumping tests performed in networks of multilevel sampling (MLS) wells can provide data of the needed density if drawdown can accurately and rapidly be measured in the small‐diameter tubing used in such wells. Field and laboratory experiments show that accurate transient drawdown data can be obtained in the small‐diameter MLS tubing either directly with miniature fiber‐optic pressure sensors or indirectly using air‐pressure transducers. As with data from many types of hydraulic tests, the quality of drawdown measurements from MLS tubing is quite dependent on the effectiveness of well development activities. Since MLS ports of the standard design are prone to clogging and are difficult to develop, alternate designs are necessary to ensure accurate drawdown measurements. Initial field experiments indicate that drawdown measurements obtained from pumping tests performed in MLS networks have considerable potential for providing valuable information about spatial variations in hydraulic conductivity.

  13. Field Sampling Tools for Explosives Residues Developed at CRREL

    DTIC Science & Technology

    2004-04-01

    4 Figure 2. Field cleaning supplies and equipment................................................. 5...Field Sampling Tools 5 Figure 2. Field cleaning supplies and equipment. Figure 3 depicts tools used for non-cohesive soils, such as sand or

  14. Open-field test site

    NASA Astrophysics Data System (ADS)

    Gyoda, Koichi; Shinozuka, Takashi

    1995-06-01

    An open-field test site with measurement equipment, a turn table, antenna positioners, and measurement auxiliary equipment was remodelled at the CRL north-site. This paper introduces the configuration, specifications and characteristics of this new open-field test site. Measured 3-m and 10-m site attenuations are in good agreement with theoretical values, and this means that this site is suitable for using 3-m and 10-m method EMI/EMC measurements. The site is expected to be effective for antenna measurement, antenna calibration, and studies on EMI/EMC measurement methods.

  15. Reliability and Validity of a New Test of Change-of-Direction Speed for Field-Based Sports: the Change-of-Direction and Acceleration Test (CODAT).

    PubMed

    Lockie, Robert G; Schultz, Adrian B; Callaghan, Samuel J; Jeffriess, Matthew D; Berry, Simon P

    2013-01-01

    Field sport coaches must use reliable and valid tests to assess change-of-direction speed in their athletes. Few tests feature linear sprinting with acute change- of-direction maneuvers. The Change-of-Direction and Acceleration Test (CODAT) was designed to assess field sport change-of-direction speed, and includes a linear 5-meter (m) sprint, 45° and 90° cuts, 3- m sprints to the left and right, and a linear 10-m sprint. This study analyzed the reliability and validity of this test, through comparisons to 20-m sprint (0-5, 0-10, 0-20 m intervals) and Illinois agility run (IAR) performance. Eighteen Australian footballers (age = 23.83 ± 7.04 yrs; height = 1.79 ± 0.06 m; mass = 85.36 ± 13.21 kg) were recruited. Following familiarization, subjects completed the 20-m sprint, CODAT, and IAR in 2 sessions, 48 hours apart. Intra-class correlation coefficients (ICC) assessed relative reliability. Absolute reliability was analyzed through paired samples t-tests (p ≤ 0.05) determining between-session differences. Typical error (TE), coefficient of variation (CV), and differences between the TE and smallest worthwhile change (SWC), also assessed absolute reliability and test usefulness. For the validity analysis, Pearson's correlations (p ≤ 0.05) analyzed between-test relationships. Results showed no between-session differences for any test (p = 0.19-0.86). CODAT time averaged ~6 s, and the ICC and CV equaled 0.84 and 3.0%, respectively. The homogeneous sample of Australian footballers meant that the CODAT's TE (0.19 s) exceeded the usual 0.2 x standard deviation (SD) SWC (0.10 s). However, the CODAT is capable of detecting moderate performance changes (SWC calculated as 0.5 x SD = 0.25 s). There was a near perfect correlation between the CODAT and IAR (r = 0.92), and very large correlations with the 20-m sprint (r = 0.75-0.76), suggesting that the CODAT was a valid change-of-direction speed test. Due to movement specificity, the CODAT has value for field sport

  16. Downselection for Sample Return — Defining Sampling Strategies Using Lessons from Terrestrial Field Analogues

    NASA Astrophysics Data System (ADS)

    Stevens, A. H.; Gentry, D.; Amador, E.; Cable, M. L.; Cantrell, T.; Chaudry, N.; Cullen, T.; Duca, Z.; Jacobsen, M.; Kirby, J.; McCaig, H.; Murukesan, G.; Rader, E.; Rennie, V.; Schwieterman, E.; Sutton, S.; Tan, G.; Yin, C.; Cullen, D.; Geppert, W.; Stockton, A.

    2018-04-01

    We detail multi-year field investigations in Icelandic Mars analogue environments that have yielded results that can help inform strategies for sample selection and downselection for Mars Sample Return.

  17. Sample Size Determination for One- and Two-Sample Trimmed Mean Tests

    ERIC Educational Resources Information Center

    Luh, Wei-Ming; Olejnik, Stephen; Guo, Jiin-Huarng

    2008-01-01

    Formulas to determine the necessary sample sizes for parametric tests of group comparisons are available from several sources and appropriate when population distributions are normal. However, in the context of nonnormal population distributions, researchers recommend Yuen's trimmed mean test, but formulas to determine sample sizes have not been…

  18. Large space structures testing

    NASA Technical Reports Server (NTRS)

    Waites, Henry; Worley, H. Eugene

    1987-01-01

    There is considerable interest in the development of testing concepts and facilities that accurately simulate the pathologies believed to exist in future spacecraft. Both the Government and Industry have participated in the development of facilities over the past several years. The progress and problems associated with the development of the Large Space Structure Test Facility at the Marshall Flight Center are presented. This facility was in existence for a number of years and its utilization has run the gamut from total in-house involvement, third party contractor testing, to the mutual participation of other goverment agencies in joint endeavors.

  19. Primordial large-scale electromagnetic fields from gravitoelectromagnetic inflation

    NASA Astrophysics Data System (ADS)

    Membiela, Federico Agustín; Bellini, Mauricio

    2009-04-01

    We investigate the origin and evolution of primordial electric and magnetic fields in the early universe, when the expansion is governed by a cosmological constant Λ0. Using the gravitoelectromagnetic inflationary formalism with A0 = 0, we obtain the power of spectrums for large-scale magnetic fields and the inflaton field fluctuations during inflation. A very important fact is that our formalism is naturally non-conformally invariant.

  20. Measurement and modeling of polarized specular neutron reflectivity in large magnetic fields

    DOE PAGES

    Maranville, Brian B.; Kirby, Brian J.; Grutter, Alexander J.; ...

    2016-06-09

    The presence of a large applied magnetic field removes the degeneracy of the vacuum energy states for spin-up and spin-down neutrons. For polarized neutron reflectometry, this must be included in the reference potential energy of the Schrödinger equation that is used to calculate the expected scattering from a magnetic layered structure. For samples with magnetization that is purely parallel or antiparallel to the applied field which defines the quantization axis, there is no mixing of the spin states (no spin-flip scattering) and so this additional potential is constant throughout the scattering region. When there is non-collinear magnetization in the sample,more » however, there will be significant scattering from one spin state into the other, and the reference potentials will differ between the incoming and outgoing wavefunctions, changing the angle and intensities of the scattering. In conclusion, the theory of the scattering and recommended experimental practices for this type of measurement are presented, as well as an example measurement.« less

  1. Large-Grain Superconducting Gun Cavity Testing Program Phase One Closing Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammons, L.; Bellavia, S.; Belomestnykh, S.

    2013-10-31

    This report details the experimental configuration and RF testing results for the first phase of a large-grained niobium electron gun cavity testing program being conducted in the Small Vertical Testing Facility in the Collider-Accelerator Department. This testing is meant to explore multi-pacting in the cavity and shed light on the behavior of a counterpart cavity of identical geometry installed in the Energy Recovery LINAC being constructed in the Collider-Accelerator Department at Brookhaven National Laboratory. This test found that the Q of the large-grained cavity at 4 K reached ~6.5 × 10 8 and at 2 K reached a value ofmore » ~6 × 10 9. Both of these values are about a factor of 10 lower than would be expected for this type of cavity given the calculated surface resistance and the estimated geometry factor for this half-cell cavity. In addition, the cavity reached a peak voltage of 0.6 MV before there was sig-nificant decline in the Q value and a substantial increase in field emission. This relatively low volt-age, coupled with the low Q and considerable field emission suggest contamination of the cavity interior, possibly during experimental assembly. The results may also suggest that additional chemical etching of the interior surface of the cavity may be beneficial. Throughout the course of testing, various challenges arose including slow helium transfer to the cryostat and cable difficulties. These difficulties and others were eventually resolved, and the re-port discusses the operating experience of the experiment thus far and the plans for future work aimed at exploring the nature of multipacting with a copper cathode inserted into the cavity.« less

  2. Crystallization of Calcium Carbonate in a Large Scale Field Study

    NASA Astrophysics Data System (ADS)

    Ueckert, Martina; Wismeth, Carina; Baumann, Thomas

    2017-04-01

    The long term efficiency of geothermal facilities and aquifer thermal energy storage in the carbonaceous Malm aquifer in the Bavarian Molasse Basin is seriously affected by precipitations of carbonates. This is mainly caused by pressure and temperature changes leading to oversaturation during production. Crystallization starts with polymorphic nuclei of calcium carbonate and is often described as diffusion-reaction controlled. Here, calcite crystallization is favoured by high concentration gradients while aragonite crystallization is occurring at high reaction rates. The factors affecting the crystallization processes have been described for simplified, well controlled laboratory experiments, the knowledge about the behaviour in more complex natural systems is still limited. The crystallization process of the polymorphic forms of calcium carbonate were investigated during a heat storage test at our test site in the eastern part of the Bavarian Molasse Basin. Complementary laboratory experiments in an autoclave were run. Both, field and laboratory experiments were conducted with carbonaceous tap water. Within the laboratory experiments additionally ultra pure water was used. To avoid precipitations of the tap water, a calculated amount of {CO_2} was added prior to heating the water from 45 - 110°C (laboratory) resp. 65 - 110°C (field). A total water volume of 0.5 L (laboratory) resp. 1 L (field) was immediately sampled and filtrated through 10 - 0.1

  3. Performance of the goulden large-sample extractor in multiclass pesticide isolation and preconcentration from stream water

    USGS Publications Warehouse

    Foster, G.D.; Foreman, W.T.; Gates, Paul M.

    1991-01-01

    The reliability of the Goulden large-sample extractor in preconcentrating pesticides from water was evaluated from the recoveries of 35 pesticides amended to filtered stream waters. Recoveries greater than 90% were observed for many of the pesticides in each major chemical class, but recoveries for some of the individual pesticides varied in seemingly unpredictable ways. Corrections cannot yet be factored into liquid-liquid extraction theory to account for matrix effects, which were apparent between the two stream waters tested. The Goulden large-sample extractor appears to be well suited for rapid chemical screening applications, with quantitative analysis requiring special quality control considerations. ?? 1991 American Chemical Society.

  4. Assessment and mitigation of errors associated with a large-scale field investigation of methane emissions from the Marcellus Shale

    NASA Astrophysics Data System (ADS)

    Caulton, D.; Golston, L.; Li, Q.; Bou-Zeid, E.; Pan, D.; Lane, H.; Lu, J.; Fitts, J. P.; Zondlo, M. A.

    2015-12-01

    Recent work suggests the distribution of methane emissions from fracking operations is a skewed distributed with a small percentage of emitters contributing a large proportion of the total emissions. In order to provide a statistically robust distributions of emitters and determine the presence of super-emitters, errors in current techniques need to be constrained and mitigated. The Marcellus shale, the most productive natural gas shale field in the United States, has received less intense focus for well-level emissions and is here investigated to provide the distribution of methane emissions. In July of 2015 approximately 250 unique well pads were sampled using the Princeton Atmospheric Chemistry Mobile Acquisition Node (PAC-MAN). This mobile lab includes a Garmin GPS unit, Vaisala weather station (WTX520), LICOR 7700 CH4 open path sensor and LICOR 7500 CO2/H2O open path sensor. Sampling sites were preselected based on wind direction, sampling distance and elevation grade. All sites were sampled during low boundary layer conditions (600-1000 and 1800-2200 local time). The majority of sites were sampled 1-3 times while selected test sites were sampled multiple times or resampled several times during the day. For selected sites a sampling tower was constructed consisting of a Metek uSonic-3 Class A sonic anemometer, and an additional LICOR 7700 and 7500. Data were recorded for at least one hour at these sites. A robust study and inter-comparison of different methodologies will be presented. The Gaussian plume model will be used to calculate fluxes for all sites and compare results from test sites with multiple passes. Tower data is used to provide constraints on the Gaussian plume model. Additionally, Large Eddy Simulation (LES) modeling will be used to calculate emissions from the tower sites. Alternative techniques will also be discussed. Results from these techniques will be compared to identify best practices and provide robust error estimates.

  5. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR FIELD COLLECTION OF INDOOR FLOOR SAMPLES AND POST FIELD SAMPLE HANDLING (UA-F-7.1)

    EPA Science Inventory

    The purpose of this SOP is to establish a uniform procedure for the collection of indoor floor dust samples in the field. This procedure was followed to ensure consistent data retrieval of dust samples during the Arizona NHEXAS project and the "Border" study. Keywords: field; va...

  6. Sampling Long- versus Short-Range Interactions Defines the Ability of Force Fields To Reproduce the Dynamics of Intrinsically Disordered Proteins.

    PubMed

    Mercadante, Davide; Wagner, Johannes A; Aramburu, Iker V; Lemke, Edward A; Gräter, Frauke

    2017-09-12

    Molecular dynamics (MD) simulations have valuably complemented experiments describing the dynamics of intrinsically disordered proteins (IDPs), particularly since the proposal of models to solve the artificial collapse of IDPs in silico. Such models suggest redefining nonbonded interactions, by either increasing water dispersion forces or adopting the Kirkwood-Buff force field. These approaches yield extended conformers that better comply with experiments, but it is unclear if they all sample the same intrachain dynamics of IDPs. We have tested this by employing MD simulations and single-molecule Förster resonance energy transfer spectroscopy to sample the dimensions of systems with different sequence compositions, namely strong and weak polyelectrolytes. For strong polyelectrolytes in which charge effects dominate, all the proposed solutions equally reproduce the expected ensemble's dimensions. For weak polyelectrolytes, at lower cutoffs, force fields abnormally alter intrachain dynamics, overestimating excluded volume over chain flexibility or reporting no difference between the dynamics of different chains. The TIP4PD water model alone can reproduce experimentally observed changes in extensions (dimensions), but not quantitatively and with only weak statistical significance. Force field limitations are reversed with increased interaction cutoffs, showing that chain dynamics are critically defined by the presence of long-range interactions. Force field analysis aside, our study provides the first insights into how long-range interactions critically define IDP dimensions and raises the question of which length range is crucial to correctly sample the overall dimensions and internal dynamics of the large group of weakly charged yet highly polar IDPs.

  7. Testing the efficiency of rover science protocols for robotic sample selection: A GeoHeuristic Operational Strategies Test

    NASA Astrophysics Data System (ADS)

    Yingst, R. A.; Bartley, J. K.; Chidsey, T. C.; Cohen, B. A.; Gilleaudeau, G. J.; Hynek, B. M.; Kah, L. C.; Minitti, M. E.; Williams, R. M. E.; Black, S.; Gemperline, J.; Schaufler, R.; Thomas, R. J.

    2018-05-01

    The GHOST field tests are designed to isolate and test science-driven rover operations protocols, to determine best practices. During a recent field test at a potential Mars 2020 landing site analog, we tested two Mars Science Laboratory data-acquisition and decision-making methods to assess resulting science return and sample quality: a linear method, where sites of interest are studied in the order encountered, and a "walkabout-first" method, where sites of interest are examined remotely before down-selecting to a subset of sites that are interrogated with more resource-intensive instruments. The walkabout method cost less time and fewer resources, while increasing confidence in interpretations. Contextual data critical to evaluating site geology was acquired earlier than for the linear method, and given a higher priority, which resulted in development of more mature hypotheses earlier in the analysis process. Combined, this saved time and energy in the collection of data with more limited spatial coverage. Based on these results, we suggest that the walkabout method be used where doing so would provide early context and time for the science team to develop hypotheses-critical tests; and that in gathering context, coverage may be more important than higher resolution.

  8. Rapid, topology-based particle tracking for high-resolution measurements of large complex 3D motion fields.

    PubMed

    Patel, Mohak; Leggett, Susan E; Landauer, Alexander K; Wong, Ian Y; Franck, Christian

    2018-04-03

    Spatiotemporal tracking of tracer particles or objects of interest can reveal localized behaviors in biological and physical systems. However, existing tracking algorithms are most effective for relatively low numbers of particles that undergo displacements smaller than their typical interparticle separation distance. Here, we demonstrate a single particle tracking algorithm to reconstruct large complex motion fields with large particle numbers, orders of magnitude larger than previously tractably resolvable, thus opening the door for attaining very high Nyquist spatial frequency motion recovery in the images. Our key innovations are feature vectors that encode nearest neighbor positions, a rigorous outlier removal scheme, and an iterative deformation warping scheme. We test this technique for its accuracy and computational efficacy using synthetically and experimentally generated 3D particle images, including non-affine deformation fields in soft materials, complex fluid flows, and cell-generated deformations. We augment this algorithm with additional particle information (e.g., color, size, or shape) to further enhance tracking accuracy for high gradient and large displacement fields. These applications demonstrate that this versatile technique can rapidly track unprecedented numbers of particles to resolve large and complex motion fields in 2D and 3D images, particularly when spatial correlations exist.

  9. Imaging samples larger than the field of view: the SLS experience

    NASA Astrophysics Data System (ADS)

    Vogiatzis Oikonomidis, Ioannis; Lovric, Goran; Cremona, Tiziana P.; Arcadu, Filippo; Patera, Alessandra; Schittny, Johannes C.; Stampanoni, Marco

    2017-06-01

    Volumetric datasets with micrometer spatial and sub-second temporal resolutions are nowadays routinely acquired using synchrotron X-ray tomographic microscopy (SRXTM). Although SRXTM technology allows the examination of multiple samples with short scan times, many specimens are larger than the field-of-view (FOV) provided by the detector. The extension of the FOV in the direction perpendicular to the rotation axis remains non-trivial. We present a method that can efficiently increase the FOV merging volumetric datasets obtained by region-of-interest tomographies in different 3D positions of the sample with a minimal amount of artefacts and with the ability to handle large amounts of data. The method has been successfully applied for the three-dimensional imaging of a small number of mouse lung acini of intact animals, where pixel sizes down to the micrometer range and short exposure times are required.

  10. A spinner magnetometer for large Apollo lunar samples.

    PubMed

    Uehara, M; Gattacceca, J; Quesnel, Y; Lepaulard, C; Lima, E A; Manfredi, M; Rochette, P

    2017-10-01

    We developed a spinner magnetometer to measure the natural remanent magnetization of large Apollo lunar rocks in the storage vault of the Lunar Sample Laboratory Facility (LSLF) of NASA. The magnetometer mainly consists of a commercially available three-axial fluxgate sensor and a hand-rotating sample table with an optical encoder recording the rotation angles. The distance between the sample and the sensor is adjustable according to the sample size and magnetization intensity. The sensor and the sample are placed in a two-layer mu-metal shield to measure the sample natural remanent magnetization. The magnetic signals are acquired together with the rotation angle to obtain stacking of the measured signals over multiple revolutions. The developed magnetometer has a sensitivity of 5 × 10 -7 Am 2 at the standard sensor-to-sample distance of 15 cm. This sensitivity is sufficient to measure the natural remanent magnetization of almost all the lunar basalt and breccia samples with mass above 10 g in the LSLF vault.

  11. A spinner magnetometer for large Apollo lunar samples

    NASA Astrophysics Data System (ADS)

    Uehara, M.; Gattacceca, J.; Quesnel, Y.; Lepaulard, C.; Lima, E. A.; Manfredi, M.; Rochette, P.

    2017-10-01

    We developed a spinner magnetometer to measure the natural remanent magnetization of large Apollo lunar rocks in the storage vault of the Lunar Sample Laboratory Facility (LSLF) of NASA. The magnetometer mainly consists of a commercially available three-axial fluxgate sensor and a hand-rotating sample table with an optical encoder recording the rotation angles. The distance between the sample and the sensor is adjustable according to the sample size and magnetization intensity. The sensor and the sample are placed in a two-layer mu-metal shield to measure the sample natural remanent magnetization. The magnetic signals are acquired together with the rotation angle to obtain stacking of the measured signals over multiple revolutions. The developed magnetometer has a sensitivity of 5 × 10-7 Am2 at the standard sensor-to-sample distance of 15 cm. This sensitivity is sufficient to measure the natural remanent magnetization of almost all the lunar basalt and breccia samples with mass above 10 g in the LSLF vault.

  12. Imaging a Large Sample with Selective Plane Illumination Microscopy Based on Multiple Fluorescent Microsphere Tracking

    NASA Astrophysics Data System (ADS)

    Ryu, Inkeon; Kim, Daekeun

    2018-04-01

    A typical selective plane illumination microscopy (SPIM) image size is basically limited by the field of view, which is a characteristic of the objective lens. If an image larger than the imaging area of the sample is to be obtained, image stitching, which combines step-scanned images into a single panoramic image, is required. However, accurately registering the step-scanned images is very difficult because the SPIM system uses a customized sample mount where uncertainties for the translational and the rotational motions exist. In this paper, an image registration technique based on multiple fluorescent microsphere tracking is proposed in the view of quantifying the constellations and measuring the distances between at least two fluorescent microspheres embedded in the sample. Image stitching results are demonstrated for optically cleared large tissue with various staining methods. Compensation for the effect of the sample rotation that occurs during the translational motion in the sample mount is also discussed.

  13. Field spectrometer (S191H) preprocessor tape quality test program design document

    NASA Technical Reports Server (NTRS)

    Campbell, H. M.

    1976-01-01

    Program QA191H performs quality assurance tests on field spectrometer data recorded on 9-track magnetic tape. The quality testing involves the comparison of key housekeeping and data parameters with historic and predetermined tolerance limits. Samples of key parameters are processed during the calibration period and wavelength cal period, and the results are printed out and recorded on an historical file tape.

  14. A Mission Control Architecture for robotic lunar sample return as field tested in an analogue deployment to the sudbury impact structure

    NASA Astrophysics Data System (ADS)

    Moores, John E.; Francis, Raymond; Mader, Marianne; Osinski, G. R.; Barfoot, T.; Barry, N.; Basic, G.; Battler, M.; Beauchamp, M.; Blain, S.; Bondy, M.; Capitan, R.-D.; Chanou, A.; Clayton, J.; Cloutis, E.; Daly, M.; Dickinson, C.; Dong, H.; Flemming, R.; Furgale, P.; Gammel, J.; Gharfoor, N.; Hussein, M.; Grieve, R.; Henrys, H.; Jaziobedski, P.; Lambert, A.; Leung, K.; Marion, C.; McCullough, E.; McManus, C.; Neish, C. D.; Ng, H. K.; Ozaruk, A.; Pickersgill, A.; Preston, L. J.; Redman, D.; Sapers, H.; Shankar, B.; Singleton, A.; Souders, K.; Stenning, B.; Stooke, P.; Sylvester, P.; Tornabene, L.

    2012-12-01

    A Mission Control Architecture is presented for a Robotic Lunar Sample Return Mission which builds upon the experience of the landed missions of the NASA Mars Exploration Program. This architecture consists of four separate processes working in parallel at Mission Control and achieving buy-in for plans sequentially instead of simultaneously from all members of the team. These four processes were: science processing, science interpretation, planning and mission evaluation. science processing was responsible for creating products from data downlinked from the field and is organized by instrument. Science Interpretation was responsible for determining whether or not science goals are being met and what measurements need to be taken to satisfy these goals. The Planning process, responsible for scheduling and sequencing observations, and the Evaluation process that fostered inter-process communications, reporting and documentation assisted these processes. This organization is advantageous for its flexibility as shown by the ability of the structure to produce plans for the rover every two hours, for the rapidity with which Mission Control team members may be trained and for the relatively small size of each individual team. This architecture was tested in an analogue mission to the Sudbury impact structure from June 6-17, 2011. A rover was used which was capable of developing a network of locations that could be revisited using a teach and repeat method. This allowed the science team to process several different outcrops in parallel, downselecting at each stage to ensure that the samples selected for caching were the most representative of the site. Over the course of 10 days, 18 rock samples were collected from 5 different outcrops, 182 individual field activities - such as roving or acquiring an image mosaic or other data product - were completed within 43 command cycles, and the rover travelled over 2200 m. Data transfer from communications passes were filled to 74

  15. GeoLab's First Field Trials, 2010 Desert RATS: Evaluating Tools for Early Sample Characterization

    NASA Technical Reports Server (NTRS)

    Evans, Cindy A.; Bell, M. S.; Calaway, M. J.; Graff, Trevor; Young, Kelsey

    2011-01-01

    As part of an accelerated prototyping project to support science operations tests for future exploration missions, we designed and built a geological laboratory, GeoLab, that was integrated into NASA's first generation Habitat Demonstration Unit-1/Pressurized Excursion Module (HDU1-PEM). GeoLab includes a pressurized glovebox for transferring and handling samples collected on geological traverses, and a suite of instruments for collecting preliminary data to help characterize those samples. The GeoLab and the HDU1-PEM were tested for the first time as part of the 2010 Desert Research and Technology Studies (DRATS), NASA's analog field exercise for testing mission technologies. The HDU1- PEM and GeoLab participated in two weeks of joint operations in northern Arizona with two crewed rovers and the DRATS science team.

  16. Large, nonsaturating thermopower in a quantizing magnetic field

    PubMed Central

    Fu, Liang

    2018-01-01

    The thermoelectric effect is the generation of an electrical voltage from a temperature gradient in a solid material due to the diffusion of free charge carriers from hot to cold. Identifying materials with a large thermoelectric response is crucial for the development of novel electric generators and coolers. We theoretically consider the thermopower of Dirac/Weyl semimetals subjected to a quantizing magnetic field. We contrast their thermoelectric properties with those of traditional heavily doped semiconductors and show that, under a sufficiently large magnetic field, the thermopower of Dirac/Weyl semimetals grows linearly with the field without saturation and can reach extremely high values. Our results suggest an immediate pathway for achieving record-high thermopower and thermoelectric figure of merit, and they compare well with a recent experiment on Pb1–xSnxSe. PMID:29806031

  17. 40 CFR 133.104 - Sampling and test procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 22 2011-07-01 2011-07-01 false Sampling and test procedures. 133.104 Section 133.104 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS SECONDARY TREATMENT REGULATION § 133.104 Sampling and test procedures. (a) Sampling and test procedures for...

  18. 40 CFR 133.104 - Sampling and test procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 21 2010-07-01 2010-07-01 false Sampling and test procedures. 133.104 Section 133.104 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS SECONDARY TREATMENT REGULATION § 133.104 Sampling and test procedures. (a) Sampling and test procedures for...

  19. 40 CFR 133.104 - Sampling and test procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 23 2012-07-01 2012-07-01 false Sampling and test procedures. 133.104 Section 133.104 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS SECONDARY TREATMENT REGULATION § 133.104 Sampling and test procedures. (a) Sampling and test procedures for...

  20. Magnetospheric Multiscale Observations of the Electron Diffusion Region of Large Guide Field Magnetic Reconnection

    NASA Technical Reports Server (NTRS)

    Eriksson, S.; Wilder, F. D.; Ergun, R. E.; Schwartz, S. J.; Cassak, P. A.; Burch, J. L.; Chen, Li-Jen; Torbert, R. B.; Phan, T. D.; Lavraud, B.; hide

    2016-01-01

    We report observations from the Magnetospheric Multiscale (MMS) satellites of a large guide field magnetic reconnection event. The observations suggest that two of the four MMS spacecraft sampled the electron diffusion region, whereas the other two spacecraft detected the exhaust jet from the event. The guide magnetic field amplitude is approximately 4 times that of the reconnecting field. The event is accompanied by a significant parallel electric field (E(sub parallel lines) that is larger than predicted by simulations. The high-speed (approximately 300 km/s) crossing of the electron diffusion region limited the data set to one complete electron distribution inside of the electron diffusion region, which shows significant parallel heating. The data suggest that E(sub parallel lines) is balanced by a combination of electron inertia and a parallel gradient of the gyrotropic electron pressure.

  1. Large Payload Ground Transportation and Test Considerations

    NASA Technical Reports Server (NTRS)

    Rucker, Michelle A.

    2016-01-01

    Many spacecraft concepts under consideration by the National Aeronautics and Space Administration’s (NASA’s) Evolvable Mars Campaign take advantage of a Space Launch System payload shroud that may be 8 to 10 meters in diameter. Large payloads can theoretically save cost by reducing the number of launches needed--but only if it is possible to build, test, and transport a large payload to the launch site in the first place. Analysis performed previously for the Altair project identified several transportation and test issues with an 8.973 meters diameter payload. Although the entire Constellation Program—including Altair—has since been canceled, these issues serve as important lessons learned for spacecraft designers and program managers considering large payloads for future programs. A transportation feasibility study found that, even broken up into an Ascent and Descent Module, the Altair spacecraft would not fit inside available aircraft. Ground transportation of such large payloads over extended distances is not generally permitted, so overland transportation alone would not be an option. Limited ground transportation to the nearest waterway may be possible, but water transportation could take as long as 67 days per production unit, depending on point of origin and acceptance test facility; transportation from the western United States would require transit through the Panama Canal to access the Kennedy Space Center launch site. Large payloads also pose acceptance test and ground processing challenges. Although propulsion, mechanical vibration, and reverberant acoustic test facilities at NASA’s Plum Brook Station have been designed to accommodate large spacecraft, special handling and test work-arounds may be necessary, which could increase cost, schedule, and technical risk. Once at the launch site, there are no facilities currently capable of accommodating the combination of large payload size and hazardous processing such as hypergolic fuels

  2. A Pipeline for Large Data Processing Using Regular Sampling for Unstructured Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berres, Anne Sabine; Adhinarayanan, Vignesh; Turton, Terece

    2017-05-12

    Large simulation data requires a lot of time and computational resources to compute, store, analyze, visualize, and run user studies. Today, the largest cost of a supercomputer is not hardware but maintenance, in particular energy consumption. Our goal is to balance energy consumption and cognitive value of visualizations of resulting data. This requires us to go through the entire processing pipeline, from simulation to user studies. To reduce the amount of resources, data can be sampled or compressed. While this adds more computation time, the computational overhead is negligible compared to the simulation time. We built a processing pipeline atmore » the example of regular sampling. The reasons for this choice are two-fold: using a simple example reduces unnecessary complexity as we know what to expect from the results. Furthermore, it provides a good baseline for future, more elaborate sampling methods. We measured time and energy for each test we did, and we conducted user studies in Amazon Mechanical Turk (AMT) for a range of different results we produced through sampling.« less

  3. Virtually distortion-free imaging system for large field, high resolution lithography

    DOEpatents

    Hawryluk, A.M.; Ceglio, N.M.

    1993-01-05

    Virtually distortion free large field high resolution imaging is performed using an imaging system which contains large field distortion or field curvature. A reticle is imaged in one direction through the optical system to form an encoded mask. The encoded mask is then imaged back through the imaging system onto a wafer positioned at the reticle position.

  4. Virtually distortion-free imaging system for large field, high resolution lithography

    DOEpatents

    Hawryluk, Andrew M.; Ceglio, Natale M.

    1993-01-01

    Virtually distortion free large field high resolution imaging is performed using an imaging system which contains large field distortion or field curvature. A reticle is imaged in one direction through the optical system to form an encoded mask. The encoded mask is then imaged back through the imaging system onto a wafer positioned at the reticle position.

  5. Communication: Multiple atomistic force fields in a single enhanced sampling simulation

    NASA Astrophysics Data System (ADS)

    Hoang Viet, Man; Derreumaux, Philippe; Nguyen, Phuong H.

    2015-07-01

    The main concerns of biomolecular dynamics simulations are the convergence of the conformational sampling and the dependence of the results on the force fields. While the first issue can be addressed by employing enhanced sampling techniques such as simulated tempering or replica exchange molecular dynamics, repeating these simulations with different force fields is very time consuming. Here, we propose an automatic method that includes different force fields into a single advanced sampling simulation. Conformational sampling using three all-atom force fields is enhanced by simulated tempering and by formulating the weight parameters of the simulated tempering method in terms of the energy fluctuations, the system is able to perform random walk in both temperature and force field spaces. The method is first demonstrated on a 1D system and then validated by the folding of the 10-residue chignolin peptide in explicit water.

  6. Useful field of view test.

    PubMed

    Wood, Joanne M; Owsley, Cynthia

    2014-01-01

    The useful field of view test was developed to reflect the visual difficulties that older adults experience with everyday tasks. Importantly, the useful field of view test (UFOV) is one of the most extensively researched and promising predictor tests for a range of driving outcomes measures, including driving ability and crash risk as well as other everyday tasks. Currently available commercial versions of the test can be administered using personal computers; these measure the speed of visual processing for rapid detection and localization of targets under conditions of divided visual attention and in the presence and absence of visual clutter. The test is believed to assess higher-order cognitive abilities, but performance also relies on visual sensory function because in order for targets to be attended to, they must be visible. The format of the UFOV has been modified over the years; the original version estimated the spatial extent of useful field of view, while the latest version measures visual processing speed. While deficits in the useful field of view are associated with functional impairments in everyday activities in older adults, there is also emerging evidence from several research groups that improvements in visual processing speed can be achieved through training. These improvements have been shown to reduce crash risk, and can have a positive impact on health and functional well-being, with the potential to increase the mobility and hence the independence of older adults. © 2014 S. Karger AG, Basel

  7. OXTR polymorphism in depression and completed suicide-A study on a large population sample.

    PubMed

    Wasilewska, Krystyna; Pawlak, Aleksandra; Kostrzewa, Grażyna; Sobczyk-Kopcioł, Agnieszka; Kaczorowska, Aleksandra; Badowski, Jarosław; Brzozowska, Małgorzata; Drygas, Wojciech; Piwoński, Jerzy; Bielecki, Wojciech; Płoski, Rafał

    2017-03-01

    In the light of contradictory results concerning OXTR polymorphism rs53576 and depression, we decided to verify the potential association between the two on 1) a large, ethnically homogenous sample of 1185 individuals who completed the Beck Depression Inventory (BDI), as well as on 2) a sample of 763 suicide victims. In the population sample, AA males showed significantly lower BDI scores (p=0.005, p cor =0.030). Exploratory analyses suggested that this effect was limited to a subgroup within 0-9 BDI score range (p=0.0007, U-Mann Whitney test), whereas no main effect on depressive symptoms (BDI>9) was found. In the suicide sample no association with rs53576 genotype was present. Exploratory analyses in suicides revealed higher blood alcohol concentration (BAC) among AA than GG/GA males (p=0.014, U-Mann Whitney test). Our results show that the OXTR rs53576 variant modulates the mood in male individuals and may positively correlate with alcohol intake among male suicides, but is not associated with suicide or depression. The study adds to the growing knowledge on rs53576 genotype characteristics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Chi-Squared Test of Fit and Sample Size-A Comparison between a Random Sample Approach and a Chi-Square Value Adjustment Method.

    PubMed

    Bergh, Daniel

    2015-01-01

    Chi-square statistics are commonly used for tests of fit of measurement models. Chi-square is also sensitive to sample size, which is why several approaches to handle large samples in test of fit analysis have been developed. One strategy to handle the sample size problem may be to adjust the sample size in the analysis of fit. An alternative is to adopt a random sample approach. The purpose of this study was to analyze and to compare these two strategies using simulated data. Given an original sample size of 21,000, for reductions of sample sizes down to the order of 5,000 the adjusted sample size function works as good as the random sample approach. In contrast, when applying adjustments to sample sizes of lower order the adjustment function is less effective at approximating the chi-square value for an actual random sample of the relevant size. Hence, the fit is exaggerated and misfit under-estimated using the adjusted sample size function. Although there are big differences in chi-square values between the two approaches at lower sample sizes, the inferences based on the p-values may be the same.

  9. Acceptance sampling for attributes via hypothesis testing and the hypergeometric distribution

    NASA Astrophysics Data System (ADS)

    Samohyl, Robert Wayne

    2017-10-01

    This paper questions some aspects of attribute acceptance sampling in light of the original concepts of hypothesis testing from Neyman and Pearson (NP). Attribute acceptance sampling in industry, as developed by Dodge and Romig (DR), generally follows the international standards of ISO 2859, and similarly the Brazilian standards NBR 5425 to NBR 5427 and the United States Standards ANSI/ASQC Z1.4. The paper evaluates and extends the area of acceptance sampling in two directions. First, by suggesting the use of the hypergeometric distribution to calculate the parameters of sampling plans avoiding the unnecessary use of approximations such as the binomial or Poisson distributions. We show that, under usual conditions, discrepancies can be large. The conclusion is that the hypergeometric distribution, ubiquitously available in commonly used software, is more appropriate than other distributions for acceptance sampling. Second, and more importantly, we elaborate the theory of acceptance sampling in terms of hypothesis testing rigorously following the original concepts of NP. By offering a common theoretical structure, hypothesis testing from NP can produce a better understanding of applications even beyond the usual areas of industry and commerce such as public health and political polling. With the new procedures, both sample size and sample error can be reduced. What is unclear in traditional acceptance sampling is the necessity of linking the acceptable quality limit (AQL) exclusively to the producer and the lot quality percent defective (LTPD) exclusively to the consumer. In reality, the consumer should also be preoccupied with a value of AQL, as should the producer with LTPD. Furthermore, we can also question why type I error is always uniquely associated with the producer as producer risk, and likewise, the same question arises with consumer risk which is necessarily associated with type II error. The resolution of these questions is new to the literature. The

  10. Large-scale magnetic fields at high Reynolds numbers in magnetohydrodynamic simulations.

    PubMed

    Hotta, H; Rempel, M; Yokoyama, T

    2016-03-25

    The 11-year solar magnetic cycle shows a high degree of coherence in spite of the turbulent nature of the solar convection zone. It has been found in recent high-resolution magnetohydrodynamics simulations that the maintenance of a large-scale coherent magnetic field is difficult with small viscosity and magnetic diffusivity (≲10 (12) square centimenters per second). We reproduced previous findings that indicate a reduction of the energy in the large-scale magnetic field for lower diffusivities and demonstrate the recovery of the global-scale magnetic field using unprecedentedly high resolution. We found an efficient small-scale dynamo that suppresses small-scale flows, which mimics the properties of large diffusivity. As a result, the global-scale magnetic field is maintained even in the regime of small diffusivities-that is, large Reynolds numbers. Copyright © 2016, American Association for the Advancement of Science.

  11. Ground test experiment for large space structures

    NASA Technical Reports Server (NTRS)

    Tollison, D. K.; Waites, H. B.

    1985-01-01

    In recent years a new body of control theory has been developed for the design of control systems for Large Space Structures (LSS). The problems of testing this theory on LSS hardware are aggravated by the expense and risk of actual in orbit tests. Ground tests on large space structures can provide a proving ground for candidate control systems, but such tests require a unique facility for their execution. The current development of such a facility at the NASA Marshall Space Flight Center (MSFC) is the subject of this report.

  12. Numerically modelling the large scale coronal magnetic field

    NASA Astrophysics Data System (ADS)

    Panja, Mayukh; Nandi, Dibyendu

    2016-07-01

    The solar corona spews out vast amounts of magnetized plasma into the heliosphere which has a direct impact on the Earth's magnetosphere. Thus it is important that we develop an understanding of the dynamics of the solar corona. With our present technology it has not been possible to generate 3D magnetic maps of the solar corona; this warrants the use of numerical simulations to study the coronal magnetic field. A very popular method of doing this, is to extrapolate the photospheric magnetic field using NLFF or PFSS codes. However the extrapolations at different time intervals are completely independent of each other and do not capture the temporal evolution of magnetic fields. On the other hand full MHD simulations of the global coronal field, apart from being computationally very expensive would be physically less transparent, owing to the large number of free parameters that are typically used in such codes. This brings us to the Magneto-frictional model which is relatively simpler and computationally more economic. We have developed a Magnetofrictional Model, in 3D spherical polar co-ordinates to study the large scale global coronal field. Here we present studies of changing connectivities between active regions, in response to photospheric motions.

  13. Field testing a soil site field guide for Allegheny hardwoods

    Treesearch

    S.B. Jones

    1991-01-01

    A site quality evaluation decision model, developed for Allegheny hardwoods on the non-glaciated Allegheny Plateau of Pennsylvania and New York, was field tested by International Paper (IP) foresters and the author, on sites within the region of derivation and on glaciated sites north and west of the Wisconsin drift line. Results from the field testing are presented...

  14. Large Payload Ground Transportation and Test Considerations

    NASA Technical Reports Server (NTRS)

    Rucker, Michelle A.

    2016-01-01

    During test and verification planning for the Altair lunar lander project, a National Aeronautics and Space Administration (NASA) study team identified several ground transportation and test issues related to the large payload diameter. Although the entire Constellation Program-including Altair-has since been canceled, issues identified by the Altair project serve as important lessons learned for payloads greater than 7 m diameter being considered for NASA's new Space Launch System (SLS). A transportation feasibility study found that Altair's 8.97 m diameter Descent Module would not fit inside available aircraft. Although the Ascent Module cabin was only 2.35 m diameter, the long reaction control system booms extended nearly to the Descent Module diameter, making it equally unsuitable for air transportation without removing the booms and invalidating assembly workmanship screens or acceptance testing that had already been performed. Ground transportation of very large payloads over extended distances is not generally permitted by most states, so overland transportation alone would not be an option. Limited ground transportation to the nearest waterway may be possible, but water transportation could take as long as 66 days per production unit, depending on point of origin and acceptance test facility; transportation from the western United States would require transit through the Panama Canal to access the Kennedy Space Center launch site. Large payloads also pose acceptance test and ground processing challenges. Although propulsion, mechanical vibration, and reverberant acoustic test facilities at NASA's Plum Brook Station have been designed to accommodate large spacecraft, special handling and test work-arounds may be necessary, which could increase cost, schedule, and technical risk. Once at the launch site, there are no facilities currently capable of accommodating the combination of large payload size and hazardous processing such as hypergolic fuels

  15. A graphical method for determining the dry-depositional component of aerosol samples and their field blanks

    NASA Astrophysics Data System (ADS)

    Huang, Suilou; Rahn, Kenneth A.; Arimoto, Richard

    During the Atmosphere/Ocean Chemistry Experiment (AEROCE), field blanks of certain elements in aerosol samples occasionally increased abruptly, always during periods of unusually high atmospheric concentrations. We hypothesized that the anomalous blanks were created by coarse aerosol entering the sampling shelters and depositing onto the blank filters. If so, samples taken nearby should have been similarly affected. To test this hypothesis, we developed a simple graphical method in which elemental masses in field blanks are plotted against elemental masses in pumped samples, and zones of proportionality between the two are sought. Data from Bermuda and Mace Head (coastal western Ireland) confirmed that depositional zones did indeed appear, but only for coarse-particle elements and only under certain conditions. Actual increases of crustal and pollution-derived elements agreed well with values predicted from settling velocities and sampling rates: blanks increased up to an order of magnitude or more but samples by less than 1%. Marine elements behaved like crustal elements in most samples but occasionally were much more enriched: blanks increased up to 30-fold and samples up to about 3%. It thus appears that when coarse-particle elements are present in high concentrations, their field blanks and samples may be measurably affected by dry deposition. Depending on the elements of interest, this dry deposition may have to be measured and the concentrations corrected.

  16. Large-Scale Academic Achievement Testing of Deaf and Hard-of-Hearing Students: Past, Present, and Future

    ERIC Educational Resources Information Center

    Qi, Sen; Mitchell, Ross E.

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the…

  17. Large-Scale Hybrid Motor Testing. Chapter 10

    NASA Technical Reports Server (NTRS)

    Story, George

    2006-01-01

    Hybrid rocket motors can be successfully demonstrated at a small scale virtually anywhere. There have been many suitcase sized portable test stands assembled for demonstration of hybrids. They show the safety of hybrid rockets to the audiences. These small show motors and small laboratory scale motors can give comparative burn rate data for development of different fuel/oxidizer combinations, however questions that are always asked when hybrids are mentioned for large scale applications are - how do they scale and has it been shown in a large motor? To answer those questions, large scale motor testing is required to verify the hybrid motor at its true size. The necessity to conduct large-scale hybrid rocket motor tests to validate the burn rate from the small motors to application size has been documented in several place^'^^.^. Comparison of small scale hybrid data to that of larger scale data indicates that the fuel burn rate goes down with increasing port size, even with the same oxidizer flux. This trend holds for conventional hybrid motors with forward oxidizer injection and HTPB based fuels. While the reason this is occurring would make a great paper or study or thesis, it is not thoroughly understood at this time. Potential causes include the fact that since hybrid combustion is boundary layer driven, the larger port sizes reduce the interaction (radiation, mixing and heat transfer) from the core region of the port. This chapter focuses on some of the large, prototype sized testing of hybrid motors. The largest motors tested have been AMROC s 250K-lbf thrust motor at Edwards Air Force Base and the Hybrid Propulsion Demonstration Program s 250K-lbf thrust motor at Stennis Space Center. Numerous smaller tests were performed to support the burn rate, stability and scaling concepts that went into the development of those large motors.

  18. Connecting the large- and the small-scale magnetic fields of solar-like stars

    NASA Astrophysics Data System (ADS)

    Lehmann, L. T.; Jardine, M. M.; Mackay, D. H.; Vidotto, A. A.

    2018-05-01

    A key question in understanding the observed magnetic field topologies of cool stars is the link between the small- and the large-scale magnetic field and the influence of the stellar parameters on the magnetic field topology. We examine various simulated stars to connect the small-scale with the observable large-scale field. The highly resolved 3D simulations we used couple a flux transport model with a non-potential coronal model using a magnetofrictional technique. The surface magnetic field of these simulations is decomposed into spherical harmonics which enables us to analyse the magnetic field topologies on a wide range of length scales and to filter the large-scale magnetic field for a direct comparison with the observations. We show that the large-scale field of the self-consistent simulations fits the observed solar-like stars and is mainly set up by the global dipolar field and the large-scale properties of the flux pattern, e.g. the averaged latitudinal position of the emerging small-scale field and its global polarity pattern. The stellar parameters flux emergence rate, differential rotation and meridional flow affect the large-scale magnetic field topology. An increased flux emergence rate increases the magnetic flux in all field components and an increased differential rotation increases the toroidal field fraction by decreasing the poloidal field. The meridional flow affects the distribution of the magnetic energy across the spherical harmonic modes.

  19. Renormalizable Quantum Field Theories in the Large -n Limit

    NASA Astrophysics Data System (ADS)

    Guruswamy, Sathya

    1995-01-01

    In this thesis, we study two examples of renormalizable quantum field theories in the large-N limit. Chapter one is a general introduction describing physical motivations for studying such theories. In chapter two, we describe the large-N method in field theory and discuss the pioneering work of 't Hooft in large-N two-dimensional Quantum Chromodynamics (QCD). In chapter three we study a spherically symmetric approximation to four-dimensional QCD ('spherical QCD'). We recast spherical QCD into a bilocal (constrained) theory of hadrons which in the large-N limit is equivalent to large-N spherical QCD for all energy scales. The linear approximation to this theory gives an eigenvalue equation which is the analogue of the well-known 't Hooft's integral equation in two dimensions. This eigenvalue equation is a scale invariant one and therefore leads to divergences in the theory. We give a non-perturbative renormalization prescription to cure this and obtain a beta function which shows that large-N spherical QCD is asymptotically free. In chapter four, we review the essentials of conformal field theories in two and higher dimensions, particularly in the context of critical phenomena. In chapter five, we study the O(N) non-linear sigma model on three-dimensional curved spaces in the large-N limit and show that there is a non-trivial ultraviolet stable critical point at which it becomes conformally invariant. We study this model at this critical point on examples of spaces of constant curvature and compute the mass gap in the theory, the free energy density (which turns out to be a universal function of the information contained in the geometry of the manifold) and the two-point correlation functions. The results we get give an indication that this model is an example of a three-dimensional analogue of a rational conformal field theory. A conclusion with a brief summary and remarks follows at the end.

  20. A field test of three LQAS designs to assess the prevalence of acute malnutrition.

    PubMed

    Deitchler, Megan; Valadez, Joseph J; Egge, Kari; Fernandez, Soledad; Hennigan, Mary

    2007-08-01

    The conventional method for assessing the prevalence of Global Acute Malnutrition (GAM) in emergency settings is the 30 x 30 cluster-survey. This study describes alternative approaches: three Lot Quality Assurance Sampling (LQAS) designs to assess GAM. The LQAS designs were field-tested and their results compared with those from a 30 x 30 cluster-survey. Computer simulations confirmed that small clusters instead of a simple random sample could be used for LQAS assessments of GAM. Three LQAS designs were developed (33 x 6, 67 x 3, Sequential design) to assess GAM thresholds of 10, 15 and 20%. The designs were field-tested simultaneously with a 30 x 30 cluster-survey in Siraro, Ethiopia during June 2003. Using a nested study design, anthropometric, morbidity and vaccination data were collected on all children 6-59 months in sampled households. Hypothesis tests about GAM thresholds were conducted for each LQAS design. Point estimates were obtained for the 30 x 30 cluster-survey and the 33 x 6 and 67 x 3 LQAS designs. Hypothesis tests showed GAM as <10% for the 33 x 6 design and GAM as > or =10% for the 67 x 3 and Sequential designs. Point estimates for the 33 x 6 and 67 x 3 designs were similar to those of the 30 x 30 cluster-survey for GAM (6.7%, CI = 3.2-10.2%; 8.2%, CI = 4.3-12.1%, 7.4%, CI = 4.8-9.9%) and all other indicators. The CIs for the LQAS designs were only slightly wider than the CIs for the 30 x 30 cluster-survey; yet the LQAS designs required substantially less time to administer. The LQAS designs provide statistically appropriate alternatives to the more time-consuming 30 x 30 cluster-survey. However, additional field-testing is needed using independent samples rather than a nested study design.

  1. Approximate sample size formulas for the two-sample trimmed mean test with unequal variances.

    PubMed

    Luh, Wei-Ming; Guo, Jiin-Huarng

    2007-05-01

    Yuen's two-sample trimmed mean test statistic is one of the most robust methods to apply when variances are heterogeneous. The present study develops formulas for the sample size required for the test. The formulas are applicable for the cases of unequal variances, non-normality and unequal sample sizes. Given the specified alpha and the power (1-beta), the minimum sample size needed by the proposed formulas under various conditions is less than is given by the conventional formulas. Moreover, given a specified size of sample calculated by the proposed formulas, simulation results show that Yuen's test can achieve statistical power which is generally superior to that of the approximate t test. A numerical example is provided.

  2. Microbial Groundwater Sampling Protocol for Fecal-Rich Environments

    PubMed Central

    Harter, Thomas; Watanabe, Naoko; Li, Xunde; Atwill, Edward R; Samuels, William

    2014-01-01

    Inherently, confined animal farming operations (CAFOs) and other intense fecal-rich environments are potential sources of groundwater contamination by enteric pathogens. The ubiquity of microbial matter poses unique technical challenges in addition to economic constraints when sampling wells in such environments. In this paper, we evaluate a groundwater sampling protocol that relies on extended purging with a portable submersible stainless steel pump and Teflon® tubing as an alternative to equipment sterilization. The protocol allows for collecting a large number of samples quickly, relatively inexpensively, and under field conditions with limited access to capacity for sterilizing equipment. The protocol is tested on CAFO monitoring wells and considers three cross-contamination sources: equipment, wellbore, and ambient air. For the assessment, we use Enterococcus, a ubiquitous fecal indicator bacterium (FIB), in laboratory and field tests with spiked and blank samples, and in an extensive, multi-year field sampling campaign on 17 wells within 2 CAFOs. The assessment shows that extended purging can successfully control for equipment cross-contamination, but also controls for significant contamination of the well-head, within the well casing and within the immediate aquifer vicinity of the well-screen. Importantly, our tests further indicate that Enterococcus is frequently entrained in water samples when exposed to ambient air at a CAFO during sample collection. Wellbore and air contamination pose separate challenges in the design of groundwater monitoring strategies on CAFOs that are not addressed by equipment sterilization, but require adequate QA/QC procedures and can be addressed by the proposed sampling strategy. PMID:24903186

  3. Infrared-temperature variability in a large agricultural field

    NASA Technical Reports Server (NTRS)

    Millard, J. P.; Goettelman, R. C.; Leroy, M. J.

    1981-01-01

    Dunnigan Agro-Meteorological Experiment airborne thermal scanner images of a large varying-terrain barley field are acquired and analyzed. Temperature variability that may occur within instantaneous fields of view (IFOV) is defined (coefficient of variation: standard deviation/mean temperature in degrees C), and the percentage of the area within various IFOV's within + or - 1, 2, 3, and 5 degrees of the mean is determined. With the exception of very rugged terrain, over 80% of the area within 4, 16, 65 and 258 ha cells was at temperatures within + or - 3 C of the mean cell temperature. Remote measurements of field temperature appeared to be slightly influenced by pixel size in the range 4 ha to 259 ha, and the area percentage within any pixel which contributes within + or - 1, 2, 3, and 5 degrees C of the mean, is nominally the same. In conclusion, no great advantage is found in utilizing a small IFOV instead of a large one for remote sensing of crop temperature.

  4. Highly effective action from large N gauge fields

    NASA Astrophysics Data System (ADS)

    Yang, Hyun Seok

    2014-10-01

    Recently Schwarz put forward a conjecture that the world-volume action of a probe D3-brane in an AdS5×S5 background of type IIB superstring theory can be reinterpreted as the highly effective action (HEA) of four-dimensional N =4 superconformal field theory on the Coulomb branch. We argue that the HEA can be derived from the noncommutative (NC) field theory representation of the AdS/CFT correspondence and the Seiberg-Witten (SW) map defining a spacetime field redefinition between ordinary and NC gauge fields. It is based only on the well-known facts that the master fields of large N matrices are higher-dimensional NC U(1) gauge fields and the SW map is a local coordinate transformation eliminating U(1) gauge fields known as the Darboux theorem in symplectic geometry.

  5. Laboratory and field testing of bednet traps for mosquito (Diptera: Culicidae) sampling in West Java, Indonesia.

    PubMed

    Stoops, Craig A; Gionar, Yoyo R; Rusmiarto, Saptoro; Susapto, Dwiko; Andris, Heri; Elyazar, Iqbal R F; Barbara, Kathryn A; Munif, Amrul

    2010-06-01

    Surveillance of medically important mosquitoes is critical to determine the risk of mosquito-borne disease transmission. The purpose of this research was to test self-supporting, exposure-free bednet traps to survey mosquitoes. In the laboratory we tested human-baited and unbaited CDC light trap/cot bednet (CDCBN) combinations against three types of traps: the Mbita Trap (MIBITA), a Tent Trap (TENT), and a modified Townes style Malaise trap (TSM). In the laboratory, 16 runs comparing MBITA, TSM, and TENT to the CDCBN were conducted for a total of 48 runs of the experiment using 13,600 mosquitoes. The TENT trap collected significantly more mosquitoes than the CDCBN. The CDCBN collected significantly more than the MBITA and there was no difference between the TSM and the CDCBN. Two field trials were conducted in Cibuntu, Sukabumi, West Java, Indonesia. The first test compared human-baited and unbaited CDCBN, TENT, and TSM traps during six nights over two consecutive weeks per month from January, 2007 to September, 2007 for a total of 54 trapnights. A total of 8,474 mosquitoes representing 33 species were collected using the six trapping methods. The TENT-baited trap collected significantly more mosquitoes than both the CDCBN and the TSM. The second field trial was a comparison of the baited and unbaited TENT and CDCBN traps and Human Landing Collections (HLCs). The trial was carried out from January, 2008 to May, 2008 for a total of 30 trap nights. A total of 11,923 mosquitoes were collected representing 24 species. Human Landing Collections captured significantly more mosquitoes than either the TENT or the CDCBN. The baited and unbaited TENT collected significantly more mosquitoes than the CDCBN. The TENT trap was found to be an effective, light-weight substitute for the CDC light-trap, bednet combination in the field and should be considered for use in surveys of mosquito-borne diseases such as malaria, arboviruses, and filariasis.

  6. The Expanded Large Scale Gap Test

    DTIC Science & Technology

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  7. Large-scale solar magnetic fields and H-alpha patterns

    NASA Technical Reports Server (NTRS)

    Mcintosh, P. S.

    1972-01-01

    Coronal and interplanetary magnetic fields computed from measurements of large-scale photospheric magnetic fields suffer from interruptions in day-to-day observations and the limitation of using only measurements made near the solar central meridian. Procedures were devised for inferring the lines of polarity reversal from H-alpha solar patrol photographs that map the same large-scale features found on Mt. Wilson magnetograms. These features may be monitored without interruption by combining observations from the global network of observatories associated with NOAA's Space Environment Services Center. The patterns of inferred magnetic fields may be followed accurately as far as 60 deg from central meridian. Such patterns will be used to improve predictions of coronal features during the next solar eclipse.

  8. 7 CFR 28.952 - Testing of samples.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Testing of samples. 28.952 Section 28.952 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... processing tests of the properties of cotton samples and report the results thereof to the persons from whom...

  9. Digital Curation of Earth Science Samples Starts in the Field

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Hsu, L.; Song, L.; Carter, M. R.

    2014-12-01

    Collection of physical samples in the field is an essential part of research in the Earth Sciences. Samples provide a basis for progress across many disciplines, from the study of global climate change now and over the Earth's history, to present and past biogeochemical cycles, to magmatic processes and mantle dynamics. The types of samples, methods of collection, and scope and scale of sampling campaigns are highly diverse, ranging from large-scale programs to drill rock and sediment cores on land, in lakes, and in the ocean, to environmental observation networks with continuous sampling, to single investigator or small team expeditions to remote areas around the globe or trips to local outcrops. Cyberinfrastructure for sample-related fieldwork needs to cater to the different needs of these diverse sampling activities, aligning with specific workflows, regional constraints such as connectivity or climate, and processing of samples. In general, digital tools should assist with capture and management of metadata about the sampling process (location, time, method) and the sample itself (type, dimension, context, images, etc.), management of the physical objects (e.g., sample labels with QR codes), and the seamless transfer of sample metadata to data systems and software relevant to the post-sampling data acquisition, data processing, and sample curation. In order to optimize CI capabilities for samples, tools and workflows need to adopt community-based standards and best practices for sample metadata, classification, identification and registration. This presentation will provide an overview and updates of several ongoing efforts that are relevant to the development of standards for digital sample management: the ODM2 project that has generated an information model for spatially-discrete, feature-based earth observations resulting from in-situ sensors and environmental samples, aligned with OGC's Observation & Measurements model (Horsburgh et al, AGU FM 2014

  10. Can test fields destroy the event horizon in the Kerr–Taub–NUT spacetime?

    NASA Astrophysics Data System (ADS)

    Düztaş, Koray

    2018-02-01

    In this work we investigate if the interaction of the Kerr–Taub–NUT spacetime with test scalar and neutrino fields can lead to the destruction of the event horizon. It turns out that both extremal and nearly extremal black holes can be destroyed by scalar and neutrino fields if the initial angular momentum of the spacetime is sufficiently large relative to its mass and NUT charge. This is the first example in which a classical field satisfying the null energy condition can actually destroy an extremal black hole. For scalar fields, the modes that can lead to the destruction of the horizon are restricted to a narrow range due to superradiance. Since superradiance does not occur for neutrino fields, the destruction of the horizon by neutrino fields is generic, and it cannot be fixed by backreaction effects. We also show that the extremal black holes that can be destroyed by scalar fields correspond to naked singularities in the Kerr limit, in accord with the previous results which imply that extremal Kerr black holes cannot be destroyed by scalar test fields.

  11. Development testing of large volume water sprays for warm fog dispersal

    NASA Technical Reports Server (NTRS)

    Keller, V. W.; Anderson, B. J.; Burns, R. A.; Lala, G. G.; Meyer, M. B.; Beard, K. V.

    1986-01-01

    A new brute-force method of warm fog dispersal is described. The method uses large volume recycled water sprays to create curtains of falling drops through which the fog is processed by the ambient wind and spray induced air flow. Fog droplets are removed by coalescence/rainout. The efficiency of the technique depends upon the drop size spectra in the spray, the height to which the spray can be projected, the efficiency with which fog laden air is processed through the curtain of spray, and the rate at which new fog may be formed due to temperature differences between the air and spray water. Results of a field test program, implemented to develop the data base necessary to assess the proposed method, are presented. Analytical calculations based upon the field test results indicate that this proposed method of warm fog dispersal is feasible. Even more convincingly, the technique was successfully demonstrated in the one natural fog event which occurred during the test program. Energy requirements for this technique are an order of magnitude less than those to operate a thermokinetic system. An important side benefit is the considerable emergency fire extinguishing capability it provides along the runway.

  12. Field Geologic Observation and Sample Collection Strategies for Planetary Surface Exploration: Insights from the 2010 Desert RATS Geologist Crewmembers

    NASA Technical Reports Server (NTRS)

    Hurtado, Jose M., Jr.; Young, Kelsey; Bleacher, Jacob E.; Garry, W. Brent; Rice, James W., Jr.

    2012-01-01

    Observation is the primary role of all field geologists, and geologic observations put into an evolving conceptual context will be the most important data stream that will be relayed to Earth during a planetary exploration mission. Sample collection is also an important planetary field activity, and its success is closely tied to the quality of contextual observations. To test protocols for doing effective planetary geologic field- work, the Desert RATS(Research and Technology Studies) project deployed two prototype rovers for two weeks of simulated exploratory traverses in the San Francisco volcanic field of northern Arizona. The authors of this paper represent the geologist crew members who participated in the 2010 field test.We document the procedures adopted for Desert RATS 2010 and report on our experiences regarding these protocols. Careful consideration must be made of various issues that impact the interplay between field geologic observations and sample collection, including time management; strategies relatedtoduplicationofsamplesandobservations;logisticalconstraintson the volume and mass of samples and the volume/transfer of data collected; and paradigms for evaluation of mission success. We find that the 2010 field protocols brought to light important aspects of each of these issues, and we recommend best practices and modifications to training and operational protocols to address them. Underlying our recommendations is the recognition that the capacity of the crew to flexibly execute their activities is paramount. Careful design of mission parameters, especially field geologic protocols, is critical for enabling the crews to successfully meet their science objectives.

  13. 21 CFR 864.3260 - OTC test sample collection systems for drugs of abuse testing.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false OTC test sample collection systems for drugs of... Instrumentation and Accessories § 864.3260 OTC test sample collection systems for drugs of abuse testing. (a) Identification. An over-the-counter (OTC) test sample collection system for drugs of abuse testing is a device...

  14. Integration and verification testing of the Large Synoptic Survey Telescope camera

    NASA Astrophysics Data System (ADS)

    Lange, Travis; Bond, Tim; Chiang, James; Gilmore, Kirk; Digel, Seth; Dubois, Richard; Glanzman, Tom; Johnson, Tony; Lopez, Margaux; Newbry, Scott P.; Nordby, Martin E.; Rasmussen, Andrew P.; Reil, Kevin A.; Roodman, Aaron J.

    2016-08-01

    We present an overview of the Integration and Verification Testing activities of the Large Synoptic Survey Telescope (LSST) Camera at the SLAC National Accelerator Lab (SLAC). The LSST Camera, the sole instrument for LSST and under construction now, is comprised of a 3.2 Giga-pixel imager and a three element corrector with a 3.5 degree diameter field of view. LSST Camera Integration and Test will be taking place over the next four years, with final delivery to the LSST observatory anticipated in early 2020. We outline the planning for Integration and Test, describe some of the key verification hardware systems being developed, and identify some of the more complicated assembly/integration activities. Specific details of integration and verification hardware systems will be discussed, highlighting some of the technical challenges anticipated.

  15. Large-field high-resolution mosaic movies

    NASA Astrophysics Data System (ADS)

    Hammerschlag, Robert H.; Sliepen, Guus; Bettonvil, Felix C. M.; Jägers, Aswin P. L.; Sütterlin, Peter; Martin, Sara F.

    2012-09-01

    Movies with fields-of-view larger than normal for high-resolution telescopes will give a better understanding of processes on the Sun, such as filament and active region developments and their possible interactions. New active regions can influence, by their emergence, their environment to the extent of possibly serving as an igniter of the eruption of a nearby filament. A method to create a large field-of-view is to join several fields-of-view into a mosaic. Fields are imaged quickly one after another using fast telescope-pointing. Such a pointing cycle has been automated at the Dutch Open Telescope (DOT), a high-resolution solar telescope located on the Canary Island La Palma. The observer can draw with the computer mouse the desired total field in the guider-telescope image of the whole Sun. The guider telescope is equipped with an H-alpha filter and electronic enhancement of contrast in the image for good visibility of filaments and prominences. The number and positions of the subfields are calculated automatically and represented by an array of bright points indicating the subfield centers inside the drawn rectangle of the total field on the computer screen with the whole-sun image. When the exposures start the telescope repeats automatically the sequence of subfields. Automatic production of flats is also programmed including defocusing and fast motion over the solar disk of the image field. For the first time mosaic movies were programmed from stored information on automated telescope motions from one field to the next. The mosaic movies fill the gap between whole-sun images with limited resolution of synoptic telescopes including space instruments and small-field high-cadence movies of high-resolution solar telescopes.

  16. Problems with sampling desert tortoises: A simulation analysis based on field data

    USGS Publications Warehouse

    Freilich, J.E.; Camp, R.J.; Duda, J.J.; Karl, A.E.

    2005-01-01

    The desert tortoise (Gopherus agassizii) was listed as a U.S. threatened species in 1990 based largely on population declines inferred from mark-recapture surveys of 2.59-km2 (1-mi2) plots. Since then, several census methods have been proposed and tested, but all methods still pose logistical or statistical difficulties. We conducted computer simulations using actual tortoise location data from 2 1-mi2 plot surveys in southern California, USA, to identify strengths and weaknesses of current sampling strategies. We considered tortoise population estimates based on these plots as "truth" and then tested various sampling methods based on sampling smaller plots or transect lines passing through the mile squares. Data were analyzed using Schnabel's mark-recapture estimate and program CAPTURE. Experimental subsampling with replacement of the 1-mi2 data using 1-km2 and 0.25-km2 plot boundaries produced data sets of smaller plot sizes, which we compared to estimates from the 1-mi 2 plots. We also tested distance sampling by saturating a 1-mi 2 site with computer simulated transect lines, once again evaluating bias in density estimates. Subsampling estimates from 1-km2 plots did not differ significantly from the estimates derived at 1-mi2. The 0.25-km2 subsamples significantly overestimated population sizes, chiefly because too few recaptures were made. Distance sampling simulations were biased 80% of the time and had high coefficient of variation to density ratios. Furthermore, a prospective power analysis suggested limited ability to detect population declines as high as 50%. We concluded that poor performance and bias of both sampling procedures was driven by insufficient sample size, suggesting that all efforts must be directed to increasing numbers found in order to produce reliable results. Our results suggest that present methods may not be capable of accurately estimating desert tortoise populations.

  17. Large-scale academic achievement testing of deaf and hard-of-hearing students: past, present, and future.

    PubMed

    Qi, Sen; Mitchell, Ross E

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the validity and reliability of using the Stanford for this special student population still require extensive scrutiny. Recent shifts in educational policy environment, which require that schools enable all children to achieve proficiency through accountability testing, warrants a close examination of the adequacy and relevance of the current large-scale testing of deaf and hard-of-hearing students. This study has three objectives: (a) it will summarize the historical data over the last three decades to indicate trends in academic achievement for this special population, (b) it will analyze the current federal laws and regulations related to educational testing and special education, thereby identifying gaps between policy and practice in the field, especially identifying the limitations of current testing programs in assessing what deaf and hard-of-hearing students know, and (c) it will offer some insights and suggestions for future testing programs for deaf and hard-of-hearing students.

  18. A novel computational approach towards the certification of large-scale boson sampling

    NASA Astrophysics Data System (ADS)

    Huh, Joonsuk

    Recent proposals of boson sampling and the corresponding experiments exhibit the possible disproof of extended Church-Turning Thesis. Furthermore, the application of boson sampling to molecular computation has been suggested theoretically. Till now, however, only small-scale experiments with a few photons have been successfully performed. The boson sampling experiments of 20-30 photons are expected to reveal the computational superiority of the quantum device. A novel theoretical proposal for the large-scale boson sampling using microwave photons is highly promising due to the deterministic photon sources and the scalability. Therefore, the certification protocol of large-scale boson sampling experiments should be presented to complete the exciting story. We propose, in this presentation, a computational protocol towards the certification of large-scale boson sampling. The correlations of paired photon modes and the time-dependent characteristic functional with its Fourier component can show the fingerprint of large-scale boson sampling. This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology(NRF-2015R1A6A3A04059773), the ICT R&D program of MSIP/IITP [2015-019, Fundamental Research Toward Secure Quantum Communication] and Mueunjae Institute for Chemistry (MIC) postdoctoral fellowship.

  19. Preliminary Investigation of Curved Liner Sample in the NASA LaRC Curved Duct Test Rig

    NASA Technical Reports Server (NTRS)

    Gerhold, Carl H.; Jones, Michael G.; Brown, Martha C.

    2007-01-01

    This viewgraph presentation reviews the preliminary investigation of the curved liner sample in the NASA LaRC Curved Duct Test Rig (CDTR). It reviews the purpose of the Curved Duct Test Rig. Its purpose is to develop capability to investigate acoustic and aerodynamic properties in ducts. It has several features to accomplish that purpose: (1) Large scale (2) Flow rate to M = 0.275 (3) Higher order mode control (4) Curved flow path (5) Adaptable test section (6) Flexible test configurations. The liner has minimal effect on turbulence or boundary layer growth in duct. The curved duct sample attenuation is affected by mode scattering. In conclusion, the CDTR is valid tool for aerodynamic and acoustic evaluation of duct treatment

  20. Analyzing Educational Testing Service Graduate Major Field Test Results

    ERIC Educational Resources Information Center

    Thornton, Barry; Arbogast, Gordon

    2012-01-01

    The Educational Testing Service (ETS) created the Graduate Major Field Test in Business (GMFT-B) for MBA students. This test is administered to all MBA classes at Jacksonville University for the purpose of measuring student academic achievement and growth, as well as to assess educational outcomes. The test is given in the capstone course,…

  1. IN SITU FIELD TESTING OF PROCESSES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.S.Y. YANG

    2004-11-08

    and analyses provide data useful for refining and confirming the understanding of flow, drift seepage, and transport processes in the UZ. The UZ testing activities included measurement of permeability distribution, quantification of the seepage of water into the drifts, evaluation of fracture-matrix interaction, study of flow along faults, testing of flow and transport between drifts, characterization of hydrologic heterogeneity along drifts, estimation of drying effects on the rock surrounding the drifts due to ventilation, monitoring of moisture conditions in open and sealed drifts, and determination of the degree of minimum construction water migration below drift. These field tests were conducted in two underground drifts at Yucca Mountain, the Exploratory Studies Facility (ESF) drift, and the cross-drift for Enhanced Characterization of the Repository Block (ECRB), as described in Section 1.2. Samples collected in boreholes and underground drifts have been used for additional hydrochemical and isotopic analyses for additional understanding of the UZ setting. The UZ transport tests conducted at the nearby Busted Butte site (see Figure 1-4) are also described in this scientific analysis report.« less

  2. 40 CFR 205.57-2 - Test vehicle sample selection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Test vehicle sample selection. 205.57... vehicle sample selection. (a) Vehicles comprising the batch sample which are required to be tested... test request from a batch of vehicles of the category or configuration specified in the test request...

  3. Large-scale modeling of rain fields from a rain cell deterministic model

    NASA Astrophysics Data System (ADS)

    FéRal, Laurent; Sauvageot, Henri; Castanet, Laurent; Lemorton, JoëL.; Cornet, FréDéRic; Leconte, Katia

    2006-04-01

    A methodology to simulate two-dimensional rain rate fields at large scale (1000 × 1000 km2, the scale of a satellite telecommunication beam or a terrestrial fixed broadband wireless access network) is proposed. It relies on a rain rate field cellular decomposition. At small scale (˜20 × 20 km2), the rain field is split up into its macroscopic components, the rain cells, described by the Hybrid Cell (HYCELL) cellular model. At midscale (˜150 × 150 km2), the rain field results from the conglomeration of rain cells modeled by HYCELL. To account for the rain cell spatial distribution at midscale, the latter is modeled by a doubly aggregative isotropic random walk, the optimal parameterization of which is derived from radar observations at midscale. The extension of the simulation area from the midscale to the large scale (1000 × 1000 km2) requires the modeling of the weather frontal area. The latter is first modeled by a Gaussian field with anisotropic covariance function. The Gaussian field is then turned into a binary field, giving the large-scale locations over which it is raining. This transformation requires the definition of the rain occupation rate over large-scale areas. Its probability distribution is determined from observations by the French operational radar network ARAMIS. The coupling with the rain field modeling at midscale is immediate whenever the large-scale field is split up into midscale subareas. The rain field thus generated accounts for the local CDF at each point, defining a structure spatially correlated at small scale, midscale, and large scale. It is then suggested that this approach be used by system designers to evaluate diversity gain, terrestrial path attenuation, or slant path attenuation for different azimuth and elevation angle directions.

  4. Large-field high-contrast hard x-ray Zernike phase-contrast nano-imaging beamline at Pohang Light Source.

    PubMed

    Lim, Jun; Park, So Yeong; Huang, Jung Yun; Han, Sung Mi; Kim, Hong-Tae

    2013-01-01

    We developed an off-axis-illuminated zone-plate-based hard x-ray Zernike phase-contrast microscope beamline at Pohang Light Source. Owing to condenser optics-free and off-axis illumination, a large field of view was achieved. The pinhole-type Zernike phase plate affords high-contrast images of a cell with minimal artifacts such as the shade-off and halo effects. The setup, including the optics and the alignment, is simple and easy, and allows faster and easier imaging of large bio-samples.

  5. Beginning Postsecondary Students Longitudinal Study First Follow-up (BPS:96/98) Field Test Report. Working Paper Series.

    ERIC Educational Resources Information Center

    Pratt, Daniel J.; Wine, Jennifer S.; Heuer, Ruth E.; Whitmore, Roy W.; Kelly, Janice E.; Doherty, John M.; Simpson, Joe B.; Marti, Norma

    This report describes the methods and procedures used for the field test of the Beginning Postsecondary Students Longitudinal Study First Followup 1996-98 (BPS:96/98). Students in this survey were first interviewed during 1995 as part of the National Postsecondary Student Aid Study 1996 field test. The BPS:96/98 full-scale student sample includes…

  6. Amplification of large scale magnetic fields in a decaying MHD system

    NASA Astrophysics Data System (ADS)

    Park, Kiwan

    2017-10-01

    Dynamo theory explains the amplification of magnetic fields in the conducting fluids (plasmas) driven by the continuous external energy. It is known that the nonhelical continuous kinetic or magnetic energy amplifies the small scale magnetic field; and the helical energy, the instability, or the shear with rotation effect amplifies the large scale magnetic field. However, recently it was reported that the decaying magnetic energy independent of helicity or instability could generate the large scale magnetic field. This phenomenon may look somewhat contradictory to the conventional dynamo theory. But it gives us some clues to the fundamental mechanism of energy transfer in the magnetized conducting fluids. It also implies that an ephemeral astrophysical event emitting the magnetic and kinetic energy can be a direct cause of the large scale magnetic field observed in space. As of now the exact physical mechanism is not yet understood in spite of several numerical results. The plasma motion coupled with a nearly conserved vector potential in the magnetohydrodynamic (MHD) system may transfer magnetic energy to the large scale. Also the intrinsic property of the scaling invariant MHD equation may decide the direction of energy transfer. In this paper we present the simulation results of inversely transferred helical and nonhelical energy in a decaying MHD system. We introduce a field structure model based on the MHD equation to show that the transfer of magnetic energy is essentially bidirectional depending on the plasma motion and initial energy distribution. And then we derive α coefficient algebraically in line with the field structure model to explain how the large scale magnetic field is induced by the helical energy in the system regardless of an external forcing source. And for the algebraic analysis of nonhelical magnetic energy, we use the eddy damped quasinormalized Markovian approximation to show the inverse transfer of magnetic energy.

  7. Large sample area and size are needed for forest soil seed bank studies to ensure low discrepancy with standing vegetation.

    PubMed

    Shen, You-xin; Liu, Wei-li; Li, Yu-hui; Guan, Hui-lin

    2014-01-01

    A large number of small-sized samples invariably shows that woody species are absent from forest soil seed banks, leading to a large discrepancy with the seedling bank on the forest floor. We ask: 1) Does this conventional sampling strategy limit the detection of seeds of woody species? 2) Are large sample areas and sample sizes needed for higher recovery of seeds of woody species? We collected 100 samples that were 10 cm (length) × 10 cm (width) × 10 cm (depth), referred to as larger number of small-sized samples (LNSS) in a 1 ha forest plot, and placed them to germinate in a greenhouse, and collected 30 samples that were 1 m × 1 m × 10 cm, referred to as small number of large-sized samples (SNLS) and placed them (10 each) in a nearby secondary forest, shrub land and grass land. Only 15.7% of woody plant species of the forest stand were detected by the 100 LNSS, contrasting with 22.9%, 37.3% and 20.5% woody plant species being detected by SNLS in the secondary forest, shrub land and grassland, respectively. The increased number of species vs. sampled areas confirmed power-law relationships for forest stand, the LNSS and SNLS at all three recipient sites. Our results, although based on one forest, indicate that conventional LNSS did not yield a high percentage of detection for woody species, but SNLS strategy yielded a higher percentage of detection for woody species in the seed bank if samples were exposed to a better field germination environment. A 4 m2 minimum sample area derived from power equations is larger than the sampled area in most studies in the literature. Increased sample size also is needed to obtain an increased sample area if the number of samples is to remain relatively low.

  8. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling: Subsystem Design and Test Challenges

    NASA Technical Reports Server (NTRS)

    Jandura, Louise

    2010-01-01

    The Sample Acquisition/Sample Processing and Handling subsystem for the Mars Science Laboratory is a highly-mechanized, Rover-based sampling system that acquires powdered rock and regolith samples from the Martian surface, sorts the samples into fine particles through sieving, and delivers small portions of the powder into two science instruments inside the Rover. SA/SPaH utilizes 17 actuated degrees-of-freedom to perform the functions needed to produce 5 sample pathways in support of the scientific investigation on Mars. Both hardware redundancy and functional redundancy are employed in configuring this sampling system so some functionality is retained even with the loss of a degree-of-freedom. Intentional dynamic environments are created to move sample while vibration isolators attenuate this environment at the sensitive instruments located near the dynamic sources. In addition to the typical flight hardware qualification test program, two additional types of testing are essential for this kind of sampling system: characterization of the intentionally-created dynamic environment and testing of the sample acquisition and processing hardware functions using Mars analog materials in a low pressure environment. The overall subsystem design and configuration are discussed along with some of the challenges, tradeoffs, and lessons learned in the areas of fault tolerance, intentional dynamic environments, and special testing

  9. 47 CFR 73.1515 - Special field test authorizations.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 4 2014-10-01 2014-10-01 false Special field test authorizations. 73.1515 Section 73.1515 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES RADIO BROADCAST SERVICES Rules Applicable to All Broadcast Stations § 73.1515 Special field test authorizations. (a) A special field test...

  10. 47 CFR 73.1515 - Special field test authorizations.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 4 2012-10-01 2012-10-01 false Special field test authorizations. 73.1515 Section 73.1515 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES RADIO BROADCAST SERVICES Rules Applicable to All Broadcast Stations § 73.1515 Special field test authorizations. (a) A special field test...

  11. Neutron Scattering Studies on Large Length Scale Sample Structures

    NASA Astrophysics Data System (ADS)

    Feng, Hao

    Neutron scattering can be used to study structures of matter. Depending on the interested sample properties, different scattering techniques can be chosen. Neutron reflectivity is more often used to detect in-depth profile of layered structures and the interfacial roughness while transmission is more sensitive to sample bulk properties. Neutron Reflectometry (NR) technique, one technique in neutron reflectivity, is first discussed in this thesis. Both specular reflectivity and the first order Bragg intensity were measured in the NR experiment with a diffraction grating in order to study the in-depth and the lateral structure of a sample (polymer) deposited on the grating. However, the first order Bragg intensity solely is sometimes inadequate to determine the lateral structure and high order Bragg intensities are difficult to measure using traditional neutron scattering techniques due to the low brightness of the current neutron sources. Spin Echo Small Angle Neutron Scattering (SESANS) technique overcomes this resolution problem by measuring the Fourier transforms of all the Bragg intensities, resulting in measuring the real-space density correlations of samples and allowing the accessible length scale from few-tens of nanometers to several microns. SESANS can be implemented by using two pairs of magnetic Wollaston prims (WP) and the accessible length scale is proportional to the magnetic field intensity in WPs. To increase the magnetic field and thus increase the accessible length scale, an apparatus named Superconducting Wollaston Prisms (SWP) which has a series of strong, well-defined shaped magnetic fields created by superconducting coils was developed in Indiana University in 2016. Since then, various kinds of optimization have been implemented, which are addressed in this thesis. Finally, applications of SWPs in other neutron scattering techniques like Neutron Larmor Diffraction (NLD) are discussed.

  12. Research on the self-absorption corrections for PGNAA of large samples

    NASA Astrophysics Data System (ADS)

    Yang, Jian-Bo; Liu, Zhi; Chang, Kang; Li, Rui

    2017-02-01

    When a large sample is analysed with the prompt gamma neutron activation analysis (PGNAA) neutron self-shielding and gamma self-absorption affect the accuracy, the correction method for the detection efficiency of the relative H of each element in a large sample is described. The influences of the thickness and density of the cement samples on the H detection efficiency, as well as the impurities Fe2O3 and SiO2 on the prompt γ ray yield for each element in the cement samples, were studied. The phase functions for Ca, Fe, and Si on H with changes in sample thickness and density were provided to avoid complicated procedures for preparing the corresponding density or thickness scale for measuring samples under each density or thickness value and to present a simplified method for the measurement efficiency scale for prompt-gamma neutron activation analysis.

  13. Influences of Electrification and Salt on Hydrophobicity of Sample Surface in Dynamic Drop Test

    NASA Astrophysics Data System (ADS)

    Shiibara, Daiki; Arata, Yoshihiro; Haji, Kenichi; Miyake, Takuma; Sakoda, Tatsuya; Otsubo, Masahisa

    Studies on the development of deterioration/ performance evaluation method for outdoor electric insulation of polymer materials are pushed forward now in the International Council on Large Electric Systems (CIGRE). The small scale test method (Dynamic drop test; DDT) which could evaluate disappearance characteristics of hydrophobicity easily was suggested. This test is to evaluate resistance of a sample to loss of hydrophobicity due to moisture and simultaneous electric stress. As factors for deterioration of hydrophobicity on a sample in DDT, various factors such as electrical influence, physical influence by water droplets and so on were considered. In this study, we investigated two kinds of factors (electrification and salt) affecting deterioration of hydrophobicity on the surface of a silicone rubber until ignition of continuous electrical discharge in DDT.

  14. Imprint of thawing scalar fields on the large scale galaxy overdensity

    NASA Astrophysics Data System (ADS)

    Dinda, Bikash R.; Sen, Anjan A.

    2018-04-01

    We investigate the observed galaxy power spectrum for the thawing class of scalar field models taking into account various general relativistic corrections that occur on very large scales. We consider the full general relativistic perturbation equations for the matter as well as the dark energy fluid. We form a single autonomous system of equations containing both the background and the perturbed equations of motion which we subsequently solve for different scalar field potentials. First we study the percentage deviation from the Λ CDM model for different cosmological parameters as well as in the observed galaxy power spectra on different scales in scalar field models for various choices of scalar field potentials. Interestingly the difference in background expansion results from the enhancement of power from Λ CDM on small scales, whereas the inclusion of general relativistic (GR) corrections results in the suppression of power from Λ CDM on large scales. This can be useful to distinguish scalar field models from Λ CDM with future optical/radio surveys. We also compare the observed galaxy power spectra for tracking and thawing types of scalar field using some particular choices for the scalar field potentials. We show that thawing and tracking models can have large differences in observed galaxy power spectra on large scales and for smaller redshifts due to different GR effects. But on smaller scales and for larger redshifts, the difference is small and is mainly due to the difference in background expansion.

  15. Second harmonic sound field after insertion of a biological tissue sample

    NASA Astrophysics Data System (ADS)

    Zhang, Dong; Gong, Xiu-Fen; Zhang, Bo

    2002-01-01

    Second harmonic sound field after inserting a biological tissue sample is investigated by theory and experiment. The sample is inserted perpendicular to the sound axis, whose acoustical properties are different from those of surrounding medium (distilled water). By using the superposition of Gaussian beams and the KZK equation in quasilinear and parabolic approximations, the second harmonic field after insertion of the sample can be derived analytically and expressed as a linear combination of self- and cross-interaction of the Gaussian beams. Egg white, egg yolk, porcine liver, and porcine fat are used as the samples and inserted in the sound field radiated from a 2 MHz uniformly excited focusing source. Axial normalized sound pressure curves of the second harmonic wave before and after inserting the sample are measured and compared with the theoretical results calculated with 10 items of Gaussian beam functions.

  16. Large Deviations for Nonlocal Stochastic Neural Fields

    PubMed Central

    2014-01-01

    We study the effect of additive noise on integro-differential neural field equations. In particular, we analyze an Amari-type model driven by a Q-Wiener process, and focus on noise-induced transitions and escape. We argue that proving a sharp Kramers’ law for neural fields poses substantial difficulties, but that one may transfer techniques from stochastic partial differential equations to establish a large deviation principle (LDP). Then we demonstrate that an efficient finite-dimensional approximation of the stochastic neural field equation can be achieved using a Galerkin method and that the resulting finite-dimensional rate function for the LDP can have a multiscale structure in certain cases. These results form the starting point for an efficient practical computation of the LDP. Our approach also provides the technical basis for further rigorous study of noise-induced transitions in neural fields based on Galerkin approximations. Mathematics Subject Classification (2000): 60F10, 60H15, 65M60, 92C20. PMID:24742297

  17. Large-scale dynamo growth rates from numerical simulations and implications for mean-field theories

    NASA Astrophysics Data System (ADS)

    Park, Kiwan; Blackman, Eric G.; Subramanian, Kandaswamy

    2013-05-01

    Understanding large-scale magnetic field growth in turbulent plasmas in the magnetohydrodynamic limit is a goal of magnetic dynamo theory. In particular, assessing how well large-scale helical field growth and saturation in simulations match those predicted by existing theories is important for progress. Using numerical simulations of isotropically forced turbulence without large-scale shear with its implications, we focus on several additional aspects of this comparison: (1) Leading mean-field dynamo theories which break the field into large and small scales predict that large-scale helical field growth rates are determined by the difference between kinetic helicity and current helicity with no dependence on the nonhelical energy in small-scale magnetic fields. Our simulations show that the growth rate of the large-scale field from fully helical forcing is indeed unaffected by the presence or absence of small-scale magnetic fields amplified in a precursor nonhelical dynamo. However, because the precursor nonhelical dynamo in our simulations produced fields that were strongly subequipartition with respect to the kinetic energy, we cannot yet rule out the potential influence of stronger nonhelical small-scale fields. (2) We have identified two features in our simulations which cannot be explained by the most minimalist versions of two-scale mean-field theory: (i) fully helical small-scale forcing produces significant nonhelical large-scale magnetic energy and (ii) the saturation of the large-scale field growth is time delayed with respect to what minimalist theory predicts. We comment on desirable generalizations to the theory in this context and future desired work.

  18. Large-scale dynamo growth rates from numerical simulations and implications for mean-field theories.

    PubMed

    Park, Kiwan; Blackman, Eric G; Subramanian, Kandaswamy

    2013-05-01

    Understanding large-scale magnetic field growth in turbulent plasmas in the magnetohydrodynamic limit is a goal of magnetic dynamo theory. In particular, assessing how well large-scale helical field growth and saturation in simulations match those predicted by existing theories is important for progress. Using numerical simulations of isotropically forced turbulence without large-scale shear with its implications, we focus on several additional aspects of this comparison: (1) Leading mean-field dynamo theories which break the field into large and small scales predict that large-scale helical field growth rates are determined by the difference between kinetic helicity and current helicity with no dependence on the nonhelical energy in small-scale magnetic fields. Our simulations show that the growth rate of the large-scale field from fully helical forcing is indeed unaffected by the presence or absence of small-scale magnetic fields amplified in a precursor nonhelical dynamo. However, because the precursor nonhelical dynamo in our simulations produced fields that were strongly subequipartition with respect to the kinetic energy, we cannot yet rule out the potential influence of stronger nonhelical small-scale fields. (2) We have identified two features in our simulations which cannot be explained by the most minimalist versions of two-scale mean-field theory: (i) fully helical small-scale forcing produces significant nonhelical large-scale magnetic energy and (ii) the saturation of the large-scale field growth is time delayed with respect to what minimalist theory predicts. We comment on desirable generalizations to the theory in this context and future desired work.

  19. The Large-scale Magnetic Fields of Thin Accretion Disks

    NASA Astrophysics Data System (ADS)

    Cao, Xinwu; Spruit, Hendrik C.

    2013-03-01

    Large-scale magnetic field threading an accretion disk is a key ingredient in the jet formation model. The most attractive scenario for the origin of such a large-scale field is the advection of the field by the gas in the accretion disk from the interstellar medium or a companion star. However, it is realized that outward diffusion of the accreted field is fast compared with the inward accretion velocity in a geometrically thin accretion disk if the value of the Prandtl number P m is around unity. In this work, we revisit this problem considering the angular momentum of the disk to be removed predominantly by the magnetically driven outflows. The radial velocity of the disk is significantly increased due to the presence of the outflows. Using a simplified model for the vertical disk structure, we find that even moderately weak fields can cause sufficient angular momentum loss via a magnetic wind to balance outward diffusion. There are two equilibrium points, one at low field strengths corresponding to a plasma-beta at the midplane of order several hundred, and one for strong accreted fields, β ~ 1. We surmise that the first is relevant for the accretion of weak, possibly external, fields through the outer parts of the disk, while the latter one could explain the tendency, observed in full three-dimensional numerical simulations, of strong flux bundles at the centers of disk to stay confined in spite of strong magnetororational instability turbulence surrounding them.

  20. Long-Term Ecological Monitoring Field Sampling Plan for 2007

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    T. Haney

    2007-07-31

    This field sampling plan describes the field investigations planned for the Long-Term Ecological Monitoring Project at the Idaho National Laboratory Site in 2007. This plan and the Quality Assurance Project Plan for Waste Area Groups 1, 2, 3, 4, 5, 6, 7, 10, and Removal Actions constitute the sampling and analysis plan supporting long-term ecological monitoring sampling in 2007. The data collected under this plan will become part of the long-term ecological monitoring data set that is being collected annually. The data will be used t determine the requirements for the subsequent long-term ecological monitoring. This plan guides the 2007more » investigations, including sampling, quality assurance, quality control, analytical procedures, and data management. As such, this plan will help to ensure that the resulting monitoring data will be scientifically valid, defensible, and of known and acceptable quality.« less

  1. Trip Report-Produced-Water Field Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, Enid J.

    2012-05-25

    Los Alamos National Laboratory (LANL) conducted field testing of a produced-water pretreatment apparatus with assistance from faculty at the Texas A&M University (TAMU) protein separation sciences laboratory located on the TAMU main campus. The following report details all of the logistics surrounding the testing. The purpose of the test was to use a new, commercially-available filter media housing containing modified zeolite (surfactant-modified zeolite or SMZ) porous medium for use in pretreatment of oil and gas produced water (PW) and frac-flowback waters. The SMZ was tested previously in October, 2010 in a lab-constructed configuration ('old multicolumn system'), and performed well formore » removal of benzene, toluene, ethylbenzene, and xylenes (BTEX) from PW. However, a less-expensive, modular configuration is needed for field use. A modular system will allow the field operator to add or subtract SMZ filters as needed to accommodate site specific conditions, and to swap out used filters easily in a multi-unit system. This test demonstrated the use of a commercial filter housing with a simple flow modification and packed with SMZ for removing BTEX from a PW source in College Station, Texas. The system will be tested in June 2012 at a field site in Pennsylvania for treating frac-flowback waters. The goals of this test are: (1) to determine sorption efficiency of BTEX in the new configuration; and (2) to observe the range of flow rates, backpressures, and total volume treated at a given flow rate.« less

  2. Large field inflation from axion mixing

    NASA Astrophysics Data System (ADS)

    Shiu, Gary; Staessens, Wieland; Ye, Fang

    2015-06-01

    We study the general multi-axion systems, focusing on the possibility of large field inflation driven by axions. We find that through axion mixing from a non-diagonal metric on the moduli space and/or from Stückelberg coupling to a U(1) gauge field, an effectively super-Planckian decay constant can be generated without the need of "alignment" in the axion decay constants. We also investigate the consistency conditions related to the gauge symmetries in the multi-axion systems, such as vanishing gauge anomalies and the potential presence of generalized Chern-Simons terms. Our scenario applies generally to field theory models whose axion periodicities are intrinsically sub-Planckian, but it is most naturally realized in string theory. The types of axion mixings invoked in our scenario appear quite commonly in D-brane models, and we present its implementation in type II superstring theory. Explicit stringy models exhibiting all the characteristics of our ideas are constructed within the frameworks of Type IIA intersecting D6-brane models on and Type IIB intersecting D7-brane models on Swiss-Cheese Calabi-Yau orientifolds.

  3. Development of Dynamic Flow Field Pressure Probes Suitable for Use in Large Scale Supersonic Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Porro, A. Robert

    2000-01-01

    A series of dynamic flow field pressure probes were developed for use in large-scale supersonic wind tunnels at NASA Glenn Research Center. These flow field probes include pitot, static, and five-hole conical pressure probes that are capable of capturing fast acting flow field pressure transients that occur on a millisecond time scale. The pitot and static probes can be used to determine local Mach number time histories during a transient event. The five-hole conical pressure probes are used primarily to determine local flow angularity, but can also determine local Mach number. These probes were designed, developed, and tested at the NASA Glenn Research Center. They were also used in a NASA Glenn 10-by 10-Foot Supersonic Wind Tunnel (SWT) test program where they successfully acquired flow field pressure data in the vicinity of a propulsion system during an engine compressor staff and inlet unstart transient event. Details of the design, development, and subsequent use of these probes are discussed in this report.

  4. Field testing the Raman gas composition sensor for gas turbine operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buric, M.; Chorpening, B.; Mullem, J.

    2012-01-01

    A gas composition sensor based on Raman spectroscopy using reflective metal lined capillary waveguides is tested under field conditions for feed-forward applications in gas turbine control. The capillary waveguide enables effective use of low powered lasers and rapid composition determination, for computation of required parameters to pre-adjust burner control based on incoming fuel. Tests on high pressure fuel streams show sub-second time response and better than one percent accuracy on natural gas fuel mixtures. Fuel composition and Wobbe constant values are provided at one second intervals or faster. The sensor, designed and constructed at NETL, is packaged for Class Imore » Division 2 operations typical of gas turbine environments, and samples gas at up to 800 psig. Simultaneous determination of the hydrocarbons methane, ethane, and propane plus CO, CO2, H2O, H2, N2, and O2 are realized. The capillary waveguide permits use of miniature spectrometers and laser power of less than 100 mW. The capillary dimensions of 1 m length and 300 μm ID also enable a full sample exchange in 0.4 s or less at 5 psig pressure differential, which allows a fast response to changes in sample composition. Sensor operation under field operation conditions will be reported.« less

  5. Large tensor non-Gaussianity from axion-gauge field dynamics

    NASA Astrophysics Data System (ADS)

    Agrawal, Aniket; Fujita, Tomohiro; Komatsu, Eiichiro

    2018-05-01

    We show that an inflation model in which a spectator axion field is coupled to an S U (2 ) gauge field produces a large three-point function (bispectrum) of primordial gravitational waves, Bh, on the scales relevant to the cosmic microwave background experiments. The amplitude of the bispectrum at the equilateral configuration is characterized by Bh/Ph2=O (10 )×ΩA-1 , where ΩA is a fraction of the energy density in the gauge field and Ph is the power spectrum of gravitational waves produced by the gauge field.

  6. Recent developments on field gas extraction and sample preparation methods for radiokrypton dating of groundwater

    NASA Astrophysics Data System (ADS)

    Yokochi, Reika

    2016-09-01

    Current and foreseen population growths will lead to an increased demand in freshwater, large quantities of which is stored as groundwater. The ventilation age is crucial to the assessment of groundwater resources, complementing the hydrological model approach based on hydrogeological parameters. Ultra-trace radioactive isotopes of Kr (81 Kr and 85 Kr) possess the ideal physical and chemical properties for groundwater dating. The recent advent of atom trap trace analyses (ATTA) has enabled determination of ultra-trace noble gas radioisotope abundances using 5-10 μ L of pure Kr. Anticipated developments will enable ATTA to analyze radiokrypton isotope abundances at high sample throughput, which necessitates simple and efficient sample preparation techniques that are adaptable to various sample chemistries. Recent developments of field gas extraction devices and simple and rapid Kr separation method at the University of Chicago are presented herein. Two field gas extraction devices optimized for different sampling conditions were recently designed and constructed, aiming at operational simplicity and portability. A newly developed Kr purification system enriches Kr by flowing a sample gas through a moderately cooled (138 K) activated charcoal column, followed by a gentle fractionating desorption. This simple process uses a single adsorbent and separates 99% of the bulk atmospheric gases from Kr without significant loss. The subsequent two stages of gas chromatographic separation and a hot Ti sponge getter further purify the Kr-enriched gas. Abundant CH4 necessitates multiple passages through one of the gas chromatographic separation columns. The presented Kr separation system has a demonstrated capability of extracting Kr with > 90% yield and 99% purity within 75 min from 1.2 to 26.8 L STP of atmospheric air with various concentrations of CH4. The apparatuses have successfully been deployed for sampling in the field and purification of groundwater samples.

  7. Analysis of the Effect of Chronic and Low-Dose Radiation Exposure on Spermatogenic Cells of Male Large Japanese Field Mice ( Apodemus speciosus ) after the Fukushima Daiichi Nuclear Power Plant Accident.

    PubMed

    Takino, Sachio; Yamashiro, Hideaki; Sugano, Yukou; Fujishima, Yohei; Nakata, Akifumi; Kasai, Kosuke; Hayashi, Gohei; Urushihara, Yusuke; Suzuki, Masatoshi; Shinoda, Hisashi; Miura, Tomisato; Fukumoto, Manabu

    2017-02-01

    In this study we analyzed the effect of chronic and low-dose-rate (LDR) radiation on spermatogenic cells of large Japanese field mice ( Apodemus speciosus ) after the Fukushima Daiichi Nuclear Power Plant (FNPP) accident. In March 2014, large Japanese field mice were collected from two sites located in, and one site adjacent to, the FNPP ex-evacuation zone: Tanashio, Murohara and Akogi, respectively. Testes from these animals were analyzed histologically. External dose rate from radiocesium (combined 134 Cs and 137 Cs) in these animals at the sampling sites exhibited 21 μGy/day in Tanashio, 304-365 μGy/day in Murohara and 407-447 μGy/day in Akogi. In the Akogi group, the numbers of spermatogenic cells and proliferating cell nuclear antigen (PCNA)-positive cells per seminiferous tubule were significantly higher compared to the Tanashio and Murohara groups, respectively. TUNEL-positive apoptotic cells tended to be detected at a lower level in the Murohara and Akogi groups compared to the Tanashio group. These results suggest that enhanced spermatogenesis occurred in large Japanese field mice living in and around the FNPP ex-evacuation zone. It remains to be elucidated whether this phenomenon, attributed to chronic exposure to LDR radiation, will benefit or adversely affect large Japanese field mice.

  8. Termiticide Field Tests - 1989 Update

    Treesearch

    Bradford M. Kard; Joe K. Mauldin

    1993-01-01

    For several years, organophosphate and pyrethroid termiticides have undergone field evaluation as treatments to soil for control of subterranean termites. These termiticides remained effective at some application rates for 5 or more years. Field data are reported for ground-board and concrete slab tests at sites in the continental United States. Generally, pyrethroids...

  9. Acute toxic tests of rainwater samples using Daphnia magna.

    PubMed

    Sakai, Manabu

    2006-06-01

    Rainwater samples were collected at Isogo Ward of Yokohama City, Japan, from 23 June to 31 July 2003. The toxic potency of pollutants present in 13 rainwater samples was tested using Daphnia magna. Most test animals died within 48 h in five test solutions that were prepared from rainwater samples. On the other hand, when nonpolar compounds such as pesticides were removed from rainwater samples before the toxic tests, mortalities in all test solutions were less than 10%. Eight kinds of pesticides were detected in rainwater samples. The highest concentration was of dichlorvos, at 0.74 microg/L. Results indicated that insecticides in rainwater sometimes lethally affected D. magna and that toxic potency of insecticides that are present in rainwater constitutes an important problem for environmental protection.

  10. New prior sampling methods for nested sampling - Development and testing

    NASA Astrophysics Data System (ADS)

    Stokes, Barrie; Tuyl, Frank; Hudson, Irene

    2017-06-01

    Nested Sampling is a powerful algorithm for fitting models to data in the Bayesian setting, introduced by Skilling [1]. The nested sampling algorithm proceeds by carrying out a series of compressive steps, involving successively nested iso-likelihood boundaries, starting with the full prior distribution of the problem parameters. The "central problem" of nested sampling is to draw at each step a sample from the prior distribution whose likelihood is greater than the current likelihood threshold, i.e., a sample falling inside the current likelihood-restricted region. For both flat and informative priors this ultimately requires uniform sampling restricted to the likelihood-restricted region. We present two new methods of carrying out this sampling step, and illustrate their use with the lighthouse problem [2], a bivariate likelihood used by Gregory [3] and a trivariate Gaussian mixture likelihood. All the algorithm development and testing reported here has been done with Mathematica® [4].

  11. Field portable low temperature porous layer open tubular cryoadsorption headspace sampling and analysis part II: Applications.

    PubMed

    Harries, Megan; Bukovsky-Reyes, Santiago; Bruno, Thomas J

    2016-01-15

    This paper details the sampling methods used with the field portable porous layer open tubular cryoadsorption (PLOT-cryo) approach, described in Part I of this two-part series, applied to several analytes of interest. We conducted tests with coumarin and 2,4,6-trinitrotoluene (two solutes that were used in initial development of PLOT-cryo technology), naphthalene, aviation turbine kerosene, and diesel fuel, on a variety of matrices and test beds. We demonstrated that these analytes can be easily detected and reliably identified using the portable unit for analyte collection. By leveraging efficiency-boosting temperature control and the high flow rate multiple capillary wafer, very short collection times (as low as 3s) yielded accurate detection. For diesel fuel spiked on glass beads, we determined a method detection limit below 1 ppm. We observed greater variability among separate samples analyzed with the portable unit than previously documented in work using the laboratory-based PLOT-cryo technology. We identify three likely sources that may help explain the additional variation: the use of a compressed air source to generate suction, matrix geometry, and variability in the local vapor concentration around the sampling probe as solute depletion occurs both locally around the probe and in the test bed as a whole. This field-portable adaptation of the PLOT-cryo approach has numerous and diverse potential applications. Published by Elsevier B.V.

  12. Field Portable Low Temperature Porous Layer Open Tubular Cryoadsorption Headspace Sampling and Analysis Part II: Applications*

    PubMed Central

    Harries, Megan; Bukovsky-Reyes, Santiago; Bruno, Thomas J.

    2016-01-01

    This paper details the sampling methods used with the field portable porous layer open tubular cryoadsorption (PLOT-cryo) approach, described in Part I of this two-part series, applied to several analytes of interest. We conducted tests with coumarin and 2,4,6-trinitrotoluene (two solutes that were used in initial development of PLOT-cryo technology), naphthalene, aviation turbine kerosene, and diesel fuel, on a variety of matrices and test beds. We demonstrated that these analytes can be easily detected and reliably identified using the portable unit for analyte collection. By leveraging efficiency-boosting temperature control and the high flow rate multiple capillary wafer, very short collection times (as low as 3 s) yielded accurate detection. For diesel fuel spiked on glass beads, we determined a method detection limit below 1 ppm. We observed greater variability among separate samples analyzed with the portable unit than previously documented in work using the laboratory-based PLOT-cryo technology. We identify three likely sources that may help explain the additional variation: the use of a compressed air source to generate suction, matrix geometry, and variability in the local vapor concentration around the sampling probe as solute depletion occurs both locally around the probe and in the test bed as a whole. This field-portable adaptation of the PLOT-cryo approach has numerous and diverse potential applications. PMID:26726934

  13. 'Nano-immuno test' for the detection of live Mycobacterium avium subspecies paratuberculosis bacilli in the milk samples using magnetic nano-particles and chromogen.

    PubMed

    Singh, Manju; Singh, Shoor Vir; Gupta, Saurabh; Chaubey, Kundan Kumar; Stephan, Bjorn John; Sohal, Jagdip Singh; Dutta, Manali

    2018-04-26

    Early rapid detection of Mycobacterium avium subspecies paratuberculosis (MAP) bacilli in milk samples is the major challenge since traditional culture method is time consuming and laboratory dependent. We report a simple, sensitive and specific nano-technology based 'Nano-immuno test' capable of detecting viable MAP bacilli in the milk samples within 10 h. Viable MAP bacilli were captured by MAP specific antibody-conjugated magnetic nano-particles using resazurin dye as chromogen. Test was optimized using true culture positive (10-bovine and 12-goats) and true culture negative (16-bovine and 25-goats) raw milk samples. Domestic livestock species in India are endemically infected with MAP. After successful optimization, sensitivity and specificity of the 'nano-immuno test' in goats with respect to milk culture was 91.7% and 96.0%, respectively. Whereas, it was 90.0% (sensitivity) and 92.6% (specificity) with respect to IS900 PCR. In bovine milk samples, sensitivity and specificity of 'nano-immuno test' with respect to milk culture was 90.0% and 93.7%, respectively. However, with respect to IS900 PCR, the sensitivity and specificity was 88.9% and 94.1%, respectively. Test was validated with field raw milk samples (goats-258 and bovine-138) collected from domestic livestock species to detect live/viable MAP bacilli. Of 138 bovine raw milk samples screened by six diagnostic tests, 81 (58.7%) milk samples were positive for MAP infection in one or more than one diagnostic tests. Of 81 (58.7%) positive bovine raw milk samples, only 24 (17.4%) samples were detected positive for the presence of viable MAP bacilli. Of 258 goats raw milk samples screened by six diagnostic tests, 141 (54.6%) were positive for MAP infection in one or more than one test. Of 141 (54.6%) positive raw milk samples from goats, only 48 (34.0%) were detected positive for live MAP bacilli. Simplicity and efficiency of this novel 'nano-immuno test' makes it suitable for wide-scale screening of milk

  14. Field spectroscopy sampling strategies for improved measurement of Earth surface reflectance

    NASA Astrophysics Data System (ADS)

    Mac Arthur, A.; Alonso, L.; Malthus, T. J.; Moreno, J. F.

    2013-12-01

    Over the last two decades extensive networks of research sites have been established to measure the flux of carbon compounds and water vapour between the Earth's surface and the atmosphere using eddy covariance (EC) techniques. However, contributing Earth surface components cannot be determined and (as the ';footprints' are spatially constrained) these measurements cannot be extrapolated to regional cover using this technique. At many of these EC sites researchers have been integrating spectral measurements with EC and ancillary data to better understand light use efficiency and carbon dioxide flux. These spectroscopic measurements could also be used to assess contributing components and provide support for imaging spectroscopy, from airborne or satellite platforms, which can provide unconstrained spatial cover. Furthermore, there is an increasing interest in ';smart' database and information retrieval systems such as that proposed by EcoSIS and OPTIMISE to store, analyse, QA and merge spectral and biophysical measurements and provide information to end users. However, as Earth surfaces are spectrally heterogeneous and imaging and field spectrometers sample different spatial extents appropriate field sampling strategies require to be adopted. To sample Earth surfaces spectroscopists adopt either single; random; regular grid; transect; or 'swiping' point sampling strategies, although little comparative work has been carried out to determine the most appropriate approach; the work by Goetz (2012) is a limited exception. Mac Arthur et al (2012) demonstrated that, for two full wavelength (400 nm to 2,500 nm) field spectroradiometers, the measurement area sampled is defined by each spectroradiometer/fore optic system's directional response function (DRF) rather than the field-of-view (FOV) specified by instrument manufacturers. Mac Arthur et al (2012) also demonstrated that each reflecting element within the sampled area was not weighted equally in the integrated

  15. Descent advisor preliminary field test

    NASA Technical Reports Server (NTRS)

    Green, Steven M.; Vivona, Robert A.; Sanford, Beverly

    1995-01-01

    A field test of the Descent Advisor (DA) automation tool was conducted at the Denver Air Route Traffic Control Center in September 1994. DA is being developed to assist Center controllers in the efficient management and control of arrival traffic. DA generates advisories, based on trajectory predictions, to achieve accurate meter-fix arrival times in a fuel efficient manner while assisting the controller with the prediction and resolution of potential conflicts. The test objectives were to evaluate the accuracy of DA trajectory predictions for conventional- and flight-management-system-equipped jet transports, to identify significant sources of trajectory prediction error, and to investigate procedural and training issues (both air and ground) associated with DA operations. Various commercial aircraft (97 flights total) and a Boeing 737-100 research aircraft participated in the test. Preliminary results from the primary test set of 24 commercial flights indicate a mean DA arrival time prediction error of 2.4 sec late with a standard deviation of 13.1 sec. This paper describes the field test and presents preliminary results for the commercial flights.

  16. Descent Advisor Preliminary Field Test

    NASA Technical Reports Server (NTRS)

    Green, Steven M.; Vivona, Robert A.; Sanford, Beverly

    1995-01-01

    A field test of the Descent Advisor (DA) automation tool was conducted at the Denver Air Route Traffic Control Center in September 1994. DA is being developed to assist Center controllers in the efficient management and control of arrival traffic. DA generates advisories, based on trajectory predictions, to achieve accurate meter-fix arrival times in a fuel efficient manner while assisting the controller with the prediction and resolution of potential conflicts. The test objectives were: (1) to evaluate the accuracy of DA trajectory predictions for conventional and flight-management system equipped jet transports, (2) to identify significant sources of trajectory prediction error, and (3) to investigate procedural and training issues (both air and ground) associated with DA operations. Various commercial aircraft (97 flights total) and a Boeing 737-100 research aircraft participated in the test. Preliminary results from the primary test set of 24 commercial flights indicate a mean DA arrival time prediction error of 2.4 seconds late with a standard deviation of 13.1 seconds. This paper describes the field test and presents preliminary results for the commercial flights.

  17. A large scale test of the gaming-enhancement hypothesis.

    PubMed

    Przybylski, Andrew K; Wang, John C

    2016-01-01

    A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis , has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people's gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.

  18. A large scale test of the gaming-enhancement hypothesis

    PubMed Central

    Wang, John C.

    2016-01-01

    A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis, has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people’s gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work. PMID:27896035

  19. Testing of Large Diameter Fresnel Optics for Space Based Observations of Extensive Air Showers

    NASA Technical Reports Server (NTRS)

    Adams, James H.; Christl, Mark J.; Young, Roy M.

    2011-01-01

    The JEM-EUSO mission will detect extensive air showers produced by extreme energy cosmic rays. It operates from the ISS looking down on Earth's night time atmosphere to detect the nitrogen fluorescence and Cherenkov produce by the charged particles in the EAS. The JEM-EUSO science objectives require a large field of view, sensitivity to energies below 50 EeV, and must fit within available ISS resources. The JEM-EUSO optic module uses three large diameter, thin plastic lenses with Fresnel surfaces to meet the instrument requirements. A bread-board model of the optic has been manufactured and has undergone preliminary tests. We report the results of optical performance tests and evaluate the present capability to manufacture these optical elements.

  20. Statistical techniques for detecting the intergalactic magnetic field from large samples of extragalactic Faraday rotation data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akahori, Takuya; Gaensler, B. M.; Ryu, Dongsu, E-mail: akahori@physics.usyd.edu.au, E-mail: bryan.gaensler@sydney.edu.au, E-mail: ryu@sirius.unist.ac.kr

    2014-08-01

    Rotation measure (RM) grids of extragalactic radio sources have been widely used for studying cosmic magnetism. However, their potential for exploring the intergalactic magnetic field (IGMF) in filaments of galaxies is unclear, since other Faraday-rotation media such as the radio source itself, intervening galaxies, and the interstellar medium of our Galaxy are all significant contributors. We study statistical techniques for discriminating the Faraday rotation of filaments from other sources of Faraday rotation in future large-scale surveys of radio polarization. We consider a 30° × 30° field of view toward the south Galactic pole, while varying the number of sources detectedmore » in both present and future observations. We select sources located at high redshifts and toward which depolarization and optical absorption systems are not observed so as to reduce the RM contributions from the sources and intervening galaxies. It is found that a high-pass filter can satisfactorily reduce the RM contribution from the Galaxy since the angular scale of this component toward high Galactic latitudes would be much larger than that expected for the IGMF. Present observations do not yet provide a sufficient source density to be able to estimate the RM of filaments. However, from the proposed approach with forthcoming surveys, we predict significant residuals of RM that should be ascribable to filaments. The predicted structure of the IGMF down to scales of 0.°1 should be observable with data from the Square Kilometre Array, if we achieve selections of sources toward which sightlines do not contain intervening galaxies and RM errors are less than a few rad m{sup –2}.« less

  1. Gleeble Testing of Tungsten Samples

    DTIC Science & Technology

    2013-02-01

    as a diffusion barrier to prevent the tungsten samples from fusing to the tungsten carbide inserts at elevated temperatures. After the anvils were...anvils with removable tungsten carbide inserts. The inserts were 19.05 mm (0.75 in) in diameter and 25.4 mm (1 in) long; they were purchased from...rhenium are shown in tables 6 and 7 and figure 7. The sample tested at 1300 °C, T4, partially embedded into the tungsten carbide (WC) inserts during

  2. An improved large-field focusing schlieren system

    NASA Technical Reports Server (NTRS)

    Weinstein, Leonard M.

    1991-01-01

    The analysis and performance of a high-brightness large-field focusing schlieren system is described. The system can be used to examine complex two- and three-dimensional flows. Techniques are described to obtain focusing schlieren through distorting optical elements, to use multiple colors in a time multiplexing technique, and to use diffuse screen holography for three-dimensional photographs.

  3. Comparison of American Fisheries Society (AFS) standard fish sampling techniques and environmental DNA for characterizing fish communities in a large reservoir

    USGS Publications Warehouse

    Perez, Christina R.; Bonar, Scott A.; Amberg, Jon J.; Ladell, Bridget; Rees, Christopher B.; Stewart, William T.; Gill, Curtis J.; Cantrell, Chris; Robinson, Anthony

    2017-01-01

    Recently, methods involving examination of environmental DNA (eDNA) have shown promise for characterizing fish species presence and distribution in waterbodies. We evaluated the use of eDNA for standard fish monitoring surveys in a large reservoir. Specifically, we compared the presence, relative abundance, biomass, and relative percent composition of Largemouth Bass Micropterus salmoides and Gizzard Shad Dorosoma cepedianum measured through eDNA methods and established American Fisheries Society standard sampling methods for Theodore Roosevelt Lake, Arizona. Catches at electrofishing and gillnetting sites were compared with eDNA water samples at sites, within spatial strata, and over the entire reservoir. Gizzard Shad were detected at a higher percentage of sites with eDNA methods than with boat electrofishing in both spring and fall. In contrast, spring and fall gillnetting detected Gizzard Shad at more sites than eDNA. Boat electrofishing and gillnetting detected Largemouth Bass at more sites than eDNA; the exception was fall gillnetting, for which the number of sites of Largemouth Bass detection was equal to that for eDNA. We observed no relationship between relative abundance and biomass of Largemouth Bass and Gizzard Shad measured by established methods and eDNA copies at individual sites or lake sections. Reservoirwide catch composition for Largemouth Bass and Gizzard Shad (numbers and total weight [g] of fish) as determined through a combination of gear types (boat electrofishing plus gillnetting) was similar to the proportion of total eDNA copies from each species in spring and fall field sampling. However, no similarity existed between proportions of fish caught via spring and fall boat electrofishing and the proportion of total eDNA copies from each species. Our study suggests that eDNA field sampling protocols, filtration, DNA extraction, primer design, and DNA sequencing methods need further refinement and testing before incorporation into standard

  4. A direct observation method for auditing large urban centers using stratified sampling, mobile GIS technology and virtual environments.

    PubMed

    Lafontaine, Sean J V; Sawada, M; Kristjansson, Elizabeth

    2017-02-16

    With the expansion and growth of research on neighbourhood characteristics, there is an increased need for direct observational field audits. Herein, we introduce a novel direct observational audit method and systematic social observation instrument (SSOI) for efficiently assessing neighbourhood aesthetics over large urban areas. Our audit method uses spatial random sampling stratified by residential zoning and incorporates both mobile geographic information systems technology and virtual environments. The reliability of our method was tested in two ways: first, in 15 Ottawa neighbourhoods, we compared results at audited locations over two subsequent years, and second; we audited every residential block (167 blocks) in one neighbourhood and compared the distribution of SSOI aesthetics index scores with results from the randomly audited locations. Finally, we present interrater reliability and consistency results on all observed items. The observed neighbourhood average aesthetics index score estimated from four or five stratified random audit locations is sufficient to characterize the average neighbourhood aesthetics. The SSOI was internally consistent and demonstrated good to excellent interrater reliability. At the neighbourhood level, aesthetics is positively related to SES and physical activity and negatively correlated with BMI. The proposed approach to direct neighbourhood auditing performs sufficiently and has the advantage of financial and temporal efficiency when auditing a large city.

  5. Remote sensing and field test capabilities at U.S. Army Dugway Proving Ground

    NASA Astrophysics Data System (ADS)

    Pearson, James T.; Herron, Joshua P.; Marshall, Martin S.

    2011-11-01

    U.S. Army Dugway Proving Ground (DPG) is a Major Range and Test Facility Base (MRTFB) with the mission of testing chemical and biological defense systems and materials. DPG facilities include state-of-the-art laboratories, extensive test grids, controlled environment calibration facilities, and a variety of referee instruments for required test measurements. Among these referee instruments, DPG has built up a significant remote sensing capability for both chemical and biological detection. Technologies employed for remote sensing include FTIR spectroscopy, UV spectroscopy, Raman-shifted eye-safe lidar, and other elastic backscatter lidar systems. These systems provide referee data for bio-simulants, chemical simulants, toxic industrial chemicals (TICs), and toxic industrial materials (TIMs). In order to realize a successful large scale open-air test, each type of system requires calibration and characterization. DPG has developed specific calibration facilities to meet this need. These facilities are the Joint Ambient Breeze Tunnel (JABT), and the Active Standoff Chamber (ASC). The JABT and ASC are open ended controlled environment tunnels. Each includes validation instrumentation to characterize simulants that are disseminated. Standoff systems are positioned at typical field test distances to measure characterized simulants within the tunnel. Data from different types of systems can be easily correlated using this method, making later open air test results more meaningful. DPG has a variety of large scale test grids available for field tests. After and during testing, data from the various referee instruments is provided in a visual format to more easily draw conclusions on the results. This presentation provides an overview of DPG's standoff testing facilities and capabilities, as well as example data from different test scenarios.

  6. Remote sensing and field test capabilities at U.S. Army Dugway Proving Ground

    NASA Astrophysics Data System (ADS)

    Pearson, James T.; Herron, Joshua P.; Marshall, Martin S.

    2012-05-01

    U.S. Army Dugway Proving Ground (DPG) is a Major Range and Test Facility Base (MRTFB) with the mission of testing chemical and biological defense systems and materials. DPG facilities include state-of-the-art laboratories, extensive test grids, controlled environment calibration facilities, and a variety of referee instruments for required test measurements. Among these referee instruments, DPG has built up a significant remote sensing capability for both chemical and biological detection. Technologies employed for remote sensing include FTIR spectroscopy, UV spectroscopy, Raman-shifted eye-safe lidar, and other elastic backscatter lidar systems. These systems provide referee data for bio-simulants, chemical simulants, toxic industrial chemicals (TICs), and toxic industrial materials (TIMs). In order to realize a successful large scale open-air test, each type of system requires calibration and characterization. DPG has developed specific calibration facilities to meet this need. These facilities are the Joint Ambient Breeze Tunnel (JABT), and the Active Standoff Chamber (ASC). The JABT and ASC are open ended controlled environment tunnels. Each includes validation instrumentation to characterize simulants that are disseminated. Standoff systems are positioned at typical field test distances to measure characterized simulants within the tunnel. Data from different types of systems can be easily correlated using this method, making later open air test results more meaningful. DPG has a variety of large scale test grids available for field tests. After and during testing, data from the various referee instruments is provided in a visual format to more easily draw conclusions on the results. This presentation provides an overview of DPG's standoff testing facilities and capabilities, as well as example data from different test scenarios.

  7. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, Terry D.; Beller, Laurence S.; Clark, Michael L.; Klingler, Kerry M.

    1997-01-01

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus are also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container.

  8. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, T.D.; Beller, L.S.; Clark, M.L.; Klingler, K.M.

    1997-10-14

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: (a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; (b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; (c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and (d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus is also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container. 8 figs.

  9. Results from Field Testing the RIMFAX GPR on Svalbard.

    NASA Astrophysics Data System (ADS)

    Hamran, S. E.; Amundsen, H. E. F.; Berger, T.; Carter, L. M.; Dypvik, H.; Ghent, R. R.; Kohler, J.; Mellon, M. T.; Nunes, D. C.; Paige, D. A.; Plettemeier, D.; Russell, P.

    2017-12-01

    The Radar Imager for Mars' Subsurface Experiment - RIMFAX is a Ground Penetrating Radar being developed for NASÁs MARS 2020 rover mission. The principal goals of the RIMFAX investigation are to image subsurface structures, provide context for sample sites, derive information regarding subsurface composition, and search for ice or brines. In meeting these goals, RIMFAX will provide a view of the stratigraphic section and a window into the geological and environmental history of Mars. To verify the design an Engineering Model (EM) of the radar was tested in the field in the spring 2017. Different sounding modes on the EM were tested in different types of subsurface geology on Svalbard. Deep soundings were performed on polythermal glaciers down to a couple of hundred meters. Shallow soundings were used to map a ground water table in the firn area of a glacier. A combination of deep and shallow soundings was used to image buried ice under a sedimentary layer of a couple of meters. Subsurface sedimentary layers were imaged down to more than 20 meters in sand stone permafrost. This presentation will give an overview of the RIMFAX investigation, describe the development of the radar system, and show results from field tests of the radar.

  10. Reverberation Chamber Uniformity Validation and Radiated Susceptibility Test Procedures for the NASA High Intensity Radiated Fields Laboratory

    NASA Technical Reports Server (NTRS)

    Koppen, Sandra V.; Nguyen, Truong X.; Mielnik, John J.

    2010-01-01

    The NASA Langley Research Center's High Intensity Radiated Fields Laboratory has developed a capability based on the RTCA/DO-160F Section 20 guidelines for radiated electromagnetic susceptibility testing in reverberation chambers. Phase 1 of the test procedure utilizes mode-tuned stirrer techniques and E-field probe measurements to validate chamber uniformity, determines chamber loading effects, and defines a radiated susceptibility test process. The test procedure is segmented into numbered operations that are largely software controlled. This document is intended as a laboratory test reference and includes diagrams of test setups, equipment lists, as well as test results and analysis. Phase 2 of development is discussed.

  11. Apparatus for testing skin samples or the like

    DOEpatents

    Holland, J.M.

    1982-08-31

    An apparatus for testing the permeability of living skin samples has a flat base with a plurality of sample-holding cavities formed in its upper surface, the samples being placed in counterbores in the cavities with the epidermis uppermost. O-rings of Teflon washers are respectively placed on the samples and a flat cover is connected to the base to press the rings against the upper surfaces of the samples. Media to maintain tissue viability and recovery of metabolites is introduced into the lower portion of the sample-holding cavities through passages in the base. Test materials are introduced through holes in the cover plate after assembly of the chamber.

  12. Generation of Large-Scale Magnetic Fields by Small-Scale Dynamo in Shear Flows.

    PubMed

    Squire, J; Bhattacharjee, A

    2015-10-23

    We propose a new mechanism for a turbulent mean-field dynamo in which the magnetic fluctuations resulting from a small-scale dynamo drive the generation of large-scale magnetic fields. This is in stark contrast to the common idea that small-scale magnetic fields should be harmful to large-scale dynamo action. These dynamos occur in the presence of a large-scale velocity shear and do not require net helicity, resulting from off-diagonal components of the turbulent resistivity tensor as the magnetic analogue of the "shear-current" effect. Given the inevitable existence of nonhelical small-scale magnetic fields in turbulent plasmas, as well as the generic nature of velocity shear, the suggested mechanism may help explain the generation of large-scale magnetic fields across a wide range of astrophysical objects.

  13. SAMPLING LARGE RIVERS FOR ALGAE, BENTHIC MACROINVERTEBRATES AND FISH

    EPA Science Inventory

    Multiple projects are currently underway to increase our understanding of the effects of different sampling methods and designs used for the biological assessment and monitoring of large (boatable) rivers. Studies include methods used to assess fish, benthic macroinvertebrates, ...

  14. 21 CFR 864.3260 - OTC test sample collection systems for drugs of abuse testing.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... abuse testing. 864.3260 Section 864.3260 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... Instrumentation and Accessories § 864.3260 OTC test sample collection systems for drugs of abuse testing. (a) Identification. An over-the-counter (OTC) test sample collection system for drugs of abuse testing is a device...

  15. 21 CFR 864.3260 - OTC test sample collection systems for drugs of abuse testing.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... abuse testing. 864.3260 Section 864.3260 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... Instrumentation and Accessories § 864.3260 OTC test sample collection systems for drugs of abuse testing. (a) Identification. An over-the-counter (OTC) test sample collection system for drugs of abuse testing is a device...

  16. 21 CFR 864.3260 - OTC test sample collection systems for drugs of abuse testing.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... abuse testing. 864.3260 Section 864.3260 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... Instrumentation and Accessories § 864.3260 OTC test sample collection systems for drugs of abuse testing. (a) Identification. An over-the-counter (OTC) test sample collection system for drugs of abuse testing is a device...

  17. 21 CFR 864.3260 - OTC test sample collection systems for drugs of abuse testing.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... abuse testing. 864.3260 Section 864.3260 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... Instrumentation and Accessories § 864.3260 OTC test sample collection systems for drugs of abuse testing. (a) Identification. An over-the-counter (OTC) test sample collection system for drugs of abuse testing is a device...

  18. Design of a mobile, homogeneous, and efficient electromagnet with a large field of view for neonatal low-field MRI.

    PubMed

    Lother, Steffen; Schiff, Steven J; Neuberger, Thomas; Jakob, Peter M; Fidler, Florian

    2016-08-01

    In this work, a prototype of an effective electromagnet with a field-of-view (FoV) of 140 mm for neonatal head imaging is presented. The efficient implementation succeeded by exploiting the use of steel plates as a housing system. We achieved a compromise between large sample volumes, high homogeneity, high B0 field, low power consumption, light weight, simple fabrication, and conserved mobility without the necessity of a dedicated water cooling system. The entire magnetic resonance imaging (MRI) system (electromagnet, gradient system, transmit/receive coil, control system) is introduced and its unique features discussed. Furthermore, simulations using a numerical optimization algorithm for magnet and gradient system are presented. Functionality and quality of this low-field scanner operating at 23 mT (generated with 500 W) is illustrated using spin-echo imaging (in-plane resolution 1.6 mm × 1.6 mm, slice thickness 5 mm, and signal-to-noise ratio (SNR) of 23 with a acquisition time of 29 min). B0 field-mapping measurements are presented to characterize the homogeneity of the magnet, and the B0 field limitations of 80 mT of the system are fully discussed. The cryogen-free system presented here demonstrates that this electromagnet with a ferromagnetic housing can be optimized for MRI with an enhanced and homogeneous magnetic field. It offers an alternative to prepolarized MRI designs in both readout field strength and power use. There are multiple indications for the clinical medical application of such low-field devices.

  19. Design of a mobile, homogeneous, and efficient electromagnet with a large field of view for neonatal low-field MRI

    PubMed Central

    Schiff, Steven J.; Neuberger, Thomas; Jakob, Peter M.; Fidler, Florian

    2017-01-01

    Objective In this work, a prototype of an effective electromagnet with a field-of-view (FoV) of 140 mm for neonatal head imaging is presented. The efficient implementation succeeded by exploiting the use of steel plates as a housing system. We achieved a compromise between large sample volumes, high homogeneity, high B0 field, low power consumption, light weight, simple fabrication, and conserved mobility without the necessity of a dedicated water cooling system. Materials and methods The entire magnetic resonance imaging (MRI) system (electromagnet, gradient system, transmit/receive coil, control system) is introduced and its unique features discussed. Furthermore, simulations using a numerical optimization algorithm for magnet and gradient system are presented. Results Functionality and quality of this low-field scanner operating at 23 mT (generated with 500 W) is illustrated using spin-echo imaging (in-plane resolution 1.6 mm × 1.6 mm, slice thickness 5 mm, and signal-to-noise ratio (SNR) of 23 with a acquisition time of 29 min). B0 field-mapping measurements are presented to characterize the homogeneity of the magnet, and the B0 field limitations of 80 mT of the system are fully discussed. Conclusion The cryogen-free system presented here demonstrates that this electromagnet with a ferromagnetic housing can be optimized for MRI with an enhanced and homogeneous magnetic field. It offers an alternative to pre-polarized MRI designs in both readout field strength and power use. There are multiple indications for the clinical medical application of such low-field devices. PMID:26861046

  20. Field Test Evaluation of Conservation Retrofits of Low-Income, Single-Family Buildings in Wisconsin: Blower-Door-Directed Infiltration Reduction Procedure, Field Test Implementation and Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gettings, M.B.

    A blower-door-directed infiltration retrofit procedure was field tested on 18 homes in south central Wisconsin. The procedure, developed by the Wisconsin Energy Conservation Corporation, includes recommended retrofit techniques as well as criteria for estimating the amount of cost-effective work to be performed on a house. A recommended expenditure level and target air leakage reduction, in air changes per hour at 50 Pascal (ACH50), are determined from the initial leakage rate measured. The procedure produced an average 16% reduction in air leakage rate. For the 7 houses recommended for retrofit, 89% of the targeted reductions were accomplished with 76% of themore » recommended expenditures. The average cost of retrofits per house was reduced by a factor of four compared with previous programs. The average payback period for recommended retrofits was 4.4 years, based on predicted energy savings computed from achieved air leakage reductions. Although exceptions occurred, the procedure's 8 ACH50 minimum initial leakage rate for advising retrofits to be performed appeared a good choice, based on cost-effective air leakage reduction. Houses with initial rates of 7 ACH50 or below consistently required substantially higher costs to achieve significant air leakage reductions. No statistically significant average annual energy savings was detected as a result of the infiltration retrofits. Average measured savings were -27 therm per year, indicating an increase in energy use, with a 90% confidence interval of 36 therm. Measured savings for individual houses varied widely in both positive and negative directions, indicating that factors not considered affected the results. Large individual confidence intervals indicate a need to increase the accuracy of such measurements as well as understand the factors which may cause such disparity. Recommendations for the procedure include more extensive training of retrofit crews, checks for minimum air exchange rates to insure air

  1. High-Field Liquid-State Dynamic Nuclear Polarization in Microliter Samples.

    PubMed

    Yoon, Dongyoung; Dimitriadis, Alexandros I; Soundararajan, Murari; Caspers, Christian; Genoud, Jeremy; Alberti, Stefano; de Rijk, Emile; Ansermet, Jean-Philippe

    2018-05-01

    Nuclear hyperpolarization in the liquid state by dynamic nuclear polarization (DNP) has been of great interest because of its potential use in NMR spectroscopy of small samples of biological and chemical compounds in aqueous media. Liquid state DNP generally requires microwave resonators in order to generate an alternating magnetic field strong enough to saturate electron spins in the solution. As a consequence, the sample size is limited to dimensions of the order of the wavelength, and this restricts the sample volume to less than 100 nL for DNP at 9 T (∼260 GHz). We show here a new approach that overcomes this sample size limitation. Large saturation of electron spins was obtained with a high-power (∼150 W) gyrotron without microwave resonators. Since high power microwaves can cause serious dielectric heating in polar solutions, we designed a planar probe which effectively alleviates dielectric heating. A thin liquid sample of 100 μm of thickness is placed on a block of high thermal conductivity aluminum nitride, with a gold coating that serves both as a ground plane and as a heat sink. A meander or a coil were used for NMR. We performed 1 H DNP at 9.2 T (∼260 GHz) and at room temperature with 10 μL of water, a volume that is more than 100× larger than reported so far. The 1 H NMR signal is enhanced by a factor of about -10 with 70 W of microwave power. We also demonstrated the liquid state of 31 P DNP in fluorobenzene containing triphenylphosphine and obtained an enhancement of ∼200.

  2. ELECTRON ACCELERATION AT A CORONAL SHOCK PROPAGATING THROUGH A LARGE-SCALE STREAMER-LIKE MAGNETIC FIELD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kong, Xiangliang; Chen, Yao; Feng, Shiwei

    2016-04-10

    Using a test-particle simulation, we investigate the effect of large-scale coronal magnetic fields on electron acceleration at an outward-propagating coronal shock with a circular front. The coronal field is approximated by an analytical solution with a streamer-like magnetic field featuring a partially open magnetic field and a current sheet at the equator atop the closed region. We show that the large-scale shock-field configuration, especially the relative curvature of the shock and the magnetic field line across which the shock is sweeping, plays an important role in the efficiency of electron acceleration. At low shock altitudes, when the shock curvature ismore » larger than that of the magnetic field lines, the electrons are mainly accelerated at the shock flanks; at higher altitudes, when the shock curvature is smaller, the electrons are mainly accelerated at the shock nose around the top of closed field lines. The above process reveals the shift of the efficient electron acceleration region along the shock front during its propagation. We also find that, in general, the electron acceleration at the shock flank is not as efficient as that at the top of the closed field because a collapsing magnetic trap can be formed at the top. In addition, we find that the energy spectra of electrons are power-law-like, first hardening then softening with the spectral index varying in a range of −3 to −6. Physical interpretations of the results and implications for the study of solar radio bursts are discussed.« less

  3. Electron acceleration at a coronal shock propagating through a large-scale streamer-like magnetic field

    DOE PAGES

    Kong, Xiangliang; Chen, Yao; Guo, Fan; ...

    2016-04-05

    With a test-particle simulation, we investigate the effect of large-scale coronal magnetic fields on electron acceleration at an outward-propagating coronal shock with a circular front. The coronal field is approximated by an analytical solution with a streamer-like magnetic field featured by partially open magnetic field and a current sheet at the equator atop the closed region. We show that the large-scale shock-field configuration, especially the relative curvature of the shock and the magnetic field line across which the shock is sweeping, plays an important role in the efficiency of electron acceleration. At low shock altitudes, when the shock curvature ismore » larger than that of magnetic field lines, the electrons are mainly accelerated at the shock flanks; at higher altitudes, when the shock curvature is smaller, the electrons are mainly accelerated at the shock nose around the top of closed field lines. The above process reveals the shift of efficient electron acceleration region along the shock front during its propagation. We also found that in general the electron acceleration at the shock flank is not so efficient as that at the top of closed field since at the top a collapsing magnetic trap can be formed. In addition, we find that the energy spectra of electrons is power-law like, first hardening then softening with the spectral index varying in a range of -3 to -6. In conclusion, physical interpretations of the results and implications on the study of solar radio bursts are discussed.« less

  4. Field-Based Video Pre-Test Counseling, Oral Testing, and Telephonic Post-Test Counseling: Implementation of an HIV Field Testing Package among High-Risk Indian Men

    ERIC Educational Resources Information Center

    Snyder, Hannah; Yeldandi, Vijay V.; Kumar, G. Prem; Liao, Chuanhong; Lakshmi, Vemu; Gandham, Sabitha R.; Muppudi, Uma; Oruganti, Ganesh; Schneider, John A.

    2012-01-01

    In India, men who have sex with men (MSM) and truck drivers are high-risk groups that often do not access HIV testing due to stigma and high mobility. This study evaluated a field testing package (FTP) that identified HIV positive participants through video pre-test counseling, OraQuick oral fluid HIV testing, and telephonic post-test counseling…

  5. Red square test for visual field screening. A sensitive and simple bedside test.

    PubMed

    Mandahl, A

    1994-12-01

    A reliable bedside test for screening of visual field defects is a valuable tool in the examination of patients with a putative disease affecting the sensory visual pathways. Conventional methods such as Donders' confrontation method, counting fingers in the visual field periphery, of two-hand confrontation are not sufficiently sensitive to detect minor but nevertheless serious visual field defects. More sensitive methods requiring only simple tools are also described. In this study, a test card with four red squares surrounding a fixation target, a black dot, with a total test area of about 11 x 12.5 degrees at a distance of 30 cm, was designed for testing experience of red colour saturation in four quadrants, red square test. The Goldmann visual field was used as reference. 125 consecutive patients with pituitary adenoma (159 eyes), craniopharyngeoma (9 eyes), meningeoma (21 eyes), vascular hemisphere lesion (40 eyes), hemisphere tumour (10 eyes) and hemisphere abscess (2 eyes) were examined. The Goldmann visual field and red square test were pathological in pituitary adenomas in 35%, in craniopharyngeomas in 44%, in meningeomas in 52% and in hemisphere tumours or abscess in 100% of the eyes. Among these, no false-normal or false-pathological tests were found. However, in vascular hemisphere disease the corresponding figures were Goldmann visual field 90% and red square test 85%. The 5% difference (4 eyes) was due to Goldmann visual field defects strictly peripheral to the central 15 degrees. These defects were easily diagnosed with two-hand confrontation and

  6. Assessing five field sampling methods to monitor Yellowstone National Park's northern ungulate winter range: the advantages and disadvantages of implementing a new sampling protocol

    Treesearch

    Pamela G. Sikkink; Roy Renkin; Geneva Chong; Art Sikkink

    2013-01-01

    The five field sampling methods tested for this study differed in richness and Simpson's Index values calculated from the raw data. How much the methods differed, and which ones were most similar to each other, depended on which diversity measure and which type of data were used for comparisons. When the number of species (richness) was used as a measure of...

  7. Generation of large-scale magnetic fields by small-scale dynamo in shear flows

    DOE PAGES

    Squire, J.; Bhattacharjee, A.

    2015-10-20

    We propose a new mechanism for a turbulent mean-field dynamo in which the magnetic fluctuations resulting from a small-scale dynamo drive the generation of large-scale magnetic fields. This is in stark contrast to the common idea that small-scale magnetic fields should be harmful to large-scale dynamo action. These dynamos occur in the presence of a large-scale velocity shear and do not require net helicity, resulting from off-diagonal components of the turbulent resistivity tensor as the magnetic analogue of the "shear-current" effect. Furthermore, given the inevitable existence of nonhelical small-scale magnetic fields in turbulent plasmas, as well as the generic naturemore » of velocity shear, the suggested mechanism may help explain the generation of large-scale magnetic fields across a wide range of astrophysical objects.« less

  8. Free Field Word recognition test in the presence of noise in normal hearing adults.

    PubMed

    Almeida, Gleide Viviani Maciel; Ribas, Angela; Calleros, Jorge

    In ideal listening situations, subjects with normal hearing can easily understand speech, as can many subjects who have a hearing loss. To present the validation of the Word Recognition Test in a Free Field in the Presence of Noise in normal-hearing adults. Sample consisted of 100 healthy adults over 18 years of age with normal hearing. After pure tone audiometry, a speech recognition test was applied in free field condition with monosyllables and disyllables, with standardized material in three listening situations: optimal listening condition (no noise), with a signal to noise ratio of 0dB and a signal to noise ratio of -10dB. For these tests, an environment in calibrated free field was arranged where speech was presented to the subject being tested from two speakers located at 45°, and noise from a third speaker, located at 180°. All participants had speech audiometry results in the free field between 88% and 100% in the three listening situations. Word Recognition Test in Free Field in the Presence of Noise proved to be easy to be organized and applied. The results of the test validation suggest that individuals with normal hearing should get between 88% and 100% of the stimuli correct. The test can be an important tool in measuring noise interference on the speech perception abilities. Copyright © 2016 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  9. From Field to the Web: Management and Publication of Geoscience Samples in CSIRO Mineral Resources

    NASA Astrophysics Data System (ADS)

    Devaraju, A.; Klump, J. F.; Tey, V.; Fraser, R.; Reid, N.; Brown, A.; Golodoniuc, P.

    2016-12-01

    Inaccessible samples are an obstacle to the reproducibility of research and may cause waste of time and resources through duplication of sample collection and management. Within the Commonwealth Scientific and Industrial Research Organisation (CSIRO) Mineral Resources there are various research communities who collect or generate physical samples as part of their field studies and analytical processes. Materials can be varied and could be rock, soil, plant materials, water, and even synthetic materials. Given the wide range of applications in CSIRO, each researcher or project may follow their own method of collecting, curating and documenting samples. In many cases samples and their documentation are often only available to the sample collector. For example, the Australian Resources Research Centre stores rock samples and research collections dating as far back as the 1970s. Collecting these samples again would be prohibitively expensive and in some cases impossible because the site has been mined out. These samples would not be easily discoverable by others without an online sample catalog. We identify some of the organizational and technical challenges to provide unambiguous and systematic access to geoscience samples, and present their solutions (e.g., workflow, persistent identifier and tools). We present the workflow starting from field sampling to sample publication on the Web, and describe how the International Geo Sample Number (IGSN) can be applied to identify samples along the process. In our test case geoscientific samples are collected as part of the Capricorn Distal Footprints project, a collaboration project between the CSIRO, the Geological Survey of Western Australia, academic institutions and industry partners. We conclude by summarizing the values of our solutions in terms of sample management and publication.

  10. Study Abroad Field Trip Improves Test Performance through Engagement and New Social Networks

    ERIC Educational Resources Information Center

    Houser, Chris; Brannstrom, Christian; Quiring, Steven M.; Lemmons, Kelly K.

    2011-01-01

    Although study abroad trips provide an opportunity for affective and cognitive learning, it is largely assumed that they improve learning outcomes. The purpose of this study is to determine whether a study abroad field trip improved cognitive learning by comparing test performance between the study abroad participants (n = 20) and their peers who…

  11. Development of a Large Field of View Shadowgraph System for a 16 Ft. Transonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Talley, Michael A.; Jones, Stephen B.; Goodman, Wesley L.

    2000-01-01

    A large field of view shadowgraph flow visualization system for the Langley 16 ft. Transonic Tunnel (16 ft.TT) has been developed to provide fast, low cost, aerodynamic design concept evaluation capability to support the development of the next generation of commercial and military aircraft and space launch vehicles. Key features of the 16 ft. TT shadowgraph system are: (1) high resolution (1280 X 1024) digital snap shots and sequences; (2) video recording of shadowgraph at 30 frames per second; (3) pan, tilt, & zoom to find and observe flow features; (4) one microsecond flash for freeze frame images; (5) large field of view approximately 12 X 6 ft; and (6) a low maintenance, high signal/noise ratio, retro-reflective screen to allow shadowgraph imaging while test section lights are on.

  12. Data Compression Algorithm Architecture for Large Depth-of-Field Particle Image Velocimeters

    NASA Technical Reports Server (NTRS)

    Bos, Brent; Memarsadeghi, Nargess; Kizhner, Semion; Antonille, Scott

    2013-01-01

    A large depth-of-field particle image velocimeter (PIV) is designed to characterize dynamic dust environments on planetary surfaces. This instrument detects lofted dust particles, and senses the number of particles per unit volume, measuring their sizes, velocities (both speed and direction), and shape factors when the particles are large. To measure these particle characteristics in-flight, the instrument gathers two-dimensional image data at a high frame rate, typically >4,000 Hz, generating large amounts of data for every second of operation, approximately 6 GB/s. To characterize a planetary dust environment that is dynamic, the instrument would have to operate for at least several minutes during an observation period, easily producing more than a terabyte of data per observation. Given current technology, this amount of data would be very difficult to store onboard a spacecraft, and downlink to Earth. Since 2007, innovators have been developing an autonomous image analysis algorithm architecture for the PIV instrument to greatly reduce the amount of data that it has to store and downlink. The algorithm analyzes PIV images and automatically reduces the image information down to only the particle measurement data that is of interest, reducing the amount of data that is handled by more than 10(exp 3). The state of development for this innovation is now fairly mature, with a functional algorithm architecture, along with several key pieces of algorithm logic, that has been proven through field test data acquired with a proof-of-concept PIV instrument.

  13. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR FIELD COLLECTION AND POST-FIELD SAMPLE HANDLING OF INDOOR FLOOR DUST SAMPLES (UA-F-7.1)

    EPA Science Inventory

    The purpose of this SOP is to establish a uniform procedure for the collection of indoor floor dust samples in the field. This procedure was followed to ensure consistent data retrieval of dust samples during the Arizona NHEXAS project and the Border study. Keywords: field; vacu...

  14. Phoenix Test Sample Site

    NASA Technical Reports Server (NTRS)

    2008-01-01

    This image, acquired by NASA's Phoenix Mars Lander's Surface Stereo Imager on Sol 7, the seventh day of the mission (June 1, 2008), shows the so-called 'Knave of Hearts' first-dig test area to the north of the lander. The Robotic Arm's scraping blade left a small horizontal depression above where the sample was taken.

    Scientists speculate that white material in the depression left by the dig could represent ice or salts that precipitated into the soil. This material is likely the same white material observed in the sample in the Robotic Arm's scoop.

    The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  15. Estimation of sample size and testing power (part 5).

    PubMed

    Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo

    2012-02-01

    Estimation of sample size and testing power is an important component of research design. This article introduced methods for sample size and testing power estimation of difference test for quantitative and qualitative data with the single-group design, the paired design or the crossover design. To be specific, this article introduced formulas for sample size and testing power estimation of difference test for quantitative and qualitative data with the above three designs, the realization based on the formulas and the POWER procedure of SAS software and elaborated it with examples, which will benefit researchers for implementing the repetition principle.

  16. Large-scale wind tunnel tests of a sting-supported V/STOL fighter model at high angles of attack

    NASA Technical Reports Server (NTRS)

    Stoll, F.; Minter, E. A.

    1981-01-01

    A new sting model support has been developed for the NASA/Ames 40- by 80-Foot Wind Tunnel. This addition to the facility permits testing of relatively large models to large angles of attack or angles of yaw depending on model orientation. An initial test on the sting is described. This test used a 0.4-scale powered V/STOL model designed for testing at angles of attack to 90 deg and greater. A method for correcting wake blockage was developed and applied to the force and moment data. Samples of this data and results of surface-pressure measurements are presented.

  17. Pliocene large-mammal assemblages from northern Chad: sampling and ecological structure

    NASA Astrophysics Data System (ADS)

    Fara, Emmanuel; Likius, Andossa; Mackaye, Hassane T.; Vignaud, Patrick; Brunet, Michel

    2005-11-01

    Numerous Pliocene large-mammal assemblages have been discovered in Chad over the last decade. They offer a unique opportunity to understand the settings in which important chapters of Hominid evolution took place in Central Africa. However, it is crucial to first investigate both sampling and taxonomic homogeneity for these Chadian assemblages because they occur over large sectors in a sandy desert that offers virtually no stratigraphic section. Using cluster analysis and ordination techniques, we show that the three Pliocene sectors from Chad are homogeneous and adequate sampling units. Previous stable isotope analyses on these assemblages have indicated that the environment became richer in C4 plants between approximately 5.3 and 3.5 3 Ma. To test whether this environmental change has affected the structure of palaeo-communities, we assigned body mass, trophic and locomotor eco-variables to mammal species from the three sectors. Statistical analysis shows that the overall ecological structure of the assemblages is not linked with the opening of the plant cover, and eco-variables show no temporal trend from the oldest sector to the youngest. For example, there is no significant change in the relative diversity of grazing and browsing taxa, although mixed feeders are less diversified in the youngest sector than in the preceding one. This pattern apparently does not result from potential biases such as methodological artefacts or taphonomic imprint. Instead, it seems that local heterogeneous environmental factors have played a major role in shaping the ecological spectrum of Chadian mammal palaeo-communities during the Pliocene.

  18. Field portable mobile phone based fluorescence microscopy for detection of Giardia lamblia cysts in water samples

    NASA Astrophysics Data System (ADS)

    Ceylan Koydemir, Hatice; Gorocs, Zoltan; McLeod, Euan; Tseng, Derek; Ozcan, Aydogan

    2015-03-01

    Giardia lamblia is a waterborne parasite that causes an intestinal infection, known as giardiasis, and it is found not only in countries with inadequate sanitation and unsafe water but also streams and lakes of developed countries. Simple, sensitive, and rapid detection of this pathogen is important for monitoring of drinking water. Here we present a cost-effective and field portable mobile-phone based fluorescence microscopy platform designed for automated detection of Giardia lamblia cysts in large volume water samples (i.e., 10 ml) to be used in low-resource field settings. This fluorescence microscope is integrated with a disposable water-sampling cassette, which is based on a flow-through porous polycarbonate membrane and provides a wide surface area for fluorescence imaging and enumeration of the captured Giardia cysts on the membrane. Water sample of interest, containing fluorescently labeled Giardia cysts, is introduced into the absorbent pads that are in contact with the membrane in the cassette by capillary action, which eliminates the need for electrically driven flow for sample processing. Our fluorescence microscope weighs ~170 grams in total and has all the components of a regular microscope, capable of detecting individual fluorescently labeled cysts under light-emitting-diode (LED) based excitation. Including all the sample preparation, labeling and imaging steps, the entire measurement takes less than one hour for a sample volume of 10 ml. This mobile phone based compact and cost-effective fluorescent imaging platform together with its machine learning based cyst counting interface is easy to use and can even work in resource limited and field settings for spatio-temporal monitoring of water quality.

  19. Fabrication and radio frequency test of large-area MgB 2 films on niobium substrates

    DOE PAGES

    Ni, Zhimao; Guo, Xin; Welander, Paul B.; ...

    2017-01-19

    Magnesium diboride (MgB 2) is a promising candidate material for superconducting radio frequency (RF) cavities because of its higher transition temperature and critical field compared with niobium. To meet the demand of RF test devices, the fabrication of large-area MgB 2 films on metal substrates is needed. Here, in this work, high quality MgB 2 films with 50 mm diameter were fabricated on niobium by using an improved HPCVD system at Peking University, and RF tests were carried out at SLAC National Accelerator Laboratory. The transition temperature is approximately 39.6 K and the RF surface resistance is about 120 μΩmore » at 4 K and 11.4 GHz. Finally, the fabrication processes, surface morphology, DC superconducting properties and RF tests of these large-area MgB 2 films are presented.« less

  20. Fabrication and radio frequency test of large-area MgB 2 films on niobium substrates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ni, Zhimao; Guo, Xin; Welander, Paul B.

    Magnesium diboride (MgB 2) is a promising candidate material for superconducting radio frequency (RF) cavities because of its higher transition temperature and critical field compared with niobium. To meet the demand of RF test devices, the fabrication of large-area MgB 2 films on metal substrates is needed. Here, in this work, high quality MgB 2 films with 50 mm diameter were fabricated on niobium by using an improved HPCVD system at Peking University, and RF tests were carried out at SLAC National Accelerator Laboratory. The transition temperature is approximately 39.6 K and the RF surface resistance is about 120 μΩmore » at 4 K and 11.4 GHz. Finally, the fabrication processes, surface morphology, DC superconducting properties and RF tests of these large-area MgB 2 films are presented.« less

  1. Test evaluation of potential heat shield contamination of an Outer Planet Probe's atmospheric sampling system

    NASA Technical Reports Server (NTRS)

    Kessler, W. C.; Woeller, F. H.; Wilkins, M. E.

    1975-01-01

    An Outer Planets Probe which retains the charred heatshield during atmospheric descent must deploy a sampling tube through the heatshield to extract atmospheric samples for analysis. Once the sampling tube is deployed, the atmospheric samples ingested must be free of contaminant gases generated by the heatshield. Outgassing products such as methane and water vapor are present in planetary atmospheres and hence, ingestion of such species would result in gas analyzer measurement uncertainties. This paper evaluates the potential for, and design impact of, the extracted atmospheric samples being contaminated by heatshield outgassing products. Flight trajectory data for Jupiter, Saturn and Uranus entries are analyzed to define the conditions resulting in the greatest potential for outgassing products being ingested into the probe's sampling system. An experimental program is defined and described which simulates the key flow field features for a planetary flight in a ground-based test facility. The primary parameters varied in the test include: sampling tube length, injectant mass flow rate and angle of attack. Measured contaminant levels predict the critical sampling tube length for contamination avoidance. Thus, the study demonstrates the compatibility of a retained heatshield concept and high quality atmospheric trace species measurements.

  2. Sampling and Data Analysis for Environmental Microbiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murray, Christopher J.

    2001-06-01

    A brief review of the literature indicates the importance of statistical analysis in applied and environmental microbiology. Sampling designs are particularly important for successful studies, and it is highly recommended that researchers review their sampling design before heading to the laboratory or the field. Most statisticians have numerous stories of scientists who approached them after their study was complete only to have to tell them that the data they gathered could not be used to test the hypothesis they wanted to address. Once the data are gathered, a large and complex body of statistical techniques are available for analysis ofmore » the data. Those methods include both numerical and graphical techniques for exploratory characterization of the data. Hypothesis testing and analysis of variance (ANOVA) are techniques that can be used to compare the mean and variance of two or more groups of samples. Regression can be used to examine the relationships between sets of variables and is often used to examine the dependence of microbiological populations on microbiological parameters. Multivariate statistics provides several methods that can be used for interpretation of datasets with a large number of variables and to partition samples into similar groups, a task that is very common in taxonomy, but also has applications in other fields of microbiology. Geostatistics and other techniques have been used to examine the spatial distribution of microorganisms. The objectives of this chapter are to provide a brief survey of some of the statistical techniques that can be used for sample design and data analysis of microbiological data in environmental studies, and to provide some examples of their use from the literature.« less

  3. Relationship between large horizontal electric fields and auroral arc elements

    NASA Astrophysics Data System (ADS)

    Lanchester, B. S.; Kailá, K.; McCrea, I. W.

    1996-03-01

    High time resolution optical measurements in the magnetic zenith are compared with European Incoherent Scatter (EISCAT) field-aligned measurements of electron density at 0.2-s resolution and with horizontal electric field measurements made at 278 km with resolution of 9 s. In one event, 20 min after a spectacular auroral breakup, a system of narrow and active arc elements moved southward into the magnetic zenith, where it remained for several minutes. During a 30-s interval of activity in a narrow arc element very close to the radar beam, the electric field vectors at 3-s resolution were found to be extremely large (up to 400 mVm-1) and to point toward the bright optical features in the arc, which moved along its length. It is proposed that the large electric fields are short-lived and are directly associated with the particle precipitation that causes the bright features in auroral arc elements.

  4. Baseline-dependent sampling and windowing for radio interferometry: data compression, field-of-interest shaping, and outer field suppression

    NASA Astrophysics Data System (ADS)

    Atemkeng, M.; Smirnov, O.; Tasse, C.; Foster, G.; Keimpema, A.; Paragi, Z.; Jonas, J.

    2018-07-01

    Traditional radio interferometric correlators produce regular-gridded samples of the true uv-distribution by averaging the signal over constant, discrete time-frequency intervals. This regular sampling and averaging then translate to be irregular-gridded samples in the uv-space, and results in a baseline-length-dependent loss of amplitude and phase coherence, which is dependent on the distance from the image phase centre. The effect is often referred to as `decorrelation' in the uv-space, which is equivalent in the source domain to `smearing'. This work discusses and implements a regular-gridded sampling scheme in the uv-space (baseline-dependent sampling) and windowing that allow for data compression, field-of-interest shaping, and source suppression. The baseline-dependent sampling requires irregular-gridded sampling in the time-frequency space, i.e. the time-frequency interval becomes baseline dependent. Analytic models and simulations are used to show that decorrelation remains constant across all the baselines when applying baseline-dependent sampling and windowing. Simulations using MeerKAT telescope and the European Very Long Baseline Interferometry Network show that both data compression, field-of-interest shaping, and outer field-of-interest suppression are achieved.

  5. High-Resolution Large Field-of-View FUV Compact Camera

    NASA Technical Reports Server (NTRS)

    Spann, James F.

    2006-01-01

    The need for a high resolution camera with a large field of view and capable to image dim emissions in the far-ultraviolet is driven by the widely varying intensities of FUV emissions and spatial/temporal scales of phenomena of interest in the Earth% ionosphere. In this paper, the concept of a camera is presented that is designed to achieve these goals in a lightweight package with sufficient visible light rejection to be useful for dayside and nightside emissions. The camera employs the concept of self-filtering to achieve good spectral resolution tuned to specific wavelengths. The large field of view is sufficient to image the Earth's disk at Geosynchronous altitudes and capable of a spatial resolution of >20 km. The optics and filters are emphasized.

  6. The effects of luminance contribution from large fields to chromatic visual evoked potentials.

    PubMed

    Skiba, Rafal M; Duncan, Chad S; Crognale, Michael A

    2014-02-01

    Though useful from a clinical and practical standpoint uniform, large-field chromatic stimuli are likely to contain luminance contributions from retinal inhomogeneities. Such contribution can significantly influence psychophysical thresholds. However, the degree to which small luminance artifacts influence the chromatic VEP has been debated. In particular, claims have been made that band-pass tuning observed in chromatic VEPs result from luminance intrusion. However, there has been no direct evidence presented to support these claims. Recently, large-field isoluminant stimuli have been developed to control for intrusion from retinal inhomogeneities with particular regard to the influence of macular pigment. We report here the application of an improved version of these full-field stimuli to directly test the influence of luminance intrusion on the temporal tuning of the chromatic VEP. Our results show that band-pass tuning persists even when isoluminance is achieved throughout the extent of the stimulus. In addition, small amounts of luminance intrusion affect neither the shape of the temporal tuning function nor the major components of the VEP. These results support the conclusion that the chromatic VEP can depart substantially from threshold psychophysics with regard to temporal tuning and that obtaining a low-pass function is not requisite evidence of selective chromatic activation in the VEP. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Results from laboratory and field testing of nitrate measuring spectrophotometers

    USGS Publications Warehouse

    Snazelle, Teri T.

    2015-01-01

    In Phase II, the analyzers were deployed in field conditions at three diferent USGS sites. The measured nitrate concentrations were compared to discrete (reference) samples analyzed by the Direct UV method on a Shimadzu UV1800 bench top spectrophotometer, and by the National Environmental Methods Index (NEMI) method I-2548-11 at the USGS National Water Quality Laboratory. The first deployment at USGS site 0249620 on the East Pearl River in Hancock County, Mississippi, tested the ability of the TriOs ProPs (10-mm path length), Hach NITRATAX (5 mm), Satlantic SUNA (10 mm), and the S::CAN Spectro::lyser (5 mm) to accurately measure low-level (less than 2 mg-N/L) nitrate concentrations while observing the effect turbidity and colored dissolved organic matter (CDOM) would have on the analyzers' measurements. The second deployment at USGS site 01389005 Passaic River below Pompton River at Two Bridges, New Jersey, tested the analyzer's accuracy in mid-level (2-8 mg-N/L) nitrate concentrations. This site provided the means to test the analyzers' performance in two distinct matrices—the Passaic and the Pompton Rivers. In this deployment, three instruments tested in Phase I (TriOS, Hach, and SUNA) were deployed with the S::CAN Spectro::lyser (35 mm) already placed by the New Jersey Water Science Center (WSC). The third deployment at USGS site 05579610 Kickapoo Creek at 2100E Road near Bloomington, Illinois, tested the ability of the analyzers to measure high nitrate concentrations (greater than 8 mg-N/L) in turbid waters. For Kickapoo Creek, the HIF provided the TriOS (10 mm) and S::CAN (5 mm) from Phase I, and a SUNA V2 (5 mm) to be deployed adjacent to the Illinois WSC-owned Hach (2 mm). A total of 40 discrete samples were collected from the three deployment sites and analyzed. The nitrate concentration of the samples ranged from 0.3–22.2 mg-N/L. The average absolute difference between the TriOS measurements and discrete samples was 0.46 mg-N/L. For the combined data

  8. Low-Field-Triggered Large Magnetostriction in Iron-Palladium Strain Glass Alloys.

    PubMed

    Ren, Shuai; Xue, Dezhen; Ji, Yuanchao; Liu, Xiaolian; Yang, Sen; Ren, Xiaobing

    2017-09-22

    Development of miniaturized magnetostriction-associated devices requires low-field-triggered large magnetostriction. In this study, we acquired a large magnetostriction (800 ppm) triggered by a low saturation field (0.8 kOe) in iron-palladium (Fe-Pd) alloys. Magnetostriction enhancement jumping from 340 to 800 ppm was obtained with a slight increase in Pd concentration from 31.3 to 32.3 at. %. Further analysis showed that such a slight increase led to suppression of the long-range ordered martensitic phase and resulted in a frozen short-range ordered strain glass state. This strain glass state possessed a two-phase nanostructure with nanosized frozen strain domains embedded in the austenite matrix, which was responsible for the unique magnetostriction behavior. Our study provides a way to design novel magnetostrictive materials with low-field-triggered large magnetostriction.

  9. Large-Scale Multiobjective Static Test Generation for Web-Based Testing with Integer Programming

    ERIC Educational Resources Information Center

    Nguyen, M. L.; Hui, Siu Cheung; Fong, A. C. M.

    2013-01-01

    Web-based testing has become a ubiquitous self-assessment method for online learning. One useful feature that is missing from today's web-based testing systems is the reliable capability to fulfill different assessment requirements of students based on a large-scale question data set. A promising approach for supporting large-scale web-based…

  10. Penetration of Large Scale Electric Field to Inner Magnetosphere

    NASA Astrophysics Data System (ADS)

    Chen, S. H.; Fok, M. C. H.; Sibeck, D. G.; Wygant, J. R.; Spence, H. E.; Larsen, B.; Reeves, G. D.; Funsten, H. O.

    2015-12-01

    The direct penetration of large scale global electric field to the inner magnetosphere is a critical element in controlling how the background thermal plasma populates within the radiation belts. These plasma populations provide the source of particles and free energy needed for the generation and growth of various plasma waves that, at critical points of resonances in time and phase space, can scatter or energize radiation belt particles to regulate the flux level of the relativistic electrons in the system. At high geomagnetic activity levels, the distribution of large scale electric fields serves as an important indicator of how prevalence of strong wave-particle interactions extend over local times and radial distances. To understand the complex relationship between the global electric fields and thermal plasmas, particularly due to the ionospheric dynamo and the magnetospheric convection effects, and their relations to the geomagnetic activities, we analyze the electric field and cold plasma measurements from Van Allen Probes over more than two years period and simulate a geomagnetic storm event using Coupled Inner Magnetosphere-Ionosphere Model (CIMI). Our statistical analysis of the measurements from Van Allan Probes and CIMI simulations of the March 17, 2013 storm event indicate that: (1) Global dawn-dusk electric field can penetrate the inner magnetosphere inside the inner belt below L~2. (2) Stronger convections occurred in the dusk and midnight sectors than those in the noon and dawn sectors. (3) Strong convections at multiple locations exist at all activity levels but more complex at higher activity levels. (4) At the high activity levels, strongest convections occur in the midnight sectors at larger distances from the Earth and in the dusk sector at closer distances. (5) Two plasma populations of distinct ion temperature isotropies divided at L-Shell ~2, indicating distinct heating mechanisms between inner and outer radiation belts. (6) CIMI

  11. Microbial Community Analysis of Naturally Durable Wood in an Above Ground Field Test

    Treesearch

    G.T. Kirker; S.V. Diehl; P.K. Lebow

    2014-01-01

    This paper presents preliminary results of an above ground field test wherein eight naturally durable wood species were exposed concurrently at two sites in North America. Surface samples were taken at regular intervals from non-durable controls and compared to their more durable counterparts. Terminal Restriction Fragment Length Polymorphism was performed to...

  12. Blue Whale Behavioral Response Study & Field Testing of the New Bioacoustic Probe

    DTIC Science & Technology

    2012-09-30

    L. T. HATCH and C. W. CLARK. 2003. Variation in humpback whale (Megaptera novaeangliae) song length in relation to low-frequency sound broadcasts...1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Blue Whale Behavioral Response Study & Field Testing of...ucsd.edu Award Number: N000140811221 LONG-TERM GOALS Task 1: Blue Whales Behavioral Response Study The behavioral response of large whales

  13. Fast solver for large scale eddy current non-destructive evaluation problems

    NASA Astrophysics Data System (ADS)

    Lei, Naiguang

    Eddy current testing plays a very important role in non-destructive evaluations of conducting test samples. Based on Faraday's law, an alternating magnetic field source generates induced currents, called eddy currents, in an electrically conducting test specimen. The eddy currents generate induced magnetic fields that oppose the direction of the inducing magnetic field in accordance with Lenz's law. In the presence of discontinuities in material property or defects in the test specimen, the induced eddy current paths are perturbed and the associated magnetic fields can be detected by coils or magnetic field sensors, such as Hall elements or magneto-resistance sensors. Due to the complexity of the test specimen and the inspection environments, the availability of theoretical simulation models is extremely valuable for studying the basic field/flaw interactions in order to obtain a fuller understanding of non-destructive testing phenomena. Theoretical models of the forward problem are also useful for training and validation of automated defect detection systems. Theoretical models generate defect signatures that are expensive to replicate experimentally. In general, modelling methods can be classified into two categories: analytical and numerical. Although analytical approaches offer closed form solution, it is generally not possible to obtain largely due to the complex sample and defect geometries, especially in three-dimensional space. Numerical modelling has become popular with advances in computer technology and computational methods. However, due to the huge time consumption in the case of large scale problems, accelerations/fast solvers are needed to enhance numerical models. This dissertation describes a numerical simulation model for eddy current problems using finite element analysis. Validation of the accuracy of this model is demonstrated via comparison with experimental measurements of steam generator tube wall defects. These simulations generating two

  14. Linear models for airborne-laser-scanning-based operational forest inventory with small field sample size and highly correlated LiDAR data

    USGS Publications Warehouse

    Junttila, Virpi; Kauranne, Tuomo; Finley, Andrew O.; Bradford, John B.

    2015-01-01

    Modern operational forest inventory often uses remotely sensed data that cover the whole inventory area to produce spatially explicit estimates of forest properties through statistical models. The data obtained by airborne light detection and ranging (LiDAR) correlate well with many forest inventory variables, such as the tree height, the timber volume, and the biomass. To construct an accurate model over thousands of hectares, LiDAR data must be supplemented with several hundred field sample measurements of forest inventory variables. This can be costly and time consuming. Different LiDAR-data-based and spatial-data-based sampling designs can reduce the number of field sample plots needed. However, problems arising from the features of the LiDAR data, such as a large number of predictors compared with the sample size (overfitting) or a strong correlation among predictors (multicollinearity), may decrease the accuracy and precision of the estimates and predictions. To overcome these problems, a Bayesian linear model with the singular value decomposition of predictors, combined with regularization, is proposed. The model performance in predicting different forest inventory variables is verified in ten inventory areas from two continents, where the number of field sample plots is reduced using different sampling designs. The results show that, with an appropriate field plot selection strategy and the proposed linear model, the total relative error of the predicted forest inventory variables is only 5%–15% larger using 50 field sample plots than the error of a linear model estimated with several hundred field sample plots when we sum up the error due to both the model noise variance and the model’s lack of fit.

  15. Beta Testing StraboSpot: Perspectives on mobile field mapping and data collection

    NASA Astrophysics Data System (ADS)

    Bunse, E.; Graham, K. A.; Rufledt, C.; Walker, J. D.; Müller, A.; Tikoff, B.

    2017-12-01

    Geologic field mapping has recently transitioned away from traditional techniques (e.g. field notebooks, paper mapping, Brunton compasses) and towards mobile `app' mapping technology. The StraboSpot system (Strabo) is an open-source solution for collection and storage for geologic field, microstructural, and lab-based data. Strabo's mission is to "enable recording and sharing data within the geoscience community, encourage interdisciplinary research, and facilitate the investigation of scientific questions that cannot currently be addressed" (Walker et al., 2015). Several mobile application beta tests of the system, on both Android and Apple iOS platforms using smartphones and tablets, began in Summer 2016. Students at the 2016 and 2017 University of Kansas Field Camps used Strabo in place of ArcGIS for Desktop on Panasonic Toughbooks, to field map two study areas. Strabo was also field tested by students of graduate and undergraduate level for both geo/thermochronologic sample collection and reconnaissance mapping associated with regional tectonic analysis in California. Throughout this period of testing, the app was geared toward structural and tectonic geologic data collection, but is versatile enough for other communities to currently use and is expanding to accommodate the sedimentology and petrology communities. Overall, users in each of the beta tests acclimated quickly to using Strabo for field data collection. Some key advantages to using Strabo over traditional mapping methods are: (1) Strabo allows for consolidation of materials in the field; (2) helps students track their position in the field with integrated GPS; and (3) Strabo data is in a uniform format making it simple for geologists to collaborate. While traditional field methods are not likely to go out of style in the near future, Strabo acts as a bridge between professional and novice geologists by providing a tool that is intuitive on all levels of geological and technological experience and

  16. DSM-5 field trials in the United States and Canada, Part I: study design, sampling strategy, implementation, and analytic approaches.

    PubMed

    Clarke, Diana E; Narrow, William E; Regier, Darrel A; Kuramoto, S Janet; Kupfer, David J; Kuhl, Emily A; Greiner, Lisa; Kraemer, Helena C

    2013-01-01

    This article discusses the design,sampling strategy, implementation,and data analytic processes of the DSM-5 Field Trials. The DSM-5 Field Trials were conducted by using a test-retest reliability design with a stratified sampling approach across six adult and four pediatric sites in the United States and one adult site in Canada. A stratified random sampling approach was used to enhance precision in the estimation of the reliability coefficients. A web-based research electronic data capture system was used for simultaneous data collection from patients and clinicians across sites and for centralized data management.Weighted descriptive analyses, intraclass kappa and intraclass correlation coefficients for stratified samples, and receiver operating curves were computed. The DSM-5 Field Trials capitalized on advances since DSM-III and DSM-IV in statistical measures of reliability (i.e., intraclass kappa for stratified samples) and other recently developed measures to determine confidence intervals around kappa estimates. Diagnostic interviews using DSM-5 criteria were conducted by 279 clinicians of varied disciplines who received training comparable to what would be available to any clinician after publication of DSM-5.Overall, 2,246 patients with various diagnoses and levels of comorbidity were enrolled,of which over 86% were seen for two diagnostic interviews. A range of reliability coefficients were observed for the categorical diagnoses and dimensional measures. Multisite field trials and training comparable to what would be available to any clinician after publication of DSM-5 provided “real-world” testing of DSM-5 proposed diagnoses.

  17. Termiticide field test report: New termiticides & emerging technologies

    Treesearch

    Bradford M. Kard

    2000-01-01

    Testing at our nationwide field sites detemines the years-of-effectiveness of currently marketed and potentially new termiticides as treatments to soil under long-term field conditions. Several new termiticide candidates and formulations have been placed in the field during the past four years and will be reported on after they complete five years of testing.

  18. Detection of antibodies to egg drop syndrome virus in chicken serum using a field-based immunofiltration (flow-through) test.

    PubMed

    Raj, G Dhinakar; Thiagarajan, V; Nachimuthu, K

    2007-09-01

    A simple, user-friendly, and rapid method to detect the presence of antibodies to egg drop syndrome 76 (EDS) virus in chicken sera based on an immunofiltration (flow-through) test was developed. Purified EDS virus antigen was coated onto nitrocellulose membranes housed in a plastic module with layers of absorbent filter pads underneath. Following addition of serum to be tested and washing, monoclonal antibodies or polyclonal serum to chicken immunoglobulin G (IgG) was used as a bridge antibody to mediate binding between EDS virus-specific IgG and protein A gold conjugate. The appearance of a pink dot indicated the presence of antibodies to EDS virus in the sample tested. The results could be obtained within 5-10 min. The developed immunofiltration test could detect antibodies in the sera of experimentally vaccinated chickens from 2 wk postvaccination. With field sera samples, this test was positive in samples having hemagglutination inhibition titers of 8 and above. This test has the potential to be used as a field-based kit to assess seroconversion in EDS-vaccinated flocks.

  19. 46 CFR 160.055-7 - Sampling, tests, and inspections.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 6 2011-10-01 2011-10-01 false Sampling, tests, and inspections. 160.055-7 Section 160.055-7 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) EQUIPMENT, CONSTRUCTION, AND... and Child, for Merchant Vessels § 160.055-7 Sampling, tests, and inspections. (a) Production tests and...

  20. 46 CFR 160.055-7 - Sampling, tests, and inspections.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 6 2010-10-01 2010-10-01 false Sampling, tests, and inspections. 160.055-7 Section 160.055-7 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) EQUIPMENT, CONSTRUCTION, AND... and Child, for Merchant Vessels § 160.055-7 Sampling, tests, and inspections. (a) Production tests and...

  1. 46 CFR 160.002-5 - Sampling, tests, and inspections.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 6 2011-10-01 2011-10-01 false Sampling, tests, and inspections. 160.002-5 Section 160.002-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) EQUIPMENT, CONSTRUCTION, AND... Type), Models 3 and 5 § 160.002-5 Sampling, tests, and inspections. (a) Production tests and...

  2. 46 CFR 160.002-5 - Sampling, tests, and inspections.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 6 2010-10-01 2010-10-01 false Sampling, tests, and inspections. 160.002-5 Section 160.002-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) EQUIPMENT, CONSTRUCTION, AND... Type), Models 3 and 5 § 160.002-5 Sampling, tests, and inspections. (a) Production tests and...

  3. Theoretical study on the laser-driven ion-beam trace probe in toroidal devices with large poloidal magnetic field

    NASA Astrophysics Data System (ADS)

    Yang, X.; Xiao, C.; Chen, Y.; Xu, T.; Yu, Y.; Xu, M.; Wang, L.; Wang, X.; Lin, C.

    2018-03-01

    Recently, a new diagnostic method, Laser-driven Ion-beam Trace Probe (LITP), has been proposed to reconstruct 2D profiles of the poloidal magnetic field (Bp) and radial electric field (Er) in the tokamak devices. A linear assumption and test particle model were used in those reconstructions. In some toroidal devices such as the spherical tokamak and the Reversal Field Pinch (RFP), Bp is not small enough to meet the linear assumption. In those cases, the error of reconstruction increases quickly when Bp is larger than 10% of the toroidal magnetic field (Bt), and the previous test particle model may cause large error in the tomography process. Here a nonlinear reconstruction method is proposed for those cases. Preliminary numerical results show that LITP could be applied not only in tokamak devices, but also in other toroidal devices, such as the spherical tokamak, RFP, etc.

  4. [The assessment of radionuclide contamination and toxicity of soils sampled from "experimental field" site of Semipalatinsk nuclear test site].

    PubMed

    Evseeva, T I; Maĭstrenko, T A; Belykh, E S; Geras'kin, S A; Kriazheva, E Iu

    2009-01-01

    Large-scale maps (1:25000) of soil contamination with radionuclides, lateral distribution of 137Cs, 90Sr, Fe and Mn water-soluble compounds and soil toxicity in "Experimental field" site of Semipalatinsk nuclear test site were charted. At present soils from studied site (4 km2) according to basic sanitary standards of radiation safety adopted in Russian Federation (OSPORB) do not attributed to radioactive wastes with respect to data on artificial radionuclide concentration, but they do in compliance with IAEA safety guide. The soils studied can not be released from regulatory control due to radioactive decay of 137Cs and 90Sr and accumulation-decay of 241Am up to 2106 year according to IAEA concept of exclusion, exemption and clearance. Data on bioassay "increase of Chlorella vulgaris Beijer biomass production in aqueous extract from soils" show that the largest part of soils from the studied site (74%) belongs to stimulating or insignificantly influencing on the algae reproduction due to water-soluble compounds effect. Toxic soils occupy 26% of the territory. The main factors effecting the algae reproduction in the aqueous extracts from soil are Fe concentration and 90Sr specific activity: 90Sr inhibits but Fe stimulates algae biomass production.

  5. Heritability of metabolic syndrome traits in a large population-based sample[S

    PubMed Central

    van Dongen, Jenny; Willemsen, Gonneke; Chen, Wei-Min; de Geus, Eco J. C.; Boomsma, Dorret I.

    2013-01-01

    Heritability estimates of metabolic syndrome traits vary widely across studies. Some studies have suggested that the contribution of genes may vary with age or sex. We estimated the heritability of 11 metabolic syndrome-related traits and height as a function of age and sex in a large population-based sample of twin families (N = 2,792–27,021, for different traits). A moderate-to-high heritability was found for all traits [from H2 = 0.47 (insulin) to H2 = 0.78 (BMI)]. The broad-sense heritability (H2) showed little variation between age groups in women; it differed somewhat more in men (e.g., for glucose, H2 = 0.61 in young females, H2 = 0.56 in older females, H2 = 0.64 in young males, and H2= 0.27 in older males). While nonadditive genetic effects explained little variation in the younger subjects, nonadditive genetic effects became more important at a greater age. Our findings show that in an unselected sample (age range, ∼18–98 years), the genetic contribution to individual differences in metabolic syndrome traits is moderate to large in both sexes and across age. Although the prevalence of the metabolic syndrome has greatly increased in the past decades due to lifestyle changes, our study indicates that most of the variation in metabolic syndrome traits between individuals is due to genetic differences. PMID:23918046

  6. Field Testing of Electronic Registering Fareboxes

    DOT National Transportation Integrated Search

    1987-02-01

    This report presents the findings of an evaluation of electronic registering fareboxes in field testing at the Detroit Department of Transportation. Thirty-two fareboxes were tested on one bus route, Woodward Avenue, which was selected because it gen...

  7. Sampling large geographic areas for rare species using environmental DNA: A study of bull trout Salvelinus confluentus occupancy in western Montana

    Treesearch

    Kevin McKelvey; Michael Young; W. L. Knotek; K. J. Carim; T. M. Wilcox; T. M. Padgett-Stewart; Michael Schwartz

    2016-01-01

    This study tested the efficacy of environmental DNA (eDNA) sampling to delineate the distribution of bull trout Salvelinus confluentus in headwater streams in western Montana, U.S.A. Surveys proved fast, reliable and sensitive: 124 samples were collected across five basins by a single crew in c. 8days. Results were largely consistent with past electrofishing,...

  8. Constraining Large-Scale Solar Magnetic Field Models with Optical Coronal Observations

    NASA Astrophysics Data System (ADS)

    Uritsky, V. M.; Davila, J. M.; Jones, S. I.

    2015-12-01

    Scientific success of the Solar Probe Plus (SPP) and Solar Orbiter (SO) missions will depend to a large extent on the accuracy of the available coronal magnetic field models describing the connectivity of plasma disturbances in the inner heliosphere with their source regions. We argue that ground based and satellite coronagraph images can provide robust geometric constraints for the next generation of improved coronal magnetic field extrapolation models. In contrast to the previously proposed loop segmentation codes designed for detecting compact closed-field structures above solar active regions, we focus on the large-scale geometry of the open-field coronal regions located at significant radial distances from the solar surface. Details on the new feature detection algorithms will be presented. By applying the developed image processing methodology to high-resolution Mauna Loa Solar Observatory images, we perform an optimized 3D B-line tracing for a full Carrington rotation using the magnetic field extrapolation code presented in a companion talk by S.Jones at al. Tracing results are shown to be in a good qualitative agreement with the large-scalie configuration of the optical corona. Subsequent phases of the project and the related data products for SSP and SO missions as wwll as the supporting global heliospheric simulations will be discussed.

  9. Heat tracer test in an alluvial aquifer: Field experiment and inverse modelling

    NASA Astrophysics Data System (ADS)

    Klepikova, Maria; Wildemeersch, Samuel; Hermans, Thomas; Jamin, Pierre; Orban, Philippe; Nguyen, Frédéric; Brouyère, Serge; Dassargues, Alain

    2016-09-01

    Using heat as an active tracer for aquifer characterization is a topic of increasing interest. In this study, we investigate the potential of using heat tracer tests for characterization of a shallow alluvial aquifer. A thermal tracer test was conducted in the alluvial aquifer of the Meuse River, Belgium. The tracing experiment consisted in simultaneously injecting heated water and a dye tracer in an injection well and monitoring the evolution of groundwater temperature and tracer concentration in the pumping well and in measurement intervals. To get insights in the 3D characteristics of the heat transport mechanisms, temperature data from a large number of observation wells closely spaced along three transects were used. Temperature breakthrough curves in observation wells are contrasted with what would be expected in an ideal layered aquifer. They reveal strongly unequal lateral and vertical components of the transport mechanisms. The observed complex behavior of the heat plume is explained by the groundwater flow gradient on the site and heterogeneities in the hydraulic conductivity field. Moreover, due to high injection temperatures during the field experiment a temperature-induced fluid density effect on heat transport occurred. By using a flow and heat transport numerical model with variable density coupled with a pilot point approach for inversion of the hydraulic conductivity field, the main preferential flow paths were delineated. The successful application of a field heat tracer test at this site suggests that heat tracer tests is a promising approach to image hydraulic conductivity field. This methodology could be applied in aquifer thermal energy storage (ATES) projects for assessing future efficiency that is strongly linked to the hydraulic conductivity variability in the considered aquifer.

  10. Disruption of circumstellar discs by large-scale stellar magnetic fields

    NASA Astrophysics Data System (ADS)

    ud-Doula, Asif; Owocki, Stanley P.; Kee, Nathaniel Dylan

    2018-05-01

    Spectropolarimetric surveys reveal that 8-10% of OBA stars harbor large-scale magnetic fields, but thus far no such fields have been detected in any classical Be stars. Motivated by this, we present here MHD simulations for how a pre-existing Keplerian disc - like that inferred to form from decretion of material from rapidly rotating Be stars - can be disrupted by a rotation-aligned stellar dipole field. For characteristic stellar and disc parameters of a near-critically rotating B2e star, we find that a polar surface field strength of just 10 G can significantly disrupt the disc, while a field of 100 G, near the observational upper limit inferred for most Be stars, completely destroys the disc over just a few days. Our parameter study shows that the efficacy of this magnetic disruption of a disc scales with the characteristic plasma beta (defined as the ratio between thermal and magnetic pressure) in the disc, but is surprisingly insensitive to other variations, e.g. in stellar rotation speed, or the mass loss rate of the star's radiatively driven wind. The disc disruption seen here for even a modest field strength suggests that the presumed formation of such Be discs by decretion of material from the star would likely be strongly inhibited by such fields; this provides an attractive explanation for why no large-scale fields are detected from such Be stars.

  11. Tests and applications of nonlinear force-free field extrapolations in spherical geometry

    NASA Astrophysics Data System (ADS)

    Guo, Y.; Ding, M. D.

    2013-07-01

    We test a nonlinear force-free field (NLFFF) optimization code in spherical geometry with an analytical solution from Low and Lou. The potential field source surface (PFSS) model is served as the initial and boundary conditions where observed data are not available. The analytical solution can be well recovered if the boundary and initial conditions are properly handled. Next, we discuss the preprocessing procedure for the noisy bottom boundary data, and find that preprocessing is necessary for NLFFF extrapolations when we use the observed photospheric magnetic field as bottom boundaries. Finally, we apply the NLFFF model to a solar area where four active regions interacting with each other. An M8.7 flare occurred in one active region. NLFFF modeling in spherical geometry simultaneously constructs the small and large scale magnetic field configurations better than the PFSS model does.

  12. 46 CFR 160.005-5 - Sampling, tests, and inspections.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 6 2010-10-01 2010-10-01 false Sampling, tests, and inspections. 160.005-5 Section 160.005-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) EQUIPMENT, CONSTRUCTION, AND... (Jacket Type), Models 52 and 56 § 160.005-5 Sampling, tests, and inspections. (a) Production tests and...

  13. 46 CFR 160.005-5 - Sampling, tests, and inspections.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 6 2011-10-01 2011-10-01 false Sampling, tests, and inspections. 160.005-5 Section 160.005-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) EQUIPMENT, CONSTRUCTION, AND... (Jacket Type), Models 52 and 56 § 160.005-5 Sampling, tests, and inspections. (a) Production tests and...

  14. 43 CFR 3162.4-2 - Samples, tests, and surveys.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false Samples, tests, and surveys. 3162.4-2... for Operating Rights Owners and Operators § 3162.4-2 Samples, tests, and surveys. (a) During the... tests, run logs, and make other surveys reasonably necessary to determine the presence, quantity, and...

  15. 43 CFR 3162.4-2 - Samples, tests, and surveys.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Samples, tests, and surveys. 3162.4-2... for Operating Rights Owners and Operators § 3162.4-2 Samples, tests, and surveys. (a) During the... tests, run logs, and make other surveys reasonably necessary to determine the presence, quantity, and...

  16. 43 CFR 3162.4-2 - Samples, tests, and surveys.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Samples, tests, and surveys. 3162.4-2... for Operating Rights Owners and Operators § 3162.4-2 Samples, tests, and surveys. (a) During the... tests, run logs, and make other surveys reasonably necessary to determine the presence, quantity, and...

  17. A two-stage design for multiple testing in large-scale association studies.

    PubMed

    Wen, Shu-Hui; Tzeng, Jung-Ying; Kao, Jau-Tsuen; Hsiao, Chuhsing Kate

    2006-01-01

    Modern association studies often involve a large number of markers and hence may encounter the problem of testing multiple hypotheses. Traditional procedures are usually over-conservative and with low power to detect mild genetic effects. From the design perspective, we propose a two-stage selection procedure to address this concern. Our main principle is to reduce the total number of tests by removing clearly unassociated markers in the first-stage test. Next, conditional on the findings of the first stage, which uses a less stringent nominal level, a more conservative test is conducted in the second stage using the augmented data and the data from the first stage. Previous studies have suggested using independent samples to avoid inflated errors. However, we found that, after accounting for the dependence between these two samples, the true discovery rate increases substantially. In addition, the cost of genotyping can be greatly reduced via this approach. Results from a study of hypertriglyceridemia and simulations suggest the two-stage method has a higher overall true positive rate (TPR) with a controlled overall false positive rate (FPR) when compared with single-stage approaches. We also report the analytical form of its overall FPR, which may be useful in guiding study design to achieve a high TPR while retaining the desired FPR.

  18. Accuracy of accommodation in heterophoric patients: testing an interaction model in a large clinical sample.

    PubMed

    Hasebe, Satoshi; Nonaka, Fumitaka; Ohtsuki, Hiroshi

    2005-11-01

    A model of the cross-link interactions between accommodation and convergence predicted that heterophoria can induce large accommodation errors (Schor, Ophthalmic Physiol. Opt. 1999;19:134-150). In 99 consecutive patients with intermittent tropia or decompensated phoria, we tested these interactions by comparing their accommodative responses to a 2.50-D target under binocular fused conditions (BFC) and monocular occluded conditions (MOC). The accommodative response in BFC frequently differed from that in MOC. The magnitude of the accommodative errors in BFC, ranging from an accommodative lag of 1.80 D (in an esophoric patient) to an accommodative lead of 1.56 D (in an exophoric patient), was correlated with distance heterophoria and uncorrected refractive errors. These results indicate that heterophoria affects the accuracy of accommodation to various degrees, as the model predicted, and that an accommodative error larger than the depth of focus of the eye occurs in exchange for binocular single vision in some heterophoric patients.

  19. Geometric quantification of features in large flow fields.

    PubMed

    Kendall, Wesley; Huang, Jian; Peterka, Tom

    2012-01-01

    Interactive exploration of flow features in large-scale 3D unsteady-flow data is one of the most challenging visualization problems today. To comprehensively explore the complex feature spaces in these datasets, a proposed system employs a scalable framework for investigating a multitude of characteristics from traced field lines. This capability supports the examination of various neighborhood-based geometric attributes in concert with other scalar quantities. Such an analysis wasn't previously possible because of the large computational overhead and I/O requirements. The system integrates visual analytics methods by letting users procedurally and interactively describe and extract high-level flow features. An exploration of various phenomena in a large global ocean-modeling simulation demonstrates the approach's generality and expressiveness as well as its efficacy.

  20. Large-Scale Low-Boom Inlet Test Overview

    NASA Technical Reports Server (NTRS)

    Hirt, Stefanie

    2011-01-01

    This presentation provides a high level overview of the Large-Scale Low-Boom Inlet Test and was presented at the Fundamental Aeronautics 2011 Technical Conference. In October 2010 a low-boom supersonic inlet concept with flow control was tested in the 8'x6' supersonic wind tunnel at NASA Glenn Research Center (GRC). The primary objectives of the test were to evaluate the inlet stability and operability of a large-scale low-boom supersonic inlet concept by acquiring performance and flowfield validation data, as well as evaluate simple, passive, bleedless inlet boundary layer control options. During this effort two models were tested: a dual stream inlet intended to model potential flight hardware and a single stream design to study a zero-degree external cowl angle and to permit surface flow visualization of the vortex generator flow control on the internal centerbody surface. The tests were conducted by a team of researchers from NASA GRC, Gulfstream Aerospace Corporation, University of Illinois at Urbana-Champaign, and the University of Virginia

  1. Pretest Caluculations of Temperature Changes for Field Thermal Conductivity Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N.S. Brodsky

    A large volume fraction of the potential monitored geologic repository at Yucca Mountain may reside in the Tptpll (Tertiary, Paintbrush Group, Topopah Spring Tuff, crystal poor, lower lithophysal) lithostratigraphic unit. This unit is characterized by voids, or lithophysae, which range in size from centimeters to meters. A series of thermal conductivity field tests are planned in the Enhanced Characterization of the Repository Block (ECRB) Cross Drift. The objective of the pretest calculation described in this document is to predict changes in temperatures in the surrounding rock for these tests for a given heater power and a set of thermal transportmore » properties. The calculation can be extended, as described in this document, to obtain thermal conductivity, thermal capacitance (density x heat capacity, J {center_dot} m{sup -3} {center_dot} K{sup -1}), and thermal diffusivity from the field data. The work has been conducted under the ''Technical Work Plan For: Testing and Monitoring'' (BSC 2001). One of the outcomes of this analysis is to determine the initial output of the heater. This heater output must be sufficiently high that it will provide results in a reasonably short period of time (within several weeks or a month) and be sufficiently high that the heat increase is detectable by the instruments employed in the test. The test will be conducted in stages and heater output will be step increased as the test progresses. If the initial temperature is set too high, the experiment will not have as many steps and thus fewer thermal conductivity data points will result.« less

  2. Large-scale carbon fiber tests

    NASA Technical Reports Server (NTRS)

    Pride, R. A.

    1980-01-01

    A realistic release of carbon fibers was established by burning a minimum of 45 kg of carbon fiber composite aircraft structural components in each of five large scale, outdoor aviation jet fuel fire tests. This release was quantified by several independent assessments with various instruments developed specifically for these tests. The most likely values for the mass of single carbon fibers released ranged from 0.2 percent of the initial mass of carbon fiber for the source tests (zero wind velocity) to a maximum of 0.6 percent of the initial carbon fiber mass for dissemination tests (5 to 6 m/s wind velocity). Mean fiber lengths for fibers greater than 1 mm in length ranged from 2.5 to 3.5 mm. Mean diameters ranged from 3.6 to 5.3 micrometers which was indicative of significant oxidation. Footprints of downwind dissemination of the fire released fibers were measured to 19.1 km from the fire.

  3. Preliminary Evaluation of the Field and Laboratory Emission Cell (FLEC) for Sampling Attribution Signatures from Building Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey, Scott D.; He, Lijian; Wahl, Jon H.

    2012-08-30

    This study provides a preliminary evaluation of the Field and Laboratory Emission Cell (FLEC) for its suitability for sampling building materials for toxic compounds and their associated impurities and residues that might remain after a terrorist chemical attack. Chemical warfare (CW) agents and toxic industrial chemicals were represented by a range of test probes that included CW surrogates. The test probes encompassed the acid-base properties, volatilities, and polarities of the expected chemical agents and residual compounds. Results indicated that dissipation of the test probes depended heavily on the underlying material. Near complete dissipation of almost all test probes occurred frommore » galvanized stainless steel within 3.0 hrs, whereas far stronger retention with concomitant slower release was observed for vinyl composition floor tiles. The test probes displayed immediated permanence on Teflon. FLEC sampling was further evaluated by profiling residues remaining after the evaporation of 2-chloroethyl ethyl sulfide, a sulfur mustard simulant. This study lays the groundwork for the eventual goal of applying this sampling approach for collection of forensic attribution signatures that remain after a terrorist chemical attack.« less

  4. Herald : field operational test evaluation final report

    DOT National Transportation Integrated Search

    2000-09-01

    The Herald Field Operational Test (FOT) tested AM radio as a low-cost way to broadcast traveler information in rural areas. It tested the feasibility of broadcasting data on the inaudible portion of an existing AM broadcast. Two systems were tested, ...

  5. Herald Field Operation Test Evaluation : Final Report

    DOT National Transportation Integrated Search

    2000-09-01

    The Herald Field Operational Test (FOT) tested AM radio as a low-cost way to broadcast traveler information in rural areas. It tested the feasibility of broadcasting data on the inaudible portion of an existing AM broadcast. Two systems were tested, ...

  6. Correcting Two-Sample "z" and "t" Tests for Correlation: An Alternative to One-Sample Tests on Difference Scores

    ERIC Educational Resources Information Center

    Zimmerman, Donald W.

    2012-01-01

    In order to circumvent the influence of correlation in paired-samples and repeated measures experimental designs, researchers typically perform a one-sample Student "t" test on difference scores. That procedure entails some loss of power, because it employs N - 1 degrees of freedom instead of the 2N - 2 degrees of freedom of the…

  7. Efficacy of a lead based paint XRF analyzer and a commercially available colorimetric lead test kit as qualitative field tools for determining presence of lead in religious powders.

    PubMed

    Shah, Manthan P; Shendell, Derek G; Meng, Qingyu; Ohman-Strickland, Pamela; Halperin, William

    2018-04-23

    The performances of a portable X-Ray Fluorescence (XRF) lead paint analyzer (RMD LPA-1, Protec Instrument Corp., Waltham, MA) and a commercially available colorimetric lead test kit (First Alert Lead Test Kit, eAccess Solutions, Inc., Palatine, IL) were evaluated for use by local or state health departments as potential cost-effective rapid analysis or "spot test" field techniques for tentative identification of lead content in sindoor powders. For both field-sampling methods, sensitivity, specificity and predictive values varied widely for samples containing <300,000 μg/g lead. For samples containing ≥300,000 μg/g lead, the aforementioned metrics were 100% (however, the CIs had a wide range). In addition, both field sampling methods showed clear, consistent positive readings only for samples containing ≥300,000 μg/g lead. Even samples with lead content as high as 5,110 μg/g were not positively identified by either field analysis technique. The results of this study suggest the XRF analyzer and colorimetric lead test kit cannot be used as a rapid field test for sindoor by health department inspectors.

  8. Comparison of sampling and test methods for determining asphalt content and moisture correction in asphalt concrete mixtures.

    DOT National Transportation Integrated Search

    1985-03-01

    The purpose of this report is to identify the difference, if any, in AASHTO and OSHD test procedures and results. This report addresses the effect of the size of samples taken in the field and evaluates the methods of determining the moisture content...

  9. Temperature-and field dependent characterization of a twisted stacked-tape cable

    NASA Astrophysics Data System (ADS)

    Barth, C.; Takayasu, M.; Bagrets, N.; Bayer, C. M.; Weiss, K.-P.; Lange, C.

    2015-04-01

    The twisted stacked-tape cable (TSTC) is one of the major high temperature superconductor cable concepts combining scalability, ease of fabrication and high current density making it a possible candidate as conductor for large scale magnets. To simulate the boundary conditions of such a magnets as well as the temperature dependence of TSTCs a 1.16 m long sample consisting of 40, 4 mm wide SuperPower REBCO tapes is characterized using the ‘FBI’ (force-field-current) superconductor test facility of the Institute for Technical Physics of the Karlsruhe Institute of Technology. In a first step, the magnetic background field is cycled while measuring the current carrying capabilities to determine the impact of Lorentz forces on the TSTC sample performance. In the first field cycle, the critical current of the TSTC sample is tested up to 12 T. A significant Lorentz force of up to 65.6 kN m-1 at the maximal magnetic background field of 12 T result in a 11.8% irreversible degradation of the current carrying capabilities. The degradation saturates (critical cable current of 5.46 kA at 4.2 K and 12 T background field) and does not increase in following field cycles. In a second step, the sample is characterized at different background fields (4-12 T) and surface temperatures (4.2-37.8 K) utilizing the variable temperature insert of the ‘FBI’ test facility. In a third step, the performance along the length of the sample is determined at 77 K, self-field. A 15% degradation is obtained for the central part of the sample which was within the high field region of the magnet during the in-field measurements.

  10. The 2-degree Field Lensing Survey: photometric redshifts from a large new training sample to r < 19.5

    NASA Astrophysics Data System (ADS)

    Wolf, C.; Johnson, A. S.; Bilicki, M.; Blake, C.; Amon, A.; Erben, T.; Glazebrook, K.; Heymans, C.; Hildebrandt, H.; Joudaki, S.; Klaes, D.; Kuijken, K.; Lidman, C.; Marin, F.; Parkinson, D.; Poole, G.

    2017-04-01

    We present a new training set for estimating empirical photometric redshifts of galaxies, which was created as part of the 2-degree Field Lensing Survey project. This training set is located in a ˜700 deg2 area of the Kilo-Degree-Survey South field and is randomly selected and nearly complete at r < 19.5. We investigate the photometric redshift performance obtained with ugriz photometry from VST-ATLAS and W1/W2 from WISE, based on several empirical and template methods. The best redshift errors are obtained with kernel-density estimation (KDE), as are the lowest biases, which are consistent with zero within statistical noise. The 68th percentiles of the redshift scatter for magnitude-limited samples at r < (15.5, 17.5, 19.5) are (0.014, 0.017, 0.028). In this magnitude range, there are no known ambiguities in the colour-redshift map, consistent with a small rate of redshift outliers. In the fainter regime, the KDE method produces p(z) estimates per galaxy that represent unbiased and accurate redshift frequency expectations. The p(z) sum over any subsample is consistent with the true redshift frequency plus Poisson noise. Further improvements in redshift precision at r < 20 would mostly be expected from filter sets with narrower passbands to increase the sensitivity of colours to small changes in redshift.

  11. An improved method for field extraction and laboratory analysis of large, intact soil cores

    USGS Publications Warehouse

    Tindall, J.A.; Hemmen, K.; Dowd, J.F.

    1992-01-01

    Various methods have been proposed for the extraction of large, undisturbed soil cores and for subsequent analysis of fluid movement within the cores. The major problems associated with these methods are expense, cumbersome field extraction, and inadequate simulation of unsaturated flow conditions. A field and laboratory procedure is presented that is economical, convenient, and simulates unsaturated and saturated flow without interface flow problems and can be used on a variety of soil types. In the field, a stainless steel core barrel is hydraulically pressed into the soil (30-cm diam. and 38 cm high), the barrel and core are extracted from the soil, and after the barrel is removed from the core, the core is then wrapped securely with flexible sheet metal and a stainless mesh screen is attached to the bottom of the core for support. In the laboratory the soil core is set atop a porous ceramic plate over which a soil-diatomaceous earth slurry has been poured to assure good contact between plate and core. A cardboard cylinder (mold) is fastened around the core and the empty space filled with paraffin wax. Soil cores were tested under saturated and unsaturated conditions using a hanging water column for potentials ???0. Breakthrough curves indicated that no interface flow occurred along the edge of the core. This procedure proved to be reliable for field extraction of large, intact soil cores and for laboratory analysis of solute transport.

  12. Blue Whale Behavioral Response Study and Field Testing of the New Bioacoustic Probe

    DTIC Science & Technology

    2011-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Blue Whale Behavioral Response Study & Field Testing of...6849 email: jhildebrand@ucsd.edu Award Number: N000140811221 LONG-TERM GOALS Task 1: Blue Whales Behavioral Response Study The...behavioral response of large whales to commercial shipping and other low-frequency anthropogenic sound is not well understood. The PCAD model (NRC 2005

  13. Cryogenic Design of the New High Field Magnet Test Facility at CERN

    NASA Astrophysics Data System (ADS)

    Benda, V.; Pirotte, O.; De Rijk, G.; Bajko, M.; Craen, A. Vande; Perret, Ph.; Hanzelka, P.

    In the framework of the R&D program related to the Large Hadron Collider (LHC) upgrades, a new High Field Magnet (HFM) vertical test bench is required. This facility located in the SM18 cryogenic test hall shall allow testing of up to 15 tons superconducting magnets with energy up to 10 MJ in a temperature range between 1.9 K and 4.5 K. The article describes the cryogenic architecture to be inserted in the general infrastructure of SM18 including the process and instrumentation diagram, the different operating phases including strategy for magnet cool down and warm up at controlled speed and quench management as well as the design of the main components.

  14. Pair-barcode high-throughput sequencing for large-scale multiplexed sample analysis.

    PubMed

    Tu, Jing; Ge, Qinyu; Wang, Shengqin; Wang, Lei; Sun, Beili; Yang, Qi; Bai, Yunfei; Lu, Zuhong

    2012-01-25

    The multiplexing becomes the major limitation of the next-generation sequencing (NGS) in application to low complexity samples. Physical space segregation allows limited multiplexing, while the existing barcode approach only permits simultaneously analysis of up to several dozen samples. Here we introduce pair-barcode sequencing (PBS), an economic and flexible barcoding technique that permits parallel analysis of large-scale multiplexed samples. In two pilot runs using SOLiD sequencer (Applied Biosystems Inc.), 32 independent pair-barcoded miRNA libraries were simultaneously discovered by the combination of 4 unique forward barcodes and 8 unique reverse barcodes. Over 174,000,000 reads were generated and about 64% of them are assigned to both of the barcodes. After mapping all reads to pre-miRNAs in miRBase, different miRNA expression patterns are captured from the two clinical groups. The strong correlation using different barcode pairs and the high consistency of miRNA expression in two independent runs demonstrates that PBS approach is valid. By employing PBS approach in NGS, large-scale multiplexed pooled samples could be practically analyzed in parallel so that high-throughput sequencing economically meets the requirements of samples which are low sequencing throughput demand.

  15. Infrared Testing of the Wide-field Infrared Survey Telescope Grism Using Computer Generated Holograms

    NASA Technical Reports Server (NTRS)

    Dominguez, Margaret Z.; Content, David A.; Gong, Qian; Griesmann, Ulf; Hagopian, John G.; Marx, Catherine T; Whipple, Arthur L.

    2017-01-01

    Infrared Computer Generated Holograms (CGHs) were designed, manufactured and used to measure the performance of the grism (grating prism) prototype which includes testing Diffractive Optical Elements (DOE). The grism in the Wide Field Infrared Survey Telescope (WFIRST) will allow the surveying of a large section of the sky to find bright galaxies.

  16. 49 CFR 199.111 - Retention of samples and additional testing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... SAFETY DRUG AND ALCOHOL TESTING Drug Testing § 199.111 Retention of samples and additional testing. (a... other than the unauthorized use of a prohibited drug, and if timely additional testing is requested by... 49 Transportation 3 2011-10-01 2011-10-01 false Retention of samples and additional testing. 199...

  17. 49 CFR 199.111 - Retention of samples and additional testing.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... SAFETY DRUG AND ALCOHOL TESTING Drug Testing § 199.111 Retention of samples and additional testing. (a... other than the unauthorized use of a prohibited drug, and if timely additional testing is requested by... 49 Transportation 3 2013-10-01 2013-10-01 false Retention of samples and additional testing. 199...

  18. 49 CFR 199.111 - Retention of samples and additional testing.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... SAFETY DRUG AND ALCOHOL TESTING Drug Testing § 199.111 Retention of samples and additional testing. (a... other than the unauthorized use of a prohibited drug, and if timely additional testing is requested by... 49 Transportation 3 2012-10-01 2012-10-01 false Retention of samples and additional testing. 199...

  19. 49 CFR 199.111 - Retention of samples and additional testing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SAFETY DRUG AND ALCOHOL TESTING Drug Testing § 199.111 Retention of samples and additional testing. (a... other than the unauthorized use of a prohibited drug, and if timely additional testing is requested by... 49 Transportation 3 2010-10-01 2010-10-01 false Retention of samples and additional testing. 199...

  20. 49 CFR 199.111 - Retention of samples and additional testing.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... SAFETY DRUG AND ALCOHOL TESTING Drug Testing § 199.111 Retention of samples and additional testing. (a... other than the unauthorized use of a prohibited drug, and if timely additional testing is requested by... 49 Transportation 3 2014-10-01 2014-10-01 false Retention of samples and additional testing. 199...

  1. Spatial considerations during cryopreservation of a large volume sample.

    PubMed

    Kilbride, Peter; Lamb, Stephen; Milne, Stuart; Gibbons, Stephanie; Erro, Eloy; Bundy, James; Selden, Clare; Fuller, Barry; Morris, John

    2016-08-01

    There have been relatively few studies on the implications of the physical conditions experienced by cells during large volume (litres) cryopreservation - most studies have focused on the problem of cryopreservation of smaller volumes, typically up to 2 ml. This study explores the effects of ice growth by progressive solidification, generally seen during larger scale cryopreservation, on encapsulated liver hepatocyte spheroids, and it develops a method to reliably sample different regions across the frozen cores of samples experiencing progressive solidification. These issues are examined in the context of a Bioartificial Liver Device which requires cryopreservation of a 2 L volume in a strict cylindrical geometry for optimal clinical delivery. Progressive solidification cannot be avoided in this arrangement. In such a system optimal cryoprotectant concentrations and cooling rates are known. However, applying these parameters to a large volume is challenging due to the thermal mass and subsequent thermal lag. The specific impact of this to the cryopreservation outcome is required. Under conditions of progressive solidification, the spatial location of Encapsulated Liver Spheroids had a strong impact on post-thaw recovery. Cells in areas first and last to solidify demonstrated significantly impaired post-thaw function, whereas areas solidifying through the majority of the process exhibited higher post-thaw outcome. It was also found that samples where the ice thawed more rapidly had greater post-thaw viability 24 h post-thaw (75.7 ± 3.9% and 62.0 ± 7.2% respectively). These findings have implications for the cryopreservation of large volumes with a rigid shape and for the cryopreservation of a Bioartificial Liver Device. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Redding Responder Field Test - UTC

    DOT National Transportation Integrated Search

    2008-10-30

    This UTC project facilitated field testing and evaluation of the "Responder" system between Phases 1 and 2 of the Redding Responder Project, sponsored by the California Department of Transportation. A pilot system, with hardware purchased by Caltrans...

  3. Abundance Ratios in a Large Sample of Emps with VLT+UVES

    NASA Astrophysics Data System (ADS)

    Hill, Vanessa; Cayrel, Roger; Spite, Monique; Bonifacio, Piercarlo; Eric, Depagne; Patrick, François; Timothy, Beers C.; Johannes, Andersen; Beatriz, Barbuy; Birgitta, Nordström

    Constraints on Early Galactic Enrichement from a large sample of Extremely Metal Poor Stars I will present the overall results from an large effort conducted at ESO-VLT+UVES to measure abundances in a sample of extremely metal-poor stars (EMPS) from high-resolution and high signal to noise spectra. More than 70 EMPS with [Fe/H]<-2.7 were observed equally distributed between turnoff and giants stars and very precise abundance ratios could be derived thanks to the high quality of the data. Among the results those of specific interest are lithium measurements in unevolved EMPS the much debated abundance of oxygen in the early galaxy (we present [OI] line measurements down to [O/Fe]=-3.5) and the trends of alpha elements iron group elements and Zinc. The scatter around these trends will also be discussed taking advantage of the small observationnal error-bars of this dataset. The implications on the early Galactic enrichement will be rewiewed while more specific topics covered by this large effort (and large team) will be adressed in devoted posters.

  4. NHEXAS PHASE I MARYLAND STUDY--STANDARD OPERATING PROCEDURE FOR FIELD SAMPLING--GENERAL INFORMATION (F01)

    EPA Science Inventory

    The purpose of this SOP is to provide an overview of field sampling procedures. This SOP details the samples taken, the responsibilities of the field staff, an approximate schedule for field operations, persons responsible for analyses, equipment used for sampling, and contents ...

  5. Double Sampling with Multiple Imputation to Answer Large Sample Meta-Research Questions: Introduction and Illustration by Evaluating Adherence to Two Simple CONSORT Guidelines

    PubMed Central

    Capers, Patrice L.; Brown, Andrew W.; Dawson, John A.; Allison, David B.

    2015-01-01

    Background: Meta-research can involve manual retrieval and evaluation of research, which is resource intensive. Creation of high throughput methods (e.g., search heuristics, crowdsourcing) has improved feasibility of large meta-research questions, but possibly at the cost of accuracy. Objective: To evaluate the use of double sampling combined with multiple imputation (DS + MI) to address meta-research questions, using as an example adherence of PubMed entries to two simple consolidated standards of reporting trials guidelines for titles and abstracts. Methods: For the DS large sample, we retrieved all PubMed entries satisfying the filters: RCT, human, abstract available, and English language (n = 322, 107). For the DS subsample, we randomly sampled 500 entries from the large sample. The large sample was evaluated with a lower rigor, higher throughput (RLOTHI) method using search heuristics, while the subsample was evaluated using a higher rigor, lower throughput (RHITLO) human rating method. Multiple imputation of the missing-completely at-random RHITLO data for the large sample was informed by: RHITLO data from the subsample; RLOTHI data from the large sample; whether a study was an RCT; and country and year of publication. Results: The RHITLO and RLOTHI methods in the subsample largely agreed (phi coefficients: title = 1.00, abstract = 0.92). Compliance with abstract and title criteria has increased over time, with non-US countries improving more rapidly. DS + MI logistic regression estimates were more precise than subsample estimates (e.g., 95% CI for change in title and abstract compliance by year: subsample RHITLO 1.050–1.174 vs. DS + MI 1.082–1.151). As evidence of improved accuracy, DS + MI coefficient estimates were closer to RHITLO than the large sample RLOTHI. Conclusion: Our results support our hypothesis that DS + MI would result in improved precision and accuracy. This method is flexible and may provide a practical

  6. Power of tests of normality for detecting contaminated normal samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thode, H.C. Jr.; Smith, L.A.; Finch, S.J.

    1981-01-01

    Seventeen tests of normality or goodness of fit were evaluated for power at detecting a contaminated normal sample. This study used 1000 replications each of samples of size 12, 17, 25, 33, 50, and 100 from six different contaminated normal distributions. The kurtosis test was the most powerful over all sample sizes and contaminations. The Hogg and weighted Kolmogorov-Smirnov tests were second. The Kolmogorov-Smirnov, chi-squared, Anderson-Darling, and Cramer-von-Mises tests had very low power at detecting contaminated normal random variables. Tables of the power of the tests and the power curves of certain tests are given.

  7. Dynamical implications of sample shape for avalanches in 2-dimensional random-field Ising model with saw-tooth domain wall

    NASA Astrophysics Data System (ADS)

    Tadić, Bosiljka

    2018-03-01

    We study dynamics of a built-in domain wall (DW) in 2-dimensional disordered ferromagnets with different sample shapes using random-field Ising model on a square lattice rotated by 45 degrees. The saw-tooth DW of the length Lx is created along one side and swept through the sample by slow ramping of the external field until the complete magnetisation reversal and the wall annihilation at the open top boundary at a distance Ly. By fixing the number of spins N =Lx ×Ly = 106 and the random-field distribution at a value above the critical disorder, we vary the ratio of the DW length to the annihilation distance in the range Lx /Ly ∈ [ 1 / 16 , 16 ] . The periodic boundary conditions are applied in the y-direction so that these ratios comprise different samples, i.e., surfaces of cylinders with the changing perimeter Lx and height Ly. We analyse the avalanches of the DW slips between following field updates, and the multifractal structure of the magnetisation fluctuation time series. Our main findings are that the domain-wall lengths materialised in different sample shapes have an impact on the dynamics at all scales. Moreover, the domain-wall motion at the beginning of the hysteresis loop (HLB) probes the disorder effects resulting in the fluctuations that are significantly different from the large avalanches in the central part of the loop (HLC), where the strong fields dominate. Specifically, the fluctuations in HLB exhibit a wide multi-fractal spectrum, which shifts towards higher values of the exponents when the DW length is reduced. The distributions of the avalanches in this segments of the loops obey power-law decay and the exponential cutoffs with the exponents firmly in the mean-field universality class for long DW. In contrast, the avalanches in the HLC obey Tsallis density distribution with the power-law tails which indicate the new categories of the scale invariant behaviour for different ratios Lx /Ly. The large fluctuations in the HLC, on the other

  8. Scanning Electron Microscopy with Samples in an Electric Field

    PubMed Central

    Frank, Ludĕk; Hovorka, Miloš; Mikmeková, Šárka; Mikmeková, Eliška; Müllerová, Ilona; Pokorná, Zuzana

    2012-01-01

    The high negative bias of a sample in a scanning electron microscope constitutes the “cathode lens” with a strong electric field just above the sample surface. This mode offers a convenient tool for controlling the landing energy of electrons down to units or even fractions of electronvolts with only slight readjustments of the column. Moreover, the field accelerates and collimates the signal electrons to earthed detectors above and below the sample, thereby assuring high collection efficiency and high amplification of the image signal. One important feature is the ability to acquire the complete emission of the backscattered electrons, including those emitted at high angles with respect to the surface normal. The cathode lens aberrations are proportional to the landing energy of electrons so the spot size becomes nearly constant throughout the full energy scale. At low energies and with their complete angular distribution acquired, the backscattered electron images offer enhanced information about crystalline and electronic structures thanks to contrast mechanisms that are otherwise unavailable. Examples from various areas of materials science are presented.

  9. A large-scale test of free-energy simulation estimates of protein-ligand binding affinities.

    PubMed

    Mikulskis, Paulius; Genheden, Samuel; Ryde, Ulf

    2014-10-27

    We have performed a large-scale test of alchemical perturbation calculations with the Bennett acceptance-ratio (BAR) approach to estimate relative affinities for the binding of 107 ligands to 10 different proteins. Employing 20-Å truncated spherical systems and only one intermediate state in the perturbations, we obtain an error of less than 4 kJ/mol for 54% of the studied relative affinities and a precision of 0.5 kJ/mol on average. However, only four of the proteins gave acceptable errors, correlations, and rankings. The results could be improved by using nine intermediate states in the simulations or including the entire protein in the simulations using periodic boundary conditions. However, 27 of the calculated affinities still gave errors of more than 4 kJ/mol, and for three of the proteins the results were not satisfactory. This shows that the performance of BAR calculations depends on the target protein and that several transformations gave poor results owing to limitations in the molecular-mechanics force field or the restricted sampling possible within a reasonable simulation time. Still, the BAR results are better than docking calculations for most of the proteins.

  10. Relationships between field performance tests in high-level soccer players.

    PubMed

    Ingebrigtsen, Jørgen; Brochmann, Marit; Castagna, Carlo; Bradley, Paul S; Ade, Jack; Krustrup, Peter; Holtermann, Andreas

    2014-04-01

    submaximal HRs after 2 and 4 minutes were independently associated with Yo-Yo IR1 and IR2 performances (p ≤ 0.01). In conclusion, Yo-Yo IR1 and 2 test performances, as well as sprint and RSA performances, correlated very largely, and it may therefore be considered using only one of the Yo-Yo tests and a RSA test, in a general soccer-specific field test protocol. The submaximal HR measures during Yo-Yo tests are reproducible and may be used for frequent, time-efficient, and nonexhaustive testing of intermittent exercise capacity of high-level soccer players.

  11. Field Testing of Cryogenic Carbon Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sayre, Aaron; Frankman, Dave; Baxter, Andrew

    Sustainable Energy Solutions has been developing Cryogenic Carbon Capture™ (CCC) since 2008. In that time two processes have been developed, the External Cooling Loop and Compressed Flue Gas Cryogenic Carbon Capture processes (CCC ECL™ and CCC CFG™ respectively). The CCC ECL™ process has been scaled up to a 1TPD CO2 system. In this process the flue gas is cooled by an external refrigerant loop. SES has tested CCC ECL™ on real flue gas slip streams from subbituminous coal, bituminous coal, biomass, natural gas, shredded tires, and municipal waste fuels at field sites that include utility power stations, heating plants, cementmore » kilns, and pilot-scale research reactors. The CO2 concentrations from these tests ranged from 5 to 22% on a dry basis. CO2 capture ranged from 95-99+% during these tests. Several other condensable species were also captured including NO2, SO2 and PMxx at 95+%. NO was also captured at a modest rate. The CCC CFG™ process has been scaled up to a .25 ton per day system. This system has been tested on real flue gas streams including subbituminous coal, bituminous coal and natural gas at field sites that include utility power stations, heating plants, and pilot-scale research reactors. CO2 concentrations for these tests ranged from 5 to 15% on a dry basis. CO2 capture ranged from 95-99+% during these tests. Several other condensable species were also captured including NO2, SO2 and PMxx at 95+%. NO was also captured at 90+%. Hg capture was also verified and the resulting effluent from CCC CFG™ was below a 1ppt concentration. This paper will focus on discussion of the capabilities of CCC, the results of field testing and the future steps surrounding the development of this technology.« less

  12. Direct Field Acoustic Testing

    NASA Technical Reports Server (NTRS)

    Larkin, Paul; Goldstein, Bob

    2008-01-01

    This paper presents an update to the methods and procedures used in Direct Field Acoustic Testing (DFAT). The paper will discuss some of the recent techniques and developments that are currently being used and the future publication of a reference standard. Acoustic testing using commercial sound system components is becoming a popular and cost effective way of generating a required acoustic test environment both in and out of a reverberant chamber. This paper will present the DFAT test method, the usual setup and procedure and the development and use of a closed-loop, narrow-band control system. Narrow-band control of the acoustic PSD allows all standard techniques and procedures currently used in random control to be applied to acoustics and some examples are given. The paper will conclude with a summary of the development of a standard practice guideline that is hoped to be available in the first quarter of next year.

  13. Location tests for biomarker studies: a comparison using simulations for the two-sample case.

    PubMed

    Scheinhardt, M O; Ziegler, A

    2013-01-01

    Gene, protein, or metabolite expression levels are often non-normally distributed, heavy tailed and contain outliers. Standard statistical approaches may fail as location tests in this situation. In three Monte-Carlo simulation studies, we aimed at comparing the type I error levels and empirical power of standard location tests and three adaptive tests [O'Gorman, Can J Stat 1997; 25: 269 -279; Keselman et al., Brit J Math Stat Psychol 2007; 60: 267- 293; Szymczak et al., Stat Med 2013; 32: 524 - 537] for a wide range of distributions. We simulated two-sample scenarios using the g-and-k-distribution family to systematically vary tail length and skewness with identical and varying variability between groups. All tests kept the type I error level when groups did not vary in their variability. The standard non-parametric U-test performed well in all simulated scenarios. It was outperformed by the two non-parametric adaptive methods in case of heavy tails or large skewness. Most tests did not keep the type I error level for skewed data in the case of heterogeneous variances. The standard U-test was a powerful and robust location test for most of the simulated scenarios except for very heavy tailed or heavy skewed data, and it is thus to be recommended except for these cases. The non-parametric adaptive tests were powerful for both normal and non-normal distributions under sample variance homogeneity. But when sample variances differed, they did not keep the type I error level. The parametric adaptive test lacks power for skewed and heavy tailed distributions.

  14. Falcon: Visual analysis of large, irregularly sampled, and multivariate time series data in additive manufacturing

    DOE PAGES

    Steed, Chad A.; Halsey, William; Dehoff, Ryan; ...

    2017-02-16

    Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less

  15. Falcon: Visual analysis of large, irregularly sampled, and multivariate time series data in additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A.; Halsey, William; Dehoff, Ryan

    Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less

  16. Outgassing tests on iras solar panel samples

    NASA Technical Reports Server (NTRS)

    Premat, G.; Zwaal, A.; Pennings, N. H.

    1980-01-01

    Several outgassing tests were carried out on representative solar panel samples in order to determine the extent of contamination that could be expected from this source. The materials for the construction of the solar panels were selected as a result of contamination obtained in micro volatile condensable materials tests.

  17. The relationship between cavum septum pellucidum and psychopathic traits in a large forensic sample.

    PubMed

    Crooks, Dana; Anderson, Nathaniel E; Widdows, Matthew; Petseva, Nia; Koenigs, Michael; Pluto, Charles; Kiehl, Kent A

    2018-04-01

    Cavum septum pellucidum (CSP) is a neuroanatomical variant of the septum pellucidum that is considered a marker for disrupted brain development. Several small sample studies have reported CSP to be related to disruptive behavior, persistent antisocial traits, and even psychopathy. However, no large-scale samples have comprehensively examined the relationship between CSP, psychopathic traits, and antisocial behavior in forensic samples. Here we test hypotheses about the presence of CSP and its relationship to psychopathic traits in incarcerated males (N = 1432). We also examined the incidence of CSP in two non-incarcerated male control samples for comparison (N = 208 and 125). Ethnic and racial composition was varied with a mean age of 33.1, and an average IQ of 96.96. CSP was evaluated via structural magnetic resonance imaging. CSP was measured by length (number of 1.0 mm slices) in continuous analyses, and classified as absent (0) or present (1+ mm), as well as by size (absent (0), small (1-3), medium (4-5), or large (6+ mm)) for comparison with prior work. The Wechsler Adult Intelligence Scale (WAIS-III), Structured Clinical Interview (SCID-I/P), and Hare Psychopathy Checklist-Revised (PCL-R) were used to assess IQ, substance dependence, and psychopathy, respectively. CSP length was positively associated with PCL-R total, Factor 1 (interpersonal/affective) and Facets 1 (interpersonal) and 2 (affective). CSP was no more prevalent among inmates than among non-incarcerated controls, with similar distributions of size. These results support the hypotheses that abnormal septal/limbic development may contribute to dimensional affective/interpersonal traits of psychopathy, but CSP is not closely associated with antisocial behavior, per se. Published by Elsevier Ltd.

  18. Temperature Control Diagnostics for Sample Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santodonato, Louis J; Walker, Lakeisha MH; Church, Andrew J

    2010-01-01

    In a scientific laboratory setting, standard equipment such as cryocoolers are often used as part of a custom sample environment system designed to regulate temperature over a wide range. The end user may be more concerned with precise sample temperature control than with base temperature. But cryogenic systems tend to be specified mainly in terms of cooling capacity and base temperature. Technical staff at scientific user facilities (and perhaps elsewhere) often wonder how to best specify and evaluate temperature control capabilities. Here we describe test methods and give results obtained at a user facility that operates a large sample environmentmore » inventory. Although this inventory includes a wide variety of temperature, pressure, and magnetic field devices, the present work focuses on cryocooler-based systems.« less

  19. Generation of large-scale magnetic fields by small-scale dynamo in shear flows

    NASA Astrophysics Data System (ADS)

    Squire, Jonathan; Bhattacharjee, Amitava

    2015-11-01

    A new mechanism for turbulent mean-field dynamo is proposed, in which the magnetic fluctuations resulting from a small-scale dynamo drive the generation of large-scale magnetic fields. This is in stark contrast to the common idea that small-scale magnetic fields should be harmful to large-scale dynamo action. These dynamos occur in the presence of large-scale velocity shear and do not require net helicity, resulting from off-diagonal components of the turbulent resistivity tensor as the magnetic analogue of the ``shear-current'' effect. The dynamo is studied using a variety of computational and analytic techniques, both when the magnetic fluctuations arise self-consistently through the small-scale dynamo and in lower Reynolds number regimes. Given the inevitable existence of non-helical small-scale magnetic fields in turbulent plasmas, as well as the generic nature of velocity shear, the suggested mechanism may help to explain generation of large-scale magnetic fields across a wide range of astrophysical objects. This work was supported by a Procter Fellowship at Princeton University, and the US Department of Energy Grant DE-AC02-09-CH11466.

  20. Towards an explicit model of large field inflation

    NASA Astrophysics Data System (ADS)

    Dorronsoro, Juan Diaz; Schillo, Marjorie

    2018-05-01

    The unwinding inflation mechanism is studied in a type IIB flux compactification where all moduli are stabilized using flux, non-perturbative effects, and the leading α' corrections of the large volume scenario. We consider the backreaction on the geometry due to the presence of anti-D3 branes as well as the backreaction of inflation on the Kähler moduli, and compute the resulting corrections to the slow-roll potential. By taking large flux numbers, we are able to find inflationary epochs where backreaction effects are under control, the inflaton traverses a super-Planckian field range, and the resulting amplitude of scalar perturbations is consistent with observation.

  1. A Carbon Free Filter for Collection of Large Volume Samples of Cellular Biomass from Oligotrophic Waters

    PubMed Central

    Mailloux, Brian J.; Dochenetz, Audra; Bishop, Michael; Dong, Hailiang; Ziolkowski, Lori A.; Wommack, K. Eric; Sakowski, Eric G.; Onstott, Tullis C.; Slater, Greg F.

    2018-01-01

    Isotopic analysis of cellular biomass has greatly improved our understanding of carbon cycling in the environment. Compound specific radiocarbon analysis (CSRA) of cellular biomass is being increasingly applied in a number of fields. However, it is often difficult to collect sufficient cellular biomass for analysis from oligotrophic waters because easy-to-use filtering methods that are free of carbon contaminants do not exist. The goal of this work was to develop a new column based filter to autonomously collect high volume samples of biomass from oligotrophic waters for CSRA using material that can be baked at 450°C to remove potential organic contaminants. A series of filter materials were tested, including uncoated sand, ferrihydrite-coated sand, goethite-coated sand, aluminum-coated sand, uncoated glass wool, ferrihydrite-coated glass wool, and aluminum-coated glass wool, in the lab with 0.1 and 1.0 µm microspheres and E. coli. Results indicated that aluminum-coated glass wool was the most efficient filter and that the retention capacity of the filter far exceeded the biomass requirements for CSRA. Results from laboratory tests indicate that for oligotrophic waters with 1×105 cells ml−1, 117 L of water would need to be filtered to collect 100 µg of PLFA for bulk PLFA analysis and 2000 L for analysis of individual PLFAs. For field sampling, filtration tests on South African mine water indicated that after filtering 5955 liters, 450 µg of total PLFAs were present, ample biomass for radiocarbon analysis. In summary, we have developed a filter that is easy to use and deploy for collection of biomass for CSRA including total and individual PLFAs. PMID:22561839

  2. Large Solar Flares and Sheared Magnetic Field Configuration

    NASA Technical Reports Server (NTRS)

    Choudhary, Debi Prasad

    2001-01-01

    This Comment gives additional information about the nature of flaring locations on the Sun described in the article "Sun unleashes Halloween storm", by R. E. Lopez, et al. What causes the large explosions from solar active regions that unleash huge magnetic storms and adverse space weather? It is now beyond doubt that the magnetic field in solar active regions harbors free energy that is released during these events. Direct measurements of the longitudinal and transverse components of active region magnetic fields with the vector magnetograph at NASA Marshall Space Flight Center (MSFC), taken on a regular basis for the last 30 years, have found key signatures of the locations of powerful flares. A vector magnetograph detects and measures the magnetic shear, which is the deviation of the observed transverse magnetic field direction from the potential field. The sheared locations possess abundant free magnetic energy for solar flares. In addition to active region NOAA 10486, the one that produced the largest flares last October, the NASA/MSFC vector magnetograph has observed several other such complex super active regions, including NOAA 6555 and 6659.

  3. Automated cognitive testing of monkeys in social groups yields results comparable to individual laboratory-based testing.

    PubMed

    Gazes, Regina Paxton; Brown, Emily Kathryn; Basile, Benjamin M; Hampton, Robert R

    2013-05-01

    Cognitive abilities likely evolved in response to specific environmental and social challenges and are therefore expected to be specialized for the life history of each species. Specialized cognitive abilities may be most readily engaged under conditions that approximate the natural environment of the species being studied. While naturalistic environments might therefore have advantages over laboratory settings for cognitive research, it is difficult to conduct certain types of cognitive tests in these settings. We implemented methods for automated cognitive testing of monkeys (Macaca mulatta) in large social groups (Field station) and compared the performance to that of laboratory-housed monkeys (Laboratory). The Field station animals shared access to four touch-screen computers in a large naturalistic social group. Each Field station subject had an RFID chip implanted in each arm for computerized identification and individualized assignment of cognitive tests. The Laboratory group was housed and tested in a typical laboratory setting, with individual access to testing computers in their home cages. Monkeys in both groups voluntarily participated at their own pace for food rewards. We evaluated performance in two visual psychophysics tests, a perceptual classification test, a transitive inference test, and a delayed matching-to-sample memory test. Despite the differences in housing, social environment, age, and sex, monkeys in the two groups performed similarly in all tests. Semi-free ranging monkeys living in complex social environments are therefore viable subjects for cognitive testing designed to take advantage of the unique affordances of naturalistic testing environments.

  4. Automated cognitive testing of monkeys in social groups yields results comparable to individual laboratory based testing

    PubMed Central

    Gazes, Regina Paxton; Brown, Emily Kathryn; Basile, Benjamin M.; Hampton, Robert R.

    2013-01-01

    Cognitive abilities likely evolved in response to specific environmental and social challenges and are therefore expected to be specialized for the life history of each species. Specialized cognitive abilities may be most readily engaged under conditions that approximate the natural environment of the species being studied. While naturalistic environments might therefore have advantages over laboratory settings for cognitive research, it is difficult to conduct certain types of cognitive tests in these settings. We implemented methods for automated cognitive testing of monkeys (Macaca mulatta) in large social groups (Field station) and compared the performance to that of laboratory housed monkeys (Laboratory). The Field station animals shared access to four touch screen computers in a large naturalistic social group. Each Field station subject had an RFID chip implanted in each arm for computerized identification and individualized assignment of cognitive tests. The Laboratory group was housed and tested in a typical laboratory setting, with individual access to testing computers in their home cages. Monkeys in both groups voluntarily participated at their own pace for food rewards. We evaluated performance in two visual psychophysics tests, a perceptual classification test, a transitive inference test, and a delayed matching to sample memory test. Despite differences in housing, social environment, age, and sex, monkeys in the two groups performed similarly in all tests. Semi-free ranging monkeys living in complex social environments are therefore viable subjects for cognitive testing designed to take advantage of the unique affordances of naturalistic testing environments. PMID:23263675

  5. Effect of Facet Displacement on Radiation Field and Its Application for Panel Adjustment of Large Reflector Antenna

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Lian, Peiyuan; Zhang, Shuxin; Xiang, Binbin; Xu, Qian

    2017-05-01

    Large reflector antennas are widely used in radars, satellite communication, radio astronomy, and so on. The rapid developments in these fields have created demands for development of better performance and higher surface accuracy. However, low accuracy and low efficiency are the common disadvantages for traditional panel alignment and adjustment. In order to improve the surface accuracy of large reflector antenna, a new method is presented to determinate panel adjustment values from far field pattern. Based on the method of Physical Optics (PO), the effect of panel facet displacement on radiation field value is derived. Then the linear system is constructed between panel adjustment vector and far field pattern. Using the method of Singular Value Decomposition (SVD), the adjustment value for all panel adjustors are obtained by solving the linear equations. An experiment is conducted on a 3.7 m reflector antenna with 12 segmented panels. The results of simulation and test are similar, which shows that the presented method is feasible. Moreover, the discussion about validation shows that the method can be used for many cases of reflector shape. The proposed research provides the instruction to adjust surface panels efficiently and accurately.

  6. Moving your laboratories to the field – Advantages and limitations of the use of field portable instruments in environmental sample analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gałuszka, Agnieszka, E-mail: Agnieszka.Galuszka@ujk.edu.pl; Migaszewski, Zdzisław M.; Namieśnik, Jacek

    The recent rapid progress in technology of field portable instruments has increased their applications in environmental sample analysis. These instruments offer a possibility of cost-effective, non-destructive, real-time, direct, on-site measurements of a wide range of both inorganic and organic analytes in gaseous, liquid and solid samples. Some of them do not require the use of reagents and do not produce any analytical waste. All these features contribute to the greenness of field portable techniques. Several stationary analytical instruments have their portable versions. The most popular ones include: gas chromatographs with different detectors (mass spectrometer (MS), flame ionization detector, photoionization detector),more » ultraviolet–visible and near-infrared spectrophotometers, X-ray fluorescence spectrometers, ion mobility spectrometers, electronic noses and electronic tongues. The use of portable instruments in environmental sample analysis gives a possibility of on-site screening and a subsequent selection of samples for routine laboratory analyses. They are also very useful in situations that require an emergency response and for process monitoring applications. However, quantification of results is still problematic in many cases. The other disadvantages include: higher detection limits and lower sensitivity than these obtained in laboratory conditions, a strong influence of environmental factors on the instrument performance and a high possibility of sample contamination in the field. This paper reviews recent applications of field portable instruments in environmental sample analysis and discusses their analytical capabilities. - Highlights: • Field portable instruments are widely used in environmental sample analysis. • Field portable instruments are indispensable for analysis in emergency response. • Miniaturization of field portable instruments reduces resource consumption. • In situ analysis is in agreement with green analytical

  7. Information Content of the Near-Field I: Two-Dimensional Samples

    NASA Technical Reports Server (NTRS)

    Frazin, Richard A.; Fischer, David G.; Carney, P. Scott

    2004-01-01

    Limits on the effective resolution of many optical near-field experiments are investigated. The results are applicable to variants of total-internal-reflection microscopy (TIRM), photon-scanning-tunneling microscopy (PSTM), and near-field-scanning-optical microscopy (NSOM) in which the sample is weakly scattering and the direction of illumination may be controlled. Analytical expressions for the variance of the estimate of the complex susceptibility of an unknown two-dimensional object as a function of spatial frequency are obtained for Gaussian and Poisson noise models, and a model-independent measure is examined. The results are used to explore the transition from near-zone to far-zone detection. It is demonstrated that the information content of the measurements made at a distance of even one wavelength away from the sample is already not much different from the information content of the far field. Copyright 2004 Optical Society of America

  8. Pair-barcode high-throughput sequencing for large-scale multiplexed sample analysis

    PubMed Central

    2012-01-01

    Background The multiplexing becomes the major limitation of the next-generation sequencing (NGS) in application to low complexity samples. Physical space segregation allows limited multiplexing, while the existing barcode approach only permits simultaneously analysis of up to several dozen samples. Results Here we introduce pair-barcode sequencing (PBS), an economic and flexible barcoding technique that permits parallel analysis of large-scale multiplexed samples. In two pilot runs using SOLiD sequencer (Applied Biosystems Inc.), 32 independent pair-barcoded miRNA libraries were simultaneously discovered by the combination of 4 unique forward barcodes and 8 unique reverse barcodes. Over 174,000,000 reads were generated and about 64% of them are assigned to both of the barcodes. After mapping all reads to pre-miRNAs in miRBase, different miRNA expression patterns are captured from the two clinical groups. The strong correlation using different barcode pairs and the high consistency of miRNA expression in two independent runs demonstrates that PBS approach is valid. Conclusions By employing PBS approach in NGS, large-scale multiplexed pooled samples could be practically analyzed in parallel so that high-throughput sequencing economically meets the requirements of samples which are low sequencing throughput demand. PMID:22276739

  9. Investigating large-scale brain dynamics using field potential recordings: analysis and interpretation.

    PubMed

    Pesaran, Bijan; Vinck, Martin; Einevoll, Gaute T; Sirota, Anton; Fries, Pascal; Siegel, Markus; Truccolo, Wilson; Schroeder, Charles E; Srinivasan, Ramesh

    2018-06-25

    New technologies to record electrical activity from the brain on a massive scale offer tremendous opportunities for discovery. Electrical measurements of large-scale brain dynamics, termed field potentials, are especially important to understanding and treating the human brain. Here, our goal is to provide best practices on how field potential recordings (electroencephalograms, magnetoencephalograms, electrocorticograms and local field potentials) can be analyzed to identify large-scale brain dynamics, and to highlight critical issues and limitations of interpretation in current work. We focus our discussion of analyses around the broad themes of activation, correlation, communication and coding. We provide recommendations for interpreting the data using forward and inverse models. The forward model describes how field potentials are generated by the activity of populations of neurons. The inverse model describes how to infer the activity of populations of neurons from field potential recordings. A recurring theme is the challenge of understanding how field potentials reflect neuronal population activity given the complexity of the underlying brain systems.

  10. Scalable parallel distance field construction for large-scale applications

    DOE PAGES

    Yu, Hongfeng; Xie, Jinrong; Ma, Kwan -Liu; ...

    2015-10-01

    Computing distance fields is fundamental to many scientific and engineering applications. Distance fields can be used to direct analysis and reduce data. In this paper, we present a highly scalable method for computing 3D distance fields on massively parallel distributed-memory machines. Anew distributed spatial data structure, named parallel distance tree, is introduced to manage the level sets of data and facilitate surface tracking overtime, resulting in significantly reduced computation and communication costs for calculating the distance to the surface of interest from any spatial locations. Our method supports several data types and distance metrics from real-world applications. We demonstrate itsmore » efficiency and scalability on state-of-the-art supercomputers using both large-scale volume datasets and surface models. We also demonstrate in-situ distance field computation on dynamic turbulent flame surfaces for a petascale combustion simulation. In conclusion, our work greatly extends the usability of distance fields for demanding applications.« less

  11. Scalable Parallel Distance Field Construction for Large-Scale Applications.

    PubMed

    Yu, Hongfeng; Xie, Jinrong; Ma, Kwan-Liu; Kolla, Hemanth; Chen, Jacqueline H

    2015-10-01

    Computing distance fields is fundamental to many scientific and engineering applications. Distance fields can be used to direct analysis and reduce data. In this paper, we present a highly scalable method for computing 3D distance fields on massively parallel distributed-memory machines. A new distributed spatial data structure, named parallel distance tree, is introduced to manage the level sets of data and facilitate surface tracking over time, resulting in significantly reduced computation and communication costs for calculating the distance to the surface of interest from any spatial locations. Our method supports several data types and distance metrics from real-world applications. We demonstrate its efficiency and scalability on state-of-the-art supercomputers using both large-scale volume datasets and surface models. We also demonstrate in-situ distance field computation on dynamic turbulent flame surfaces for a petascale combustion simulation. Our work greatly extends the usability of distance fields for demanding applications.

  12. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    NASA Astrophysics Data System (ADS)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  13. 46 CFR 161.006-5 - Sampling, inspections and tests.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 6 2011-10-01 2011-10-01 false Sampling, inspections and tests. 161.006-5 Section 161.006-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) EQUIPMENT, CONSTRUCTION, AND... Vessels § 161.006-5 Sampling, inspections and tests. (a) General. Motor lifeboat searchlights specified by...

  14. 46 CFR 161.006-5 - Sampling, inspections and tests.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 6 2010-10-01 2010-10-01 false Sampling, inspections and tests. 161.006-5 Section 161.006-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) EQUIPMENT, CONSTRUCTION, AND... Vessels § 161.006-5 Sampling, inspections and tests. (a) General. Motor lifeboat searchlights specified by...

  15. 7 CFR 29.426 - Collection of pesticide test samples.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Collection of pesticide test samples. 29.426 Section... CONTAINER REGULATIONS TOBACCO INSPECTION Regulations Miscellaneous § 29.426 Collection of pesticide test samples. Any lot of tobacco not certified by the importer as being free of prohibited pesticide residues...

  16. 7 CFR 29.426 - Collection of pesticide test samples.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Collection of pesticide test samples. 29.426 Section... CONTAINER REGULATIONS TOBACCO INSPECTION Regulations Miscellaneous § 29.426 Collection of pesticide test samples. Any lot of tobacco not certified by the importer as being free of prohibited pesticide residues...

  17. 7 CFR 29.426 - Collection of pesticide test samples.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Collection of pesticide test samples. 29.426 Section... CONTAINER REGULATIONS TOBACCO INSPECTION Regulations Miscellaneous § 29.426 Collection of pesticide test samples. Any lot of tobacco not certified by the importer as being free of prohibited pesticide residues...

  18. 7 CFR 29.426 - Collection of pesticide test samples.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Collection of pesticide test samples. 29.426 Section... CONTAINER REGULATIONS TOBACCO INSPECTION Regulations Miscellaneous § 29.426 Collection of pesticide test samples. Any lot of tobacco not certified by the importer as being free of prohibited pesticide residues...

  19. 7 CFR 29.426 - Collection of pesticide test samples.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Collection of pesticide test samples. 29.426 Section... CONTAINER REGULATIONS TOBACCO INSPECTION Regulations Miscellaneous § 29.426 Collection of pesticide test samples. Any lot of tobacco not certified by the importer as being free of prohibited pesticide residues...

  20. Large scale cryogenic fluid systems testing

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA Lewis Research Center's Cryogenic Fluid Systems Branch (CFSB) within the Space Propulsion Technology Division (SPTD) has the ultimate goal of enabling the long term storage and in-space fueling/resupply operations for spacecraft and reusable vehicles in support of space exploration. Using analytical modeling, ground based testing, and on-orbit experimentation, the CFSB is studying three primary categories of fluid technology: storage, supply, and transfer. The CFSB is also investigating fluid handling, advanced instrumentation, and tank structures and materials. Ground based testing of large-scale systems is done using liquid hydrogen as a test fluid at the Cryogenic Propellant Tank Facility (K-site) at Lewis' Plum Brook Station in Sandusky, Ohio. A general overview of tests involving liquid transfer, thermal control, pressure control, and pressurization is given.

  1. Evaluation of genotoxicity testing of FDA approved large molecule therapeutics.

    PubMed

    Sawant, Satin G; Fielden, Mark R; Black, Kurt A

    2014-10-01

    Large molecule therapeutics (MW>1000daltons) are not expected to enter the cell and thus have reduced potential to interact directly with DNA or related physiological processes. Genotoxicity studies are therefore not relevant and typically not required for large molecule therapeutic candidates. Regulatory guidance supports this approach; however there are examples of marketed large molecule therapeutics where sponsors have conducted genotoxicity studies. A retrospective analysis was performed on genotoxicity studies of United States FDA approved large molecule therapeutics since 1998 identified through the Drugs@FDA website. This information was used to provide a data-driven rationale for genotoxicity evaluations of large molecule therapeutics. Fifty-three of the 99 therapeutics identified were tested for genotoxic potential. None of the therapeutics tested showed a positive outcome in any study except the peptide glucagon (GlucaGen®) showing equivocal in vitro results, as stated in the product labeling. Scientific rationale and data from this review indicate that testing of a majority of large molecule modalities do not add value to risk assessment and support current regulatory guidance. Similarly, the data do not support testing of peptides containing only natural amino acids. Peptides containing non-natural amino acids and small molecules in conjugated products may need to be tested. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Photovoltaic-Powered Vaccine Refrigerator: Freezer Systems Field Test Results

    NASA Technical Reports Server (NTRS)

    Ratajczak, A. F.

    1985-01-01

    A project to develop and field test photovoltaic-powered refrigerator/freezers suitable for vaccine storage was undertaken. Three refrigerator/freezers were qualified; one by Solar Power Corp. and two by Solvolt. Follow-on contracts were awarded for 19 field test systems and for 10 field test systems. A total of 29 systems were installed in 24 countries between October 1981 and October 1984. The project, systems descriptions, installation experiences, performance data for the 22 systems for which field test data was reported, an operational reliability summary, and recommendations relative to system designs and future use of such systems are explained. Performance data indicate that the systems are highly reliable and are capable of maintaining proper vaccine storage temperatures in a wide range of climatological and user environments.

  3. MUSE field splitter unit: fan-shaped separator for 24 integral field units

    NASA Astrophysics Data System (ADS)

    Laurent, Florence; Renault, Edgard; Anwand, Heiko; Boudon, Didier; Caillier, Patrick; Kosmalski, Johan; Loupias, Magali; Nicklas, Harald; Seifert, Walter; Salaun, Yves; Xu, Wenli

    2014-07-01

    MUSE (Multi Unit Spectroscopic Explorer) is a second generation Very Large Telescope (VLT) integral field spectrograph developed for the European Southern Observatory (ESO). It combines a 1' x 1' field of view sampled at 0.2 arcsec for its Wide Field Mode (WFM) and a 7.5"x7.5" field of view for its Narrow Field Mode (NFM). Both modes will operate with the improved spatial resolution provided by GALACSI (Ground Atmospheric Layer Adaptive Optics for Spectroscopic Imaging), that will use the VLT deformable secondary mirror and 4 Laser Guide Stars (LGS) foreseen in 2015. MUSE operates in the visible wavelength range (0.465-0.93 μm). A consortium of seven institutes is currently commissioning MUSE in the Very Large Telescope for the Preliminary Acceptance in Chile, scheduled for September, 2014. MUSE is composed of several subsystems which are under the responsibility of each institute. The Fore Optics derotates and anamorphoses the image at the focal plane. A Splitting and Relay Optics feed the 24 identical Integral Field Units (IFU), that are mounted within a large monolithic instrument mechanical structure. Each IFU incorporates an image slicer, a fully refractive spectrograph with VPH-grating and a detector system connected to a global vacuum and cryogenic system. During 2012 and 2013, all MUSE subsystems were integrated, aligned and tested to the P.I. institute at Lyon. After successful PAE in September 2013, MUSE instrument was shipped to the Very Large Telescope in Chile where it was aligned and tested in ESO integration hall at Paranal. After, MUSE was directly transferred in monolithic way onto VLT telescope where the first light was achieved. This paper describes the MUSE main optical component: the Field Splitter Unit. It splits the VLT image into 24 subfields and provides the first separation of the beam for the 24 Integral Field Units. This talk depicts its manufacturing at Winlight Optics and its alignment into MUSE instrument. The success of the MUSE

  4. Creating a testing field where delta technology and water innovations are tested and demonstrated with the help of citizen science methods

    NASA Astrophysics Data System (ADS)

    de Vries, Sandra; Rutten, Martine; de Vries, Liselotte; Anema, Kim; Klop, Tanja; Kaspersma, Judith

    2017-04-01

    In highly populated deltas, much work is to be done. Complex problems ask for new and knowledge driven solutions. Innovations in delta technology and water can bring relief to managing the water rich urban areas. Testing fields form a fundamental part of the knowledge valorisation for such innovations. In such testing fields, product development by start-ups is coupled with researchers, thus supplying new scientific insights. With the help of tests, demonstrations and large-scale applications by the end-users, these innovations find their way to the daily practices of delta management. More and more cities embrace the concept of Smart Cities to tackle the ongoing complexity of urban problems and to manage the city's assets - such as its water supply networks and other water management infrastructure. Through the use of new technologies and innovative systems, data are collected from and with citizens and devices - then processed and analysed. The information and knowledge gathered are keys to enabling a better quality of life. By testing water innovations together with citizens in order to find solutions for water management problems, not only highly spatial amounts of data are provided by and/or about these innovations, they are also improved and demonstrated to the public. A consortium consisting of a water authority, a science centre, a valorisation program and two universities have joined forces to create a testing field for delta technology and water innovations using citizen science methods. In this testing field, the use of citizen science for water technologies is researched and validated by facilitating pilot projects. In these projects, researchers, start-ups and citizens work together to find the answer to present-day water management problems. The above mentioned testing field tests the use of crowd-sourcing data as for example hydrological model inputs, or to validate remote sensing applications, or improve water management decisions. Currently the

  5. Pre-Clinical Evaluation of a Real-Time PCR Assay on a Portable Instrument as a Possible Field Diagnostic Tool: Experiences from the Testing of Clinical Samples for African and Classical Swine Fever Viruses.

    PubMed

    Liu, L; Luo, Y; Accensi, F; Ganges, L; Rodríguez, F; Shan, H; Ståhl, K; Qiu, H-J; Belák, S

    2017-10-01

    African swine fever (ASF) and classical swine fever (CSF) are two highly infectious transboundary animal diseases (TADs) that are serious threats to the pig industry worldwide, including in China, the world's largest pork producer. In this study, a duplex real-time PCR assay was developed for the rapid detection and differentiation of African swine fever virus (ASFV) and classical swine fever virus (CSFV). The assay was performed on a portable, battery-powered PCR thermocycler with a low sample throughput (termed as 'T-COR4 assay'). The feasibility and reliability of the T-COR4 assay as a possible field method was investigated by testing clinical samples collected in China. When evaluated with reference materials or samples from experimental infections, the assay performed in a reliable manner, producing results comparable to those obtained from stationary PCR platforms. Of 59 clinical samples, 41 had results identical to a two-step CSFV real-time PCR assay. No ASFV was detected in these samples. The T-COR4 assay was technically easy to perform and produced results within 3 h, including sample preparation. In combination with a simple sample preparation method, the T-COR4 assay provides a new tool for the field diagnosis and differentiation of ASF and CSF, which could be of particular value in remote areas. © 2016 Blackwell Verlag GmbH.

  6. FIELD SAMPLING OF RESIDUAL AVIATION GASOLINE IN SANDY SOIL

    EPA Science Inventory

    Two complimentary field sampling methods for the determination of residual aviation gasoline content in the contaminated capillary fringe of a fine, uniform, sandy soil were investigated. The first method featured filed extrusion of core barrels into pint size Mason jars, while ...

  7. A Large-Sample Test of a Semi-Automated Clavicle Search Engine to Assist Skeletal Identification by Radiograph Comparison.

    PubMed

    D'Alonzo, Susan S; Guyomarc'h, Pierre; Byrd, John E; Stephan, Carl N

    2017-01-01

    In 2014, a morphometric capability to search chest radiograph databases by quantified clavicle shape was published to assist skeletal identification. Here, we extend the validation tests conducted by increasing the search universe 18-fold, from 409 to 7361 individuals to determine whether there is any associated decrease in performance under these more challenging circumstances. The number of trials and analysts were also increased, respectively, from 17 to 30 skeletons, and two to four examiners. Elliptical Fourier analysis was conducted on clavicles from each skeleton by each analyst (shadowgrams trimmed from scratch in every instance) and compared to the search universe. Correctly matching individuals were found in shortlists of 10% of the sample 70% of the time. This rate is similar to, although slightly lower than, rates previously found for much smaller samples (80%). Accuracy and reliability are thereby maintained, even when the comparison system is challenged by much larger search universes. © 2016 American Academy of Forensic Sciences.

  8. Testing large flats with computer generated holograms

    NASA Astrophysics Data System (ADS)

    Pariani, Giorgio; Tresoldi, Daniela; Spanò, Paolo; Bianco, Andrea

    2012-09-01

    We describe the optical test of a large flat based on a spherical mirror and a dedicated CGH. The spherical mirror, which can be accurately manufactured and tested in absolute way, allows to obtain a quasi collimated light beam, and the hologram performs the residual wavefront correction. Alignment tools for the spherical mirror and the hologram itself are encoded in the CGH. Sensitivity to fabrication errors and alignment has been evaluated. Tests to verify the effectiveness of our approach are now under execution.

  9. Optimum sample size allocation to minimize cost or maximize power for the two-sample trimmed mean test.

    PubMed

    Guo, Jiin-Huarng; Luh, Wei-Ming

    2009-05-01

    When planning a study, sample size determination is one of the most important tasks facing the researcher. The size will depend on the purpose of the study, the cost limitations, and the nature of the data. By specifying the standard deviation ratio and/or the sample size ratio, the present study considers the problem of heterogeneous variances and non-normality for Yuen's two-group test and develops sample size formulas to minimize the total cost or maximize the power of the test. For a given power, the sample size allocation ratio can be manipulated so that the proposed formulas can minimize the total cost, the total sample size, or the sum of total sample size and total cost. On the other hand, for a given total cost, the optimum sample size allocation ratio can maximize the statistical power of the test. After the sample size is determined, the present simulation applies Yuen's test to the sample generated, and then the procedure is validated in terms of Type I errors and power. Simulation results show that the proposed formulas can control Type I errors and achieve the desired power under the various conditions specified. Finally, the implications for determining sample sizes in experimental studies and future research are discussed.

  10. A single mini-barcode test to screen for Australian mammalian predators from environmental samples

    PubMed Central

    MacDonald, Anna J; Sarre, Stephen D

    2017-01-01

    Abstract Identification of species from trace samples is now possible through the comparison of diagnostic DNA fragments against reference DNA sequence databases. DNA detection of animals from non-invasive samples, such as predator faeces (scats) that contain traces of DNA from their species of origin, has proved to be a valuable tool for the management of elusive wildlife. However, application of this approach can be limited by the availability of appropriate genetic markers. Scat DNA is often degraded, meaning that longer DNA sequences, including standard DNA barcoding markers, are difficult to recover. Instead, targeted short diagnostic markers are required to serve as diagnostic mini-barcodes. The mitochondrial genome is a useful source of such trace DNA markers because it provides good resolution at the species level and occurs in high copy numbers per cell. We developed a mini-barcode based on a short (178 bp) fragment of the conserved 12S ribosomal ribonucleic acid mitochondrial gene sequence, with the goal of discriminating amongst the scats of large mammalian predators of Australia. We tested the sensitivity and specificity of our primers and can accurately detect and discriminate amongst quolls, cats, dogs, foxes, and devils from trace DNA samples. Our approach provides a cost-effective, time-efficient, and non-invasive tool that enables identification of all 8 medium-large mammal predators in Australia, including native and introduced species, using a single test. With modification, this approach is likely to be of broad applicability elsewhere. PMID:28810700

  11. The persistence of large-scale blowouts in largely vegetated coastal dune fields

    NASA Astrophysics Data System (ADS)

    Delgado-Fernandez, Irene; Smyth, Thomas; Jackson, Derek; Davidson-Arnott, Robin; Smith, Alexander

    2016-04-01

    Coastal dunes move through natural phases of stability and instability during their evolution, displaying various temporal and spatial patterns across the dune field. Recent observations, however, have shown exceptionally rapid rates of stability through increased vegetative growth. This progressive vegetation colonisation and consequent loss of bare sand on coastal dune systems has been noted worldwide. Percentage reductions in bare sand of as much as 80% within just a few decades can been seen in examples from South Africa, Canada and Brazil as well as coastal dune sites across NW Europe. Despite these dramatic trends towards dune stabilisation, it is not uncommon to find particular examples of large-scale active blowouts and parabolic dunes within largely vegetated coastal dunes. While turbulence and airflow dynamics within features such as blowouts and other dune forms has been studied in detail within recent years, there is a lack of knowledge about what maintains dune mobility at these specific points in otherwise largely stabilized dune fields. This work explores the particular example of the 'Devil's Hole' blowout, Sefton Dunes, NW England. Approximately 300 m long by 100 m wide, its basin is below the water-table which leads to frequent flooding. Sefton Dunes in general have seen a dramatic loss of bare sand since the 1940s. However, and coinciding with this period of dune stabilisation, the 'Devil's Hole' has not only remained active but also grown in size at a rate of 4.5 m year-1 along its main axis. An exploration of factors controlling the maintenance of open bare sand areas at this particular location is examined using a variety of techniques including Computational Fluid Dynamics (CFD) airflow modelling and in situ empirical measurements of (short-term experiments) of wind turbulence and sand transport. Field measurements of wind parameters and transport processes were collected over a 2 week period during October 2015. Twenty three 3D ultrasonic

  12. Large-scale magnetic fields, non-Gaussianity, and gravitational waves from inflation

    NASA Astrophysics Data System (ADS)

    Bamba, Kazuharu

    2017-12-01

    We explore the generation of large-scale magnetic fields in the so-called moduli inflation. The hypercharge electromagnetic fields couple to not only a scalar field but also a pseudoscalar one, so that the conformal invariance of the hypercharge electromagnetic fields can be broken. We explicitly analyze the strength of the magnetic fields on the Hubble horizon scale at the present time, the local non-Gaussianity of the curvature perturbations originating from the massive gauge fields, and the tensor-to-scalar ratio of the density perturbations. As a consequence, we find that the local non-Gaussianity and the tensor-to-scalar ratio are compatible with the recent Planck results.

  13. Preparation and Testing of Impedance-based Fluidic Biochips with RTgill-W1 Cells for Rapid Evaluation of Drinking Water Samples for Toxicity

    PubMed Central

    Brennan, Linda M.; Widder, Mark W.; McAleer, Michael K.; Mayo, Michael W.; Greis, Alex P.; van der Schalie, William H.

    2016-01-01

    This manuscript describes how to prepare fluidic biochips with Rainbow trout gill epithelial (RTgill-W1) cells for use in a field portable water toxicity sensor. A monolayer of RTgill-W1 cells forms on the sensing electrodes enclosed within the biochips. The biochips are then used for testing in a field portable electric cell-substrate impedance sensing (ECIS) device designed for rapid toxicity testing of drinking water. The manuscript further describes how to run a toxicity test using the prepared biochips. A control water sample and the test water sample are mixed with pre-measured powdered media and injected into separate channels of the biochip. Impedance readings from the sensing electrodes in each of the biochip channels are measured and compared by an automated statistical software program. The screen on the ECIS instrument will indicate either "Contamination Detected" or "No Contamination Detected" within an hour of sample injection. Advantages are ease of use and rapid response to a broad spectrum of inorganic and organic chemicals at concentrations that are relevant to human health concerns, as well as the long-term stability of stored biochips in a ready state for testing. Limitations are the requirement for cold storage of the biochips and limited sensitivity to cholinesterase-inhibiting pesticides. Applications for this toxicity detector are for rapid field-portable testing of drinking water supplies by Army Preventative Medicine personnel or for use at municipal water treatment facilities. PMID:27023147

  14. Preparation and Testing of Impedance-based Fluidic Biochips with RTgill-W1 Cells for Rapid Evaluation of Drinking Water Samples for Toxicity.

    PubMed

    Brennan, Linda M; Widder, Mark W; McAleer, Michael K; Mayo, Michael W; Greis, Alex P; van der Schalie, William H

    2016-03-07

    This manuscript describes how to prepare fluidic biochips with Rainbow trout gill epithelial (RTgill-W1) cells for use in a field portable water toxicity sensor. A monolayer of RTgill-W1 cells forms on the sensing electrodes enclosed within the biochips. The biochips are then used for testing in a field portable electric cell-substrate impedance sensing (ECIS) device designed for rapid toxicity testing of drinking water. The manuscript further describes how to run a toxicity test using the prepared biochips. A control water sample and the test water sample are mixed with pre-measured powdered media and injected into separate channels of the biochip. Impedance readings from the sensing electrodes in each of the biochip channels are measured and compared by an automated statistical software program. The screen on the ECIS instrument will indicate either "Contamination Detected" or "No Contamination Detected" within an hour of sample injection. Advantages are ease of use and rapid response to a broad spectrum of inorganic and organic chemicals at concentrations that are relevant to human health concerns, as well as the long-term stability of stored biochips in a ready state for testing. Limitations are the requirement for cold storage of the biochips and limited sensitivity to cholinesterase-inhibiting pesticides. Applications for this toxicity detector are for rapid field-portable testing of drinking water supplies by Army Preventative Medicine personnel or for use at municipal water treatment facilities.

  15. Distinct Element Modeling of the Large Block Test

    NASA Astrophysics Data System (ADS)

    Carlson, S. R.; Blair, S. C.; Wagoner, J. L.

    2001-12-01

    The Yucca Mountain Site Characterization Project is investigating Yucca Mountain, Nevada as a potential nuclear waste repository site. As part of this effort, the Large Block, a 3m x 3m x 4.5m rectangular prism of Topopah Spring tuff, was excavated at Fran Ridge near Yucca Mountain. The Large Block was heated to a peak temperature of 145\\deg C along a horizontal plane 2.75m below the top of the block over a period of about one-year. Displacements were measured in three orthogonal directions with an array of six Multiple Point Borehole Extensometers (MPBX) and were numerically simulated in three dimensions with 3DEC, a distinct element code. The distinct element method was chosen to incorporate discrete fractures in the simulations. The model domain was extended 23m below the ground surface and, in the subsurface, 23m outward from each vertical face so that fixed displacement boundary conditions could be applied well away from the heated portion of the block. A single continuum model and three distinct element models, incorporating six to twenty eight mapped fractures, were tested. Two thermal expansion coefficients were tested for the six-fracture model: a higher value taken from laboratory measurements and a lower value from an earlier field test. The MPBX data show that the largest displacements occurred in the upper portion of the block despite the higher temperatures near the center. The continuum model was found to under-predict the MPBX displacements except in the east west direction near the base of the block. The high thermal expansion model over-predicted the MPBX displacements except in the north south direction near the top of the block. The highly fractured model under-predicted most of the MPBX displacements and poorly simulated the cool-down portion of the test. Although no model provided the single best fit to all of the MPBX data, the six and seven fracture models consistently provided good fits and in most cases showed much improvement over the

  16. Scanning EM of non-heavy metal stained biosamples: Large-field of view, high contrast and highly efficient immunolabeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuipers, Jeroen; Boer, Pascal de; Giepmans, Ben N.G., E-mail: b.n.g.giepmans@umcg.nl

    Scanning electron microscopy (SEM) is increasing its application in life sciences for electron density measurements of ultrathin sections. These are traditionally analyzed with transmission electron microscopy (TEM); by most labs, SEM analysis still is associated with surface imaging only. Here we report several advantages of SEM for thin sections over TEM, both for structural inspection, as well as analyzing immuno-targeted labels such as quantum dots (QDs) and gold, where we find that QD-labeling is ten times more efficient than gold-labeling. Furthermore, we find that omitting post-staining with uranyl and lead leads to QDs readily detectable over the ultrastructure, but undermore » these conditions ultrastructural contrast was even almost invisible in TEM examination. Importantly, imaging in SEM with STEM detection leads to both outstanding QDs and ultrastructural contrast. STEM imaging is superior over back-scattered electron imaging of these non-contrasted samples, whereas secondary electron detection cannot be used at all. We conclude that examination of ultrathin sections by SEM, which may be immunolabeled with QDs, will allow rapid and straightforward analysis of large fields with more efficient labeling than can be achieved with immunogold. The large fields of view routinely achieved with SEM, but not with TEM, allows straightforward raw data sharing using virtual microscopy, also known as nanotomy when this concerns EM data in the life sciences. - Highlights: • High resolution and large fields of view via nanotomy or virtual microscopy. • Highly relevant for EM‐datasets where information density is high. • Sample preparation with low contrast good for STEM, not TEM. • Quantum dots now stand out in STEM‐based detection. • 10 Times more efficient labeling with quantum dots compared to gold.« less

  17. Field comparison of OraQuick® ADVANCE Rapid HIV-1/2 antibody test and two blood-based rapid HIV antibody tests in Zambia

    PubMed Central

    2012-01-01

    Background Zambia’s national HIV testing algorithm specifies use of two rapid blood based antibody assays, Determine®HIV-1/2 (Inverness Medical) and if positive then Uni-GoldTM Recombigen HIV-1/2 (Trinity Biotech). Little is known about the performance of oral fluid based HIV testing in Zambia. The aims of this study are two-fold: 1) to compare the diagnostic accuracy (sensitivity and specificity) under field conditions of the OraQuick® ADVANCE® Rapid HIV-1/2 (OraSure Technologies, Inc.) to two blood-based rapid antibody tests currently in use in the Zambia National Algorithm, and 2) to perform a cost analysis of large-scale field testing employing the OraQuick®. Methods This was a operational retrospective research of HIV testing and questionnaire data collected in 2010 as part of the ZAMSTAR (Zambia South Africa TB and AIDS reduction) study. Randomly sampled individuals in twelve communities were tested consecutively with OraQuick® test using oral fluid versus two blood-based rapid HIV tests, Determine® and Uni-GoldTM. A cost analysis of four algorithms from health systems perspective were performed: 1) Determine® and if positive, then Uni-GoldTM (Determine®/Uni-GoldTM); based on current algorithm, 2) Determine® and if positive, then OraQuick® (Determine®/OraQuick®), 3) OraQuick® and if positive, then Determine® (OraQuick®/Determine®), 4) OraQuick® and if positive, then Uni-GoldTM (OraQuick®/Uni-GoldTM). This information was then used to construct a model using a hypothetical population of 5,000 persons with varying prevalence of HIV infection from 1–30%. Results 4,458 participants received both a Determine® and OraQuick® test. The sensitivity and specificity of the OraQuick® test were 98.7 (95%CI, 97.5–99.4) and 99.8 (95%CI, 99.6–99.9), respectively when compared to HIV positive serostatus. The average unit costs per algorithm were US$3.76, US$4.03, US$7.35, and US$7.67 for Determine®/Uni-GoldTM, Determine®/OraQuick®, Ora

  18. A prototype tap test imaging system: Initial field test results

    NASA Astrophysics Data System (ADS)

    Peters, J. J.; Barnard, D. J.; Hudelson, N. A.; Simpson, T. S.; Hsu, D. K.

    2000-05-01

    This paper describes a simple, field-worthy tap test imaging system that gives quantitative information about the size, shape, and severity of defects and damages. The system consists of an accelerometer, electronic circuits for conditioning the signal and measuring the impact duration, a laptop PC and data acquisition and processing software. The images are generated manually by tapping on a grid printed on a plastic sheet laid over the part's surface. A mechanized scanner is currently under development. The prototype has produced images for a variety of aircraft composite and metal honeycomb structures containing flaws, damages, and repairs. Images of the local contact stiffness, deduced from the impact duration using a spring model, revealed quantitatively the stiffness reduction due to flaws and damages, as well as the stiffness enhancement due to substructures. The system has been field tested on commercial and military aircraft as well as rotor blades and engine decks on helicopters. Field test results will be shown and the operation of the system will be demonstrated.—This material is based upon work supported by the Federal Aviation Administration under Contract #DTFA03-98-D-00008, Delivery Order No. IA016 and performed at Iowa State University's Center for NDE as part of the Center for Aviation Systems Reliability program.

  19. Field Test to Evaluate Deep Borehole Disposal.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, Ernest; Brady, Patrick Vane.; Clark, Andrew Jordan

    The U.S. Department of Energy (DOE) has embarked on the Deep Borehole Field Test (DBFT), which will investigate whether conditions suitable for disposal of radioactive waste can be found at a depth of up to 5 km in the earth’s crust. As planned, the DBFT will demonstrate drilling and construction of two boreholes, one for initial scientific characterization, and the other at a larger diameter such as could be appropriate for waste disposal (the DBFT will not involve radioactive waste). A wide range of geoscience activities is planned for the Characterization Borehole, and an engineering demonstration of test package emplacementmore » and retrieval is planned for the larger Field Test Borehole. Characterization activities will focus on measurements and samples that are important for evaluating the long-term isolation capability of the Deep Borehole Disposal (DBD) concept. Engineering demonstration activities will focus on providing data to evaluate the concept’s operational safety and practicality. Procurement of a scientifically acceptable DBFT site and a site management contractor is now underway. The concept of deep borehole disposal (DBD) for radioactive wastes is not new. It was considered by the National Academy of Science (NAS 1957) for liquid waste, studied in the 1980’s in the U.S. (Woodward–Clyde 1983), and has been evaluated by European waste disposal R&D programs in the past few decades (for example, Grundfelt and Crawford 2014; Grundfelt 2010). Deep injection of wastewater including hazardous wastes is ongoing in the U.S. and regulated by the Environmental Protection Agency (EPA 2001). The DBFT is being conducted with a view to use the DBD concept for future disposal of smaller-quantity, DOE-managed wastes from nuclear weapons production (i.e., Cs/Sr capsules and granular solid wastes). However, the concept may also have broader applicability for nations that have a need to dispose of limited amounts of spent fuel from nuclear power reactors

  20. Near-field Testing of the 15-meter Model of the Hoop Column Antenna

    NASA Technical Reports Server (NTRS)

    Hoover, J.; Kefauver, N.; Cencich, T.; Osborn, J.; Osmanski, J.

    1986-01-01

    The technical results from near-field testing of the 15-meter model of the hoop column antenna at the Martin Marietta Denver Aerospace facility are documented. The antenna consists of a deployable central column and a 15 meter hoop, stiffened by cables into a structure with a high tolerance repeatable surface and offset feed location. The surface has been configured to have four offset parabolic apertures, each about 6 meters in diameter, and is made of gold plated molybdenum wire mesh. Pattern measurements were made with feed systems radiating at frequencies of 7.73, 11.60, 2.27, 2.225, and 4.26 (all in GHz). This report (Volume 1) covers the testing from an overall viewpoint and contains information of generalized interest for testing large antennas. This volume discusses the deployment of the antenna in the Martin Facility and the measurements to determine mechanical stability and trueness of the reflector surface, gives the test program outline, and gives a synopsis of antenna electromagnetic performance. Three techniques for measuring surface mechanical tolerances were used (theodolites, metric cameras, and near-field phase), but only the near-field phase approach is included. The report also includes an error analysis. A detailed listing of the antenna patterns are provided for the 2.225 Ghz feed in Volume 3 of this report, and for all other feeds in Volume 2.

  1. A self-sampling method to obtain large volumes of undiluted cervicovaginal secretions.

    PubMed

    Boskey, Elizabeth R; Moench, Thomas R; Hees, Paul S; Cone, Richard A

    2003-02-01

    Studies of vaginal physiology and pathophysiology sometime require larger volumes of undiluted cervicovaginal secretions than can be obtained by current methods. A convenient method for self-sampling these secretions outside a clinical setting can facilitate such studies of reproductive health. The goal was to develop a vaginal self-sampling method for collecting large volumes of undiluted cervicovaginal secretions. A menstrual collection device (the Instead cup) was inserted briefly into the vagina to collect secretions that were then retrieved from the cup by centrifugation in a 50-ml conical tube. All 16 women asked to perform this procedure found it feasible and acceptable. Among 27 samples, an average of 0.5 g of secretions (range, 0.1-1.5 g) was collected. This is a rapid and convenient self-sampling method for obtaining relatively large volumes of undiluted cervicovaginal secretions. It should prove suitable for a wide range of assays, including those involving sexually transmitted diseases, microbicides, vaginal physiology, immunology, and pathophysiology.

  2. Validation and Parameter Sensitivity Tests for Reconstructing Swell Field Based on an Ensemble Kalman Filter

    PubMed Central

    Wang, Xuan; Tandeo, Pierre; Fablet, Ronan; Husson, Romain; Guan, Lei; Chen, Ge

    2016-01-01

    The swell propagation model built on geometric optics is known to work well when simulating radiated swells from a far located storm. Based on this simple approximation, satellites have acquired plenty of large samples on basin-traversing swells induced by fierce storms situated in mid-latitudes. How to routinely reconstruct swell fields with these irregularly sampled observations from space via known swell propagation principle requires more examination. In this study, we apply 3-h interval pseudo SAR observations in the ensemble Kalman filter (EnKF) to reconstruct a swell field in ocean basin, and compare it with buoy swell partitions and polynomial regression results. As validated against in situ measurements, EnKF works well in terms of spatial–temporal consistency in far-field swell propagation scenarios. Using this framework, we further address the influence of EnKF parameters, and perform a sensitivity analysis to evaluate estimations made under different sets of parameters. Such analysis is of key interest with respect to future multiple-source routinely recorded swell field data. Satellite-derived swell data can serve as a valuable complementary dataset to in situ or wave re-analysis datasets. PMID:27898005

  3. Effects of environmental radiation on testes and spermatogenesis in wild large Japanese field mice (Apodemus speciosus) from Fukushima

    PubMed Central

    Okano, Tsukasa; Ishiniwa, Hiroko; Onuma, Manabu; Shindo, Junji; Yokohata, Yasushi; Tamaoki, Masanori

    2016-01-01

    The Fukushima Daiichi Nuclear Power Plant (FDNPP) accident that occurred after the Great East Japan Earthquake in March 2011 released large quantities of radionuclides to the environment. The long-term effects of radioactive cesium (Cs) on biota are of particular concern. We investigated the accumulation of radioactive Cs derived from the FDNPP accident, and chronic effects of environmental radionuclides on male reproduction, in the large Japanese field mouse (Apodemus speciosus). In 2013 and 2014, wild mice were captured at 2 sites in Fukushima Prefecture and at 2 control sites that were distant from Fukushima. Although the median concentrations of 134Cs and 137Cs in the mice from Fukushima exceeded 4,000 Bq/kg, there were no significant differences in the apoptotic cell frequencies or the frequencies of morphologically abnormal sperm among the capture sites. Thus, we conclude that radiation did not cause substantial male subfertility in Fukushima during 2013 and 2014, and radionuclide pollution levels in the study sites would not be detrimental to spermatogenesis of the wild mice in Fukushima. PMID:27005329

  4. Measurement of physical performance by field tests in programs of cardiac rehabilitation: a systematic review and meta-analysis.

    PubMed

    Travensolo, Cristiane; Goessler, Karla; Poton, Roberto; Pinto, Roberta Ramos; Polito, Marcos Doederlein

    2018-04-13

    The literature concerning the effects of cardiac rehabilitation (CR) on field tests results is inconsistent. To perform a systematic review with meta-analysis on field tests results after programs of CR. Studies published in PubMed and Web of Science databases until May 2016 were analyzed. The standard difference in means correct by bias (Hedges' g) was used as effect size (g) to measure que amount of modifications in performance of field tests after CR period. Potential differences between subgroups were analyzed by Q-test based on ANOVA. Fifteen studies published between 1996 e 2016 were included in the review, 932 patients and age ranged 54,4 - 75,3 years old. Fourteen studies used the six-minutes walking test to evaluate the exercise capacity and one study used the Shuttle Walk Test. The random Hedges's g was 0.617 (P<0.001), representing a drop of 20% in the performance of field test after CR. The meta-regression showed significantly association (P=0.01) to aerobic exercise duration, i.e., for each 1-min increase in aerobic exercise duration, there is a 0.02 increase in effect size for performance in the field test. Field tests can detect physical modification after CR, and the large duration of aerobic exercise during CR was associated with a better result. Copyright © 2018 Sociedade Portuguesa de Cardiologia. Publicado por Elsevier España, S.L.U. All rights reserved.

  5. Comparison of two sampling and culture systems for detection of Salmonella enterica in the environment of a large animal hospital.

    PubMed

    Ruple-Czerniak, A; Bolte, D S; Burgess, B A; Morley, P S

    2014-07-01

    Nosocomial salmonellosis is an important problem in veterinary hospitals that treat horses and other large animals. Detection and mitigation of outbreaks and prevention of healthcare-associated infections often require detection of Salmonella enterica in the hospital environment. To compare 2 previously published methods for detecting environmental contamination with S. enterica in a large animal veterinary teaching hospital. Hospital-based comparison of environmental sampling techniques. A total of 100 pairs of environmental samples were collected from stalls used to house large animal cases (horses, cows or New World camelids) that were confirmed to be shedding S. enterica by faecal culture. Stalls were cleaned and disinfected prior to sampling, and the same areas within each stall were sampled for the paired samples. One method of detection used sterile, premoistened sponges that were cultured using thioglycolate enrichment before plating on XLT-4 agar. The other method used electrostatic wipes that were cultured using buffered peptone water, tetrathionate and Rappaport-Vassiliadis R10 broths before plating on XLT-4 agar. Salmonella enterica was recovered from 14% of samples processed using the electrostatic wipe sampling and culture procedure, whereas S. enterica was recovered from only 4% of samples processed using the sponge sampling and culture procedure. There was test agreement for 85 pairs of culture-negative samples and 3 pairs of culture-positive samples. However, the remaining 12 pairs of samples with discordant results created significant disagreement between the 2 detection methods (P<0.01). Persistence of Salmonella in the environment of veterinary hospitals can occur even with rigorous cleaning and disinfection. Use of sensitive methods for detection of environmental contamination is critical when detecting and mitigating this problem in veterinary hospitals. These results suggest that the electrostatic wipe sampling and culture method was

  6. Protocol to obtain targeted transcript sequence data from snake venom samples collected in the Colombian field.

    PubMed

    Fonseca, Alejandra; Renjifo-Ibáñez, Camila; Renjifo, Juan Manuel; Cabrera, Rodrigo

    2018-03-21

    Snake venoms are a mixture of different molecules that can be used in the design of drugs for various diseases. The study of these venoms has relied on strategies that use complete venom extracted from animals in captivity or from venom glands that require the sacrifice of the animals. Colombia, a country with political and geographical conflicts has difficult access to certain regions. A strategy that can prevent the sacrifice of animals and could allow the study of samples collected in the field is necessary. We report the use of lyophilized venom from Crotalus durissus cumanensis as a model to test, for the first time, a protocol for the amplification of complete toxins from Colombian venom samples collected in the field. In this protocol, primers were designed from conserved region from Crotalus sp. mRNA and EST regions to maximize the likelihood of coding sequence amplification. We obtained the sequences of Metalloproteinases II, Disintegrins, Disintegrin-Like, Phospholipases A 2, C-type Lectins and Serine proteinases from Crotalus durissus cumanensis and compared them to different Crotalus sp sequences available on databases obtaining concordance between the toxins amplified and those reported. Our strategy allows the use of lyophilized venom to obtain complete toxin sequences from samples collected in the field and the study of poorly characterized venoms in challenging environments. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. 18/20 T high magnetic field scanning tunneling microscope with fully low voltage operability, high current resolution, and large scale searching ability.

    PubMed

    Li, Quanfeng; Wang, Qi; Hou, Yubin; Lu, Qingyou

    2012-04-01

    We present a home-built 18/20 T high magnetic field scanning tunneling microscope (STM) featuring fully low voltage (lower than ±15 V) operability in low temperatures, large scale searching ability, and 20 fA high current resolution (measured by using a 100 GOhm dummy resistor to replace the tip-sample junction) with a bandwidth of 3.03 kHz. To accomplish low voltage operation which is important in achieving high precision, low noise, and low interference with the strong magnetic field, the coarse approach is implemented with an inertial slider driven by the lateral bending of a piezoelectric scanner tube (PST) whose inner electrode is axially split into two for enhanced bending per volt. The PST can also drive the same sliding piece to inertial slide in the other bending direction (along the sample surface) of the PST, which realizes the large area searching ability. The STM head is housed in a three segment tubular chamber, which is detachable near the STM head for the convenience of sample and tip changes. Atomic resolution images of a graphite sample taken under 17.6 T and 18.0001 T are presented to show its performance. © 2012 American Institute of Physics

  8. TESTING FOR A LARGE LOCAL VOID BY INVESTIGATING THE NEAR-INFRARED GALAXY LUMINOSITY FUNCTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keenan, R. C.; Wang, W.-H.; Barger, A. J.

    2012-08-01

    Recent cosmological modeling efforts have shown that a local underdensity on scales of a few hundred Mpc (out to z {approx} 0.1) could produce the apparent acceleration of the expansion of the universe observed via Type Ia supernovae. Several studies of galaxy counts in the near-infrared (NIR) have found that the local universe appears underdense by {approx}25%-50% compared with regions a few hundred Mpc distant. Galaxy counts at low redshifts sample primarily L {approx} L* galaxies. Thus, if the local universe is underdense, then the normalization of the NIR galaxy luminosity function (LF) at z > 0.1 should be highermore » than that measured for z < 0.1. Here we present a highly complete (>90%) spectroscopic sample of 1436 galaxies selected in the H band (1.6 {mu}m) to study the normalization of the NIR LF at 0.1 < z < 0.3 and address the question of whether or not we reside in a large local underdensity. Our survey sample consists of all galaxies brighter than 18th magnitude in the H band drawn from six widely separated fields at high Galactic latitudes, which cover a total of {approx}2 deg{sup 2} on the sky. We find that for the combination of our six fields, the product {phi}*L* at 0.1 < z < 0.3 is {approx}30% higher than that measured at lower redshifts. While our statistical errors in this measurement are on the {approx}10% level, we find the systematics due to cosmic variance may be larger still. We investigate the effects of cosmic variance on our measurement using the COSMOS cone mock catalogs from the Millennium Simulation and recent empirical estimates of cosmic variance. We find that our survey is subject to systematic uncertainties due to cosmic variance at the 15% level (1{sigma}), representing an improvement by a factor of {approx}2 over previous studies in this redshift range. We conclude that observations cannot yet rule out the possibility that the local universe is underdense at z < 0.1. The fields studied in this work have a large amount of

  9. Sampling and Reconstruction of the Pupil and Electric Field for Phase Retrieval

    NASA Technical Reports Server (NTRS)

    Dean, Bruce; Smith, Jeffrey; Aronstein, David

    2012-01-01

    This technology is based on sampling considerations for a band-limited function, which has application to optical estimation generally, and to phase retrieval specifically. The analysis begins with the observation that the Fourier transform of an optical aperture function (pupil) can be implemented with minimal aliasing for Q values down to Q = 1. The sampling ratio, Q, is defined as the ratio of the sampling frequency to the band-limited cut-off frequency. The analytical results are given using a 1-d aperture function, and with the electric field defined by the band-limited sinc(x) function. Perfect reconstruction of the Fourier transform (electric field) is derived using the Whittaker-Shannon sampling theorem for 1sampling ratio such that 1field with no aliasing, which has been extended to 2-d optical apertures.

  10. Search for life on Mars in surface samples: Lessons from the 1999 Marsokhod rover field experiment

    USGS Publications Warehouse

    Newsom, Horton E.; Bishop, J.L.; Cockell, C.; Roush, T.L.; Johnson, J. R.

    2001-01-01

    The Marsokhod 1999 field experiment in the Mojave Desert included a simulation of a rover-based sample selection mission. As part of this mission, a test was made of strategies and analytical techniques for identifying past or present life in environments expected to be present on Mars. A combination of visual clues from high-resolution images and the detection of an important biomolecule (chlorophyll) with visible/near-infrared (NIR) spectroscopy led to the successful identification of a rock with evidence of cryptoendolithic organisms. The sample was identified in high-resolution images (3 times the resolution of the Imager for Mars Pathfinder camera) on the basis of a green tinge and textural information suggesting the presence of a thin, partially missing exfoliating layer revealing the organisms. The presence of chlorophyll bands in similar samples was observed in visible/NIR spectra of samples in the field and later confirmed in the laboratory using the same spectrometer. Raman spectroscopy in the laboratory, simulating a remote measurement technique, also detected evidence of carotenoids in samples from the same area. Laboratory analysis confirmed that the subsurface layer of the rock is inhabited by a community of coccoid Chroococcidioposis cyanobacteria. The identification of minerals in the field, including carbonates and serpentine, that are associated with aqueous processes was also demonstrated using the visible/NIR spectrometer. Other lessons learned that are applicable to future rover missions include the benefits of web-based programs for target selection and for daily mission planning and the need for involvement of the science team in optimizing image compression schemes based on the retention of visual signature characteristics. Copyright 2000 by the American Geophysical Union.

  11. Measurement of the UH-60A Hub Large Rotor Test Apparatus Control System Stiffness

    NASA Technical Reports Server (NTRS)

    Kufeld, Robert M.

    2014-01-01

    This purpose of this report is to provides details of the measurement of the control system stiffness of the UH-60A rotor hub mounted on the Large Rotor Test Apparatus (UH-60A/LRTA). The UH-60A/LRTA was used in the 40- by 80-Foot Wind Tunnel to complete the full-scale wind tunnel test portion of the NASA / ARMY UH-60A Airloads Program. This report describes the LRTA control system and highlights the differences between the LRTA and UH-60A aircraft. The test hardware, test setup, and test procedures are also described. Sample results are shown, including the azimuthal variation of the measured control system stiffness for three different loadings and two different dynamic actuator settings. Finally, the azimuthal stiffness is converted to fixed system values using multi-blade transformations for input to comprehensive rotorcraft prediction codes.

  12. Large-scale vortices in compressible turbulent medium with the magnetic field

    NASA Astrophysics Data System (ADS)

    Gvaramadze, V. V.; Dimitrov, B. G.

    1990-08-01

    An averaged equation which describes the large scale vortices and Alfven waves generation in a compressible helical turbulent medium with a constant magnetic field is presented. The presence of the magnetic field leads to anisotropization of the vortex generation. Possible applications of the anisotropic vortex dynamo effect are accretion disks of compact objects.

  13. The large N limit of superconformal field theories and supergravity

    NASA Astrophysics Data System (ADS)

    Maldacena, Juan

    1999-07-01

    We show that the large N limit of certain conformal field theories in various dimensions include in their Hilbert space a sector describing supergravity on the product of Anti-deSitter spacetimes, spheres and other compact manifolds. This is shown by taking some branes in the full M/string theory and then taking a low energy limit where the field theory on the brane decouples from the bulk. We observe that, in this limit, we can still trust the near horizon geometry for large N. The enhanced supersymmetries of the near horizon geometry correspond to the extra supersymmetry generators present in the superconformal group (as opposed to just the super-Poincare group). The 't Hooft limit of 3+1N=4 super-Yang-Mills at the conformal point is shown to contain strings: they are IIB strings. We conjecture that compactifications of M/string theory on various Anti-deSitter spacetimes is dual to various conformal field theories. This leads to a new proposal for a definition of M-theory which could be extended to include five non-compact dimensions.

  14. Development of a field testing protocol for identifying Deepwater Horizon oil spill residues trapped near Gulf of Mexico beaches.

    PubMed

    Han, Yuling; Clement, T Prabhakar

    2018-01-01

    The Deepwater Horizon (DWH) accident, one of the largest oil spills in U.S. history, contaminated several beaches located along the Gulf of Mexico (GOM) shoreline. The residues from the spill still continue to be deposited on some of these beaches. Methods to track and monitor the fate of these residues require approaches that can differentiate the DWH residues from other types of petroleum residues. This is because, historically, the crude oil released from sources such as natural seeps and anthropogenic discharges have also deposited other types of petroleum residues on GOM beaches. Therefore, identifying the origin of these residues is critical for developing effective management strategies for monitoring the long-term environmental impacts of the DWH oil spill. Advanced fingerprinting methods that are currently used for identifying the source of oil spill residues require detailed laboratory studies, which can be cost-prohibitive. Also, most agencies typically use untrained workers or volunteers to conduct shoreline monitoring surveys and these worker will not have access to advanced laboratory facilities. Furthermore, it is impractical to routinely fingerprint large volumes of samples that are collected after a major oil spill event, such as the DWH spill. In this study, we propose a simple field testing protocol that can identify DWH oil spill residues based on their unique physical characteristics. The robustness of the method is demonstrated by testing a variety of oil spill samples, and the results are verified by characterizing the samples using advanced chemical fingerprinting methods. The verification data show that the method yields results that are consistent with the results derived from advanced fingerprinting methods. The proposed protocol is a reliable, cost-effective, practical field approach for differentiating DWH residues from other types of petroleum residues.

  15. Development of a field testing protocol for identifying Deepwater Horizon oil spill residues trapped near Gulf of Mexico beaches

    PubMed Central

    Han, Yuling

    2018-01-01

    The Deepwater Horizon (DWH) accident, one of the largest oil spills in U.S. history, contaminated several beaches located along the Gulf of Mexico (GOM) shoreline. The residues from the spill still continue to be deposited on some of these beaches. Methods to track and monitor the fate of these residues require approaches that can differentiate the DWH residues from other types of petroleum residues. This is because, historically, the crude oil released from sources such as natural seeps and anthropogenic discharges have also deposited other types of petroleum residues on GOM beaches. Therefore, identifying the origin of these residues is critical for developing effective management strategies for monitoring the long-term environmental impacts of the DWH oil spill. Advanced fingerprinting methods that are currently used for identifying the source of oil spill residues require detailed laboratory studies, which can be cost-prohibitive. Also, most agencies typically use untrained workers or volunteers to conduct shoreline monitoring surveys and these worker will not have access to advanced laboratory facilities. Furthermore, it is impractical to routinely fingerprint large volumes of samples that are collected after a major oil spill event, such as the DWH spill. In this study, we propose a simple field testing protocol that can identify DWH oil spill residues based on their unique physical characteristics. The robustness of the method is demonstrated by testing a variety of oil spill samples, and the results are verified by characterizing the samples using advanced chemical fingerprinting methods. The verification data show that the method yields results that are consistent with the results derived from advanced fingerprinting methods. The proposed protocol is a reliable, cost-effective, practical field approach for differentiating DWH residues from other types of petroleum residues. PMID:29329313

  16. Urine sampling and collection system optimization and testing

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Geating, J. A.; Koesterer, M. G.

    1975-01-01

    A Urine Sampling and Collection System (USCS) engineering model was developed to provide for the automatic collection, volume sensing and sampling of urine from each micturition. The purpose of the engineering model was to demonstrate verification of the system concept. The objective of the optimization and testing program was to update the engineering model, to provide additional performance features and to conduct system testing to determine operational problems. Optimization tasks were defined as modifications to minimize system fluid residual and addition of thermoelectric cooling.

  17. On the BV formalism of open superstring field theory in the large Hilbert space

    NASA Astrophysics Data System (ADS)

    Matsunaga, Hiroaki; Nomura, Mitsuru

    2018-05-01

    We construct several BV master actions for open superstring field theory in the large Hilbert space. First, we show that a naive use of the conventional BV approach breaks down at the third order of the antifield number expansion, although it enables us to define a simple "string antibracket" taking the Darboux form as spacetime antibrackets. This fact implies that in the large Hilbert space, "string fields-antifields" should be reassembled to obtain master actions in a simple manner. We determine the assembly of the string anti-fields on the basis of Berkovits' constrained BV approach, and give solutions to the master equation defined by Dirac antibrackets on the constrained string field-antifield space. It is expected that partial gauge-fixing enables us to relate superstring field theories based on the large and small Hilbert spaces directly: reassembling string fields-antifields is rather natural from this point of view. Finally, inspired by these results, we revisit the conventional BV approach and construct a BV master action based on the minimal set of string fields-antifields.

  18. Report on Electrochemcial Corrosion Testing of 241-SY-102 Grab Samples from the 2012 Grab Sampling Campaign

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wyrwas, Richard B.; Lamothe, Margaret E.

    2013-05-30

    This report describes the results of the electrochemical testing performed on tank 241-SY-102 (SY-102) grab samples that were collected in support of corrosion mitigation. The objective of the work presented here was to determine corrosion resistance of tank SY-102 to the grab samples collected using electrochemical methods up to 50°C as well as to satisfy data quality objectives. Grab samples were collected at multiple elevations from Riser 003. The electrochemical corrosion testing was planned to consist of linear polarization resistance testing (LPR) and cyclic potentiodynamic polarization (CPP) testing at 50°C. The temperature would be lowered to 40 °C and themore » test repeated if the CPP curve indicated pitting corrosion at 50°C. Ifno pitting was indicated by the CPP curve, then a duplicate scan would be repeated at 50°C to confirm the first result. The testing would be complete if the duplicate CPP scan was consistent with the first. This report contains the CPP results of the testing of grab sample 2SY-12-03 and 2SY-12-03DUP composite sample tested under these conditions. There was no indication of pitting at 50°C, and the duplicate scan was in agreement with the first scan. Since no further testing was required, a third scan with a shorter rest time was performed and is present in this report.« less

  19. Testing of Photomultiplier Tubes in a Magnetic Field

    NASA Astrophysics Data System (ADS)

    Waldron, Zachary; A1 Collaboration

    2016-09-01

    The A1 collaboration at MAMI in Mainz, Germany has designed a neutron detector that can be used in experiments to measure the electric form factor of the neutron. They will measure elastic scattering from the neutron, using the polarized electron beam from MAMI at A1's experimental hall. The detector will be composed of two walls of staggered scintillator bars which will be read out by photomultiplier tubes (PMT), connected to both ends of each scintillator via light guides. The experiment requires a magnetic field with strength of 1 Tesla, 2m away from the first scintillator wall. The resulting fringe field is sufficient to disrupt the PMTs, despite the addition of Mu Metal shielding. The effects of the fringe field on these PMTs was tested to optimize the amplification of the PMTs. A Helmholtz Coil was designed to generate a controlled magnetic field with equivalent strength to the field that the PMTs will encounter. The PMTs were read out using a multi-channel analyzer, were tested at various angles relative to the magnetic field in order to determine the optimal orientation to minimize signal disruption. Tests were also performed to determine: the neutron detector response to cosmic radiation; and the best method for measuring a magnetic field's strength in two dimensions. National Science Foundation Grant No. IIA-1358175.

  20. Field trials of line transect methods applied to estimation of desert tortoise abundance

    USGS Publications Warehouse

    Anderson, David R.; Burnham, Kenneth P.; Lubow, Bruce C.; Thomas, L. E. N.; Corn, Paul Stephen; Medica, Philip A.; Marlow, R.W.

    2001-01-01

    We examine the degree to which field observers can meet the assumptions underlying line transect sampling to monitor populations of desert tortoises (Gopherus agassizii). We present the results of 2 field trials using artificial tortoise models in 3 size classes. The trials were conducted on 2 occasions on an area south of Las Vegas, Nevada, where the density of the test population was known. In the first trials, conducted largely by experienced biologists who had been involved in tortoise surveys for many years, the density of adult tortoise models was well estimated (-3.9% bias), while the bias was higher (-20%) for subadult tortoise models. The bias for combined data was -12.0%. The bias was largely attributed to the failure to detect all tortoise models on or near the transect centerline. The second trials were conducted with a group of largely inexperienced student volunteers and used somewhat different searching methods, and the results were similar to the first trials. Estimated combined density of subadult and adult tortoise models had a negative bias (-7.3%), again attributable to failure to detect some models on or near the centerline. Experience in desert tortoise biology, either comparing the first and second trials or in the second trial with 2 experienced biologists versus 16 novices, did not have an apparent effect on the quality of the data or the accuracy of the estimates. Observer training, specific to line transect sampling, and field testing are important components of a reliable survey. Line transect sampling represents a viable method for large-scale monitoring of populations of desert tortoise; however, field protocol must be improved to assure the key assumptions are met.

  1. Decision Models for Determining the Optimal Life Test Sampling Plans

    NASA Astrophysics Data System (ADS)

    Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Strelchonok, Vladimir F.

    2010-11-01

    Life test sampling plan is a technique, which consists of sampling, inspection, and decision making in determining the acceptance or rejection of a batch of products by experiments for examining the continuous usage time of the products. In life testing studies, the lifetime is usually assumed to be distributed as either a one-parameter exponential distribution, or a two-parameter Weibull distribution with the assumption that the shape parameter is known. Such oversimplified assumptions can facilitate the follow-up analyses, but may overlook the fact that the lifetime distribution can significantly affect the estimation of the failure rate of a product. Moreover, sampling costs, inspection costs, warranty costs, and rejection costs are all essential, and ought to be considered in choosing an appropriate sampling plan. The choice of an appropriate life test sampling plan is a crucial decision problem because a good plan not only can help producers save testing time, and reduce testing cost; but it also can positively affect the image of the product, and thus attract more consumers to buy it. This paper develops the frequentist (non-Bayesian) decision models for determining the optimal life test sampling plans with an aim of cost minimization by identifying the appropriate number of product failures in a sample that should be used as a threshold in judging the rejection of a batch. The two-parameter exponential and Weibull distributions with two unknown parameters are assumed to be appropriate for modelling the lifetime of a product. A practical numerical application is employed to demonstrate the proposed approach.

  2. HIV testing among at-risk adolescents and young adults: a prospective analysis of a community sample.

    PubMed

    Tolou-Shams, Marina; Payne, Nanetta; Houck, Christopher; Pugatch, David; Beausoleil, Nancy; Brown, Larry K

    2007-12-01

    Little is known about predictors of human immunodeficiency virus (HIV) testing among sexually active adolescents, who account for a large proportion of new HIV infections. This study sought to determine predictors of HIV testing among a large community-based sample of adolescents in three cities who had recent unprotected sexual intercourse. Sexually active adolescents (N = 1222) completed baseline and 3-month assessments of sexual behavior, substance use and HIV testing behaviors as part of a larger, multi-site, brief HIV prevention program. Approximately half of the adolescents reported having previously been tested for HIV, and of those one third were tested in the next 3 months without a specific intervention. Adolescents who received HIV testing were more likely at baseline to have ever been tested, to have a STI diagnosis, to have not used substances during sex and to have been assertive about condom use with a partner. Health care models encouraging more widespread, universal testing may be an important public health initiative to curb the spread of HIV. Regular HIV screenings provide an opportunity to enhance awareness of behavioral risk and HIV status, as well as provide opportunities for early detection and care.

  3. Field Tests of Real-time In-situ Dissolved CO2 Monitoring for CO2 Leakage Detection in Groundwater

    NASA Astrophysics Data System (ADS)

    Yang, C.; Zou, Y.; Delgado, J.; Guzman, N.; Pinedo, J.

    2016-12-01

    Groundwater monitoring for detecting CO2 leakage relies on groundwater sampling from water wells drilled into aquifers. Usually groundwater samples are required be collected periodically in field and analyzed in the laboratory. Obviously groundwater sampling is labor and cost-intensive for long-term monitoring of large areas. Potential damage and contamination of water samples during the sampling process can degrade accuracy, and intermittent monitoring may miss changes in the geochemical parameters of groundwater, and therefore signs of CO2 leakage. Real-time in-situ monitoring of geochemical parameters with chemical sensors may play an important role for CO2 leakage detection in groundwater at a geological carbon sequestration site. This study presents field demonstration of a real-time in situ monitoring system capable of covering large areas for detection of low levels of dissolved CO2 in groundwater and reliably differentiating natural variations of dissolved CO2 concentration from small changes resulting from leakage. The sand-alone system includes fully distributed fiber optic sensors for carbon dioxide detection with a unique sensor technology developed by Intelligent Optical Systems. The systems were deployed to the two research sites: the Brackenridge Field Laboratory where the aquifer is shallow at depths of 10-20 ft below surface and the Devine site where the aquifer is much deeper at depths of 140 to 150 ft. Groundwater samples were periodically collected from the water wells which were installed with the chemical sensors and further compared to the measurements of the chemical sensors. Our study shows that geochemical monitoring of dissolved CO2 with fiber optic sensors could provide reliable CO2 leakage signal detection in groundwater as long as CO2 leakage signals are stronger than background noises at the monitoring locations.

  4. A single test for rejecting the null hypothesis in subgroups and in the overall sample.

    PubMed

    Lin, Yunzhi; Zhou, Kefei; Ganju, Jitendra

    2017-01-01

    In clinical trials, some patient subgroups are likely to demonstrate larger effect sizes than other subgroups. For example, the effect size, or informally the benefit with treatment, is often greater in patients with a moderate condition of a disease than in those with a mild condition. A limitation of the usual method of analysis is that it does not incorporate this ordering of effect size by patient subgroup. We propose a test statistic which supplements the conventional test by including this information and simultaneously tests the null hypothesis in pre-specified subgroups and in the overall sample. It results in more power than the conventional test when the differences in effect sizes across subgroups are at least moderately large; otherwise it loses power. The method involves combining p-values from models fit to pre-specified subgroups and the overall sample in a manner that assigns greater weight to subgroups in which a larger effect size is expected. Results are presented for randomized trials with two and three subgroups.

  5. Measurement of field-saturated hydraulic conductivity on fractured rock outcrops near Altamura (Southern Italy) with an adjustable large ring infiltrometer

    USGS Publications Warehouse

    Caputo, Maria C.; de Carlo, L.; Masciopinto, C.; Nimmo, J.R.

    2010-01-01

    Up to now, field studies set up to measure field-saturated hydraulic conductivity to evaluate contamination risks, have employed small cylinders that may not be representative of the scale of measurements in heterogeneous media. In this study, a large adjustable ring infiltrometer was designed to be installed on-site directly on rock to measure its field-saturated hydraulic conductivity. The proposed device is inexpensive and simple to implement, yet also very versatile, due to its large adjustable diameter that can be fixed on-site. It thus allows an improved representation of the natural system's heterogeneity, while also taking into consideration irregularities in the soil/rock surface. The new apparatus was tested on an outcrop of karstic fractured limestone overlying the deep Murge aquifer in the South of Italy, which has recently been affected by untreated sludge disposal, derived from municipal and industrial wastewater treatment plants. The quasi-steady vertical flow into the unsaturated fractures was investigated by measuring water levels during infiltrometer tests. Simultaneously, subsurface electrical resistivity measurements were used to visualize the infiltration of water in the subsoil, due to unsaturated water flow in the fractures. The proposed experimental apparatus works well on rock outcrops, and allows the repetition of infiltration tests at many locations in order to reduce model uncertainties in heterogeneous media. ?? 2009 Springer-Verlag.

  6. Posttraumatic Stress Disorder Symptom Clusters and the Interpersonal Theory of Suicide in a Large Military Sample.

    PubMed

    Pennings, Stephanie M; Finn, Joseph; Houtsma, Claire; Green, Bradley A; Anestis, Michael D

    2017-10-01

    Prior studies examining posttraumatic stress disorder (PTSD) symptom clusters and the components of the interpersonal theory of suicide (ITS) have yielded mixed results, likely stemming in part from the use of divergent samples and measurement techniques. This study aimed to expand on these findings by utilizing a large military sample, gold standard ITS measures, and multiple PTSD factor structures. Utilizing a sample of 935 military personnel, hierarchical multiple regression analyses were used to test the association between PTSD symptom clusters and the ITS variables. Additionally, we tested for indirect effects of PTSD symptom clusters on suicidal ideation through thwarted belongingness, conditional on levels of perceived burdensomeness. Results indicated that numbing symptoms are positively associated with both perceived burdensomeness and thwarted belongingness and hyperarousal symptoms (dysphoric arousal in the 5-factor model) are positively associated with thwarted belongingness. Results also indicated that hyperarousal symptoms (anxious arousal in the 5-factor model) were positively associated with fearlessness about death. The positive association between PTSD symptom clusters and suicidal ideation was inconsistent and modest, with mixed support for the ITS model. Overall, these results provide further clarity regarding the association between specific PTSD symptom clusters and suicide risk factors. © 2016 The American Association of Suicidology.

  7. Evaluating noninvasive genetic sampling techniques to estimate large carnivore abundance.

    PubMed

    Mumma, Matthew A; Zieminski, Chris; Fuller, Todd K; Mahoney, Shane P; Waits, Lisette P

    2015-09-01

    Monitoring large carnivores is difficult because of intrinsically low densities and can be dangerous if physical capture is required. Noninvasive genetic sampling (NGS) is a safe and cost-effective alternative to physical capture. We evaluated the utility of two NGS methods (scat detection dogs and hair sampling) to obtain genetic samples for abundance estimation of coyotes, black bears and Canada lynx in three areas of Newfoundland, Canada. We calculated abundance estimates using program capwire, compared sampling costs, and the cost/sample for each method relative to species and study site, and performed simulations to determine the sampling intensity necessary to achieve abundance estimates with coefficients of variation (CV) of <10%. Scat sampling was effective for both coyotes and bears and hair snags effectively sampled bears in two of three study sites. Rub pads were ineffective in sampling coyotes and lynx. The precision of abundance estimates was dependent upon the number of captures/individual. Our simulations suggested that ~3.4 captures/individual will result in a < 10% CV for abundance estimates when populations are small (23-39), but fewer captures/individual may be sufficient for larger populations. We found scat sampling was more cost-effective for sampling multiple species, but suggest that hair sampling may be less expensive at study sites with limited road access for bears. Given the dependence of sampling scheme on species and study site, the optimal sampling scheme is likely to be study-specific warranting pilot studies in most circumstances. © 2015 John Wiley & Sons Ltd.

  8. Estimation of sample size and testing power (Part 3).

    PubMed

    Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo

    2011-12-01

    This article introduces the definition and sample size estimation of three special tests (namely, non-inferiority test, equivalence test and superiority test) for qualitative data with the design of one factor with two levels having a binary response variable. Non-inferiority test refers to the research design of which the objective is to verify that the efficacy of the experimental drug is not clinically inferior to that of the positive control drug. Equivalence test refers to the research design of which the objective is to verify that the experimental drug and the control drug have clinically equivalent efficacy. Superiority test refers to the research design of which the objective is to verify that the efficacy of the experimental drug is clinically superior to that of the control drug. By specific examples, this article introduces formulas of sample size estimation for the three special tests, and their SAS realization in detail.

  9. The Rights and Responsibility of Test Takers When Large-Scale Testing Is Used for Classroom Assessment

    ERIC Educational Resources Information Center

    van Barneveld, Christina; Brinson, Karieann

    2017-01-01

    The purpose of this research was to identify conflicts in the rights and responsibility of Grade 9 test takers when some parts of a large-scale test are marked by teachers and used in the calculation of students' class marks. Data from teachers' questionnaires and students' questionnaires from a 2009-10 administration of a large-scale test of…

  10. Light sheet theta microscopy for rapid high-resolution imaging of large biological samples.

    PubMed

    Migliori, Bianca; Datta, Malika S; Dupre, Christophe; Apak, Mehmet C; Asano, Shoh; Gao, Ruixuan; Boyden, Edward S; Hermanson, Ola; Yuste, Rafael; Tomer, Raju

    2018-05-29

    Advances in tissue clearing and molecular labeling methods are enabling unprecedented optical access to large intact biological systems. These developments fuel the need for high-speed microscopy approaches to image large samples quantitatively and at high resolution. While light sheet microscopy (LSM), with its high planar imaging speed and low photo-bleaching, can be effective, scaling up to larger imaging volumes has been hindered by the use of orthogonal light sheet illumination. To address this fundamental limitation, we have developed light sheet theta microscopy (LSTM), which uniformly illuminates samples from the same side as the detection objective, thereby eliminating limits on lateral dimensions without sacrificing the imaging resolution, depth, and speed. We present a detailed characterization of LSTM, and demonstrate its complementary advantages over LSM for rapid high-resolution quantitative imaging of large intact samples with high uniform quality. The reported LSTM approach is a significant step for the rapid high-resolution quantitative mapping of the structure and function of very large biological systems, such as a clarified thick coronal slab of human brain and uniformly expanded tissues, and also for rapid volumetric calcium imaging of highly motile animals, such as Hydra, undergoing non-isomorphic body shape changes.

  11. Planck intermediate results. XLII. Large-scale Galactic magnetic fields

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Adam, R.; Ade, P. A. R.; Alves, M. I. R.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Battaner, E.; Benabed, K.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Chiang, H. C.; Christensen, P. R.; Colombo, L. P. L.; Combet, C.; Couchot, F.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Dickinson, C.; Diego, J. M.; Dolag, K.; Doré, O.; Ducout, A.; Dupac, X.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Ferrière, K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Galeotta, S.; Ganga, K.; Ghosh, T.; Giard, M.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Harrison, D. L.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hobson, M.; Hornstrup, A.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kisner, T. S.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; Melchiorri, A.; Mennella, A.; Migliaccio, M.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Nørgaard-Nielsen, H. U.; Oppermann, N.; Orlando, E.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Pasian, F.; Perotto, L.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Pratt, G. W.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Scott, D.; Spencer, L. D.; Stolyarov, V.; Stompor, R.; Strong, A. W.; Sudiwala, R.; Sunyaev, R.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Valenziano, L.; Valiviita, J.; Van Tent, F.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zonca, A.

    2016-12-01

    Recent models for the large-scale Galactic magnetic fields in the literature have been largely constrained by synchrotron emission and Faraday rotation measures. We use three different but representative models to compare their predicted polarized synchrotron and dust emission with that measured by the Planck satellite. We first update these models to match the Planck synchrotron products using a common model for the cosmic-ray leptons. We discuss the impact on this analysis of the ongoing problems of component separation in the Planck microwave bands and of the uncertain cosmic-ray spectrum. In particular, the inferred degree of ordering in the magnetic fields is sensitive to these systematic uncertainties, and we further show the importance of considering the expected variations in the observables in addition to their mean morphology. We then compare the resulting simulated emission to the observed dust polarization and find that the dust predictions do not match the morphology in the Planck data but underpredict the dust polarization away from the plane. We modify one of the models to roughly match both observables at high latitudes by increasing the field ordering in the thin disc near the observer. Though this specific analysis is dependent on the component separation issues, we present the improved model as a proof of concept for how these studies can be advanced in future using complementary information from ongoing and planned observational projects.

  12. Implementing the Mars Science Laboratory Terminal Descent Sensor Field Test Campaign

    NASA Technical Reports Server (NTRS)

    Montgomery, James F.; Bodie, James H.; Brown, Joseph D.; Chen, Allen; Chen, Curtis W.; Essmiller, John C.; Fisher, Charles D.; Goldberg, Hannah R.; Lee, Steven W.; Shaffer, Scott J.

    2012-01-01

    The Mars Science Laboratory (MSL) will deliver a 900 kg rover to the surface of Mars in August 2012. MSL will utilize a new pulse-Doppler landing radar, the Terminal Descent Sensor (TDS). The TDS employs six narrow-beam antennas to provide unprecedented slant range and velocity performance at Mars to enable soft touchdown of the MSL rover using a unique sky crane Entry, De-scent, and Landing (EDL) technique. Prior to use on MSL, the TDS was put through a rigorous verification and validation (V&V) process. A key element of this V&V was operating the TDS over a series of field tests, using flight-like profiles expected during the descent and landing of MSL over Mars-like terrain on Earth. Limits of TDS performance were characterized with additional testing meant to stress operational modes outside of the expected EDL flight profiles. The flight envelope over which the TDS must operate on Mars encompasses such a large range of altitudes and velocities that a variety of venues were neces-sary to cover the test space. These venues included an F/A-18 high performance aircraft, a Eurocopter AS350 AStar helicopter and 100-meter tall Echo Towers at the China Lake Naval Air Warfare Center. Testing was carried out over a five year period from July 2006 to June 2011. TDS performance was shown, in gen-eral, to be excellent over all venues. This paper describes the planning, design, and implementation of the field test campaign plus results and lessons learned.

  13. An Autosampler and Field Sample Carrier for Maximizing Throughput Using an Open-Air, Surface Sampling Ion Source for MS

    EPA Science Inventory

    A recently developed, commercially available, open-air, surface sampling ion source for mass spectrometers provides individual analyses in several seconds. To realize its full throughput potential, an autosampler and field sample carrier were designed and built. The autosampler ...

  14. Integrated sample-to-detection chip for nucleic acid test assays.

    PubMed

    Prakash, R; Pabbaraju, K; Wong, S; Tellier, R; Kaler, K V I S

    2016-06-01

    Nucleic acid based diagnostic techniques are routinely used for the detection of infectious agents. Most of these assays rely on nucleic acid extraction platforms for the extraction and purification of nucleic acids and a separate real-time PCR platform for quantitative nucleic acid amplification tests (NATs). Several microfluidic lab on chip (LOC) technologies have been developed, where mechanical and chemical methods are used for the extraction and purification of nucleic acids. Microfluidic technologies have also been effectively utilized for chip based real-time PCR assays. However, there are few examples of microfluidic systems which have successfully integrated these two key processes. In this study, we have implemented an electro-actuation based LOC micro-device that leverages multi-frequency actuation of samples and reagents droplets for chip based nucleic acid extraction and real-time, reverse transcription (RT) PCR (qRT-PCR) amplification from clinical samples. Our prototype micro-device combines chemical lysis with electric field assisted isolation of nucleic acid in a four channel parallel processing scheme. Furthermore, a four channel parallel qRT-PCR amplification and detection assay is integrated to deliver the sample-to-detection NAT chip. The NAT chip combines dielectrophoresis and electrostatic/electrowetting actuation methods with resistive micro-heaters and temperature sensors to perform chip based integrated NATs. The two chip modules have been validated using different panels of clinical samples and their performance compared with standard platforms. This study has established that our integrated NAT chip system has a sensitivity and specificity comparable to that of the standard platforms while providing up to 10 fold reduction in sample/reagent volumes.

  15. Self-Referencing Hartmann Test for Large-Aperture Telescopes

    NASA Technical Reports Server (NTRS)

    Korechoff, Robert P.; Oseas, Jeffrey M.

    2010-01-01

    A method is proposed for end-to-end, full aperture testing of large-aperture telescopes using an innovative variation of a Hartmann mask. This technique is practical for telescopes with primary mirrors tens of meters in diameter and of any design. Furthermore, it is applicable to the entire optical band (near IR, visible, ultraviolet), relatively insensitive to environmental perturbations, and is suitable for ambient laboratory as well as thermal-vacuum environments. The only restriction is that the telescope optical axis must be parallel to the local gravity vector during testing. The standard Hartmann test utilizes an array of pencil beams that are cut out of a well-corrected wavefront using a mask. The pencil beam array is expanded to fill the full aperture of the telescope. The detector plane of the telescope is translated back and forth along the optical axis in the vicinity of the nominal focal plane, and the centroid of each pencil beam image is recorded. Standard analytical techniques are then used to reconstruct the telescope wavefront from the centroid data. The expansion of the array of pencil beams is usually accomplished by double passing the beams through the telescope under test. However, this requires a well-corrected, autocollimation flat, the diameter or which is approximately equal to that of the telescope aperture. Thus, the standard Hartmann method does not scale well because of the difficulty and expense of building and mounting a well-corrected, large aperture flat. The innovation in the testing method proposed here is to replace the large aperture, well-corrected, monolithic autocollimation flat with an array of small-aperture mirrors. In addition to eliminating the need for a large optic, the surface figure requirement for the small mirrors is relaxed compared to that required of the large autocollimation flat. The key point that allows this method to work is that the small mirrors need to operate as a monolithic flat only with regard to

  16. A Novel High Sensitivity Sensor for Remote Field Eddy Current Non-Destructive Testing Based on Orthogonal Magnetic Field

    PubMed Central

    Xu, Xiaojie; Liu, Ming; Zhang, Zhanbin; Jia, Yueling

    2014-01-01

    Remote field eddy current is an effective non-destructive testing method for ferromagnetic tubular structures. In view of conventional sensors' disadvantages such as low signal-to-noise ratio and poor sensitivity to axial cracks, a novel high sensitivity sensor based on orthogonal magnetic field excitation is proposed. Firstly, through a three-dimensional finite element simulation, the remote field effect under orthogonal magnetic field excitation is determined, and an appropriate configuration which can generate an orthogonal magnetic field for a tubular structure is developed. Secondly, optimized selection of key parameters such as frequency, exciting currents and shielding modes is analyzed in detail, and different types of pick-up coils, including a new self-differential mode pick-up coil, are designed and analyzed. Lastly, the proposed sensor is verified experimentally by various types of defects manufactured on a section of a ferromagnetic tube. Experimental results show that the proposed novel sensor can largely improve the sensitivity of defect detection, especially for axial crack whose depth is less than 40% wall thickness, which are very difficult to detect and identify by conventional sensors. Another noteworthy advantage of the proposed sensor is that it has almost equal sensitivity to various types of defects, when a self-differential mode pick-up coil is adopted. PMID:25615738

  17. Multistatic Array Sampling Scheme for Fast Near-Field Image Reconstruction

    DTIC Science & Technology

    2016-01-01

    1 Multistatic Array Sampling Scheme for Fast Near-Field Image Reconstruction William F. Moulder, James D. Krieger, Denise T. Maurais-Galejs, Huy...described and validated experimentally with the formation of high quality microwave images. It is further shown that the scheme is more than two orders of... scheme (wherein transmitters and receivers are co-located) which require NTNR transmit-receive elements to achieve the same sampling. The second

  18. High-resolution hydrodynamic chromatographic separation of large DNA using narrow, bare open capillaries: a rapid and economical alternative technology to pulsed-field gel electrophoresis?

    PubMed

    Liu, Lei; Veerappan, Vijaykumar; Pu, Qiaosheng; Cheng, Chang; Wang, Xiayan; Lu, Liping; Allen, Randy D; Guo, Guangsheng

    2014-01-07

    A high-resolution, rapid, and economical hydrodynamic chromatographic (HDC) method for large DNA separations in free solution was developed using narrow (5 μm diameter), bare open capillaries. Size-based separation was achieved in a chromatographic format with larger DNA molecules being eluting faster than smaller ones. Lambda DNA Mono Cut Mix was baseline-separated with the percentage resolutions generally less than 9.0% for all DNA fragments (1.5 to 48.5 kbp) tested in this work. High efficiencies were achieved for large DNA from this chromatographic technique, and the number of theoretical plates reached 3.6 × 10(5) plates for the longest (48.5 kbp) and 3.7 × 10(5) plates for the shortest (1.5 kbp) fragments. HDC parameters and performances were also discussed. The method was further applied for fractionating large DNA fragments from real-world samples (SacII digested Arabidopsis plant bacterial artificial chromosome (BAC) DNA and PmeI digested Rice BAC DNA) to demonstrate its feasibility for BAC DNA finger printing. Rapid separation of PmeI digested Rice BAC DNA covering from 0.44 to 119.041 kbp was achieved in less than 26 min. All DNA fragments of these samples were baseline separated in narrow bare open capillaries, while the smallest fragment (0.44 kbp) was missing in pulsed-field gel electrophoresis (PFGE) separation mode. It is demonstrated that narrow bare open capillary chromatography can realize a rapid separation for a wide size range of DNA mixtures that contain both small and large DNA fragments in a single run.

  19. Digital Audio Radio Field Tests

    NASA Technical Reports Server (NTRS)

    Hollansworth, James E.

    1997-01-01

    Radio history continues to be made at the NASA Lewis Research Center with the beginning of phase two of Digital Audio Radio testing conducted by the Consumer Electronic Manufacturers Association (a sector of the Electronic Industries Association and the National Radio Systems Committee) and cosponsored by the Electronic Industries Association and the National Association of Broadcasters. The bulk of the field testing of the four systems should be complete by the end of October 1996, with results available soon thereafter. Lewis hosted phase one of the testing process, which included laboratory testing of seven proposed digital audio radio systems and modes (see the following table). Two of the proposed systems operate in two modes, thus making a total of nine systems for testing. These nine systems are divided into the following types of transmission: in-band on channel (IBOC), in-band adjacent channel (IBAC), and new bands - the L-band (1452 to 1492 MHz) and the S-band (2310 to 2360 MHz).

  20. ISOLOK VALVE ACCEPTANCE TESTING FOR DWPF SME SAMPLING PROCESS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, T.; Hera, K.; Coleman, C.

    2011-12-05

    Evaluation of the Defense Waste Processing Facility (DWPF) Chemical Process Cell (CPC) cycle time identified several opportunities to improve the CPC processing time. Of the opportunities, a focus area related to optimizing the equipment and efficiency of the sample turnaround time for DWPF Analytical Laboratory was identified. The Mechanical Systems & Custom Equipment Development (MS&CED) Section of the Savannah River National Laboratory (SRNL) evaluated the possibility of using an Isolok{reg_sign} sampling valve as an alternative to the Hydragard{reg_sign} valve for taking process samples. Previous viability testing was conducted with favorable results using the Isolok sampler and reported in SRNL-STI-2010-00749 (1).more » This task has the potential to improve operability, reduce maintenance time and decrease CPC cycle time. This report summarizes the results from acceptance testing which was requested in Task Technical Request (TTR) HLW-DWPF-TTR-2010-0036 (2) and which was conducted as outlined in Task Technical and Quality Assurance Plan (TTQAP) SRNL-RP-2011-00145 (3). The Isolok to be tested is the same model which was tested, qualified, and installed in the Sludge Receipt Adjustment Tank (SRAT) sample system. RW-0333P QA requirements apply to this task. This task was to qualify the Isolok sampler for use in the DWPF Slurry Mix Evaporator (SME) sampling process. The Hydragard, which is the current baseline sampling method, was used for comparison to the Isolok sampling data. The Isolok sampler is an air powered grab sampler used to 'pull' a sample volume from a process line. The operation of the sampler is shown in Figure 1. The image on the left shows the Isolok's spool extended into the process line and the image on the right shows the sampler retracted and then dispensing the liquid into the sampling container. To determine tank homogeneity, a Coliwasa sampler was used to grab samples at a high and low location within the mixing tank. Data from the two

  1. Effect of microbiological testing on subsequent mid-infrared milk component analysis of the same milk sample.

    PubMed

    Wojciechowski, Karen L; Melilli, Caterina; Barbano, David M

    2014-09-01

    Our objectives were to determine if mixing and sampling of a raw milk sample at 4°C for determination of total bacteria count (TBC) and if incubation at 14°C for 18h and sampling for a preliminary incubation (PI) count influenced the accuracy of subsequent fat, protein, or lactose measurement by mid-infrared (IR) analysis of milk from the same sample container due to either nonrepresentative sampling or the presence of microbial metabolites produced by microbial growth in the milk from the incubation. Milks of 4 fat levels (2.2, 3, 4, and 5%) reflected the range of fat levels encountered in producer milks. If the portion of milk removed from a cold sample was not representative, then the effect on a milk component test would likely be larger as fat content increases. Within the milks at each fat level, 3 treatments were used: (1) 20 vials of the same milk sampled for testing TBC using a BactoScan FC and then used for a milk component test; (2) 20 vials for testing TBC plus PI count followed by component test; and (3) 20 vials to run for IR component test without a prior micro sampling and testing. This was repeated in 3 different weeks using a different batch of milk each week. No large effect on the accuracy of component milk testing [IR fat B (carbon hydrogen stretch) and fat A (carbonyl stretch)] due to the cold milk sample handling and mixing procedures used for TBC was detected, confirming the fact that the physical removal of milk from the vial by the BactoScan FC (Foss Electric, Hillerød, Denmark) was a representative portion of the milk. However, the representativeness of any other sampling procedure (manual or automated) of a cold milk sample before running milk component testing on the same container of milk should be demonstrated and verified periodically as a matter of routine laboratory quality assurance. Running TBC with a BactoScan FC first and then IR milk analysis after had a minimal effect on milk component tests by IR when milk bacteria counts

  2. 40 CFR 205.171-2 - Test exhaust system sample selection and preparation.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Test exhaust system sample selection... Systems § 205.171-2 Test exhaust system sample selection and preparation. (a)(1) Exhaust systems comprising the sample which are required to be tested under a test request in accordance with this subpart...

  3. Portable Imagery Quality Assessment Test Field for Uav Sensors

    NASA Astrophysics Data System (ADS)

    Dąbrowski, R.; Jenerowicz, A.

    2015-08-01

    Nowadays the imagery data acquired from UAV sensors are the main source of all data used in various remote sensing applications, photogrammetry projects and in imagery intelligence (IMINT) as well as in other tasks as decision support. Therefore quality assessment of such imagery is an important task. The research team from Military University of Technology, Faculty of Civil Engineering and Geodesy, Geodesy Institute, Department of Remote Sensing and Photogrammetry has designed and prepared special test field- The Portable Imagery Quality Assessment Test Field (PIQuAT) that provides quality assessment in field conditions of images obtained with sensors mounted on UAVs. The PIQuAT consists of 6 individual segments, when combined allow for determine radiometric, spectral and spatial resolution of images acquired from UAVs. All segments of the PIQuAT can be used together in various configurations or independently. All elements of The Portable Imagery Quality Assessment Test Field were tested in laboratory conditions in terms of their radiometry and spectral reflectance characteristics.

  4. Sampling large landscapes with small-scale stratification-User's Manual

    USGS Publications Warehouse

    Bart, Jonathan

    2011-01-01

    This manual explains procedures for partitioning a large landscape into plots, assigning the plots to strata, and selecting plots in each stratum to be surveyed. These steps are referred to as the "sampling large landscapes (SLL) process." We assume that users of the manual have a moderate knowledge of ArcGIS and Microsoft ® Excel. The manual is written for a single user but in many cases, some steps will be carried out by a biologist designing the survey and some steps will be carried out by a quantitative assistant. Thus, the manual essentially may be passed back and forth between these users. The SLL process primarily has been used to survey birds, and we refer to birds as subjects of the counts. The process, however, could be used to count any objects. ®

  5. Large field-induced-strain at high temperature in ternary ferroelectric crystals

    PubMed Central

    Wang, Yaojin; Chen, Lijun; Yuan, Guoliang; Luo, Haosu; Li, Jiefang; Viehland, D.

    2016-01-01

    The new generation of ternary Pb(In1/2Nb1/2)O3-Pb(Mg1/3Nb2/3)O3-PbTiO3 ferroelectric single crystals have potential applications in high power devices due to their surperior operational stability relative to the binary system. In this work, a reversible, large electric field induced strain of over 0.9% at room temperature, and in particular over 0.6% above 380 K was obtained. The polarization rotation path and the phase transition sequence of different compositions in these ternary systems have been determined with increasing electric field applied along [001] direction based on x-ray diffraction data. Thereafter, composition dependence of field-temperature phase diagrams were constructed, which provide compositional and thermal prospectus for the electromechanical properties. It was found the structural origin of the large stain, especially at higher temperature is the lattice parameters modulated by dual independent variables in composition of these ternary solid solution crystals. PMID:27734908

  6. Large field of view, fast and low dose multimodal phase-contrast imaging at high x-ray energy.

    PubMed

    Astolfo, Alberto; Endrizzi, Marco; Vittoria, Fabio A; Diemoz, Paul C; Price, Benjamin; Haig, Ian; Olivo, Alessandro

    2017-05-19

    X-ray phase contrast imaging (XPCI) is an innovative imaging technique which extends the contrast capabilities of 'conventional' absorption based x-ray systems. However, so far all XPCI implementations have suffered from one or more of the following limitations: low x-ray energies, small field of view (FOV) and long acquisition times. Those limitations relegated XPCI to a 'research-only' technique with an uncertain future in terms of large scale, high impact applications. We recently succeeded in designing, realizing and testing an XPCI system, which achieves significant steps toward simultaneously overcoming these limitations. Our system combines, for the first time, large FOV, high energy and fast scanning. Importantly, it is capable of providing high image quality at low x-ray doses, compatible with or even below those currently used in medical imaging. This extends the use of XPCI to areas which were unpractical or even inaccessible to previous XPCI solutions. We expect this will enable a long overdue translation into application fields such as security screening, industrial inspections and large FOV medical radiography - all with the inherent advantages of the XPCI multimodality.

  7. On-field measurement trial of 4×128 Gbps PDM-QPSK signals by linear optical sampling

    NASA Astrophysics Data System (ADS)

    Bin Liu; Wu, Zhichao; Fu, Songnian; Feng, Yonghua; Liu, Deming

    2017-02-01

    Linear optical sampling is a promising characterization technique for advanced modulation formats, together with digital signal processing (DSP) and software-synchronized algorithm. We theoretically investigate the acquisition of optical sampling, when the high-speed signal under test is either periodic or random. Especially, when the profile of optical sampling pulse is asymmetrical, the repetition frequency of sampling pulse needs careful adjustment in order to obtain correct waveform. Then, we demonstrate on-field measurement trial of commercial four-channel 128 Gbps polarization division multiplexing quadrature phase shift keying (PDM-QPSK) signals with truly random characteristics by self-developed equipment. A passively mode-locked fiber laser (PMFL) with a repetition frequency of 95.984 MHz is used as optical sampling source, meanwhile four balanced photo detectors (BPDs) with 400 MHz bandwidth and four-channel analog-to-digital convertor (ADC) with 1.25 GS/s sampling rate are used for data acquisition. The performance comparison with conventional optical modulation analyzer (OMA) verifies that the self-developed equipment has the advantages of low cost, easy implementation, and fast response.

  8. Moving your laboratories to the field--Advantages and limitations of the use of field portable instruments in environmental sample analysis.

    PubMed

    Gałuszka, Agnieszka; Migaszewski, Zdzisław M; Namieśnik, Jacek

    2015-07-01

    The recent rapid progress in technology of field portable instruments has increased their applications in environmental sample analysis. These instruments offer a possibility of cost-effective, non-destructive, real-time, direct, on-site measurements of a wide range of both inorganic and organic analytes in gaseous, liquid and solid samples. Some of them do not require the use of reagents and do not produce any analytical waste. All these features contribute to the greenness of field portable techniques. Several stationary analytical instruments have their portable versions. The most popular ones include: gas chromatographs with different detectors (mass spectrometer (MS), flame ionization detector, photoionization detector), ultraviolet-visible and near-infrared spectrophotometers, X-ray fluorescence spectrometers, ion mobility spectrometers, electronic noses and electronic tongues. The use of portable instruments in environmental sample analysis gives a possibility of on-site screening and a subsequent selection of samples for routine laboratory analyses. They are also very useful in situations that require an emergency response and for process monitoring applications. However, quantification of results is still problematic in many cases. The other disadvantages include: higher detection limits and lower sensitivity than these obtained in laboratory conditions, a strong influence of environmental factors on the instrument performance and a high possibility of sample contamination in the field. This paper reviews recent applications of field portable instruments in environmental sample analysis and discusses their analytical capabilities. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Modeling and Testing of Phase Transition-Based Deployable Systems for Small Body Sample Capture

    NASA Technical Reports Server (NTRS)

    Quadrelli, Marco; Backes, Paul; Wilkie, Keats; Giersch, Lou; Quijano, Ubaldo; Keim, Jason; Mukherjee, Rudranarayan

    2009-01-01

    This paper summarizes the modeling, simulation, and testing work related to the development of technology to investigate the potential that shape memory actuation has to provide mechanically simple and affordable solutions for delivering assets to a surface and for sample capture and return. We investigate the structural dynamics and controllability aspects of an adaptive beam carrying an end-effector which, by changing equilibrium phases is able to actively decouple the end-effector dynamics from the spacecraft dynamics during the surface contact phase. Asset delivery and sample capture and return are at the heart of several emerging potential missions to small bodies, such as asteroids and comets, and to the surface of large bodies, such as Titan.

  10. Field Exploration and Life Detection Sampling for Planetary Analogue Research (FELDSPAR)

    NASA Astrophysics Data System (ADS)

    Gentry, D.; Stockton, A. M.; Amador, E. S.; Cable, M. L.; Cantrell, T.; Chaudry, N.; Cullen, T.; Duca, Z. A.; Jacobsen, M. B.; Kirby, J.; McCaig, H. C.; Murukesan, G.; Rennie, V.; Rader, E.; Schwieterman, E. W.; Stevens, A. H.; Sutton, S. A.; Tan, G.; Yin, C.; Cullen, D.; Geppert, W.

    2017-12-01

    Extraterrestrial studies are typically conducted on mg samples from cm-scale features, while landing sites are selected based on m to km-scale features. It is therefore critical to understand spatial distribution of organic molecules over scales from cm to the km, particularly in geological features that appear homogenous at m to km scales. This is addressed by FELDSPAR, a NASA-funded project that conducts field operations analogous to Mars sample return in its science, operations, and technology [1]. Here, we present recent findings from a 2016 and a 2017 campaign to multiple Martian analogue sites in Iceland. Icelandic volcanic regions are Mars analogues due to desiccation, low nutrient availability, temperature extremes [2], and are relatively young and isolated from anthropogenic contamination [3]. Operationally, many Icelandic analogue sites are remote enough to require that field expeditions address several sampling constraints that are also faced by robotic exploration [1, 2]. Four field sites were evaluated in this study. The Fimmvörðuháls lava field was formed by a basaltic effusive eruption associated with the 2010 Eyjafjallajökull eruption. Mælifellssandur is a recently deglaciated plain to the north of the Myrdalsjökull glacier. Holuhraun is a basaltic spatter and cinder cone formed by 2014 fissure eruptions just north of the Vatnajökull glacier. Dyngjusandur is a plain kept barren by repeated aeolian mechanical weathering. Samples were collected in nested triangular grids from 10 cm to the 1 km scale. We obtained overhead imagery at 1 m to 200 m elevation to create digital elevation models. In-field reflectance spectroscopy was obtained with an ASD spectrometer and chemical composition was measured by a Bruker handheld XRF. All sites chosen were homogeneous in apparent color, morphology, moisture, grain size, and reflectance spectra at all scales greater than 10 cm. Field lab ATP assays were conducted to monitor microbial habitation, and home

  11. Development and field testing of a rapid and ultra-stable atmospheric carbon dioxide spectrometer

    DOE PAGES

    Xiang, B.; Nelson, D. D.; McManus, J. B.; ...

    2014-12-15

    We present field test results for a new spectroscopic instrument to measure atmospheric carbon dioxide (CO 2) with high precision (0.02 μmol mol -1, or ppm at 1 Hz) and demonstrate high stability (within 0.1 ppm over more than 8 months), without the need for hourly, daily, or even monthly calibration against high-pressure gas cylinders. The technical novelty of this instrument (ABsolute Carbon dioxide, ABC) is the spectral null method using an internal quartz reference cell with known CO 2 column density. Compared to a previously described prototype, the field instrument has better stability and benefits from more precise thermalmore » control of the optics and more accurate pressure measurements in the sample cell (at the mTorr level). The instrument has been deployed at a long-term ecological research site (the Harvard Forest, USA), where it has measured for 8 months without on-site calibration and with minimal maintenance, showing drift bounds of less than 0.1 ppm. Field measurements agree well with those of a commercially available cavity ring-down CO 2 instrument (Picarro G2301) run with a standard calibration protocol. This field test demonstrates that ABC is capable of performing high-accuracy, unattended, continuous field measurements with minimal use of reference gas cylinders.« less

  12. Development and field testing of a rapid and ultra-stable atmospheric carbon dioxide spectrometer

    DOE PAGES

    Xiang, B.; Nelson, D. D.; McManus, J. B.; ...

    2014-08-05

    We present field test results for a new spectroscopic instrument to measure atmospheric carbon dioxide (CO 2) with high precision (0.02 ppm at 1 Hz) and demonstrate high stability (within 0.1 ppm over more than 8 months), without the need for hourly, daily, or even monthly calibration against high-pressure gas cylinders. The technical novelty of this instrument ( ABsolute Carbon dioxide, ABC) is the spectral null method using an internal quartz reference cell with known CO 2 column density. Compared to a previously described prototype, the field instrument has better stability and benefits from more precise thermal control of themore » optics and more accurate pressure measurements in the sample cell (at the mTorr level). The instrument has been deployed at a long-term ecological research site (the Harvard Forest, USA), where it has measured for eight months without on-site calibration and with minimal maintenance, showing drift bounds of less than 0.1 ppm. Field measurements agree well with those of another commercially available cavity ring-down CO 2 instrument (Picarro G2301) run with a standard calibration protocol. This field test demonstrates that ABC is capable of performing high-accuracy, unattended, continuous field measurements with minimal use of calibration cylinders.« less

  13. Development and field testing of a rapid and ultra-stable atmospheric carbon dioxide spectrometer

    NASA Astrophysics Data System (ADS)

    Xiang, B.; Nelson, D. D.; McManus, J. B.; Zahniser, M. S.; Wehr, R. A.; Wofsy, S. C.

    2014-12-01

    We present field test results for a new spectroscopic instrument to measure atmospheric carbon dioxide (CO2) with high precision (0.02 μmol mol-1, or ppm at 1 Hz) and demonstrate high stability (within 0.1 ppm over more than 8 months), without the need for hourly, daily, or even monthly calibration against high-pressure gas cylinders. The technical novelty of this instrument (ABsolute Carbon dioxide, ABC) is the spectral null method using an internal quartz reference cell with known CO2 column density. Compared to a previously described prototype, the field instrument has better stability and benefits from more precise thermal control of the optics and more accurate pressure measurements in the sample cell (at the mTorr level). The instrument has been deployed at a long-term ecological research site (the Harvard Forest, USA), where it has measured for 8 months without on-site calibration and with minimal maintenance, showing drift bounds of less than 0.1 ppm. Field measurements agree well with those of a commercially available cavity ring-down CO2 instrument (Picarro G2301) run with a standard calibration protocol. This field test demonstrates that ABC is capable of performing high-accuracy, unattended, continuous field measurements with minimal use of reference gas cylinders.

  14. Development and field testing of a rapid and ultra-stable atmospheric carbon dioxide spectrometer

    NASA Astrophysics Data System (ADS)

    Xiang, B.; Nelson, D. D.; McManus, J. B.; Zahniser, M. S.; Wehr, R.; Wofsy, S. C.

    2014-08-01

    We present field test results for a new spectroscopic instrument to measure atmospheric carbon dioxide (CO2) with high precision (0.02 ppm at 1 Hz) and demonstrate high stability (within 0.1 ppm over more than 8 months), without the need for hourly, daily, or even monthly calibration against high-pressure gas cylinders. The technical novelty of this instrument (ABsolute Carbon dioxide, ABC) is the spectral null method using an internal quartz reference cell with known CO2 column density. Compared to a previously described prototype, the field instrument has better stability and benefits from more precise thermal control of the optics and more accurate pressure measurements in the sample cell (at the mTorr level). The instrument has been deployed at a long-term ecological research site (the Harvard Forest, USA), where it has measured for eight months without on-site calibration and with minimal maintenance, showing drift bounds of less than 0.1 ppm. Field measurements agree well with those of another commercially available cavity ring-down CO2 instrument (Picarro G2301) run with a standard calibration protocol. This field test demonstrates that ABC is capable of performing high-accuracy, unattended, continuous field measurements with minimal use of calibration cylinders.

  15. Large-scale Organized Magnetic Fields in O, B and A Stars

    NASA Astrophysics Data System (ADS)

    Mathys, G.

    2009-06-01

    The status of our current knowledge of magnetic fields in stars of spectral types ranging from early F to O is reviewed. Fields with large-scale organised structure have now been detected and measured throughout this range. These fields are consistent with the oblique rotator model. In early F to late B stars, their occurrence is restricted to the subgroup of the Ap stars, which have the best studied fields among the early-type stars. Presence of fields with more complex topologies in other A and late B stars has been suggested, but is not firmly established. Magnetic fields have not been studied in a sufficient number of OB stars yet so as to establish whether they occur in all or only in some subset of these stars.

  16. Characterization Efforts in a Deep Borehole Field Test

    NASA Astrophysics Data System (ADS)

    Kuhlman, K. L.; Sassani, D.; Freeze, G. A.; Hardin, E. L.; Brady, P. V.

    2016-12-01

    The US Department of Energy Office of Nuclear Energy is embarking on a Deep Borehole Field Test to investigate the feasibility of constructing and characterizing two boreholes in crystalline basement rock to a depth of 5 km (16,400 ft). The concept of deep borehole disposal for radioactive waste has some advantages, including incremental construction and loading and the enhanced natural barriers provided by deep continental crystalline basement. Site characterization activities will include geomechanical (i.e., hydrofracture stress measurements), geological (i.e., core and mud logging), hydrological (i.e., packer-based pulse and pumping tests), and chemical (i.e., fluids sampled in situ from packer intervals and extracted from cores) tests. Borehole-based characterization will be used to determine the variability of system state (i.e., stress, pressure, temperature, and chemistry) with depth and interpretation of material and system parameters relevant to numerical site simulation. We explore the effects fluid density and geothermal temperature gradients (i.e., thermohaline convection) have on characterization goals in light of expected downhole conditions, including a disturbed rock zone surrounding the borehole. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the US Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  17. Generation of a Large-scale Magnetic Field in a Convective Full-sphere Cross-helicity Dynamo

    NASA Astrophysics Data System (ADS)

    Pipin, V. V.; Yokoi, N.

    2018-05-01

    We study the effects of the cross-helicity in the full-sphere large-scale mean-field dynamo models of a 0.3 M ⊙ star rotating with a period of 10 days. In exploring several dynamo scenarios that stem from magnetic field generation by the cross-helicity effect, we found that the cross-helicity provides the natural generation mechanisms for the large-scale scale axisymmetric and nonaxisymmetric magnetic field. Therefore, the rotating stars with convective envelopes can produce a large-scale magnetic field generated solely due to the turbulent cross-helicity effect (we call it γ 2-dynamo). Using mean-field models we compare the properties of the large-scale magnetic field organization that stems from dynamo mechanisms based on the kinetic helicity (associated with the α 2 dynamos) and cross-helicity. For the fully convective stars, both generation mechanisms can maintain large-scale dynamos even for the solid body rotation law inside the star. The nonaxisymmetric magnetic configurations become preferable when the cross-helicity and the α-effect operate independently of each other. This corresponds to situations with purely γ 2 or α 2 dynamos. The combination of these scenarios, i.e., the γ 2 α 2 dynamo, can generate preferably axisymmetric, dipole-like magnetic fields at strengths of several kGs. Thus, we found a new dynamo scenario that is able to generate an axisymmetric magnetic field even in the case of a solid body rotation of the star. We discuss the possible applications of our findings to stellar observations.

  18. Large-aperture space optical system testing based on the scanning Hartmann.

    PubMed

    Wei, Haisong; Yan, Feng; Chen, Xindong; Zhang, Hao; Cheng, Qiang; Xue, Donglin; Zeng, Xuefeng; Zhang, Xuejun

    2017-03-10

    Based on the Hartmann testing principle, this paper proposes a novel image quality testing technology which applies to a large-aperture space optical system. Compared with the traditional testing method through a large-aperture collimator, the scanning Hartmann testing technology has great advantages due to its simple structure, low cost, and ability to perform wavefront measurement of an optical system. The basic testing principle of the scanning Hartmann testing technology, data processing method, and simulation process are presented in this paper. Certain simulation results are also given to verify the feasibility of this technology. Furthermore, a measuring system is developed to conduct a wavefront measurement experiment for a 200 mm aperture optical system. The small deviation (6.3%) of root mean square (RMS) between experimental results and interferometric results indicates that the testing system can measure low-order aberration correctly, which means that the scanning Hartmann testing technology has the ability to test the imaging quality of a large-aperture space optical system.

  19. Large Area Field of View for Fast Temporal Resolution Astronomy

    NASA Astrophysics Data System (ADS)

    Covarrubias, Ricardo A.

    2018-01-01

    Scientific CMOS (sCMOS) technology is especially relevant for high temporal resolution astronomy combining high resolution, large field of view with very fast frame rates, without sacrificing ultra-low noise performance. Solar Astronomy, Near Earth Object detections, Space Debris Tracking, Transient Observations or Wavefront Sensing are among the many applications this technology can be utilized. Andor Technology is currently developing the next-generation, very large area sCMOS camera with an extremely low noise, rapid frame rates, high resolution and wide dynamic range.

  20. Drying step optimization to obtain large-size transparent magnesium-aluminate spinel samples

    NASA Astrophysics Data System (ADS)

    Petit, Johan; Lallemant, Lucile

    2017-05-01

    In the transparent ceramics processing, the green body elaboration step is probably the most critical one. Among the known techniques, wet shaping processes are particularly interesting because they enable the particles to find an optimum position on their own. Nevertheless, the presence of water molecules leads to drying issues. During the water removal, its concentration gradient induces cracks limiting the sample size: laboratory samples are generally less damaged because of their small size but upscaling the samples for industrial applications lead to an increasing cracking probability. Thanks to the drying step optimization, large size spinel samples were obtained.