Sample records for x-shooter science verification

  1. The X-shooter pipeline

    NASA Astrophysics Data System (ADS)

    Goldoni, P.

    2011-03-01

    The X-shooter data reduction pipeline is an integral part of the X-shooter project, it allows the production of reduced data in physical quantities from the raw data produced by the instrument. The pipeline is based on the data reduction library developed by the X-shooter consortium with contributions from France, The Netherlands and ESO and it uses the Common Pipeline Library (CPL) developed at ESO. The pipeline has been developed for two main functions. The first function is to monitor the operation of the instrument through the reduction of the acquired data, both at Paranal, for a quick-look control, and in Garching, for a more thorough evaluation. The second function is to allow an optimized data reduction for a scientific user. In the following I will first outline the main steps of data reduction with the pipeline then I will briefly show two examples of optimization of the results for science reduction.

  2. VizieR Online Data Catalog: M17 massive pms stars X-shooter spectra (Ramirez-Tannus+, 2017)

    NASA Astrophysics Data System (ADS)

    Ramirez-Tannus, M. C.; Kaper, L.; de Koter, A.; Tramper, F.; Bik, A.; Ellerbroek, L. E.; Ochsendorf, B. B.; Ramirez-Agudelo, O. H.; Sana, H.

    2017-05-01

    Normalized X-shooter spectra of the PMS and OB stars in M17 studied. The X-shooter spectra were obtained under good weather conditions with seeing ranging from 0.5" and 1" and clear sky. With the exception of the 2012 B289 spectrum and the 2009 B275 science verification spectrum, the spectrograph slit widths used were 1" (UVB, 300-590nm), 0.9" (VIS, 550-1020nm), and 0.4" (NIR, 1000-2480nm), resulting in a spectral resolving power of 5100, 8800, and 11300, respectively. The slit widths for the 2010 B275 observations were 1.6", 0.9", and 0.9" resulting in a resolving power of 3300, 8800, and 5600, respectively. For the 2012 B289 observations we used the 0.8", 0.7", and 0.4" slits corresponding to a resolving power of 6200, 11000, and 11300 for the UVB, VIS, and NIR arms, respectively. The spectra were taken in nodding mode and reduced using the X-shooter pipeline version 2.7.1 running under the ESO Reflex environment version 2.8.4. (2 data files).

  3. VLT/X-shooter Spectroscopy of a dusty planetary nebula discovered with Spitzer/IRS

    NASA Astrophysics Data System (ADS)

    Oliveira, I.; Overzier, R. A.; Pontoppidan, K. M.; van Dishoeck, E. F.; Spezzi, L.

    2011-02-01

    As part of a mid-infrared spectroscopic survey of young stars with the Spitzer Space Telescope, an unclassified red emission line object was discovered. Based on its high ionization state indicated by the Spitzer spectrum, this object could either be a dusty supernova remnant (SNR) or a planetary nebula (PN). In this research note, the object is classified and the available spectroscopic data are presented to the community for further analysis. UV/optical/NIR spectra were obtained during the science verification run of the VLT/X-shooter. A large number of emission lines are identified allowing the determination of the nature of this object. The presence of strong, narrow (Δv ~8 - 74 km s-1) emission lines, combined with very low line ratios of, e.g., [N ii]/Hα and [S ii]/Hα show that the object is a PN that lies at an undetermined distance behind the Serpens Molecular Cloud. This illustrates the potential of X-shooter as an efficient tool for constraining the nature of faint sources with unknown spectral properties or colors.

  4. The X-shooter pipeline

    NASA Astrophysics Data System (ADS)

    Modigliani, Andrea; Goldoni, Paolo; Royer, Frédéric; Haigron, Regis; Guglielmi, Laurent; François, Patrick; Horrobin, Matthew; Bristow, Paul; Vernet, Joel; Moehler, Sabine; Kerber, Florian; Ballester, Pascal; Mason, Elena; Christensen, Lise

    2010-07-01

    The X-shooter data reduction pipeline, as part of the ESO-VLT Data Flow System, provides recipes for Paranal Science Operations, and for Data Product and Quality Control Operations at Garching headquarters. At Paranal, it is used for the quick-look data evaluation. The pipeline recipes can be executed either with EsoRex at the command line level or through the Gasgano graphical user interface. The recipes are implemented with the ESO Common Pipeline Library (CPL). X-shooter is the first of the second generation of VLT instruments. It makes possible to collect in one shot the full spectrum of the target from 300 to 2500 nm, subdivided in three arms optimised for UVB, VIS and NIR ranges, with an efficiency between 15% and 35% including the telescope and the atmosphere, and a spectral resolution varying between 3000 and 17,000. It allows observations in stare, offset modes, using the slit or an IFU, and observing sequences nodding the target along the slit. Data reduction can be performed either with a classical approach, by determining the spectral format via 2D-polynomial transformations, or with the help of a dedicated instrument physical model to gain insight on the instrument and allowing a constrained solution that depends on a few parameters with a physical meaning. In the present paper we describe the steps of data reduction necessary to fully reduce science observations in the different modes with examples on typical data calibrations and observations sequences.

  5. Report on the Workshop ''The First Year of Science with X-shooter''

    NASA Astrophysics Data System (ADS)

    Randich, S.; Covino, S.; Cristiani, S.

    2011-03-01

    The workshop was held with the aim of bringing together X-shooter users to discuss scientific results, performance and technical aspects, after the first year of successful operations of the instrument. The workshop was also organised to commemorate Roberto Pallavicini, whose scientific and human contribution to the development of X-shooter was invaluable and a source of continuous inspiration for all of us. A touching presentation focusing on the scientific personality of Roberto was given by Luca Pasquini on the second day of the workshop.

  6. The spectrum of (136199) Eris between 350 and 2350 nm: results with X-Shooter

    NASA Astrophysics Data System (ADS)

    Alvarez-Candal, A.; Pinilla-Alonso, N.; Licandro, J.; Cook, J.; Mason, E.; Roush, T.; Cruikshank, D.; Gourgeot, F.; Dotto, E.; Perna, D.

    2011-08-01

    Context. X-Shooter is the first second-generation instrument for the ESO-Very Large Telescope. It is a spectrograph covering the entire 300-2480 nm spectral range at once with a high resolving power. These properties enticed us to observe the well-known trans-Neptunian object (136199) Eris during the science verification of the instrument. The target has numerous absorption features in the optical and near-infrared domain that have been observed by different authors, showing differences in these features' positions and strengths. Aims: Besides testing the capabilities of X-Shooter to observe minor bodies, we attempt to constrain the existence of super-volatiles, e.g., CH4, CO and N2, and in particular we try to understand the physical-chemical state of the ices on Eris' surface. Methods: We observed Eris in the 300 - 2480 nm range and compared the newly obtained spectra with those available in the literature. We identified several absorption features, measured their positions and depth, and compare them with those of the reflectance of pure methane ice obtained from the optical constants of this ice at 30 K to study shifts in these features' positions and find a possible explanation for their origin. Results: We identify several absorption bands in the spectrum that are all consistent with the presence of CH4 ice. We do not identify bands related to N2 or CO. We measured the central wavelengths of the bands and compared to those measured in the spectrum of pure CH4 at 30 K finding variable spectral shifts. Conclusions: Based on these wavelength shifts, we confirm the presence of a dilution of CH4 in other ice on the surface of Eris and the presence of pure CH4 that is spatially segregated. The comparison of the centers and shapes of these bands with previous works suggests that the surface is heterogeneous. The absence of the 2160 nm band of N2 can be explained if the surface temperature is below 35.6 K, the transition temperature between the alpha and beta phases

  7. VizieR Online Data Catalog: Lupus YSOs X-shooter spectroscopy (Alcala+, 2017)

    NASA Astrophysics Data System (ADS)

    Alcala, J. M.; Manara, C. F.; Natta, A.; Frasca, A.; Testi, L.; Nisini, B.; Stelzer, B.; Williams, J. P.; Antoniucci, S.; Biazzo, K.; Covino, E.; Esposito, M.; Getman, F.; Rigliaco, E.

    2017-07-01

    All the data used in this paper were acquired with the X-shooter spectrograph at the VLT. The capabilities of X-shooter in terms of wide spectral coverage (310-2500nm), resolution and limiting magnitudes allow us to assess simultaneously the mass accretion and outflow, and disc diagnostics, from the UV and optical to the near IR. The sample studied in this paper consists mainly of two sets of low-mass class II YSOs in the aforementioned Lupus clouds. The first one comprises the 36 objects published in Alcala et al, (2014, Cat. J/A+A/561/A2), observed within the context of the X-shooter INAF/GTO (Alcala et al. 2011AN....332..242A) project; for simplicity we will refer to it as the "GTO sample" throughout the paper. One additional source namely Sz105, was investigated with X-shooter during the GTO, but rejected as a legitimate YSO (see below). The second sample consists of 49 objects observed during ESO periods 95 and 97 (1 April-30 September 2015 and 1 April-30 September 2016, respectively). In addition, we include here six objects observed with X-shooter in other programmes taken from the ESO archive. In total, 55 objects were newly analysed here and we will refer to them as the "new sample". (12 data files).

  8. The polarimeters for HARPS and X-shooter

    NASA Astrophysics Data System (ADS)

    Snik, F.; Harpspol Team; X-Shooter-Pol Team

    2013-01-01

    Spectropolarimetry enables observations of stellar magnetic fields and circumstellar asymmetries, e.g. in disks and supernova explosions. To furnish better diagnostics of such stellar physics, we designed and commissioned a polarimetric unit at the successful HARPS spectrograph at ESO's 3.6-m telescope at La Silla. We present the design and performance of HARPSpol, and show some first science results. The most striking achievement of HARPSpol is its capability to measure stellar magnetic fields as small as 0.1 G. Finally, we give a sneak preview of the polarimeter we are currently designing for X-shooter at the VLT. It contains a novel type of polarimetric modulator that is able to efficiently measure all the Stokes parameters over the huge wavelength range of 300-2500 nm.

  9. An X-shooter survey of star forming regions: Low-mass stars and sub-stellar objects

    NASA Astrophysics Data System (ADS)

    Alcalá, J. M.; Stelzer, B.; Covino, E.; Cupani, G.; Natta, A.; Randich, S.; Rigliaco, E.; Spezzi, L.; Testi, L.; Bacciotti, F.; Bonito, R.; Covino, S.; Flaccomio, E.; Frasca, A.; Gandolfi, D.; Leone, F.; Micela, G.; Nisini, B.; Whelan, E.

    2011-03-01

    We present preliminary results of our X-shooter survey in star forming regions. In this contribution we focus on sub-samples of young stellar and sub-stellar objects (YSOs) in the Lupus star forming region and in the TW Hya association. We show that the X-shooter spectra are suitable for conducting several parallel studies such as YSO + disk fundamental parameters, accretion and outflow activity in the very low-mass (VLM) and sub-stellar regimes, as well as magnetic activity in young VLM YSOs, and Li abundance determinations. The capabilities of X-shooter in terms of wide spectral coverage, resolution and limiting magnitudes, allow us to assess simultaneously the accretion/outflow, magnetic activity, and disk diagnostics, from the UV and optical to the near-IR, avoiding ambiguities due to possible YSO variability. Based on observations collected at the European Southern Observatory, Chile, under Programmes 084.C-0269 and 085.C-0238.

  10. The VLT/X-shooter GRB afterglow legacy survey

    NASA Astrophysics Data System (ADS)

    Kaper, Lex; Fynbo, Johan P. U.; Pugliese, Vanna; van Rest, Daan

    2017-11-01

    The Swift satellite allows us to use gamma-ray bursts (GRBs) to peer through the hearts of star forming galaxies through cosmic time. Our open collaboration, representing most of the active European researchers in this field, builds a public legacy sample of GRB X-shooter spectroscopy while Swift continues to fly. To date, our spectroscopy of more than 100 GRB afterglows covers a redshift range from 0.059 to about 8 (Tanvir et al. 2009, Nature 461, 1254), with more than 20 robust afterglow-based metallicity measurements (over a redshift range from 1.7 to 5.9). With afterglow spectroscopy (throughout the electromagnetic spectrum from X-rays to the sub-mm) we can hence characterize the properties of star-forming galaxies over cosmic history in terms of redshift, metallicity, molecular content, ISM temperature, UV-flux density, etc.. These observations provide key information on the final evolution of the most massive stars collapsing into black holes, with the potential of probing the epoch of the formation of the first (very massive) stars. VLT/X-shooter (Vernet et al. 2011, A&A 536, A105) is in many ways the ideal GRB follow-up instrument and indeed GRB follow-up was one of the primary science cases behind the instrument design and implementation. Due to the wide wavelength coverage of X-shooter, in the same observation one can detect molecular H2 absorption near the atmospheric cut-off and many strong emission lines from the host galaxy in the near-infrared (e.g., Friis et al. 2015, MNRAS 451, 167). For example, we have measured a metallicity of 0.1 Z ⊙ for GRB 100219A at z = 4.67 (Thöne et al. 2013, MNRAS 428, 3590), 0.02 Z ⊙ for GRB 111008A at z = 4.99 (Sparre et al. 2014, ApJ 785, 150) and 0.05 Z ⊙ for GRB 130606A at z = 5.91 (Hartoog et al. 2015, A&A 580, 139). In the latter, the very high value of [Al/Fe]=2.40 +/- 0.78 might be due to a proton capture process and may be a signature of a previous generation of massive (perhaps even the first) stars

  11. VLT/X-shooter GRBs: Individual extinction curves of star-forming regions★

    NASA Astrophysics Data System (ADS)

    Zafar, T.; Watson, D.; Møller, P.; Selsing, J.; Fynbo, J. PU; Schady, P.; Wiersema, K.; Levan, A. J.; Heintz, K. E.; Postigo, A. de Ugarte; D'Elia, V.; Jakobsson, P.; Bolmer, J.; Japelj, J.; Covino, S.; Gomboc, A.; Cano, Z.

    2018-05-01

    The extinction profiles in Gamma-Ray Burst (GRB) afterglow spectral energy distributions (SEDs) are usually described by the Small Magellanic Cloud (SMC)-type extinction curve. In different empirical extinction laws, the total-to-selective extinction, RV, is an important quantity because of its relation to dust grain sizes and compositions. We here analyse a sample of 17 GRBs (0.34X-shooter instrument, giving us an opportunity to fit individual extinction curves of GRBs for the first time. Our sample is compiled on the basis that multi-band photometry is available around the X-shooter observations. The X-shooter data are combined with the Swift X-ray data and a single or broken power-law together with a parametric extinction law is used to model the individual SEDs. We find 10 cases with significant dust, where the derived extinction, AV, ranges from 0.1-1.0 mag. In four of those, the inferred extinction curves are consistent with the SMC curve. The GRB individual extinction curves have a flat RV distribution with an optimal weighted combined value of RV = 2.61 ± 0.08 (for seven broad coverage cases). The `average GRB extinction curve' is similar to, but slightly steeper than the typical SMC, and consistent with the SMC Bar extinction curve at ˜95% confidence level. The resultant steeper extinction curves imply populations of small grains, where large dust grains may be destroyed due to GRB activity. Another possibility could be that young age and/or lower metallicities of GRBs environments are responsible for the steeper curves.

  12. X-shooter Finds an Extremely Primitive Star

    NASA Astrophysics Data System (ADS)

    Caffau, E.; Bonifacio, P.; François, P.; Sbordone, L.; Monaco, L.; Spite, M.; Spite, F.; Ludwig, H.-G.; Cayrel, R.; Zaggia, S.; Hammer, F.; Randich, S.; Molaro, P.; Hill, V.

    2011-12-01

    Low-mass extremely metal-poor (EMP) stars hold the fossil record of the chemical composition of the early phases of the Universe in their atmospheres. Chemical analysis of such objects provides important constraints on these early phases. EMP stars are rather rare objects: to dig them out, large amounts of data have to be considered. We have analysed stars from the Sloan Digital Sky Survey using an automatic procedure and selected a sample of good candidate EMP stars, which we observed with the spectrographs X-shooter and UVES. We could confirm the low metallicity of our sample of stars, and we succeeded in finding a record metal-poor star.

  13. The First X-shooter Observations of Jets from Young Stars

    NASA Astrophysics Data System (ADS)

    Bacciotti, F.; Whelan, E. T.; Alcalá, J. M.; Nisini, B.; Podio, L.; Randich, S.; Stelzer, B.; Cupani, G.

    2011-08-01

    We present the first pilot study of jets from young stars conducted with X-shooter, on the ESO/Very Large Telescope. As it offers simultaneous, high-quality spectra in the range 300-2500 nm, X-shooter is uniquely important for spectral diagnostics in jet studies. We chose to probe the accretion/ejection mechanisms at low stellar masses examining two targets with well-resolved continuous jets lying on the plane of the sky: ESO-HA 574 in Chameleon I and Par-Lup3-4 in Lupus III. The mass of the latter is close to the sub-stellar boundary (M sstarf = 0.13 M sun). A large number of emission lines probing regions of different excitation are identified, position-velocity diagrams are presented, and mass outflow/accretion rates are estimated. Comparison between the two objects is striking. ESO-HA 574 is a weakly accreting star for which we estimate a mass accretion rate of log (\\dot{M}_{acc}) = -10.8 +/- 0.5 (in M sun yr-1), yet it drives a powerful jet with \\dot{M}_{out} ~ 1.5-2.7 × 10-9 M sun yr-1. These values can be reconciled with a magneto-centrifugal jet acceleration mechanism assuming that the presence of the edge-on disk severely depresses the luminosity of the accretion tracers. In comparison, Par-Lup3-4, with stronger mass accretion (log (\\dot{M}_{acc}) = -9.1 +/- 0.4 M sun yr-1), drives a low-excitation jet with about \\dot{M}_{out} ~ 3.2 × 10-10 M sun yr-1 in both lobes. Despite the low stellar mass, \\dot{M}_{out}/\\dot{M}_{acc} for Par-Lup3-4 is at the upper limit of the range usually measured for young objects, but still compatible with a steady magneto-centrifugal wind scenario if all uncertainties are considered. Based on Observations collected with X-shooter at the Very Large Telescope on Cerro Paranal (Chile), operated by the European Southern Observatory (ESO). Program ID: 085.C-0238(A).

  14. Hunting for brown dwarf binaries with X-Shooter

    NASA Astrophysics Data System (ADS)

    Manjavacas, E.; Goldman, B.; Alcalá, J. M.; Zapatero-Osorio, M. R.; Béjar, B. J. S.; Homeier, D.; Bonnefoy, M.; Smart, R. L.; Henning, T.; Allard, F.

    2015-05-01

    The refinement of the brown dwarf binary fraction may contribute to the understanding of the substellar formation mechanisms. Peculiar brown dwarf spectra or discrepancy between optical and near-infrared spectral type classification of brown dwarfs may indicate unresolved brown dwarf binary systems. We obtained medium-resolution spectra of 22 brown dwarfs of potential binary candidates using X-Shooter at the VLT. We aimed to select brown dwarf binary candidates. We also tested whether BT-Settl 2014 atmospheric models reproduce the physics in the atmospheres of these objects. To find different spectral type spectral binaries, we used spectral indices and we compared the selected candidates to single spectra and composition of two single spectra from libraries, to try to reproduce our X-Shooter spectra. We also created artificial binaries within the same spectral class, and we tried to find them using the same method as for brown dwarf binaries with different spectral types. We compared our spectra to the BT-Settl models 2014. We selected six possible candidates to be combination of L plus T brown dwarfs. All candidates, except one, are better reproduced by a combination of two single brown dwarf spectra than by a single spectrum. The one-sided F-test discarded this object as a binary candidate. We found that we are not able to find the artificial binaries with components of the same spectral type using the same method used for L plus T brown dwarfs. Best matches to models gave a range of effective temperatures between 950 K and 1900 K, a range of gravities between 4.0 and 5.5. Some best matches corresponded to supersolar metallicity.

  15. Observing metal-poor stars with X-Shooter

    NASA Astrophysics Data System (ADS)

    Caffau, E.; Bonifacio, P.; Sbordone, L.; Monaco, L.; François; , P.

    The extremely metal-poor stars (EMP) hold in their atmospheres the fossil record of the chemical composition of the early phases of the Galactic evolution. The chemical analysis of such objects provides important constraints on these early phases. EMP stars are very rare objects; to dig them out large amounts of data have to be considered. With an automatic procedure, we analysed objects with colours of Turn-Off stars from the Sloan Digital Sky Survey to select a sample of good candidate EMP stars. During the French-Italian GTO of the spectrograph X-Shooter, we observed a sample of these candidates. We could confirm the low metallicity of our sample of stars, and we succeeded in finding a record metal-poor star.

  16. Carbon stars in the X-Shooter Spectral Library

    NASA Astrophysics Data System (ADS)

    Gonneau, A.; Lançon, A.; Trager, S. C.; Aringer, B.; Lyubenova, M.; Nowotny, W.; Peletier, R. F.; Prugniel, P.; Chen, Y.-P.; Dries, M.; Choudhury, O. S.; Falcón-Barroso, J.; Koleva, M.; Meneses-Goytia, S.; Sánchez-Blázquez, P.; Vazdekis, A.

    2016-05-01

    We provide a new collection of spectra of 35 carbon stars obtained with the ESO/VLT X-Shooter instrument as part of the X-Shooter Spectral Library project. The spectra extend from 0.3 μm to 2.4 μm with a resolving power above ~8000. The sample contains stars with a broad range of (J - K) color and pulsation properties located in the Milky Way and the Magellanic Clouds. We show that the distribution of spectral properties of carbon stars at a given (J - K) color becomes bimodal (in our sample) when (J - K) is larger than about 1.5. We describe the two families of spectra that emerge, characterized by the presence or absence of the absorption feature at 1.53 μm, generally associated with HCN and C2H2. This feature appears essentially only in large-amplitude variables, though not in all observations. Associated spectral signatures that we interpret as the result of veiling by circumstellar matter, indicate that the 1.53 μm feature might point to episodes of dust production in carbon-rich Miras. Based on observations collected at the European Southern Observatory, Paranal, Chile, Prog. ID 084.B-0869(A/B), 085.B-0751(A/B), 189.B-0925(A/B/C/D).Tables 1, B.1, E.1, E.2 are also available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/589/A36The reduced spectra are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/589/A36

  17. Stellar parameters of Be stars observed with X-shooter

    NASA Astrophysics Data System (ADS)

    Shokry, A.; Rivinius, Th.; Mehner, A.; Martayan, C.; Hummel, W.; Townsend, R. H. D.; Mérand, A.; Mota, B.; Faes, D. M.; Hamdy, M. A.; Beheary, M. M.; Gadallah, K. A. K.; Abo-Elazm, M. S.

    2018-01-01

    Aims: The X-shooter archive of several thousand telluric standard star spectra was skimmed for Be and Be shell stars to derive the stellar fundamental parameters and statistical properties, in particular for the less investigated late-type Be stars and the extension of the Be phenomenon into early A stars. Methods: An adapted version of the BCD method is used, using the Balmer discontinuity parameters to determine effective temperature and surface gravity. This method is optimally suited for late B stars. The projected rotational velocity was obtained by profile fitting to the Mg ii lines of the targets, and the spectra were inspected visually for the presence of peculiar features such as the infrared Ca ii triplet or the presence of a double Balmer discontinuity. The Balmer line equivalent widths were measured, but they are only useful for determining the pure emission contribution in a subsample of Be stars owing to uncertainties in determining the photospheric contribution. Results: A total of 78, mostly late-type, Be stars, were identified in the X-shooter telluric standard star archive, out of which 48 had not been reported before. We confirm the general trend that late-type Be stars have more tenuous disks and are less variable than early-type Be stars. The relatively large number (48) of relatively bright (V> 8.5) additional Be stars casts some doubt on the statistics of late-type Be stars; they are more common than currently thought. The Be/B star fraction may not strongly depend on spectral subtype. Based on observations made with ESO Telescopes at the La Silla Paranal Observatory under program IDs 60.A-9022, 60.A-9024, 077.D-0085, 085.A-0962, 185.D-0056, 091.B-0900, and 093.D-0415.Table 6 is only available at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/609/A108

  18. CASSOWARY20: a wide separation Einstein Cross identified with the X-shooter spectrograph

    NASA Astrophysics Data System (ADS)

    Pettini, Max; Christensen, Lise; D'Odorico, Sandro; Belokurov, Vasily; Evans, N. Wyn; Hewett, Paul C.; Koposov, Sergey; Mason, Elena; Vernet, Joël

    2010-03-01

    We have used spectra obtained with X-shooter, the triple arm optical-infrared spectrograph recently commissioned on the Very Large Telescope of the European Southern Observatory, to confirm the gravitational lens nature of the CAmbridge Sloan Survey Of Wide ARcs in the skY (CASSOWARY) candidate CSWA20. This system consists of a luminous red galaxy at redshift zabs = 0.741, with a very high velocity dispersion, σlens ~= 500kms-1, which lenses a blue star-forming galaxy at zem = 1.433 into four images with a mean separation of ~6arcsec. The source shares many of its properties with those of UV-selected galaxies at z = 2-3: it is forming stars at a rate SFR ~= 25Msolaryr-1, has a metallicity of ~1/4 solar and shows nebular emission from two components separated by 0.4arcsec (in the image plane), possibly indicating a merger. It appears that foreground interstellar material within the galaxy has been evacuated from the sightline along which we observe the starburst, giving an unextinguished view of its stars and HII regions. CSWA20, with its massive lensing galaxy producing a high magnification of an intrinsically luminous background galaxy, is a promising target for future studies at a variety of wavelengths. Based on public data from the X-shooter commissioning observations collected at the European Southern Observatory VLT/Melipal telescope, Paranal, Chile. E-mail: pettini@ast.cam.ac.uk

  19. GRB host galaxies with VLT/X-Shooter: properties at 0.8 < z < 1.3

    NASA Astrophysics Data System (ADS)

    Piranomonte, S.; Japelj, J.; Vergani, S. D.; Savaglio, S.; Palazzi, E.; Covino, S.; Flores, H.; Goldoni, P.; Cupani, G.; Krühler, T.; Mannucci, F.; Onori, F.; Rossi, A.; D'Elia, V.; Pian, E.; D'Avanzo, P.; Gomboc, A.; Hammer, F.; Randich, S.; Fiore, F.; Stella, L.; Tagliaferri, G.

    2015-10-01

    Long gamma-ray bursts (LGRBs) are associated with the death of massive stars. Their host galaxies therefore represent a unique class of objects tracing star formation across the observable Universe. Indeed, recently accumulated evidence shows that GRB hosts do not differ substantially from general population of galaxies at high (z > 2) redshifts. However, it has been long recognized that the properties of z < 1.5 hosts, compared to general star-forming population, are unusual. To better understand the reasons for the supposed difference in LGRB hosts properties at z < 1.5, we obtained Very Large Telescope (VLT)/X-Shooter spectra of six hosts lying in the redshift range of 0.8 < z < 1.3. Some of these hosts have been observed before, yet we still lack well-constrained information on their characteristics such as metallicity, dust extinction and star formation rate (SFR). We search for emission lines in the VLT/X-Shooter spectra of the hosts and measure their fluxes. We perform a detailed analysis, estimating host average extinction, SFRs, metallicities and electron densities where possible. Measured quantities of our hosts are compared to a larger sample of previously observed GRB hosts at z < 2. SFRs and metallicities are measured for all the hosts analysed in this paper and metallicities are well determined for four hosts. The mass-metallicity relation, the fundamental metallicity relation and SFRs derived from our hosts occupy similar parameter space as other host galaxies investigated so far at the same redshift. We therefore conclude that GRB hosts in our sample support the found discrepancy between the properties of low-redshift GRB hosts and the general population of star-forming galaxies.

  20. VizieR Online Data Catalog: X-Shooter spectroscopy of YSOs in Lupus (Frasca+, 2017)

    NASA Astrophysics Data System (ADS)

    Frasca, A.; Biazzo, K.; Alcala, J. M.; Manara, C. F.; Stelzer, B.; Covino, E.; Antoniucci, S.

    2017-03-01

    Membership, atmospheric parameters (Teff, logg, and [Fe/H]), radial velocity (RV), projected rotational velocity (vsini) and veiling at five wavelengths are listed for 102 Lupus YSO candidates in Table 1. Mass and age are also reported in Table 1 for the members, with the exception of subluminous sources. Table 2 reports the full width at 10% maximum of the Hα line and the fluxes in the Hα, Hβ, CaII-IRT, CaII-K, and NaI,D1,2 lines. Table 3 reports the fluxes for Paγ, Paβ, and Brγ measured in the NIR X-Shooter spectra. (3 data files).

  1. X-Shooter study of accretion in Chamaeleon I

    NASA Astrophysics Data System (ADS)

    Manara, C. F.; Fedele, D.; Herczeg, G. J.; Teixeira, P. S.

    2016-01-01

    We present the analysis of 34 new VLT/X-Shooter spectra of young stellar objects in the Chamaeleon I star-forming region, together with four more spectra of stars in Taurus and two in Chamaeleon II. The broad wavelength coverage and accurate flux calibration of our spectra allow us to estimate stellar and accretion parameters for our targets by fitting the photospheric and accretion continuum emission from the Balmer continuum down to ~700 nm. The dependence of accretion on stellar properties for this sample is consistent with previous results from the literature. The accretion rates for transitional disks are consistent with those of full disks in the same region. The spread of mass accretion rates at any given stellar mass is found to be smaller than in many studies, but is larger than that derived in the Lupus clouds using similar data and techniques. Differences in the stellar mass range and in the environmental conditions between our sample and that of Lupus may account for the discrepancy in scatter between Chamaeleon I and Lupus. Complete samples in Chamaeleon I and Lupus are needed to determine whether the difference in scatter of accretion rates and the lack of evolutionary trends are not influenced by sample selection. This work is based on observations made with ESO Telescopes at the Paranal Observatory under programme ID 084.C-1095 and 094.C-0913.

  2. The Eta Carinae Homunculus in Full 3D with X-Shooter and Shape

    NASA Technical Reports Server (NTRS)

    Steffen, Wolfgang; Teodoro, Mairan; Madura, Thomas I.; Groh, Jose H.; Gull, Theodore R.; Mehner, Andrea; Corcoran, Michael F.; Damineli, Augusto; Hamaguchi, Kenji

    2014-01-01

    Massive stars like Eta Carinae are extremely rare in comparison to stars such as the Sun, and currently we know of only a handful of stars with masses of more than 100 solar mass in the Milky Way. Such massive stars were much more frequent in the early history of the Universe and had a huge impact on its evolution. Even among this elite club, Eta Car is outstanding, in particular because of its giant eruption around 1840 that produced the beautiful bipolar nebula now known as the Homunculus. In this study, we used detailed spatio-kinematic information obtained from X-shooter spectra to reconstruct the 3D structure of the Homunculus. The small-scale features suggest that the central massive binary played a significant role in shaping the Homunculus.

  3. X-shooter observations of low-mass stars in the η Chamaeleontis association

    NASA Astrophysics Data System (ADS)

    Rugel, Michael; Fedele, Davide; Herczeg, Gregory

    2018-01-01

    The nearby η Chamaeleontis association is a collection of 4-10 Myr old stars with a disk fraction of 35-45%. In this study, the broad wavelength coverage of VLT/X-shooter is used to measure the stellar and mass accretion properties of 15 low-mass stars in the η Chamaeleontis association. For each star, the observed spectrum is fitted with a non-accreting stellar template and an accretion spectrum obtained from assuming a plane-parallel hydrogen slab. Five of the eight stars with an IR disk excess show excess UV emission, indicating ongoing accretion. The accretion rates measured here are similar to those obtained from previous measurements of excess UV emission, but tend to be higher than past measurements from Hα modeling. The mass accretion rates are consistent with those of other young star forming regions. This work is based on observations made with ESO Telescopes at the Paranal Observatory under program ID 084.C-1095.

  4. The VLT LBG redshift survey - VI. Mapping H I in the proximity of z ˜ 3 LBGs with X-Shooter

    NASA Astrophysics Data System (ADS)

    Bielby, R. M.; Shanks, T.; Crighton, N. H. M.; Bornancini, C. G.; Infante, L.; Lambas, D. G.; Minniti, D.; Morris, S. L.; Tummuangpak, P.

    2017-10-01

    We present an analysis of the spatial distribution and dynamics of neutral hydrogen gas around galaxies using new X-Shooter observations of z ˜ 2.5-4 quasars. Adding the X-Shooter data to our existing data set of high-resolution quasar spectroscopy, we use a total sample of 29 quasars alongside ˜1700 Lyman Break Galaxies (LBGs) in the redshift range 2 ≲ z ≲ 3.5. We measure the Lyα forest auto-correlation function, finding a clustering length of s0 = 0.081 ± 0.006 h-1 Mpc, and the cross-correlation function with LBGs, finding a cross-clustering length of s0 = 0.27 ± 0.14 h-1 Mpc and power-law slope γ = 1.1 ± 0.2. Our results highlight the weakly clustered nature of neutral hydrogren systems in the Lyα forest. Building on this, we make a first analysis of the dependence of the clustering on absorber strength, finding a clear preference for stronger Lyα forest absorption features to be more strongly clustered around the galaxy population, suggesting that they trace on average higher mass haloes. Using the projected and 2-D cross-correlation functions, we constrain the dynamics of Lyα forest clouds around z ˜ 3 galaxies. We find a significant detection of large-scale infall of neutral hydrogen, with a constraint on the Lyα forest infall parameter of βF = 1.02 ± 0.22.

  5. Search with UVES and X-Shooter for signatures of the low-mass secondary in the post common-envelope binary AA Doradus

    NASA Astrophysics Data System (ADS)

    Hoyer, D.; Rauch, T.; Werner, K.; Hauschildt, P. H.; Kruk, J. W.

    2015-06-01

    Context. AA Dor is a close, totally eclipsing, post common-envelope binary with an sdOB-type primary star and an extremely low-mass secondary star, located close to the mass limit of stable central hydrogen burning. Within error limits, it may either be a brown dwarf or a late M-type dwarf. Aims: We aim to extract the secondary's contribution to the phase-dependent composite spectra. The spectrum and identified lines of the secondary decide on its nature. Methods: In January 2014, we measured the phase-dependent spectrum of AA Dor with X-Shooter over one complete orbital period. Since the secondary's rotation is presumable synchronized with the orbital period, its surface strictly divides into a day and night side. Therefore, we may obtain the spectrum of its cool side during its transit and of its hot, irradiated side close to its occultation. We developed the Virtual Observatory (VO) tool TLISA to search for weak lines of a faint companion in a binary system. We successfully applied it to the observations of AA Dor. Results: We identified 53 spectral lines of the secondary in the ultraviolet-blue, visual, and near-infrared X-Shooter spectra that are strongest close to its occultation. We identified 57 (20 additional) lines in available Ultraviolet and Visual Echelle Spectrograph (UVES) spectra from 2001. The lines are mostly from C ii-iii and O ii, typical for a low-mass star that is irradiated and heated by the primary. We verified the orbital period of P = 22 597.033201 ± 0.00007 s and determined the orbital velocity K_sec = 232.9+16.6-6.5 km s-1 of the secondary. The mass of the secondary is M_sec = 0.081+0.018-0.010 M_⊙ and, hence, it is not possible to reliably determine a brown dwarf or an M-type dwarf nature. Conclusions: Although we identified many emission lines of the secondary's irradiated surface, the resolution and signal-to-noise ratio of our UVES and X-Shooter spectra are not good enough to extract a good spectrum of the secondary

  6. Development of the Simbol-X science verification model and its contribution for the IXO Mission

    NASA Astrophysics Data System (ADS)

    Maier, Daniel; Aschauer, Florian; Dick, Jürgen; Distratis, Giuseppe; Gebhardt, Henry; Herrmann, Sven; Kendziorra, Eckhard; Lauf, Thomas; Lechner, Peter; Santangelo, Andrea; Schanz, Thomas; Strüder, Lothar; Tenzer, Chris; Treis, Johannes

    2010-07-01

    Like the International X-ray Observatory (IXO) mission, the Simbol-X mission is a projected X-ray space telescope with spectral and imaging capabilities covering the energy range from 500 eV up to 80 keV. To detect photons within this wide range of energies, a silicon based "Depleted P-channel Field Effect Transistor" (DePFET)- matrix is used as the Low Energy Detector (LED) on top of an array of CdTe-Caliste modules, which act as the High Energy Detector (HED). A Science Verification Model (SVM) consisting of one LED quadrant in front of one Caliste module will be set up at our institute (IAAT) and operated under laboratory conditions that approximate the expected environment in space. As a first step we use the SVM to test and optimize the performance of the LED operation and data acquisition chain, consisting of an ADC, an event-preprocessor, a sequencer, and an interface controller. All these components have been developed at our institute with the objective to handle the high readout rate of approximately 8000 frames per second. The second step is to study the behaviour and the interactions of LED and HED operating as a combined detector system. We report on the development status of the SVM and its associated electronics and present first results of the currently achieved spectral performance.

  7. Aiming routines and their electrocortical concomitants among competitive rifle shooters.

    PubMed

    Konttinen, N; Landers, D M; Lyytinen, H

    2000-06-01

    The present study focused on an examination of competitive shooters' aiming process during a rifle shooting task. The barrel movements of the rifle, as detected by a laser system during the last 1000-ms time period preceding the triggering, were recorded from six elite and six pre-elite shooters. Electrocortical slow potentials (SPs) from frontal (Fz), centro-lateral (C3, C4), and occipital (Oz) brain areas were recorded to get an additional insight into the underlying covert processing. The results suggested that the elite shooters did not pull the trigger until they reached a sustained rifle position. In the pre-elite shooters the rifle appeared to be in a less stable position, and their strategy was to take advantage of the first appropriate moment of steadiness without a sustained rifle position so they could pull the trigger. The observed pre-trigger readiness potential (RP) shifts at Fz and Oz were more positive among the elite shooters relative to the pre-elite shooters, reflecting their more pronounced covert effort, rather than increasing preparedness for the trigger pull. The present study lends support for the view that a successful aiming strategy is mainly based on sustained rifle balancing. With regards to the brain slow potentials, it can be concluded that the RP shift does not specifically reflect the preparation for the trigger pull.

  8. Preliminary report for using X-rays as verification and authentication tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Esch, Ernst Ingo; Desimone, David J.; Lakis, Rollin Evan

    2016-04-06

    We examined x-rays for the use as authentication and verification tool in treaty verification. Several x-ray pictures were taken to determine the quality and feasibility of x-rays for these tasks. This document describes the capability of the used x-ray system and outlines its parameters and possible use.

  9. 2. Photocopy of photograph. View, looking southeast, showing Shooters Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Photocopy of photograph. View, looking southeast, showing Shooters Island with standard ship building plant in operation. Circa 1930 by Airmap Corporation of America. (Original in Staten Inland Historic Society, Staten Island, New York) - Shooters Island, Ships Graveyard, Newark Bay, Staten Island (subdivision), Richmond County, NY

  10. Awareness and Understanding of a College Active Shooter Crisis Plan

    ERIC Educational Resources Information Center

    Williams, Christopher Brian

    2017-01-01

    Gun violence on college campuses has gained the attention of campus leaders, leading to an active shooter policy and procedure development and implementation. There was little awareness within the campus leadership of a college in the Southeast United States on the college's active shooter policy and procedures. Guided by Coomb's crisis management…

  11. Exploring Ohio Police Preparedness for Active Shooter Incidents in Public Schools

    ERIC Educational Resources Information Center

    Pignatelli, Daniel A.

    2010-01-01

    School shootings, such as Columbine, have prompted police executives to explore response tactics and preparedness efforts for combating active shooters. This qualitative exploratory case study focused on specific preparation initiatives that have been implemented for the purpose of dealing with active shooters. Being prepared is one of the only…

  12. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  13. A Mini-BAL Outflow at 900 pc from the Central Source: VLT/X-shooter Observations

    NASA Astrophysics Data System (ADS)

    Xu, Xinfeng; Arav, Nahum; Miller, Timothy; Benn, Chris

    2018-05-01

    We determine the physical conditions and location of the outflow material seen in the mini-BAL quasar SDSS J1111+1437 (z = 2.138). These results are based on the analysis of a high S/N, medium-resolution VLT/X-shooter spectrum. The main outflow component spans the velocity range ‑1500 to ‑3000 km s‑1 and has detected absorption troughs from both high-ionization species: C IV, N V, O VI, Si IV, P V, and S IV; and low-ionization species: H I, C II, Mg II, Al II, Al III, Si II, and Si III. Measurements of these troughs allow us to derive an accurate photoionization solution for this absorption component: a hydrogen column density, {log}({N}{{H}})={21.47}-0.27+0.21 cm‑2 and ionization parameter, {log}({U}{{H}})=-{1.23}-0.25+0.20. Troughs produced from the ground and excited states of S IV combined with the derived {U}{{H}} value allow us to determine an electron number density of {log}({n}{{e}})={3.62}-0.11+0.09 cm‑3 and to obtain the distance of the ionized gas from the central source: R={880}-260+210 pc.

  14. The detection of gunshot residues in the nasal mucus of suspected shooters.

    PubMed

    Merli, Daniele; Brandone, Alberto; Amadasi, Alberto; Cattaneo, Cristina; Profumo, Antonella

    2016-07-01

    The identification and quantification of metallic residues produced by gunshots, called gunshot residues (GSR), provide crucial elements in forensic investigations. The research has been largely focused on their collection onto the hands of suspected shooters, but the method is often burdened by risks of contamination. This research was focused on the possibility of sampling GSR trapped inside the nasal mucus of consenting shooters. Samples of the nasal mucus of "blank" control subjects and shooters were chemically analysed by Instrumental Neutron Activation Analysis (INAA), for residues of antimony (Sb) and barium (Ba), while lead (Pb) was excluded as ubiquitously environmental contaminant and due to high instrumental quantification limit (IQL) of INAA for this element. Shots were fired using two types of weapons (pistols and revolvers) and different firing sequences. The mucus was sampled at different times: immediately after the shots, after 30-60-120 and 180 min. Different amounts of Sb and Ba were detected between controls and shooters, witnessing the ability of the nasal mucus to retain GSR at concentrations significantly different even from the highest basal levels. Moreover, in order to simulate actual cases, nasal mucus from five groups of shooters was sampled after different shots with the same weapon and cartridges, immediately and after 1, 3, 12, and 24 h. The highest values were always found in the first 3 h from firing, for both weapons. Interestingly, for all the weapons, significant Sb and Ba concentrations were also found up to 12 h after firing, contrary to what occurs on hands, even though a progressive decrease was detected, with values below the detection threshold only after 24 h, thus demonstrating that GSR are persistent in nasal mucus. These first results proving that both Sb and Ba were qualitatively detectable in the nasal mucus of shooters indicate that the chemical analysis of the nasal mucus of suspected shooters may represent a

  15. Goddard high resolution spectrograph science verification and data analysis

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The data analysis performed was to support the Orbital Verification (OV) and Science Verification (SV) of the GHRS was in the areas of the Digicon detector's performance and stability, wavelength calibration, and geomagnetic induced image motion. The results of the analyses are briefly described. Detailed results are given in the form of attachments. Specialized software was developed for the analyses. Calibration files were formatted according to the specifications in a Space Telescope Science report. IRAS images were restored of the Large Magellanic Cloud using a blocked iterative algorithm. The algorithm works with the raw data scans without regridding or interpolating the data on an equally spaced image grid.

  16. Information Sharing during the University of Texas at Austin Active Shooter/Suicide Event

    ERIC Educational Resources Information Center

    Egnoto, Michael J.; Griffin, Darrin J.; Svetieva, Elena; Winslow, Luke

    2016-01-01

    Emergency response systems can be improved by investigating the motives and manner in which people share information during an active shooter crisis. This article analyzed survey data collected from undergraduate participants at The University of Texas at Austin who were enrolled during the fall of 2010 when an active shooter event occurred on…

  17. System and method for bullet tracking and shooter localization

    DOEpatents

    Roberts, Randy S [Livermore, CA; Breitfeller, Eric F [Dublin, CA

    2011-06-21

    A system and method of processing infrared imagery to determine projectile trajectories and the locations of shooters with a high degree of accuracy. The method includes image processing infrared image data to reduce noise and identify streak-shaped image features, using a Kalman filter to estimate optimal projectile trajectories, updating the Kalman filter with new image data, determining projectile source locations by solving a combinatorial least-squares solution for all optimal projectile trajectories, and displaying all of the projectile source locations. Such a shooter-localization system is of great interest for military and law enforcement applications to determine sniper locations, especially in urban combat scenarios.

  18. Narcissistic Symptoms in German School Shooters.

    PubMed

    Bondü, Rebecca; Scheithauer, Herbert

    2015-12-01

    School shooters are often described as narcissistic, but empirical evidence is scant. To provide more reliable and detailed information, we conducted an exploratory study, analyzing police investigation files on seven school shootings in Germany, looking for symptoms of narcissistic personality disorder as defined by the Diagnostic and Statistical Manual of Mental Disorders (4th ed.; DSM-IV) in witnesses' and offenders' reports and expert psychological evaluations. Three out of four offenders who had been treated for mental disorders prior to the offenses displayed detached symptoms of narcissism, but none was diagnosed with narcissistic personality disorder. Of the other three, two displayed narcissistic traits. In one case, the number of symptoms would have justified a diagnosis of narcissistic personality disorder. Offenders showed low and high self-esteem and a range of other mental disorders. Thus, narcissism is not a common characteristic of school shooters, but possibly more frequent than in the general population. This should be considered in developing adequate preventive and intervention measures. © The Author(s) 2014.

  19. Ares I-X Range Safety Simulation Verification and Analysis Independent Validation and Verification

    NASA Technical Reports Server (NTRS)

    Merry, Carl M.; Tarpley, Ashley F.; Craig, A. Scott; Tartabini, Paul V.; Brewer, Joan D.; Davis, Jerel G.; Dulski, Matthew B.; Gimenez, Adrian; Barron, M. Kyle

    2011-01-01

    NASA s Ares I-X vehicle launched on a suborbital test flight from the Eastern Range in Florida on October 28, 2009. To obtain approval for launch, a range safety final flight data package was generated to meet the data requirements defined in the Air Force Space Command Manual 91-710 Volume 2. The delivery included products such as a nominal trajectory, trajectory envelopes, stage disposal data and footprints, and a malfunction turn analysis. The Air Force s 45th Space Wing uses these products to ensure public and launch area safety. Due to the criticality of these data, an independent validation and verification effort was undertaken to ensure data quality and adherence to requirements. As a result, the product package was delivered with the confidence that independent organizations using separate simulation software generated data to meet the range requirements and yielded consistent results. This document captures Ares I-X final flight data package verification and validation analysis, including the methodology used to validate and verify simulation inputs, execution, and results and presents lessons learned during the process

  20. Accretion signatures in the X-shooter spectrum of the substellar companion to SR12

    NASA Astrophysics Data System (ADS)

    Santamaría-Miranda, Alejandro; Cáceres, Claudio; Schreiber, Matthias R.; Hardy, Adam; Bayo, Amelia; Parsons, Steven G.; Gromadzki, Mariusz; Aguayo Villegas, Aurora Belén

    2018-04-01

    About a dozen substellar companions orbiting young stellar objects or pre-main sequence stars at several hundred au have been identified in the last decade. These objects are interesting both due to the uncertainties surrounding their formation, and because their large separation from the host star offers the potential to study the atmospheres of young giant planets and brown dwarfs. Here, we present X-shooter spectroscopy of SR 12 C, a ˜2 Myr young brown dwarf orbiting SR 12 at an orbital separation of 1083 au. We determine the spectral type, gravity, and effective temperature via comparison with models and observational templates of young brown dwarfs. In addition, we detect and characterize accretion using several accretion tracers. We find SR 12 C to be a brown dwarf of spectral type L0 ± 1, log g = 4 ± 0.5, an effective temperature of 2600 ± 100 K. Our spectra provide clear evidence for accretion at a rate of ˜10-10 M⊙ yr-1. This makes SR 12 one of the few sub-stellar companions with a reliable estimate for its accretion rate. A comparison of the ages and accretion rates of sub-stellar companions with young isolated brown dwarfs does not reveal any significant differences. If further accretion rate measurements of a large number of substellar companions can confirm this trend, this would hint towards a similar formation mechanism for substellar companions at large separations and isolated brown dwarfs.

  1. Individual and environmental risk factors for high blood lead concentrations in Danish indoor shooters.

    PubMed

    Grandahl, Kasper; Suadicani, Poul; Jacobsen, Peter

    2012-08-01

    International studies have shown blood lead at levels causing health concern in recreational indoor shooters. We hypothesized that Danish recreational indoor shooters would also have a high level of blood lead, and that this could be explained by shooting characteristics and the physical environment at the shooting range. This was an environmental case study of 58 male and female shooters from two indoor shooting ranges with assumed different ventilation and cleaning conditions. Information was obtained on general conditions including age, gender, tobacco and alcohol use, and shooting conditions: weapon type, number of shots fired, frequency of stays at the shooting range and hygiene habits. A venous blood sample was drawn to determine blood lead concentrations; 14 non-shooters were included as controls. Almost 60% of the shooters, hereof five out of 14 women, had a blood lead concentration above 0.48 micromol/l, a level causing long-term health concern. All controls had blood lead values below 0.17 micromol/l. Independent significant associations with blood lead concentrations above 0.48 micromol/l were found for shooting at a poorly ventilated range, use of heavy calibre weapons, number of shots and frequency of stays at the shooting range. A large proportion of Danish recreational indoor shooters had potentially harmful blood lead concentrations. Ventilation, amounts of shooting, use of heavy calibre weapons and stays at the shooting ranges were independently associated with increased blood lead. The technical check at the two ranges was performed by the Danish Technological Institute and costs were defrayed by the Danish Rifle Association. To pay for the analyses of blood lead, the study was supported by the The Else & Mogens Wedell-Wedellsborg Foundation. The Danish Regional Capital Scientific Ethics Committee approved the study, protocol number H-4-2010-130.

  2. Connection between jets, winds and accretion in T Tauri stars. The X-shooter view

    NASA Astrophysics Data System (ADS)

    Nisini, B.; Antoniucci, S.; Alcalá, J. M.; Giannini, T.; Manara, C. F.; Natta, A.; Fedele, D.; Biazzo, K.

    2018-01-01

    Mass loss from jets and winds is a key ingredient in the evolution of accretion discs in young stars. While slow winds have been recently extensively studied in T Tauri stars, little investigation has been devoted on the occurrence of high velocity jets and on how the two mass-loss phenomena are connected with each other, and with the disc mass accretion rates. In this framework, we have analysed the [O I]6300 Å line in a sample of 131 young stars with discs in the Lupus, Chamaeleon and σ Orionis star forming regions. The stars were observed with the X-shooter spectrograph at the Very Large Telescope and have mass accretion rates spanning from 10-12 to 10-7M⊙ yr-1. The line profile was deconvolved into a low velocity component (LVC, | Vr | < 40 km s-1) and a high velocity component (HVC, | Vr | > 40 km s-1), originating from slow winds and high velocity jets, respectively. The LVC is by far the most frequent component, with a detection rate of 77%, while only 30% of sources have a HVC. The fraction of HVC detections slightly increases (i.e. 39%) in the sub-sample of stronger accretors (i.e. with log (Lacc/L⊙) >-3). The [O I]6300 Å luminosity of both the LVC and HVC, when detected, correlates with stellar and accretion parameters of the central sources (i.e. L∗, M∗, Lacc, Ṁacc), with similar slopes for the two components. The line luminosity correlates better (i.e. has a lower dispersion) with the accretion luminosity than with the stellar luminosity or stellar mass. We suggest that accretion is the main drivers for the line excitation and that MHD disc-winds are at the origin of both components. In the sub-sample of Lupus sources observed with ALMA a relationship is found between the HVC peak velocity and the outer disc inclination angle, as expected if the HVC traces jets ejected perpendicularly to the disc plane. Mass ejection rates (Ṁjet) measured from the detected HVC [O I]6300 Å line luminosity span from 10-13 to 10-7M⊙ yr-1. The

  3. Male Adolescent Bullying and the School Shooter

    ERIC Educational Resources Information Center

    Reuter-Rice, Karin

    2008-01-01

    An extensive review of the literature reveals that adolescent male victims of peer bullying suffer somatic and emotional consequences from being victimized. Limited research on school shooters found that a significant number of them were adolescents who were targets of bullies and claimed their shootings were in response to their victimization. To…

  4. "School Shooter" Web Video Game Raises Concerns

    ERIC Educational Resources Information Center

    Rhen, Brad

    2011-01-01

    A new video game in which the player stalks and shoots fellow students and teachers in school settings is drawing fire from school district officials. "School Shooter: North American Tour 2012" is a first-person game that allows the player to move around a school and collect points by killing defenseless students and teachers. The game,…

  5. Live Scale Active Shooter Exercise: Lessons Learned

    ERIC Educational Resources Information Center

    Ervin, Randy

    2008-01-01

    On October 23, 2007, the Lake Land College Public Safety Department conducted a full-scale live exercise that simulated an active shooter and barricaded hostage. In this article, the author will emphasize what they learned, and how they intend to benefit from it. He will list the law enforcement issues and general issues they encountered, and then…

  6. The effect of an active shooter response intervention on hospital employees' response knowledge, perceived program usefulness, and perceived organizational preparedness.

    PubMed

    Landry, Gail; Zimbro, Kathie S; Morgan, Merri K; Maduro, Ralitsa S; Snyder, Tim; Sweeney, Nancy L

    2018-04-02

    Active shooter events occur frequently across the United States in a variety of locations, including health care facilities. Hospital health care worker response to an active shooter event may mean the difference in life or death for self or others. There is little research on how hospitals prepare nonmanagers to respond to active shooter events. We conducted a study to explore differences in knowledge, perceived organizational preparedness, and program utility following participation in an active shooter response program. Self-efficacy, personal characteristics, and professional characteristics were also explored. Program evaluation was conducted via a one-group pretest/posttest design. There was a significant increase in knowledge and perceived organizational preparedness postintervention. Trait-level self-efficacy did not have a significant effect on retained knowledge and perceived organizational preparedness. The current study is the first known to evaluate the efficacy of an active shooter response program for nonmanagers within an inpatient health care facility. Findings from this study may inform risk managers on how to educate employees on what to expect and how to react should an active shooter event occur. © 2018 American Society for Healthcare Risk Management of the American Hospital Association.

  7. Tell Me Why? Existential Concerns of School Shooters

    ERIC Educational Resources Information Center

    Pfeifer, Birgit; Ganzevoort, Ruard

    2017-01-01

    One of the few recurring characteristics in school shooters' stories is their expression of existential concerns. Many discuss their hatred of the world and existential loneliness in their manifestos, suicide letters, or social media updates. These expressions--called leaking--are made during the planning period preceding their deed. They are not…

  8. Spatial Rotation, Aggression, and Gender in First-Person-Shooter Video Games and Their Influence on Math Achievement

    ERIC Educational Resources Information Center

    Krone, Beth K.

    2012-01-01

    As shown by the neuropsychological educational approach to the cognitive remediation model, first-person-shooter video game play eliminates gender-related deficits in spatial rotation. Spatial rotation increases academic success and decreases social and economic disparities. Per the general aggression model, first-person-shooter video game play…

  9. Mitigating active shooter impact: Analysis for policy options based on agent/computer-based modeling.

    PubMed

    Anklam, Charles; Kirby, Adam; Sharevski, Filipo; Dietz, J Eric

    2015-01-01

    Active shooting violence at confined settings, such as educational institutions, poses serious security concerns to public safety. In studying the effects of active shooter scenarios, the common denominator associated with all events, regardless of reason/intent for shooter motives, or type of weapons used, was the location chosen and time expended between the beginning of the event and its culmination. This in turn directly correlates to number of casualties incurred in any given event. The longer the event protracts, the more casualties are incurred until law enforcement or another barrier can react and culminate the situation. Using AnyLogic technology, devise modeling scenarios to test multiple hypotheses against free-agent modeling simulation to determine the best method to reduce casualties associated with active shooter scenarios. Test four possible scenarios of responding to active shooter in a public school setting using agent-based computer modeling techniques-scenario 1: basic scenario where no access control or any type of security is used within the school; scenario 2, scenario assumes that concealed carry individual(s) (5-10 percent of the work force) are present in the school; scenario 3, scenario assumes that the school has assigned resource officer; scenario 4, scenario assumes that the school has assigned resource officer and concealed carry individual(s) (5-10 percent) present in the school. Statistical data from modeling scenarios indicating which tested hypothesis resulted in fewer casualties and quicker culmination of event. The use of AnyLogic proved the initial hypothesis that a decrease on response time to an active shooter scenario directly reduced victim casualties. Modeling tests show statistically significant fewer casualties in scenarios where on scene armed responders such as resource officers and concealed carry personnel were present.

  10. Body sway, aim point fluctuation and performance in rifle shooters: inter- and intra-individual analysis.

    PubMed

    Ball, Kevin A; Best, Russell J; Wrigley, Tim V

    2003-07-01

    In this study, we examined the relationships between body sway, aim point fluctuation and performance in rifle shooting on an inter- and intra-individual basis. Six elite shooters performed 20 shots under competition conditions. For each shot, body sway parameters and four aim point fluctuation parameters were quantified for the time periods 5 s to shot, 3 s to shot and 1 s to shot. Three parameters were used to indicate performance. An AMTI LG6-4 force plate was used to measure body sway parameters, while a SCATT shooting analysis system was used to measure aim point fluctuation and shooting performance. Multiple regression analysis indicated that body sway was related to performance for four shooters. Also, body sway was related to aim point fluctuation for all shooters. These relationships were specific to the individual, with the strength of association, parameters of importance and time period of importance different for different shooters. Correlation analysis of significant regressions indicated that, as body sway increased, performance decreased and aim point fluctuation increased for most relationships. We conclude that body sway and aim point fluctuation are important in elite rifle shooting and performance errors are highly individual-specific at this standard. Individual analysis should be a priority when examining elite sports performance.

  11. 75 FR 43943 - Defense Science Board; Task Force on Nuclear Treaty Monitoring and Verification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... DEPARTMENT OF DEFENSE Office of the Secretary Defense Science Board; Task Force on Nuclear Treaty... meetings. SUMMARY: The Defense Science Board Task Force on Nuclear Treaty Monitoring and Verification will... held September 13-14, and 25-26, 2010. ADDRESSES: The meetings will be held at Science Applications...

  12. Does excessive play of violent first-person-shooter-video-games dampen brain activity in response to emotional stimuli?

    PubMed

    Montag, Christian; Weber, Bernd; Trautner, Peter; Newport, Beate; Markett, Sebastian; Walter, Nora T; Felten, Andrea; Reuter, Martin

    2012-01-01

    The present case-control study investigated the processing of emotional pictures in excessive first-person-shooter-video-players and control persons. All participants of the fMRI experiment were confronted with pictures from four categories including pleasant, unpleasant, neutral content and pictures from the first-person-shooter-video-game 'Counterstrike'. Compared to controls, gamers showed a significantly lower activation of the left lateral medial frontal lobe while processing negative emotions. Another interesting finding of the study represents the higher activation of frontal and temporal brain areas in gamers when processing screen-shots from the first-person-shooter-video-game 'Counterstrike'. Higher brain activity in the lateral prefrontal cortex could represent a protection mechanism against experiencing negative emotions by down-regulating limbic brain activity. Due to a frequent confrontation with violent scenes, the first-person-shooter-video-gamers might have habituated to the effects of unpleasant stimuli resulting in lower brain activation. Individual differences in brain activations of the contrast Counterstrike>neutral pictures potentially resemble the activation of action-scripts related to the video-game. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Interim Letter Report - Verification Survey of Partial Grids H19, J21, J22, X20, and X21 at the David Witherspoon, Inc. 1630 Site, Knoxville Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P.C. Weaver

    2008-03-19

    Conduct verification surveys of available grids at the David Witherspoon Incorporated 1630 Site (DWI 1630) in Knoxville, Tennessee. The IVT conducted verification activities of partial grids H19, J21, J22, X20, and X21.

  14. Brief Morning Light Exposure, Visuomotor Performance, and Biochemistry in Sport Shooters.

    PubMed

    Leichtfried, Veronika; Hanser, Friedrich; Griesmacher, Andrea; Canazei, Markus; Schobersberger, Wolfgang

    2016-09-01

    Demands on concentrative and cognitive performance are high in sport shooting and vary in a circadian pattern, aroused by internal and external stimuli. The most prominent external stimulus is light. Bright light (BL) has been shown to have a certain impact on cognitive and physical performance. To evaluate the impact of a single half hour of BL exposure in the morning hours on physical and cognitive performance in 15 sport shooters. In addition, courses of sulfateoxymelatonin (aMT6s), tryptophan (TRP), and kynurenine (KYN) were monitored. In a crossover design, 15 sport shooters were exposed to 30 min of BL and dim light (DL) in the early-morning hours. Shooting performance, balance, visuomotor performance, and courses of aMT6s, TRP, and KYN were evaluated. Shooting performance was 365.4 (349.7-381.0) and 368.5 (353.9-383.1), identical in both light setups. Numbers of right reactions (sustained attention) and deviations from the horizontal plane (balance-related measure) were higher after BL. TRP concentrations decreased from 77.5 (73.5-81.4) to 66.9 (60.7-67.0) in the DL setup only. The 2 light conditions generated heterogeneous visuomotor and physiological effects in sport shooters. The authors therefore suggest that a single half hour of BL exposure is effective in improving cognitive aspects of performance, but not physical performance. Further research is needed to evaluate BL's impact on biochemical parameters.

  15. Gas content of transitional disks: a VLT/X-Shooter study of accretion and winds

    NASA Astrophysics Data System (ADS)

    Manara, C. F.; Testi, L.; Natta, A.; Rosotti, G.; Benisty, M.; Ercolano, B.; Ricci, L.

    2014-08-01

    Context. Transitional disks are thought to be a late evolutionary stage of protoplanetary disks whose inner regions have been depleted of dust. The mechanism responsible for this depletion is still under debate. To constrain the various models it is mandatory to have a good understanding of the properties of the gas content in the inner part of the disk. Aims: Using X-Shooter broad band - UV to near-infrared - medium-resolution spectroscopy, we derive the stellar, accretion, and wind properties of a sample of 22 transitional disks. The analysis of these properties allows us to place strong constraints on the gas content in a region very close to the star (≲0.2 AU) that is not accessible with any other observational technique. Methods: We fitted the spectra with a self-consistent procedure to simultaneously derive spectral type, extinction, and accretion properties of the targets. From the continuum excess at near-infrared wavelength we distinguished whether our targets have dust free inner holes. By analyzing forbidden emission lines, we derived the wind properties of the targets. We then compared our findings with results for classical T Tauri stars. Results: The accretion rates and wind properties of 80% of the transitional disks in our sample, which is strongly biased toward stongly accreting objects, are comparable to those of classical T Tauri stars. Thus, there are (at least) some transitional disks with accretion properties compatible with those of classical T Tauri stars, irrespective of the size of the dust inner hole. Only in two cases are the mass accretion rates much lower, while the wind properties remain similar. We detected no strong trend of the mass accretion rates with the size of the dust-depleted cavity or with the presence of a dusty optically thick disk very close to the star. These results suggest that, close to the central star, there is a gas-rich inner disk with a density similar to that of classical T Tauri star disks. Conclusions: The

  16. 75 FR 34439 - Defense Science Board Task Force on Nuclear Treaty Monitoring and Verification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-17

    ... DEPARTMENT OF DEFENSE Office of the Secretary Defense Science Board Task Force on Nuclear Treaty... meetings. SUMMARY: The Defense Science Board Task Force on Nuclear Treaty Monitoring and Verification will... Applications International Corporation, 4001 North Fairfax Drive, Suite 300, Arlington, VA. FOR FURTHER...

  17. The X-Shooter Lens Survey - I. Dark matter domination and a Salpeter-type initial mass function in a massive early-type galaxy

    NASA Astrophysics Data System (ADS)

    Spiniello, C.; Koopmans, L. V. E.; Trager, S. C.; Czoske, O.; Treu, T.

    2011-11-01

    We present the first results from the X-Shooter Lens Survey: an analysis of the massive early-type galaxy SDSS J1148+1930 at redshift z= 0.444. We combine its extended kinematic profile - derived from spectra obtained with X-Shooter on the European Southern Observatory Very Large Telescope - with strong gravitational lensing and multicolour information derived from Sloan Digital Sky Survey (SDSS) images. Our main results are as follows. (i) The luminosity-weighted stellar velocity dispersion is <σ*>(≲Reff) = 352 ± 10 ± 16 km s-1, extracted from a rectangular aperture of 1.8 × 1.6 arcsec2 centred on the galaxy, more accurate and considerably lower than a previously published value of ˜450 km s-1. (ii) A single-component (stellar plus dark) mass model of the lens galaxy yields a logarithmic total-density slope of γ'= 1.72+0.05- 0.06 (68 per cent confidence level, CL; ?) within a projected radius of ˜2.16 arcsec. (iii) The projected stellar mass fraction, derived solely from the lensing and dynamical data, is f*(90 per cent CL and in some cases violate the total lensing

  18. Deadly Lessons: School Shooters Tell Why. Sun-Times Exclusive Report.

    ERIC Educational Resources Information Center

    Chicago Sun-Times, IL.

    This document represents a compilation of newspaper articles analyzing information shared by the Secret Service concerning 37 school shootings. The findings are presented to educate parents and teachers concerning what has been learned about violent students. It was determined that there is no profile of a typical youth who kills. The shooter is…

  19. Local Jurisdictions and Active Shooters: Building Networks, Building Capacities

    DTIC Science & Technology

    2010-12-01

    coordination will be the foundation for identifying relevant sources and materials on the armed active shooter assault. This research will also benefit...CONCLUSION In summary, the literature review identified relevant sources and materials on the importance of an armed attack. While an armed assault...armed with the following: dozens of explosive devices of varying potency, seven knives, two Savage-Stevens 12 gauge double- barrel shotguns with the

  20. X-shooter spectroscopy of young stellar objects. VI. H I line decrements

    NASA Astrophysics Data System (ADS)

    Antoniucci, S.; Nisini, B.; Giannini, T.; Rigliaco, E.; Alcalá, J. M.; Natta, A.; Stelzer, B.

    2017-03-01

    Context. Hydrogen recombination emission lines commonly observed in accreting young stellar objects represent a powerful tracer for the gas conditions in the circumstellar structures (accretion columns, and winds or jets). Aims: Here we perform a study of the H I decrements and line profiles, from the Balmer and Paschen H I lines detected in the X-shooter spectra of a homogeneous sample of 36 T Tauri objects in Lupus, the accretion and stellar properties of which were already derived in a previous work. We aim to obtain information on the H I gas physical conditions to delineate a consistent picture of the H I emission mechanisms in pre-main sequence low-mass stars (M∗< 2 M⊙). Methods: We have empirically classified the sources based on their H I line profiles and decrements. We identified four Balmer decrement types (which we classified as 1, 2, 3, and 4) and three Paschen decrement types (A, B, and C), characterised by different shapes. We first discussed the connection between the decrement types and the source properties and then compared the observed decrements with predictions from recently published local line excitation models. Results: We identify a few groups of sources that display similar H I properties. One third of the objects show lines with narrow symmetric profiles, and present similar Balmer and Paschen decrements (straight decrements, types 2 and A). Lines in these sources are consistent with optically thin emission from gas with hydrogen densities of order 109 cm-3 and 5000 < T < 15 000 K. These objects are associated with low mass accretion rates. Type 4 (L-shaped) Balmer and type B Paschen decrements are found in conjunction with very wide line profiles and are characteristic of strong accretors, with optically thick emission from high-density gas (log nH > 11 cm-3). Type 1 (curved) Balmer decrements are observed only in three sub-luminous sources viewed edge-on, so we speculate that these are actually type 2 decrements that are reddened

  1. The concept verification testing of materials science payloads

    NASA Technical Reports Server (NTRS)

    Griner, C. S.; Johnston, M. H.; Whitaker, A.

    1976-01-01

    The concept Verification Testing (CVT) project at the Marshall Space Flight Center, Alabama, is a developmental activity that supports Shuttle Payload Projects such as Spacelab. It provides an operational 1-g environment for testing NASA and other agency experiment and support systems concepts that may be used in shuttle. A dedicated Materials Science Payload was tested in the General Purpose Laboratory to assess the requirements of a space processing payload on a Spacelab type facility. Physical and functional integration of the experiments into the facility was studied, and the impact of the experiments on the facility (and vice versa) was evaluated. A follow-up test designated CVT Test IVA was also held. The purpose of this test was to repeat Test IV experiments with a crew composed of selected and trained scientists. These personnel were not required to have prior knowledge of the materials science disciplines, but were required to have a basic knowledge of science and the scientific method.

  2. Appeal of playing online First Person Shooter Games.

    PubMed

    Jansz, Jeroen; Tanis, Martin

    2007-02-01

    First Person Shooter Games (FPSG) such as Counter Strike are often the subject of public concern. Surprisingly, there is no published research available about playing these games. We conducted an exploratory Internet survey (n 5 751) in order to gather information about who the players of online first person shooters are, and why they spend time on playing this particular kind of video game. The results of our survey on the one hand confirmed the stereotype of the gamer as it is often presented in popular media: the players of online FPS were indeed almost exclusively young men (mean age about 18 years) who spend a lot of their leisure time on gaming (about 2.6 h per day). We also found that the most committed gamers, that is, the ones who were members of a (semi)professional clan, scored highest on motives with respect to competition, and challenge in comparison with members of amateur clans and online gamers who had not joined a clan. On the other hand, our results cast doubt on the accuracy of the stereotype. This study showed clearly that online FPSG are not played in isolation. More than 80% of our respondents were member of a clan. Also, the regression analysis showed that the social interaction motive was the strongest predictor of the time actually spend on gaming.

  3. The role of science in treaty verification.

    PubMed

    Gavron, Avigdor

    2005-01-01

    Technologically advanced nations are currently applying more science to treaty verification than ever before. Satellites gather a multitude of information relating to proliferation concerns using thermal imaging analysis, nuclear radiation measurements, and optical and radio frequency signals detection. Ground stations gather complementary signals such as seismic events and radioactive emissions. Export controls in many countries attempt to intercept materials and technical means that could be used for nuclear proliferation. Nevertheless, we have witnessed a plethora of nuclear proliferation episodes, that were undetected (or were belatedly detected) by these technologies--the Indian nuclear tests in 1998, the Libyan nuclear buildup, the Iranian enrichment program and the North Korea nuclear weapons program are some prime examples. In this talk, we will discuss some of the technologies used for proliferation detection. In particular, we will note some of the issues relating to nuclear materials control agreements that epitomize political difficulties as they impact the implementation of science and technology.

  4. The DES Science Verification Weak Lensing Shear Catalogs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarvis, M.

    We present weak lensing shear catalogs for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogs of 2.12 million and 3.44 million galaxies respectively. We also detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SVmore » data. Furthermore, we discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogs for the full 5-year DES, which is expected to cover 5000 square degrees.« less

  5. The DES Science Verification Weak Lensing Shear Catalogs

    DOE PAGES

    Jarvis, M.

    2016-05-01

    We present weak lensing shear catalogs for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogs of 2.12 million and 3.44 million galaxies respectively. We also detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SVmore » data. Furthermore, we discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogs for the full 5-year DES, which is expected to cover 5000 square degrees.« less

  6. Influence of Running on Pistol Shot Hit Patterns.

    PubMed

    Kerkhoff, Wim; Bolck, Annabel; Mattijssen, Erwin J A T

    2016-01-01

    In shooting scene reconstructions, risk assessment of the situation can be important for the legal system. Shooting accuracy and precision, and thus risk assessment, might be correlated with the shooter's physical movement and experience. The hit patterns of inexperienced and experienced shooters, while shooting stationary (10 shots) and in running motion (10 shots) with a semi-automatic pistol, were compared visually (with confidence ellipses) and statistically. The results show a significant difference in precision (circumference of the hit patterns) between stationary shots and shots fired in motion for both inexperienced and experienced shooters. The decrease in precision for all shooters was significantly larger in the y-direction than in the x-direction. The precision of the experienced shooters is overall better than that of the inexperienced shooters. No significant change in accuracy (shift in the hit pattern center) between stationary shots and shots fired in motion can be seen for all shooters. © 2015 American Academy of Forensic Sciences.

  7. A longitudinal analysis of shooter games and their relationship with conduct disorder and cself-reported delinquency.

    PubMed

    Smith, Sven; Ferguson, Chris; Beaver, Kevin

    Despite several decades of research, little scholarly consensus has emerged regarding the role of violent video games in the development of youth psychopathology or crime. The current study employed the Avon Longitudinal Study of Parents and Children longitudinal dataset to examine the impact of the shooter game genre ownership in childhood on later adolescent conduct disorder and criminal behavior. Multivariate Poisson regressions with the robust estimator correlation matrix were performed comparing effects of independent and confounding variables. Results revealed that early childhood mental health symptoms at age seven related to ADHD, depression and early conduct disorder predicted criminal behavior at age fifteen. Male gender also predicted criminal behavior at age fifteen. However, exposure to shooter games did not predict adolescent conduct disorder or criminal behavior. We have found support that suggests that the role of violent video games in the development of youth psychopathology or crime is very little if any. Lack of a relationship between exposure to shooter games and later conduct and criminal behavior problems may be understood within the context of the Catalyst Model. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Associations between active shooter incidents and gun ownership and storage among families with young children in the United States.

    PubMed

    Morrissey, Taryn W

    2017-07-01

    The presence of firearms and their unsafe storage in the home can increase risk of firearm-related death and injury, but public opinion suggests that firearm ownership is a protective factor against gun violence. This study examined the effects of a recent nearby active shooter incident on gun ownership and storage practices among families with young children. A series of regression models, with data from the nationally representative Early Childhood Longitudinal Study-Birth Cohort merged with the FBI's Active Shooter Incidents data collected in 2003-2006, were used to examine whether household gun ownership and storage practices differed in the months prior to and following an active shooter incident that occurred anywhere in the United States or within the same state. Approximately one-fifth of young children lived in households with one or more guns; of these children, only two-thirds lived in homes that stored all guns in locked cabinets. Results suggest that the experience of a recent active shooter incident was associated with an increased likelihood of storing all guns locked, with the magnitude dependent on the temporal and geographic proximity of the incident. The severity of the incident, defined as the number of fatalities, predicted an increase in storing guns locked. Findings suggest that public shootings change behaviors related to firearm storage among families with young children. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Hubble Space Telescope high speed photometer science verification test report

    NASA Technical Reports Server (NTRS)

    Richards, Evan E.

    1992-01-01

    The purpose of this report is to summarize the results of the HSP Science Verification (SV) tests, the status of the HSP at the end of the SV period, and the work remaining to be done. The HSP OV report (November 1991) covered all activities (OV, SV, and SAO) from launch to the completion of phase three alignment, OV 3233 performed in the 91154 SMS, on June 8, 1991. This report covers subsequent activities through May 1992.

  10. First Results from the UT1 Science Verification Programme

    NASA Astrophysics Data System (ADS)

    1998-11-01

    from the galaxy is taken into account. This additional material represents some of the Universe's dark matter . The gravitational lens action is also magnifying the background object by a factor of ten, providing an unparalleled view of this very distant galaxy which is in a stage of active star-formation. The scientists involved in this study are : Palle Møller (ESO), Stephen J. Warren (Blackett Laboratory, Imperial College, UK), Paul C. Hewett (Institute of Astronomy, Cambridge, UK) and Geraint F. Lewis (Dept. of Physics and Astronomy, University of Victoria, Canada). An Extremely Red Galaxy One of the main goals of modern cosmology is to understand when and how the galaxies formed. In the very last years, many high-redshift (i.e. very distant) galaxies have been found, suggesting that some galaxies were already assembled, when the Universe was much younger than now. None of these high-redshift galaxies have ever been found to be a bona-fide red elliptical galaxy . The VLT, however, with its very good capabilities for infrared observations, is an ideal instrument to investigate when and how the red elliptical galaxies formed. The VLT Science Verification images have provided unique multicolour information about an extremely red galaxy that was originally (Treu et al., 1998, A&A Letters, Vol. 340, p. 10) identified on the Hubble Deep Field South (HDF-S) Test Image. This galaxy is shown in PR Photo 48d/98 that is an enlargment from ESO PR Photo 35b/98. It was detected on Near-IR images and also on images obtained in the optical part of the spectrum, at the very faint limit of magnitude B ~ 29 in the blue. However, this galaxy has not been detected in the near-ultraviolet band. ESO PR Photo 48d/98 ESO PR Photo 48d/98 [Preview - JPEG: 800 x 594 pix - 264k] [High-Res - JPEG: 3000 x 2229 pix - 1.8Mb] ESO PR Photo 48e/98 ESO PR Photo 48e/98 [Preview - JPEG: 800 x 942 pix - 96k] [High-Res - JPEG: 3000 x 3533 pix - 576k] PR Photo 48d/98 (left) shows the very red galaxy (at

  11. X-shooter spectroscopy of young stellar objects. III. Photospheric and chromospheric properties of Class III objects

    NASA Astrophysics Data System (ADS)

    Stelzer, B.; Frasca, A.; Alcalá, J. M.; Manara, C. F.; Biazzo, K.; Covino, E.; Rigliaco, E.; Testi, L.; Covino, S.; D'Elia, V.

    2013-10-01

    Context. Traditionally, the chromospheres of late-type stars are studied through their strongest emission lines, Hα and Ca ii HK emission. Our knowledge on the whole emission line spectrum is more elusive as a result of the limited spectral range and sensitivity of most available spectrographs. Aims: We intend to reduce this gap with a comprehensive spectroscopic study of the chromospheric emission line spectrum of a sample of non-accreting pre-main sequence stars (Class III sources). Methods: We analyzed X-shooter/VLT spectra of 24 Class III sources from three nearby star-forming regions (σ Orionis, Lupus III, and TW Hya). We determined the effective temperature, surface gravity, rotational velocity, and radial velocity by comparing the observed spectra with synthetic BT-Settl model spectra. We investigated in detail the emission lines emerging from the stellar chromospheres and combined these data with archival X-ray data to allow for a comparison between chromospheric and coronal emissions. Results: For some objects in the sample the atmospheric and kinematic parameters are presented here for the first time. The effective temperatures are consistent with those derived for the same stars from an empirical calibration with spectral types. Small differences in the surface gravity found between the stars can be attributed to differences in the average age of the three star-forming regions. The strength of lithium absorption and radial velocities confirm the young age of all but one object in the sample (Sz 94). Both X-ray and Hα luminosity as measured in terms of the bolometric luminosity are independent of the effective temperature for early-M stars but decline toward the end of the spectral M sequence. For the saturated early-M stars the average emission level is almost one dex higher for X-rays than for Hα: log (Lx/Lbol) = -2.85 ± 0.36 vs. log (LHα/Lbol) = -3.72 ± 0.21. When all chromospheric emission lines (including the Balmer series up to H11, Ca ii HK

  12. VLT/X-Shooter spectroscopy of the afterglow of the Swift GRB 130606A. Chemical abundances and reionisation at z ~ 6

    NASA Astrophysics Data System (ADS)

    Hartoog, O. E.; Malesani, D.; Fynbo, J. P. U.; Goto, T.; Krühler, T.; Vreeswijk, P. M.; De Cia, A.; Xu, D.; Møller, P.; Covino, S.; D'Elia, V.; Flores, H.; Goldoni, P.; Hjorth, J.; Jakobsson, P.; Krogager, J.-K.; Kaper, L.; Ledoux, C.; Levan, A. J.; Milvang-Jensen, B.; Sollerman, J.; Sparre, M.; Tagliaferri, G.; Tanvir, N. R.; de Ugarte Postigo, A.; Vergani, S. D.; Wiersema, K.; Datson, J.; Salinas, R.; Mikkelsen, K.; Aghanim, N.

    2015-08-01

    Context. The reionisation of the Universe is a process that is thought to have ended around z ~ 6, as inferred from spectroscopy of distant bright background sources, such as quasars (QSO) and gamma-ray burst (GRB) afterglows. Furthermore, spectroscopy of a GRB afterglow provides insight in its host galaxy, which is often too dim and distant to study otherwise. Aims: For the Swift GRB 130606A at z = 5.913 we have obtained a high S/N spectrum covering the full optical and near-IR wavelength region at intermediate spectral resolution with VLT/X-Shooter. We aim to measure the degree of ionisation of the intergalactic medium (IGM) between z = 5.02-5.84 and to study the chemical abundance pattern and dust content of its host galaxy. Methods: We estimated the UV continuum of the GRB afterglow using a power-law extrapolation, then measured the flux decrement due to absorption at Lyα,β, and γ wavelength regions. Furthermore, we fitted the shape of the red damping wing of Lyα. The hydrogen and metal absorption lines formed in the host galaxy were fitted with Voigt profiles to obtain column densities. We investigated whether ionisation corrections needed to be applied. Results: Our measurements of the Lyα-forest optical depth are consistent with previous measurements of QSOs, but have a much smaller uncertainty. The analysis of the red damping wing yields a neutral fraction xH i< 0.05 (3σ). We obtain column density measurements of H, Al, Si, and Fe; for C, O, S and Ni we obtain limits. The ionisation due to the GRB is estimated to be negligible (corrections <0.03 dex), but larger corrections may apply due to the pre-existing radiation field (up to 0.4 dex based on sub-DLA studies). Assuming that [ Si/Fe ] = +0.79 ± 0.13 is due to dust depletion, the dust-to-metal ratio is similar to the Galactic value. Conclusions: Our measurements confirm that the Universe is already predominantly ionised over the redshift range probed in this work, but was slightly more neutral at z

  13. Weak lensing magnification in the Dark Energy Survey Science Verification data

    NASA Astrophysics Data System (ADS)

    Garcia-Fernandez, M.; Sanchez, E.; Sevilla-Noarbe, I.; Suchyta, E.; Huff, E. M.; Gaztanaga, E.; Aleksić, J.; Ponce, R.; Castander, F. J.; Hoyle, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Eifler, T. F.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Jarvis, M.; Kirk, D.; Krause, E.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; MacCrann, N.; Maia, M. A. G.; March, M.; Marshall, J. L.; Melchior, P.; Miquel, R.; Mohr, J. J.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Scarpine, V.; Schubnell, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Tarle, G.; Thomas, D.; Walker, A. R.; Wester, W.; DES Collaboration

    2018-05-01

    In this paper, the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using the Dark Energy Survey Science Verification data set. This analysis is carried out for galaxies that are selected only by its photometric redshift. An extensive analysis of the systematic effects, using new methods based on simulations is performed, including a Monte Carlo sampling of the selection function of the survey.

  14. Attitudes of older adults toward shooter video games: An initial study to select an acceptable game for training visual processing.

    PubMed

    McKay, Sandra M; Maki, Brian E

    2010-01-01

    A computer-based 'Useful Field of View' (UFOV) training program has been shown to be effective in improving visual processing in older adults. Studies of young adults have shown that playing video games can have similar benefits; however, these studies involved realistic and violent 'first-person shooter' (FPS) games. The willingness of older adults to play such games has not been established. OBJECTIVES: To determine the degree to which older adults would accept playing a realistic, violent FPS-game, compared to video games not involving realistic depiction of violence. METHODS: Sixteen older adults (ages 64-77) viewed and rated video-clip demonstrations of the UFOV program and three video-game genres (realistic-FPS, cartoon-FPS, fixed-shooter), and were then given an opportunity to try them out (30 minutes per game) and rate various features. RESULTS: The results supported a hypothesis that the participants would be less willing to play the realistic-FPS game in comparison to the less violent alternatives (p's<0.02). After viewing the video-clip demonstrations, 10 of 16 participants indicated they would be unwilling to try out the realistic-FPS game. Of the six who were willing, three did not enjoy the experience and were not interested in playing again. In contrast, all 12 subjects who were willing to try the cartoon-FPS game reported that they enjoyed it and would be willing to play again. A high proportion also tried and enjoyed the UFOV training (15/16) and the fixed-shooter game (12/15). DISCUSSION: A realistic, violent FPS video game is unlikely to be an appropriate choice for older adults. Cartoon-FPS and fixed-shooter games are more viable options. Although most subjects also enjoyed UFOV training, a video-game approach has a number of potential advantages (for instance, 'addictive' properties, low cost, self-administration at home). We therefore conclude that non-violent cartoon-FPS and fixed-shooter video games warrant further investigation as an

  15. Be/X-ray Binary Science for Future X-ray Timing Missions

    NASA Technical Reports Server (NTRS)

    Wilson-Hodge, Colleen A.

    2011-01-01

    For future missions, the Be/X-ray binary community needs to clearly define our science priorities for the future to advocate for their inclusion in future missions. In this talk, I will describe current designs for two potential future missions and Be X-ray binary science enabled by these designs. The Large Observatory For X-ray Timing (LOFT) is an X-ray timing mission selected in February 2011 for the assessment phase from the 2010 ESA M3 call for proposals. The Advanced X-ray Timing ARray (AXTAR) is a NASA explorer concept X-ray timing mission. This talk is intended to initiate discussions of our science priorities for the future.

  16. Active Shooter Response: Defensive Tactics And Tactical Decision Making For Elementary School Teachers And Staff

    DTIC Science & Technology

    2017-12-01

    Action Reports .....................................................................9 2. Psychological Impact of Training... INTEGRATION ...................................................................60  D.  JOHN BOYD’S OODA LOOP...impact of a school-based active shooter cannot be understated. Beyond the given risk of injury and death, a potential psychological impact exists to all

  17. SU-E-T-364: 6X FFF and 10X FFF Portal Dosimetry Output Factor Verification: Application for SRS/SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gulam, M; Bellon, M; Gopal, A

    2014-06-01

    Purpose: To enhance portal dosimetry of high dose rate SRS/SBRT plan verifications with extensive imager measurement of output factors (OF). Methods: Electronic portal image dosimetry (EPID), implemented on the Varian Edge allows for acquisition of its two energies: 6X FFF and 10 FFF (1400 and 2400 MU/min, respectively) at source to imager distance (SID) =100cm without imager saturation. Square and rectangular aSi OF following EPID calibration were obtained. Data taken was similar to that obtained during beam commissioning (of almost all field sizes from 1×1 to 15×15 and 20×20 cm{sup 2}, [Trilogy] and [Edge], respectively) to construct a table usingmore » the OF tool for use in the Portal Dosimetry Prediction Algorithm (PDIP v11). The Trilogy 6x SRS 1000 MU/min EPID data were taken at 140 SID. The large number of OF were obtained for comparison to that obtained with diode detectors and ion chambers (cc13 for >3×3 field size). As Edge PDIP verification is currently ongoing, EPID measurements of three SRS/SBRT plans for the Trilogy were taken and compared to results obtained prior to these measurements. Results: The relative difference output factors of field sizes 2×2 and higher compared to commissioning data were (mean+/-SD, [range]): Edge 6X (−1.9+/−2.9%, [−5.9%,3.1%]), Edge 10X (−0.7+/−1.2%, [− 3.3%,0.8%] and Trilogy (0.03+/−0.5%, [−1.4%,1.1%]) with EPID over predicting. The results for the 140 SID showed excellent agreement throughout except at the 1×1 to 1×15 and 15×1 field sizes where differences were: −10.6%, −6.0% and −5.8%. The differences were also most pronounced for the 1×1 at 100 SID. They were −7.4% and −11.5% for 6X and 10X, respectively. The Gamma (3%, 1mm) for three clinical plans improved by 8.7+/−1.8%. Conclusion: Results indicate that imager output factor measurements at any SID of high dose rate SRS/SBRT are quite reliable for portal dosimetry plan verification except for the smallest fields. This work was

  18. AFRL Commander's Challenge 2015: stopping the active shooter

    NASA Astrophysics Data System (ADS)

    McIntire, John P.; Boston, Jonathan; Smith, Brandon; Swartz, Pete; Whitney-Rawls, Amy; Martinez Calderon, Julian; Magin, Jonathan

    2017-05-01

    In this work, we describe a rapid-innovation challenge to combat and deal with the problem of internal, insider physical threats (e.g., active shooters) and associated first-responder situation awareness on military installations. Our team's research and development effort described within focused on several key tech development areas: (1) indoor acoustical gunshot detection, (2) indoor spatial tracking of first responders, (3) bystander safety and protection, (4) two-way mass alerting capability, and (5) spatial information displays for command and control. The technological solutions were specifically designed to be innovative, low-cost, and (relatively) easy-to-implement, and to provide support across the spectrum of possible users including potential victims/bystanders, first responders, dispatch, and incident command.

  19. Toward the Computational Representation of Individual Cultural, Cognitive, and Physiological State: The Sensor Shooter Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    RAYBOURN,ELAINE M.; FORSYTHE,JAMES C.

    2001-08-01

    This report documents an exploratory FY 00 LDRD project that sought to demonstrate the first steps toward a realistic computational representation of the variability encountered in individual human behavior. Realism, as conceptualized in this project, required that the human representation address the underlying psychological, cultural, physiological, and environmental stressors. The present report outlines the researchers' approach to representing cognitive, cultural, and physiological variability of an individual in an ambiguous situation while faced with a high-consequence decision that would greatly impact subsequent events. The present project was framed around a sensor-shooter scenario as a soldier interacts with an unexpected target (twomore » young Iraqi girls). A software model of the ''Sensor Shooter'' scenario from Desert Storm was developed in which the framework consisted of a computational instantiation of Recognition Primed Decision Making in the context of a Naturalistic Decision Making model [1]. Recognition Primed Decision Making was augmented with an underlying foundation based on our current understanding of human neurophysiology and its relationship to human cognitive processes. While the Gulf War scenario that constitutes the framework for the Sensor Shooter prototype is highly specific, the human decision architecture and the subsequent simulation are applicable to other problems similar in concept, intensity, and degree of uncertainty. The goal was to provide initial steps toward a computational representation of human variability in cultural, cognitive, and physiological state in order to attain a better understanding of the full depth of human decision-making processes in the context of ambiguity, novelty, and heightened arousal.« less

  20. Optical-NIR spectroscopy of the puzzling γ-ray source 3FGL 1603.9-4903/PMN J1603-4904 with X-Shooter

    NASA Astrophysics Data System (ADS)

    Goldoni, P.; Pita, S.; Boisson, C.; Müller, C.; Dauser, T.; Jung, I.; Krauß, F.; Lenain, J.-P.; Sol, H.

    2016-02-01

    Context. The Fermi/LAT instrument has detected about two thousand extragalactic high energy (E ≥ 100 MeV) γ-ray sources. One of the brightest is 3FGL J1603.9-4903; it is associated to the radio source PMN J1603-4904. Its nature is not yet clear, it could be either a very peculiar BL Lac or a compact symmetric object radio source which are considered as the early stage of a radio galaxy. The latter, if confirmed, would be the first detection in γ-rays for this class of objects. A redshift z = 0.18 ± 0.01 has recently been claimed on the basis of the detection of a single X-ray line at 5.44 ± 0.05 keV which has been interpreted as a 6.4 keV (rest frame) fluorescent line. Aims: We aim to investigate the nature of 3FGL J1603.9-4903/PMN J1603-4904 using optical-to near-IR (NIR) spectroscopy. Methods: We observed PMN J1603-4904 with the UV-NIR VLT/X-Shooter spectrograph for two hours. We extracted spectra in the visible and NIR range that we calibrated in flux and corrected for telluric absorption. We systematically searched for absorption and emission features. Results: The source was detected starting from ~6300 Å down to 24 000 Å with an intensity similar to that of its 2MASS counterpart and a mostly featureless spectrum. The continuum lacks absorption features and thus is non-stellar in origin and most likely non-thermal. In addition to this spectrum, we detected three emission lines that we interpret as the Hα-[NII] complex, the [SII]λ,λ6716, 6731 doublet and the [SIII]λ 9530 line; we obtain a redshift estimate of z = 0.2321 ± 0.0004. The line ratios suggest that a LINER/Seyfert nucleus powers the emission. This new redshift measurement implies that the X-ray line previously detected should be interpreted as a 6.7 keV line which is very peculiar. Based on observations collected at the European Organisation for Astronomical Research in the Southern Hemisphere, Chile, under program 095.B-0400(A). The raw FITS data files are available in the ESO archive.

  1. Augmenting Security on Department of Defense Installations to Defeat the Active Shooter Threat

    DTIC Science & Technology

    2016-06-10

    strategies to determine if the military could benefit from increased numbers of armed personnel to augment military and civilian law enforcement...personnel. The benefit to the DoD includes increased probability of prevention and deterrence of active shooter events, and a more efficient mitigation...strategies to determine if the military could benefit from increased numbers of armed personnel to augment military and civilian law enforcement

  2. Spitzer Space Telescope in-orbit checkout and science verification operations

    NASA Technical Reports Server (NTRS)

    Linick, Sue H.; Miles, John W.; Gilbert, John B.; Boyles, Carol A.

    2004-01-01

    Spitzer Space Telescope, the fourth and final of NASA's great observatories, and the first mission in NASA's Origins Program was launched 25 August 2003 into an Earth-trailing solar orbit. The observatory was designed to probe and explore the universe in the infrared. Before science data could be acquired, however, the observatory had to be initialized, characterized, calibrated, and commissioned. A two phased operations approach was defined to complete this work. These phases were identified as In-Orbit Checkout (IOC) and Science Verification (SV). Because the observatory lifetime is cryogen-limited these operations had to be highly efficient. The IOC/SV operations design accommodated a pre-defined distributed organizational structure and a complex, cryogenic flight system. Many checkout activities were inter-dependent, and therefore the operations concept and ground data system had to provide the flexibility required for a 'short turn-around' environment. This paper describes the adaptive operations system design and evolution, implementation, and lessons-learned from the completion of IOC/SV.

  3. Status on the Verification of Combustion Stability for the J-2X Engine Thrust Chamber Assembly

    NASA Technical Reports Server (NTRS)

    Casiano, Matthew; Hinerman, Tim; Kenny, R. Jeremy; Hulka, Jim; Barnett, Greg; Dodd, Fred; Martin, Tom

    2013-01-01

    Development is underway of the J -2X engine, a liquid oxygen/liquid hydrogen rocket engine for use on the Space Launch System. The Engine E10001 began hot fire testing in June 2011 and testing will continue with subsequent engines. The J -2X engine main combustion chamber contains both acoustic cavities and baffles. These stability aids are intended to dampen the acoustics in the main combustion chamber. Verification of the engine thrust chamber stability is determined primarily by examining experimental data using a dynamic stability rating technique; however, additional requirements were included to guard against any spontaneous instability or rough combustion. Startup and shutdown chug oscillations are also characterized for this engine. This paper details the stability requirements and verification including low and high frequency dynamics, a discussion on sensor selection and sensor port dynamics, and the process developed to assess combustion stability. A status on the stability results is also provided and discussed.

  4. The CHANDRA X-Ray Observatory: Thermal Design, Verification, and Early Orbit Experience

    NASA Technical Reports Server (NTRS)

    Boyd, David A.; Freeman, Mark D.; Lynch, Nicolie; Lavois, Anthony R. (Technical Monitor)

    2000-01-01

    The CHANDRA X-ray Observatory (formerly AXAF), one of NASA's "Great Observatories" was launched aboard the Shuttle in July 1999. CHANDRA comprises a grazing-incidence X-ray telescope of unprecedented focal-length, collecting area and angular resolution -- better than two orders of magnitude improvement in imaging performance over any previous soft X-ray (0.1-10 keV) mission. Two focal-plane instruments, one with a 150 K passively-cooled detector, provide celestial X-ray images and spectra. Thermal control of CHANDRA includes active systems for the telescope mirror and environment and the optical bench, and largely passive systems for the focal plans instruments. Performance testing of these thermal control systems required 1-1/2 years at increasing levels of integration, culminating in thermal-balance testing of the fully-configured observatory during the summer of 1998. This paper outlines details of thermal design tradeoffs and methods for both the Observatory and the two focal-plane instruments, the thermal verification philosophy of the Chandra program (what to test and at what level), and summarizes the results of the instrument, optical system and observatory testing.

  5. Supersonic projectile models for asynchronous shooter localization

    NASA Astrophysics Data System (ADS)

    Kozick, Richard J.; Whipps, Gene T.; Ash, Joshua N.

    2011-06-01

    In this work we consider the localization of a gunshot using a distributed sensor network measuring time differences of arrival between a firearm's muzzle blast and the shockwave induced by a supersonic bullet. This so-called MB-SW approach is desirable because time synchronization is not required between the sensors, however it suffers from increased computational complexity and requires knowledge of the bullet's velocity at all points along its trajectory. While the actual velocity profile of a particular gunshot is unknown, one may use a parameterized model for the velocity profile and simultaneously fit the model and localize the shooter. In this paper we study efficient solutions for the localization problem and identify deceleration models that trade off localization accuracy and computational complexity. We also develop a statistical analysis that includes bias due to mismatch between the true and actual deceleration models and covariance due to additive noise.

  6. Limitations of Routine Verification of Nasogastric Tube Insertion Using X-Ray and Auscultation: Two Case Reports of Life-Threatening Complications.

    PubMed

    Nejo, Takahide; Oya, Soichi; Tsukasa, Tsuchiya; Yamaguchi, Naomi; Matsui, Toru

    2016-12-01

    Several bedside approaches used in combination with thoracoabdominal X-ray are widely used to avoid severe complications that have been reported during nasogastric tube management. Although confirmation by X-ray is considered the gold standard, it is not yet perfect. We present 2 cases of rare complications in which the routine verification methods could not detect all the complications related to the nasogastric tube placement. Case 1 was a 17-year-old male who presented with a brain tumor and repeatedly required nasogastric tube placement. Despite normal auscultatory and X-ray findings, the patient's condition deteriorated rapidly after resuming the enteral nutrition (EN). Computed tomography images showed the presence of hepatic portal venous gas (HPVG). Urgent upper gastrointestinal endoscopy showed esophagogastric submucosal tunneling of the tube that required an emergency open total gastrectomy. Case 2 was a 76-year-old man with long-term EN after stroke. While the last auscultatory verification was normal, he suddenly developed extensive HPVG due to gastric mucosal injury following EN, which resulted in progressive intestinal necrosis, general peritonitis, and death. These 2 cases indicated that routine verification methods consisting of auscultation and X-ray may not be completely reliable, and the awareness of the limitations of these methods should be reaffirmed because expeditious examinations and necessary interventions are critical in preventing life-threatening complications.

  7. THE REDMAPPER GALAXY CLUSTER CATALOG FROM DES SCIENCE VERIFICATION DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rykoff, E. S.; Rozo, E.; Hollowood, D.

    We describe updates to the redMaPPer algorithm, a photometric red-sequence cluster finder specifically designed for large photometric surveys. The updated algorithm is applied to 150 deg(2) of Science Verification (SV) data from the Dark Energy Survey (DES), and to the Sloan Digital Sky Survey (SDSS) DR8 photometric data set. The DES SV catalog is locally volume limited and contains 786 clusters with richness lambda > 20 (roughly equivalent to M500c greater than or similar to 10(14) h(70)(-1)M(circle dot)) and 0.2 < z < 0.9. The DR8 catalog consists of 26,311 clusters with 0.08 < z < 0.6, with a sharplymore » increasing richness threshold as a function of redshift for z greater than or similar to 0.35. The photometric redshift performance of both catalogs is shown to be excellent, with photometric redshift uncertainties controlled at the sigma(z)/(1+ z) similar to 0.01 level for z greater than or similar to 0.7, rising to similar to 0.02 at z similar to 0.9 in DES SV. We make use of Chandra and XMM X-ray and South Pole Telescope Sunyaev-Zeldovich data to show that the centering performance and mass-richness scatter are consistent with expectations based on prior runs of redMaPPer on SDSS data. We also show how the redMaPPer photo-z and richness estimates are relatively insensitive to imperfect star/galaxy separation and small-scale star masks.« less

  8. Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation & Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsao, Jeffrey Y.; Trucano, Timothy G.; Kleban, Stephen D.

    This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledgemore » gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?« less

  9. First Images from VLT Science Verification Programme

    NASA Astrophysics Data System (ADS)

    1998-09-01

    .1 hrs); Edge-on Galaxies (7.4 hrs); Globular cluster cores (6.7 hrs); QSO Hosts (4.4 hrs); TNOs (3.4 hrs); Pulsars (1.3 hrs); Calibrations (22.7 hrs). All of the SV data are now in the process of being prepared for public release by September 30, 1998 to the ESO and Chilean astronomical communities. It will be possible to retrieve the data from the VLT archive, and a set of CDs will be distributed to all astronomical research institutes within the ESO member states and Chile. Moreover, data obtained on the HDF-S will become publicly available worldwide, and retrievable from the VLT archive. Updated information on this data release can be found on the ESO web site at http://www.eso.org/vltsv/. It is expected that the first scientific results based on the SV data will become available in the course of October and November 1998. First images from the Science Verification programme This Press Release is accompanied by three photos that reproduce some of the images obtained during the SV period. ESO PR Photo 35a/98 ESO PR Photo 35a/98 [Preview - JPEG: 671 x 800 pix - 752k] [High-Res - JPEG: 2518 x 3000 pix - 5.8Mb] This colour composite was constructed from the U+B, R and I Test Camera Images of the Hubble Deep Field South (HDF-S) NICMOS field. These images are displayed as blue, green and red, respectively. The first photo is a colour composite of the HDF-S NICMOS sky field that combines exposures obtained in different wavebands: ultraviolet (U) + blue (B), red (R) and near-infrared (I). For all of them, the image quality is better than 0.9 arcsec. Most of the objects seen in the field are distant galaxies. The image is reproduced in such a way that it shows the faintest features scaled, while rendering the image of the star below the large spiral galaxy approximately white. The spiral galaxy is displayed in such a way that the internal structure is visible. A provisional analysis has shown that limiting magnitudes that were predicted for the HDF-S observations (27.0 - 28

  10. The basis of shooter biases: beyond cultural stereotypes.

    PubMed

    Miller, Saul L; Zielaskowski, Kate; Plant, E Ashby

    2012-10-01

    White police officers and undergraduate students mistakenly shoot unarmed Black suspects more than White suspects on computerized shoot/don't shoot tasks. This bias is typically attributed to cultural stereotypes of Black men. Yet, previous research has not examined whether such biases emerge even in the absence of cultural stereotypes. The current research investigates whether individual differences in chronic beliefs about interpersonal threat interact with target group membership to elicit shooter biases, even when group membership is unrelated to race or cultural stereotypes about danger. Across two studies, participants with strong beliefs about interpersonal threats were more likely to mistakenly shoot outgroup members than ingroup members; this was observed for unfamiliar, arbitrarily formed groups using a minimal group paradigm (Study 1) and racial groups not culturally stereotyped as dangerous (Asians; Study 2). Implications for the roles of both group membership and cultural stereotypes in shaping decisions to shoot are discussed.

  11. X-shooter spectroscopy of young stellar objects in Lupus. Atmospheric parameters, membership, and activity diagnostics

    NASA Astrophysics Data System (ADS)

    Frasca, A.; Biazzo, K.; Alcalá, J. M.; Manara, C. F.; Stelzer, B.; Covino, E.; Antoniucci, S.

    2017-06-01

    Aims: A homogeneous determination of basic stellar parameters of young stellar object (YSO) candidates is needed to confirm their pre-main sequence evolutionary stage and membership to star forming regions (SFRs), and to get reliable values of the quantities related to chromospheric activity and accretion. Methods: We used the code ROTFIT and synthetic BT-Settl spectra for the determination of the atmospheric parameters (Teff and log g), veiling (r), radial (RV), and projected rotational velocity (vsini) from X-shooter spectra of 102 YSO candidates (95 of infrared Class II and seven Class III) in the Lupus SFR. The spectral subtraction of inactive templates, rotationally broadened to match the vsini of the targets, enabled us to measure the line fluxes for several diagnostics of both chromospheric activity and accretion, such as Hα, Hβ, Ca II, and Na I lines. Results: We have shown that 13 candidates can be rejected as Lupus members based on their discrepant RV with respect to Lupus and/or the very low log g values. At least 11 of them are background giants, two of which turned out to be lithium-rich giants. Regarding the members, we found that all Class III sources have Hα fluxes that are compatible with a pure chromospheric activity, while objects with disks lie mostly above the boundary between chromospheres and accretion. Young stellar objects with transitional disks display both high and low Hα fluxes. We found that the line fluxes per unit surface are tightly correlated with the accretion luminosity (Lacc) derived from the Balmer continuum excess. This rules out that the relationships between Lacc and line luminosities found in previous works are simply due to calibration effects. We also found that the Ca II-IRT flux ratio, FCaII8542/FCaII8498, is always small, indicating an optically thick emission source. The latter can be identified with the accretion shock near the stellar photosphere. The Balmer decrement reaches instead, for several accretors, high

  12. Death with a Story: How Story Impacts Emotional, Motivational, and Physiological Responses to First-Person Shooter Video Games

    ERIC Educational Resources Information Center

    Schneider, Edward F.; Lang, Annie; Shin, Mija; Bradley, Samuel D.

    2004-01-01

    This study investigates how game playing experience changes when a story is added to a first-person shooter game. Dependent variables include identification, presence, emotional experiences and motivations. When story was present, game players felt greater identification, sense of presence, and physiological arousal. The presence of story did not…

  13. A Framework for School Safety and Risk Management: Results from a Study of 18 Targeted School Shooters

    ERIC Educational Resources Information Center

    Lenhardt, Ann Marie C.; Graham, Lemuel W.; Farrell, Melissa L.

    2018-01-01

    Targeted violence continues to pose a threat to school safety. Reported here are the results of a study of 18 cases of school shooters from 1996 to 2012. Variables examined are individual factors and behaviors, family dynamics, and triggering events. Results indicate the need for expanded school-based mental health services, threat assessment, and…

  14. Science with Constellation-X, Choice of Instrumentation

    NASA Technical Reports Server (NTRS)

    Hornscheimeier, Ann; White, Nicholas; Tananbaum, Harvey; Garcia, Michael; Bookbinder, Jay; Petre, Robert; Cottam, Jean

    2007-01-01

    The Constellation X-ray Observatory is one of the two Beyond Einstein Great Observatories and will provide a 100-fold increase in collecting area in high spectral resolving power X-ray instruments over the Chandra and XMM-Newton gratings instruments. The mission has four main science objectives which drive the requirements for the mission. This contribution to the Garmire celebration conference describes these four science areas: Black Holes, Dark Energy, Missing Baryons, and the Neutron Star Equation of State as well as the requirements flow-down that give rise to the choice of instrumentation and implementation for Constellation-X. As we show, each of these science areas place complementary constraints on mission performance parameters such as collecting area, spectral resolving power, timing resolution, and field of view. The mission's capabilities will enable a great breadth of science, and its resources will be open to the community through its General Observer program.

  15. Execution of the Spitzer In-orbit Checkout and Science Verification Plan

    NASA Technical Reports Server (NTRS)

    Miles, John W.; Linick, Susan H.; Long, Stacia; Gilbert, John; Garcia, Mark; Boyles, Carole; Werner, Michael; Wilson, Robert K.

    2004-01-01

    The Spitzer Space Telescope is an 85-cm telescope with three cryogenically cooled instruments. Following launch, the observatory was initialized and commissioned for science operations during the in-orbit checkout (IOC) and science verification (SV) phases, carried out over a total of 98.3 days. The execution of the IOC/SV mission plan progressively established Spitzer capabilities taking into consideration thermal, cryogenic, optical, pointing, communications, and operational designs and constraints. The plan was carried out with high efficiency, making effective use of cryogen-limited flight time. One key component to the success of the plan was the pre-launch allocation of schedule reserve in the timeline of IOC/SV activities, and how it was used in flight both to cover activity redesign and growth due to continually improving spacecraft and instrument knowledge, and to recover from anomalies. This paper describes the adaptive system design and evolution, implementation, and lessons learned from IOC/SV operations. It is hoped that this information will provide guidance to future missions with similar engineering challenges

  16. Towards Verification and Validation for Increased Autonomy

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra

    2017-01-01

    This presentation goes over the work we have performed over the last few years on verification and validation of the next generation onboard collision avoidance system, ACAS X, for commercial aircraft. It describes our work on probabilistic verification and synthesis of the model that ACAS X is based on, and goes on to the validation of that model with respect to actual simulation and flight data. The presentation then moves on to identify the characteristics of ACAS X that are related to autonomy and to discuss the challenges that autonomy pauses on VV. All work presented has already been published.

  17. The redMaPPer Galaxy Cluster Catalog From DES Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rykoff, E. S.

    We describe updates to the redMaPPer algorithm, a photometric red-sequence cluster finder specifically designed for large photometric surveys. The updated algorithm is applied tomore » $$150\\,\\mathrm{deg}^2$$ of Science Verification (SV) data from the Dark Energy Survey (DES), and to the Sloan Digital Sky Survey (SDSS) DR8 photometric data set. The DES SV catalog is locally volume limited, and contains 786 clusters with richness $$\\lambda>20$$ (roughly equivalent to $$M_{\\mathrm{500c}}\\gtrsim10^{14}\\,h_{70}^{-1}\\,M_{\\odot}$$) and 0.2 < $z$ <0.9. The DR8 catalog consists of 26311 clusters with 0.08 < $z$ < 0.6, with a sharply increasing richness threshold as a function of redshift for $$z\\gtrsim 0.35$$. The photometric redshift performance of both catalogs is shown to be excellent, with photometric redshift uncertainties controlled at the $$\\sigma_z/(1+z)\\sim 0.01$$ level for $$z\\lesssim0.7$$, rising to $$\\sim0.02$$ at $$z\\sim0.9$$ in DES SV. We make use of $Chandra$ and $XMM$ X-ray and South Pole Telescope Sunyaev-Zeldovich data to show that the centering performance and mass--richness scatter are consistent with expectations based on prior runs of redMaPPer on SDSS data. We also show how the redMaPPer photo-$z$ and richness estimates are relatively insensitive to imperfect star/galaxy separation and small-scale star masks.« less

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  19. Commissioning and Science Verification of JAST/T80

    NASA Astrophysics Data System (ADS)

    Ederoclte, A.; Cenarro, A. J.; Marín-Franch, A.; Cristóbal-Hornillos, D.; Vázquez Ramió, H.; Varela, J.; Hurier, G.; Moles, M.; Lamadrid, J. L.; Díaz-Martín, M. C.; Iglesias Marzoa, R.; Tilve, V.; Rodríguez, S.; Maícas, N.; Abri, J.

    2017-03-01

    Located at the Observatorio Astrofísico de Javalambre, the ’’Javalambre Auxiliary Survey Telescope’’ is an 80cm telescope with a unvignetted 2 square degrees field of view. The telescope is equipped with T80Cam, a camera with a large format CCD and two filter wheels which can host, at any given time, 12 filters. The telescope has been designed to provide optical quality all across the field of view, which is achieved with a field corrector. In this talk, I will review the commissioning of the telescope. The optical performance in the centre of the field of view has been tested with lucky imaging technique, providing a telescope PSF of 0.4’’, which is close to the one expected from theory. Moreover, the tracking of the telescope does not affect the image quality, as it has been shown that stars appear round even in exposures of 10minutes obtained without guiding. Most importantly, we present the preliminary results of science verification observations which combine the two main characteristics of this telescope: the large field of view and the special filter set.

  20. X-shooter spectroscopy of young stellar objects. IV. Accretion in low-mass stars and substellar objects in Lupus

    NASA Astrophysics Data System (ADS)

    Alcalá, J. M.; Natta, A.; Manara, C. F.; Spezzi, L.; Stelzer, B.; Frasca, A.; Biazzo, K.; Covino, E.; Randich, S.; Rigliaco, E.; Testi, L.; Comerón, F.; Cupani, G.; D'Elia, V.

    2014-01-01

    We present VLT/X-shooter observations of a sample of 36 accreting low-mass stellar and substellar objects (YSOs) in the Lupus star-forming region, spanning a range in mass from ~0.03 to ~1.2 M⊙, but mostly with 0.1 M⊙

  1. Wild game consumption habits among Italian shooters: relevance for intakes of cadmium, perfluorooctanesulphonic acid, and 137cesium as priority contaminants.

    PubMed

    Ferri, Mauro; Baldi, Loredana; Cavallo, Stefania; Pellicanò, Roberta; Brambilla, Gianfranco

    2017-05-01

    The consumption habits of 766 Italian shooters (96% males, 4% females), on average 52 years old, have been investigated, in Italy, through the distribution of questionnaires delivered during shooters' attendance to training and teaching courses, in compliance with 853/2004/EC Regulation provisions on food hygiene. The most consumed wild species recorded were pheasant > woodcock > choke among feathered animals, and wild boar > hare > roe deer among mammals, respectively. An average of 100-200 g game per serving (four servings per month) was consumed, with highest intakes of 3000 g per month; meat, liver, and heart were the preferred food items. Mammalian and feathered game was regularly consumed with friends and relatives in 83% and in 60% of cases, respectively. Accounting for an inventoried population of 751,876 shooters in Italy, it is estimated that there is regular consumption of wild game in around the 3% of the Italian population. More than 80% of responders were aware of health risks related to game handling and to food safety issues. Due to the occurrence in wild boar meat and liver of the heavy metal cadmium (Cd), the persistent organic pollutant perfluorooctan sulphonic acid (PFOS), and the radionuclide 137 cesium ( 137 Cs), it was possible to demonstrate the usefulness of such a food consumption database for intake assessment in this sensitive group of consumers. In high consumers of wild boar, threshold concentrations for intakes have been estimated in the ranges of 48-93 ng g -1 for Cd, 35-67 ng g -1 for PFOS and 0.20-0.34 Bq kg -1 for 137 Cs.

  2. Signature Verification Using N-tuple Learning Machine.

    PubMed

    Maneechot, Thanin; Kitjaidure, Yuttana

    2005-01-01

    This research presents new algorithm for signature verification using N-tuple learning machine. The features are taken from handwritten signature on Digital Tablet (On-line). This research develops recognition algorithm using four features extraction, namely horizontal and vertical pen tip position(x-y position), pen tip pressure, and pen altitude angles. Verification uses N-tuple technique with Gaussian thresholding.

  3. Monte Carlo modeling of HD120 multileaf collimator on Varian TrueBeam linear accelerator for verification of 6X and 6X FFF VMAT SABR treatment plans

    PubMed Central

    Gete, Ermias; Duzenli, Cheryl; Teke, Tony

    2014-01-01

    A Monte Carlo (MC) validation of the vendor‐supplied Varian TrueBeam 6 MV flattened (6X) phase‐space file and the first implementation of the Siebers‐Keall MC MLC model as applied to the HD120 MLC (for 6X flat and 6X flattening filterfree (6X FFF) beams) are described. The MC model is validated in the context of VMAT patient‐specific quality assurance. The Monte Carlo commissioning process involves: 1) validating the calculated open‐field percentage depth doses (PDDs), profiles, and output factors (OF), 2) adapting the Siebers‐Keall MLC model to match the new HD120‐MLC geometry and material composition, 3) determining the absolute dose conversion factor for the MC calculation, and 4) validating this entire linac/MLC in the context of dose calculation verification for clinical VMAT plans. MC PDDs for the 6X beams agree with the measured data to within 2.0% for field sizes ranging from 2 × 2 to 40 × 40 cm2. Measured and MC profiles show agreement in the 50% field width and the 80%‐20% penumbra region to within 1.3 mm for all square field sizes. MC OFs for the 2 to 40 cm2 square fields agree with measurement to within 1.6%. Verification of VMAT SABR lung, liver, and vertebra plans demonstrate that measured and MC ion chamber doses agree within 0.6% for the 6X beam and within 2.0% for the 6X FFF beam. A 3D gamma factor analysis demonstrates that for the 6X beam, > 99% of voxels meet the pass criteria (3%/3 mm). For the 6X FFF beam, > 94% of voxels meet this criteria. The TrueBeam accelerator delivering 6X and 6X FFF beams with the HD120 MLC can be modeled in Monte Carlo to provide an independent 3D dose calculation for clinical VMAT plans. This quality assurance tool has been used clinically to verify over 140 6X and 16 6X FFF TrueBeam treatment plans. PACS number: 87.55.K‐ PMID:24892341

  4. The Race To X-ray Microbeam and Nanobeam Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ice, Gene E; Budai, John D; Pang, Judy

    2011-01-01

    X-ray microbeams are an emerging characterization tool with transformational implications for broad areas of science ranging from materials structure and dynamics, geophysics and environmental science to biophysics and protein crystallography. In this review, we discuss the race toward sub-10 nm- x-ray beams with the ability to penetrate tens to hundreds of microns into most materials and with the ability to determine local (crystal) structure. Examples of science enabled by current micro/nanobeam technologies are presented and we provide a perspective on future directions. Applications highlighted are chosen to illustrate the important features of various submicron beam strategies and to highlight themore » directions of current and future research. While it is clear that x-ray microprobes will impact science broadly, the practical limit for hard x-ray beam size, the limit to trace element sensitivity, and the ultimate limitations associated with near-atomic structure determinations are the subject of ongoing research.« less

  5. Evolution of Galaxy Luminosity and Stellar-Mass Functions since $z=1$ with the Dark Energy Survey Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Capozzi, D.; et al.

    We present the first study of the evolution of the galaxy luminosity and stellar-mass functions (GLF and GSMF) carried out by the Dark Energy Survey (DES). We describe the COMMODORE galaxy catalogue selected from Science Verification images. This catalogue is made ofmore » $$\\sim 4\\times 10^{6}$$ galaxies at $$0« less

  6. X-shooter spectroscopy of young stellar objects in Lupus. Lithium, iron, and barium elemental abundances

    NASA Astrophysics Data System (ADS)

    Biazzo, K.; Frasca, A.; Alcalá, J. M.; Zusi, M.; Covino, E.; Randich, S.; Esposito, M.; Manara, C. F.; Antoniucci, S.; Nisini, B.; Rigliaco, E.; Getman, F.

    2017-09-01

    Aims: With the purpose of performing a homogeneous determination of elemental abundances for members of the Lupus T association, we analyzed three chemical elements: lithium, iron, and barium. The aims were: 1) to derive the lithium abundance for the almost complete sample ( 90%) of known class II stars in the Lupus I, II, III, and IV clouds; 2) to perform chemical tagging of a region where few iron abundance measurements have been obtained in the past, and no determination of the barium content has been done up to now. We also investigated possible barium enhancement at the very young age of the region, as this element has become increasingly interesting in the last few years following the evidence of barium over-abundance in young clusters, the origin of which is still unknown. Methods: Using the X-shooter spectrograph mounted on the Unit 2 (UT2) at the Very Large Telescope (VLT), we analyzed the spectra of 89 cluster members, both class II (82) and class III (7) stars. We measured the strength of the lithium line at λ6707.8 Å and derived the abundance of this element through equivalent width measurements and curves of growth. For six class II stars we also derived the iron and barium abundances using the spectral synthesis method and the code MOOG. The veiling contribution was taken into account in the abundance analysis for all three elements. Results: We find a dispersion in the strength of the lithium line at low effective temperatures and identify three targets with severe Li depletion. The nuclear age inferred for these highly lithium-depleted stars is around 15 Myr, which exceeds by an order of magnitude the isochronal one. We derive a nearly solar metallicity for the members whose spectra could be analyzed. We find that Ba is over-abundant by 0.7 dex with respect to the Sun. Since current theoretical models cannot reproduce this abundance pattern, we investigated whether this unusually large Ba content might be related to effects due to stellar

  7. Using Small-Step Refinement for Algorithm Verification in Computer Science Education

    ERIC Educational Resources Information Center

    Simic, Danijela

    2015-01-01

    Stepwise program refinement techniques can be used to simplify program verification. Programs are better understood since their main properties are clearly stated, and verification of rather complex algorithms is reduced to proving simple statements connecting successive program specifications. Additionally, it is easy to analyse similar…

  8. Photometric redshift analysis in the Dark Energy Survey Science Verification data

    NASA Astrophysics Data System (ADS)

    Sánchez, C.; Carrasco Kind, M.; Lin, H.; Miquel, R.; Abdalla, F. B.; Amara, A.; Banerji, M.; Bonnett, C.; Brunner, R.; Capozzi, D.; Carnero, A.; Castander, F. J.; da Costa, L. A. N.; Cunha, C.; Fausti, A.; Gerdes, D.; Greisel, N.; Gschwend, J.; Hartley, W.; Jouvel, S.; Lahav, O.; Lima, M.; Maia, M. A. G.; Martí, P.; Ogando, R. L. C.; Ostrovski, F.; Pellegrini, P.; Rau, M. M.; Sadeh, I.; Seitz, S.; Sevilla-Noarbe, I.; Sypniewski, A.; de Vicente, J.; Abbot, T.; Allam, S. S.; Atlee, D.; Bernstein, G.; Bernstein, J. P.; Buckley-Geer, E.; Burke, D.; Childress, M. J.; Davis, T.; DePoy, D. L.; Dey, A.; Desai, S.; Diehl, H. T.; Doel, P.; Estrada, J.; Evrard, A.; Fernández, E.; Finley, D.; Flaugher, B.; Frieman, J.; Gaztanaga, E.; Glazebrook, K.; Honscheid, K.; Kim, A.; Kuehn, K.; Kuropatkin, N.; Lidman, C.; Makler, M.; Marshall, J. L.; Nichol, R. C.; Roodman, A.; Sánchez, E.; Santiago, B. X.; Sako, M.; Scalzo, R.; Smith, R. C.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Tucker, D. L.; Uddin, S. A.; Valdés, F.; Walker, A.; Yuan, F.; Zuntz, J.

    2014-12-01

    We present results from a study of the photometric redshift performance of the Dark Energy Survey (DES), using the early data from a Science Verification period of observations in late 2012 and early 2013 that provided science-quality images for almost 200 sq. deg. at the nominal depth of the survey. We assess the photometric redshift (photo-z) performance using about 15 000 galaxies with spectroscopic redshifts available from other surveys. These galaxies are used, in different configurations, as a calibration sample, and photo-z's are obtained and studied using most of the existing photo-z codes. A weighting method in a multidimensional colour-magnitude space is applied to the spectroscopic sample in order to evaluate the photo-z performance with sets that mimic the full DES photometric sample, which is on average significantly deeper than the calibration sample due to the limited depth of spectroscopic surveys. Empirical photo-z methods using, for instance, artificial neural networks or random forests, yield the best performance in the tests, achieving core photo-z resolutions σ68 ˜ 0.08. Moreover, the results from most of the codes, including template-fitting methods, comfortably meet the DES requirements on photo-z performance, therefore, providing an excellent precedent for future DES data sets.

  9. Playing a first-person shooter video game induces neuroplastic change.

    PubMed

    Wu, Sijing; Cheng, Cho Kin; Feng, Jing; D'Angelo, Lisa; Alain, Claude; Spence, Ian

    2012-06-01

    Playing a first-person shooter (FPS) video game alters the neural processes that support spatial selective attention. Our experiment establishes a causal relationship between playing an FPS game and neuroplastic change. Twenty-five participants completed an attentional visual field task while we measured ERPs before and after playing an FPS video game for a cumulative total of 10 hr. Early visual ERPs sensitive to bottom-up attentional processes were little affected by video game playing for only 10 hr. However, participants who played the FPS video game and also showed the greatest improvement on the attentional visual field task displayed increased amplitudes in the later visual ERPs. These potentials are thought to index top-down enhancement of spatial selective attention via increased inhibition of distractors. Individual variations in learning were observed, and these differences show that not all video game players benefit equally, either behaviorally or in terms of neural change.

  10. Modeling the Magnetospheric X-ray Emission from Solar Wind Charge Exchange with Verification from XMM-Newton Observations

    DTIC Science & Technology

    2016-08-26

    Journal of Geophysical Research: Space Physics Modeling the magnetospheric X-ray emission from solar wind charge exchange with verification from XMM...Newton observations Ian C. Whittaker1, Steve Sembay1, Jennifer A. Carter1, AndrewM. Read1, Steve E. Milan1, andMinna Palmroth2 1Department of Physics ...observations, J. Geophys. Res. Space Physics , 121, 4158–4179, doi:10.1002/2015JA022292. Received 21 DEC 2015 Accepted 26 FEB 2016 Accepted article online 29

  11. X-shooter spectroscopy of young stellar objects in Lupus. Accretion properties of class II and transitional objects

    NASA Astrophysics Data System (ADS)

    Alcalá, J. M.; Manara, C. F.; Natta, A.; Frasca, A.; Testi, L.; Nisini, B.; Stelzer, B.; Williams, J. P.; Antoniucci, S.; Biazzo, K.; Covino, E.; Esposito, M.; Getman, F.; Rigliaco, E.

    2017-04-01

    The mass accretion rate, Ṁacc, is a key quantity for the understanding of the physical processes governing the evolution of accretion discs around young low-mass (M⋆ ≲ 2.0 M⊙) stars and substellar objects (YSOs). We present here the results of a study of the stellar and accretion properties of the (almost) complete sample of class II and transitional YSOs in the Lupus I, II, III and IV clouds, based on spectroscopic data acquired with the VLT/X-shooter spectrograph. Our study combines the dataset from our previous work with new observations of 55 additional objects. We have investigated 92 YSO candidates in total, 11 of which have been definitely identified with giant stars unrelated to Lupus. The stellar and accretion properties of the 81 bona fide YSOs, which represent more than 90% of the whole class II and transition disc YSO population in the aforementioned Lupus clouds, have been homogeneously and self-consistently derived, allowing for an unbiased study of accretion and its relationship with stellar parameters. The accretion luminosity, Lacc, increases with the stellar luminosity, L⋆, with an overall slope of 1.6, similar but with a smaller scatter than in previous studies. There is a significant lack of strong accretors below L⋆ ≈ 0.1 L⊙, where Lacc is always lower than 0.01 L⋆. We argue that the Lacc - L⋆ slope is not due to observational biases, but is a true property of the Lupus YSOs. The log Ṁacc - log M⋆ correlation shows a statistically significant evidence of a break, with a steeper relation for M⋆ ≲ 0.2 M⊙ and a flatter slope for higher masses. The bimodality of the Ṁacc - M⋆ relation is confirmed with four different evolutionary models used to derive the stellar mass. The bimodal behaviour of the observed relationship supports the importance of modelling self-gravity in the early evolution of the more massive discs, but other processes, such as photo-evaporation and planet formation during the YSO's lifetime, may

  12. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    DTIC Science & Technology

    1996-09-01

    Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic

  13. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced

  14. Vibration transmissibility on rifle shooter: A comparison between accelerometer and laser Doppler vibrometer data

    NASA Astrophysics Data System (ADS)

    Scalise, L.; Casacanditella, L.; Santolini, C.; Martarelli, M.; Tomasini, E. P.

    2014-05-01

    The transmission of mechanical vibrations from tools to human subjects is known to be potentially dangerous for the circulatory and neurological systems. It is also known that such damages are strictly depending on the intensity and the frequency range of the vibrational signals transferred to the different anatomical districts. In this paper, very high impulsive signals, generated during a shooting by a rifle, will be studied, being such signals characterised by a very high acceleration amplitude as well as high frequency range. In this paper, it will be presented an experimental setup aimed to collect experimental data relative to the transmission of the vibration signals from the rifle to the shoulder of subject during the shooting action. In particular the transmissibility of acceleration signals, as well as of the velocity signals, between the rifle stock and the subject's back shoulder will be measured using two piezoelectric accelerometers and a single point laser Doppler vibrometer (LDV). Tests have been carried out in a shooting lab where a professional shooter has conducted the experiments, using different experimental configurations: two different types of stocks and two kinds of bullets with different weights were considered. Two uniaxial accelerometers were fixed on the stock of the weapon and on the back of the shoulder of the shooter respectively. Vibration from the back shoulder was also measured by means of a LDV simultaneously. A comparison of the measured results will be presented and the pros and cons of the use of contact and non-contact transducers will be discussed taking into account the possible sources of the measurement uncertainty as unwanted sensor vibrations for the accelerometer.

  15. Photometric redshift analysis in the Dark Energy Survey Science Verification data

    DOE PAGES

    Sanchez, C.; Carrasco Kind, M.; Lin, H.; ...

    2014-10-09

    In this study, we present results from a study of the photometric redshift performance of the Dark Energy Survey (DES), using the early data from a Science Verification period of observations in late 2012 and early 2013 that provided science-quality images for almost 200 sq. deg. at the nominal depth of the survey. We assess the photometric redshift (photo-z) performance using about 15 000 galaxies with spectroscopic redshifts available from other surveys. These galaxies are used, in different configurations, as a calibration sample, and photo-z's are obtained and studied using most of the existing photo-z codes. A weighting method inmore » a multidimensional colour–magnitude space is applied to the spectroscopic sample in order to evaluate the photo-z performance with sets that mimic the full DES photometric sample, which is on average significantly deeper than the calibration sample due to the limited depth of spectroscopic surveys. In addition, empirical photo-z methods using, for instance, artificial neural networks or random forests, yield the best performance in the tests, achieving core photo-z resolutions σ68 ~ 0.08. Moreover, the results from most of the codes, including template-fitting methods, comfortably meet the DES requirements on photo-z performance, therefore, providing an excellent precedent for future DES data sets.« less

  16. The South Carolina National Guard Secure Area Duty Officer Program: A Reserve Component Active Shooter Contingency Case Study

    DTIC Science & Technology

    2017-12-01

    SADOP) was authorized as SCNG policy on October 3, 2015. This research constitutes a case study of the SCNG SADOP and catalogs the program from...concealed firearms carry law. C. METHODOLOGY This thesis is a single case study of SADOP, which is an exceptional case and the only one of its kind...CAROLINA NATIONAL GUARD SECURE AREA DUTY OFFICER PROGRAM: A RESERVE COMPONENT ACTIVE SHOOTER CONTINGENCY CASE STUDY by Barry N. Ramey December

  17. Letter Report - Verification Survey of Final Grids at the David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P.C. Weaver

    2009-02-17

    Conduct verification surveys of grids at the DWI 1630 Site in Knoxville, Tennessee. The independent verification team (IVT) from ORISE, conducted verification activities in whole and partial grids, as completed by BJC. ORISE site activities included gamma surface scans and soil sampling within 33 grids; G11 through G14; H11 through H15; X14, X15, X19, and X21; J13 through J15 and J17 through J21; K7 through K9 and K13 through K15; L13 through L15; and M14 through M16

  18. Comparison of VLT/X-shooter OH and O2 rotational temperatures with consideration of TIMED/SABER emission and temperature profiles

    NASA Astrophysics Data System (ADS)

    Noll, S.; Kausch, W.; Kimeswenger, S.; Unterguggenberger, S.; Jones, A. M.

    2015-11-01

    Rotational temperatures Trot derived from lines of the same OH band are an important method to study the dynamics and long-term trends in the mesopause region near 87 km. To measure realistic temperatures, a corresponding Boltzmann distribution of the rotational level populations has to be achieved. However, this might not be fulfilled, especially at high emission altitudes. In order to quantify possible non-local thermodynamic equilibrium (non-LTE) contributions to the OH Trot as a function of the upper vibrational level v', we studied a sample of 343 echelle spectra taken with the X-shooter spectrograph at the Very Large Telescope at Cerro Paranal in Chile. These data allowed us to analyse 25 OH bands in each spectrum. Moreover, we could measure lines of O2b(0-1), which peaks at about 94 to 95 km, and O2a(0-0) with an emission peak at about 90 km. The latter altitude is reached in the second half of the night after a rise of several km because of the decay of a daytime population of excited O2. Since the radiative lifetimes for the upper levels of the two O2 bands are relatively long, the derived Trot are not significantly affected by non-LTE contributions. These bands are well suited for a comparison with OH if the differences in the emission profiles are corrected. For different sample averages, we made these corrections by using OH emission, O2a(0-0) emission, and CO2-based temperature profile data from the multi-channel radiometer SABER on the TIMED satellite. The procedure relies on differences of profile-weighted SABER temperatures. For an O2a(0-0)-based reference profile at 90 km, we found a good agreement of the O2 with the SABER-related temperatures, whereas the OH temperatures, especially for the high and even v', showed significant excesses with a maximum of more than 10 K for v' = 8. The exact value depends on the selected lines and molecular parameters. We could also find a nocturnal trend towards higher non-LTE effects, particularly for high v'. The

  19. CMOS VLSI Layout and Verification of a SIMD Computer

    NASA Technical Reports Server (NTRS)

    Zheng, Jianqing

    1996-01-01

    A CMOS VLSI layout and verification of a 3 x 3 processor parallel computer has been completed. The layout was done using the MAGIC tool and the verification using HSPICE. Suggestions for expanding the computer into a million processor network are presented. Many problems that might be encountered when implementing a massively parallel computer are discussed.

  20. CMB lensing tomography with the DES Science Verification galaxies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giannantonio, T.

    We measure the cross-correlation between the galaxy density in the Dark Energy Survey (DES) Science Verification data and the lensing of the cosmic microwave background (CMB) as reconstructed with the Planck satellite and the South Pole Telescope (SPT). When using the DES main galaxy sample over the full redshift range 0.2 < z phot < 1.2, a cross-correlation signal is detected at 6σ and 4σ with SPT and Planck respectively. We then divide the DES galaxies into five photometric redshift bins, finding significant (>2σ) detections in all bins. Comparing to the fiducial Planck cosmology, we find the redshift evolution ofmore » the signal matches expectations, although the amplitude is consistently lower than predicted across redshift bins. We test for possible systematics that could affect our result and find no evidence for significant contamination. Finally, we demonstrate how these measurements can be used to constrain the growth of structure across cosmic time. We find the data are fit by a model in which the amplitude of structure in the z < 1.2 universe is 0.73 ± 0.16 times as large as predicted in the LCDM Planck cosmology, a 1.7σ deviation.« less

  1. CMB lensing tomography with the DES Science Verification galaxies

    DOE PAGES

    Giannantonio, T.

    2016-01-07

    We measure the cross-correlation between the galaxy density in the Dark Energy Survey (DES) Science Verification data and the lensing of the cosmic microwave background (CMB) as reconstructed with the Planck satellite and the South Pole Telescope (SPT). When using the DES main galaxy sample over the full redshift range 0.2 < z phot < 1.2, a cross-correlation signal is detected at 6σ and 4σ with SPT and Planck respectively. We then divide the DES galaxies into five photometric redshift bins, finding significant (>2σ) detections in all bins. Comparing to the fiducial Planck cosmology, we find the redshift evolution ofmore » the signal matches expectations, although the amplitude is consistently lower than predicted across redshift bins. We test for possible systematics that could affect our result and find no evidence for significant contamination. Finally, we demonstrate how these measurements can be used to constrain the growth of structure across cosmic time. We find the data are fit by a model in which the amplitude of structure in the z < 1.2 universe is 0.73 ± 0.16 times as large as predicted in the LCDM Planck cosmology, a 1.7σ deviation.« less

  2. Optical Testing and Verification Methods for the James Webb Space Telescope Integrated Science Instrument Module Element

    NASA Technical Reports Server (NTRS)

    Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; hide

    2016-01-01

    NASA's James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (40K). The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) that contains four science instruments (SI) and the fine guider. The SIs are mounted to a composite metering structure. The SI and guider units were integrated to the ISIM structure and optically tested at the NASA Goddard Space Flight Center as a suite using the Optical Telescope Element SIMulator (OSIM). OSIM is a full field, cryogenic JWST telescope simulator. SI performance, including alignment and wave front error, were evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.

  3. Optical testing and verification methods for the James Webb Space Telescope Integrated Science Instrument Module element

    NASA Astrophysics Data System (ADS)

    Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; Eichhorn, William L.; Glasse, Alistair C.; Gracey, Renee; Hartig, George F.; Howard, Joseph M.; Kelly, Douglas M.; Kimble, Randy A.; Kirk, Jeffrey R.; Kubalak, David A.; Landsman, Wayne B.; Lindler, Don J.; Malumuth, Eliot M.; Maszkiewicz, Michael; Rieke, Marcia J.; Rowlands, Neil; Sabatke, Derek S.; Smith, Corbett T.; Smith, J. Scott; Sullivan, Joseph F.; Telfer, Randal C.; Te Plate, Maurice; Vila, M. Begoña.; Warner, Gerry D.; Wright, David; Wright, Raymond H.; Zhou, Julia; Zielinski, Thomas P.

    2016-09-01

    NASA's James Webb Space Telescope (JWST) is a 6.5m diameter, segmented, deployable telescope for cryogenic IR space astronomy. The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM), that contains four science instruments (SI) and the Fine Guidance Sensor (FGS). The SIs are mounted to a composite metering structure. The SIs and FGS were integrated to the ISIM structure and optically tested at NASA's Goddard Space Flight Center using the Optical Telescope Element SIMulator (OSIM). OSIM is a full-field, cryogenic JWST telescope simulator. SI performance, including alignment and wavefront error, was evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, implementation of associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.

  4. The Science Goals of the Constellation-X Mission

    NASA Technical Reports Server (NTRS)

    White, Nicholas E.; Tananbaum, Harvey; Weaver, Kimberly; Petre, Robert; Bookbinder, Jay

    2004-01-01

    The Constellation-X mission will address the questions: "What happens to matter close to a black hole?" and "What is Dark Energy?" These questions are central to the NASA Beyond Einstein Program, where Constellation-X plays a central role. The mission will address these questions by using high throughput X-ray spectroscopy to observe the effects of strong gravity close to the event horizon of black holes, and to observe the formation and evolution of clusters of galaxies in order to precisely determine Cosmological parameters. To achieve these primary science goals requires a factor of 25-100 increase in sensitivity for high resolution spectroscopy. The mission will also perform routine high- resolution X-ray spectroscopy of faint and extended X-ray source populations. This will provide diagnostic information such as density, elemental abundances, velocity, and ionization state for a wide range of astrophysical problems. This has enormous potential for the discovery of new unexpected phenomena. The Constellation-X mission is a high priority in the National Academy of Sciences McKee-Taylor Astronomy and Astrophysics Survey of new Astrophysics Facilities for the first decade of the 21st century.

  5. Most Efficient Spectrograph to Shoot the Southern Skies

    NASA Astrophysics Data System (ADS)

    2009-05-01

    ESO's Very Large Telescope -- Europe's flagship facility for ground-based astronomy -- has been equipped with the first of its second generation instruments: X-shooter. It can record the entire spectrum of a celestial object in one shot -- from the ultraviolet to the near-infrared -- with high sensitivity. This unique new instrument will be particularly useful for the study of distant exploding objects called gamma-ray bursts. ESO PR Photo 20a/09 An X-shooter spectrum ESO PR Photo 20b/09 The X-shooter instrument ESO PR Photo 20c/09 First Light of X-shooter "X-shooter offers a capability that is unique among astronomical instruments installed at large telescopes," says Sandro D'Odorico, who coordinated the Europe-wide consortium of scientists and engineers that built this remarkable instrument. "Until now, different instruments at different telescopes and multiple observations were needed to cover this kind of wavelength range, making it very difficult to compare data, which, even though from the same object, could have been taken at different times and under different sky conditions." X-shooter collects the full spectrum from the ultraviolet (300 nm) to the near-infrared (2400 nm) in parallel, capturing up to half of all the light from an object that passes through the atmosphere and the various elements of the telescope. "All in all, X-shooter can save us a factor of three or more in terms of precious telescope time and opens a new window of opportunity for the study of many, still poorly understood, celestial sources," says D'Odorico. The name of the 2.5-ton instrument was chosen to stress its capacity to capture data highly efficiently from a source whose nature and energy distribution are not known in advance of the observation. This property is particularly crucial in the study of gamma-ray bursts, the most energetic explosions known to occur in the Universe (ESO 17/09). Until now, a rough estimate of the distance of the target was needed, so as to know which

  6. X-shooter study of accretion in Chamaeleon I. II. A steeper increase of accretion with stellar mass for very low-mass stars?

    NASA Astrophysics Data System (ADS)

    Manara, C. F.; Testi, L.; Herczeg, G. J.; Pascucci, I.; Alcalá, J. M.; Natta, A.; Antoniucci, S.; Fedele, D.; Mulders, G. D.; Henning, T.; Mohanty, S.; Prusti, T.; Rigliaco, E.

    2017-08-01

    The dependence of the mass accretion rate on the stellar properties is a key constraint for star formation and disk evolution studies. Here we present a study of a sample of stars in the Chamaeleon I star-forming region carried out using spectra taken with the ESO VLT/X-shooter spectrograph. The sample is nearly complete down to stellar masses (M⋆) 0.1 M⊙ for the young stars still harboring a disk in this region. We derive the stellar and accretion parameters using a self-consistent method to fit the broadband flux-calibrated medium resolution spectrum. The correlation between accretion luminosity to stellar luminosity, and of mass accretion rate to stellar mass in the logarithmic plane yields slopes of 1.9 ± 0.1 and 2.3 ± 0.3, respectively. These slopes and the accretion rates are consistent with previous results in various star-forming regions and with different theoretical frameworks. However, we find that a broken power-law fit, with a steeper slope for stellar luminosity lower than 0.45 L⊙ and for stellar masses lower than 0.3 M⊙ is slightly preferred according to different statistical tests, but the single power-law model is not excluded. The steeper relation for lower mass stars can be interpreted as a faster evolution in the past for accretion in disks around these objects, or as different accretion regimes in different stellar mass ranges. Finally, we find two regions on the mass accretion versus stellar mass plane that are empty of objects: one region at high mass accretion rates and low stellar masses, which is related to the steeper dependence of the two parameters we derived. The second region is located just above the observational limits imposed by chromospheric emission, at M⋆ 0.3 - 0.4 M⊙. These are typical masses where photoevaporation is known to be effective. The mass accretion rates of this region are 10-10M⊙/yr, which is compatible with the value expected for photoevaporation to rapidly dissipate the inner disk. This work is

  7. Flux Calibration and Spectral Typing of the SPLASH Sample

    NASA Astrophysics Data System (ADS)

    Chang, Caroline; Vemuri, Nikita; Hamren, Katherine; Guhathakurta, Puragra

    2015-01-01

    We present the spectroscopic identification of M-stars in the disk of the Andromeda Galaxy (M31) and revised spectral types for the M-stars in the X-Shooter Library (XSL). Our dataset consists of optical spectra taken with the DEIMOS spectrograph on the Keck II 10-m telescope as part of the Spectroscopic Landscape of Andromeda's Stellar Halo (SPLASH) survey. We use stars from the MILES and X-Shooter Libraries to perform a first order flux calibration of these spectra, then use TiO-based indices from Fluks et al. 1994 to determine the probable M spectral subtype. While testing this procedure on the M-stars of the XSL, we find that the spectral subtypes derived from the spectra themselves are different from the spectral subtypes obtained from the literature and that XSL includes several spectra with subtypes seemingly later than M10. We suggest that this is due to stellar variability. We also identify ~2000 M-stars in the SPLASH sample. We present the distribution of subtypes here.This research was funded by grants from the National Science Foundation and the Space Telescope Science Institute. Some of the research presented here was conducted by high-school students under the auspices of the University of California Santa Cruz's Science Internship Program.

  8. Designing the X-Ray Microcalorimeter Spectrometer for Optimal Science Return

    NASA Technical Reports Server (NTRS)

    Ptak, Andrew; Bandler, Simon R.; Bookbinder, Jay; Kelley, Richard L.; Petre, Robert; Smith, Randall K.; Smith, Stephen

    2013-01-01

    Recent advances in X-ray microcalorimeters enable a wide range of possible focal plane designs for the X-ray Microcalorimeter Spectrometer (XMS) instrument on the future Advanced X-ray Spectroscopic Imaging Observatory (AXSIO) or X-ray Astrophysics Probe (XAP). Small pixel designs (75 microns) oversample a 5-10" PSF by a factor of 3-6 for a 10 m focal length, enabling observations at both high count rates and high energy resolution. Pixel designs utilizing multiple absorbers attached to single transition-edge sensors can extend the focal plane to cover a significantly larger field of view, albeit at a cost in maximum count rate and energy resolution. Optimizing the science return for a given cost and/or complexity is therefore a non-trivial calculation that includes consideration of issues such as the mission science drivers, likely targets, mirror size, and observing efficiency. We present a range of possible designs taking these factors into account and their impacts on the science return of future large effective-area X-ray spectroscopic missions.

  9. Weak lensing magnification in the Dark Energy Survey Science Verification Data

    DOE PAGES

    Garcia-Fernandez, M.; et al.

    2018-02-02

    In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less

  10. Weak lensing magnification in the Dark Energy Survey Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia-Fernandez, M.; et al.

    In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less

  11. Weak lensing magnification in the Dark Energy Survey Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia-Fernandez, M.; et al.

    2016-11-30

    In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less

  12. Process Document, Joint Verification Protocol, and Joint Test Plan for Verification of HACH-LANGE GmbH LUMIStox 300 Bench Top Luminometer and ECLOX Handheld Luminometer for Luminescent Bacteria Test for use in Wastewater

    EPA Science Inventory

    The Danish Environmental Technology Verification program (DANETV) Water Test Centre operated by DHI, is supported by the Danish Ministry for Science, Technology and Innovation. DANETV, the United States Environmental Protection Agency Environmental Technology Verification Progra...

  13. Wide-Field Lensing Mass Maps from Dark Energy Survey Science Verification Data

    DOE PAGES

    Chang, C.

    2015-07-29

    We present a mass map reconstructed from weak gravitational lensing shear measurements over 139 deg 2 from the Dark Energy Survey science verification data. The mass map probes both luminous and dark matter, thus providing a tool for studying cosmology. We also find good agreement between the mass map and the distribution of massive galaxy clusters identified using a red-sequence cluster finder. Potential candidates for superclusters and voids are identified using these maps. We measure the cross-correlation between the mass map and a magnitude-limited foreground galaxy sample and find a detection at the 6.8σ level with 20 arc min smoothing.more » These measurements are consistent with simulated galaxy catalogs based on N-body simulations from a cold dark matter model with a cosmological constant. This suggests low systematics uncertainties in the map. Finally, we summarize our key findings in this Letter; the detailed methodology and tests for systematics are presented in a companion paper.« less

  14. Wide-Field Lensing Mass Maps from Dark Energy Survey Science Verification Data.

    PubMed

    Chang, C; Vikram, V; Jain, B; Bacon, D; Amara, A; Becker, M R; Bernstein, G; Bonnett, C; Bridle, S; Brout, D; Busha, M; Frieman, J; Gaztanaga, E; Hartley, W; Jarvis, M; Kacprzak, T; Kovács, A; Lahav, O; Lin, H; Melchior, P; Peiris, H; Rozo, E; Rykoff, E; Sánchez, C; Sheldon, E; Troxel, M A; Wechsler, R; Zuntz, J; Abbott, T; Abdalla, F B; Allam, S; Annis, J; Bauer, A H; Benoit-Lévy, A; Brooks, D; Buckley-Geer, E; Burke, D L; Capozzi, D; Carnero Rosell, A; Carrasco Kind, M; Castander, F J; Crocce, M; D'Andrea, C B; Desai, S; Diehl, H T; Dietrich, J P; Doel, P; Eifler, T F; Evrard, A E; Fausti Neto, A; Flaugher, B; Fosalba, P; Gruen, D; Gruendl, R A; Gutierrez, G; Honscheid, K; James, D; Kent, S; Kuehn, K; Kuropatkin, N; Maia, M A G; March, M; Martini, P; Merritt, K W; Miller, C J; Miquel, R; Neilsen, E; Nichol, R C; Ogando, R; Plazas, A A; Romer, A K; Roodman, A; Sako, M; Sanchez, E; Sevilla, I; Smith, R C; Soares-Santos, M; Sobreira, F; Suchyta, E; Tarle, G; Thaler, J; Thomas, D; Tucker, D; Walker, A R

    2015-07-31

    We present a mass map reconstructed from weak gravitational lensing shear measurements over 139  deg2 from the Dark Energy Survey science verification data. The mass map probes both luminous and dark matter, thus providing a tool for studying cosmology. We find good agreement between the mass map and the distribution of massive galaxy clusters identified using a red-sequence cluster finder. Potential candidates for superclusters and voids are identified using these maps. We measure the cross-correlation between the mass map and a magnitude-limited foreground galaxy sample and find a detection at the 6.8σ level with 20 arc min smoothing. These measurements are consistent with simulated galaxy catalogs based on N-body simulations from a cold dark matter model with a cosmological constant. This suggests low systematics uncertainties in the map. We summarize our key findings in this Letter; the detailed methodology and tests for systematics are presented in a companion paper.

  15. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  16. Verification of space weather forecasts at the UK Met Office

    NASA Astrophysics Data System (ADS)

    Bingham, S.; Sharpe, M.; Jackson, D.; Murray, S.

    2017-12-01

    The UK Met Office Space Weather Operations Centre (MOSWOC) has produced space weather guidance twice a day since its official opening in 2014. Guidance includes 4-day probabilistic forecasts of X-ray flares, geomagnetic storms, high-energy electron events and high-energy proton events. Evaluation of such forecasts is important to forecasters, stakeholders, model developers and users to understand the performance of these forecasts and also strengths and weaknesses to enable further development. Met Office terrestrial near real-time verification systems have been adapted to provide verification of X-ray flare and geomagnetic storm forecasts. Verification is updated daily to produce Relative Operating Characteristic (ROC) curves and Reliability diagrams, and rolling Ranked Probability Skill Scores (RPSSs) thus providing understanding of forecast performance and skill. Results suggest that the MOSWOC issued X-ray flare forecasts are usually not statistically significantly better than a benchmark climatological forecast (where the climatology is based on observations from the previous few months). By contrast, the issued geomagnetic storm activity forecast typically performs better against this climatological benchmark.

  17. SpaceX CRS-10 What's on Board Science Briefing

    NASA Image and Video Library

    2017-02-17

    During the SpaceX CRS-10 "What's On Board?" Science Briefing inside the Press Site Auditorium, members of social media learned about the science aboard the Dragon spacecraft. The briefing focused on growth of crystals in microgravity planned for the International Space Station following the arrival of a Dragon spacecraft. The Dragon is scheduled to be launched from Kennedy’s Launch Complex 39A on Feb. 18 atop a SpaceX Falcon 9 rocket on the company's 10th Commercial Resupply Services mission to the space station.

  18. Medical Provider Ballistic Protection at Active Shooter Events.

    PubMed

    Stopyra, Jason P; Bozeman, William P; Callaway, David W; Winslow, James; McGinnis, Henderson D; Sempsrott, Justin; Evans-Taylor, Lisa; Alson, Roy L

    2016-01-01

    There is some controversy about whether ballistic protective equipment (body armor) is required for medical responders who may be called to respond to active shooter mass casualty incidents. In this article, we describe the ongoing evolution of recommendations to optimize medical care to injured victims at such an incident. We propose that body armor is not mandatory for medical responders participating in a rapid-response capacity, in keeping with the Hartford Consensus and Arlington Rescue Task Force models. However, we acknowledge that the development and implementation of these programs may benefit from the availability of such equipment as one component of risk mitigation. Many police agencies regularly retire body armor on a defined time schedule before the end of its effective service life. Coordination with law enforcement may allow such retired body armor to be available to other public safety agencies, such as fire and emergency medical services, providing some degree of ballistic protection to medical responders at little or no cost during the rare mass casualty incident. To provide visual demonstration of this concept, we tested three "retired" ballistic vests with ages ranging from 6 to 27 years. The vests were shot at close range using police-issue 9mm, .40 caliber, .45 caliber, and 12-gauge shotgun rounds. Photographs demonstrate that the vests maintained their ballistic protection and defeated all of these rounds. 2016.

  19. Defending the Doomed: Implicit Strategies Concerning Protection of First-Person Shooter Games

    PubMed Central

    Munko, Daniel; Glock, Sabine; Bente, Gary

    2012-01-01

    Abstract Censorship of violent digital games, especially first-person shooter (FPS) games, is broadly discussed between generations. While older people are concerned about possible negative influences of these games, not only players but also nonplayers of the younger net-generation seem to deny any association with real aggressive behavior. Our study aimed at investigating defense mechanisms players and nonplayers use to defend FPS and peers with playing habits. By using a lexical decision task, we found that aggressive concepts are activated by priming the content of FPS but suppressed afterward. Only if participants were instructed to actively suppress aggressive concepts after priming, thought suppression was no longer necessary. Young people still do have negative associations with violent video games. These associations are neglected by implicitly applying defense strategies—independent of own playing habits—to protect this specific hobby, which is common for the net-generation. PMID:22515170

  20. The Role of Project Science in the Chandra X-Ray Observatory

    NASA Technical Reports Server (NTRS)

    O'Dell, Stephen L.; Weisskopf, Martin C.

    2006-01-01

    The Chandra X-Ray Observatory, one of NASA's Great Observatories, has an outstanding record of scientific and technical success. This success results from the efforts of a team comprising NASA, its contractors, the Smithsonian Astrophysical Observatory, the instrument groups, and other elements of the scientific community, including thousands of scientists who utilize this powerful facility for astrophysical research. We discuss the role of NASA Project Science in the formulation, development, calibration, and operation of the Chandra X-ray Observatory. In addition to representing the scientific community within the Project, Project Science performed what we term "science systems engineering". This activity encompasses translation of science requirements into technical requirements and assessment of the scientific impact of programmatic and technical trades. We briefly describe several examples of science systems engineering conducted by Chandra Project Science.

  1. Performance verification of the Gravity and Extreme Magnetism Small explorer (GEMS) x-ray polarimeter

    NASA Astrophysics Data System (ADS)

    Enoto, Teruaki; Black, J. Kevin; Kitaguchi, Takao; Hayato, Asami; Hill, Joanne E.; Jahoda, Keith; Tamagawa, Toru; Kaneko, Kenta; Takeuchi, Yoko; Yoshikawa, Akifumi; Marlowe, Hannah; Griffiths, Scott; Kaaret, Philip E.; Kenward, David; Khalid, Syed

    2014-07-01

    Polarimetry is a powerful tool for astrophysical observations that has yet to be exploited in the X-ray band. For satellite-borne and sounding rocket experiments, we have developed a photoelectric gas polarimeter to measure X-ray polarization in the 2-10 keV range utilizing a time projection chamber (TPC) and advanced micro-pattern gas electron multiplier (GEM) techniques. We carried out performance verification of a flight equivalent unit (1/4 model) which was planned to be launched on the NASA Gravity and Extreme Magnetism Small Explorer (GEMS) satellite. The test was performed at Brookhaven National Laboratory, National Synchrotron Light Source (NSLS) facility in April 2013. The polarimeter was irradiated with linearly-polarized monochromatic X-rays between 2.3 and 10.0 keV and scanned with a collimated beam at 5 different detector positions. After a systematic investigation of the detector response, a modulation factor >=35% above 4 keV was obtained with the expected polarization angle. At energies below 4 keV where the photoelectron track becomes short, diffusion in the region between the GEM and readout strips leaves an asymmetric photoelectron image. A correction method retrieves an expected modulation angle, and the expected modulation factor, ~20% at 2.7 keV. Folding the measured values of modulation through an instrument model gives sensitivity, parameterized by minimum detectable polarization (MDP), nearly identical to that assumed at the preliminary design review (PDR).

  2. New Worlds / New Horizons Science with an X-ray Astrophysics Probe

    NASA Technical Reports Server (NTRS)

    Smith, Randall K.; Bookbinder, Jay A.; Hornschemeier, Ann E.; Bandler, Simon; Brandt, W. N.; Hughes, John P.; McCammon, Dan; Matsumoto, Hironori; Mushotzky, Richard; Osten, Rachel A.; hide

    2014-01-01

    In 2013 NASA commenced a design study for an X-ray Astrophysics Probe to address the X-ray science goals and program prioritizations of the Decadal Survey New World New Horizons (NWNH) with a cost cap of approximately $1B. Both the NWNH report and 2011 NASA X-ray mission concept study found that high-resolution X-ray spectroscopy performed with an X-ray microcalorimeter would enable the most highly rated NWNH X-ray science. Here we highlight some potential science topics, namely: 1) a direct, strong-field test of General Relativity via the study of accretion onto black holes through relativistic broadened Fe lines and their reverberation in response to changing hard X-ray continuum, 2) understanding the evolution of galaxies and clusters by mapping temperatures, abundances and dynamics in hot gas, 3) revealing the physics of accretion onto stellar-mass black holes from companion stars and the equation of state of neutron stars through timing studies and time-resolved spectroscopy of X-ray binaries and 4) feedback from AGN and star formation shown in galaxy-scale winds and jets. In addition to these high-priority goals, an X-ray astrophysics probe would be a general-purpose observatory that will result in invaluable data for other NWNH topics such as stellar astrophysics, protostars and their impact on protoplanetary systems, X-ray spectroscopy of transient phenomena such as high-z gamma-ray bursts and tidal capture of stars by massive black holes, and searches for dark matter decay.

  3. Weak-lensing mass calibration of redMaPPer galaxy clusters in Dark Energy Survey Science Verification data

    DOE PAGES

    Melchior, P.; Gruen, D.; McClintock, T.; ...

    2017-05-16

    Here, we use weak-lensing shear measurements to determine the mean mass of optically selected galaxy clusters in Dark Energy Survey Science Verification data. In a blinded analysis, we split the sample of more than 8000 redMaPPer clusters into 15 subsets, spanning ranges in the richness parameter 5 ≤ λ ≤ 180 and redshift 0.2 ≤ z ≤ 0.8, and fit the averaged mass density contrast profiles with a model that accounts for seven distinct sources of systematic uncertainty: shear measurement and photometric redshift errors; cluster-member contamination; miscentring; deviations from the NFW halo profile; halo triaxiality and line-of-sight projections.

  4. Weak-lensing mass calibration of redMaPPer galaxy clusters in Dark Energy Survey Science Verification data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melchior, P.; Gruen, D.; McClintock, T.

    Here, we use weak-lensing shear measurements to determine the mean mass of optically selected galaxy clusters in Dark Energy Survey Science Verification data. In a blinded analysis, we split the sample of more than 8000 redMaPPer clusters into 15 subsets, spanning ranges in the richness parameter 5 ≤ λ ≤ 180 and redshift 0.2 ≤ z ≤ 0.8, and fit the averaged mass density contrast profiles with a model that accounts for seven distinct sources of systematic uncertainty: shear measurement and photometric redshift errors; cluster-member contamination; miscentring; deviations from the NFW halo profile; halo triaxiality and line-of-sight projections.

  5. SpaceX CRS-14 What's On Board Science Briefing

    NASA Image and Video Library

    2018-04-01

    During the SpaceX CRS-14 "What's On Board?" Science Briefing inside the Kennedy Space Center Press Site Auditorium, members of the media learned about the research headed to the International Space Station aboard the Dragon spacecraft. The briefing focused on several science projects including the Metabolic Tracking experiment; Atmosphere-Space Interactions Monitor (ASIM); Multi-purpose Variable-g Platform (MVP), and Veggie PONDS Validation. The Dragon spacecraft is scheduled to be launched from Space Launch Complex 40 at Cape Canaveral Air Force Station in Florida atop a SpaceX Falcon 9 rocket on the company's 14th Commercial Resupply Services mission to the space station.

  6. Mechanical verification of a schematic Byzantine clock synchronization algorithm

    NASA Technical Reports Server (NTRS)

    Shankar, Natarajan

    1991-01-01

    Schneider generalizes a number of protocols for Byzantine fault tolerant clock synchronization and presents a uniform proof for their correctness. The authors present a machine checked proof of this schematic protocol that revises some of the details in Schneider's original analysis. The verification was carried out with the EHDM system developed at the SRI Computer Science Laboratory. The mechanically checked proofs include the verification that the egocentric mean function used in Lamport and Melliar-Smith's Interactive Convergence Algorithm satisfies the requirements of Schneider's protocol.

  7. Science Results from the Spaceborne Imaging Radar-C/X-Band Synthetic Aperture Radar (SIR-C/X-SAR): Progress Report

    NASA Technical Reports Server (NTRS)

    Evans, Diane L. (Editor); Plaut, Jeffrey (Editor)

    1996-01-01

    The Spaceborne Imaging Radar-C/X-band Synthetic Aperture Radar (SIR-C/X-SAR) is the most advanced imaging radar system to fly in Earth orbit. Carried in the cargo bay of the Space Shuttle Endeavour in April and October of 1994, SIR-C/X-SAR simultaneously recorded SAR data at three wavelengths (L-, C-, and X-bands; 23.5, 5.8, and 3.1 cm, respectively). The SIR-C/X-SAR Science Team consists of 53 investigator teams from more than a dozen countries. Science investigations were undertaken in the fields of ecology, hydrology, ecology, and oceanography. This report contains 44 investigator team reports and several additional reports from coinvestigators and other researchers.

  8. Model-Based Building Verification in Aerial Photographs.

    DTIC Science & Technology

    1987-09-01

    Powers ’ordon E. Schacher Chaii nan Dean of Science and Electrical and Computer Engineering Engineering "p. 5.€ ’ ,’"..€ € . € -, _ _ . ."€ . 4...paper, we have proposed an ex)erimental knowledge-based " verification syste, te organization for change (letection is oitliinet. , Kowledge rules and

  9. Effects of music on arousal during imagery in elite shooters: A pilot study.

    PubMed

    Kuan, Garry; Morris, Tony; Terry, Peter

    2017-01-01

    Beneficial effects of music on several performance-related aspects of sport have been reported, but the processes involved are not well understood. The purpose of the present study was to investigate effects of relaxing and arousing classical music on physiological indicators and subjective perceptions of arousal during imagery of a sport task. First, appropriate music excerpts were selected. Then, 12 skilled shooters performed shooting imagery while listening to the three preselected music excerpts in randomized order. Participants' galvanic skin response, peripheral temperature, and electromyography were monitored during music played concurrently with imagery. Subjective music ratings and physiological measures showed, as hypothesized, that unfamiliar relaxing music was the most relaxing and unfamiliar arousing music was the most arousing. Researchers should examine the impact of unfamiliar relaxing and arousing music played during imagery on subsequent performance in diverse sports. Practitioners can apply unfamiliar relaxing and arousing music with imagery to manipulate arousal level.

  10. Cosmic shear measurements with Dark Energy Survey Science Verification data

    DOE PAGES

    Becker, M. R.

    2016-07-06

    Here, we present measurements of weak gravitational lensing cosmic shear two-point statistics using Dark Energy Survey Science Verification data. We demonstrate that our results are robust to the choice of shear measurement pipeline, either ngmix or im3shape, and robust to the choice of two-point statistic, including both real and Fourier-space statistics. Our results pass a suite of null tests including tests for B-mode contamination and direct tests for any dependence of the two-point functions on a set of 16 observing conditions and galaxy properties, such as seeing, airmass, galaxy color, galaxy magnitude, etc. We use a large suite of simulationsmore » to compute the covariance matrix of the cosmic shear measurements and assign statistical significance to our null tests. We find that our covariance matrix is consistent with the halo model prediction, indicating that it has the appropriate level of halo sample variance. We also compare the same jackknife procedure applied to the data and the simulations in order to search for additional sources of noise not captured by the simulations. We find no statistically significant extra sources of noise in the data. The overall detection significance with tomography for our highest source density catalog is 9.7σ. Cosmological constraints from the measurements in this work are presented in a companion paper.« less

  11. Design and Analysis of Modules for Segmented X-Ray Optics

    NASA Technical Reports Server (NTRS)

    McClelland, Ryan S.; BIskach, Michael P.; Chan, Kai-Wing; Saha, Timo T; Zhang, William W.

    2012-01-01

    Future X-ray astronomy missions demand thin, light, and closely packed optics which lend themselves to segmentation of the annular mirrors and, in turn, a modular approach to the mirror design. The modular approach to X-ray Flight Mirror Assembly (FMA) design allows excellent scalability of the mirror technology to support a variety of mission sizes and science objectives. This paper describes FMA designs using slumped glass mirror segments for several X-ray astrophysics missions studied by NASA and explores the driving requirements and subsequent verification tests necessary to qualify a slumped glass mirror module for space-flight. A rigorous testing program is outlined allowing Technical Development Modules to reach technical readiness for mission implementation while reducing mission cost and schedule risk.

  12. Alignment verification procedures

    NASA Technical Reports Server (NTRS)

    Edwards, P. R.; Phillips, E. P.; Newman, J. C., Jr.

    1988-01-01

    In alignment verification procedures each laboratory is required to align its test machines and gripping fixtures to produce a nearly uniform tensile stress field on an un-notched sheet specimen. The blank specimens (50 mm w X 305 mm l X 2.3 mm th) supplied by the coordinators were strain gauged. Strain gauge readings were taken at all gauges (n = 1 through 10). The alignment verification procedures are as follows: (1) zero all strain gauges while specimen is in a free-supported condition; (2) put strain-gauged specimen in the test machine so that specimen front face (face 1) is in contact with reference jaw (standard position of specimen), tighten grips, and at zero load measure strains on all gauges. (epsilon sub nS0 is strain at gauge n, standard position, zero load); (3) with specimen in machine and at a tensile load of 10 kN measure strains (specimen in standard position). (Strain = epsilon sub nS10); (4) remove specimen from machine. Put specimen in machine so that specimen back face (face 2) is in contact with reference jaw (reverse position of specimen), tighten grips, and at zero load measure strains on all gauges. (Strain - epsilon sub nR0); and (5) with specimen in machine and at tensile load of 10 kN measure strains (specimen in reverse position). (epsilon sub nR10 is strain at gauge n, reverse position, 10 kN load).

  13. SpaceX CRS-13 What's on Board Science Briefing

    NASA Image and Video Library

    2017-12-11

    During the SpaceX CRS-13 "What's On Board?" Science Briefing inside the Kennedy Space Center Press Site Auditorium, members of social media learned about the science headed to the International Space Station aboard the Dragon spacecraft. The briefing focused on several research projects including Biorasis Glucose Biosensor; Launchpad Medical; Space Debris Sensor; Total & Spectral solar Irradiance Sensor (TSIS); Fiber Optic Payload (Made in Space); Rodent Research 6; and Plant Gravity Perception. The Dragon spacecraft is scheduled to be launched from Space Launch Complex 40 at Cape Canaveral Air Force Station in Florida atop a SpaceX Falcon 9 rocket on the company's 13th Commercial Resupply Services mission to the space station.

  14. Dark Energy, Dark Matter and Science with Constellation-X

    NASA Technical Reports Server (NTRS)

    Cardiff, Ann Hornschemeier

    2005-01-01

    Constellation-X, with more than 100 times the collecting area of any previous spectroscopic mission operating in the 0.25-40 keV bandpass, will enable highthroughput, high spectral resolution studies of sources ranging from the most luminous accreting supermassive black holes in the Universe to the disks around young stars where planets form. This talk will review the updated Constellation-X science case, released in booklet form during summer 2005. The science areas where Constellation-X will have major impact include the exploration of the space-time geometry of black holes spanning nine orders of magnitude in mass and the nature of the dark energy and dark matter which govern the expansion and ultimate fate of the Universe. Constellation-X will also explore processes referred to as "cosmic feedback" whereby mechanical energy, radiation, and chemical elements from star formation and black holes are returned to interstellar and intergalactic medium, profoundly affecting the development of structure in the Universe, and will also probe all the important life cycles of matter, from stellar and planetary birth to stellar death via supernova to stellar endpoints in the form of accreting binaries and supernova remnants. This talk will touch upon all these areas, with particular emphasis on Constellation-X's role in the study of Dark Energy.

  15. Responding To and Recovering From an Active Shooter Incident That Turns Into a Hostage Situation. Lessons Learned From School Crises and Emergencies, Volume 2, Issue 6, 2007

    ERIC Educational Resources Information Center

    US Department of Education, 2007

    2007-01-01

    "Lessons Learned" is a series of publications that are a brief recounting of actual school emergencies and crises. This "Lessons Learned" issue focuses on an active shooter situation that escalated to a hostage situation that required multiple law enforcement agencies and other first responders and agencies to coordinate response and recovery…

  16. Environmental Technology Verification: Baghouse Filtration Products--Sinoma Science & Technology Co. Ltd FT-806 Filtration Media

    EPA Science Inventory

    EPA created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. It seeks to achieve this goal by providing high-quality, peer r...

  17. Ares I-X Range Safety Simulation Verification and Analysis IV and V

    NASA Technical Reports Server (NTRS)

    Tarpley, Ashley; Beaty, James; Starr, Brett

    2010-01-01

    NASA s ARES I-X vehicle launched on a suborbital test flight from the Eastern Range in Florida on October 28, 2009. NASA generated a Range Safety (RS) flight data package to meet the RS trajectory data requirements defined in the Air Force Space Command Manual 91-710. Some products included in the flight data package were a nominal ascent trajectory, ascent flight envelope trajectories, and malfunction turn trajectories. These data are used by the Air Force s 45th Space Wing (45SW) to ensure Eastern Range public safety and to make flight termination decisions on launch day. Due to the criticality of the RS data in regards to public safety and mission success, an independent validation and verification (IV&V) effort was undertaken to accompany the data generation analyses to ensure utmost data quality and correct adherence to requirements. Multiple NASA centers and contractor organizations were assigned specific products to IV&V. The data generation and IV&V work was coordinated through the Launch Constellation Range Safety Panel s Trajectory Working Group, which included members from the prime and IV&V organizations as well as the 45SW. As a result of the IV&V efforts, the RS product package was delivered with confidence that two independent organizations using separate simulation software generated data to meet the range requirements and yielded similar results. This document captures ARES I-X RS product IV&V analysis, including the methodology used to verify inputs, simulation, and output data for an RS product. Additionally a discussion of lessons learned is presented to capture advantages and disadvantages to the IV&V processes used.

  18. Effects of music on arousal during imagery in elite shooters: A pilot study

    PubMed Central

    Kuan, Garry; Morris, Tony; Terry, Peter

    2017-01-01

    Beneficial effects of music on several performance-related aspects of sport have been reported, but the processes involved are not well understood. The purpose of the present study was to investigate effects of relaxing and arousing classical music on physiological indicators and subjective perceptions of arousal during imagery of a sport task. First, appropriate music excerpts were selected. Then, 12 skilled shooters performed shooting imagery while listening to the three preselected music excerpts in randomized order. Participants’ galvanic skin response, peripheral temperature, and electromyography were monitored during music played concurrently with imagery. Subjective music ratings and physiological measures showed, as hypothesized, that unfamiliar relaxing music was the most relaxing and unfamiliar arousing music was the most arousing. Researchers should examine the impact of unfamiliar relaxing and arousing music played during imagery on subsequent performance in diverse sports. Practitioners can apply unfamiliar relaxing and arousing music with imagery to manipulate arousal level. PMID:28414741

  19. Requirement Specifications for a Design and Verification Unit.

    ERIC Educational Resources Information Center

    Pelton, Warren G.; And Others

    A research and development activity to introduce new and improved education and training technology into Bureau of Medicine and Surgery training is recommended. The activity, called a design and verification unit, would be administered by the Education and Training Sciences Department. Initial research and development are centered on the…

  20. Verification test of the SURF and SURFplus models in xRage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menikoff, Ralph

    2016-05-18

    As a verification test of the SURF and SURFplus models in the xRage code we use a propagating underdriven detonation wave in 1-D. This is about the only test cases for which an accurate solution can be determined based on the theoretical structure of the solution. The solution consists of a steady ZND reaction zone profile joined with a scale invariant rarefaction or Taylor wave and followed by a constant state. The end of the reaction profile and the head of the rarefaction coincide with the sonic CJ state of the detonation wave. The constant state is required to matchmore » a rigid wall boundary condition. For a test case, we use PBX 9502 with the same EOS and burn rate as previously used to test the shock detector algorithm utilized by the SURF model. The detonation wave is propagated for 10 μs (slightly under 80mm). As expected, the pointwise errors are largest in the neighborhood of discontinuities; pressure discontinuity at the lead shock front and pressure derivative discontinuities at the head and tail of the rarefaction. As a quantitative measure of the overall accuracy, the L2 norm of the difference of the numerical pressure and the exact solution is used. Results are presented for simulations using both a uniform grid and an adaptive grid that refines the reaction zone.« less

  1. 78 FR 28812 - Energy Efficiency Program for Industrial Equipment: Petition of UL Verification Services Inc. for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... are engineers. UL today is comprised of five businesses, Product Safety, Verification Services, Life..., Director--Global Technical Research, UL Verification Services. Subscribed and sworn to before me this 20... (431.447(c)(4)) General Personnel Overview UL is a global independent safety science company with more...

  2. The DES Bright Arcs Survey: Hundreds of Candidate Strongly Lensed Galaxy Systems from the Dark Energy Survey Science Verification and Year 1 Observations

    NASA Astrophysics Data System (ADS)

    Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.; Nord, B.; Gaitsch, H.; Gaitsch, S.; Lin, H.; Allam, S.; Collett, T. E.; Furlanetto, C.; Gill, M. S. S.; More, A.; Nightingale, J.; Odden, C.; Pellico, A.; Tucker, D. L.; da Costa, L. N.; Fausti Neto, A.; Kuropatkin, N.; Soares-Santos, M.; Welch, B.; Zhang, Y.; Frieman, J. A.; Abdalla, F. B.; Annis, J.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Cunha, C. E.; D'Andrea, C. B.; Desai, S.; Dietrich, J. P.; Drlica-Wagner, A.; Evrard, A. E.; Finley, D. A.; Flaugher, B.; García-Bellido, J.; Gerdes, D. W.; Goldstein, D. A.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Kuehn, K.; Kuhlmann, S.; Lahav, O.; Li, T. S.; Lima, M.; Maia, M. A. G.; Marshall, J. L.; Menanteau, F.; Miquel, R.; Nichol, R. C.; Nugent, P.; Ogando, R. L. C.; Plazas, A. A.; Reil, K.; Romer, A. K.; Sako, M.; Sanchez, E.; Santiago, B.; Scarpine, V.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Sheldon, E.; Smith, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Walker, A. R.; DES Collaboration

    2017-09-01

    We report the results of searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verification and Year 1 observations. The Science Verification data span approximately 250 sq. deg. with a median I-band limiting magnitude for extended objects (10σ) of 23.0. The Year 1 data span approximately 2000 sq. deg. and have an I-band limiting magnitude for extended objects (10σ) of 22.9. As these data sets are both wide and deep, they are particularly useful for identifying strong gravitational lens candidates. Potential strong gravitational lens candidate systems were initially identified based on a color and magnitude selection in the DES object catalogs or because the system is at the location of a previously identified galaxy cluster. Cutout images of potential candidates were then visually scanned using an object viewer and numerically ranked according to whether or not we judged them to be likely strong gravitational lens systems. Having scanned nearly 400,000 cutouts, we present 374 candidate strong lens systems, of which 348 are identified for the first time. We provide the R.A. and decl., the magnitudes and photometric properties of the lens and source objects, and the distance (radius) of the source(s) from the lens center for each system.

  3. Weak lensing by galaxy troughs in DES Science Verification data

    DOE PAGES

    Gruen, D.; Friedrich, O.; Amara, A.; ...

    2015-11-29

    In this study, we measure the weak lensing shear around galaxy troughs, i.e. the radial alignment of background galaxies relative to underdensities in projections of the foreground galaxy field over a wide range of redshift in Science Verification data from the Dark Energy Survey. Our detection of the shear signal is highly significant (10σ–15σ for the smallest angular scales) for troughs with the redshift range z ϵ [0.2, 0.5] of the projected galaxy field and angular diameters of 10 arcmin…1°. These measurements probe the connection between the galaxy, matter density, and convergence fields. By assuming galaxies are biased tracers ofmore » the matter density with Poissonian noise, we find agreement of our measurements with predictions in a fiducial Λ cold dark matter model. The prediction for the lensing signal on large trough scales is virtually independent of the details of the underlying model for the connection of galaxies and matter. Our comparison of the shear around troughs with that around cylinders with large galaxy counts is consistent with a symmetry between galaxy and matter over- and underdensities. In addition, we measure the two-point angular correlation of troughs with galaxies which, in contrast to the lensing signal, is sensitive to galaxy bias on all scales. The lensing signal of troughs and their clustering with galaxies is therefore a promising probe of the statistical properties of matter underdensities and their connection to the galaxy field.« less

  4. Aqueous cleaning and verification processes for precision cleaning of small parts

    NASA Technical Reports Server (NTRS)

    Allen, Gale J.; Fishell, Kenneth A.

    1995-01-01

    The NASA Kennedy Space Center (KSC) Materials Science Laboratory (MSL) has developed a totally aqueous process for precision cleaning and verification of small components. In 1990 the Precision Cleaning Facility at KSC used approximately 228,000 kg (500,000 lbs) of chlorofluorocarbon (CFC) 113 in the cleaning operations. It is estimated that current CFC 113 usage has been reduced by 75 percent and it is projected that a 90 percent reduction will be achieved by the end of calendar year 1994. The cleaning process developed utilizes aqueous degreasers, aqueous surfactants, and ultrasonics in the cleaning operation and an aqueous surfactant, ultrasonics, and Total Organic Carbon Analyzer (TOCA) in the nonvolatile residue (NVR) and particulate analysis for verification of cleanliness. The cleaning and verification process is presented in its entirety, with comparison to the CFC 113 cleaning and verification process, including economic and labor costs/savings.

  5. [Uniqueness seeking behavior as a self-verification: an alternative approach to the study of uniqueness].

    PubMed

    Yamaoka, S

    1995-06-01

    Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.

  6. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  7. Galaxy-galaxy lensing in the Dark Energy Survey Science Verification data

    DOE PAGES

    Clampitt, J.; S?nchez, C.; Kwan, J.; ...

    2016-11-22

    We present galaxy-galaxy lensing results from 139 square degrees of Dark Energy Survey (DES) Science Verification (SV) data. Our lens sample consists of red galaxies, known as redMaGiC, which are specifically selected to have a low photometric redshift error and outlier rate. The lensing measurement has a total signal-to-noise of 29 over scales $0.09 < R < 15$ Mpc/$h$, including all lenses over a wide redshift range $0.2 < z < 0.8$. Dividing the lenses into three redshift bins for this constant moving number density sample, we find no evidence for evolution in the halo mass with redshift. We obtainmore » consistent results for the lensing measurement with two independent shear pipelines, ngmix and im3shape. We perform a number of null tests on the shear and photometric redshift catalogs and quantify resulting systematic uncertainties. Covariances from jackknife subsamples of the data are validated with a suite of 50 mock surveys. The results and systematics checks in this work provide a critical input for future cosmological and galaxy evolution studies with the DES data and redMaGiC galaxy samples. We fit a Halo Occupation Distribution (HOD) model, and demonstrate that our data constrains the mean halo mass of the lens galaxies, despite strong degeneracies between individual HOD parameters.« less

  8. Galaxy-galaxy lensing in the Dark Energy Survey Science Verification data

    NASA Astrophysics Data System (ADS)

    Clampitt, J.; Sánchez, C.; Kwan, J.; Krause, E.; MacCrann, N.; Park, Y.; Troxel, M. A.; Jain, B.; Rozo, E.; Rykoff, E. S.; Wechsler, R. H.; Blazek, J.; Bonnett, C.; Crocce, M.; Fang, Y.; Gaztanaga, E.; Gruen, D.; Jarvis, M.; Miquel, R.; Prat, J.; Ross, A. J.; Sheldon, E.; Zuntz, J.; Abbott, T. M. C.; Abdalla, F. B.; Armstrong, R.; Becker, M. R.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Dietrich, J. P.; Doel, P.; Estrada, J.; Evrard, A. E.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; Gruendl, R. A.; Honscheid, K.; James, D. J.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Mohr, J. J.; Nichol, R. C.; Nord, B.; Plazas, A. A.; Romer, A. K.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.

    2017-03-01

    We present galaxy-galaxy lensing results from 139 deg2 of Dark Energy Survey (DES) Science Verification (SV) data. Our lens sample consists of red galaxies, known as redMaGiC, which are specifically selected to have a low photometric redshift error and outlier rate. The lensing measurement has a total signal-to-noise ratio of 29 over scales 0.09 < R < 15 Mpc h-1, including all lenses over a wide redshift range 0.2 < z < 0.8. Dividing the lenses into three redshift bins for this constant moving number density sample, we find no evidence for evolution in the halo mass with redshift. We obtain consistent results for the lensing measurement with two independent shear pipelines, NGMIX and IM3SHAPE. We perform a number of null tests on the shear and photometric redshift catalogues and quantify resulting systematic uncertainties. Covariances from jackknife subsamples of the data are validated with a suite of 50 mock surveys. The result and systematic checks in this work provide a critical input for future cosmological and galaxy evolution studies with the DES data and redMaGiC galaxy samples. We fit a halo occupation distribution (HOD) model, and demonstrate that our data constrain the mean halo mass of the lens galaxies, despite strong degeneracies between individual HOD parameters.

  9. Galaxy-galaxy lensing in the Dark Energy Survey Science Verification data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clampitt, J.; S?nchez, C.; Kwan, J.

    We present galaxy-galaxy lensing results from 139 square degrees of Dark Energy Survey (DES) Science Verification (SV) data. Our lens sample consists of red galaxies, known as redMaGiC, which are specifically selected to have a low photometric redshift error and outlier rate. The lensing measurement has a total signal-to-noise of 29 over scales $0.09 < R < 15$ Mpc/$h$, including all lenses over a wide redshift range $0.2 < z < 0.8$. Dividing the lenses into three redshift bins for this constant moving number density sample, we find no evidence for evolution in the halo mass with redshift. We obtainmore » consistent results for the lensing measurement with two independent shear pipelines, ngmix and im3shape. We perform a number of null tests on the shear and photometric redshift catalogs and quantify resulting systematic uncertainties. Covariances from jackknife subsamples of the data are validated with a suite of 50 mock surveys. The results and systematics checks in this work provide a critical input for future cosmological and galaxy evolution studies with the DES data and redMaGiC galaxy samples. We fit a Halo Occupation Distribution (HOD) model, and demonstrate that our data constrains the mean halo mass of the lens galaxies, despite strong degeneracies between individual HOD parameters.« less

  10. An Overview of Integration and Test of the James Webb Space Telescope Integrated Science Instrument Module

    NASA Technical Reports Server (NTRS)

    Drury, Michael; Becker, Neil; Bos, Brent; Davila, Pamela; Frey, Bradley; Hylan, Jason; Marsh, James; McGuffey, Douglas; Novak, Maria; Ohl, Raymond; hide

    2007-01-01

    The James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (approx.40K). The JWST Observatory architecture includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) element that contains four science instruments (SI) including a Guider. The SIs and Guider are mounted to a composite metering structure with outer dimensions of 2.1x2.2x1.9m. The SI and Guider units are integrated to the ISIM structure and optically tested at NASA/Goddard Space Flight Center as an instrument suite using a high-fidelity, cryogenic JWST telescope simulator that features a 1.5m diameter powered mirror. The SIs are integrated and aligned to the structure under ambient, clean room conditions. SI performance, including focus, pupil shear and wavefront error, is evaluated at the operating temperature. We present an overview of the ISIM integration within the context of Observatory-level construction. We describe the integration and verification plan for the ISIM element, including an overview of our incremental verification approach, ambient mechanical integration and test plans and optical alignment and cryogenic test plans. We describe key ground support equipment and facilities.

  11. Observatory Science with the NICER X-ray Timing Instrument

    NASA Astrophysics Data System (ADS)

    Remillard, Ronald A.

    2016-04-01

    This presentation is submitted on behalf of the NICER Observatory Science Working Group. NICER will be deployed on the International Space Station later in 2016. The X-ray sensitivity spans 0.2-12 keV, with CCD-like spectral resolution, low background rates, and unprecedented timing accuracy. A Guest Observer (GO) Program has been approved by NASA as one of the proposed Science Enhancement Options, contingent on NICER meeting its Prime Mission Science Objectives. The NICER Science team will observe limited Observatory Science targets (i.e., sources other than neutron stars) in year 1, and GO observations will constitute 50% of the exposures in year 2. Thereafter, NICER will compete for continuation via the NASA Senior Review process. NICER Instrument performance is compared with Missions such as XMM-Newton and RXTE. We briefly highlight the expected themes for Observatory Science relating to accreting black holes on all mass scales, magnetic CVs, active stars, and clusters of galaxies.

  12. Combustion Stability Verification for the Thrust Chamber Assembly of J-2X Developmental Engines 10001, 10002, and 10003

    NASA Technical Reports Server (NTRS)

    Morgan, C. J.; Hulka, J. R.; Casiano, M. J.; Kenny, R. J.; Hinerman, T. D.; Scholten, N.

    2015-01-01

    The J-2X engine, a liquid oxygen/liquid hydrogen propellant rocket engine available for future use on the upper stage of the Space Launch System vehicle, has completed testing of three developmental engines at NASA Stennis Space Center. Twenty-one tests of engine E10001 were conducted from June 2011 through September 2012, thirteen tests of the engine E10002 were conducted from February 2013 through September 2013, and twelve tests of engine E10003 were conducted from November 2013 to April 2014. Verification of combustion stability of the thrust chamber assembly was conducted by perturbing each of the three developmental engines. The primary mechanism for combustion stability verification was examining the response caused by an artificial perturbation (bomb) in the main combustion chamber, i.e., dynamic combustion stability rating. No dynamic instabilities were observed in the TCA, although a few conditions were not bombed. Additional requirements, included to guard against spontaneous instability or rough combustion, were also investigated. Under certain conditions, discrete responses were observed in the dynamic pressure data. The discrete responses were of low amplitude and posed minimal risk to safe engine operability. Rough combustion analyses showed that all three engines met requirements for broad-banded frequency oscillations. Start and shutdown transient chug oscillations were also examined to assess the overall stability characteristics, with no major issues observed.

  13. Commissioning results of an automated treatment planning verification system

    PubMed Central

    Mason, Bryan E.; Robinson, Ronald C.; Kisling, Kelly D.; Kirsner, Steven M.

    2014-01-01

    A dose calculation verification system (VS) was acquired and commissioned as a second check on the treatment planning system (TPS). This system reads DICOM CT datasets, RT plans, RT structures, and RT dose from the TPS and automatically, using its own collapsed cone superposition/convolution algorithm, computes dose on the same CT dataset. The system was commissioned by extracting basic beam parameters for simple field geometries and dose verification for complex treatments. Percent depth doses (PDD) and profiles were extracted for field sizes using jaw settings 3 × 3 cm2 ‐ 40 × 40 cm2 and compared to measured data, as well as our TPS model. Smaller fields of 1 × 1 cm2 and 2 × 2 cm2 generated using the multileaf collimator (MLC) were analyzed in the same fashion as the open fields. In addition, 40 patient plans consisting of both IMRT and VMAT were computed and the following comparisons were made: 1) TPS to the VS, 2) VS to measured data, and 3) TPS to measured data where measured data is both ion chamber (IC) and film measurements. Our results indicated for all field sizes using jaw settings PDD errors for the VS on average were less than 0.87%, 1.38%, and 1.07% for 6x, 15x, and 18x, respectively, relative to measured data. PDD errors for MLC field sizes were less than 2.28%, 1.02%, and 2.23% for 6x, 15x, and 18x, respectively. The infield profile analysis yielded results less than 0.58% for 6x, 0.61% for 15x, and 0.77% for 18x for the VS relative to measured data. Analysis of the penumbra region yields results ranging from 66.5% points, meeting the DTA criteria to 100% of the points for smaller field sizes for all energies. Analysis of profile data for field sizes generated using the MLC saw agreement with infield DTA analysis ranging from 68.8%–100% points passing the 1.5%/1.5 mm criteria. Results from the dose verification for IMRT and VMAT beams indicated that, on average, the ratio of TPS to IC and VS to IC measurements was

  14. In-Line Phase-Contrast X-ray Imaging and Tomography for Materials Science

    PubMed Central

    Mayo, Sheridan C.; Stevenson, Andrew W.; Wilkins, Stephen W.

    2012-01-01

    X-ray phase-contrast imaging and tomography make use of the refraction of X-rays by the sample in image formation. This provides considerable additional information in the image compared to conventional X-ray imaging methods, which rely solely on X-ray absorption by the sample. Phase-contrast imaging highlights edges and internal boundaries of a sample and is thus complementary to absorption contrast, which is more sensitive to the bulk of the sample. Phase-contrast can also be used to image low-density materials, which do not absorb X-rays sufficiently to form a conventional X-ray image. In the context of materials science, X-ray phase-contrast imaging and tomography have particular value in the 2D and 3D characterization of low-density materials, the detection of cracks and voids and the analysis of composites and multiphase materials where the different components have similar X-ray attenuation coefficients. Here we review the use of phase-contrast imaging and tomography for a wide variety of materials science characterization problems using both synchrotron and laboratory sources and further demonstrate the particular benefits of phase contrast in the laboratory setting with a series of case studies. PMID:28817018

  15. In-Line Phase-Contrast X-ray Imaging and Tomography for Materials Science.

    PubMed

    Mayo, Sheridan C; Stevenson, Andrew W; Wilkins, Stephen W

    2012-05-24

    X-ray phase-contrast imaging and tomography make use of the refraction of X-rays by the sample in image formation. This provides considerable additional information in the image compared to conventional X-ray imaging methods, which rely solely on X-ray absorption by the sample. Phase-contrast imaging highlights edges and internal boundaries of a sample and is thus complementary to absorption contrast, which is more sensitive to the bulk of the sample. Phase-contrast can also be used to image low-density materials, which do not absorb X-rays sufficiently to form a conventional X-ray image. In the context of materials science, X-ray phase-contrast imaging and tomography have particular value in the 2D and 3D characterization of low-density materials, the detection of cracks and voids and the analysis of composites and multiphase materials where the different components have similar X-ray attenuation coefficients. Here we review the use of phase-contrast imaging and tomography for a wide variety of materials science characterization problems using both synchrotron and laboratory sources and further demonstrate the particular benefits of phase contrast in the laboratory setting with a series of case studies.

  16. Highlights of Science Launching on SpaceX CRS-15

    NASA Image and Video Library

    2018-06-24

    A new batch of science is headed to the International Space Station aboard the SpaceX Dragon on the company’s 15th mission for commercial resupply services. Among the research being delivered is science that studies the use of artificial intelligence for crew support, plant water use all over the planet, gut health in space, more efficient drug development and the formation of inorganic structures without the influence of Earth’s gravity. The International Space Station is a convergence of science, technology and human innovation that demonstrates new technologies and enables research not possible on Earth. The space station has been occupied continuously since November 2000. In that time, more than 230 people and a variety of international and commercial spacecraft have visited the orbiting laboratory. The space station remains the springboard to NASA's next great leap in exploration, including future human missions to the Moon and eventually to Mars. Highlighted investigations shown: Mobile Companion/CIMON: https://go.nasa.gov/2JCgPRf ECOSTRESS: https://go.nasa.gov/2sT87DV Angiex Cancer Therapy: https://go.nasa.gov/2LA1Cgc Rodent Research-7: https://go.nasa.gov/2JlVQlC Chemical Gardens: https://go.nasa.gov/2JDCYie Follow updates on the science conducted aboard the space station on Twitter: https://twitter.com/iss_research For more information on how you can conduct your research in microgravity, visit https://go.nasa.gov/2q84LJj HD Download: https://archive.org/details/jsc2018m000428_Highlights_of_Science_Launching_on_SpaceX_CRS-15

  17. The DES Bright Arcs Survey: Hundreds of Candidate Strongly Lensed Galaxy Systems from the Dark Energy Survey Science Verification and Year 1 Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.

    We report the results of searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verification and Year 1 observations. The Science Verification data span approximately 250 sq. deg. with a median i -band limiting magnitude for extended objects (10 σ ) of 23.0. The Year 1 data span approximately 2000 sq. deg. and have an i -band limiting magnitude for extended objects (10 σ ) of 22.9. As these data sets are both wide and deep, they are particularly useful for identifying strong gravitational lens candidates. Potential strong gravitational lens candidate systems were initially identified basedmore » on a color and magnitude selection in the DES object catalogs or because the system is at the location of a previously identified galaxy cluster. Cutout images of potential candidates were then visually scanned using an object viewer and numerically ranked according to whether or not we judged them to be likely strong gravitational lens systems. Having scanned nearly 400,000 cutouts, we present 374 candidate strong lens systems, of which 348 are identified for the first time. We provide the R.A. and decl., the magnitudes and photometric properties of the lens and source objects, and the distance (radius) of the source(s) from the lens center for each system.« less

  18. The life science X-ray scattering beamline at NSLS-II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiFabio, Jonathan; Yang, Lin; Chodankar, Shirish

    We report the current development status of the High Brightness X-ray Scattering for Life Sciences (or Life Science X-ray Scattering, LiX) beamline at the NSLS-II facility of Brookhaven National Laboratory. This instrument will operate in the x-ray energy range of 2.1-18 keV, provide variable beam sizes from 1 micron to ~0.5 mm, and support user experiments in three scientific areas: (1) high-throughput solution scattering, in-line size exclusion chromatography and flow mixers-based time-resolved solution scattering of biological macro-molecules, (2) diffraction from single- and multi-layered lipid membranes, and (3) scattering-based scanning probe imaging of biological tissues. In order to satisfy the beammore » stability required for these experiments and to switch rapidly between different types of experiments, we have adopted a secondary source with refractive lenses for secondary focusing, a detector system consisting of three Pilatus detectors, and specialized experimental modules that can be quickly exchanged and each dedicated to a defined set of experiments. The construction of this beamline is on schedule for completion in September 2015. User experiments are expected to start in Spring 2016.« less

  19. The life science X-ray scattering beamline at NSLS-II

    DOE PAGES

    DiFabio, Jonathan; Yang, Lin; Chodankar, Shirish; ...

    2015-09-30

    We report the current development status of the High Brightness X-ray Scattering for Life Sciences (or Life Science X-ray Scattering, LiX) beamline at the NSLS-II facility of Brookhaven National Laboratory. This instrument will operate in the x-ray energy range of 2.1-18 keV, provide variable beam sizes from 1 micron to ~0.5 mm, and support user experiments in three scientific areas: (1) high-throughput solution scattering, in-line size exclusion chromatography and flow mixers-based time-resolved solution scattering of biological macro-molecules, (2) diffraction from single- and multi-layered lipid membranes, and (3) scattering-based scanning probe imaging of biological tissues. In order to satisfy the beammore » stability required for these experiments and to switch rapidly between different types of experiments, we have adopted a secondary source with refractive lenses for secondary focusing, a detector system consisting of three Pilatus detectors, and specialized experimental modules that can be quickly exchanged and each dedicated to a defined set of experiments. The construction of this beamline is on schedule for completion in September 2015. User experiments are expected to start in Spring 2016.« less

  20. The life science x-ray scattering beamline at NSLS-II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiFabio, Jonathan; Chodankar, Shirish; Pjerov, Sal

    We report the current development status of the High Brightness X-ray Scattering for Life Sciences (or Life Science X-ray Scattering, LiX) beamline at the NSLS-II facility of Brookhaven National Laboratory. This instrument will operate in the x-ray energy range of 2.1-18 keV, provide variable beam sizes from 1 micron to ∼0.5 mm, and support user experiments in three scientific areas: (1) high-throughput solution scattering, in-line size exclusion chromatography and flow mixers-based time-resolved solution scattering of biological macro-molecules, (2) diffraction from single- and multi-layered lipid membranes, and (3) scattering-based scanning probe imaging of biological tissues. In order to satisfy the beammore » stability required for these experiments and to switch rapidly between different types of experiments, we have adopted a secondary source with refractive lenses for secondary focusing, a detector system consisting of three Pilatus detectors, and specialized experimental modules that can be quickly exchanged and each dedicated to a defined set of experiments. The construction of this beamline is on schedule for completion in September 2015. User experiments are expected to start in Spring 2016.« less

  1. VizieR Online Data Catalog: Reflectance spectra of 12 Trojans and Hildas (Marsset+, 2014)

    NASA Astrophysics Data System (ADS)

    Marsset, M.; Vernazza, P.; Gourgeot, F.; Dumas, C.; Birlan, M.; Lamy, P.; Binzel, R. P.

    2014-07-01

    We present 17 reflectance spectra of 12 high albedo (pv>0.14) Trojans (8 objects) and Hildas (4 objects) obtained with the ESO/VLT Echelle spectrograph X-SHOOTER in the 0.3-2.2um spectral range (14 spectra) and with the NASA/IRTF spectrograph SpeX in the 0.8-2.5um spectral range (3 spectra). X-SHOOTER spectra were normalized to unity at 0.55um and SpeX spectra were normalized to unity at 2.2um . The spectra presented in this work were collected between April and December 2013. (18 data files).

  2. Mass and galaxy distributions of four massive galaxy clusters from Dark Energy Survey Science Verification data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melchior, P.; Suchyta, E.; Huff, E.

    2015-03-31

    We measure the weak-lensing masses and galaxy distributions of four massive galaxy clusters observed during the Science Verification phase of the Dark Energy Survey. This pathfinder study is meant to 1) validate the DECam imager for the task of measuring weak-lensing shapes, and 2) utilize DECam's large field of view to map out the clusters and their environments over 90 arcmin. We conduct a series of rigorous tests on astrometry, photometry, image quality, PSF modeling, and shear measurement accuracy to single out flaws in the data and also to identify the optimal data processing steps and parameters. We find Sciencemore » Verification data from DECam to be suitable for the lensing analysis described in this paper. The PSF is generally well-behaved, but the modeling is rendered difficult by a flux-dependent PSF width and ellipticity. We employ photometric redshifts to distinguish between foreground and background galaxies, and a red-sequence cluster finder to provide cluster richness estimates and cluster-galaxy distributions. By fitting NFW profiles to the clusters in this study, we determine weak-lensing masses that are in agreement with previous work. For Abell 3261, we provide the first estimates of redshift, weak-lensing mass, and richness. In addition, the cluster-galaxy distributions indicate the presence of filamentary structures attached to 1E 0657-56 and RXC J2248.7-4431, stretching out as far as 1 degree (approximately 20 Mpc), showcasing the potential of DECam and DES for detailed studies of degree-scale features on the sky.« less

  3. Mass and galaxy distributions of four massive galaxy clusters from Dark Energy Survey Science Verification data

    DOE PAGES

    Melchior, P.; Suchyta, E.; Huff, E.; ...

    2015-03-31

    We measure the weak-lensing masses and galaxy distributions of four massive galaxy clusters observed during the Science Verification phase of the Dark Energy Survey. This pathfinder study is meant to 1) validate the DECam imager for the task of measuring weak-lensing shapes, and 2) utilize DECam's large field of view to map out the clusters and their environments over 90 arcmin. We conduct a series of rigorous tests on astrometry, photometry, image quality, PSF modelling, and shear measurement accuracy to single out flaws in the data and also to identify the optimal data processing steps and parameters. We find Sciencemore » Verification data from DECam to be suitable for the lensing analysis described in this paper. The PSF is generally well-behaved, but the modelling is rendered difficult by a flux-dependent PSF width and ellipticity. We employ photometric redshifts to distinguish between foreground and background galaxies, and a red-sequence cluster finder to provide cluster richness estimates and cluster-galaxy distributions. By fitting NFW profiles to the clusters in this study, we determine weak-lensing masses that are in agreement with previous work. For Abell 3261, we provide the first estimates of redshift, weak-lensing mass, and richness. Additionally, the cluster-galaxy distributions indicate the presence of filamentary structures attached to 1E 0657-56 and RXC J2248.7-4431, stretching out as far as 1degree (approximately 20 Mpc), showcasing the potential of DECam and DES for detailed studies of degree-scale features on the sky.« less

  4. Advanced X-ray Astrophysics Facility (AXAF) science instruments

    NASA Technical Reports Server (NTRS)

    Winkler, Carl E.; Dailey, Carroll C.; Cumings, Nesbitt P.

    1991-01-01

    The overall AXAF program is summarized, with particular emphasis given to its science instruments. The science objectives established for AXAF are to determine the nature of celestial objects, from normal stars to quasars, to elucidate the nature of the physical processes which take place in and between astronomical objects, and to shed light on the history and evolution of the universe. Attention is given to the AXAF CCD imaging spectrometer, which is to provide spectrally and temporally resolved imaging, or, in conjunction with transmission grating, high-resolution dispersed spectral images of celestial sources. A high-resolution camera, an X-ray spectrometer, and the Bragg Crystal Spectrometer are also discussed.

  5. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  6. Cosmology from cosmic shear with Dark Energy Survey Science Verification data

    DOE PAGES

    Becker, M. R.

    2016-07-06

    We present the first constraints on cosmology from the Dark Energy Survey (DES), using weak lensing measurements from the preliminary Science Verification (SV) data. We use 139 square degrees of SV data, which is less than 3% of the full DES survey area. Using cosmic shear 2-point measurements over three redshift bins we find σ 8(m=0.3) 0.5 = 0:81 ± 0:06 (68% confidence), after marginalising over 7 systematics parameters and 3 other cosmological parameters. Furthermore, we examine the robustness of our results to the choice of data vector and systematics assumed, and find them to be stable. About 20% ofmore » our error bar comes from marginalising over shear and photometric redshift calibration uncertainties. The current state-of-the-art cosmic shear measurements from CFHTLenS are mildly discrepant with the cosmological constraints from Planck CMB data. Our results are consistent with both datasets. Our uncertainties are ~30% larger than those from CFHTLenS when we carry out a comparable analysis of the two datasets, which we attribute largely to the lower number density of our shear catalogue. We investigate constraints on dark energy and find that, with this small fraction of the full survey, the DES SV constraints make negligible impact on the Planck constraints. The moderate disagreement between the CFHTLenS and Planck values of σ 8(Ω m=0.3) 0.5 is present regardless of the value of w.« less

  7. Cosmology from cosmic shear with Dark Energy Survey Science Verification data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, M. R.

    We present the first constraints on cosmology from the Dark Energy Survey (DES), using weak lensing measurements from the preliminary Science Verification (SV) data. We use 139 square degrees of SV data, which is less than 3% of the full DES survey area. Using cosmic shear 2-point measurements over three redshift bins we find σ 8(m=0.3) 0.5 = 0:81 ± 0:06 (68% confidence), after marginalising over 7 systematics parameters and 3 other cosmological parameters. Furthermore, we examine the robustness of our results to the choice of data vector and systematics assumed, and find them to be stable. About 20% ofmore » our error bar comes from marginalising over shear and photometric redshift calibration uncertainties. The current state-of-the-art cosmic shear measurements from CFHTLenS are mildly discrepant with the cosmological constraints from Planck CMB data. Our results are consistent with both datasets. Our uncertainties are ~30% larger than those from CFHTLenS when we carry out a comparable analysis of the two datasets, which we attribute largely to the lower number density of our shear catalogue. We investigate constraints on dark energy and find that, with this small fraction of the full survey, the DES SV constraints make negligible impact on the Planck constraints. The moderate disagreement between the CFHTLenS and Planck values of σ 8(Ω m=0.3) 0.5 is present regardless of the value of w.« less

  8. VLT/X-shooter observations of the low-metallicity blue compact dwarf galaxy PHL 293B including a luminous blue variable star

    NASA Astrophysics Data System (ADS)

    Izotov, Y. I.; Guseva, N. G.; Fricke, K. J.; Henkel, C.

    2011-09-01

    Context. We present VLT/X-shooter spectroscopic observations in the wavelength range λλ3000-23 000 Å of the extremely metal-deficient blue compact dwarf (BCD) galaxy PHL 293B containing a luminous blue variable (LBV) star and compare them with previous data. Aims: This BCD is one of the two lowest-metallicity galaxies where LBV stars were detected, allowing us to study the LBV phenomenon in the extremely low metallicity regime. Methods: We determine abundances of nitrogen, oxygen, neon, sulfur, argon, and iron by analyzing the fluxes of narrow components of the emission lines using empirical methods and study the properties of the LBV from the fluxes and widths of broad emission lines. Results: We derive an interstellar oxygen abundance of 12+log O/H = 7.71 ± 0.02, which is in agreement with previous determinations. The observed fluxes of narrow Balmer, Paschen and Brackett hydrogen lines correspond to the theoretical recombination values after correction for extinction with a single value C(Hβ) = 0.225. This implies that the star-forming region observed in the optical range is the only source of ionisation and there is no additional source of ionisation that is seen in the NIR range but is hidden in the optical range. We detect three v = 1-0 vibrational lines of molecular hydrogen. Their flux ratios and non-detection of v = 2-1 and 3-1 emission lines suggest that collisional excitation is the main source producing H2 lines. For the LBV star in PHL 293B we find broad emission with P Cygni profiles in several Balmer hydrogen emission lines and for the first time in several Paschen hydrogen lines and in several He i emission lines, implying temporal evolution of the LBV on a time scale of 8 years. The Hα luminosity of the LBV star is by one order of magnitude higher than the one obtained for the LBV star in NGC 2363 ≡ Mrk 71 which has a slightly higher metallicity 12+logO/H = 7.87. The terminal velocity of the stellar wind in the low-metallicity LBV of PHL293

  9. Commissioning and quality assurance of an integrated system for patient positioning and setup verification in particle therapy.

    PubMed

    Pella, A; Riboldi, M; Tagaste, B; Bianculli, D; Desplanques, M; Fontana, G; Cerveri, P; Seregni, M; Fattori, G; Orecchia, R; Baroni, G

    2014-08-01

    In an increasing number of clinical indications, radiotherapy with accelerated particles shows relevant advantages when compared with high energy X-ray irradiation. However, due to the finite range of ions, particle therapy can be severely compromised by setup errors and geometric uncertainties. The purpose of this work is to describe the commissioning and the design of the quality assurance procedures for patient positioning and setup verification systems at the Italian National Center for Oncological Hadrontherapy (CNAO). The accuracy of systems installed in CNAO and devoted to patient positioning and setup verification have been assessed using a laser tracking device. The accuracy in calibration and image based setup verification relying on in room X-ray imaging system was also quantified. Quality assurance tests to check the integration among all patient setup systems were designed, and records of daily QA tests since the start of clinical operation (2011) are presented. The overall accuracy of the patient positioning system and the patient verification system motion was proved to be below 0.5 mm under all the examined conditions, with median values below the 0.3 mm threshold. Image based registration in phantom studies exhibited sub-millimetric accuracy in setup verification at both cranial and extra-cranial sites. The calibration residuals of the OTS were found consistent with the expectations, with peak values below 0.3 mm. Quality assurance tests, daily performed before clinical operation, confirm adequate integration and sub-millimetric setup accuracy. Robotic patient positioning was successfully integrated with optical tracking and stereoscopic X-ray verification for patient setup in particle therapy. Sub-millimetric setup accuracy was achieved and consistently verified in daily clinical operation.

  10. Pre-performance Physiological State: Heart Rate Variability as a Predictor of Shooting Performance.

    PubMed

    Ortega, E; Wang, C J K

    2018-03-01

    Heart rate variability (HRV) is commonly used in sport science for monitoring the physiology of athletes but not as an indicator of physiological state from a psychological perspective. Since HRV is established to be an indicator of emotional responding, it could be an objective means of quantifying an athlete's subjective physiological state before competition. A total of 61 sport shooters participated in this study, of which 21 were novice shooters, 19 were intermediate shooters, and 21 were advanced level shooters. HRV, self-efficacy, and use of mental skills were assessed before they completed a standard shooting performance task of 40 shots, as in a competition qualifying round. The results showed that HRV was significantly positively correlated with self-efficacy and performance and was a significant predictor of shooting performance. In addition, advanced shooters were found to have significantly lower average heart rate before shooting and used more self-talk, relaxation, imagery, and automaticity compared to novice and intermediate shooters. HRV was found to be useful in identifying the physiological state of an athlete before competing, and as such, coaches and athletes can adopt practical strategies to improve the pre-performance physiological state as a means to optimize performance.

  11. Précis of the myth of martyrdom: what really drives suicide bombers, rampage shooters, and other self-destructive killers.

    PubMed

    Lankford, Adam

    2014-08-01

    For years, scholars have claimed that suicide terrorists are not suicidal, but rather psychologically normal individuals inspired to sacrifice their lives for an ideological cause, due to a range of social and situational factors. I agree that suicide terrorists are shaped by their contexts, as we all are. However, I argue that these scholars went too far. In The Myth of Martyrdom: What Really Drives Suicide Bombers, Rampage Shooters, and Other Self-Destructive Killers, I take the opposing view, based on my in-depth analyses of suicide attackers from Asia, Africa, Europe, the Middle East, and North America; attackers who were male, female, young, old, Islamic, and Christian; attackers who carried out the most deadly and the least deadly strikes. I present evidence that in terms of their behavior and psychology, suicide terrorists are much like others who commit conventional suicides, murder-suicides, or unconventional suicides where mental health problems, personal crises, coercion, fear of an approaching enemy, or hidden self-destructive urges play a major role. I also identify critical differences between suicide terrorists and those who have genuinely sacrificed their lives for a greater good. By better understanding suicide terrorists, experts in the behavioral and brain sciences may be able to pioneer exciting new breakthroughs in security countermeasures and suicide prevention. And even more ambitiously, by examining these profound extremes of the human condition, perhaps we can more accurately grasp the power of the human survival instinct among those who are actually psychologically healthy.

  12. Galaxy bias from galaxy-galaxy lensing in the DES Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prat, J.; et al.

    We present a measurement of galaxy-galaxy lensing around a magnitude-limited (more » $$i_{AB} < 22.5$$) sample of galaxies selected from the Dark Energy Survey Science Verification (DES-SV) data. We split these lenses into three photometric-redshift bins from 0.2 to 0.8, and determine the product of the galaxy bias $b$ and cross-correlation coefficient between the galaxy and dark matter overdensity fields $r$ in each bin, using scales above 4 Mpc/$h$ comoving, where we find the linear bias model to be valid given our current uncertainties. We compare our galaxy bias results from galaxy-galaxy lensing with those obtained from galaxy clustering (Crocce et al. 2016) and CMB lensing (Giannantonio et al. 2016) for the same sample of galaxies, and find our measurements to be in good agreement with those in Crocce et al. (2016), while, in the lowest redshift bin ($$z\\sim0.3$$), they show some tension with the findings in Giannantonio et al. (2016). Our results are found to be rather insensitive to a large range of systematic effects. We measure $$b\\cdot r$$ to be $$0.87\\pm 0.11$$, $$1.12 \\pm 0.16$$ and $$1.24\\pm 0.23$$, respectively for the three redshift bins of width $$\\Delta z = 0.2$$ in the range $0.2« less

  13. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  14. Miniature X-Ray Solar Spectrometer: A Science-Oriented, University 3U CubeSat

    NASA Technical Reports Server (NTRS)

    Mason, James P.; Woods, Thomas N.; Caspi, Amir; Chamberlin, Phillip C.; Moore, Christopher; Jones, Andrew; Kohnert, Rick; Li, Xinlin; Palo, Scott; Solomon, Stanley C.

    2016-01-01

    The miniature x-ray solar spectrometer is a three-unit CubeSat developed at the Laboratory for Atmospheric and Space Physics at the University of Colorado, Boulder. Over 40 students contributed to the project with professional mentorship and technical contributions from professors in the Aerospace Engineering Sciences Department at University of Colorado, Boulder and from Laboratory for Atmospheric and Space Physics scientists and engineers. The scientific objective of the miniature x-ray solar spectrometer is to study processes in the dynamic sun, from quiet sun to solar flares, and to further understand how these changes in the sun influence the Earth's atmosphere by providing unique spectral measurements of solar soft x-rays. The enabling technology providing the advanced solar soft x-ray spectral measurements is the Amptek X123, a commercial off-the-shelf silicon drift detector. The Amptek X123 has a low mass (approx. 324 g after modification), modest power consumption (approx. 2.50 W), and small volume (6.86 x 9.91 x 2.54 cm), making it ideal for a CubeSat. This paper provides an overview of the miniature x-ray solar spectrometer mission: the science objectives, project history, subsystems, and lessons learned, which can be useful for the small-satellite community.

  15. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  16. Impulsivity and related neuropsychological features in regular and addictive first person shooter gaming.

    PubMed

    Metcalf, Olivia; Pammer, Kristen

    2014-03-01

    Putative cyber addictions are of significant interest. There remains little experimental research into excessive use of first person shooter (FPS) games, despite their global popularity. Moreover, the role between excessive gaming and impulsivity remains unclear, with previous research showing conflicting findings. The current study investigated performances on a number of neuropsychological tasks (go/no-go, continuous performance task, Iowa gambling task) and a trait measure of impulsivity for a group of regular FPS gamers (n=25), addicted FPS gamers (n=22), and controls (n=22). Gamers were classified using the Addiction-Engagement Questionnaire. Addicted FPS gamers had significantly higher levels of trait impulsivity on the Barratt Impulsiveness Scale compared to controls. Addicted FPS gamers also had significantly higher levels of disinhibition in a go/no-go task and inattention in a continuous performance task compared to controls, whereas the regular FPS gamers had better decision making on the Iowa gambling task compared to controls. The results indicate impulsivity is associated with FPS gaming addiction, comparable to pathological gambling. The relationship between impulsivity and excessive gaming may be unique to the FPS genre. Furthermore, regular FPS gaming may improve decision making ability.

  17. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  18. Determining the lifetime of detectable amounts of gunshot residue on the hands of a shooter using laser-induced breakdown spectroscopy.

    PubMed

    Rosenberg, Matthew B; Dockery, Christopher R

    2008-11-01

    Laser-induced breakdown spectroscopy (LIBS) has been used to determine the period of time that a shooter will test positive for gunshot residue (GSR) after firing a revolver. Multiple rounds of primer were fired and samples collected at multiple hour intervals using an adhesive tape pressed against the skin. Samples were analyzed directly using a commercially available laser-induced breakdown spectrometer where barium emission (originating from barium nitrate in the primer) was observed. Population statistics were used to compare suspected GSR to a library of blank samples from which a threshold value was established. Statistically significant results, positive for GSR, are obtained 5.27 days after a firearm discharge using these techniques.

  19. Physics Verification Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  20. Cleared for Launch - Lessons Learned from the OSIRIS-REx System Requirements Verification Program

    NASA Technical Reports Server (NTRS)

    Stevens, Craig; Adams, Angela; Williams, Bradley; Goodloe, Colby

    2017-01-01

    Requirements verification of a large flight system is a challenge. It is especially challenging for engineers taking on their first role in space systems engineering. This paper describes our approach to verification of the Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) system requirements. It also captures lessons learned along the way from developing systems engineers embroiled in this process. We begin with an overview of the mission and science objectives as well as the project requirements verification program strategy. A description of the requirements flow down is presented including our implementation for managing the thousands of program and element level requirements and associated verification data. We discuss both successes and methods to improve the managing of this data across multiple organizational interfaces. Our approach to verifying system requirements at multiple levels of assembly is presented using examples from our work at instrument, spacecraft, and ground segment levels. We include a discussion of system end-to-end testing limitations and their impacts to the verification program. Finally, we describe lessons learned that are applicable to all emerging space systems engineers using our unique perspectives across multiple organizations of a large NASA program.

  1. EMC: Verification

    Science.gov Websites

    , GFS, RAP, HRRR, HIRESW, SREF mean, International Global Models, HPC analysis Precipitation Skill Scores : 1995-Present NAM, GFS, NAM CONUS nest, International Models EMC Forecast Verfication Stats: NAM ) Real Time Verification of NCEP Operational Models against observations Real Time Verification of NCEP

  2. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  3. Verification Test of the SURF and SURFplus Models in xRage: Part III Affect of Mesh Alignment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menikoff, Ralph

    The previous studies used an underdriven detonation wave in 1-dimension (steady ZND reaction zone profile followed by a scale-invariant rarefaction wave) for PBX 9502 as a verification test of the implementation of the SURF and SURFplus models in the xRage code. Since the SURF rate is a function of the lead shock pressure, the question arises as to the effect on accuracy of variations in the detected shock pressure due to the alignment of the shock front with the mesh. To study the effect of mesh alignment we simulate a cylindrically diverging detonation wave using a planar 2-D mesh. Themore » leading issue is the magnitude of azimuthal asymmetries in the numerical solution. The 2-D test case does not have an exact analytic solution. To quantify the accuracy, the 2-D solution along rays through the origin are compared to a highly resolved 1-D simulation in cylindrical geometry.« less

  4. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    NASA Astrophysics Data System (ADS)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  5. Voltage verification unit

    DOEpatents

    Martin, Edward J [Virginia Beach, VA

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  6. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    NASA Astrophysics Data System (ADS)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including

  7. SMAP Verification and Validation Project - Final Report

    NASA Technical Reports Server (NTRS)

    Murry, Michael

    2012-01-01

    In 2007, the National Research Council (NRC) released the Decadal Survey of Earth science. In the future decade, the survey identified 15 new space missions of significant scientific and application value for the National Aeronautics and Space Administration (NASA) to undertake. One of these missions was the Soil Moisture Active Passive (SMAP) mission that NASA assigned to the Jet Propulsion Laboratory (JPL) in 2008. The goal of SMAP1 is to provide global, high resolution mapping of soil moisture and its freeze/thaw states. The SMAP project recently passed its Critical Design Review and is proceeding with its fabrication and testing phase.Verification and Validation (V&V) is widely recognized as a critical component in system engineering and is vital to the success of any space mission. V&V is a process that is used to check that a system meets its design requirements and specifications in order to fulfill its intended purpose. Verification often refers to the question "Have we built the system right?" whereas Validation asks "Have we built the right system?" Currently the SMAP V&V team is verifying design requirements through inspection, demonstration, analysis, or testing. An example of the SMAP V&V process is the verification of the antenna pointing accuracy with mathematical models since it is not possible to provide the appropriate micro-gravity environment for testing the antenna on Earth before launch.

  8. redMaGiC: selecting luminous red galaxies from the DES Science Verification data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rozo, E.

    We introduce redMaGiC, an automated algorithm for selecting Luminous Red Galaxies (LRGs). The algorithm was developed to minimize photometric redshift uncertainties in photometric large-scale structure studies. redMaGiC achieves this by self-training the color-cuts necessary to produce a luminosity-thresholded LRG sam- ple of constant comoving density. Additionally, we demonstrate that redMaGiC photo-zs are very nearly as accurate as the best machine-learning based methods, yet they require minimal spectroscopic training, do not suffer from extrapolation biases, and are very nearly Gaussian. We apply our algorithm to Dark Energy Survey (DES) Science Verification (SV) data to produce a redMaGiC catalog sampling the redshiftmore » range z ϵ [0.2,0.8]. Our fiducial sample has a comoving space density of 10 -3 (h -1Mpc) -3, and a median photo-z bias (z spec z photo) and scatter (σ z=(1 + z)) of 0.005 and 0.017 respectively.The corresponding 5σ outlier fraction is 1.4%. We also test our algorithm with Sloan Digital Sky Survey (SDSS) Data Release 8 (DR8) and Stripe 82 data, and discuss how spectroscopic training can be used to control photo-z biases at the 0.1% level.« less

  9. PREFACE: Buried Interface Sciences with X-rays and Neutrons 2010

    NASA Astrophysics Data System (ADS)

    Sakurai, Kenji

    2011-09-01

    The 2010 summer workshop on buried interface science with x-rays and neutrons was held at Nagoya University, Japan, on 25-27 July 2010. The workshop was organized by the Japan Applied Physics Society, which established a group to develop the research field of studying buried function interfaces with x-rays and neutrons. The workshop was the latest in a series held since 2001; Tsukuba (December 2001), Niigata (September 2002), Nagoya (July 2003), Tsukuba (July 2004), Saitama (March 2005), Yokohama (July 2006), Kusatsu (August 2006), Tokyo (December 2006), Sendai (July 2007), Sapporo (September 2007), Tokyo (December 2007), Tokyo-Akihabara (July 2009) and Hiratsuka (March 2010). The 2010 summer workshop had 64 participants and 34 presentations. Interfaces mark the boundaries of different material systems at which many interesting phenomena take place, thus making it extremely important to design, fabricate and analyse the structures of interfaces at both the atomic and macroscopic scale. For many applications, devices are prepared in the form of multi-layered thin films, with the result that interfaces are not exposed but buried under multiple layers. Because of such buried conditions, it is generally not easy to analyse such interfaces. In certain cases, for example, when the thin surface layer is not a solid but a liquid such as water, scientists can observe the atomic arrangement of the liquid-solid interface directly by using a scanning probe microscope, of which the tip is soaked in water. However, it has become clear that the use of a stylus tip positioned extremely close to the interface might change the structure of the water molecules. Therefore it is absolutely crucial to develop non-contact, non-destructive probes for buried interfaces. It is known that analysis using x-rays and neutrons is one of the most powerful tools for exploring near-surface structures including interfaces buried under several layers. In particular, x-ray analysis using 3rd

  10. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  11. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  12. ESA's X-ray space observatory XMM takes first pictures

    NASA Astrophysics Data System (ADS)

    2000-02-01

    Under the aegis of Prof. Roger Bonnet, ESA Director of Science, the mission's Principal Investigators will be presenting these spectacular first images at a press conference to be held on 9 February at the ESA Vilspa facility at Villafranca/Madrid in Spain, where the XMM Science Operations Centre is located. The event will also be the occasion for several major announcements concerning the XMM mission. In particular Professor Bonnet will launch the third XMM competition "Stargazing" - previously announced in September 1999. This will address European youngsters, 16 to 18 years old, who will be offered the unique opportunity of winning observing time using the X-ray telescope. Commissioning phase starts After a successful launch from Kourou on Ariane 504 on 10 December 1999, XMM was brought to its final operational orbit in the following week. The telescope doors on the X-ray Mirror Modules and on the Optical Monitor telescope were opened on 17/18 December. The Radiation Monitor was activated on 19 December and the spacecraft was put into a quiet mode over the Christmas and New Year period. The mission's scientific data is being received, processed and dispatched to astronomers by the XMM Science Operations Centre in Villafranca. Operations with the spacecraft restarted there on 4 January when, as part of the commissioning phase, all the science payloads were switched on one after the other for initial verifications. By the week of 17 January functional tests had begun on the Optical Monitor, the EPIC pn, the two EPIC MOS and the two RGS instruments. The internal doors of the EPIC cameras were opened whilst keeping the camera filter wheels closed. Astounding first images After a series of engineering exposures, all three EPIC cameras were used in turn, between 19-24 January, to take several views of two different extragalactic regions of the Universe. These views, featuring a variety of extended and X-ray point sources, were chosen to demonstrate the full

  13. Prototyping a Global Soft X-ray Imaging Instrument for Heliophysics, Planetary Science, and Astrophysics Science

    NASA Technical Reports Server (NTRS)

    Collier, Michael R.; Porter, F. Scott; Sibeck, David G.; Carter, Jenny A.; Chiao, Meng P.; Chornay, Dennis J.; Cravens, Thomas; Galeazzi, Massimiliano; Keller, John W.; Koutroumpa, Dimitra; hide

    2012-01-01

    We describe current progress in the development of a prototype wide field-of-view soft X-ray imager that employs Lobster-eye optics and targets heliophysics, planetary, and astrophysics science. The prototype will provide proof-of-concept for a future flight instrument capable of imaging the entire dayside magnetosheath from outside the magnetosphere. Such an instrument was proposed for the FSA AXIOM mission

  14. Prototyping a Global Soft X-Ray Imaging Instrument for Heliophysics, Planetary Science, and Astrophysics Science

    NASA Technical Reports Server (NTRS)

    Collier, M. R.; Porter, F. S.; Sibeck, D. G.; Carter, J. A.; Chiao, M. P.; Chornay, D. J.; Cravens, T.; Galeazzi, M.; Keller, J. W.; Koutroumpa, D.; hide

    2012-01-01

    We describe current progress in the development of a prototype wide field-of-view soft X-ray imager that employs Lobstereye optics and targets heliophysics, planetary, and astrophysics science. The prototype will provide proof-of-concept for a future flight instrument capable of imaging the entire dayside magnetosheath from outside the magnetosphere. Such an instrument was proposed for the ESA AXIOM mission.

  15. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Elvatech, Ltd. ElvaX (ElvaX) x-ray fluorescence (XRF) analyzer distributed in the United States by Xcalibur XRF Services (Xcalibur), was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ElvaX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ElvaX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as s

  16. Understanding the evolution of anomalous anharmonicity in Bi 2 Te 3 - x Se x

    DOE PAGES

    Tian, Yao; Jia, Shuang; Cava, R. J.; ...

    2017-03-08

    The anharmonic effect in thermoelectrics has been a central topic for decades in both condensed matter physics and material science. However, despite the long-believed strong and complex anharmonicity in the Bi 2Te 3-xSe x series, experimental verification of anharmonicity and its evolution with doping remains elusive. We fill this important gap with high-resolution, temperature-dependent Raman spectroscopy in high-quality single crystals of Bi 2Te, Bi 2Te 2Se , and Bi 2Se 3 over the temperature range from 4 to 293 K. Klemens's model was employed to explain the renormalization of their phonon linewidths. The phonon energies of Bi 2Se 3 andmore » Bi 2Te 3 are analyzed in detail from three aspects: lattice expansion, cubic anharmonicity, and quartic anharmonicity. For the first time, we explain the evolution of anharmonicity in various phonon modes and across the series. Lastly, in particular, we find that the interplay between cubic and quartic anharmonicity is governed by their distinct dependence on the phonon density of states, providing insights into anomalous anharmonicity designing of new thermoelectrics.« less

  17. Six Years of Science with the Chandra X-Ray Observatory

    NASA Technical Reports Server (NTRS)

    Weisskopf, Martin

    2005-01-01

    The Chandra X-ray Observatory had its origins in a 1963 proposal led by Riccardo Giacconi that called for a 1-meter diameter, 1-arcsecond class X-Ray telescope for studying the Universe in X-rays. We will briefly discuss the history of the mission, the development of the hardware, its testing, and the launch on 1999, July 23. The remainder of the talk will be an admittedly eclectic review of some of the most exciting scientific highlights. These include the detection and identification of the first source seen with Chandra - an unusual Seyfert 1 we nicknamed Leon X-1, the detailed study of the Crab Nebula and its pulsar, and spectacular images of other supernova remnants including a 1-Million second exposure on Cas A. We also will summarize some of the major Chandra findings for normal and active galaxies and we will illustrate the breadth of science enabled by Chandra observations of clusters of galaxies and their implications for cosmology.

  18. Draft Plan for Characterizing Commercial Data Products in Support of Earth Science Research

    NASA Technical Reports Server (NTRS)

    Ryan, Robert E.; Terrie, Greg; Berglund, Judith

    2006-01-01

    This presentation introduces a draft plan for characterizing commercial data products for Earth science research. The general approach to the commercial product verification and validation includes focused selection of a readily available commercial remote sensing products that support Earth science research. Ongoing product verification and characterization will question whether the product meets specifications and will examine its fundamental properties, potential and limitations. Validation will encourage product evaluation for specific science research and applications. Specific commercial products included in the characterization plan include high-spatial-resolution multispectral (HSMS) imagery and LIDAR data products. Future efforts in this process will include briefing NASA headquarters and modifying plans based on feedback, increased engagement with the science community and refinement of details, coordination with commercial vendors and The Joint Agency Commercial Imagery Evaluation (JACIE) for HSMS satellite acquisitions, acquiring waveform LIDAR data and performing verification and validation.

  19. Novice Shooters With Lower Pre-shooting Alpha Power Have Better Performance During Competition in a Virtual Reality Scenario.

    PubMed

    Pereira, Michael; Argelaguet, Ferran; Millán, José Del R; Lécuyer, Anatole

    2018-01-01

    Competition changes the environment for athletes. The difficulty of training for such stressful events can lead to the well-known effect of "choking" under pressure, which prevents athletes from performing at their best level. To study the effect of competition on the human brain, we recorded pilot electroencephalography (EEG) data while novice shooters were immersed in a realistic virtual environment representing a shooting range. We found a differential between-subject effect of competition on mu (8-12 Hz) oscillatory activity during aiming; compared to training, the more the subject was able to desynchronize his mu rhythm during competition, the better was his shooting performance. Because this differential effect could not be explained by differences in simple measures of the kinematics and muscular activity, nor by the effect of competition or shooting performance per se , we interpret our results as evidence that mu desynchronization has a positive effect on performance during competition.

  20. [Theories of evolution shaping Victorian anthropology. The science-politics of the X-Club, 1860-1872].

    PubMed

    Gondermann, Thomas

    2008-01-01

    This paper discusses the role that a group of evolutionists, the X-Club, played in the epistemic and institutional transformation of Victorian anthropology in the 1860s. It analyses how anthropology has been brought into line with the theory of evolution, which gained currency at the same time. The X-Club was a highly influential pressure group in the Victorian scientific community. It campaigned for the theory of evolution in several fields of the natural sciences and had a considerable influence on the modernization of the sciences. Yet, this club also intervened in the anthropological discourse of these years. The X-Club's meddling with anthropology led to the latter's evolutionary turn. The introduction of an evolutionary agenda into Victorian anthropology depended not only on the X-Club's theoretical contributions but also on the structural reformation of the discipline. Its campaigns also aimed at marginalizing the proponents of pre-evolutionary anthropology in its institutions and led to the foundation of a new organization in anthropology: The Anthropological Institute of Great Britain and Ireland. Thus, evolutionary anthropology emerged in the 1860s also as the result of science-politicking rather than just from the transmission of evolutionary concepts through discourse.

  1. Longer you play, the more hostile you feel: examination of first person shooter video games and aggression during video game play.

    PubMed

    Barlett, Christopher P; Harris, Richard J; Baldassaro, Ross

    2007-01-01

    This study investigated the effects of video game play on aggression. Using the General Aggression Model, as applied to video games by Anderson and Bushman, [2002] this study measured physiological arousal, state hostility, and how aggressively participants would respond to three hypothetical scenarios. In addition, this study measured each of these variables multiple times to gauge how aggression would change with increased video game play. Results showed a significant increase from baseline in hostility and aggression (based on two of the three story stems), which is consistent with the General Aggression Model. This study adds to the existing literature on video games and aggression by showing that increased play of a violent first person shooter video game can significantly increase aggression from baseline. 2007 Wiley-Liss, Inc.

  2. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  3. Playing shooter and driving videogames improves top-down guidance in visual search.

    PubMed

    Wu, Sijing; Spence, Ian

    2013-05-01

    Playing action videogames is known to improve visual spatial attention and related skills. Here, we showed that playing action videogames also improves classic visual search, as well as the ability to locate targets in a dual search that mimics certain aspects of an action videogame. In Experiment 1A, first-person shooter (FPS) videogame players were faster than nonplayers in both feature search and conjunction search, and in Experiment 1B, they were faster and more accurate in a peripheral search and identification task while simultaneously performing a central search. In Experiment 2, we showed that 10 h of play could improve the performance of nonplayers on each of these tasks. Three different genres of videogames were used for training: two action games and a 3-D puzzle game. Participants who played an action game (either an FPS or a driving game) achieved greater gains on all search tasks than did those who trained using the puzzle game. Feature searches were faster after playing an action videogame, suggesting that players developed a better target template to guide search in a top-down manner. The results of the dual search suggest that, in addition to enhancing the ability to divide attention, playing an action game improves the top-down guidance of attention to possible target locations. The results have practical implications for the development of training tools to improve perceptual and cognitive skills.

  4. SpaceX CRS-14 What's On Board Science Briefing

    NASA Image and Video Library

    2018-04-01

    From left, Pete Hasbrook, associate program scientist, International Space Station Program at NASA's Johnson Space Center in Houston; Craig Kundrot, director, NASA's Space Life and Physical Science Research and Applications; Marie Lewis, moderator, Kennedy Space Center; and Patrick O'Neill, Marketing and Communications Manager, Center for the Advancement of Science in Space, speak to members of the media in the Kennedy Space Center Press Site auditorium. The briefing focused on research planned for launch to the International Space Station. The scientific materials and supplies will be aboard a Dragon spacecraft scheduled for liftoff from Cape Canaveral Air Force Station's Space Launch Complex 40 at 4:30 p.m. EST, on April 2, 2018. The SpaceX Falcon 9 rocket will launch the company's 14th Commercial Resupply Services mission to the space station.

  5. Influence of the Redundant Verification and the Non-Redundant Verification on the Hydraulic Tomography

    NASA Astrophysics Data System (ADS)

    Wei, T. B.; Chen, Y. L.; Lin, H. R.; Huang, S. Y.; Yeh, T. C. J.; Wen, J. C.

    2016-12-01

    In the groundwater study, it estimated the heterogeneous spatial distribution of hydraulic Properties, there were many scholars use to hydraulic tomography (HT) from field site pumping tests to estimate inverse of heterogeneous spatial distribution of hydraulic Properties, to prove the most of most field site aquifer was heterogeneous hydrogeological parameters spatial distribution field. Many scholars had proposed a method of hydraulic tomography to estimate heterogeneous spatial distribution of hydraulic Properties of aquifer, the Huang et al. [2011] was used the non-redundant verification analysis of pumping wells changed, observation wells fixed on the inverse and the forward, to reflect the feasibility of the heterogeneous spatial distribution of hydraulic Properties of field site aquifer of the non-redundant verification analysis on steady-state model.From post literature, finding only in steady state, non-redundant verification analysis of pumping well changed location and observation wells fixed location for inverse and forward. But the studies had not yet pumping wells fixed or changed location, and observation wells fixed location for redundant verification or observation wells change location for non-redundant verification of the various combinations may to explore of influences of hydraulic tomography method. In this study, it carried out redundant verification method and non-redundant verification method for forward to influences of hydraulic tomography method in transient. And it discuss above mentioned in NYUST campus sites the actual case, to prove the effectiveness of hydraulic tomography methods, and confirmed the feasibility on inverse and forward analysis from analysis results.Keywords: Hydraulic Tomography, Redundant Verification, Heterogeneous, Inverse, Forward

  6. redMaGiC: Selecting luminous red galaxies from the DES Science Verification data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rozo, E.; Rykoff, E. S.; Abate, A.

    Here, we introduce redMaGiC, an automated algorithm for selecting luminous red galaxies (LRGs). The algorithm was specifically developed to minimize photometric redshift uncertainties in photometric large-scale structure studies. redMaGiC achieves this by self-training the colour cuts necessary to produce a luminosity-thresholded LRG sample of constant comoving density. We demonstrate that redMaGiC photo-zs are very nearly as accurate as the best machine learning-based methods, yet they require minimal spectroscopic training, do not suffer from extrapolation biases, and are very nearly Gaussian. We apply our algorithm to Dark Energy Survey (DES) Science Verification (SV) data to produce a redMaGiC catalogue sampling themore » redshift range z ϵ [0.2, 0.8]. Our fiducial sample has a comoving space density of 10 –3 (h –1 Mpc) –3, and a median photo-z bias (zspec – zphoto) and scatter (σz/(1 + z)) of 0.005 and 0.017, respectively. The corresponding 5σ outlier fraction is 1.4 per cent. We also test our algorithm with Sloan Digital Sky Survey Data Release 8 and Stripe 82 data, and discuss how spectroscopic training can be used to control photo-z biases at the 0.1 per cent level.« less

  7. redMaGiC: Selecting luminous red galaxies from the DES Science Verification data

    DOE PAGES

    Rozo, E.; Rykoff, E. S.; Abate, A.; ...

    2016-05-30

    Here, we introduce redMaGiC, an automated algorithm for selecting luminous red galaxies (LRGs). The algorithm was specifically developed to minimize photometric redshift uncertainties in photometric large-scale structure studies. redMaGiC achieves this by self-training the colour cuts necessary to produce a luminosity-thresholded LRG sample of constant comoving density. We demonstrate that redMaGiC photo-zs are very nearly as accurate as the best machine learning-based methods, yet they require minimal spectroscopic training, do not suffer from extrapolation biases, and are very nearly Gaussian. We apply our algorithm to Dark Energy Survey (DES) Science Verification (SV) data to produce a redMaGiC catalogue sampling themore » redshift range z ϵ [0.2, 0.8]. Our fiducial sample has a comoving space density of 10 –3 (h –1 Mpc) –3, and a median photo-z bias (zspec – zphoto) and scatter (σz/(1 + z)) of 0.005 and 0.017, respectively. The corresponding 5σ outlier fraction is 1.4 per cent. We also test our algorithm with Sloan Digital Sky Survey Data Release 8 and Stripe 82 data, and discuss how spectroscopic training can be used to control photo-z biases at the 0.1 per cent level.« less

  8. Submicron x-ray diffraction and its applications to problems in materials and environmental science

    NASA Astrophysics Data System (ADS)

    Tamura, N.; Celestre, R. S.; MacDowell, A. A.; Padmore, H. A.; Spolenak, R.; Valek, B. C.; Meier Chang, N.; Manceau, A.; Patel, J. R.

    2002-03-01

    The availability of high brilliance third generation synchrotron sources together with progress in achromatic focusing optics allows us to add submicron spatial resolution to the conventional century-old x-ray diffraction technique. The new capabilities include the possibility to map in situ, grain orientations, crystalline phase distribution, and full strain/stress tensors at a very local level, by combining white and monochromatic x-ray microbeam diffraction. This is particularly relevant for high technology industry where the understanding of material properties at a microstructural level becomes increasingly important. After describing the latest advances in the submicron x-ray diffraction techniques at the Advanced Light Source, we will give some examples of its application in material science for the measurement of strain/stress in metallic thin films and interconnects. Its use in the field of environmental science will also be discussed.

  9. Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  10. Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  11. SpaceX CRS-10 "What's On Board" Science Briefing

    NASA Image and Video Library

    2017-02-17

    Speaking to members of the media in the Kennedy Space Center’s Press Site auditorium, Dr. Michael Freilich of the Earth Science Division at NASA Headquarters in Washington, D.C., left, and Dr. Richard Blakeslee of NASA’s Marshall Space Flight Center in Huntsville, Alabama, discussed instruments to be delivered to the International Space Station on the SpaceX CRS-10 mission. The Lightning Imaging Sensor (LIS) is to measure the amount, rate and energy of lightning around the world. The SAGE III instrument is designed to study ozone in the atmosphere. A Dragon spacecraft is scheduled to be launched from Kennedy’s Launch Complex 39A on Feb. 18 atop a SpaceX Falcon 9 rocket on the company's 10th Commercial Resupply Services mission to the space station.

  12. Bringing Hands-on Activities and Real Scientists to Students: Bishop Museum's X-treme Science Exhibit, Holoholo Science Program, and Planned Science Learning Center

    NASA Astrophysics Data System (ADS)

    Hills, D. J.; Fullerton, K.; Hoddick, C.; Ali, N.; Mosher, M. K.

    2002-12-01

    Bishop Museum developed the "X-treme Science: Exploring Oceans, Volcanoes, and Outer Space" museum exhibit in conjunction with NASA as part of their goal to increase educational outreach. A key element of the exhibit was the inclusion of real scientists describing what they do, and fostering the interaction between scientists and students. Highlights of the exhibit were interviews with local (Hawaii-based) scientists involved in current ocean, volcano, and space research. These interviews were based on questions that students provided, and were available during the exhibit at interactive kiosks. Lesson plans were developed by local teachers and scientists, and provided online to enhance the exhibit. However, one limitation of the museum exhibit was that not all students in the state could visit, or spend enough time with it. To serve more remote schools, and to provide for additional enrichment for those who did attend, the education department at Bishop Museum developed a traveling program with the X-treme Science exhibit as the basis. The Holoholo (Hawaiian for "fun outing") Science program brings a scientist into the classroom with a hands-on scientific inquiry activity. The activity is usually a simplified version of a problem that the scientist actually deals with. The students explore the activity, reach conclusions, and discuss their results. They are then given the opportunity to question the scientist about the activity and about what the scientist does. This allows students to understand that science is not something mystical, but rather something attainable. A key element of Holoholo remains the active participation of real-life scientists in the experience. The scientists who have participated in the program have had overwhelmingly positive experiences. Bishop Museum is developing a science learning center, with the objective of meeting local and national science standards using inquiry based science. The unifying theme of all three of these projects is

  13. Real-Time Verification of a High-Dose-Rate Iridium 192 Source Position Using a Modified C-Arm Fluoroscope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nose, Takayuki, E-mail: nose-takayuki@nms.ac.jp; Chatani, Masashi; Otani, Yuki

    Purpose: High-dose-rate (HDR) brachytherapy misdeliveries can occur at any institution, and they can cause disastrous results. Even a patient's death has been reported. Misdeliveries could be avoided with real-time verification methods. In 1996, we developed a modified C-arm fluoroscopic verification of an HDR Iridium 192 source position prevent these misdeliveries. This method provided excellent image quality sufficient to detect errors, and it has been in clinical use at our institutions for 20 years. The purpose of the current study is to introduce the mechanisms and validity of our straightforward C-arm fluoroscopic verification method. Methods and Materials: Conventional X-ray fluoroscopic images aremore » degraded by spurious signals and quantum noise from Iridium 192 photons, which make source verification impractical. To improve image quality, we quadrupled the C-arm fluoroscopic X-ray dose per pulse. The pulse rate was reduced by a factor of 4 to keep the average exposure compliant with Japanese medical regulations. The images were then displayed with quarter-frame rates. Results: Sufficient quality was obtained to enable observation of the source position relative to both the applicators and the anatomy. With this method, 2 errors were detected among 2031 treatment sessions for 370 patients within a 6-year period. Conclusions: With the use of a modified C-arm fluoroscopic verification method, treatment errors that were otherwise overlooked were detected in real time. This method should be given consideration for widespread use.« less

  14. Real-Time Verification of a High-Dose-Rate Iridium 192 Source Position Using a Modified C-Arm Fluoroscope.

    PubMed

    Nose, Takayuki; Chatani, Masashi; Otani, Yuki; Teshima, Teruki; Kumita, Shinichirou

    2017-03-15

    High-dose-rate (HDR) brachytherapy misdeliveries can occur at any institution, and they can cause disastrous results. Even a patient's death has been reported. Misdeliveries could be avoided with real-time verification methods. In 1996, we developed a modified C-arm fluoroscopic verification of an HDR Iridium 192 source position prevent these misdeliveries. This method provided excellent image quality sufficient to detect errors, and it has been in clinical use at our institutions for 20 years. The purpose of the current study is to introduce the mechanisms and validity of our straightforward C-arm fluoroscopic verification method. Conventional X-ray fluoroscopic images are degraded by spurious signals and quantum noise from Iridium 192 photons, which make source verification impractical. To improve image quality, we quadrupled the C-arm fluoroscopic X-ray dose per pulse. The pulse rate was reduced by a factor of 4 to keep the average exposure compliant with Japanese medical regulations. The images were then displayed with quarter-frame rates. Sufficient quality was obtained to enable observation of the source position relative to both the applicators and the anatomy. With this method, 2 errors were detected among 2031 treatment sessions for 370 patients within a 6-year period. With the use of a modified C-arm fluoroscopic verification method, treatment errors that were otherwise overlooked were detected in real time. This method should be given consideration for widespread use. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Designing the Social Context for Easier Verification, Validation, and Uncertainty Quantification of Earth Science Data

    NASA Astrophysics Data System (ADS)

    Barkstrom, B. R.; Loeb, N. G.; Wielicki, B. A.

    2017-12-01

    Verification, Validation, and Uncertainty Quantification (VVUQ) are key actions that support conclusions based on Earth science data. Communities of data producers and users must undertake VVUQ when they create and use their data. The strategies [S] and tools [T] suggested below come from successful use on two large NASA projects. The first was the Earth Radiation Budget Experiment (ERBE). The second is the investigation of Clouds and the Earth's Radiant Energy System (CERES). [S] 1. Partition the production system into subsystems that deal with data transformations confined to limited space and time scales. Simplify the subsystems to minimize the number of data transformations in each subsystem. [S] 2. Derive algorithms from the fundamental physics and chemistry governing the parameters in each subsystem including those for instrument calibration. [S] 3. Use preliminary uncertainty estimates to detect unexpected discrepancies. Removing these requires diagnostic work as well as development and testing of fixes. [S] 4. Make sure there are adequate resources to support multiple end-to-end reprocessing of all data products. [T] 1. Create file identifiers that accommodate temporal and spatial sequences of data files and subsystem version changes. [T] 2. Create libraries of parameters used in common by different subsystems to reduce errors due to inconsistent values. [T] 3. Maintain a list of action items to record progress on resolving discrepancies. [T] 4. Plan on VVUQ activities that use independent data sources and peer review before distributing and archiving data. The goal of VVUQ is to provide a transparent link between the data and the physics and chemistry governing the measured quantities. The VVUQ effort also involves specialized domain experience and nomenclature. It often requires as much effort as the original system development. ERBE and CERES demonstrated that these strategies and tools can reduce the cost of VVUQ for Earth science data products.

  16. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    PubMed

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  17. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations... concentration subcomponents (e.g., THC and CH4 for NMHC) separately. For example, for NMHC measurements, perform drift verification on NMHC; do not verify THC and CH4 separately. (2) Drift verification requires two...

  18. Comparative evaluation of Kodak EDR2 and XV2 films for verification of intensity modulated radiation therapy.

    PubMed

    Dogan, Nesrin; Leybovich, Leonid B; Sethi, Anil

    2002-11-21

    Film dosimetry provides a convenient tool to determine dose distributions, especially for verification of IMRT plans. However, the film response to radiation shows a significant dependence on depth, energy and field size that compromise the accuracy of measurements. Kodak's XV2 film has a low saturation dose (approximately 100 cGy) and, consequently, a relatively short region of linear dose-response. The recently introduced Kodak extended range EDR2 film was reported to have a linear dose-response region extending to 500 cGy. This increased dose range may be particularly useful in the verification of IMRT plans. In this work, the dependence of Kodak EDR2 film's response on the depth, field size and energy was evaluated and compared with Kodak XV2 film. Co-60, 6 MV, 10 MV and 18 MV beams were used. Field sizes were 2 x 2, 6 x 6, 10 x 10, 14 x 14, 18 x 18 and 24 x 24 cm2. Doses for XV2 and EDR2 films were 80 cGy and 300 cGy, respectively. Optical density was converted to dose using depth-corrected sensitometric (Hurter and Driffield, or H&D) curves. For each field size, XV2 and EDR2 depth-dose curves were compared with ion chamber depth-dose curves. Both films demonstrated similar (within 1%) field size dependence. The deviation from the ion chamber for both films was small forthe fields ranging from 2 x 2 to 10 x 10 cm2: < or =2% for 6, 10 and 18 MV beams. No deviation was observed for the Co-60 beam. As the field size increased to 24 x 24 cm2, the deviation became significant for both films: approximately 7.5% for Co-60, approximately 5% for 6 MV and 10 MV, and approximately 6% for 18 MV. During the verification of IMRT plans, EDR2 film showed a better agreement with the calculated dose distributions than the XV2 film.

  19. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  20. Monthly and seasonally verification of precipitation in Poland

    NASA Astrophysics Data System (ADS)

    Starosta, K.; Linkowska, J.

    2009-04-01

    The national meteorological service of Poland - the Institute of Meteorology and Water Management (IMWM) joined COSMO - The Consortium for Small Scale Modelling on July 2004. In Poland, the COSMO _PL model version 3.5 had run till June 2007. Since July 2007, the model version 4.0 has been running. The model runs in an operational mode at 14-km grid spacing, twice a day (00 UTC, 12 UTC). For scientific research also model with 7-km grid spacing is ran. Monthly and seasonally verification for the 24-hours (06 UTC - 06 UTC) accumulated precipitation is presented in this paper. The precipitation field of COSMO_LM had been verified against rain gauges network (308 points). The verification had been made for every month and all seasons from December 2007 to December 2008. The verification was made for three forecast days for selected thresholds: 0.5, 1, 2.5, 5, 10, 20, 25, 30 mm. Following indices from contingency table were calculated: FBI (bias), POD (probability of detection), PON (probability of detection of non event), FAR (False alarm rate), TSS (True sill statistic), HSS (Heidke skill score), ETS (Equitable skill score). Also percentile ranks and ROC-relative operating characteristic are presented. The ROC is a graph of the hit rate (Y-axis) against false alarm rate (X-axis) for different decision thresholds

  1. Monthly and seasonally verification of precipitation in Poland

    NASA Astrophysics Data System (ADS)

    Starosta, K.; Linkowska, J.

    2009-04-01

    The national meteorological service of Poland - the Institute of Meteorology and Water Management (IMWM) joined COSMO - The Consortium for Small Scale Modelling on July 2004. In Poland, the COSMO _PL model version 3.5 had run till June 2007. Since July 2007, the model version 4.0 has been running. The model runs in an operational mode at 14-km grid spacing, twice a day (00 UTC, 12 UTC). For scientific research also model with 7-km grid spacing is ran. Monthly and seasonally verification for the 24-hours (06 UTC - 06 UTC) accumulated precipitation is presented in this paper. The precipitation field of COSMO_LM had been verified against rain gauges network (308 points). The verification had been made for every month and all seasons from December 2007 to December 2008. The verification was made for three forecast days for selected thresholds: 0.5, 1, 2.5, 5, 10, 20, 25, 30 mm. Following indices from contingency table were calculated: FBI (bias), POD (probability of detection), PON (probability of detection of non event), FAR (False alarm rate), TSS (True sill statistic), HSS (Heidke skill score), ETS (Equitable skill score). Also percentile ranks and ROC-relative operating characteristic are presented. The ROC is a graph of the hit rate (Y-axis) against false alarm rate (X-axis) for different decision thresholds.

  2. Optimal sensitometric curves of Kodak EDR2 film for dynamic intensity modulated radiation therapy verification.

    PubMed

    Suriyapee, S; Pitaxtarnin, N; Oonsiri, S; Jumpangern, C; Israngkul Na Ayuthaya, I

    2008-01-01

    To investigate the optimal sensitometric curves of extended dose range (EDR2) radiographic film in terms of depth, field size, dose range and processing conditions for dynamic intensity modulated radiation therapy (IMRT) dosimetry verification with 6 MV X-ray beams. A Varian Clinac 23 EX linear accelerator with 6 MV X-ray beam was used to study the response of Kodak EDR2 film. Measurements were performed at depths of 5, 10 and 15 cm in MedTec virtual water phantom and with field sizes of 2x2, 3x3, 10x10 and 15x15 cm(2). Doses ranging from 20 to 450 cGy were used. The film was developed with the Kodak RP X-OMAT Model M6B automatic film processor. Film response was measured with the Vidar model VXR-16 scanner. Sensitometric curves were applied to the dose profiles measured with film at 5 cm in the virtual water phantom with field sizes of 2x2 and 10x10 cm(2) and compared with ion chamber data. Scanditronix/Wellhofer OmniPro(TM) IMRT software was used for the evaluation of the IMRT plan calculated by Eclipse treatment planning. Investigation of the reproducibility and accuracy of the film responses, which depend mainly on the film processor, was carried out by irradiating one film nine times with doses of 20 to 450 cGy. A maximum standard deviation of 4.9% was found which decreased to 1.9% for doses between 20 and 200 cGy. The sensitometric curves for various field sizes at fixed depth showed a maximum difference of 4.2% between 2x2 and 15x15 cm(2) at 5 cm depth with a dose of 450 cGy. The shallow depth tended to show a greater effect of field size responses than the deeper depths. The sensitometric curves for various depths at fixed field size showed slightly different film responses; the difference due to depth was within 1.8% for all field sizes studied. Both field size and depth effect were reduced when the doses were lower than 450 cGy. The difference was within 2.5% in the dose range from 20 to 300 cGy for all field sizes and depths studied. Dose profiles

  3. The Mars Science Laboratory Organic Check Material

    NASA Astrophysics Data System (ADS)

    Conrad, Pamela G.; Eigenbrode, Jennifer L.; Von der Heydt, Max O.; Mogensen, Claus T.; Canham, John; Harpold, Dan N.; Johnson, Joel; Errigo, Therese; Glavin, Daniel P.; Mahaffy, Paul R.

    2012-09-01

    Mars Science Laboratory's Curiosity rover carries a set of five external verification standards in hermetically sealed containers that can be sampled as would be a Martian rock, by drilling and then portioning into the solid sample inlet of the Sample Analysis at Mars (SAM) suite. Each organic check material (OCM) canister contains a porous ceramic solid, which has been doped with a fluorinated hydrocarbon marker that can be detected by SAM. The purpose of the OCM is to serve as a verification tool for the organic cleanliness of those parts of the sample chain that cannot be cleaned other than by dilution, i.e., repeated sampling of Martian rock. SAM possesses internal calibrants for verification of both its performance and its internal cleanliness, and the OCM is not used for that purpose. Each OCM unit is designed for one use only, and the choice to do so will be made by the project science group (PSG).

  4. Novice Shooters With Lower Pre-shooting Alpha Power Have Better Performance During Competition in a Virtual Reality Scenario

    PubMed Central

    Pereira, Michael; Argelaguet, Ferran; Millán, José del R.; Lécuyer, Anatole

    2018-01-01

    Competition changes the environment for athletes. The difficulty of training for such stressful events can lead to the well-known effect of “choking” under pressure, which prevents athletes from performing at their best level. To study the effect of competition on the human brain, we recorded pilot electroencephalography (EEG) data while novice shooters were immersed in a realistic virtual environment representing a shooting range. We found a differential between-subject effect of competition on mu (8–12 Hz) oscillatory activity during aiming; compared to training, the more the subject was able to desynchronize his mu rhythm during competition, the better was his shooting performance. Because this differential effect could not be explained by differences in simple measures of the kinematics and muscular activity, nor by the effect of competition or shooting performance per se, we interpret our results as evidence that mu desynchronization has a positive effect on performance during competition.

  5. Science verification of operational aerosol and cloud products for TROPOMI on Sentinel-5 precursor

    NASA Astrophysics Data System (ADS)

    Lelli, Luca; Gimeno-Garcia, Sebastian; Sanders, Abram; Sneep, Maarten; Rozanov, Vladimir V.; Kokhanvosky, Alexander A.; Loyola, Diego; Burrows, John P.

    2016-04-01

    With the approaching launch of the Sentinel-5 precursor (S-5P) satellite, scheduled by mid 2016, one preparatory task of the L2 working group (composed by the Institute of Environmental Physics IUP Bremen, the Royal Netherlands Meteorological Institute KNMI De Bilt, and the German Aerospace Center DLR Oberpfaffenhofen) has been the assessment of biases among aerosol and cloud products, that are going to be inferred by the respective algorithms from measurements of the platform's payload TROPOspheric Monitoring Instrument (TROPOMI). The instrument will measure terrestrial radiance with varying moderate spectral resolutions from the ultraviolet throughout the shortwave infrared. Specifically, all the operational and verification algorithms involved in this comparison exploit the sensitivity of molecular oxygen absorption (the A-band, 755-775 nm, with a resolution of 0.54 nm) to changes in optical and geometrical parameters of tropospheric scattering layers. Therefore, aerosol layer height (ALH) and thickness (AOT), cloud top height (CTH), thickness (COT) and albedo (CA) are the targeted properties. First, the verification of these properties has been accomplished upon synchronisation of the respective forward radiative transfer models for a variety of atmospheric scenarios. Then, biases against independent techniques have been evaluated with real measurements of selected GOME-2 orbits. Global seasonal bias assessment has been carried out for CTH, CA and COT, whereas the verification of ALH and AOT is based on the analysis of the ash plume emitted by the icelandic volcanic eruption Eyjafjallajökull in May 2010 and selected dust scenes off the Saharan west coast sensed by SCIAMACHY in year 2009.

  6. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  7. 40 CFR 1065.370 - CLD CO2 and H2O quench verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... collisional quenching, which inhibits the chemiluminescent reaction that a CLD utilizes to detect NOX. This... x NOwet and use it in the quench verification calculations in § 1065.675. (f) Corrective action. If... action by repairing or replacing the analyzer. Before running emission tests, verify that the corrective...

  8. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  9. Cross-correlation redshift calibration without spectroscopic calibration samples in DES Science Verification Data

    NASA Astrophysics Data System (ADS)

    Davis, C.; Rozo, E.; Roodman, A.; Alarcon, A.; Cawthon, R.; Gatti, M.; Lin, H.; Miquel, R.; Rykoff, E. S.; Troxel, M. A.; Vielzeuf, P.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Bechtol, K.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Doel, P.; Drlica-Wagner, A.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gaztanaga, E.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; Jain, B.; James, D. J.; Jeltema, T.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Kuropatkin, N.; Lahav, O.; Li, T. S.; Lima, M.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Ogando, R. L. C.; Plazas, A. A.; Romer, A. K.; Sanchez, E.; Scarpine, V.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.; Wechsler, R. H.

    2018-06-01

    Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogues with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty of Δz ˜ ±0.01. We forecast that our proposal can, in principle, control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Our results provide strong motivation to launch a programme to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.

  10. Cosmology constraints from shear peak statistics in Dark Energy Survey Science Verification data

    NASA Astrophysics Data System (ADS)

    Kacprzak, T.; Kirk, D.; Friedrich, O.; Amara, A.; Refregier, A.; Marian, L.; Dietrich, J. P.; Suchyta, E.; Aleksić, J.; Bacon, D.; Becker, M. R.; Bonnett, C.; Bridle, S. L.; Chang, C.; Eifler, T. F.; Hartley, W. G.; Huff, E. M.; Krause, E.; MacCrann, N.; Melchior, P.; Nicola, A.; Samuroff, S.; Sheldon, E.; Troxel, M. A.; Weller, J.; Zuntz, J.; Abbott, T. M. C.; Abdalla, F. B.; Armstrong, R.; Benoit-Lévy, A.; Bernstein, G. M.; Bernstein, R. A.; Bertin, E.; Brooks, D.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Crocce, M.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Evrard, A. E.; Neto, A. Fausti; Flaugher, B.; Fosalba, P.; Frieman, J.; Gerdes, D. W.; Goldstein, D. A.; Gruen, D.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; Jain, B.; James, D. J.; Jarvis, M.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; March, M.; Marshall, J. L.; Martini, P.; Miller, C. J.; Miquel, R.; Mohr, J. J.; Nichol, R. C.; Nord, B.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.; Zhang, Y.; DES Collaboration

    2016-12-01

    Shear peak statistics has gained a lot of attention recently as a practical alternative to the two-point statistics for constraining cosmological parameters. We perform a shear peak statistics analysis of the Dark Energy Survey (DES) Science Verification (SV) data, using weak gravitational lensing measurements from a 139 deg2 field. We measure the abundance of peaks identified in aperture mass maps, as a function of their signal-to-noise ratio, in the signal-to-noise range 04 would require significant corrections, which is why we do not include them in our analysis. We compare our results to the cosmological constraints from the two-point analysis on the SV field and find them to be in good agreement in both the central value and its uncertainty. We discuss prospects for future peak statistics analysis with upcoming DES data.

  11. Cross-correlation redshift calibration without spectroscopic calibration samples in DES Science Verification Data

    DOE PAGES

    Davis, C.; Rozo, E.; Roodman, A.; ...

    2018-03-26

    Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogs with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty ofmore » $$\\Delta z \\sim \\pm 0.01$$. We forecast that our proposal can in principle control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Here, our results provide strong motivation to launch a program to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.« less

  12. Cross-correlation redshift calibration without spectroscopic calibration samples in DES Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, C.; Rozo, E.; Roodman, A.

    Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogs with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty ofmore » $$\\Delta z \\sim \\pm 0.01$$. We forecast that our proposal can in principle control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Here, our results provide strong motivation to launch a program to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.« less

  13. Earth Science Activities: A Guide to Effective Elementary School Science Teaching.

    ERIC Educational Resources Information Center

    Kanis, Ira B.; Yasso, Warren E.

    The primary emphasis of this book is on new or revised earth science activities that promote concept development rather than mere verification of concepts learned by passive means. Chapter 2 describes philosophies, strategies, methods, and techniques to guide preservice and inservice teachers, school building administrators, and curriculum…

  14. 40 CFR 1065.355 - H2O and CO2 interference verification for CO NDIR analyzers.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... interference verification as follows: (1) Start, operate, zero, and span the CO NDIR analyzer as you would..., and absolute pressure, p total, to calculate x H2O. Verify that the water content meets the...

  15. 40 CFR 1065.355 - H2O and CO2 interference verification for CO NDIR analyzers.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... interference verification as follows: (1) Start, operate, zero, and span the CO NDIR analyzer as you would..., and absolute pressure, p total, to calculate x H2O. Verify that the water content meets the...

  16. 40 CFR 1065.355 - H2O and CO2 interference verification for CO NDIR analyzers.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... interference verification as follows: (1) Start, operate, zero, and span the CO NDIR analyzer as you would..., and absolute pressure, p total, to calculate x H2O. Verify that the water content meets the...

  17. 40 CFR 1065.355 - H2O and CO2 interference verification for CO NDIR analyzers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... interference verification as follows: (1) Start, operate, zero, and span the CO NDIR analyzer as you would..., and absolute pressure, p total, to calculate x H2O. Verify that the water content meets the...

  18. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  19. Concept Verification Test - Evaluation of Spacelab/Payload operation concepts

    NASA Technical Reports Server (NTRS)

    Mcbrayer, R. O.; Watters, H. H.

    1977-01-01

    The Concept Verification Test (CVT) procedure is used to study Spacelab operational concepts by conducting mission simulations in a General Purpose Laboratory (GPL) which represents a possible design of Spacelab. In conjunction with the laboratory a Mission Development Simulator, a Data Management System Simulator, a Spacelab Simulator, and Shuttle Interface Simulator have been designed. (The Spacelab Simulator is more functionally and physically representative of the Spacelab than the GPL.) Four simulations of Spacelab mission experimentation were performed, two involving several scientific disciplines, one involving life sciences, and the last involving material sciences. The purpose of the CVT project is to support the pre-design and development of payload carriers and payloads, and to coordinate hardware, software, and operational concepts of different developers and users.

  20. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  1. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  2. A Practitioners Perspective on Verification

    NASA Astrophysics Data System (ADS)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  3. Methods for identification and verification using vacuum XRF system

    NASA Technical Reports Server (NTRS)

    Kaiser, Bruce (Inventor); Schramm, Fred (Inventor)

    2005-01-01

    Apparatus and methods in which one or more elemental taggants that are intrinsically located in an object are detected by x-ray fluorescence analysis under vacuum conditions to identify or verify the object's elemental content for elements with lower atomic numbers. By using x-ray fluorescence analysis, the apparatus and methods of the invention are simple and easy to use, as well as provide detection by a non line-of-sight method to establish the origin of objects, as well as their point of manufacture, authenticity, verification, security, and the presence of impurities. The invention is extremely advantageous because it provides the capability to measure lower atomic number elements in the field with a portable instrument.

  4. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  5. VEG-01: Veggie Hardware Verification Testing

    NASA Technical Reports Server (NTRS)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  6. Exploring Middle School Students' Representational Competence in Science: Development and Verification of a Framework for Learning with Visual Representations

    NASA Astrophysics Data System (ADS)

    Tippett, Christine Diane

    -methods verification study that was conducted to refine and validate the theoretical framework. This study examined middle school students' representational competence and focused on students' creation of visual representations such as labelled diagrams, a form of representation commonly found in science information texts and textbooks. An analysis of the 31 Grade 6 participants' representations and semistructured interviews revealed five themes, each of which supports one or more dimensions of the exploratory framework: participants' use of color, participants' choice of representation (form and function), participants' method of planning for representing, participants' knowledge of conventions, and participants' selection of information to represent. Together, the results of these three projects highlight the need for further research on learning with rather than learning from representations.

  7. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Eligibility verification. 457.380 Section 457.380... Requirements: Eligibility, Screening, Applications, and Enrollment § 457.380 Eligibility verification. (a) The... State may establish reasonable eligibility verification mechanisms to promote enrollment of eligible...

  8. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  9. Galaxy bias from galaxy–galaxy lensing in the DES science verification data

    DOE PAGES

    Prat, J.; Sánchez, C.; Miquel, R.; ...

    2017-09-25

    Here, we present a measurement of galaxy–galaxy lensing around a magnitude-limited (i AB < 22.5) sample of galaxies from the dark energy survey science verification (DES-SV) data. We split these lenses into three photometric-redshift bins from 0.2 to 0.8, and determine the product of the galaxy bias b and cross-correlation coefficient between the galaxy and dark matter overdensity fields r in each bin, using scales above 4 h –1 Mpc comoving, where we find the linear bias model to be valid given our current uncertainties. We compare our galaxy bias results from galaxy–galaxy lensing with those obtained from galaxy clusteringmore » and CMB lensing for the same sample of galaxies, and find our measurements to be in good agreement with those in Crocce et al., while, in the lowest redshift bin (z ~ 0.3), they show some tension with the findings in Giannantonio et al. We measure b · r to be 0.87 ± 0.11, 1.12 ± 0.16 and 1.24 ± 0.23, respectively, for the three redshift bins of width Δz = 0.2 in the range 0.2 < z < 0.8, defined with the photometric-redshift algorithm bpz. Using a different code to split the lens sample, tpz, leads to changes in the measured biases at the 10–20 per cent level, but it does not alter the main conclusion of this work: when comparing with Crocce et al. we do not find strong evidence for a cross-correlation parameter significantly below one in this galaxy sample, except possibly at the lowest redshift bin (z ~ 0.3), where we find r = 0.71 ± 0.11 when using tpz, and 0.83 ± 0.12 with bpz.« less

  10. Galaxy bias from galaxy–galaxy lensing in the DES science verification data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prat, J.; Sánchez, C.; Miquel, R.

    Here, we present a measurement of galaxy–galaxy lensing around a magnitude-limited (i AB < 22.5) sample of galaxies from the dark energy survey science verification (DES-SV) data. We split these lenses into three photometric-redshift bins from 0.2 to 0.8, and determine the product of the galaxy bias b and cross-correlation coefficient between the galaxy and dark matter overdensity fields r in each bin, using scales above 4 h –1 Mpc comoving, where we find the linear bias model to be valid given our current uncertainties. We compare our galaxy bias results from galaxy–galaxy lensing with those obtained from galaxy clusteringmore » and CMB lensing for the same sample of galaxies, and find our measurements to be in good agreement with those in Crocce et al., while, in the lowest redshift bin (z ~ 0.3), they show some tension with the findings in Giannantonio et al. We measure b · r to be 0.87 ± 0.11, 1.12 ± 0.16 and 1.24 ± 0.23, respectively, for the three redshift bins of width Δz = 0.2 in the range 0.2 < z < 0.8, defined with the photometric-redshift algorithm bpz. Using a different code to split the lens sample, tpz, leads to changes in the measured biases at the 10–20 per cent level, but it does not alter the main conclusion of this work: when comparing with Crocce et al. we do not find strong evidence for a cross-correlation parameter significantly below one in this galaxy sample, except possibly at the lowest redshift bin (z ~ 0.3), where we find r = 0.71 ± 0.11 when using tpz, and 0.83 ± 0.12 with bpz.« less

  11. Galaxy bias from galaxy-galaxy lensing in the DES science verification data

    NASA Astrophysics Data System (ADS)

    Prat, J.; Sánchez, C.; Miquel, R.; Kwan, J.; Blazek, J.; Bonnett, C.; Amara, A.; Bridle, S. L.; Clampitt, J.; Crocce, M.; Fosalba, P.; Gaztanaga, E.; Giannantonio, T.; Hartley, W. G.; Jarvis, M.; MacCrann, N.; Percival, W. J.; Ross, A. J.; Sheldon, E.; Zuntz, J.; Abbott, T. M. C.; Abdalla, F. B.; Annis, J.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Doel, P.; Eifler, T. F.; Evrard, A. E.; Fausti Neto, A.; Flaugher, B.; Frieman, J.; Gerdes, D. W.; Goldstein, D. A.; Gruen, D.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; James, D. J.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; Marshall, J. L.; Melchior, P.; Menanteau, F.; Nord, B.; Plazas, A. A.; Reil, K.; Romer, A. K.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Walker, A. R.

    2018-01-01

    We present a measurement of galaxy-galaxy lensing around a magnitude-limited (iAB < 22.5) sample of galaxies from the dark energy survey science verification (DES-SV) data. We split these lenses into three photometric-redshift bins from 0.2 to 0.8, and determine the product of the galaxy bias b and cross-correlation coefficient between the galaxy and dark matter overdensity fields r in each bin, using scales above 4 h-1 Mpc comoving, where we find the linear bias model to be valid given our current uncertainties. We compare our galaxy bias results from galaxy-galaxy lensing with those obtained from galaxy clustering and CMB lensing for the same sample of galaxies, and find our measurements to be in good agreement with those in Crocce et al., while, in the lowest redshift bin (z ∼ 0.3), they show some tension with the findings in Giannantonio et al. We measure b · r to be 0.87 ± 0.11, 1.12 ± 0.16 and 1.24 ± 0.23, respectively, for the three redshift bins of width Δz = 0.2 in the range 0.2 < z < 0.8, defined with the photometric-redshift algorithm BPZ. Using a different code to split the lens sample, TPZ, leads to changes in the measured biases at the 10-20 per cent level, but it does not alter the main conclusion of this work: when comparing with Crocce et al. we do not find strong evidence for a cross-correlation parameter significantly below one in this galaxy sample, except possibly at the lowest redshift bin (z ∼ 0.3), where we find r = 0.71 ± 0.11 when using TPZ, and 0.83 ± 0.12 with BPZ.

  12. SU-F-J-25: Position Monitoring for Intracranial SRS Using BrainLAB ExacTrac Snap Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jang, S; McCaw, T; Huq, M

    2016-06-15

    Purpose: To determine the accuracy of position monitoring with BrainLAB ExacTrac snap verification following couch rotations during intracranial SRS. Methods: A CT scan of an anthropomorphic head phantom was acquired using 1.25mm slices. The isocenter was positioned near the centroid of the frontal lobe. The head phantom was initially aligned on the treatment couch using cone-beam CT, then repositioned using ExacTrac x-ray verification with residual errors less than 0.2mm and 0.2°. Snap verification was performed over the full range of couch angles in 15° increments with known positioning offsets of 0–3mm applied to the phantom along each axis. At eachmore » couch angle, the smallest tolerance was determined for which no positioning deviation was detected. Results: For couch angles 30°–60° from the center position, where the longitudinal axis of the phantom is approximately aligned with the beam axis of one x-ray tube, snap verification consistently detected positioning errors exceeding the maximum 8mm tolerance. Defining localization error as the difference between the known offset and the minimum tolerance for which no deviation was detected, the RMS error is mostly less than 1mm outside of couch angles 30°–60° from the central couch position. Given separate measurements of patient position from the two imagers, whether to proceed with treatment can be determined by the criterion of a reading within tolerance from just one (OR criterion) or both (AND criterion) imagers. Using a positioning tolerance of 1.5mm, snap verification has sensitivity and specificity of 94% and 75%, respectively, with the AND criterion, and 67% and 93%, respectively, with the OR criterion. If readings exceeding maximum tolerance are excluded, the sensitivity and specificity are 88% and 86%, respectively, with the AND criterion. Conclusion: With a positioning tolerance of 1.5mm, ExacTrac snap verification can be used during intracranial SRS with sensitivity and specificity

  13. A year after lift-off, XMM-Newton is impressing the X-ray astronomy community

    NASA Astrophysics Data System (ADS)

    2000-11-01

    XMM-Newton was launched from Kourou on 10 December 1999 on the first Ariane-5 commercial flight. After in-orbit commissioning of the spacecraft, and calibration and performance verification of its science instruments, the observatory entered its routine operations phase on 1 July. At the press conference, ESA's Director of Science Prof. Roger-Maurice Bonnet and XMM-Newton Project Scientist Fred Jansen will present some of the many scientific results from the first eight months of the mission. Also present will be two of Europe's foremost X-ray astronomers, Prof. Johan Bleeker of the Space Research Organisation of the Netherlands, and Prof. Guenther Hasinger of the Astrophysikalisches Institut Potsdam, Germany. Amongst the topics to be illustrated with some remarkably vivid "colour" images of the X-ray Universe, will be XMM-Newton's first examination of a cataclysmic binary star, its first insights into some enigmatic black hole systems, analysis of the morphology of a few supernovae remnants, and evidence it has collected to end the long-standing mystery over X-ray cosmic background emission... The press conference will also recap on the spacecraft's operations, the performance of its science instruments, the issue of radiation constraints and future aspects of the mission. Media representatives wishing to attend the press event are kindly invited to complete the attached reply form and fax it back to ESA Media Relations Office +33(0)1.53.69.7690. Note to editors XMM-Newton is ESA's second Cornerstone Mission of the Horizon 2000 programme. The spacecraft was built by a European consortium of companies led by Astrium (formerly Dornier Satellitensysteme), Friedrichshafen, Germany. Its X-ray imaging and spectrographic instruments (EPIC and RGS) and its optical telescope (OM) were provided by large consortia, whose principal investigators are from, respectively, the University of Leicester, UK, SRON University of Utrecht Netherlands, and the Mullard Space Science

  14. Quantum money with classical verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gavinsky, Dmitry

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  15. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  16. Kleene Algebra and Bytecode Verification

    DTIC Science & Technology

    2016-04-27

    computing the star (Kleene closure) of a matrix of transfer functions. In this paper we show how this general framework applies to the problem of Java ...bytecode verification. We show how to specify transfer functions arising in Java bytecode verification in such a way that the Kleene algebra operations...potentially improve the performance over the standard worklist algorithm when a small cutset can be found. Key words: Java , bytecode, verification, static

  17. ESTEST: An Open Science Platform for Electronic Structure Research

    ERIC Educational Resources Information Center

    Yuan, Gary

    2012-01-01

    Open science platforms in support of data generation, analysis, and dissemination are becoming indispensible tools for conducting research. These platforms use informatics and information technologies to address significant problems in open science data interoperability, verification & validation, comparison, analysis, post-processing,…

  18. MARATHON Verification (MARV)

    DTIC Science & Technology

    2017-08-01

    comparable with MARATHON 1 in terms of output. Rather, the MARATHON 2 verification cases were designed to ensure correct implementation of the new algorithms...DISCLAIMER The findings of this report are not to be construed as an official Department of the Army position, policy, or decision unless so designated by...for employment against demands. This study is a comparative verification of the functionality of MARATHON 4 (our newest implementation of MARATHON

  19. 40 CFR 1065.355 - H2O and CO2 interference verification for CO NDIR analyzers.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... interference verification as follows: (1) Start, operate, zero, and span the CO NDIR analyzer as you would... absolute pressure, p total, to calculate x H 2 O. Verify that the H2O content meets the requirement in...

  20. SpaceX CRS-12 "What's on Board?" Science Briefing

    NASA Image and Video Library

    2017-08-13

    Ken Shields, director of Operations for Center for the Advancement of Science in Space/ISS National Lab, speaks to members of social media in the Kennedy Space Center’s Press Site auditorium. The briefing focused on research planned for launch to the International Space Station. The scientific materials and supplies will be aboard a Dragon spacecraft scheduled for launch from Kennedy’s Launch Complex 39A on Aug. 14 atop a SpaceX Falcon 9 rocket on the company's 12th Commercial Resupply Services mission to the space station.

  1. SpaceX CRS-11 "What's on Board?" Science Briefing

    NASA Image and Video Library

    2017-05-31

    Ken Shields, director of Operations for the Center for the Advancement of Science in Space (CASIS)/ISS National Lab, speaks to members of social media in the Kennedy Space Center’s Press Site auditorium. The briefing focused on research planned for launch to the International Space Station. The scientific materials and supplies will be aboard a Dragon spacecraft scheduled for launch from Kennedy’s Launch Complex 39A on June 1 atop a SpaceX Falcon 9 rocket on the company's 11th Commercial Resupply Services mission to the space station.

  2. SpaceX CRS-14 What's On Board Science Briefing

    NASA Image and Video Library

    2018-04-01

    Craig Kundrot, director, NASA's Space Life and Physical Science Research and Applications, speaks to members of the media in the Kennedy Space Center Press Site auditorium. The briefing focused on research planned for launch to the International Space Station. The scientific materials and supplies will be aboard a Dragon spacecraft scheduled for liftoff from Cape Canaveral Air Force Station's Space Launch Complex 40 at 4:30 p.m. EST, on April 2, 2018. The SpaceX Falcon 9 rocket will launch the company's 14th Commercial Resupply Services mission to the space station.

  3. Optimal sensitometric curves of Kodak EDR2 film for dynamic intensity modulated radiation therapy verification

    PubMed Central

    Suriyapee, S; Pitaxtarnin, N; Oonsiri, S; Jumpangern, C; Israngkul Na Ayuthaya, I

    2008-01-01

    Purpose: To investigate the optimal sensitometric curves of extended dose range (EDR2) radiographic film in terms of depth, field size, dose range and processing conditions for dynamic intensity modulated radiation therapy (IMRT) dosimetry verification with 6 MV X-ray beams. Materials and methods: A Varian Clinac 23 EX linear accelerator with 6 MV X-ray beam was used to study the response of Kodak EDR2 film. Measurements were performed at depths of 5, 10 and 15 cm in MedTec virtual water phantom and with field sizes of 2x2, 3x3, 10x10 and 15x15 cm2. Doses ranging from 20 to 450 cGy were used. The film was developed with the Kodak RP X-OMAT Model M6B automatic film processor. Film response was measured with the Vidar model VXR-16 scanner. Sensitometric curves were applied to the dose profiles measured with film at 5 cm in the virtual water phantom with field sizes of 2x2 and 10x10 cm2 and compared with ion chamber data. Scanditronix/Wellhofer OmniProTM IMRT software was used for the evaluation of the IMRT plan calculated by Eclipse treatment planning. Results: Investigation of the reproducibility and accuracy of the film responses, which depend mainly on the film processor, was carried out by irradiating one film nine times with doses of 20 to 450 cGy. A maximum standard deviation of 4.9% was found which decreased to 1.9% for doses between 20 and 200 cGy. The sensitometric curves for various field sizes at fixed depth showed a maximum difference of 4.2% between 2x2 and 15x15 cm2 at 5 cm depth with a dose of 450 cGy. The shallow depth tended to show a greater effect of field size responses than the deeper depths. The sensitometric curves for various depths at fixed field size showed slightly different film responses; the difference due to depth was within 1.8% for all field sizes studied. Both field size and depth effect were reduced when the doses were lower than 450 cGy. The difference was within 2.5% in the dose range from 20 to 300 cGy for all field sizes and

  4. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    NASA Astrophysics Data System (ADS)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  5. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  6. Reusable science tools for analog exploration missions: xGDS Web Tools, VERVE, and Gigapan Voyage

    NASA Astrophysics Data System (ADS)

    Lee, Susan Y.; Lees, David; Cohen, Tamar; Allan, Mark; Deans, Matthew; Morse, Theodore; Park, Eric; Smith, Trey

    2013-10-01

    The Exploration Ground Data Systems (xGDS) project led by the Intelligent Robotics Group (IRG) at NASA Ames Research Center creates software tools to support multiple NASA-led planetary analog field experiments. The two primary tools that fall under the xGDS umbrella are the xGDS Web Tools (xGDS-WT) and Visual Environment for Remote Virtual Exploration (VERVE). IRG has also developed a hardware and software system that is closely integrated with our xGDS tools and is used in multiple field experiments called Gigapan Voyage. xGDS-WT, VERVE, and Gigapan Voyage are examples of IRG projects that improve the ratio of science return versus development effort by creating generic and reusable tools that leverage existing technologies in both hardware and software. xGDS Web Tools provides software for gathering and organizing mission data for science and engineering operations, including tools for planning traverses, monitoring autonomous or piloted vehicles, visualization, documentation, analysis, and search. VERVE provides high performance three dimensional (3D) user interfaces used by scientists, robot operators, and mission planners to visualize robot data in real time. Gigapan Voyage is a gigapixel image capturing and processing tool that improves situational awareness and scientific exploration in human and robotic analog missions. All of these technologies emphasize software reuse and leverage open source and/or commercial-off-the-shelf tools to greatly improve the utility and reduce the development and operational cost of future similar technologies. Over the past several years these technologies have been used in many NASA-led robotic field campaigns including the Desert Research and Technology Studies (DRATS), the Pavilion Lake Research Project (PLRP), the K10 Robotic Follow-Up tests, and most recently we have become involved in the NASA Extreme Environment Mission Operations (NEEMO) field experiments. A major objective of these joint robot and crew experiments is

  7. Applications of “Tender” Energy (1-5 keV) X-ray Absorption Spectroscopy in Life Sciences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Northrup, Paul; Leri, Alessandra; Tappero, Ryan

    The “tender” energy range of 1 to 5 keV, between the energy ranges of most “hard” (>5 keV) and “soft” (<1 keV) synchrotron X-ray facilities, offers some unique opportunities for synchrotron-based X-ray absorption fine structure spectroscopy in life sciences. In particular the K absorption edges of Na through Ca offer opportunities to study local structure, speciation, and chemistry of many important biological compounds, structures and processes. This is an area of largely untapped science, in part due to a scarcity of optimized facilities. Such measurements also entail unique experimental challenges. Lastly, this brief review describes the technique, its experimental challenges,more » recent progress in development of microbeam measurement capabilities, and several highlights illustrating applications in life sciences.« less

  8. Applications of “Tender” Energy (1-5 keV) X-ray Absorption Spectroscopy in Life Sciences

    DOE PAGES

    Northrup, Paul; Leri, Alessandra; Tappero, Ryan

    2016-02-15

    The “tender” energy range of 1 to 5 keV, between the energy ranges of most “hard” (>5 keV) and “soft” (<1 keV) synchrotron X-ray facilities, offers some unique opportunities for synchrotron-based X-ray absorption fine structure spectroscopy in life sciences. In particular the K absorption edges of Na through Ca offer opportunities to study local structure, speciation, and chemistry of many important biological compounds, structures and processes. This is an area of largely untapped science, in part due to a scarcity of optimized facilities. Such measurements also entail unique experimental challenges. Lastly, this brief review describes the technique, its experimental challenges,more » recent progress in development of microbeam measurement capabilities, and several highlights illustrating applications in life sciences.« less

  9. Microcode Verification Project.

    DTIC Science & Technology

    1980-05-01

    numerical constant. The internal syntax for these minimum and maximum values is REALMIN and REALMAX. ISPSSIMP ISPSSIMP is the file simplifying bitstring...To be fair , it is quito clear that much of the ILbor Il tile verification task can be reduced If verification and. code development are carried out...basi.a of and the language we have chosen for both encoding our descriptions of machines and reasoning about the course of computations. Internally , our

  10. Interim Letter Report - Verification Survey of 19 Grids in the Lester Flat Area, David Witherspoon Inc. 1630 Site Knoxville, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P.C. Weaver

    2008-10-17

    Perform verification surveys of 19 available grids located in the Lester Flat Area at the Davod Witherspoon Site. The survey grids included E11, E12, E13, F11, F12, F13, F14, F15, G15, G16, G17, H16, H17, H18, X16, X17, X18, K16, and J16.

  11. Face verification with balanced thresholds.

    PubMed

    Yan, Shuicheng; Xu, Dong; Tang, Xiaoou

    2007-01-01

    The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.

  12. Cosmology constraints from shear peak statistics in Dark Energy Survey Science Verification data

    DOE PAGES

    Kacprzak, T.; Kirk, D.; Friedrich, O.; ...

    2016-08-19

    Shear peak statistics has gained a lot of attention recently as a practical alternative to the two point statistics for constraining cosmological parameters. We perform a shear peak statistics analysis of the Dark Energy Survey (DES) Science Verification (SV) data, using weak gravitational lensing measurements from a 139 degmore » $^2$ field. We measure the abundance of peaks identified in aperture mass maps, as a function of their signal-to-noise ratio, in the signal-to-noise range $$0<\\mathcal S / \\mathcal N<4$$. To predict the peak counts as a function of cosmological parameters we use a suite of $N$-body simulations spanning 158 models with varying $$\\Omega_{\\rm m}$$ and $$\\sigma_8$$, fixing $w = -1$, $$\\Omega_{\\rm b} = 0.04$$, $h = 0.7$ and $$n_s=1$$, to which we have applied the DES SV mask and redshift distribution. In our fiducial analysis we measure $$\\sigma_{8}(\\Omega_{\\rm m}/0.3)^{0.6}=0.77 \\pm 0.07$$, after marginalising over the shear multiplicative bias and the error on the mean redshift of the galaxy sample. We introduce models of intrinsic alignments, blending, and source contamination by cluster members. These models indicate that peaks with $$\\mathcal S / \\mathcal N>4$$ would require significant corrections, which is why we do not include them in our analysis. We compare our results to the cosmological constraints from the two point analysis on the SV field and find them to be in good agreement in both the central value and its uncertainty. As a result, we discuss prospects for future peak statistics analysis with upcoming DES data.« less

  13. VizieR Online Data Catalog: Spectroscopy of the foreground population in Orion A (Fang+, 2017)

    NASA Astrophysics Data System (ADS)

    Fang, M.; Kim, J. S.; Pascucci, I.; Apai, D.; Zhang, L.; Sicilia-Aguilar, A.; Alonso-Martinez, M.; Eiroa, C.; Wang, H.

    2018-05-01

    We performed a low-resolution spectroscopic survey of the stellar population in NGC 1980 with the Hectospec multi-object spectrograph, capable of taking a maximum of 300 spectra simultaneously. We used the 270 groove/mm grating and obtained spectra in the 3700-9000Å range with a spectral resolution of ~5Å. The data were taken in 2016 February. In Table 4, we list the young stars with X-Shooter spectra. These sources are mainly from the {eta} Cha cluster, the TW Hydra Association, the Lupus star-forming region, the σ Ori cluster, and the Cha I star-forming region. We extract the spectra of these sources from the X-Shooter phase III data archive. (3 data files).

  14. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  15. Formulating face verification with semidefinite programming.

    PubMed

    Yan, Shuicheng; Liu, Jianzhuang; Tang, Xiaoou; Huang, Thomas S

    2007-11-01

    This paper presents a unified solution to three unsolved problems existing in face verification with subspace learning techniques: selection of verification threshold, automatic determination of subspace dimension, and deducing feature fusing weights. In contrast to previous algorithms which search for the projection matrix directly, our new algorithm investigates a similarity metric matrix (SMM). With a certain verification threshold, this matrix is learned by a semidefinite programming approach, along with the constraints of the kindred pairs with similarity larger than the threshold, and inhomogeneous pairs with similarity smaller than the threshold. Then, the subspace dimension and the feature fusing weights are simultaneously inferred from the singular value decomposition of the derived SMM. In addition, the weighted and tensor extensions are proposed to further improve the algorithmic effectiveness and efficiency, respectively. Essentially, the verification is conducted within an affine subspace in this new algorithm and is, hence, called the affine subspace for verification (ASV). Extensive experiments show that the ASV can achieve encouraging face verification accuracy in comparison to other subspace algorithms, even without the need to explore any parameters.

  16. Space Weather Models and Their Validation and Verification at the CCMC

    NASA Technical Reports Server (NTRS)

    Hesse, Michael

    2010-01-01

    The Community Coordinated l\\lodeling Center (CCMC) is a US multi-agency activity with a dual mission. With equal emphasis, CCMC strives to provide science support to the international space research community through the execution of advanced space plasma simulations, and it endeavors to support the space weather needs of the CS and partners. Space weather support involves a broad spectrum, from designing robust forecasting systems and transitioning them to forecasters, to providing space weather updates and forecasts to NASA's robotic mission operators. All of these activities have to rely on validation and verification of models and their products, so users and forecasters have the means to assign confidence levels to the space weather information. In this presentation, we provide an overview of space weather models resident at CCMC, as well as of validation and verification activities undertaken at CCMC or through the use of CCMC services.

  17. 25 CFR 61.8 - Verification forms.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... using the last address of record. The verification form will be used to ascertain the previous enrollee... death. Name and/or address changes will only be made if the verification form is signed by an adult... 25 Indians 1 2010-04-01 2010-04-01 false Verification forms. 61.8 Section 61.8 Indians BUREAU OF...

  18. [The Dose Effect of Isocenter Selection during IMRT Dose Verification with the 2D Chamber Array].

    PubMed

    Xie, Chuanbin; Cong, Xiaohu; Xu, Shouping; Dai, Xiangkun; Wang, Yunlai; Han, Lu; Gong, Hanshun; Ju, Zhongjian; Ge, Ruigang; Ma, Lin

    2015-03-01

    To investigate the dose effect of isocenter difference during IMRT dose verification with the 2D chamber array. The samples collected from 10 patients were respectively designed for IMRT plans, the isocenter of which was independently defined as P(o), P(x) and P(y). P(o) was fixed on the target center and the other points shifted 8cm from the target center in the orientation of x/y. The PTW729 was used for 2D dose verification in the 3 groups which beams of plans were set to 0 degrees. The γ-analysis passing rates for the whole plan and each beam were gotten using the different standards in the 3 groups, The results showed the mean passing rate of γ-analysis was highest in the P(o) group, and the mean passing rate of the whole plan was better than that of each beam. In addition, it became worse with the increase of dose leakage between the leaves in P(y) group. Therefore, the determination of isocenter has a visible effect for IMRT dose verification of the 2D chamber array, The isocenter of the planning design should be close to the geometric center of target.

  19. High stakes in INF verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krepon, M.

    1987-06-01

    The stakes involved in negotiating INF verification arrangements are high. While these proposals deal only with intermediate-range ground-launched cruise and mobile missiles, if properly devised they could help pave the way for comprehensive limits on other cruise missiles and strategic mobile missiles. In contrast, poorly drafted monitoring provisions could compromise national industrial security and generate numerous compliance controversies. Any verification regime will require new openness on both sides, but that means significant risks as well as opportunities. US and Soviet negotiators could spend weeks, months, and even years working out in painstaking detail verification provisions for medium-range missiles. Alternatively, ifmore » the two sides wished to conclude an INF agreement quickly, they could defer most of the difficult verification issues to the strategic arms negotiations.« less

  20. Verification of Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  1. Consistent Structural Integrity and Efficient Certification with Analysis. Volume 3: Appendices of Verification and Validation Examples, Correlation Factors, and Failure Criteria

    DTIC Science & Technology

    2005-05-01

    TANK WALL.........................74 6 VERIFICATION - BONDED JOINT HOMOGENOUS ISOTROPIC AND ORTHOTROPIC DELALE & ERDOGAN PUBLICATION (SIX EXAMPLES...developed for verification of BondJo 87 6.3.2 Adhesive stress comparisons between BondJo, Ansys solid model FEA and Delale and Erdogan plate theory 88...comparisons for condition 1 91 6.3.6 Adhesive stress comparisons between BondJo, Ansys solid model FEA and Delale and Erdogan plate theory 92 x FIGURE

  2. The Screaming Boredom of Learning Science

    ERIC Educational Resources Information Center

    Krips, H.

    1977-01-01

    Advocates changing the role of secondary school science from one of theory verification and problem solving to the formulation and acceptance of hypotheses for observed phenomena. Provides an example of the procedure using Hooke's Law. (CP)

  3. Consortium for Verification Technology Fellowship Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadler, Lorraine E.

    2017-06-01

    As one recipient of the Consortium for Verification Technology (CVT) Fellowship, I spent eight days as a visiting scientist at the University of Michigan, Department of Nuclear Engineering and Radiological Sciences (NERS). During this time, I participated in multiple department and research group meetings and presentations, met with individual faculty and students, toured multiple laboratories, and taught one-half of a one-unit class on Risk Analysis in Nuclear Arms control (six 1.5 hour lectures). The following report describes some of the interactions that I had during my time as well as a brief discussion of the impact of this fellowship onmore » members of the consortium and on me/my laboratory’s technical knowledge and network.« less

  4. Galaxies in X-ray Selected Clusters and Groups in Dark Energy Survey Data: Stellar Mass Growth of Bright Central Galaxies Since z~1.2

    DOE PAGES

    Zhang, Y.; Miller, C.; McKay, T.; ...

    2016-01-10

    Using the science verification data of the Dark Energy Survey for a new sample of 106 X-ray selected clusters and groups, we study the stellar mass growth of bright central galaxies (BCGs) since redshift z ~ 1.2. Compared with the expectation in a semi-analytical model applied to the Millennium Simulation, the observed BCGs become under-massive/under-luminous with decreasing redshift. We incorporate the uncertainties associated with cluster mass, redshift, and BCG stellar mass measurements into analysis of a redshift-dependent BCG-cluster mass relation.

  5. CD volume design and verification

    NASA Technical Reports Server (NTRS)

    Li, Y. P.; Hughes, J. S.

    1993-01-01

    In this paper, we describe a prototype for CD-ROM volume design and verification. This prototype allows users to create their own model of CD volumes by modifying a prototypical model. Rule-based verification of the test volumes can then be performed later on against the volume definition. This working prototype has proven the concept of model-driven rule-based design and verification for large quantity of data. The model defined for the CD-ROM volumes becomes a data model as well as an executable specification.

  6. SpaceX CRS-12 "What's on Board?" Science Briefing

    NASA Image and Video Library

    2017-08-13

    Ken Shields, director of Operations for Center for the Advancement of Science in Space/ISS National Lab, left, and Pete Hasbrook, associate program scientist for the International Space Station Program, speak to members of social media in the Kennedy Space Center’s Press Site auditorium. The briefing focused on research planned for launch to the International Space Station. The scientific materials and supplies will be aboard a Dragon spacecraft scheduled for launch from Kennedy’s Launch Complex 39A on Aug. 14 atop a SpaceX Falcon 9 rocket on the company's 12th Commercial Resupply Services mission to the space station.

  7. SpaceX CRS-14 What's On Board Science Briefing

    NASA Image and Video Library

    2018-04-01

    Patrick O'Neill, Marketing and Communications Manager, Center for the Advancement of Science in Space, speaks to members of the media in the Kennedy Space Center Press Site auditorium. The briefing focused on research planned for launch to the International Space Station. The scientific materials and supplies will be aboard a Dragon spacecraft scheduled for liftoff from Cape Canaveral Air Force Station's Space Launch Complex 40 at 4:30 p.m. EST, on April 2, 2018. The SpaceX Falcon 9 rocket will launch the company's 14th Commercial Resupply Services mission to the space station.

  8. SpaceX CRS-10 "What's On Board" Science Briefing

    NASA Image and Video Library

    2017-02-17

    Tara Ruttley, NASA associate scientist for the International Space Station Program, left, and Patrick O'Nell, Marketing and Communications manager for the Center for the Advancement of Science in Space (CASIS), speak to members of social media in the Kennedy Space Center’s Press Site auditorium. The briefing focused on research planned for launch to the International Space Station. The scientific materials and supplies will be aboard a Dragon spacecraft scheduled for launch from Kennedy’s Launch Complex 39A on Feb. 18 atop a SpaceX Falcon 9 rocket on the company's 10th Commercial Resupply Services mission to the space station.

  9. Monitoring and verification R&D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existingmore » energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.« less

  10. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the... verifications and analysis. It may also be necessary to limit the range of conditions under which the PEMS can... additional information or analysis to support your conclusions. (b) Overall verification. This paragraph (b...

  11. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaur, Ramanpreet; Kaper, Lex; Ellerbroek, Lucas E.

    We present the optical to near-infrared spectrum of MAXI J1659-152 during the onset of its 2010 X-ray outburst. The spectrum was obtained with X-shooter on the ESO Very Large Telescope early in the outburst simultaneous with high-quality observations at both shorter and longer wavelengths. At the time of the observations, the source was in the low-hard state. The X-shooter spectrum includes many broad ({approx}2000 km s{sup -1}), double-peaked emission profiles of H, He I, and He II, characteristic signatures of a low-mass X-ray binary during outburst. We detect no spectral signatures of the low-mass companion star. The strength of themore » diffuse interstellar bands results in a lower limit to the total interstellar extinction of A{sub V} {approx_equal} 0.4 mag. Using the neutral hydrogen column density obtained from the X-ray spectrum we estimate A{sub V} {approx_equal} 1 mag. The radial velocity structure of the interstellar Na I D and Ca II H and K lines results in a lower limit to the distance of {approx}4 {+-} 1 kpc, consistent with previous estimates. With this distance and A{sub V} , the dereddened spectral energy distribution represents a flat disk spectrum. The two 10 minute X-shooter spectra show significant variability in the red wing of the emission-line profiles, indicating a global change in the density structure of the disk, though on a timescale much shorter than the typical viscous timescale of the disk.« less

  13. The Hyper-X Flight Systems Validation Program

    NASA Technical Reports Server (NTRS)

    Redifer, Matthew; Lin, Yohan; Bessent, Courtney Amos; Barklow, Carole

    2007-01-01

    For the Hyper-X/X-43A program, the development of a comprehensive validation test plan played an integral part in the success of the mission. The goal was to demonstrate hypersonic propulsion technologies by flight testing an airframe-integrated scramjet engine. Preparation for flight involved both verification and validation testing. By definition, verification is the process of assuring that the product meets design requirements; whereas validation is the process of assuring that the design meets mission requirements for the intended environment. This report presents an overview of the program with emphasis on the validation efforts. It includes topics such as hardware-in-the-loop, failure modes and effects, aircraft-in-the-loop, plugs-out, power characterization, antenna pattern, integration, combined systems, captive carry, and flight testing. Where applicable, test results are also discussed. The report provides a brief description of the flight systems onboard the X-43A research vehicle and an introduction to the ground support equipment required to execute the validation plan. The intent is to provide validation concepts that are applicable to current, follow-on, and next generation vehicles that share the hybrid spacecraft and aircraft characteristics of the Hyper-X vehicle.

  14. Research in space science and technology. [including X-ray astronomy and interplanetary plasma physics

    NASA Technical Reports Server (NTRS)

    Beckley, L. E.

    1977-01-01

    Progress in various space flight research programs is reported. Emphasis is placed on X-ray astronomy and interplanetary plasma physics. Topics covered include: infrared astronomy, long base line interferometry, geological spectroscopy, space life science experiments, atmospheric physics, and space based materials and structures research. Analysis of galactic and extra-galactic X-ray data from the Small Astronomy Satellite (SAS-3) and HEAO-A and interplanetary plasma data for Mariner 10, Explorers 47 and 50, and Solrad is discussed.

  15. 40 CFR 1066.240 - Torque transducer verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270 and... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066...

  16. Observation and confirmation of six strong-lensing systems in the Dark Energy Survey science verification data

    DOE PAGES

    Nord, B.; Buckley-Geer, E.; Lin, H.; ...

    2016-08-05

    We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either weremore » not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ~ 0.80–3.2 and in i-band surface brightness i SB ~ 23–25 mag arcsec –2 (2'' aperture). For each of the six systems, we estimate the Einstein radius θ E and the enclosed mass M enc, which have ranges θ E ~ 5''–9'' and M enc ~ 8 × 10 12 to 6 × 10 13 M ⊙, respectively.« less

  17. Observation and confirmation of six strong-lensing systems in the Dark Energy Survey science verification data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nord, B.; Buckley-Geer, E.; Lin, H.

    We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either weremore » not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ~ 0.80–3.2 and in i-band surface brightness i SB ~ 23–25 mag arcsec –2 (2'' aperture). For each of the six systems, we estimate the Einstein radius θ E and the enclosed mass M enc, which have ranges θ E ~ 5''–9'' and M enc ~ 8 × 10 12 to 6 × 10 13 M ⊙, respectively.« less

  18. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  19. Dosimetry investigation of MOSFET for clinical IMRT dose verification.

    PubMed

    Deshpande, Sudesh; Kumar, Rajesh; Ghadi, Yogesh; Neharu, R M; Kannan, V

    2013-06-01

    In IMRT, patient-specific dose verification is followed regularly at each centre. Simple and efficient dosimetry techniques play a very important role in routine clinical dosimetry QA. The MOSFET dosimeter offers several advantages over the conventional dosimeters such as its small detector size, immediate readout, immediate reuse, multiple point dose measurements. To use the MOSFET as routine clinical dosimetry system for pre-treatment dose verification in IMRT, a comprehensive set of experiments has been conducted, to investigate its linearity, reproducibility, dose rate effect and angular dependence for 6 MV x-ray beam. The MOSFETs shows a linear response with linearity coefficient of 0.992 for a dose range of 35 cGy to 427 cGy. The reproducibility of the MOSFET was measured by irradiating the MOSFET for ten consecutive irradiations in the dose range of 35 cGy to 427 cGy. The measured reproducibility of MOSFET was found to be within 4% up to 70 cGy and within 1.4% above 70 cGy. The dose rate effect on the MOSFET was investigated in the dose rate range 100 MU/min to 600 MU/min. The response of the MOSFET varies from -1.7% to 2.1%. The angular responses of the MOSFETs were measured at 10 degrees intervals from 90 to 270 degrees in an anticlockwise direction and normalized at gantry angle zero and it was found to be in the range of 0.98 ± 0.014 to 1.01 ± 0.014. The MOSFETs were calibrated in a phantom which was later used for IMRT verification. The measured calibration coefficients were found to be 1 mV/cGy and 2.995 mV/cGy in standard and high sensitivity mode respectively. The MOSFETs were used for pre-treatment dose verification in IMRT. Nine dosimeters were used for each patient to measure the dose in different plane. The average variation between calculated and measured dose at any location was within 3%. Dose verification using MOSFET and IMRT phantom was found to quick and efficient and well suited for a busy radiotherapy

  20. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  1. 14 CFR 211.11 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Verification. 211.11 Section 211.11 Aeronautics and Space OFFICE OF THE SECRETARY, DEPARTMENT OF TRANSPORTATION (AVIATION PROCEEDINGS) ECONOMIC REGULATIONS APPLICATIONS FOR PERMITS TO FOREIGN AIR CARRIERS General Requirements § 211.11 Verification...

  2. SpaceX CRS-14 What's On Board Science Briefing

    NASA Image and Video Library

    2018-04-01

    Howard Levine, at left, chief scientist in the Utilization and Life Sciences Office at NASA's Kennedy Space Center, and Dave Reid, a project manager with Techshot, discuss continuing research on growing food in space, as the Veggie Passive Orbital Nutrient Delivery System (PONDS) experiment tests a new way to deliver nutrients to plants. PONDS is one of the experiments that will be aboard a Dragon spacecraft scheduled for liftoff from Cape Canaveral Air Force Station's Space Launch Complex 40 at 4:30 p.m. EST, on April 2, 2018. The SpaceX Falcon 9 rocket will launch the company's 14th Commercial Resupply Services mission to the space station.

  3. X-ray verification of an optically aligned off-plane grating module

    NASA Astrophysics Data System (ADS)

    Donovan, Benjamin D.; McEntaffer, Randall L.; Tutt, James H.; DeRoo, Casey T.; Allured, Ryan; Gaskin, Jessica A.; Kolodziejczak, Jeffery J.

    2018-01-01

    Off-plane x-ray reflection gratings are theoretically capable of achieving high resolution and high diffraction efficiencies over the soft x-ray bandpass, making them an ideal technology to implement on upcoming x-ray spectroscopy missions. To achieve high effective area, these gratings must be aligned into grating modules. X-ray testing was performed on an aligned grating module to assess the current optical alignment methods. Results indicate that the grating module achieved the desired alignment for an upcoming x-ray spectroscopy suborbital rocket payload with modest effective area and resolving power. These tests have also outlined a pathway towards achieving the stricter alignment tolerances of future x-ray spectrometer payloads, which require improvements in alignment metrology, grating fabrication, and testing techniques.

  4. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PERFORMANCE VERIFICATION OF THE W.L. GORE & ASSOCIATES GORE-SORBER SCREENING SURVEY

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  6. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  7. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having knowledge...

  8. Application of X-ray topography to USSR and Russian space materials science

    PubMed Central

    Shul’pina, I. L.; Prokhorov, I. A.; Serebryakov, Yu. A.; Bezbakh, I. Zh.

    2016-01-01

    The authors’ experience of the application of X-ray diffraction imaging in carrying out space technological experiments on semiconductor crystal growth for the former USSR and for Russia is reported, from the Apollo–Soyuz programme (1975) up to the present day. X-ray topography was applied to examine defects in crystals in order to obtain information on the crystallization conditions and also on their changes under the influence of factors of orbital flight in space vehicles. The data obtained have promoted a deeper understanding of the conditions and mechanisms of crystallization under both microgravity and terrestrial conditions, and have enabled the elaboration of terrestrial methods of highly perfect crystal growth. The use of X-ray topography in space materials science has enriched its methods in the field of digital image processing of growth striations and expanded its possibilities in investigating the inhomogeneity of crystals. PMID:27158506

  9. Application of X-ray topography to USSR and Russian space materials science.

    PubMed

    Shul'pina, I L; Prokhorov, I A; Serebryakov, Yu A; Bezbakh, I Zh

    2016-05-01

    The authors' experience of the application of X-ray diffraction imaging in carrying out space technological experiments on semiconductor crystal growth for the former USSR and for Russia is reported, from the Apollo-Soyuz programme (1975) up to the present day. X-ray topography was applied to examine defects in crystals in order to obtain information on the crystallization conditions and also on their changes under the influence of factors of orbital flight in space vehicles. The data obtained have promoted a deeper understanding of the conditions and mechanisms of crystallization under both microgravity and terrestrial conditions, and have enabled the elaboration of terrestrial methods of highly perfect crystal growth. The use of X-ray topography in space materials science has enriched its methods in the field of digital image processing of growth striations and expanded its possibilities in investigating the inhomogeneity of crystals.

  10. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  11. Galaxy Clustering, Photometric Redshifts and Diagnosis of Systematics in the DES Science Verification Data

    DOE PAGES

    Crocce, M.

    2015-12-09

    We study the clustering of galaxies detected at i < 22.5 in the Science Verification observations of the Dark Energy Survey (DES). Two-point correlation functions are measured using 2.3 × 106 galaxies over a contiguous 116 deg 2 region in five bins of photometric redshift width Δz = 0.2 in the range 0.2 < z < 1.2. The impact of photometric redshift errors is assessed by comparing results using a template-based photo-zalgorithm (BPZ) to a machine-learning algorithm (TPZ). A companion paper presents maps of several observational variables (e.g. seeing, sky brightness) which could modulate the galaxy density. Here we characterizemore » and mitigate systematic errors on the measured clustering which arise from these observational variables, in addition to others such as Galactic dust and stellar contamination. After correcting for systematic effects, we then measure galaxy bias over a broad range of linear scales relative to mass clustering predicted from the Planck Λ cold dark matter model, finding agreement with the Canada-France-Hawaii Telescope Legacy Survey (CFHTLS) measurements with χ 2 of 4.0 (8.7) with 5 degrees of freedom for the TPZ (BPZ) redshifts. Furthermore, we test a ‘linear bias’ model, in which the galaxy clustering is a fixed multiple of the predicted non-linear dark matter clustering. The precision of the data allows us to determine that the linear bias model describes the observed galaxy clustering to 2.5 percent accuracy down to scales at least 4–10 times smaller than those on which linear theory is expected to be sufficient.« less

  12. Galaxy Clustering, Photometric Redshifts and Diagnosis of Systematics in the DES Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crocce, M.

    We study the clustering of galaxies detected at i < 22.5 in the Science Verification observations of the Dark Energy Survey (DES). Two-point correlation functions are measured using 2.3 × 106 galaxies over a contiguous 116 deg 2 region in five bins of photometric redshift width Δz = 0.2 in the range 0.2 < z < 1.2. The impact of photometric redshift errors is assessed by comparing results using a template-based photo-zalgorithm (BPZ) to a machine-learning algorithm (TPZ). A companion paper presents maps of several observational variables (e.g. seeing, sky brightness) which could modulate the galaxy density. Here we characterizemore » and mitigate systematic errors on the measured clustering which arise from these observational variables, in addition to others such as Galactic dust and stellar contamination. After correcting for systematic effects, we then measure galaxy bias over a broad range of linear scales relative to mass clustering predicted from the Planck Λ cold dark matter model, finding agreement with the Canada-France-Hawaii Telescope Legacy Survey (CFHTLS) measurements with χ 2 of 4.0 (8.7) with 5 degrees of freedom for the TPZ (BPZ) redshifts. Furthermore, we test a ‘linear bias’ model, in which the galaxy clustering is a fixed multiple of the predicted non-linear dark matter clustering. The precision of the data allows us to determine that the linear bias model describes the observed galaxy clustering to 2.5 percent accuracy down to scales at least 4–10 times smaller than those on which linear theory is expected to be sufficient.« less

  13. Mineralogy by X-ray Diffraction on Mars: The Chemin Instrument on Mars Science Laboratory

    NASA Technical Reports Server (NTRS)

    Vaniman, D. T.; Bristow, T. F.; Bish, D. L.; Ming, D. W.; Blake, D. F.; Morris, R. V.; Rampe, E. B.; Chipera, S. J.; Treiman, A. H.; Morrison, S. M.; hide

    2014-01-01

    To obtain detailed mineralogy information, the Mars Science Laboratory rover Curiosity carries CheMin, the first X-ray diffraction (XRD) instrument used on a planet other than Earth. CheMin has provided the first in situ XRD analyses of full phase assemblages on another planet.

  14. The Impact of Crosstalk in the X-IFU Instrument on Athena Science Cases

    NASA Technical Reports Server (NTRS)

    Hartog, R. Den; Peille, P.; Dauser, T.; Jackson, B.; Bandler, S.; Barret, D.; Brand, T.; Herder, J-W Den; Kiviranta, M.; Kuur, J. Van Der; hide

    2016-01-01

    In this paper we present a first assessment of the impact of various forms of instrumental crosstalk on the science performance of the X-ray Integral Field Unit (X-IFU) on the Athena X-ray mission. This assessment is made using the SIXTE end-to-end simulator in the context of one of the more technically challenging science cases for the XIFU instrument. Crosstalk considerations may influence or drive various aspects of the design of the array of high-count-rate Transition Edge Sensor (TES) detectors and its Frequency Domain Multiplexed (FDM) readout architecture. The Athena X-ray mission was selected as the second L-class mission in ESA's Cosmic Vision 2015–25 plan, with alaunch foreseen in 2028, to address the theme ''Hot and Energetic Universe"1. One of the two instruments on boardAthena is the X-ray Integral Field Unit2 (X-IFU) which is based on an array of 3800 Transition Edge Sensors (TES's)operated at a temperature of 90 mK. The science cases pose an interesting challenge for this instrument, as they requirea combination of high energy resolution (2.5 eV FWHM or better), high spatial resolution (5 arcsec or better) and highcount rate capability (several tens of counts per second per detector for point sources as bright as 10 mCrab).The performance at the single sensor level has been demonstrated3, but the operation of such detectors in an array, usingmultiplexed readout, brings additional challenges, both for the design of the array in which the sensors are placed and forthe readout of the sensors. The readout of the detector array will be based on Frequency Domain Multiplexing (FDM)4.In this system of detectors and readout, crosstalk can arise through various mechanisms: on the TES array, neighboringsensors can couple through thermal crosstalk. Detectors adjacent in carrier frequency may suffer from electrical crosstalkdue to the finite width of the bandpass filters, and shared sources of impedance in their signal lines. The signals from theindividual

  15. Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, S R; Bihari, B L; Salari, K

    As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.

  16. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  17. Hyper-X Flight Engine Ground Testing for X-43 Flight Risk Reduction

    NASA Technical Reports Server (NTRS)

    Huebner, Lawrence D.; Rock, Kenneth E.; Ruf, Edward G.; Witte, David W.; Andrews, Earl H., Jr.

    2001-01-01

    Airframe-integrated scramjet engine testing has been completed at Mach 7 flight conditions in the NASA Langley 8-Foot High Temperature Tunnel as part of the NASA Hyper-X program. This test provided engine performance and operability data, as well as design and database verification, for the Mach 7 flight tests of the Hyper-X research vehicle (X-43), which will provide the first-ever airframe-integrated scramjet data in flight. The Hyper-X Flight Engine, a duplicate Mach 7 X-43 scramjet engine, was mounted on an airframe structure that duplicated the entire three-dimensional propulsion flowpath from the vehicle leading edge to the vehicle trailing edge. This model was also tested to verify and validate the complete flight-like engine system. This paper describes the subsystems that were subjected to flight-like conditions and presents supporting data. The results from this test help to reduce risk for the Mach 7 flights of the X-43.

  18. Evaluation of Kodak EDR2 film for dose verification of intensity modulated radiation therapy delivered by a static multileaf collimator.

    PubMed

    Zhu, X R; Jursinic, P A; Grimm, D F; Lopez, F; Rownd, J J; Gillin, M T

    2002-08-01

    A new type of radiographic film, Kodak EDR2 film, was evaluated for dose verification of intensity modulated radiation therapy (IMRT) delivered by a static multileaf collimator (SMLC). A sensitometric curve of EDR2 film irradiated by a 6 MV x-ray beam was compared with that of Kodak X-OMAT V (XV) film. The effects of field size, depth and dose rate on the sensitometric curve were also studied. It is found that EDR2 film is much less sensitive than XV film. In high-energy x-ray beams, the double hit process is the dominant mechanism that renders the grains on EDR2 films developable. As a result, in the dose range that is commonly used for film dosimetry for IMRT and conventional external beam therapy, the sensitometric curves of EDR2 films cannot be approximated as a linear function, OD = c * D. Within experimental uncertainty, the film sensitivity does not depend on the dose rate (50 vs 300 MU/min) or dose per pulse (from 1.0 x 10(-4) to 4.21 x 10(-4) Gy/pulse). Field sizes and depths (up to field size of 10 x 10 cm2 and depth = 10 cm) have little effect on the sensitometric curves. Percent depth doses (PDDs) for both 6 and 23 MV x rays were measured with both EDR2 and XV films and compared with ion chamber data. Film data are within 2.5% of the ion chamber results. Dose profiles measured with EDR2 film are consistent with those measured with an ion chamber. Examples of measured IMRT isodose distributions versus calculated isodoses are presented. We have used EDR2 films for verification of all IMRT patients treated by SMLC in our clinic. In most cases, with EDR2 film, actual clinical daily fraction doses can be used for verification of composite isodose distributions of SMLC-based IMRT.

  19. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  20. Mathematics learning disabilities in girls with fragile X or Turner syndrome during late elementary school.

    PubMed

    Murphy, Melissa M; Mazzocco, Michèle M M

    2008-01-01

    The present study focuses on math and related skills among 32 girls with fragile X (n = 14) or Turner (n = 18) syndrome during late elementary school. Performance in each syndrome group was assessed relative to Full Scale IQ-matched comparison groups of girls from the general population (n = 32 and n = 89 for fragile X syndrome and Turner syndrome, respectively). Differences between girls with fragile X and their comparison group emerged on untimed arithmetic calculations, mastery of counting skills, and arithmetic problem verification accuracy. Relative to girls in the comparison group, girls with Turner syndrome did not differ on untimed arithmetic calculations or problem verification accuracy, but they had limited mastery of counting skills and longer response times to complete the problem verification task. Girls with fragile X or Turner syndrome also differed from their respective comparison groups on math-related abilities, including visual-spatial, working memory, and reading skills, and the associations between math and those related skills. Together, these findings support the notion that difficulty with math and related skills among girls with fragile X or Turner syndrome continues into late elementary school and that the profile of math and related skill difficulty distinguishes the two syndrome groups from each other.

  1. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  2. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  3. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  4. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  5. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  6. Cognitive neuroscience in forensic science: understanding and utilizing the human element

    PubMed Central

    Dror, Itiel E.

    2015-01-01

    The human element plays a critical role in forensic science. It is not limited only to issues relating to forensic decision-making, such as bias, but also relates to most aspects of forensic work (some of which even take place before a crime is ever committed or long after the verification of the forensic conclusion). In this paper, I explicate many aspects of forensic work that involve the human element and therefore show the relevance (and potential contribution) of cognitive neuroscience to forensic science. The 10 aspects covered in this paper are proactive forensic science, selection during recruitment, training, crime scene investigation, forensic decision-making, verification and conflict resolution, reporting, the role of the forensic examiner, presentation in court and judicial decisions. As the forensic community is taking on the challenges introduced by the realization that the human element is critical for forensic work, new opportunities emerge that allow for considerable improvement and enhancement of the forensic science endeavour. PMID:26101281

  7. In-Space Engine (ISE-100) Development - Design Verification Test

    NASA Technical Reports Server (NTRS)

    Trinh, Huu P.; Popp, Chris; Bullard, Brad

    2017-01-01

    In the past decade, NASA has formulated science mission concepts with an anticipation of landing spacecraft on the lunar surface, meteoroids, and other planets. Advancing thruster technology for spacecraft propulsion systems has been considered for maximizing science payload. Starting in 2010, development of In-Space Engine (designated as ISE-100) has been carried out. ISE-100 thruster is designed based on heritage Missile Defense Agency (MDA) technology aimed for a lightweight and efficient system in terms volume and packaging. It runs with a hypergolic bi-propellant system: MON-25 (nitrogen tetroxide, N2O4, with 25% of nitric oxide, NO) and MMH (monomethylhydrazine, CH6N2) for NASA spacecraft applications. The utilization of this propellant system will provide a propulsion system capable of operating at wide range of temperatures, from 50 C (122 F) down to -30 C (-22 F) to drastically reduce heater power. The thruster is designed to deliver 100 lb(sub f) of thrust with the capability of a pulse mode operation for a wide range of mission duty cycles (MDCs). Two thrusters were fabricated. As part of the engine development, this test campaign is dedicated for the design verification of the thruster. This presentation will report the efforts of the design verification hot-fire test program of the ISE-100 thruster in collaboration between NASA Marshall Space Flight Center (MSFC) and Aerojet Rocketdyne (AR) test teams. The hot-fire tests were conducted at Advance Mobile Propulsion Test (AMPT) facility in Durango, Colorado, from May 13 to June 10, 2016. This presentation will also provide a summary of key points from the test results.

  8. ADEN ALOS PALSAR Product Verification

    NASA Astrophysics Data System (ADS)

    Wright, P. A.; Meadows, P. J.; Mack, G.; Miranda, N.; Lavalle, M.

    2008-11-01

    Within the ALOS Data European Node (ADEN) the verification of PALSAR products is an important and continuing activity, to ensure data utility for the users. The paper will give a summary of the verification activities, the status of the ADEN PALSAR processor and the current quality issues that are important for users of ADEN PALSAR data.

  9. Biometric verification in dynamic writing

    NASA Astrophysics Data System (ADS)

    George, Susan E.

    2002-03-01

    Pen-tablet devices capable of capturing the dynamics of writing record temporal and pressure information as well as the spatial pattern. This paper explores biometric verification based upon the dynamics of writing where writers are distinguished not on the basis of what they write (ie the signature), but how they write. We have collected samples of dynamic writing from 38 Chinese writers. Each writer was asked to provide 10 copies of a paragraph of text and the same number of signature samples. From the data we have extracted stroke-based primitives from the sentence data utilizing pen-up/down information and heuristic rules about the shape of the character. The x, y and pressure values of each primitive were interpolated into an even temporal range based upon a 20 msec sampling rate. We applied the Daubechies 1 wavelet transform to the x signal, y signal and pressure signal using the coefficients as inputs to a multi-layer perceptron trained with back-propagation on the sentence data. We found a sensitivity of 0.977 and specificity of 0.990 recognizing writers based on test primitives extracted from sentence data and measures of 0.916 and 0.961 respectively, from test primitives extracted from signature data.

  10. Wide-field lensing mass maps from Dark Energy Survey science verification data: Methodology and detailed analysis

    DOE PAGES

    Vikram, V.

    2015-07-29

    Weak gravitational lensing allows one to reconstruct the spatial distribution of the projected mass density across the sky. These “mass maps” provide a powerful tool for studying cosmology as they probe both luminous and dark matter. In this paper, we present a weak lensing mass map reconstructed from shear measurements in a 139 deg 2 area from the Dark Energy Survey (DES) science verification data. We compare the distribution of mass with that of the foreground distribution of galaxies and clusters. The overdensities in the reconstructed map correlate well with the distribution of optically detected clusters. We demonstrate that candidatemore » superclusters and voids along the line of sight can be identified, exploiting the tight scatter of the cluster photometric redshifts. We cross-correlate the mass map with a foreground magnitude-limited galaxy sample from the same data. Our measurement gives results consistent with mock catalogs from N-body simulations that include the primary sources of statistical uncertainties in the galaxy, lensing, and photo-z catalogs. The statistical significance of the cross-correlation is at the 6.8σ level with 20 arcminute smoothing. We find that the contribution of systematics to the lensing mass maps is generally within measurement uncertainties. In this study, we analyze less than 3% of the final area that will be mapped by the DES; the tools and analysis techniques developed in this paper can be applied to forthcoming larger data sets from the survey.« less

  11. OBSERVATION AND CONFIRMATION OF SIX STRONG-LENSING SYSTEMS IN THE DARK ENERGY SURVEY SCIENCE VERIFICATION DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nord, B.; Buckley-Geer, E.; Lin, H.

    We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either weremore » not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ∼ 0.80–3.2 and in i -band surface brightness i {sub SB} ∼ 23–25 mag arcsec{sup −2} (2″ aperture). For each of the six systems, we estimate the Einstein radius θ {sub E} and the enclosed mass M {sub enc}, which have ranges θ {sub E} ∼ 5″–9″ and M {sub enc} ∼ 8 × 10{sup 12} to 6 × 10{sup 13} M {sub ⊙}, respectively.« less

  12. Observation and Confirmation of Six Strong-lensing Systems in the Dark Energy Survey Science Verification Data

    NASA Astrophysics Data System (ADS)

    Nord, B.; Buckley-Geer, E.; Lin, H.; Diehl, H. T.; Helsby, J.; Kuropatkin, N.; Amara, A.; Collett, T.; Allam, S.; Caminha, G. B.; De Bom, C.; Desai, S.; Dúmet-Montoya, H.; Pereira, M. Elidaiana da S.; Finley, D. A.; Flaugher, B.; Furlanetto, C.; Gaitsch, H.; Gill, M.; Merritt, K. W.; More, A.; Tucker, D.; Saro, A.; Rykoff, E. S.; Rozo, E.; Birrer, S.; Abdalla, F. B.; Agnello, A.; Auger, M.; Brunner, R. J.; Carrasco Kind, M.; Castander, F. J.; Cunha, C. E.; da Costa, L. N.; Foley, R. J.; Gerdes, D. W.; Glazebrook, K.; Gschwend, J.; Hartley, W.; Kessler, R.; Lagattuta, D.; Lewis, G.; Maia, M. A. G.; Makler, M.; Menanteau, F.; Niernberg, A.; Scolnic, D.; Vieira, J. D.; Gramillano, R.; Abbott, T. M. C.; Banerji, M.; Benoit-Lévy, A.; Brooks, D.; Burke, D. L.; Capozzi, D.; Carnero Rosell, A.; Carretero, J.; D'Andrea, C. B.; Dietrich, J. P.; Doel, P.; Evrard, A. E.; Frieman, J.; Gaztanaga, E.; Gruen, D.; Honscheid, K.; James, D. J.; Kuehn, K.; Li, T. S.; Lima, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Miquel, R.; Neilsen, E.; Nichol, R. C.; Ogando, R.; Plazas, A. A.; Romer, A. K.; Sako, M.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thaler, J.; Walker, A. R.; Wester, W.; Zhang, Y.; DES Collaboration

    2016-08-01

    We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either were not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ˜ 0.80-3.2 and in I-band surface brightness I SB ˜ 23-25 mag arcsec-2 (2″ aperture). For each of the six systems, we estimate the Einstein radius θ E and the enclosed mass M enc, which have ranges θ E ˜ 5″-9″ and M enc ˜ 8 × 1012 to 6 × 1013 M ⊙, respectively. This paper includes data gathered with the 6.5 m Magellan Telescopes located at Las Campanas Observatory, Chile.

  13. Cosmology from large-scale galaxy clustering and galaxy–galaxy lensing with Dark Energy Survey Science Verification data

    DOE PAGES

    Kwan, J.; Sánchez, C.; Clampitt, J.; ...

    2016-10-05

    We present cosmological constraints from the Dark Energy Survey (DES) using a combined analysis of angular clustering of red galaxies and their cross-correlation with weak gravitational lensing of background galaxies. We use a 139 square degree contiguous patch of DES data from the Science Verification (SV) period of observations. Using large scale measurements, we constrain the matter density of the Universe asmore » $$\\Omega_m = 0.31 \\pm 0.09$$ and the clustering amplitude of the matter power spectrum as $$\\sigma_8 = 0.74 +\\pm 0.13$$ after marginalizing over seven nuisance parameters and three additional cosmological parameters. This translates into $$S_8$$ = $$\\sigma_8(\\Omega_m/0.3)^{0.16} = 0.74 \\pm 0.12$$ for our fiducial lens redshift bin at 0.35 < z < 0.5, while $$S_8 = 0.78 \\pm 0.09$$ using two bins over the range 0.2 < z < 0.5. We study the robustness of the results under changes in the data vectors, modelling and systematics treatment, including photometric redshift and shear calibration uncertainties, and find consistency in the derived cosmological parameters. We show that our results are consistent with previous cosmological analyses from DES and other data sets and conclude with a joint analysis of DES angular clustering and galaxy-galaxy lensing with Planck CMB data, Baryon Accoustic Oscillations and Supernova type Ia measurements.« less

  14. Cosmology from large-scale galaxy clustering and galaxy–galaxy lensing with Dark Energy Survey Science Verification data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kwan, J.; Sánchez, C.; Clampitt, J.

    We present cosmological constraints from the Dark Energy Survey (DES) using a combined analysis of angular clustering of red galaxies and their cross-correlation with weak gravitational lensing of background galaxies. We use a 139 square degree contiguous patch of DES data from the Science Verification (SV) period of observations. Using large scale measurements, we constrain the matter density of the Universe asmore » $$\\Omega_m = 0.31 \\pm 0.09$$ and the clustering amplitude of the matter power spectrum as $$\\sigma_8 = 0.74 +\\pm 0.13$$ after marginalizing over seven nuisance parameters and three additional cosmological parameters. This translates into $$S_8$$ = $$\\sigma_8(\\Omega_m/0.3)^{0.16} = 0.74 \\pm 0.12$$ for our fiducial lens redshift bin at 0.35 < z < 0.5, while $$S_8 = 0.78 \\pm 0.09$$ using two bins over the range 0.2 < z < 0.5. We study the robustness of the results under changes in the data vectors, modelling and systematics treatment, including photometric redshift and shear calibration uncertainties, and find consistency in the derived cosmological parameters. We show that our results are consistent with previous cosmological analyses from DES and other data sets and conclude with a joint analysis of DES angular clustering and galaxy-galaxy lensing with Planck CMB data, Baryon Accoustic Oscillations and Supernova type Ia measurements.« less

  15. A tracking and verification system implemented in a clinical environment for partial HIPAA compliance

    NASA Astrophysics Data System (ADS)

    Guo, Bing; Documet, Jorge; Liu, Brent; King, Nelson; Shrestha, Rasu; Wang, Kevin; Huang, H. K.; Grant, Edward G.

    2006-03-01

    The paper describes the methodology for the clinical design and implementation of a Location Tracking and Verification System (LTVS) that has distinct benefits for the Imaging Department at the Healthcare Consultation Center II (HCCII), an outpatient imaging facility located on the USC Health Science Campus. A novel system for tracking and verification of patients and staff in a clinical environment using wireless and facial biometric technology to monitor and automatically identify patients and staff was developed in order to streamline patient workflow, protect against erroneous examinations and create a security zone to prevent and audit unauthorized access to patient healthcare data under the HIPAA mandate. This paper describes the system design and integration methodology based on initial clinical workflow studies within a clinical environment. An outpatient center was chosen as an initial first step for the development and implementation of this system.

  16. Post-OPC verification using a full-chip pattern-based simulation verification method

    NASA Astrophysics Data System (ADS)

    Hung, Chi-Yuan; Wang, Ching-Heng; Ma, Cliff; Zhang, Gary

    2005-11-01

    In this paper, we evaluated and investigated techniques for performing fast full-chip post-OPC verification using a commercial product platform. A number of databases from several technology nodes, i.e. 0.13um, 0.11um and 90nm are used in the investigation. Although it has proven that for most cases, our OPC technology is robust in general, due to the variety of tape-outs with complicated design styles and technologies, it is difficult to develop a "complete or bullet-proof" OPC algorithm that would cover every possible layout patterns. In the evaluation, among dozens of databases, some OPC databases were found errors by Model-based post-OPC checking, which could cost significantly in manufacturing - reticle, wafer process, and more importantly the production delay. From such a full-chip OPC database verification, we have learned that optimizing OPC models and recipes on a limited set of test chip designs may not provide sufficient coverage across the range of designs to be produced in the process. And, fatal errors (such as pinch or bridge) or poor CD distribution and process-sensitive patterns may still occur. As a result, more than one reticle tape-out cycle is not uncommon to prove models and recipes that approach the center of process for a range of designs. So, we will describe a full-chip pattern-based simulation verification flow serves both OPC model and recipe development as well as post OPC verification after production release of the OPC. Lastly, we will discuss the differentiation of the new pattern-based and conventional edge-based verification tools and summarize the advantages of our new tool and methodology: 1). Accuracy: Superior inspection algorithms, down to 1nm accuracy with the new "pattern based" approach 2). High speed performance: Pattern-centric algorithms to give best full-chip inspection efficiency 3). Powerful analysis capability: Flexible error distribution, grouping, interactive viewing and hierarchical pattern extraction to narrow

  17. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  18. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  19. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  20. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  1. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  2. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  3. Density scaling of phantom materials for a 3D dose verification system.

    PubMed

    Tani, Kensuke; Fujita, Yukio; Wakita, Akihisa; Miyasaka, Ryohei; Uehara, Ryuzo; Kodama, Takumi; Suzuki, Yuya; Aikawa, Ako; Mizuno, Norifumi; Kawamori, Jiro; Saitoh, Hidetoshi

    2018-05-21

    In this study, the optimum density scaling factors of phantom materials for a commercially available three-dimensional (3D) dose verification system (Delta4) were investigated in order to improve the accuracy of the calculated dose distributions in the phantom materials. At field sizes of 10 × 10 and 5 × 5 cm 2 with the same geometry, tissue-phantom ratios (TPRs) in water, polymethyl methacrylate (PMMA), and Plastic Water Diagnostic Therapy (PWDT) were measured, and TPRs in various density scaling factors of water were calculated by Monte Carlo simulation, Adaptive Convolve (AdC, Pinnacle 3 ), Collapsed Cone Convolution (CCC, RayStation), and AcurosXB (AXB, Eclipse). Effective linear attenuation coefficients (μ eff ) were obtained from the TPRs. The ratios of μ eff in phantom and water ((μ eff ) pl,water ) were compared between the measurements and calculations. For each phantom material, the density scaling factor proposed in this study (DSF) was set to be the value providing a match between the calculated and measured (μ eff ) pl,water . The optimum density scaling factor was verified through the comparison of the dose distributions measured by Delta4 and calculated with three different density scaling factors: the nominal physical density (PD), nominal relative electron density (ED), and DSF. Three plans were used for the verifications: a static field of 10 × 10 cm 2 and two intensity modulated radiation therapy (IMRT) treatment plans. DSF were determined to be 1.13 for PMMA and 0.98 for PWDT. DSF for PMMA showed good agreement for AdC and CCC with 6 MV x ray, and AdC for 10 MV x ray. DSF for PWDT showed good agreement regardless of the dose calculation algorithms and x-ray energy. DSF can be considered one of the references for the density scaling factor of Delta4 phantom materials and may help improve the accuracy of the IMRT dose verification using Delta4. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley

  4. Security Verification of Secure MANET Routing Protocols

    DTIC Science & Technology

    2012-03-22

    SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS THESIS Matthew F. Steele, Captain, USAF AFIT/GCS/ ENG /12-03 DEPARTMENT OF THE AIR FORCE AIR...States AFIT/GCS/ ENG /12-03 SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS THESIS Presented to the Faculty Department of Electrical and Computer...DISTRIBUTION UNLIMITED AFIT/GCS/ ENG /12-03 SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS Matthew F. Steele, B.S.E.E. Captain, USAF

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  6. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  7. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  8. Action video gaming and cognitive control: playing first person shooter games is associated with improvement in working memory but not action inhibition.

    PubMed

    Colzato, Lorenza S; van den Wildenberg, Wery P M; Zmigrod, Sharon; Hommel, Bernhard

    2013-03-01

    The interest in the influence of videogame experience in our daily life is constantly growing. "First Person Shooter" (FPS) games require players to develop a flexible mindset to rapidly react and monitor fast moving visual and auditory stimuli, and to inhibit erroneous actions. This study investigated whether and to which degree experience with such videogames generalizes to other cognitive control tasks. Experienced video game players (VGPs) and individuals with little to no videogame experience (NVGPs) performed on a N-back task and a stop-signal paradigm that provide a relatively well-established diagnostic measure of the monitoring and updating of working memory (WM) and response inhibition (an index of behavioral impulsivity), respectively. VGPs were faster and more accurate in the monitoring and updating of WM than NVGPs, which were faster in reacting to go signals, but showed comparable stopping performance. Our findings support the idea that playing FPS games is associated with enhanced flexible updating of task-relevant information without affecting impulsivity.

  9. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Design verification testing. 61.40-3 Section 61.40-3... INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  10. Remote Sensing Product Verification and Validation at the NASA Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas M.

    2005-01-01

    Remote sensing data product verification and validation (V&V) is critical to successful science research and applications development. People who use remote sensing products to make policy, economic, or scientific decisions require confidence in and an understanding of the products' characteristics to make informed decisions about the products' use. NASA data products of coarse to moderate spatial resolution are validated by NASA science teams. NASA's Stennis Space Center (SSC) serves as the science validation team lead for validating commercial data products of moderate to high spatial resolution. At SSC, the Applications Research Toolbox simulates sensors and targets, and the Instrument Validation Laboratory validates critical sensors. The SSC V&V Site consists of radiometric tarps, a network of ground control points, a water surface temperature sensor, an atmospheric measurement system, painted concrete radial target and edge targets, and other instrumentation. NASA's Applied Sciences Directorate participates in the Joint Agency Commercial Imagery Evaluation (JACIE) team formed by NASA, the U.S. Geological Survey, and the National Geospatial-Intelligence Agency to characterize commercial systems and imagery.

  11. Circulation of spoof surface plasmon polaritons: Implementation and verification

    NASA Astrophysics Data System (ADS)

    Pan, Junwei; Wang, Jiafu; Qiu, Tianshuo; Pang, Yongqiang; Li, Yongfeng; Zhang, Jieqiu; Qu, Shaobo

    2018-05-01

    In this letter, we are dedicated to implementation and experimental verification of broadband circulator for spoof surface plasmon polaritons (SSPPs). For the ease of fabrication, a circulator operating in X band was firstly designed. The comb-like transmission lines (CL-TLs), a typical SSPP structure, are adopted as the three branches of the Y-junction. To enable broadband coupling of SSPP, a transition section is added on each end of the CL-TLs. Through such a design, the circulator can operate under the sub-wavelength SSPP mode in a broad band. The simulation results show that the insertion loss is less than 0.5dB while the isolation and return loss are higher than 20dB in 9.4-12.0GHz. A prototype was fabricated and measured. The experimental results are consistent with the simulation results and verify the broadband circulation performance in X band.

  12. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  13. Architectures Toward Reusable Science Data Systems

    NASA Technical Reports Server (NTRS)

    Moses, John Firor

    2014-01-01

    Science Data Systems (SDS) comprise an important class of data processing systems that support product generation from remote sensors and in-situ observations. These systems enable research into new science data products, replication of experiments and verification of results. NASA has been building systems for satellite data processing since the first Earth observing satellites launched and is continuing development of systems to support NASA science research and NOAA's Earth observing satellite operations. The basic data processing workflows and scenarios continue to be valid for remote sensor observations research as well as for the complex multi-instrument operational satellite data systems being built today.

  14. NASA's Future X-ray Missions: From Constellation-X to Generation-X

    NASA Technical Reports Server (NTRS)

    Hornschemeier, A.

    2006-01-01

    Among the most important topics in modern astrophysics are the formation and evolution of supermassive black holes in concert with galaxy bulges, the nature of the dark energy equation of state, and the self-regulating symmetry imposed by both stellar and AGN feedback. All of these topics are readily addressed with observations at X-ray wavelengths. NASA's next major X-ray observatory is Constellation-X, which is being developed to perform spatially resolved high-resolution X-ray spectroscopy. Con-X will directly measure the physical properties of material near black holes' last stable orbits and the absolute element abundances and velocities of hot gas in clusters of galaxies. The Con-X mission will be described, as well as its successor, Generation-X (anticipated to fly approx.1 decade after Con-X). After describing these missions and their driving science areas, the talk will focus on areas in which Chandra observing programs may enable science with future X-ray observatories. These areas include a possible ultra-deep Chandra imaging survey as an early Universe pathfinder, a large program to spatially resolve the hot intracluster medium of massive clusters to aid dark energy measurements, and possible deep spectroscopic observations to aid in preparatory theoretical atomic physics work needed for interpreting Con-X spectra.

  15. Cosmic voids and void lensing in the Dark Energy Survey science verification data

    DOE PAGES

    Sánchez, C.; Clampitt, J.; Kovacs, A.; ...

    2016-10-26

    Galaxies and their dark matter halos populate a complicated filamentary network around large, nearly empty regions known as cosmic voids. Cosmic voids are usually identified in spectroscopic galaxy surveys, where 3D information about the large-scale structure of the Universe is available. Although an increasing amount of photometric data is being produced, its potential for void studies is limited since photometric redshifts induce line-of-sight position errors of ~50 Mpc/h or more that can render many voids undetectable. In this paper we present a new void finder designed for photometric surveys, validate it using simulations, and apply it to the high-quality photo-zmore » redMaGiC galaxy sample of the Dark Energy Survey Science Verification (DES-SV) data. The algorithm works by projecting galaxies into 2D slices and finding voids in the smoothed 2D galaxy density field of the slice. Fixing the line-of-sight size of the slices to be at least twice the photo- z scatter, the number of voids found in these projected slices of simulated spectroscopic and photometric galaxy catalogs is within 20% for all transverse void sizes, and indistinguishable for the largest voids of radius ~70 Mpc/h and larger. The positions, radii, and projected galaxy profiles of photometric voids also accurately match the spectroscopic void sample. Applying the algorithm to the DES-SV data in the redshift range 0.2 < z < 0.8 , we identify 87 voids with comoving radii spanning the range 18-120 Mpc/h, and carry out a stacked weak lensing measurement. With a significance of 4.4σ, the lensing measurement confirms the voids are truly underdense in the matter field and hence not a product of Poisson noise, tracer density effects or systematics in the data. In conclusion, it also demonstrates, for the first time in real data, the viability of void lensing studies in photometric surveys.« less

  16. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements: (a...

  17. Self-verification and contextualized self-views.

    PubMed

    Chen, Serena; English, Tammy; Peng, Kaiping

    2006-07-01

    Whereas most self-verification research has focused on people's desire to verify their global self-conceptions, the present studies examined self-verification with regard to contextualized selfviews-views of the self in particular situations and relationships. It was hypothesized that individuals whose core self-conceptions include contextualized self-views should seek to verify these self-views. In Study 1, the more individuals defined the self in dialectical terms, the more their judgments were biased in favor of verifying over nonverifying feedback about a negative, situation-specific self-view. In Study 2, consistent with research on gender differences in the importance of relationships to the self-concept, women but not men showed a similar bias toward feedback about a negative, relationship-specific self-view, a pattern not seen for global self-views. Together, the results support the notion that self-verification occurs for core self-conceptions, whatever form(s) they may take. Individual differences in self-verification and the nature of selfhood and authenticity are discussed.

  18. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  19. The NETL MFiX Suite of multiphase flow models: A brief review and recent applications of MFiX-TFM to fossil energy technologies

    DOE PAGES

    Li, Tingwen; Rogers, William A.; Syamlal, Madhava; ...

    2016-07-29

    Here, the MFiX suite of multiphase computational fluid dynamics (CFD) codes is being developed at U.S. Department of Energy's National Energy Technology Laboratory (NETL). It includes several different approaches to multiphase simulation: MFiX-TFM, a two-fluid (Eulerian–Eulerian) model; MFiX-DEM, an Eulerian fluid model with a Lagrangian Discrete Element Model for the solids phase; and MFiX-PIC, Eulerian fluid model with Lagrangian particle ‘parcels’ representing particle groups. These models are undergoing continuous development and application, with verification, validation, and uncertainty quantification (VV&UQ) as integrated activities. After a brief summary of recent progress in the verification, validation and uncertainty quantification (VV&UQ), this article highlightsmore » two recent accomplishments in the application of MFiX-TFM to fossil energy technology development. First, recent application of MFiX to the pilot-scale KBR TRIG™ Transport Gasifier located at DOE's National Carbon Capture Center (NCCC) is described. Gasifier performance over a range of operating conditions was modeled and compared to NCCC operational data to validate the ability of the model to predict parametric behavior. Second, comparison of code predictions at a detailed fundamental scale is presented studying solid sorbents for the post-combustion capture of CO 2 from flue gas. Specifically designed NETL experiments are being used to validate hydrodynamics and chemical kinetics for the sorbent-based carbon capture process.« less

  20. Combining Dark Energy Survey Science Verification data with near-infrared data from the ESO VISTA Hemisphere Survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banerji, M.; Jouvel, S.; Lin, H.

    2014-11-25

    We present the combination of optical data from the Science Verification phase of the Dark Energy Survey (DES) with near-infrared (NIR) data from the European Southern Observatory VISTA Hemisphere Survey (VHS). The deep optical detections from DES are used to extract fluxes and associated errors from the shallower VHS data. Joint seven-band ( grizYJK) photometric catalogues are produced in a single 3 sq-deg dedicated camera field centred at 02h26m-04d36m where the availability of ancillary multiwavelength photometry and spectroscopy allows us to test the data quality. Dual photometry increases the number of DES galaxies with measured VHS fluxes by a factormore » of ~4.5 relative to a simple catalogue level matching and results in a ~1.5 mag increase in the 80 per cent completeness limit of the NIR data. Almost 70 per cent of DES sources have useful NIR flux measurements in this initial catalogue. Photometric redshifts are estimated for a subset of galaxies with spectroscopic redshifts and initial results, although currently limited by small number statistics, indicate that the VHS data can help reduce the photometric redshift scatter at both z < 0.5 and z > 1. We present example DES+VHS colour selection criteria for high-redshift luminous red galaxies (LRGs) at z ~ 0.7 as well as luminous quasars. Using spectroscopic observations in this field we show that the additional VHS fluxes enable a cleaner selection of both populations with <10 per cent contamination from galactic stars in the case of spectroscopically confirmed quasars and <0.5 per cent contamination from galactic stars in the case of spectroscopically confirmed LRGs. The combined DES+VHS data set, which will eventually cover almost 5000 sq-deg, will therefore enable a range of new science and be ideally suited for target selection for future wide-field spectroscopic surveys.« less

  1. Combining Dark Energy Survey Science Verification data with near-infrared data from the ESO VISTA Hemisphere Survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banerji, M.; Jouvel, S.; Lin, H.

    2014-11-25

    We present the combination of optical data from the Science Verification phase of the Dark Energy Survey (DES) with near-infrared (NIR) data from the European Southern Observatory VISTA Hemisphere Survey (VHS). The deep optical detections from DES are used to extract fluxes and associated errors from the shallower VHS data. Joint seven-band (grizYJK) photometric catalogues are produced in a single 3 sq-deg dedicated camera field centred at 02h26m-04d36m where the availability of ancillary multiwavelength photometry and spectroscopy allows us to test the data quality. Dual photometry increases the number of DES galaxies with measured VHS fluxes by a factor ofmore » similar to 4.5 relative to a simple catalogue level matching and results in a similar to 1.5 mag increase in the 80 per cent completeness limit of the NIR data. Almost 70 per cent of DES sources have useful NIR flux measurements in this initial catalogue. Photometric redshifts are estimated for a subset of galaxies with spectroscopic redshifts and initial results, although currently limited by small number statistics, indicate that the VHS data can help reduce the photometric redshift scatter at both z < 0.5 and z > 1. We present example DES VHS colour selection criteria for high-redshift luminous red galaxies (LRGs) at z similar to 0.7 as well as luminous quasars. Using spectroscopic observations in this field we show that the additional VHS fluxes enable a cleaner selection of both populations with <10 per cent contamination from galactic stars in the case of spectroscopically confirmed quasars and <0.5 per cent contamination from galactic stars in the case of spectroscopically confirmed LRGs. The combined DES+VHS data set, which will eventually cover almost 5000 sq-deg, will therefore enable a range of new science and be ideally suited for target selection for future wide-field spectroscopic surveys.« less

  2. Phase correction for ALMA. Investigating water vapour radiometer scaling: The long-baseline science verification data case study

    NASA Astrophysics Data System (ADS)

    Maud, L. T.; Tilanus, R. P. J.; van Kempen, T. A.; Hogerheijde, M. R.; Schmalzl, M.; Yoon, I.; Contreras, Y.; Toribio, M. C.; Asaki, Y.; Dent, W. R. F.; Fomalont, E.; Matsushita, S.

    2017-09-01

    The Atacama Large millimetre/submillimetre Array (ALMA) makes use of water vapour radiometers (WVR), which monitor the atmospheric water vapour line at 183 GHz along the line of sight above each antenna to correct for phase delays introduced by the wet component of the troposphere. The application of WVR derived phase corrections improve the image quality and facilitate successful observations in weather conditions that were classically marginal or poor. We present work to indicate that a scaling factor applied to the WVR solutions can act to further improve the phase stability and image quality of ALMA data. We find reduced phase noise statistics for 62 out of 75 datasets from the long-baseline science verification campaign after a WVR scaling factor is applied. The improvement of phase noise translates to an expected coherence improvement in 39 datasets. When imaging the bandpass source, we find 33 of the 39 datasets show an improvement in the signal-to-noise ratio (S/N) between a few to 30 percent. There are 23 datasets where the S/N of the science image is improved: 6 by <1%, 11 between 1 and 5%, and 6 above 5%. The higher frequencies studied (band 6 and band 7) are those most improved, specifically datasets with low precipitable water vapour (PWV), <1 mm, where the dominance of the wet component is reduced. Although these improvements are not profound, phase stability improvements via the WVR scaling factor come into play for the higher frequency (>450 GHz) and long-baseline (>5 km) observations. These inherently have poorer phase stability and are taken in low PWV (<1 mm) conditions for which we find the scaling to be most effective. A promising explanation for the scaling factor is the mixing of dry and wet air components, although other origins are discussed. We have produced a python code to allow ALMA users to undertake WVR scaling tests and make improvements to their data.

  3. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...

  4. Grazing Incidence Wavefront Sensing and Verification of X-Ray Optics Performance

    NASA Technical Reports Server (NTRS)

    Saha, Timo T.; Rohrbach, Scott; Zhang, William W.

    2011-01-01

    Evaluation of interferometrically measured mirror metrology data and characterization of a telescope wavefront can be powerful tools in understanding of image characteristics of an x-ray optical system. In the development of soft x-ray telescope for the International X-Ray Observatory (IXO), we have developed new approaches to support the telescope development process. Interferometrically measuring the optical components over all relevant spatial frequencies can be used to evaluate and predict the performance of an x-ray telescope. Typically, the mirrors are measured using a mount that minimizes the mount and gravity induced errors. In the assembly and mounting process the shape of the mirror segments can dramatically change. We have developed wavefront sensing techniques suitable for the x-ray optical components to aid us in the characterization and evaluation of these changes. Hartmann sensing of a telescope and its components is a simple method that can be used to evaluate low order mirror surface errors and alignment errors. Phase retrieval techniques can also be used to assess and estimate the low order axial errors of the primary and secondary mirror segments. In this paper we describe the mathematical foundation of our Hartmann and phase retrieval sensing techniques. We show how these techniques can be used in the evaluation and performance prediction process of x-ray telescopes.

  5. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...

  6. Built-in-Test Verification Techniques

    DTIC Science & Technology

    1987-02-01

    report documents the results of the effort for the Rome Air Development Center Contract F30602-84-C-0021, BIT Verification Techniques. The work was...Richard Spillman of Sp.,llman Research Associates. The principal investigators were Mike Partridge and subsequently Jeffrey Albert. The contract was...two your effort to develop techniques for Built-In Test (BIT) verification. The objective of the contract was to develop specifications and technical

  7. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file.

    PubMed

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-21

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases.

  8. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What is the Platform Verification Program? 250... Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms; platforms of a new or unique design...

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, LEAD IN DUST WIPE MEASUREMENT TECHNOLOGY, NITON LLC, X-RAY FLUORESCENCE SPECTRUM ANALYZER, XLT-700

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) design efficient processes for conducting has created the Environmental Technology perfofl1lance tests of innovative technologies. Verification Program (E TV) to facilitate the deployment of innovative or improved environmental techn...

  10. Verification Challenges at Low Numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, themore » issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.« less

  11. Model Based Verification of Cyber Range Event Environments

    DTIC Science & Technology

    2015-12-10

    Model Based Verification of Cyber Range Event Environments Suresh K. Damodaran MIT Lincoln Laboratory 244 Wood St., Lexington, MA, USA...apply model based verification to cyber range event environment configurations, allowing for the early detection of errors in event environment...Environment Representation (CCER) ontology. We also provide an overview of a methodology to specify verification rules and the corresponding error

  12. Environmental dependence of the galaxy stellar mass function in the Dark Energy Survey Science Verification Data

    DOE PAGES

    Etherington, J.; Thomas, D.; Maraston, C.; ...

    2016-01-04

    Measurements of the galaxy stellar mass function are crucial to understand the formation of galaxies in the Universe. In a hierarchical clustering paradigm it is plausible that there is a connection between the properties of galaxies and their environments. Evidence for environmental trends has been established in the local Universe. The Dark Energy Survey (DES) provides large photometric datasets that enable further investigation of the assembly of mass. In this study we use ~3.2 million galaxies from the (South Pole Telescope) SPT-East field in the DES science verification (SV) dataset. From grizY photometry we derive galaxy stellar masses and absolutemore » magnitudes, and determine the errors on these properties using Monte-Carlo simulations using the full photometric redshift probability distributions. We compute galaxy environments using a fixed conical aperture for a range of scales. We construct galaxy environment probability distribution functions and investigate the dependence of the environment errors on the aperture parameters. We compute the environment components of the galaxy stellar mass function for the redshift range 0.15 < z < 1.05. For z < 0.75 we find that the fraction of massive galaxies is larger in high density environment than in low density environments. We show that the low density and high density components converge with increasing redshift up to z ~ 1.0 where the shapes of the mass function components are indistinguishable. As a result, our study shows how high density structures build up around massive galaxies through cosmic time.« less

  13. X-ray verification of an optically-aligned off-plane grating module

    NASA Astrophysics Data System (ADS)

    Donovan, Benjamin; McEntaffer, Randall; Tutt, James; DeRoo, Casey; Allured, Ryan; Gaskin, Jessica; Kolodziejczak, Jeffery

    2017-08-01

    The next generation of X-ray spectrometer missions are baselined to have order-of-magnitude improvements in both spectral resolving power and effective area when compared to existing X-ray spectrometer missions. Off-plane X-ray reflection gratings are capable of achieving high resolution and high diffraction efficiencies over the entire X-ray bandpass, making them an ideal technology to implement on these future missions. To achieve the high effective area desired while maintaining high spectral resolution, many off-plane gratings must be precisely aligned such that their diffraction arcs overlap at the focal plane. Methods are under development to align a number of these gratings into a grating module using optical metrology techniques in support of the Off-plane Grating Rocket Experiment (OGRE), a suborbital rocket payload scheduled to launch in late 2018. X-ray testing was performed on an aligned grating module at the Straylight Test Facility (SLTF) at NASA Marshall Space Flight Center (MSFC) to assess the current alignment methodology and its ability to meet the desired performance of OGRE. We report on the results from the test campaign at MSFC, as well as plans for future development.

  14. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  15. Students' Verification Strategies for Combinatorial Problems

    ERIC Educational Resources Information Center

    Mashiach Eizenberg, Michal; Zaslavsky, Orit

    2004-01-01

    We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…

  16. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Innov-X XT400 Series (XT400) x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XT400 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XT400 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was as

  17. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  18. Cognitive neuroscience in forensic science: understanding and utilizing the human element.

    PubMed

    Dror, Itiel E

    2015-08-05

    The human element plays a critical role in forensic science. It is not limited only to issues relating to forensic decision-making, such as bias, but also relates to most aspects of forensic work (some of which even take place before a crime is ever committed or long after the verification of the forensic conclusion). In this paper, I explicate many aspects of forensic work that involve the human element and therefore show the relevance (and potential contribution) of cognitive neuroscience to forensic science. The 10 aspects covered in this paper are proactive forensic science, selection during recruitment, training, crime scene investigation, forensic decision-making, verification and conflict resolution, reporting, the role of the forensic examiner, presentation in court and judicial decisions. As the forensic community is taking on the challenges introduced by the realization that the human element is critical for forensic work, new opportunities emerge that allow for considerable improvement and enhancement of the forensic science endeavour. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  19. Enhanced Verification Test Suite for Physics Simulation Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, J R; Brock, J S; Brandon, S T

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  20. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  1. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  2. 22 CFR 123.14 - Import certificate/delivery verification procedure.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Import certificate/delivery verification...

  3. 22 CFR 123.14 - Import certificate/delivery verification procedure.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Import certificate/delivery verification...

  4. Redshift distributions of galaxies in the Dark Energy Survey Science Verification shear catalogue and implications for weak lensing

    DOE PAGES

    Bonnett, C.; Troxel, M. A.; Hartley, W.; ...

    2016-08-30

    Here we present photometric redshift estimates for galaxies used in the weak lensing analysis of the Dark Energy Survey Science Verification (DES SV) data. Four model- or machine learning-based photometric redshift methods—annz2, bpz calibrated against BCC-Ufig simulations, skynet, and tpz—are analyzed. For training, calibration, and testing of these methods, we construct a catalogue of spectroscopically confirmed galaxies matched against DES SV data. The performance of the methods is evaluated against the matched spectroscopic catalogue, focusing on metrics relevant for weak lensing analyses, with additional validation against COSMOS photo-z’s. From the galaxies in the DES SV shear catalogue, which have meanmore » redshift 0.72±0.01 over the range 0.38 of approximately 3%. This shift is within the one sigma statistical errors on σ8 for the DES SV shear catalogue. We further study the potential impact of systematic differences on the critical surface density, Σ crit, finding levels of bias safely less than the statistical power of DES SV data. In conclusion, we recommend a final Gaussian prior for the photo-z bias in the mean of n(z) of width 0.05 for each of the three tomographic bins, and show that this is a sufficient bias model for the corresponding cosmology analysis.« less

  5. Redshift distributions of galaxies in the Dark Energy Survey Science Verification shear catalogue and implications for weak lensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonnett, C.; Troxel, M. A.; Hartley, W.

    Here we present photometric redshift estimates for galaxies used in the weak lensing analysis of the Dark Energy Survey Science Verification (DES SV) data. Four model- or machine learning-based photometric redshift methods—annz2, bpz calibrated against BCC-Ufig simulations, skynet, and tpz—are analyzed. For training, calibration, and testing of these methods, we construct a catalogue of spectroscopically confirmed galaxies matched against DES SV data. The performance of the methods is evaluated against the matched spectroscopic catalogue, focusing on metrics relevant for weak lensing analyses, with additional validation against COSMOS photo-z’s. From the galaxies in the DES SV shear catalogue, which have meanmore » redshift 0.72±0.01 over the range 0.38 of approximately 3%. This shift is within the one sigma statistical errors on σ8 for the DES SV shear catalogue. We further study the potential impact of systematic differences on the critical surface density, Σ crit, finding levels of bias safely less than the statistical power of DES SV data. In conclusion, we recommend a final Gaussian prior for the photo-z bias in the mean of n(z) of width 0.05 for each of the three tomographic bins, and show that this is a sufficient bias model for the corresponding cosmology analysis.« less

  6. Self-verification motives at the collective level of self-definition.

    PubMed

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  7. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  8. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  9. SpaceX CRS-13 "What's on Board?" Mission Science Briefing

    NASA Image and Video Library

    2017-12-11

    Cheryl Warner of NASA Communications, left, Patrick O'Neill, Marketing and Communications manager at the Center of Advancement of Science in Space (CASIS), center, and Rebecca Regan of Boeing Communications speak to members of social media in the Kennedy Space Center’s Press Site auditorium. The briefing focused on research planned for launch to the International Space Station. The scientific materials and supplies will be aboard a Dragon spacecraft scheduled for liftoff from Cape Canaveral Air Force Station's Space Launch Complex 40 at 11:46 a.m. EST, on Dec. 12, 2017. The SpaceX Falcon 9 rocket will launch the company's 13th Commercial Resupply Services mission to the space station.

  10. PREFACE: Workshop on 'Buried' Interface Science with X-rays and Neutrons

    NASA Astrophysics Data System (ADS)

    Sakurai, Kenji

    2007-06-01

    The 2007 workshop on `buried' interface science with X-rays and neutrons was held at the Institute of Materials Research, Tohoku University, in Sendai, Japan, on July 22-24, 2007. The workshop was the latest in a series held since 2001; Tsukuba (December 2001), Niigata (September 2002), Nagoya (July 2003), Tsukuba (July 2004), Saitama (March 2005), Yokohama (July 2006), Kusatsu (August 2006) and Tokyo (December 2006). The 2007 workshop had 64 participants and 34 presentations. There are increasing demands for sophisticated metrology in order to observe multilayered materials with nano-structures (dots, wires, etc), which are finding applications in electronic, magnetic, optical and other devices. Unlike many other surface-sensitive methods, X-ray and neutron analysis is known for its ability to see even `buried' function interfaces as well as the surface. It is highly reliable in practice, because the information, which ranges from the atomic to mesoscopic scale, is quantitative and reproducible. The non-destructive nature of this type of analytical method ensures that the same specimen can be measured by other techniques. However, we now realize that the method should be upgraded further to cope with more realistic problems in nano sciences and technologies. In the case of the reflectivity technique and other related methods, which have been the main topics in our workshops over the past 7 years, there are three important directions as illustrated in the Figure. Current X-ray methods can give atomic-scale information for quite a large area on a scale of mm2-cm2. These methods can deliver good statistics for an average, but sometimes we need to be able to see a specific part in nano-scale rather than an average structure. In addition, there is a need to see unstable changing structures and related phenomena in order to understand more about the mechanism of the functioning of nano materials. Quick measurements are therefore important. Furthermore, in order to apply

  11. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  12. CASL Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mousseau, Vincent Andrew; Dinh, Nam

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation andmore » verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.« less

  13. The monitoring and verification of nuclear weapons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garwin, Richard L., E-mail: RLG2@us.ibm.com

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION: ADD-ON NOX CONTROLS

    EPA Science Inventory

    The paper discusses the environmental technology verification (ETV) of add-on nitrogen oxide (NOx) controls. Research Triangle Institute (RTI) is EPA's cooperating partner for the Air Pollution Control Technology (APCT) Program, one of a dozen ETV pilot programs. Verification of ...

  15. Verification and quality control of routine hematology analyzers.

    PubMed

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  16. J-2X Abort System Development

    NASA Technical Reports Server (NTRS)

    Santi, Louis M.; Butas, John P.; Aguilar, Robert B.; Sowers, Thomas S.

    2008-01-01

    The J-2X is an expendable liquid hydrogen (LH2)/liquid oxygen (LOX) gas generator cycle rocket engine that is currently being designed as the primary upper stage propulsion element for the new NASA Ares vehicle family. The J-2X engine will contain abort logic that functions as an integral component of the Ares vehicle abort system. This system is responsible for detecting and responding to conditions indicative of impending Loss of Mission (LOM), Loss of Vehicle (LOV), and/or catastrophic Loss of Crew (LOC) failure events. As an earth orbit ascent phase engine, the J-2X is a high power density propulsion element with non-negligible risk of fast propagation rate failures that can quickly lead to LOM, LOV, and/or LOC events. Aggressive reliability requirements for manned Ares missions and the risk of fast propagating J-2X failures dictate the need for on-engine abort condition monitoring and autonomous response capability as well as traditional abort agents such as the vehicle computer, flight crew, and ground control not located on the engine. This paper describes the baseline J-2X abort subsystem concept of operations, as well as the development process for this subsystem. A strategy that leverages heritage system experience and responds to an evolving engine design as well as J-2X specific test data to support abort system development is described. The utilization of performance and failure simulation models to support abort system sensor selection, failure detectability and discrimination studies, decision threshold definition, and abort system performance verification and validation is outlined. The basis for abort false positive and false negative performance constraints is described. Development challenges associated with information shortfalls in the design cycle, abort condition coverage and response assessment, engine-vehicle interface definition, and abort system performance verification and validation are also discussed.

  17. Multi-canister overpack project -- verification and validation, MCNP 4A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldmann, L.H.

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less

  18. X-ray grating interferometer for materials-science imaging at a low-coherent wiggler source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herzen, Julia; Physics Department and Institute for Medical Engineering, Technische Universitaet Muenchen, 85748 Garching; Donath, Tilman

    2011-11-15

    X-ray phase-contrast radiography and tomography enable to increase contrast for weakly absorbing materials. Recently, x-ray grating interferometers were developed that extend the possibility of phase-contrast imaging from highly brilliant radiation sources like third-generation synchrotron sources to non-coherent conventional x-ray tube sources. Here, we present the first installation of a three grating x-ray interferometer at a low-coherence wiggler source at the beamline W2 (HARWI II) operated by the Helmholtz-Zentrum Geesthacht at the second-generation synchrotron storage ring DORIS (DESY, Hamburg, Germany). Using this type of the wiggler insertion device with a millimeter-sized source allows monochromatic phase-contrast imaging of centimeter sized objects withmore » high photon flux. Thus, biological and materials-science imaging applications can highly profit from this imaging modality. The specially designed grating interferometer currently works in the photon energy range from 22 to 30 keV, and the range will be increased by using adapted x-ray optical gratings. Our results of an energy-dependent visibility measurement in comparison to corresponding simulations demonstrate the performance of the new setup.« less

  19. The verification of LANDSAT data in the geographical analysis of wetlands in west Tennessee

    NASA Technical Reports Server (NTRS)

    Rehder, J.; Quattrochi, D. A.

    1978-01-01

    The reliability of LANDSAT imagery as a medium for identifying, delimiting, monitoring, measuring, and mapping wetlands in west Tennessee was assessed to verify LANDSAT as an accurate, efficient cartographic tool that could be employed by a wide range of users to study wetland dynamics. The verification procedure was based on the visual interpretation and measurement of multispectral imagery. The accuracy testing procedure was predicated on surrogate ground truth data gleaned from medium altitude imagery of the wetlands. Fourteen sites or case study areas were selected from individual 9 x 9 inch photo frames on the aerial photography. These sites were then used as data control calibration parameters for assessing the cartography accuracy of the LANDSAT imagery. An analysis of results obtained from the verification tests indicated that 1:250,000 scale LANDSAT data were the most reliable scale of imagery for visually mapping and measuring wetlands using the area grid technique. The mean areal percentage of accuracy was 93.54 percent (real) and 96.93 percent (absolute). As a test of accuracy, the LANDSAT 1:250,000 scale overall wetland measurements were compared with an area cell mensuration of the swamplands from 1:130,000 scale color infrared U-2 aircraft imagery. The comparative totals substantiated the results from the LANDSAT verification procedure.

  20. On the Formal Verification of Conflict Detection Algorithms

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Butler, Ricky W.; Carreno, Victor A.; Dowek, Gilles

    2001-01-01

    Safety assessment of new air traffic management systems is a main issue for civil aviation authorities. Standard techniques such as testing and simulation have serious limitations in new systems that are significantly more autonomous than the older ones. In this paper, we present an innovative approach, based on formal verification, for establishing the correctness of conflict detection systems. Fundamental to our approach is the concept of trajectory, which is a continuous path in the x-y plane constrained by physical laws and operational requirements. From the Model of trajectories, we extract, and formally prove, high level properties that can serve as a framework to analyze conflict scenarios. We use the Airborne Information for Lateral Spacing (AILS) alerting algorithm as a case study of our approach.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  2. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  3. Simulation environment based on the Universal Verification Methodology

    NASA Astrophysics Data System (ADS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  4. Scanning transmission x-ray microscope for materials science spectromicroscopy at the ALS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warwick, T.; Seal, S.; Shin, H.

    1997-04-01

    The brightness of the Advanced Light Source will be exploited by several new instruments for materials science spectromicroscopy over the next year or so. The first of these to become operational is a scanning transmission x-ray microscope with which near edge x-ray absorption spectra (NEXAFS) can be measured on spatial features of sub-micron size. Here the authors describe the instrument as it is presently implemented, its capabilities, some studies made to date and the developments to come. The Scanning Transmission X-ray Microscope makes use of a zone plate lens to produce a small x-ray spot with which to perform absorptionmore » spectroscopy through thin samples. The x-ray beam from ALS undulator beamline 7.0 emerges into the microscope vessel through a silicon nitride vacuum window 160nm thick and 300{mu}m square. The vessel is filled with helium at atmospheric pressure. The zone plate lens is illuminated 1mm downstream from the vacuum window and forms an image in first order of a pinhole which is 3m upstream in the beamline. An order sorting aperture passes the first order converging light and blocks the unfocused zero order. The sample is at the focus a few mm downstream of the zone plate and mounted from a scanning piezo stage which rasters in x and y so that an image is formed, pixel by pixel, by an intensity detector behind the sample. Absorption spectra are measured point-by-point as the photon energy is scanned by rotating the diffraction grating in the monochromator and changing the undulator gap.« less

  5. Glove-based approach to online signature verification.

    PubMed

    Kamel, Nidal S; Sayeed, Shohel; Ellis, Grant A

    2008-06-01

    Utilizing the multiple degrees of freedom offered by the data glove for each finger and the hand, a novel on-line signature verification system using the Singular Value Decomposition (SVD) numerical tool for signature classification and verification is presented. The proposed technique is based on the Singular Value Decomposition in finding r singular vectors sensing the maximal energy of glove data matrix A, called principal subspace, so the effective dimensionality of A can be reduced. Having modeled the data glove signature through its r-principal subspace, signature authentication is performed by finding the angles between the different subspaces. A demonstration of the data glove is presented as an effective high-bandwidth data entry device for signature verification. This SVD-based signature verification technique is tested and its performance is shown to be able to recognize forgery signatures with a false acceptance rate of less than 1.2%.

  6. Cross-correlation of gravitational lensing from DES Science Verification data with SPT and Planck lensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirk, D.; Omori, Y.; Benoit-Lévy, A.

    We measure the cross-correlation between weak lensing of galaxy images and of the cosmic microwave background (CMB). The effects of gravitational lensing on different sources will be correlated if the lensing is caused by the same mass fluctuations. We use galaxy shape measurements from 139 deg(2) of the Dark Energy Survey (DES) Science Verification data and overlapping CMB lensing from the South Pole Telescope (SPT) and Planck. The DES source galaxies have a median redshift of z(med) similar to 0.7, while the CMB lensing kernel is broad and peaks at z similar to 2. The resulting cross-correlation is maximally sensitivemore » to mass fluctuations at z similar to 0.44. Assuming the Planck 2015 best-fitting cosmology, the amplitude of the DESxSPT cross-power is found to be A(SPT) = 0.88 +/- 0.30 and that from DESxPlanck to be A(Planck) = 0.86 +/- 0.39, where A = 1 corresponds to the theoretical prediction. These are consistent with the expected signal and correspond to significances of 2.9 sigma and 2.2 sigma, respectively. We demonstrate that our results are robust to a number of important systematic effects including the shear measurement method, estimator choice, photo-z uncertainty and CMB lensing systematics. We calculate a value of A = 1.08 +/- 0.36 for DESxSPT when we correct the observations with a simple intrinsic alignment model. With three measurements of this cross-correlation now existing in the literature, there is not yet reliable evidence for any deviation from the expected LCDM level of cross-correlation. We provide forecasts for the expected signal-to-noise ratio of the combination of the five-year DES survey and SPT-3G.« less

  7. Cross-correlation of gravitational lensing from DES Science Verification data with SPT and Planck lensing

    NASA Astrophysics Data System (ADS)

    Kirk, D.; Omori, Y.; Benoit-Lévy, A.; Cawthon, R.; Chang, C.; Larsen, P.; Amara, A.; Bacon, D.; Crawford, T. M.; Dodelson, S.; Fosalba, P.; Giannantonio, T.; Holder, G.; Jain, B.; Kacprzak, T.; Lahav, O.; MacCrann, N.; Nicola, A.; Refregier, A.; Sheldon, E.; Story, K. T.; Troxel, M. A.; Vieira, J. D.; Vikram, V.; Zuntz, J.; Abbott, T. M. C.; Abdalla, F. B.; Becker, M. R.; Benson, B. A.; Bernstein, G. M.; Bernstein, R. A.; Bleem, L. E.; Bonnett, C.; Bridle, S. L.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Capozzi, D.; Carlstrom, J. E.; Rosell, A. Carnero; Kind, M. Carrasco; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Dietrich, J. P.; Doel, P.; Eifler, T. F.; Evrard, A. E.; Flaugher, B.; Frieman, J.; Gerdes, D. W.; Goldstein, D. A.; Gruen, D.; Gruendl, R. A.; Honscheid, K.; James, D. J.; Jarvis, M.; Kent, S.; Kuehn, K.; Kuropatkin, N.; Lima, M.; March, M.; Martini, P.; Melchior, P.; Miller, C. J.; Miquel, R.; Nichol, R. C.; Ogando, R.; Plazas, A. A.; Reichardt, C. L.; Roodman, A.; Rozo, E.; Rykoff, E. S.; Sako, M.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Simard, G.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Wechsler, R. H.; Weller, J.

    2016-06-01

    We measure the cross-correlation between weak lensing of galaxy images and of the cosmic microwave background (CMB). The effects of gravitational lensing on different sources will be correlated if the lensing is caused by the same mass fluctuations. We use galaxy shape measurements from 139 deg2 of the Dark Energy Survey (DES) Science Verification data and overlapping CMB lensing from the South Pole Telescope (SPT) and Planck. The DES source galaxies have a median redshift of zmed ˜ 0.7, while the CMB lensing kernel is broad and peaks at z ˜ 2. The resulting cross-correlation is maximally sensitive to mass fluctuations at z ˜ 0.44. Assuming the Planck 2015 best-fitting cosmology, the amplitude of the DES×SPT cross-power is found to be ASPT = 0.88 ± 0.30 and that from DES×Planck to be APlanck = 0.86 ± 0.39, where A = 1 corresponds to the theoretical prediction. These are consistent with the expected signal and correspond to significances of 2.9σ and 2.2σ, respectively. We demonstrate that our results are robust to a number of important systematic effects including the shear measurement method, estimator choice, photo-z uncertainty and CMB lensing systematics. We calculate a value of A = 1.08 ± 0.36 for DES×SPT when we correct the observations with a simple intrinsic alignment model. With three measurements of this cross-correlation now existing in the literature, there is not yet reliable evidence for any deviation from the expected LCDM level of cross-correlation. We provide forecasts for the expected signal-to-noise ratio of the combination of the five-year DES survey and SPT-3G.

  8. Verification of S&D Solutions for Network Communications and Devices

    NASA Astrophysics Data System (ADS)

    Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen

    This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.

  9. Considerations in STS payload environmental verification

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1978-01-01

    Considerations regarding the Space Transportation System (STS) payload environmental verification are reviewed. It is noted that emphasis is placed on testing at the subassembly level and that the basic objective of structural dynamic payload verification is to ensure reliability in a cost-effective manner. Structural analyses consist of: (1) stress analysis for critical loading conditions, (2) model analysis for launch and orbital configurations, (3) flight loads analysis, (4) test simulation analysis to verify models, (5) kinematic analysis of deployment/retraction sequences, and (6) structural-thermal-optical program analysis. In addition to these approaches, payload verification programs are being developed in the thermal-vacuum area. These include the exposure to extreme temperatures, temperature cycling, thermal-balance testing and thermal-vacuum testing.

  10. Formal Multilevel Hierarchical Verification of Synchronous MOS VLSI Circuits.

    DTIC Science & Technology

    1987-06-01

    166 12.4 Capacitance Coupling............................. 166 12.5 Multiple Abstraction Fuctions ....................... 168...depend on whether it is performing flat verification or hierarchical verification. The primary operations of Silica Pithecus when performing flat...signals never arise. The primary operation of Silica Pithecus when performing hierarchical verification is processing constraints to show they hold

  11. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file

    NASA Astrophysics Data System (ADS)

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-01

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases. This work was partly presented at the 58th Annual meeting of American Association of Physicists in Medicine.

  12. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  13. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  14. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  15. Molecfit: A general tool for telluric absorption correction. II. Quantitative evaluation on ESO-VLT/X-Shooterspectra

    NASA Astrophysics Data System (ADS)

    Kausch, W.; Noll, S.; Smette, A.; Kimeswenger, S.; Barden, M.; Szyszka, C.; Jones, A. M.; Sana, H.; Horst, H.; Kerber, F.

    2015-04-01

    Context. Absorption by molecules in the Earth's atmosphere strongly affects ground-based astronomical observations. The resulting absorption line strength and shape depend on the highly variable physical state of the atmosphere, i.e. pressure, temperature, and mixing ratio of the different molecules involved. Usually, supplementary observations of so-called telluric standard stars (TSS) are needed to correct for this effect, which is expensive in terms of telescope time. We have developed the software package molecfit to provide synthetic transmission spectra based on parameters obtained by fitting narrow ranges of the observed spectra of scientific objects. These spectra are calculated by means of the radiative transfer code LBLRTM and an atmospheric model. In this way, the telluric absorption correction for suitable objects can be performed without any additional calibration observations of TSS. Aims: We evaluate the quality of the telluric absorption correction using molecfit with a set of archival ESO-VLT/X-Shooter visible and near-infrared spectra. Methods: Thanks to the wavelength coverage from the U to the K band, X-Shooter is well suited to investigate the quality of the telluric absorption correction with respect to the observing conditions, the instrumental set-up, input parameters of the code, the signal-to-noise of the input spectrum, and the atmospheric profiles. These investigations are based on two figures of merit, Ioff and Ires, that describe the systematic offsets and the remaining small-scale residuals of the corrections. We also compare the quality of the telluric absorption correction achieved with molecfit to the classical method based on a telluric standard star. Results: The evaluation of the telluric correction with molecfit shows a convincing removal of atmospheric absorption features. The comparison with the classical method reveals that molecfit performs better because it is not prone to the bad continuum reconstruction, noise, and

  16. Practical Formal Verification of MPI and Thread Programs

    NASA Astrophysics Data System (ADS)

    Gopalakrishnan, Ganesh; Kirby, Robert M.

    Large-scale simulation codes in science and engineering are written using the Message Passing Interface (MPI). Shared memory threads are widely used directly, or to implement higher level programming abstractions. Traditional debugging methods for MPI or thread programs are incapable of providing useful formal guarantees about coverage. They get bogged down in the sheer number of interleavings (schedules), often missing shallow bugs. In this tutorial we will introduce two practical formal verification tools: ISP (for MPI C programs) and Inspect (for Pthread C programs). Unlike other formal verification tools, ISP and Inspect run directly on user source codes (much like a debugger). They pursue only the relevant set of process interleavings, using our own customized Dynamic Partial Order Reduction algorithms. For a given test harness, DPOR allows these tools to guarantee the absence of deadlocks, instrumented MPI object leaks and communication races (using ISP), and shared memory races (using Inspect). ISP and Inspect have been used to verify large pieces of code: in excess of 10,000 lines of MPI/C for ISP in under 5 seconds, and about 5,000 lines of Pthread/C code in a few hours (and much faster with the use of a cluster or by exploiting special cases such as symmetry) for Inspect. We will also demonstrate the Microsoft Visual Studio and Eclipse Parallel Tools Platform integrations of ISP (these will be available on the LiveCD).

  17. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    PubMed

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  18. An Efficient Location Verification Scheme for Static Wireless Sensor Networks

    PubMed Central

    Kim, In-hwan; Kim, Bo-sung; Song, JooSeok

    2017-01-01

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007

  19. 47 CFR 2.952 - Limitation on verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Limitation on verification. 2.952 Section 2.952 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL... person shall, in any advertising matter, brochure, etc., use or make reference to a verification in a...

  20. Handbook: Design of automated redundancy verification

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1971-01-01

    The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.

  1. SpaceX CRS-13 "What's on Board?" Mission Science Briefing

    NASA Image and Video Library

    2017-12-11

    Cheryl Warner of NASA Communications, left, Kirt Costello, deputy chief scientist for the International Space Station Program at NASA’s Johnson Space Center in Houston, center, and Patrick O'Neill, Marketing and Communications manager at the Center of Advancement of Science in Space (CASIS), speak to members of social media in the Kennedy Space Center’s Press Site auditorium. The briefing focused on research planned for launch to the International Space Station. The scientific materials and supplies will be aboard a Dragon spacecraft scheduled for liftoff from Cape Canaveral Air Force Station's Space Launch Complex 40 at 11:46 a.m. EST, on Dec. 12, 2017. The SpaceX Falcon 9 rocket will launch the company's 13th Commercial Resupply Services mission to the space station.

  2. ASCA Observation of the Dipping X-Ray Source X1916-053

    NASA Technical Reports Server (NTRS)

    Ko, Yuan-Kuen; Makai, Koji; Smale, Alan P.; White, Nick E.

    1997-01-01

    We present the results of timing and spectral studies of the dipping X-ray source X1916-053, observed by ASCA during its Performance Verification phase. The detected dipping activity is consistent with previous observations, with a period of 3008s and an intermittent secondary dip observed roughly 0.4 out of phase with the primary dip. The energy spectra of different intensity states are fitted with a power law with partial covering fraction absorption and interstellar absorption. The increase in the hardness ratio during the primary and secondary dips, and the increase in the covering fraction and column density with decreasing X-ray intensity, all imply that the dipping is caused by the photo-absorbing materials which have been suggested to be where the accreted flow hits the outer edge of the disk materials. The spectra at all intensity levels show no apparent evidence for Fe or Ne emission lines. This may be due to the low metal abundance in the accretion flow. Alternatively, the X-ray luminosity of the central source may be too weak to excite emission lines, which are assumed to be produced by X-ray photoionization of the disk materials.

  3. Science Goals for an All-sky Viewing Observatory in X-rays

    NASA Astrophysics Data System (ADS)

    Remillard, R. A.; Levine, A. M.; Morgan, E. H.; Bradt, H. V.

    2003-03-01

    We describe a concept for a NASA SMEX Mission that will provide a comprehensive investigation of cosmic explosions. These range from the short flashes at cosmological distances in Gamma-ray bursts, to the moments of relativistic mass ejections in Galactic microquasars, to the panorama of outbursts used to identify the stellar-scale black holes in our Galaxy. With an equatorial launch, an array of 31 cameras can cover 97% of the sky with an average exposure efficiency of 65%. Coded mask cameras with Xe detectors (1.5-12 keV) are chosen for their ability to distinguish thermal and non-thermal processes, while providing high throughput and msec time resolution to capture the detailed evolution of bright events. This mission, with 1' position accuracy, would provide a long-term solution to the critical needs for monitoring services for Chandra and GLAST, with possible overlap into the time frame for Constellation-X. The sky coverage would create additional science opportunities beyond the X-ray missions: "eyes" for LIGO and partnerships for time-variability with LOFAR and dedicated programs at optical observatories. Compared to the RXTE ASM, AVOX offers improvements by a factor of 40 in instantaneous sky coverage and a factor of 10 in sensitivity to faint X-ray sources (i.e. to 0.8 mCrab at 3 sigma in 1 day).

  4. The politics of verification and the control of nuclear tests, 1945-1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallagher, N.W.

    1990-01-01

    This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less

  5. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  6. Approaches to environmental verification of STS free-flier and pallet payloads

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1982-01-01

    This paper presents an overview of the environmental verification programs followed on an STS-launched free-flier payload, using the Tracking and Data Relay Satellite (TDRS) as an example, and a pallet payload, using the Office of Space Sciences-1 (OSS-1) as an example. Differences are assessed and rationale given as to why the differing programs were used on the two example payloads. It is concluded that the differences between the programs are due to inherent differences in the payload configuration, their respective mission performance objectives and their operational scenarios rather than to any generic distinctions that differentiate between a free-flier and a pallet payload.

  7. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  8. Current status of verification practices in clinical biochemistry in Spain.

    PubMed

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  9. Fingerprint changes and verification failure among patients with hand dermatitis.

    PubMed

    Lee, Chew Kek; Chang, Choong Chor; Johar, Asmah; Puwira, Othman; Roshidah, Baba

    2013-03-01

    To determine the prevalence of fingerprint verification failure and to define and quantify the fingerprint changes associated with fingerprint verification failure. Case-control study. Referral public dermatology center. The study included 100 consecutive patients with clinical hand dermatitis involving the palmar distal phalanx of either thumb and 100 age-, sex-, and ethnicity-matched controls. Patients with an altered thumb print due to other causes and palmar hyperhidrosis were excluded. Fingerprint verification(pass/fail) and hand eczema severity index score. Twenty-seven percent of patients failed fingerprint verification compared with 2% of controls. Fingerprint verification failure was associated with a higher hand eczema severity index score (P.001). The main fingerprint abnormalities were fingerprint dystrophy (42.0%) and abnormal white lines (79.5%). The number of abnormal white lines was significantly higher among the patients with hand dermatitis compared with controls(P=.001). Among the patients with hand dermatitis, theodds of failing fingerprint verification with fingerprint dystrophy was 4.01. The presence of broad lines and long lines was associated with a greater odds of fingerprint verification failure (odds ratio [OR], 8.04; 95% CI, 3.56-18.17 and OR, 2.37; 95% CI, 1.31-4.27, respectively),while the presence of thin lines was protective of verification failure (OR, 0.45; 95% CI, 0.23-0.89). Fingerprint verification failure is a significant problem among patients with more severe hand dermatitis. It is mainly due to fingerprint dystrophy and abnormal white lines. Malaysian National Medical Research Register Identifier: NMRR-11-30-8226

  10. Identity Verification, Control, and Aggression in Marriage

    ERIC Educational Resources Information Center

    Stets, Jan E.; Burke, Peter J.

    2005-01-01

    In this research we study the identity verification process and its effects in marriage. Drawing on identity control theory, we hypothesize that a lack of verification in the spouse identity (1) threatens stable self-meanings and interaction patterns between spouses, and (2) challenges a (nonverified) spouse's perception of control over the…

  11. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  12. AGN Science with STROBE-X

    NASA Astrophysics Data System (ADS)

    Ballantyne, David; Balokovic, Mislav; Garcia, Javier; Koss, Michael; STROBE-X

    2018-01-01

    The probe concept STROBE-X, with its combination of large collecting area, wide-field monitor, broad bandpass, and rapid timing capability, is a powerful tool for studying many aspects of AGN astrophysics. This unique combination of features opens up the possibility for studying AGNs in ways current and other future missions are unable to accomplish. Here, we show a few of the novel new investigations made possible by STROBE-X: probing the structure of the BLR and torus with reverberation of the narrow Fe Kα line and line-of-sight column density, tracking changes in coronal parameters, investigating the origin of the soft excess, Fe Kα emission line surveys, and efficient Compton-thick characterization. Additional ideas and suggestions are always welcome and can be communicated to any member of the STROBE-X team.

  13. Science with Constellation-X

    NASA Technical Reports Server (NTRS)

    Hornschemeier, Ann (Editor); Garcia, Michael (Editor)

    2005-01-01

    NASA's upcoming Constellation-X mission, one of two flagship missions in the Beyond Einstein program, will have more than 100 times the collecting area of any previous spectroscopic mission operating in the 0.25-40 keV bandpass and will enable high-throughput, high spectral resolution studies of sources ranging from the most luminous accreting supermassive black holes in the Universe to the disks around young stars where planets form. This booklet, which was assembled during early 2005 using the contributions of a large team of Astrophysicists, outlines the important scientific questions for the decade following this one and describes the areas where Constellation-X is going to have a major impact. These areas include the exploration of the space-time geometry of black holes spanning nine orders of magnitude in mass and the nature of the dark energy and dark matter which govern the expansion and ultimate fate of the Universe. Constellation-X will also explore processes referred to as "cosmic feedback" whereby mechanical energy, radiation, and chemical elements from star formation and black holes are returned to interstellar and intergalactic medium, profoundly affecting the development of structure in the Universe, and will also probe all the important life cycles of matter, from stellar and planetary birth to stellar death via supernova to stellar endpoints in the form of accreting binaries and supernova remnants.

  14. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  15. INF verification: a guide for the perplexed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendelsohn, J.

    1987-09-01

    The administration has dug itself some deep holes on the verification issue. It will have to conclude an arms control treaty without having resolved earlier (but highly questionable) compliance issues on which it has placed great emphasis. It will probably have to abandon its more sweeping (and unnecessary) on-site inspection (OSI) proposals because of adverse security and political implications for the United States and its allies. And, finally, it will probably have to present to the Congress an INF treaty that will provide for a considerably less-stringent (but nonetheless adequate) verification regime that it had originally demanded. It is difficultmore » to dispel the impression that, when the likelihood of concluding an INF treaty seemed remote, the administration indulged its penchant for intrusive and non-negotiable verification measures. As the possibility of, and eagerness for, a treaty increased, and as the Soviet Union shifted its policy from one of the resistance to OSI to one of indicating that on-site verification involved reciprocal obligations, the administration was forced to scale back its OSI rhetoric. This re-evaluation of OSI by the administration does not make the INF treaty any less verifiable; from the outset the Reagan administration was asking for a far more extensive verification package than was necessary, practicable, acceptable, or negotiable.« less

  16. Reward system and temporal pole contributions to affective evaluation during a first person shooter video game.

    PubMed

    Mathiak, Krystyna A; Klasen, Martin; Weber, René; Ackermann, Hermann; Shergill, Sukhwinder S; Mathiak, Klaus

    2011-07-12

    Violent content in video games evokes many concerns but there is little research concerning its rewarding aspects. It was demonstrated that playing a video game leads to striatal dopamine release. It is unclear, however, which aspects of the game cause this reward system activation and if violent content contributes to it. We combined functional Magnetic Resonance Imaging (fMRI) with individual affect measures to address the neuronal correlates of violence in a video game. Thirteen male German volunteers played a first-person shooter game (Tactical Ops: Assault on Terror) during fMRI measurement. We defined success as eliminating opponents, and failure as being eliminated themselves. Affect was measured directly before and after game play using the Positive and Negative Affect Schedule (PANAS). Failure and success events evoked increased activity in visual cortex but only failure decreased activity in orbitofrontal cortex and caudate nucleus. A negative correlation between negative affect and responses to failure was evident in the right temporal pole (rTP). The deactivation of the caudate nucleus during failure is in accordance with its role in reward-prediction error: it occurred whenever subject missed an expected reward (being eliminated rather than eliminating the opponent). We found no indication that violence events were directly rewarding for the players. We addressed subjective evaluations of affect change due to gameplay to study the reward system. Subjects reporting greater negative affect after playing the game had less rTP activity associated with failure. The rTP may therefore be involved in evaluating the failure events in a social context, to regulate the players' mood.

  17. Radiation effects on science instruments in Grand Tour type missions

    NASA Technical Reports Server (NTRS)

    Parker, R. H.

    1972-01-01

    The extent of the radiation effects problem is delineated, along with the status of protective designs for 15 representative science instruments. Designs for protecting science instruments from radiation damage is discussed for the various instruments to be employed in the Grand Tour type missions. A literature search effort was undertaken to collect science instrument components damage/interference effects data on the various sensitive components such as Si detectors, vidicon tubes, etc. A small experimental effort is underway to provide verification of the radiation effects predictions.

  18. Validation of a Dumbbell Body Sway Test in Olympic Air Pistol Shooting

    PubMed Central

    Mon, Daniel; Zakynthinaki, Maria S.; Cordente, Carlos A.; Monroy Antón, Antonio; López Jiménez, David

    2014-01-01

    We present and validate a test able to provide reliable body sway measurements in air pistol shooting, without the use of a gun. 46 senior male pistol shooters who participated in Spanish air pistol championships participated in the study. Body sway data of two static bipodal balance tests have been compared: during the first test, shooting was simulated by use of a dumbbell, while during the second test the shooters own pistol was used. Both tests were performed the day previous to the competition, during the official training time and at the training stands to simulate competition conditions. The participantś performance was determined as the total score of 60 shots at competition. Apart from the commonly used variables that refer to movements of the shooters centre of pressure (COP), such as COP displacements on the X and Y axes, maximum and average COP velocities and total COP area, the present analysis also included variables that provide information regarding the axes of the COP ellipse (length and angle in respect to X). A strong statistically significant correlation between the two tests was found (with an interclass correlation varying between 0.59 and 0.92). A statistically significant inverse linear correlation was also found between performance and COP movements. The study concludes that dumbbell tests are perfectly valid for measuring body sway by simulating pistol shooting. PMID:24756067

  19. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Control Functions), MOD-027-1 (Verification of Models and Data for Turbine/Governor and Load Control or...), MOD-027-1 (Verification of Models and Data for Turbine/Governor and Load Control or Active Power... Category B and C contingencies, as required by wind generators in Order No. 661, or that those generators...

  20. 24 CFR 985.3 - Indicators, HUD verification methods and ratings.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicators, HUD verification..., HUD verification methods and ratings. This section states the performance indicators that are used to assess PHA Section 8 management. HUD will use the verification method identified for each indicator in...

  1. Enhanced dynamic wedge and independent monitor unit verification.

    PubMed

    Howlett, S J

    2005-03-01

    Some serious radiation accidents have occurred around the world during the delivery of radiotherapy treatment. The regrettable incident in Panama clearly indicated the need for independent monitor unit (MU) verification. Indeed the International Atomic Energy Agency (IAEA), after investigating the incident, made specific recommendations for radiotherapy centres which included an independent monitor unit check for all treatments. Independent monitor unit verification is practiced in many radiotherapy centres in developed countries around the world. It is mandatory in USA but not yet in Australia. This paper describes development of an independent MU program, concentrating on the implementation of the Enhanced Dynamic Wedge (EDW) component. The difficult case of non centre of field (COF) calculation points under the EDW was studied in some detail. Results of a survey of Australasian centres regarding the use of independent MU check systems is also presented. The system was developed with reference to MU calculations made by Pinnacle 3D Radiotherapy Treatment Planning (RTP) system (ADAC - Philips) for 4MV, 6MV and 18MV X-ray beams used at the Newcastle Mater Misericordiae Hospital (NMMH) in the clinical environment. A small systematic error was detected in the equation used for the EDW calculations. Results indicate that COF equations may be used in the non COF situation with similar accuracy to that achieved with profile corrected methods. Further collaborative work with other centres is planned to extend these findings.

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM FOR MONITORING AND CHARACTERIZATION

    EPA Science Inventory

    The Environmental Technology Verification Program is a service of the Environmental Protection Agency designed to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of performance. The goal of ETV i...

  3. Geometric Verification of Dynamic Wave Arc Delivery With the Vero System Using Orthogonal X-ray Fluoroscopic Imaging.

    PubMed

    Burghelea, Manuela; Verellen, Dirk; Poels, Kenneth; Gevaert, Thierry; Depuydt, Tom; Tournel, Koen; Hung, Cecilia; Simon, Viorica; Hiraoka, Masahiro; de Ridder, Mark

    2015-07-15

    The purpose of this study was to define an independent verification method based on on-board orthogonal fluoroscopy to determine the geometric accuracy of synchronized gantry-ring (G/R) rotations during dynamic wave arc (DWA) delivery available on the Vero system. A verification method for DWA was developed to calculate O-ring-gantry (G/R) positional information from ball-bearing positions retrieved from fluoroscopic images of a cubic phantom acquired during DWA delivery. Different noncoplanar trajectories were generated in order to investigate the influence of path complexity on delivery accuracy. The G/R positions detected from the fluoroscopy images (DetPositions) were benchmarked against the G/R angulations retrieved from the control points (CP) of the DWA RT plan and the DWA log files recorded by the treatment console during DWA delivery (LogActed). The G/R rotational accuracy was quantified as the mean absolute deviation ± standard deviation. The maximum G/R absolute deviation was calculated as the maximum 3-dimensional distance between the CP and the closest DetPositions. In the CP versus DetPositions comparison, an overall mean G/R deviation of 0.13°/0.16° ± 0.16°/0.16° was obtained, with a maximum G/R deviation of 0.6°/0.2°. For the LogActed versus DetPositions evaluation, the overall mean deviation was 0.08°/0.15° ± 0.10°/0.10° with a maximum G/R of 0.3°/0.4°. The largest decoupled deviations registered for gantry and ring were 0.6° and 0.4° respectively. No directional dependence was observed between clockwise and counterclockwise rotations. Doubling the dose resulted in a double number of detected points around each CP, and an angular deviation reduction in all cases. An independent geometric quality assurance approach was developed for DWA delivery verification and was successfully applied on diverse trajectories. Results showed that the Vero system is capable of following complex G/R trajectories with maximum deviations during DWA

  4. Land Ice Verification and Validation Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  5. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  6. Status of BOUT fluid turbulence code: improvements and verification

    NASA Astrophysics Data System (ADS)

    Umansky, M. V.; Lodestro, L. L.; Xu, X. Q.

    2006-10-01

    BOUT is an electromagnetic fluid turbulence code for tokamak edge plasma [1]. BOUT performs time integration of reduced Braginskii plasma fluid equations, using spatial discretization in realistic geometry and employing a standard ODE integration package PVODE. BOUT has been applied to several tokamak experiments and in some cases calculated spectra of turbulent fluctuations compared favorably to experimental data. On the other hand, the desire to understand better the code results and to gain more confidence in it motivated investing effort in rigorous verification of BOUT. Parallel to the testing the code underwent substantial modification, mainly to improve its readability and tractability of physical terms, with some algorithmic improvements as well. In the verification process, a series of linear and nonlinear test problems was applied to BOUT, targeting different subgroups of physical terms. The tests include reproducing basic electrostatic and electromagnetic plasma modes in simplified geometry, axisymmetric benchmarks against the 2D edge code UEDGE in real divertor geometry, and neutral fluid benchmarks against the hydrodynamic code LCPFCT. After completion of the testing, the new version of the code is being applied to actual tokamak edge turbulence problems, and the results will be presented. [1] X. Q. Xu et al., Contr. Plas. Phys., 36,158 (1998). *Work performed for USDOE by Univ. Calif. LLNL under contract W-7405-ENG-48.

  7. 7 CFR 272.8 - State income and eligibility verification system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false State income and eligibility verification system. 272... PARTICIPATING STATE AGENCIES § 272.8 State income and eligibility verification system. (a) General. (1) State agencies may maintain and use an income and eligibility verification system (IEVS), as specified in this...

  8. Guidelines for qualifying cleaning and verification materials

    NASA Technical Reports Server (NTRS)

    Webb, D.

    1995-01-01

    This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.

  9. 21 CFR 21.44 - Verification of identity.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  10. 21 CFR 21.44 - Verification of identity.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  11. 21 CFR 21.44 - Verification of identity.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 1 2014-04-01 2014-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  12. 21 CFR 21.44 - Verification of identity.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  13. Lu.sub.1-xI.sub.3:Ce.sub.x-a scintillator for gamma-ray spectroscopy and time-of-flight pet

    DOEpatents

    Shah, Kanai S [Newton, MA

    2008-02-12

    The present invention includes very fast scintillator materials including lutetium iodide doped with Cerium (Lu.sub.1-xI.sub.3:Ce.sub.x; LuI.sub.3:Ce). The LuI.sub.3 scintillator material has surprisingly good characteristics including high light output, high gamma-ray stopping efficiency, fast response, low cost, good proportionality, and minimal afterglow that the material is useful for gamma-ray spectroscopy, medical imaging, nuclear and high energy physics research, diffraction, non-destructive testing, nuclear treaty verification and safeguards, and geological exploration.

  14. Study of techniques for redundancy verification without disrupting systems, phases 1-3

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.

  15. X-38 Ship #2 in Free Flight

    NASA Image and Video Library

    1999-07-09

    The X-38, a research vehicle built to help develop technology for an emergency Crew Return Vehicle (CRV), descends under its steerable parachute during a July 1999 test flight at the Dryden Flight Research Center, Edwards, California. It was the fourth free flight of the test vehicles in the X-38 program, and the second free flight test of Vehicle 132 or Ship 2. The goal of this flight was to release the vehicle from a higher altitude -- 31,500 feet -- and to fly the vehicle longer -- 31 seconds -- than any previous X-38 vehicle had yet flown. The project team also conducted aerodynamic verification maneuvers and checked improvements made to the drogue parachute.

  16. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  17. Formal Verification for a Next-Generation Space Shuttle

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2002-01-01

    This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.

  18. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I resubmit Platform Verification...

  19. A digital flight control system verification laboratory

    NASA Technical Reports Server (NTRS)

    De Feo, P.; Saib, S.

    1982-01-01

    A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.

  20. A methodology for the rigorous verification of plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION TEST PROTOCOL, GENERAL VENTILATION FILTERS

    EPA Science Inventory

    The Environmental Technology Verification Test Protocol, General Ventilation Filters provides guidance for verification tests.

    Reference is made in the protocol to the ASHRAE 52.2P "Method of Testing General Ventilation Air-cleaning Devices for Removal Efficiency by P...

  2. SU-E-T-435: Development and Commissioning of a Complete System for In-Vivo Dosimetry and Range Verification in Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samuel, D; Testa, M; Park, Y

    Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separablemore » into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient.« less

  3. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamashita, M; Kokubo, M; Institute of Biomedical Research and Innovation, Kobe, Hyogo

    2016-06-15

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used formore » dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical

  4. Automated System Calibration and Verification of the Position Measurements for the Los Alamos Isotope Production Facility and the Switchyard Kicker Facilities

    NASA Astrophysics Data System (ADS)

    Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.

    2004-11-01

    The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.

  5. Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Verification

    NASA Technical Reports Server (NTRS)

    Hanson, John M.; Beard, Bernard B.

    2010-01-01

    This paper is focused on applying Monte Carlo simulation to probabilistic launch vehicle design and requirements verification. The approaches developed in this paper can be applied to other complex design efforts as well. Typically the verification must show that requirement "x" is met for at least "y" % of cases, with, say, 10% consumer risk or 90% confidence. Two particular aspects of making these runs for requirements verification will be explored in this paper. First, there are several types of uncertainties that should be handled in different ways, depending on when they become known (or not). The paper describes how to handle different types of uncertainties and how to develop vehicle models that can be used to examine their characteristics. This includes items that are not known exactly during the design phase but that will be known for each assembled vehicle (can be used to determine the payload capability and overall behavior of that vehicle), other items that become known before or on flight day (can be used for flight day trajectory design and go/no go decision), and items that remain unknown on flight day. Second, this paper explains a method (order statistics) for determining whether certain probabilistic requirements are met or not and enables the user to determine how many Monte Carlo samples are required. Order statistics is not new, but may not be known in general to the GN&C community. The methods also apply to determining the design values of parameters of interest in driving the vehicle design. The paper briefly discusses when it is desirable to fit a distribution to the experimental Monte Carlo results rather than using order statistics.

  6. 24 CFR 5.659 - Family information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Family information and verification... Assisted Housing Serving Persons with Disabilities: Family Income and Family Payment; Occupancy... § 5.659 Family information and verification. (a) Applicability. This section states requirements for...

  7. 24 CFR 5.659 - Family information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Family information and verification... Assisted Housing Serving Persons with Disabilities: Family Income and Family Payment; Occupancy... § 5.659 Family information and verification. (a) Applicability. This section states requirements for...

  8. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...

  9. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...

  10. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...

  11. 40 CFR 1066.275 - Daily dynamometer readiness verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.275 Daily... automated process for this verification procedure, perform this evaluation by setting the initial speed and... your dynamometer does not perform this verification with an automated process: (1) With the dynamometer...

  12. Risk and Infrastructure Science Center - Global Security Sciences

    Science.gov Websites

    delivers scientific tools and methodologies to inform decision making regarding the most challenging Sciences ASD Accelerator Systems AES APS Engineering Support XSD X-ray Science Physical Sciences and Leadership Strategic Alliance for Global Energy Solutions Overview Leadership Systems Science Center Overview

  13. Software Verification of Orion Cockpit Displays

    NASA Technical Reports Server (NTRS)

    Biswas, M. A. Rafe; Garcia, Samuel; Prado, Matthew; Hossain, Sadad; Souris, Matthew; Morin, Lee

    2017-01-01

    NASA's latest spacecraft Orion is in the development process of taking humans deeper into space. Orion is equipped with three main displays to monitor and control the spacecraft. To ensure the software behind the glass displays operates without faults, rigorous testing is needed. To conduct such testing, the Rapid Prototyping Lab at NASA's Johnson Space Center along with the University of Texas at Tyler employed a software verification tool, EggPlant Functional by TestPlant. It is an image based test automation tool that allows users to create scripts to verify the functionality within a program. A set of edge key framework and Common EggPlant Functions were developed to enable creation of scripts in an efficient fashion. This framework standardized the way to code and to simulate user inputs in the verification process. Moreover, the Common EggPlant Functions can be used repeatedly in verification of different displays.

  14. 37 CFR 262.7 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a Performer may conduct a single audit of the Designated Agent upon reasonable notice and... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR CERTAIN ELIGIBLE...

  15. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  16. 40 CFR 1065.920 - PEMS Calibrations and verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... verification. The verification consists of operating an engine over a duty cycle in the laboratory and... by laboratory equipment as follows: (1) Mount an engine on a dynamometer for laboratory testing...

  17. A framework of multitemplate ensemble for fingerprint verification

    NASA Astrophysics Data System (ADS)

    Yin, Yilong; Ning, Yanbin; Ren, Chunxiao; Liu, Li

    2012-12-01

    How to improve performance of an automatic fingerprint verification system (AFVS) is always a big challenge in biometric verification field. Recently, it becomes popular to improve the performance of AFVS using ensemble learning approach to fuse related information of fingerprints. In this article, we propose a novel framework of fingerprint verification which is based on the multitemplate ensemble method. This framework is consisted of three stages. In the first stage, enrollment stage, we adopt an effective template selection method to select those fingerprints which best represent a finger, and then, a polyhedron is created by the matching results of multiple template fingerprints and a virtual centroid of the polyhedron is given. In the second stage, verification stage, we measure the distance between the centroid of the polyhedron and a query image. In the final stage, a fusion rule is used to choose a proper distance from a distance set. The experimental results on the FVC2004 database prove the improvement on the effectiveness of the new framework in fingerprint verification. With a minutiae-based matching method, the average EER of four databases in FVC2004 drops from 10.85 to 0.88, and with a ridge-based matching method, the average EER of these four databases also decreases from 14.58 to 2.51.

  18. The role of science self-efficacy, science career efficacy, science career interest, and intentions to enroll in nonrequired science courses in the future selection of science-related careers for high school students

    NASA Astrophysics Data System (ADS)

    Ballard, Sherri Patrice

    1998-12-01

    Underrepresentation of non-Asian minority groups and women in science and math related professions has been an area of concern for many years. The purpose of this study was to examine the role of career selection variables for African-American and European-American students on future aspirations of pursuing a science-related career. Other examined variables included gender, academic track and socioeconomic status. A survey was completed by 368 high school students in rural settings in the Southeastern portion of the United States. Gender, race, tracking, and socioeconomic differences in career selection variables and future aspirations of pursuing a science-related career were explored using a 2 x 2 x 2 x 2 MANOVA. Multiple regression was used to examine the predictiveness of career selection variables relative to future career aspirations of pursuing a science-related career. Results indicated that African-Americans reported higher total science career interest, and higher science career efficacy. European-American students reported higher levels of science self-efficacy relative to making a B or better in science courses and solving science-related problems. Also, European-Americans reported higher levels of interest in science-related tasks, a subscale on the science career interest variable. When the effect of gender was examined across the total sample, no differences were found. However, when gender was examined by race, European-American females reported higher levels of science career interest than European-American males. Students from high academic tracking groups reported greater efficacy for completing science-related technical skills. Science career interest was predictive of future career selection for this sample.

  19. Proton Therapy Verification with PET Imaging

    PubMed Central

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions, and approaches for data evaluation are discussed. PMID:24312147

  20. Formal verification of an oral messages algorithm for interactive consistency

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.