Sample records for prototype lsst ccds

  1. Delta Doping High Purity CCDs and CMOS for LSST

    NASA Technical Reports Server (NTRS)

    Blacksberg, Jordana; Nikzad, Shouleh; Hoenk, Michael; Elliott, S. Tom; Bebek, Chris; Holland, Steve; Kolbe, Bill

    2006-01-01

    A viewgraph presentation describing delta doping high purity CCD's and CMOS for LSST is shown. The topics include: 1) Overview of JPL s versatile back-surface process for CCDs and CMOS; 2) Application to SNAP and ORION missions; 3) Delta doping as a back-surface electrode for fully depleted LBNL CCDs; 4) Delta doping high purity CCDs for SNAP and ORION; 5) JPL CMP thinning process development; and 6) Antireflection coating process development.

  2. Consequences of CCD imperfections for cosmology determined by weak lensing surveys: from laboratory measurements to cosmological parameter bias

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okura, Yuki; Petri, Andrea; May, Morgan

    Weak gravitational lensing causes subtle changes in the apparent shapes of galaxies due to the bending of light by the gravity of foreground masses. By measuring the shapes of large numbers of galaxies (millions in recent surveys, up to tens of billions in future surveys) we can infer the parameters that determine cosmology. Imperfections in the detectors used to record images of the sky can introduce changes in the apparent shape of galaxies, which in turn can bias the inferred cosmological parameters. Here in this paper we consider the effect of two widely discussed sensor imperfections: tree-rings, due to impuritymore » gradients which cause transverse electric fields in the Charge-Coupled Devices (CCD), and pixel-size variation, due to periodic CCD fabrication errors. These imperfections can be observed when the detectors are subject to uniform illumination (flat field images). We develop methods to determine the spurious shear and convergence (due to the imperfections) from the flat-field images. We calculate how the spurious shear when added to the lensing shear will bias the determination of cosmological parameters. We apply our methods to candidate sensors of the Large Synoptic Survey Telescope (LSST) as a timely and important example, analyzing flat field images recorded with LSST prototype CCDs in the laboratory. In conclusion, we find that tree-rings and periodic pixel-size variation present in the LSST CCDs will introduce negligible bias to cosmological parameters determined from the lensing power spectrum, specifically w,Ω m and σ 8.« less

  3. Consequences of CCD imperfections for cosmology determined by weak lensing surveys: from laboratory measurements to cosmological parameter bias

    DOE PAGES

    Okura, Yuki; Petri, Andrea; May, Morgan; ...

    2016-06-27

    Weak gravitational lensing causes subtle changes in the apparent shapes of galaxies due to the bending of light by the gravity of foreground masses. By measuring the shapes of large numbers of galaxies (millions in recent surveys, up to tens of billions in future surveys) we can infer the parameters that determine cosmology. Imperfections in the detectors used to record images of the sky can introduce changes in the apparent shape of galaxies, which in turn can bias the inferred cosmological parameters. Here in this paper we consider the effect of two widely discussed sensor imperfections: tree-rings, due to impuritymore » gradients which cause transverse electric fields in the Charge-Coupled Devices (CCD), and pixel-size variation, due to periodic CCD fabrication errors. These imperfections can be observed when the detectors are subject to uniform illumination (flat field images). We develop methods to determine the spurious shear and convergence (due to the imperfections) from the flat-field images. We calculate how the spurious shear when added to the lensing shear will bias the determination of cosmological parameters. We apply our methods to candidate sensors of the Large Synoptic Survey Telescope (LSST) as a timely and important example, analyzing flat field images recorded with LSST prototype CCDs in the laboratory. In conclusion, we find that tree-rings and periodic pixel-size variation present in the LSST CCDs will introduce negligible bias to cosmological parameters determined from the lensing power spectrum, specifically w,Ω m and σ 8.« less

  4. Characterising CCDs with cosmic rays

    DOE PAGES

    Fisher-Levine, M.; Nomerotski, A.

    2015-08-06

    The properties of cosmic ray muons make them a useful probe for measuring the properties of thick, fully depleted CCD sensors. The known energy deposition per unit length allows measurement of the gain of the sensor's amplifiers, whilst the straightness of the tracks allows for a crude assessment of the static lateral electric fields at the sensor's edges. The small volume in which the muons deposit their energy allows measurement of the contribution to the PSF from the diffusion of charge as it drifts across the sensor. In this work we present a validation of the cosmic ray gain measurementmore » technique by comparing with radioisotope gain measurments, and calculate the charge diffusion coefficient for prototype LSST sensors.« less

  5. On determination of charge transfer efficiency of thick, fully depleted CCDs with 55 Fe x-rays

    DOE PAGES

    Yates, D.; Kotov, I.; Nomerotski, A.

    2017-07-01

    Charge transfer efficiency (CTE) is one of the most important CCD characteristics. Our paper examines ways to optimize the algorithms used to analyze 55Fe x-ray data on the CCDs, as well as explores new types of observables for CTE determination that can be used for testing LSST CCDs. Furthermore, the observables are modeled employing simple Monte Carlo simulations to determine how the charge diffusion in thick, fully depleted silicon affects the measurement. The data is compared to the simulations for one of the observables, integral flux of the x-ray hit.

  6. New characterization techniques for LSST sensors

    DOE PAGES

    Nomerotski, A.

    2015-06-18

    Fully depleted, thick CCDs with extended infra-red response have become the sensor of choice for modern sky surveys. The charge transport effects in the silicon and associated astrometric distortions could make mapping between the sky coordinates and sensor coordinates non-trivial, and limit the ultimate precision achievable with these sensors. Two new characterization techniques for the CCDs, which both could probe these issues, are discussed: x-ray flat fielding and imaging of pinhole arrays.

  7. A study of astrometric distortions due to “tree rings” in CCD sensors using LSST Photon Simulator

    DOE PAGES

    Beamer, Benjamin; Nomerotski, Andrei; Tsybychev, Dmitri

    2015-05-22

    Imperfections in the production process of thick CCDs lead to circularly symmetric dopant concentration variations, which in turn produce electric fields transverse to the surface of the fully depleted CCD that displace the photogenerated charges. We use PhoSim, a Monte Carlo photon simulator, to explore and examine the likely impacts these dopant concentration variations will have on astrometric measurements in LSST. The scale and behavior of both the astrometric shifts imparted to point sources and the intensity variations in flat field images that result from these doping imperfections are similar to those previously observed in Dark Energy Camera CCDs, givingmore » initial confirmation of PhoSim's model for these effects. In addition, the organized shape distortions were observed as a result of the symmetric nature of these dopant variations, causing nominally round sources to be imparted with a measurable ellipticity either aligned with or transverse to the radial direction of this dopant variation pattern.« less

  8. An automated system to measure the quantum efficiency of CCDs for astronomy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coles, R.; Chiang, J.; Cinabro, D.

    We describe a system to measure the Quantum Efficiency in the wavelength range of 300 nm to 1100 nm of 40 × 40 mm n-channel CCD sensors for the construction of the 3.2 gigapixel LSST focal plane. The technique uses a series of instrument to create a very uniform flux of photons of controllable intensity in the wavelength range of interest across the face the sensor. This allows the absolute Quantum Efficiency to be measured with an accuracy in the 1% range. Finally, this system will be part of a production facility at Brookhaven National Lab for the basic componentmore » of the LSST camera.« less

  9. An automated system to measure the quantum efficiency of CCDs for astronomy

    DOE PAGES

    Coles, R.; Chiang, J.; Cinabro, D.; ...

    2017-04-18

    We describe a system to measure the Quantum Efficiency in the wavelength range of 300 nm to 1100 nm of 40 × 40 mm n-channel CCD sensors for the construction of the 3.2 gigapixel LSST focal plane. The technique uses a series of instrument to create a very uniform flux of photons of controllable intensity in the wavelength range of interest across the face the sensor. This allows the absolute Quantum Efficiency to be measured with an accuracy in the 1% range. Finally, this system will be part of a production facility at Brookhaven National Lab for the basic componentmore » of the LSST camera.« less

  10. The Large Synoptic Survey Telescope as a Near-Earth Object discovery machine

    NASA Astrophysics Data System (ADS)

    Jones, R. Lynne; Slater, Colin T.; Moeyens, Joachim; Allen, Lori; Axelrod, Tim; Cook, Kem; Ivezić, Željko; Jurić, Mario; Myers, Jonathan; Petry, Catherine E.

    2018-03-01

    Using the most recent prototypes, design, and as-built system information, we test and quantify the capability of the Large Synoptic Survey Telescope (LSST) to discover Potentially Hazardous Asteroids (PHAs) and Near-Earth Objects (NEOs). We empirically estimate an expected upper limit to the false detection rate in LSST image differencing, using measurements on DECam data and prototype LSST software and find it to be about 450 deg-2. We show that this rate is already tractable with current prototype of the LSST Moving Object Processing System (MOPS) by processing a 30-day simulation consistent with measured false detection rates. We proceed to evaluate the performance of the LSST baseline survey strategy for PHAs and NEOs using a high-fidelity simulated survey pointing history. We find that LSST alone, using its baseline survey strategy, will detect 66% of the PHA and 61% of the NEO population objects brighter than H = 22 , with the uncertainty in the estimate of ± 5 percentage points. By generating and examining variations on the baseline survey strategy, we show it is possible to further improve the discovery yields. In particular, we find that extending the LSST survey by two additional years and doubling the MOPS search window increases the completeness for PHAs to 86% (including those discovered by contemporaneous surveys) without jeopardizing other LSST science goals (77% for NEOs). This equates to reducing the undiscovered population of PHAs by additional 26% (15% for NEOs), relative to the baseline survey.

  11. A Prototype External Event Broker for LSST

    NASA Astrophysics Data System (ADS)

    Elan Alvarez, Gabriella; Stassun, Keivan; Burger, Dan; Siverd, Robert; Cox, Donald

    2015-01-01

    LSST plans to have an alerts system that will automatically identify various types of "events" appearing in the LSST data stream. These events will include things such as supernovae, moving objects, and many other types, and it is expected that there will be millions of events nightly. It is expected that there may be tens of millions of events each night. To help the LSST community parse and make full advantage of the LSST alerts stream, we are working to design an external "events alert broker" that will generate real-time notification of LSST events to users and/or robotic telescope facilities based on user-specified criteria. For example, users will be able to specify that they wish to be notified immediately via text message of urgent events, such as GRB counterparts, or notified only occasionally in digest form of less time-sensitive events, such as eclipsing binaries. This poster will summarize results from a survey of scientists for the most important features that such an alerts notification service needs to provide, and will present a preliminary design for our external event broker.

  12. Asteroid Discovery and Characterization with the Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Jones, R. Lynne; Jurić, Mario; Ivezić, Željko

    2016-01-01

    The Large Synoptic Survey Telescope (LSST) will be a ground-based, optical, all-sky, rapid cadence survey project with tremendous potential for discovering and characterizing asteroids. With LSST's large 6.5m diameter primary mirror, a wide 9.6 square degree field of view 3.2 Gigapixel camera, and rapid observational cadence, LSST will discover more than 5 million asteroids over its ten year survey lifetime. With a single visit limiting magnitude of 24.5 in r band, LSST will be able to detect asteroids in the Main Belt down to sub-kilometer sizes. The current strawman for the LSST survey strategy is to obtain two visits (each `visit' being a pair of back-to-back 15s exposures) per field, separated by about 30 minutes, covering the entire visible sky every 3-4 days throughout the observing season, for ten years. The catalogs generated by LSST will increase the known number of small bodies in the Solar System by a factor of 10-100 times, among all populations. The median number of observations for Main Belt asteroids will be on the order of 200-300, with Near Earth Objects receiving a median of 90 observations. These observations will be spread among ugrizy bandpasses, providing photometric colors and allow sparse lightcurve inversion to determine rotation periods, spin axes, and shape information. These catalogs will be created using automated detection software, the LSST Moving Object Processing System (MOPS), that will take advantage of the carefully characterized LSST optical system, cosmetically clean camera, and recent improvements in difference imaging. Tests with the prototype MOPS software indicate that linking detections (and thus `discovery') will be possible at LSST depths with our working model for the survey strategy, but evaluation of MOPS and improvements in the survey strategy will continue. All data products and software created by LSST will be publicly available.

  13. LSST camera readout chip ASPIC: test tools

    NASA Astrophysics Data System (ADS)

    Antilogus, P.; Bailly, Ph; Jeglot, J.; Juramy, C.; Lebbolo, H.; Martin, D.; Moniez, M.; Tocut, V.; Wicek, F.

    2012-02-01

    The LSST camera will have more than 3000 video-processing channels. The readout of this large focal plane requires a very compact readout chain. The correlated ''Double Sampling technique'', which is generally used for the signal readout of CCDs, is also adopted for this application and implemented with the so called ''Dual Slope integrator'' method. We have designed and implemented an ASIC for LSST: the Analog Signal Processing asIC (ASPIC). The goal is to amplify the signal close to the output, in order to maximize signal to noise ratio, and to send differential outputs to the digitization. Others requirements are that each chip should process the output of half a CCD, that is 8 channels and should operate at 173 K. A specific Back End board has been designed especially for lab test purposes. It manages the clock signals, digitizes the analog differentials outputs of ASPIC and stores data into a memory. It contains 8 ADCs (18 bits), 512 kwords memory and an USB interface. An FPGA manages all signals from/to all components on board and generates the timing sequence for ASPIC. Its firmware is written in Verilog and VHDL languages. Internals registers permit to define various tests parameters of the ASPIC. A Labview GUI allows to load or update these registers and to check a proper operation. Several series of tests, including linearity, noise and crosstalk, have been performed over the past year to characterize the ASPIC at room and cold temperature. At present, the ASPIC, Back-End board and CCD detectors are being integrated to perform a characterization of the whole readout chain.

  14. Is flat fielding safe for precision CCD astronomy?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumer, Michael; Davis, Christopher P.; Roodman, Aaron

    The ambitious goals of precision cosmology with wide-field optical surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST) demand precision CCD astronomy as their foundation. This in turn requires an understanding of previously uncharacterized sources of systematic error in CCD sensors, many of which manifest themselves as static effective variations in pixel area. Such variation renders a critical assumption behind the traditional procedure of flat fielding—that a sensor's pixels comprise a uniform grid—invalid. In this work, we present a method to infer a curl-free model of a sensor's underlying pixel grid from flat-field images,more » incorporating the superposition of all electrostatic sensor effects—both known and unknown—present in flat-field data. We use these pixel grid models to estimate the overall impact of sensor systematics on photometry, astrometry, and PSF shape measurements in a representative sensor from the Dark Energy Camera (DECam) and a prototype LSST sensor. Applying the method to DECam data recovers known significant sensor effects for which corrections are currently being developed within DES. For an LSST prototype CCD with pixel-response non-uniformity (PRNU) of 0.4%, we find the impact of "improper" flat fielding on these observables is negligible in nominal .7'' seeing conditions. Furthermore, these errors scale linearly with the PRNU, so for future LSST production sensors, which may have larger PRNU, our method provides a way to assess whether pixel-level calibration beyond flat fielding will be required.« less

  15. Is flat fielding safe for precision CCD astronomy?

    DOE PAGES

    Baumer, Michael; Davis, Christopher P.; Roodman, Aaron

    2017-07-06

    The ambitious goals of precision cosmology with wide-field optical surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST) demand precision CCD astronomy as their foundation. This in turn requires an understanding of previously uncharacterized sources of systematic error in CCD sensors, many of which manifest themselves as static effective variations in pixel area. Such variation renders a critical assumption behind the traditional procedure of flat fielding—that a sensor's pixels comprise a uniform grid—invalid. In this work, we present a method to infer a curl-free model of a sensor's underlying pixel grid from flat-field images,more » incorporating the superposition of all electrostatic sensor effects—both known and unknown—present in flat-field data. We use these pixel grid models to estimate the overall impact of sensor systematics on photometry, astrometry, and PSF shape measurements in a representative sensor from the Dark Energy Camera (DECam) and a prototype LSST sensor. Applying the method to DECam data recovers known significant sensor effects for which corrections are currently being developed within DES. For an LSST prototype CCD with pixel-response non-uniformity (PRNU) of 0.4%, we find the impact of "improper" flat fielding on these observables is negligible in nominal .7'' seeing conditions. Furthermore, these errors scale linearly with the PRNU, so for future LSST production sensors, which may have larger PRNU, our method provides a way to assess whether pixel-level calibration beyond flat fielding will be required.« less

  16. The LSST OCS scheduler design

    NASA Astrophysics Data System (ADS)

    Delgado, Francisco; Schumacher, German

    2014-08-01

    The Large Synoptic Survey Telescope (LSST) is a complex system of systems with demanding performance and operational requirements. The nature of its scientific goals requires a special Observatory Control System (OCS) and particularly a very specialized automatic Scheduler. The OCS Scheduler is an autonomous software component that drives the survey, selecting the detailed sequence of visits in real time, taking into account multiple science programs, the current external and internal conditions, and the history of observations. We have developed a SysML model for the OCS Scheduler that fits coherently in the OCS and LSST integrated model. We have also developed a prototype of the Scheduler that implements the scheduling algorithms in the simulation environment provided by the Operations Simulator, where the environment and the observatory are modeled with real weather data and detailed kinematics parameters. This paper expands on the Scheduler architecture and the proposed algorithms to achieve the survey goals.

  17. CCD developments for particle colliders

    NASA Astrophysics Data System (ADS)

    Stefanov, Konstantin D.

    2006-09-01

    Charge Coupled Devices (CCDs) have been successfully used in several high-energy physics experiments over the last 20 years. Their small pixel size and excellent precision provide superb tool for studying of short-lived particles and understanding the nature at fundamental level. Over the last years the Linear Collider Flavour Identification (LCFI) collaboration has developed Column-Parallel CCDs (CPCCD) and CMOS readout chips to be used for the vertex detector at the International Linear Collider (ILC). The CPCCDs are very fast devices capable of satisfying the challenging requirements imposed by the beam structure of the superconducting accelerator. First set of prototype devices have been designed, manufactured and successfully tested, with second-generation chips on the way. Another idea for CCD-based device, the In-situ Storage Image Sensor (ISIS) is also under development and the first prototype is in production.

  18. CCD-based vertex detector for ILC

    NASA Astrophysics Data System (ADS)

    Stefanov, Konstantin D.

    2006-12-01

    Charge Coupled Devices (CCDs) have been successfully used in several high-energy physics experiments over the last 20 years. Their small pixel size and excellent precision provide a superb tool for studying of short-lived particles and understanding the nature at fundamental level. Over the last few years the Linear Collider Flavour Identification (LCFI) collaboration has developed Column-Parallel CCDs (CPCCD) and CMOS readout chips, to be used for the vertex detector at the International Linear Collider (ILC). The CPCCDs are very fast devices capable of satisfying the challenging requirements imposed by the beam structure of the superconducting accelerator. The first set of prototype devices have been successfully designed, manufactured and tested, with second generation chips on the way. Another idea for CCD-based device, the In-situ Storage Image Sensor (ISIS) is also under development and the first prototype has been manufactured.

  19. The LSST Scheduler from design to construction

    NASA Astrophysics Data System (ADS)

    Delgado, Francisco; Reuter, Michael A.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) will be a highly robotic facility, demanding a very high efficiency during its operation. To achieve this, the LSST Scheduler has been envisioned as an autonomous software component of the Observatory Control System (OCS), that selects the sequence of targets in real time. The Scheduler will drive the survey using optimization of a dynamic cost function of more than 200 parameters. Multiple science programs produce thousands of candidate targets for each observation, and multiple telemetry measurements are received to evaluate the external and the internal conditions of the observatory. The design of the LSST Scheduler started early in the project supported by Model Based Systems Engineering, detailed prototyping and scientific validation of the survey capabilities required. In order to build such a critical component, an agile development path in incremental releases is presented, integrated to the development plan of the Operations Simulator (OpSim) to allow constant testing, integration and validation in a simulated OCS environment. The final product is a Scheduler that is also capable of running 2000 times faster than real time in simulation mode for survey studies and scientific validation during commissioning and operations.

  20. A fringe projector-based study of the Brighter-Fatter Effect in LSST CCDs

    DOE PAGES

    Gilbertson, W.; Nomerotski, A.; Takacs, P.

    2017-09-07

    In order to achieve the goals of the Large Synoptic Survey Telescope for Dark Energy science requires a detailed understanding of CCD sensor effects. One such sensor effect is the Point Spread Function (PSF) increasing with flux, alternatively called the `Brighter-Fatter Effect.' Here a novel approach was tested to perform the PSF measurements in the context of the Brighter-Fatter Effect employing a Michelson interferometer to project a sinusoidal fringe pattern onto the CCD. The Brighter-Fatter effect predicts that the fringe pattern should become asymmetric in the intensity pattern as the brighter peaks corresponding to a larger flux are smeared bymore » a larger PSF. By fitting the data with a model that allows for a changing PSF, the strength of the Brighter-Fatter effect can be evaluated.« less

  1. A fringe projector-based study of the Brighter-Fatter Effect in LSST CCDs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbertson, W.; Nomerotski, A.; Takacs, P.

    In order to achieve the goals of the Large Synoptic Survey Telescope for Dark Energy science requires a detailed understanding of CCD sensor effects. One such sensor effect is the Point Spread Function (PSF) increasing with flux, alternatively called the `Brighter-Fatter Effect.' Here a novel approach was tested to perform the PSF measurements in the context of the Brighter-Fatter Effect employing a Michelson interferometer to project a sinusoidal fringe pattern onto the CCD. The Brighter-Fatter effect predicts that the fringe pattern should become asymmetric in the intensity pattern as the brighter peaks corresponding to a larger flux are smeared bymore » a larger PSF. By fitting the data with a model that allows for a changing PSF, the strength of the Brighter-Fatter effect can be evaluated.« less

  2. The LSST operations simulator

    NASA Astrophysics Data System (ADS)

    Delgado, Francisco; Saha, Abhijit; Chandrasekharan, Srinivasan; Cook, Kem; Petry, Catherine; Ridgway, Stephen

    2014-08-01

    The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://www.lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions as well as additional scheduled and unscheduled downtime. It has a detailed model to simulate the external conditions with real weather history data from the site, a fully parameterized kinematic model for the internal conditions of the telescope, camera and dome, and serves as a prototype for an automatic scheduler for the real time survey operations with LSST. The Simulator is a critical tool that has been key since very early in the project, to help validate the design parameters of the observatory against the science requirements and the goals from specific science programs. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. Software to efficiently compare the efficacy of different survey strategies for a wide variety of science applications using such a growing set of metrics is under development. A recent restructuring of the code allows us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator is being used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities and assist with performance margin investigations of the LSST system.

  3. The LSST Camera 500 watt -130 degC Mixed Refrigerant Cooling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowden, Gordon B.; Langton, Brian J.; /SLAC

    2014-05-28

    The LSST Camera has a higher cryogenic heat load than previous CCD telescope cameras due to its large size (634 mm diameter focal plane, 3.2 Giga pixels) and its close coupled front-end electronics operating at low temperature inside the cryostat. Various refrigeration technologies are considered for this telescope/camera environment. MMR-Technology’s Mixed Refrigerant technology was chosen. A collaboration with that company was started in 2009. The system, based on a cluster of Joule-Thomson refrigerators running a special blend of mixed refrigerants is described. Both the advantages and problems of applying this technology to telescope camera refrigeration are discussed. Test results frommore » a prototype refrigerator running in a realistic telescope configuration are reported. Current and future stages of the development program are described. (auth)« less

  4. NOAO and LSST: Illuminating the Path to LSST for All Users

    NASA Astrophysics Data System (ADS)

    Olsen, Knut A.; Matheson, T.; Ridgway, S. T.; Saha, A.; Lauer, T. R.; NOAO LSST Science Working Group

    2013-01-01

    As LSST moves toward construction and survey definition, the burden on the user community to begin planning and preparing for the massive data stream grows. In light of the significant challenge and opportunity that LSST now brings, a critical role for a National Observatory will be to advocate for, respond to, and advise the U.S. community on its use of LSST. NOAO intends to establish an LSST Community Science Center to meet these common needs. Such a Center builds on NOAO's leadership in offering survey-style instruments, proposal opportunities, and data management software over the last decade. This leadership has enabled high-impact scientific results, as evidenced by the award of the 2011 Nobel Prize in Physics for the discovery of Dark Energy, which stemmed directly from survey-style observations taken at NOAO. As steps towards creating an LSST Community Science Center, NOAO is 1) supporting the LSST Science Collaborations through membership calls and collaboration meetings; 2) developing the LSST operations simulator, the tool by which the community's scientific goals of are tested against the reality of what LSST's cadence can deliver; 3) embarking on a project to establish metrics for science data quality assessment, which will be critical for establishing faith in LSST results; 4) developing a roadmap and proposal to host and support the capability to help the community manage the expected flood of automated alerts from LSST; and 5) starting a serious discussion of the system capabilities needed for photometric and spectroscopic followup of LSST observations. The fundamental goal is to enable productive, world-class research with LSST by the entire US community-at-large in tight collaboration with the LSST Project, LSST Science Collaborations, and the funding agencies.

  5. LSST: Education and Public Outreach

    NASA Astrophysics Data System (ADS)

    Bauer, Amanda; Herrold, Ardis; LSST Education and Public Outreach Team

    2018-01-01

    The Large Synoptic Survey Telescope (LSST) will conduct a 10-year wide, fast, and deep survey of the night sky starting in 2022. LSST Education and Public Outreach (EPO) will enable public access to a subset of LSST data so anyone can explore the universe and be part of the discovery process. LSST EPO aims to facilitate a pathway from entry-level exploration of astronomical imagery to more sophisticated interaction with LSST data using tools similar to what professional astronomers use. To deliver data to the public, LSST EPO is creating an online Portal to serve as the main hub to EPO activities. The Portal will host an interactive Skyviewer, access to LSST data for educators and the public through online Jupyter notebooks, original multimedia for informal science centers and planetariums, and feature citizen science projects that use LSST data. LSST EPO will engage with the Chilean community through Spanish-language components of the Portal and will partner with organizations serving underrepresented groups in STEM.

  6. The European perspective for LSST

    NASA Astrophysics Data System (ADS)

    Gangler, Emmanuel

    2017-06-01

    LSST is a next generation telescope that will produce an unprecedented data flow. The project goal is to deliver data products such as images and catalogs thus enabling scientific analysis for a wide community of users. As a large scale survey, LSST data will be complementary with other facilities in a wide range of scientific domains, including data from ESA or ESO. European countries have invested in LSST since 2007, in the construction of the camera as well as in the computing effort. This latter will be instrumental in designing the next step: how to distribute LSST data to Europe. Astroinformatics challenges for LSST indeed includes not only the analysis of LSST big data, but also the practical efficiency of the data access.

  7. The Large Synoptic Survey Telescope (LSST) Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Ranked as the top ground-based national priority for the field for the current decade, LSST is currently under construction in Chile. The U.S. Department of Energy’s SLAC National Accelerator Laboratory is leading the construction of the LSST camera – the largest digital camera ever built for astronomy. SLAC Professor Steven M. Kahn is the overall Director of the LSST project, and SLAC personnel are also participating in the data management. The National Science Foundation is the lead agency for construction of the LSST. Additional financial support comes from the Department of Energy and private funding raised by the LSST Corporation.

  8. Transient Alerts in LSST

    NASA Astrophysics Data System (ADS)

    Kantor, J.

    During LSST observing, transient events will be detected and alerts generated at the LSST Archive Center at NCSA in Champaign-Illinois. As a very high rate of alerts is expected, approaching ˜ 10 million per night, we plan for VOEvent-compliant Distributor/Brokers (http://voevent.org) to be the primary end-points of the full LSST alert streams. End users will then use these Distributor/Brokers to classify and filter events on the stream for those fitting their science goals. These Distributor/Brokers are envisioned to be operated as a community service by third parties who will have signed MOUs with LSST. The exact identification of Distributor/Brokers to receive alerts will be determined as LSST approaches full operations and may change over time, but it is in our interest to identify and coordinate with them as early as possible. LSST will also operate a limited Distributor/Broker with a filtering capability at the Archive Center, to allow alerts to be sent directly to a limited number of entities that for some reason need to have a more direct connection to LSST. This might include, for example, observatories with significant follow-up capabilities whose observing may temporarily be more directly tied to LSST observing. It will let astronomers create simple filters that limit what alerts are ultimately forwarded to them. These user defined filters will be possible to specify using an SQL-like declarative language, or short snippets of (likely Python) code. We emphasize that this LSST-provided capability will be limited, and is not intended to satisfy the wide variety of use cases that a full-fledged public Event Distributor/Broker could. End users will not be able to subscribe to full, unfiltered, alert streams coming directly from LSST. In this session, we will discuss anticipated LSST data rates, and capabilities for alert processing and distribution/brokering. We will clarify what the LSST Observatory will provide versus what we anticipate will be a community effort.

  9. The Large Synoptic Survey Telescope (LSST) Camera

    ScienceCinema

    None

    2018-06-13

    Ranked as the top ground-based national priority for the field for the current decade, LSST is currently under construction in Chile. The U.S. Department of Energy’s SLAC National Accelerator Laboratory is leading the construction of the LSST camera – the largest digital camera ever built for astronomy. SLAC Professor Steven M. Kahn is the overall Director of the LSST project, and SLAC personnel are also participating in the data management. The National Science Foundation is the lead agency for construction of the LSST. Additional financial support comes from the Department of Energy and private funding raised by the LSST Corporation.

  10. A comparative study of charge transfer inefficiency value and trap parameter determination techniques making use of an irradiated ESA-Euclid prototype CCD

    NASA Astrophysics Data System (ADS)

    Prod'homme, Thibaut; Verhoeve, P.; Kohley, R.; Short, A.; Boudin, N.

    2014-07-01

    The science objectives of space missions using CCDs to carry out accurate astronomical measurements are put at risk by the radiation-induced increase in charge transfer inefficiency (CTI) that results from trapping sites in the CCD silicon lattice. A variety of techniques are used to obtain CTI values and derive trap parameters, however they often differ in results. To identify and understand these differences, we take advantage of an on-going comprehensive characterisation of an irradiated Euclid prototype CCD including the following techniques: X-ray, trap pumping, flat field extended pixel edge response and first pixel response. We proceed to a comparative analysis of the obtained results.

  11. Examining the Potential of LSST to Contribute to Exoplanet Discovery

    NASA Astrophysics Data System (ADS)

    Lund, Michael B.; Pepper, Joshua; Jacklin, Savannah; Stassun, Keivan G.

    2018-01-01

    The Large Synoptic Survey Telescope (LSST), currently under construction in Chile with scheduled first light in 2019, will be one of the major sources of data in the next decade and is one of the top priorities expressed in the last Decadal Survey. As LSST is intended to cover a range of science questions, and so the LSST community is still working on optimizing the observing strategy of the survey. With a survey area that will cover half the sky in 6 bands providing photometric data on billions of stars from 16th to 24th magnitude, LSST has the ability to be leveraged to help contribute to exoplanet science. In particular, LSST has the potential to detect exoplanets around stellar populations that are not normally usually included in transiting exoplanet searches. This includes searching for exoplanets around red and white dwarfs and stars in the galactic plane and bulge, stellar clusters, and potentially even the Magellanic Clouds. In probing these varied stellar populations, relative exoplanet frequency can be examined, and in turn, LSST may be able to provide fresh insight into how stellar environment can play a role in planetary formation rates.Our initial work on this project has been to demonstrate that even with the limitations of the LSST cadence, exoplanets would be recoverable and detectable in the LSST photometry, and to show that exoplanets indeed worth including in discussions of variable sources that LSST can contribute to. We have continued to expand this work to examine exoplanets around stars in belonging to various stellar populations, both to show the types of systems that LSST is capable of discovering, and to determine the potential exoplanet yields using standard algorithms that have already been implemented in transiting exoplanet searches, as well as how changes to LSST's observing schedule may impact both of these results.

  12. Science Education with the LSST

    NASA Astrophysics Data System (ADS)

    Jacoby, S. H.; Khandro, L. M.; Larson, A. M.; McCarthy, D. W.; Pompea, S. M.; Shara, M. M.

    2004-12-01

    LSST will create the first true celestial cinematography - a revolution in public access to the changing universe. The challenge will be to take advantage of the unique capabilities of the LSST while presenting the data in ways that are manageable, engaging, and supportive of national science education goals. To prepare for this opportunity for exploration, tools and displays will be developed using current deep-sky multi-color imaging data. Education professionals from LSST partners invite input from interested members of the community. Initial LSST science education priorities include: - Fostering authentic student-teacher research projects at all levels, - Exploring methods of visualizing the large and changing datasets in science centers, - Defining Web-based interfaces and tools for access and interaction with the data, - Delivering online instructional materials, and - Developing meaningful interactions between LSST scientists and the public.

  13. LSST active optics system software architecture

    NASA Astrophysics Data System (ADS)

    Thomas, Sandrine J.; Chandrasekharan, Srinivasan; Lotz, Paul; Xin, Bo; Claver, Charles; Angeli, George; Sebag, Jacques; Dubois-Felsmann, Gregory P.

    2016-08-01

    The Large Synoptic Survey Telescope (LSST) is an 8-meter class wide-field telescope now under construction on Cerro Pachon, near La Serena, Chile. This ground-based telescope is designed to conduct a decade-long time domain survey of the optical sky. In order to achieve the LSST scientific goals, the telescope requires delivering seeing limited image quality over the 3.5 degree field-of-view. Like many telescopes, LSST will use an Active Optics System (AOS) to correct in near real-time the system aberrations primarily introduced by gravity and temperature gradients. The LSST AOS uses a combination of 4 curvature wavefront sensors (CWS) located on the outside of the LSST field-of-view. The information coming from the 4 CWS is combined to calculate the appropriate corrections to be sent to the 3 different mirrors composing LSST. The AOS software incorporates a wavefront sensor estimation pipeline (WEP) and an active optics control system (AOCS). The WEP estimates the wavefront residual error from the CWS images. The AOCS determines the correction to be sent to the different degrees of freedom every 30 seconds. In this paper, we describe the design and implementation of the AOS. More particularly, we will focus on the software architecture as well as the AOS interactions with the various subsystems within LSST.

  14. LSST Survey Data: Models for EPO Interaction

    NASA Astrophysics Data System (ADS)

    Olsen, J. K.; Borne, K. D.

    2007-12-01

    The potential for education and public outreach with the Large Synoptic Survey Telescope is as far reaching as the telescope itself. LSST data will be available to the public, giving anyone with a web browser a movie-like window on the Universe. The LSST project is unique in designing its data management and data access systems with the public and community users in mind. The enormous volume of data to be generated by LSST is staggering: 30 Terabytes per night, 10 Petabytes per year. The final database of extracted science parameters from the images will also be enormous -- 50-100 Petabytes -- a rich gold mine for data mining and scientific discovery potential. LSST will also generate 100,000 astronomical alerts per night, for 10 years. The LSST EPO team is examining models for EPO interaction with the survey data, particularly in how the community (amateurs, teachers, students, and general public) can participate in the discovery process. We will outline some of our models of community interaction for inquiry-based science using the LSST survey data, and we invite discussion on these topics.

  15. Consensus coding sequence (CCDS) database: a standardized set of human and mouse protein-coding regions supported by expert curation

    PubMed Central

    Pujar, Shashikant; O’Leary, Nuala A; Farrell, Catherine M; Mudge, Jonathan M; Wallin, Craig; Diekhans, Mark; Barnes, If; Bennett, Ruth; Berry, Andrew E; Cox, Eric; Davidson, Claire; Goldfarb, Tamara; Gonzalez, Jose M; Hunt, Toby; Jackson, John; Joardar, Vinita; Kay, Mike P; Kodali, Vamsi K; McAndrews, Monica; McGarvey, Kelly M; Murphy, Michael; Rajput, Bhanu; Rangwala, Sanjida H; Riddick, Lillian D; Seal, Ruth L; Webb, David; Zhu, Sophia; Aken, Bronwen L; Bult, Carol J; Frankish, Adam; Pruitt, Kim D

    2018-01-01

    Abstract The Consensus Coding Sequence (CCDS) project provides a dataset of protein-coding regions that are identically annotated on the human and mouse reference genome assembly in genome annotations produced independently by NCBI and the Ensembl group at EMBL-EBI. This dataset is the product of an international collaboration that includes NCBI, Ensembl, HUGO Gene Nomenclature Committee, Mouse Genome Informatics and University of California, Santa Cruz. Identically annotated coding regions, which are generated using an automated pipeline and pass multiple quality assurance checks, are assigned a stable and tracked identifier (CCDS ID). Additionally, coordinated manual review by expert curators from the CCDS collaboration helps in maintaining the integrity and high quality of the dataset. The CCDS data are available through an interactive web page (https://www.ncbi.nlm.nih.gov/CCDS/CcdsBrowse.cgi) and an FTP site (ftp://ftp.ncbi.nlm.nih.gov/pub/CCDS/). In this paper, we outline the ongoing work, growth and stability of the CCDS dataset and provide updates on new collaboration members and new features added to the CCDS user interface. We also present expert curation scenarios, with specific examples highlighting the importance of an accurate reference genome assembly and the crucial role played by input from the research community. PMID:29126148

  16. Big Software for Big Data: Scaling Up Photometry for LSST (Abstract)

    NASA Astrophysics Data System (ADS)

    Rawls, M.

    2017-06-01

    (Abstract only) The Large Synoptic Survey Telescope (LSST) will capture mosaics of the sky every few nights, each containing more data than your computer's hard drive can store. As a result, the software to process these images is as critical to the science as the telescope and the camera. I discuss the algorithms and software being developed by the LSST Data Management team to handle such a large volume of data. All of our work is open source and available to the community. Once LSST comes online, our software will produce catalogs of objects and a stream of alerts. These will bring exciting new opportunities for follow-up observations and collaborations with LSST scientists.

  17. Consensus coding sequence (CCDS) database: a standardized set of human and mouse protein-coding regions supported by expert curation.

    PubMed

    Pujar, Shashikant; O'Leary, Nuala A; Farrell, Catherine M; Loveland, Jane E; Mudge, Jonathan M; Wallin, Craig; Girón, Carlos G; Diekhans, Mark; Barnes, If; Bennett, Ruth; Berry, Andrew E; Cox, Eric; Davidson, Claire; Goldfarb, Tamara; Gonzalez, Jose M; Hunt, Toby; Jackson, John; Joardar, Vinita; Kay, Mike P; Kodali, Vamsi K; Martin, Fergal J; McAndrews, Monica; McGarvey, Kelly M; Murphy, Michael; Rajput, Bhanu; Rangwala, Sanjida H; Riddick, Lillian D; Seal, Ruth L; Suner, Marie-Marthe; Webb, David; Zhu, Sophia; Aken, Bronwen L; Bruford, Elspeth A; Bult, Carol J; Frankish, Adam; Murphy, Terence; Pruitt, Kim D

    2018-01-04

    The Consensus Coding Sequence (CCDS) project provides a dataset of protein-coding regions that are identically annotated on the human and mouse reference genome assembly in genome annotations produced independently by NCBI and the Ensembl group at EMBL-EBI. This dataset is the product of an international collaboration that includes NCBI, Ensembl, HUGO Gene Nomenclature Committee, Mouse Genome Informatics and University of California, Santa Cruz. Identically annotated coding regions, which are generated using an automated pipeline and pass multiple quality assurance checks, are assigned a stable and tracked identifier (CCDS ID). Additionally, coordinated manual review by expert curators from the CCDS collaboration helps in maintaining the integrity and high quality of the dataset. The CCDS data are available through an interactive web page (https://www.ncbi.nlm.nih.gov/CCDS/CcdsBrowse.cgi) and an FTP site (ftp://ftp.ncbi.nlm.nih.gov/pub/CCDS/). In this paper, we outline the ongoing work, growth and stability of the CCDS dataset and provide updates on new collaboration members and new features added to the CCDS user interface. We also present expert curation scenarios, with specific examples highlighting the importance of an accurate reference genome assembly and the crucial role played by input from the research community. Published by Oxford University Press on behalf of Nucleic Acids Research 2017.

  18. The Large Synoptic Survey Telescope Science Requirements

    NASA Astrophysics Data System (ADS)

    Tyson, J. A.; LSST Collaboration

    2004-12-01

    The Large Synoptic Survey Telescope (LSST) is a wide-field telescope facility that will add a qualitatively new capability in astronomy and will address some of the most pressing open questions in astronomy and fundamental physics. The 8.4-meter telescope and 3 billion pixel camera covering ten square degrees will reach sky in less than 10 seconds in each of 5-6 optical bands. This is enabled by advances in microelectronics, software, and large optics fabrication. The unprecedented optical throughput drives LSST's ability to go faint-wide-fast. The LSST will produce time-lapse digital imaging of faint astronomical objects across the entire visible sky with good resolution. For example, the LSST will provide unprecedented 3-dimensional maps of the mass distribution in the Universe, in addition to the traditional images of luminous stars and galaxies. These weak lensing data can be used to better understand the nature of Dark Energy. The LSST will also provide a comprehensive census of our solar system. By surveying deeply the entire accessible sky every few nights, the LSST will provide large samples of events which we now only rarely observe, and will create substantial potential for new discoveries. The LSST will produce the largest non-proprietary data set in the world. Several key science drivers are representative of the LSST system capabilities: Precision Characterization of Dark Energy, Solar System Map, Optical Transients, and a map of our Galaxy and its environs. In addition to enabling all four of these major scientific initiatives, LSST will make it possible to pursue many other research programs. The community has suggested a number of exciting programs using these data, and the long-lived data archives of the LSST will have the astrometric and photometric precision needed to support entirely new research directions which will inevitably develop during the next several decades.

  19. Investigating the Bright End of LSST Photometry

    NASA Astrophysics Data System (ADS)

    Ojala, Elle; Pepper, Joshua; LSST Collaboration

    2018-01-01

    The Large Synoptic Survey Telescope (LSST) will begin operations in 2022, conducting a wide-field, synoptic multiband survey of the southern sky. Some fraction of objects at the bright end of the magnitude regime observed by LSST will overlap with other wide-sky surveys, allowing for calibration and cross-checking between surveys. The LSST is optimized for observations of very faint objects, so much of this data overlap will be comprised of saturated images. This project provides the first in-depth analysis of saturation in LSST images. Using the PhoSim package to create simulated LSST images, we evaluate saturation properties of several types of stars to determine the brightness limitations of LSST. We also collect metadata from many wide-field photometric surveys to provide cross-survey accounting and comparison. Additionally, we evaluate the accuracy of the PhoSim modeling parameters to determine the reliability of the software. These efforts will allow us to determine the expected useable data overlap between bright-end LSST images and faint-end images in other wide-sky surveys. Our next steps are developing methods to extract photometry from saturated images.This material is based upon work supported in part by the National Science Foundation through Cooperative Agreement 1258333 managed by the Association of Universities for Research in Astronomy (AURA), and the Department of Energy under Contract No. DE-AC02-76SF00515 with the SLAC National Accelerator Laboratory. Additional LSST funding comes from private donations, grants to universities, and in-kind support from LSSTC Institutional Members.Thanks to NSF grant PHY-135195 and the 2017 LSSTC Grant Award #2017-UG06 for making this project possible.

  20. Investigating interoperability of the LSST data management software stack with Astropy

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.

  1. The LSST: A System of Systems

    NASA Astrophysics Data System (ADS)

    Claver, Chuck F.; Debois-Felsmann, G. P.; Delgado, F.; Hascall, P.; Marshall, S.; Nordby, M.; Schumacher, G.; Sebag, J.; LSST Collaboration

    2011-01-01

    The Large Synoptic Survey Telescope (LSST) is a complete observing system that acquires and archives images, processes and analyzes them, and publishes reduced images and catalogs of sources and objects. The LSST will operate over a ten year period producing a survey of 20,000 square degrees over the entire [Southern] sky in 6 filters (ugrizy) with each field having been visited several hundred times enabling a wide spectrum of science from fast transients to exploration of dark matter and dark energy. The LSST itself is a complex system of systems consisting of the 8.4m 3-mirror telescope, a 3.2 billion pixel camera, and a peta-scale data management system. The LSST project uses a Model Based Systems Engineering (MBSE) methodology to ensure an integrated approach to system design and rigorous definition of system interfaces and specifications. The MBSE methodology is applied through modeling of the LSST's systems with the System Modeling Language (SysML). The SysML modeling recursively establishes the threefold relationship between requirements, logical & physical functional decomposition and definition, and system and component behavior at successively deeper level of abstraction and detail. The LSST modeling includes the analysis and documenting the flow of command and control information and data between the suite of systems in the LSST observatory that are needed to carry out the activities of the survey. The MBSE approach is applied throughout all stages of the project from design, to validation and verification, though to commissioning.

  2. Integration and verification testing of the Large Synoptic Survey Telescope camera

    NASA Astrophysics Data System (ADS)

    Lange, Travis; Bond, Tim; Chiang, James; Gilmore, Kirk; Digel, Seth; Dubois, Richard; Glanzman, Tom; Johnson, Tony; Lopez, Margaux; Newbry, Scott P.; Nordby, Martin E.; Rasmussen, Andrew P.; Reil, Kevin A.; Roodman, Aaron J.

    2016-08-01

    We present an overview of the Integration and Verification Testing activities of the Large Synoptic Survey Telescope (LSST) Camera at the SLAC National Accelerator Lab (SLAC). The LSST Camera, the sole instrument for LSST and under construction now, is comprised of a 3.2 Giga-pixel imager and a three element corrector with a 3.5 degree diameter field of view. LSST Camera Integration and Test will be taking place over the next four years, with final delivery to the LSST observatory anticipated in early 2020. We outline the planning for Integration and Test, describe some of the key verification hardware systems being developed, and identify some of the more complicated assembly/integration activities. Specific details of integration and verification hardware systems will be discussed, highlighting some of the technical challenges anticipated.

  3. The Hyper Suprime-Cam software pipeline

    NASA Astrophysics Data System (ADS)

    Bosch, James; Armstrong, Robert; Bickerton, Steven; Furusawa, Hisanori; Ikeda, Hiroyuki; Koike, Michitaro; Lupton, Robert; Mineo, Sogo; Price, Paul; Takata, Tadafumi; Tanaka, Masayuki; Yasuda, Naoki; AlSayyad, Yusra; Becker, Andrew C.; Coulton, William; Coupon, Jean; Garmilla, Jose; Huang, Song; Krughoff, K. Simon; Lang, Dustin; Leauthaud, Alexie; Lim, Kian-Tat; Lust, Nate B.; MacArthur, Lauren A.; Mandelbaum, Rachel; Miyatake, Hironao; Miyazaki, Satoshi; Murata, Ryoma; More, Surhud; Okura, Yuki; Owen, Russell; Swinbank, John D.; Strauss, Michael A.; Yamada, Yoshihiko; Yamanoi, Hitomi

    2018-01-01

    In this paper, we describe the optical imaging data processing pipeline developed for the Subaru Telescope's Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope's Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrending and image characterizations.

  4. It’s about time: How do sky surveys manage uncertainty about scientific needs many years into the future

    NASA Astrophysics Data System (ADS)

    Darch, Peter T.; Sands, Ashley E.

    2016-06-01

    Sky surveys, such as the Sloan Digital Sky Survey (SDSS) and the Large Synoptic Survey Telescope (LSST), generate data on an unprecedented scale. While many scientific projects span a few years from conception to completion, sky surveys are typically on the scale of decades. This paper focuses on critical challenges arising from long timescales, and how sky surveys address these challenges.We present findings from a study of LSST, comprising interviews (n=58) and observation. Conceived in the 1990s, the LSST Corporation was formed in 2003, and construction began in 2014. LSST will commence data collection operations in 2022 for ten years.One challenge arising from this long timescale is uncertainty about future needs of the astronomers who will use these data many years hence. Sources of uncertainty include scientific questions to be posed, astronomical phenomena to be studied, and tools and practices these astronomers will have at their disposal. These uncertainties are magnified by the rapid technological and scientific developments anticipated between now and the start of LSST operations.LSST is implementing a range of strategies to address these challenges. Some strategies involve delaying resolution of uncertainty, placing this resolution in the hands of future data users. Other strategies aim to reduce uncertainty by shaping astronomers’ data analysis practices so that these practices will integrate well with LSST once operations begin.One approach that exemplifies both types of strategy is the decision to make LSST data management software open source, even now as it is being developed. This policy will enable future data users to adapt this software to evolving needs. In addition, LSST intends for astronomers to start using this software well in advance of 2022, thereby embedding LSST software and data analysis approaches in the practices of astronomers.These findings strengthen arguments for making the software supporting sky surveys available as open source. Such arguments usually focus on reuse potential of software, and enhancing replicability of analyses. In this case, however, open source software also promises to mitigate the critical challenge of anticipating the needs of future data users.

  5. Astrometry with LSST: Objectives and Challenges

    NASA Astrophysics Data System (ADS)

    Casetti-Dinescu, D. I.; Girard, T. M.; Méndez, R. A.; Petronchak, R. M.

    2018-01-01

    The forthcoming Large Synoptic Survey Telescope (LSST) is an optical telescope with an effective aperture of 6.4 m, and a field of view of 9.6 square degrees. Thus, LSST will have an étendue larger than any other optical telescope, performing wide-field, deep imaging of the sky. There are four broad categories of science objectives: 1) dark-energy and dark matter, 2) transients, 3) the Milky Way and its neighbours and, 4) the Solar System. In particular, for the Milky-Way science case, astrometry will make a critical contribution; therefore, special attention must be devoted to extract the maximum amount of astrometric information from the LSST data. Here, we outline the astrometric challenges posed by such a massive survey. We also present some current examples of ground-based, wide-field, deep imagers used for astrometry, as precursors of the LSST.

  6. Easily Transported CCD Systems for Use in Astronomy Labs

    NASA Astrophysics Data System (ADS)

    Meisel, D.

    1992-12-01

    Relatively inexpensive CCD cameras and portable computers are now easily obtained as commercially available products. I will describe a prototype system that can be used by introductory astronomy students, even urban enviroments, to obtain useful observations of the night sky. It is based on the ST-4 CCDs made by Santa Barbara Instruments Group and Macintosh Powerbook145 computers. Students take outdoor images directly from the college campus, bring the exposures back into the lab and download the images into our networked server. These stored images can then be processed (at a later time) using a variety of image processing programs including a new astronomical version of the popular "freeware" NIH Image package that is currently under development at Geneseo. The prototype of this system will be demonstrated and available for hands-on use during the meeting. This work is supported by NSF ILI Demonstration Grant USE9250493 and Grants from SUNY-GENESEO.

  7. Optical Variability and Classification of High Redshift (3.5 < z < 5.5) Quasars on SDSS Stripe 82

    NASA Astrophysics Data System (ADS)

    AlSayyad, Yusra; McGreer, Ian D.; Fan, Xiaohui; Connolly, Andrew J.; Ivezic, Zeljko; Becker, Andrew C.

    2015-01-01

    Recent studies have shown promise in combining optical colors with variability to efficiently select and estimate the redshifts of low- to mid-redshift quasars in upcoming ground-based time-domain surveys. We extend these studies to fainter and less abundant high-redshift quasars using light curves from 235 sq. deg. and 10 years of Stripe 82 imaging reprocessed with the prototype LSST data management stack. Sources are detected on the i-band co-adds (5σ: i ~ 24) but measured on the single-epoch (ugriz) images, generating complete and unbiased lightcurves for sources fainter than the single-epoch detection threshold. Using these forced photometry lightcurves, we explore optical variability characteristics of high redshift quasars and validate classification methods with particular attention to the low signal limit. In this low SNR limit, we quantify the degradation of the uncertainties and biases on variability parameters using simulated light curves. Completeness/efficiency and redshift accuracy are verified with new spectroscopic observations on the MMT and APO 3.5m. These preliminary results are part of a survey to measure the z~4 luminosity function for quasars (i < 23) on Stripe 82 and to validate purely photometric classification techniques for high redshift quasars in LSST.

  8. Data Mining Research with the LSST

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.; Strauss, M. A.; Tyson, J. A.

    2007-12-01

    The LSST catalog database will exceed 10 petabytes, comprising several hundred attributes for 5 billion galaxies, 10 billion stars, and over 1 billion variable sources (optical variables, transients, or moving objects), extracted from over 20,000 square degrees of deep imaging in 5 passbands with thorough time domain coverage: 1000 visits over the 10-year LSST survey lifetime. The opportunities are enormous for novel scientific discoveries within this rich time-domain ultra-deep multi-band survey database. Data Mining, Machine Learning, and Knowledge Discovery research opportunities with the LSST are now under study, with a potential for new collaborations to develop to contribute to these investigations. We will describe features of the LSST science database that are amenable to scientific data mining, object classification, outlier identification, anomaly detection, image quality assurance, and survey science validation. We also give some illustrative examples of current scientific data mining research in astronomy, and point out where new research is needed. In particular, the data mining research community will need to address several issues in the coming years as we prepare for the LSST data deluge. The data mining research agenda includes: scalability (at petabytes scales) of existing machine learning and data mining algorithms; development of grid-enabled parallel data mining algorithms; designing a robust system for brokering classifications from the LSST event pipeline (which may produce 10,000 or more event alerts per night); multi-resolution methods for exploration of petascale databases; visual data mining algorithms for visual exploration of the data; indexing of multi-attribute multi-dimensional astronomical databases (beyond RA-Dec spatial indexing) for rapid querying of petabyte databases; and more. Finally, we will identify opportunities for synergistic collaboration between the data mining research group and the LSST Data Management and Science Collaboration teams.

  9. The Stellar Populations of the Milky Way and Nearby Galaxies with LSST

    NASA Astrophysics Data System (ADS)

    Olsen, Knut A.; Covey, K.; Saha, A.; Beers, T. C.; Bochanski, J.; Boeshaar, P.; Cargile, P.; Catelan, M.; Burgasser, A.; Cook, K.; Dhital, S.; Figer, D.; Ivezic, Z.; Kalirai, J.; McGehee, P.; Minniti, D.; Pepper, J.; Prsa, A.; Sarajedini, A.; Silva, D.; Smith, J. A.; Stassun, K.; Thorman, P.; Williams, B.; LSST Stellar Populations Collaboration

    2011-01-01

    The LSST will produce a multi-color map and photometric object catalog of half the sky to r=27.6 (AB mag; 5-sigma) when observations at the individual epochs of the standard cadence are stacked. Analyzing the ten years of independent measurements in each field will allow variability, proper motion and parallax measurements to be derived for objects brighter than r=24.5. These photometric, astrometric, and variability data will enable the construction of a detailed and robust map of the stellar populations of the Milky Way, its satellites and its nearest extra-galactic neighbors--allowing exploration of their star formation, chemical enrichment, and accretion histories on a grand scale. For example, with geometric parallax accuracy of 1 milli-arc-sec, comparable to HIPPARCOS but reaching more than 10 magnitudes fainter, LSST will allow a complete census of all stars above the hydrogen-burning limit that are closer than 500 pc, including thousands of predicted L and T dwarfs. The LSST time sampling will identify and characterize variable stars of all types, from time scales of 1 hr to several years, a feast for variable star astrophysics; LSST's projected impact on the study of several variable star classes, including eclipsing binaries, are discussed here. We also describe the ongoing efforts of the collaboration to optimize the LSST system for stellar populations science. We are currently investigating the trade-offs associated with the exact wavelength boundaries of the LSST filters, identifying the most scientifically valuable locations for fields that will receive enhanced temporal coverage compared to the standard cadence, and analyzing synthetic LSST outputs to verify that the system's performance will be sufficient to achieve our highest priority science goals.

  10. The development of a cryogenic over-pressure pump

    NASA Astrophysics Data System (ADS)

    Alvarez, M.; Cease, H.; Flaugher, B.; Flores, R.; Garcia, J.; Lathrop, A.; Ruiz, F.

    2014-01-01

    A cryogenic over-pressure pump (OPP) was tested in the prototype telescope liquid nitrogen (LN2) cooling system for the Dark Energy Survey (DES) Project. This OPP consists of a process cylinder (PC), gas generator, and solenoid operated valves (SOVs). It is a positive displacement pump that provided intermittent liquid nitrogen (LN2) flow to an array of charge couple devices (CCDs) for the prototype Dark Energy Camera (DECam). In theory, a heater submerged in liquid would generate the drive gas in a closed loop cooling system. The drive gas would be injected into the PC to displace that liquid volume. However, due to limitations of the prototype closed loop nitrogen system (CCD cooling system) for DECam, a quasiclosed-loop nitrogen system was created. During the test of the OPP, the CCD array was cooled to its designed set point temperature of 173K. It was maintained at that temperature via electrical heaters. The performance of the OPP was captured in pressure, temperature, and flow rate in the CCD LN2 cooling system at Fermi National Accelerator Laboratory (FNAL).

  11. Designing a Multi-Petabyte Database for LSST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becla, Jacek; Hanushevsky, Andrew; Nikolaev, Sergei

    2007-01-10

    The 3.2 giga-pixel LSST camera will produce approximately half a petabyte of archive images every month. These data need to be reduced in under a minute to produce real-time transient alerts, and then added to the cumulative catalog for further analysis. The catalog is expected to grow about three hundred terabytes per year. The data volume, the real-time transient alerting requirements of the LSST, and its spatio-temporal aspects require innovative techniques to build an efficient data access system at reasonable cost. As currently envisioned, the system will rely on a database for catalogs and metadata. Several database systems are beingmore » evaluated to understand how they perform at these data rates, data volumes, and access patterns. This paper describes the LSST requirements, the challenges they impose, the data access philosophy, results to date from evaluating available database technologies against LSST requirements, and the proposed database architecture to meet the data challenges.« less

  12. The LSST: A System of Systems

    NASA Astrophysics Data System (ADS)

    Claver, Chuck F.; Dubois-Felsmann, G. P.; Delgado, F.; Hascall, P.; Horn, D.; Marshall, S.; Nordby, M.; Schalk, T. L.; Schumacher, G.; Sebag, J.; LSST Project Team

    2010-01-01

    The LSST is a complete observing system that acquires and archives images, processes and analyzes them, and publishes reduced images and catalogs of sources and objects. The LSST will operate over a ten year period producing a survey of 20,000 square degrees over the entire southern sky in 6 filters (ugrizy) with each field having been visited several hundred times enabling a wide spectrum of science from fast transients to exploration of dark matter and dark energy. The LSST itself is a complex system of systems consisting of the 8.4m three mirror telescope, a 3.2 billion pixel camera, and a peta-scale data management system. The LSST project uses a Model Based Systems Engineering (MBSE) methodology to ensure an integrated approach to system design and rigorous definition of system interfaces and specifications. The MBSE methodology is applied through modeling of the LSST's systems with the System Modeling Language (SysML). The SysML modeling recursively establishes the threefold relationship between requirements, logical & physical functional decomposition and definition, and system and component behavior at successively deeper levels of abstraction and detail. The MBSE approach is applied throughout all stages of the project from design, to validation and verification, though to commissioning.

  13. Effect of condensed corn distillers solubles concentration on lactation performance of Holstein cows.

    PubMed

    McCormick, M E; Forbes, S; Moreira, V R; Blouin, D C; Han, K J

    2015-03-01

    Forty-eight Holstein cows (32 multiparous and 16 primiparous) in mid to late lactation averaging 219±71 days in milk and 30.5±6.6 kg/d of 3.5% fat-corrected milk were used in a 56-d completely randomized design experiment to evaluate condensed corn distillers solubles (CCDS) inclusion in high-fiber total mixed rations (TMR). Inclusion rates evaluated were 0, 6.6, 13.2, and 19.8% CCDS as a percentage of dry matter (DM). Distiller solubles substituted for soybean meal, corn grain, and whole cottonseed such that diets were similar in protein (16.6%) and fat (4.50%). Water was added to 0, 6.6, and 13.2% CCDS treatments so that final TMR DM concentrations (47.8%) were similar across diets. The forage portion of the diet was kept constant at 19.6% annual ryegrass hay and 26.0% sorghum baleage. Diet in vitro true digestibility tended to increase as CCDS addition increased, but neutral detergent fiber digestibility trended lower in CCDS diets. Percent P (0.39, 0.55, 0.69, and 0.73%) and S (0.32, 0.35, 0.39, and 0.42%) in TMR increased as CCDS concentration increased. Milk yield (23.5, 24.7, 25.5, and 24.8 kg/d of 3.5% fat-corrected milk) was similar for control and CCDS diets. Milk fat (3.88, 3.73, 3.78, and 3.68%), protein (3.28, 3.27, 3.31, and 3.31%), and lactose (4.61, 4.66, 4.69, and 4.77) percentages were similar across diets. Milk urea nitrogen (16.60, 15.58, 15.43, and 14.75 mg/dL) declined with increasing CCDS addition. Animal activity, body weight, body condition scores, and locomotion scores were not influenced by CCDS. Day 28 poststudy locomotion scores were similar across diets. Ruminal acetate concentrations did not differ among diets, but propionate and butyrate concentrations were elevated in rumen fluid of cows receiving 19.6% CCDS. Although rumen fluid pH values were similar (6.5, 6.4, 6.3, and 6.2), the two highest CCDS diets exhibited depressed acetate:propionate ratios relative to controls. The results from this study indicate that CCDS may be included in high-fiber TMR for lactating dairy cows at up to nearly 20% of diet DM; however, caution is recommended because high CCDS P concentrations may create Ca:P imbalances and excess P may be introduced into the environment. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  14. The Effects of Commercial Airline Traffic on LSST Observing Efficiency

    NASA Astrophysics Data System (ADS)

    Gibson, Rose; Claver, Charles; Stubbs, Christopher

    2016-01-01

    The Large Synoptic Survey Telescope (LSST) is a ten-year survey that will map the southern sky in six different filters 800 times before the end of its run. In this paper, we explore the primary effect of airline traffic on scheduling the LSST observations in addition to the secondary effect of condensation trails, or contrails, created by the presence of the aircraft. The large national investment being made in LSST implies that small improvments observing efficiency through aircraft and contrail avoidance can result in a significant improvement in the quality of the survey and its science. We have used the Automatic Dependent Surveillance-Broadcast (ADS-B) signals received from commercial aircraft to monitor and record activity over the LSST site. We installed a ADS-B ground station on Cerro Pachón, Chile consiting of a1090Mhz antenna on the Andes Lidar Observatory feeding a RTL2832U software defined radio. We used dump1090 to convert the received ADS-B telementry into Basestation format, where we found that during the busiest time of the night there were only 4 signals being received each minute on average, which will have very small direct effect, if any, on the LSST observing scheduler. As part of future studies we will examin the effects of contrals on LSST observations. Gibson was supported by the NOAO/KPNO Research Experiences for Undergraduates (REU) Program which is funded by the National Science Foundation Research Experience for Undergraduates Program (AST-1262829).

  15. IAC level "O" program development

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1982-01-01

    The current status of the IAC development activity is summarized. The listed prototype software and documentation was delivered, and details were planned for development of the level 1 operational system. The planned end product IAC is required to support LSST design analysis and performance evaluation, with emphasis on the coupling of required technical disciplines. The long term IAC effectively provides two distinct features: a specific set of analysis modules (thermal, structural, controls, antenna radiation performance and instrument optical performance) that will function together with the IAC supporting software in an integrated and user friendly manner; and a general framework whereby new analysis modules can readily be incorporated into IAC or be allowed to communicate with it.

  16. The Hyper Suprime-Cam software pipeline

    DOE PAGES

    Bosch, James; Armstrong, Robert; Bickerton, Steven; ...

    2017-10-12

    Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less

  17. The Hyper Suprime-Cam software pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosch, James; Armstrong, Robert; Bickerton, Steven

    Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less

  18. Machine Learning for Zwicky Transient Facility

    NASA Astrophysics Data System (ADS)

    Mahabal, Ashish; Zwicky Transient Facility, Catalina Real-Time Transient Survey

    2018-01-01

    The Zwicky Transient Facility (ZTF) will operate from 2018 to 2020 covering the accessible sky with its large 47 square degree camera. The transient detection rate is expected to be about a million per night. ZTF is thus a perfect LSST prototype. The big difference is that all of the ZTF transients can be followed up by 4- to 8-m class telescopes. Given the large numbers, using human scanners for separating the genuine transients from artifacts is out of question. For that first step as well as for classifying the transients with minimal follow-up requires machine learning. We describe the tools and plans to take on this task using follow-up facilities, and knowledge gained from archival datasets.

  19. Designing a multi-petabyte database for LSST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becla, J; Hanushevsky, A

    2005-12-21

    The 3.2 giga-pixel LSST camera will produce over half a petabyte of raw images every month. This data needs to be reduced in under a minute to produce real-time transient alerts, and then cataloged and indexed to allow efficient access and simplify further analysis. The indexed catalogs alone are expected to grow at a speed of about 600 terabytes per year. The sheer volume of data, the real-time transient alerting requirements of the LSST, and its spatio-temporal aspects require cutting-edge techniques to build an efficient data access system at reasonable cost. As currently envisioned, the system will rely on amore » database for catalogs and metadata. Several database systems are being evaluated to understand how they will scale and perform at these data volumes in anticipated LSST access patterns. This paper describes the LSST requirements, the challenges they impose, the data access philosophy, and the database architecture that is expected to be adopted in order to meet the data challenges.« less

  20. TRANSITING PLANETS WITH LSST. II. PERIOD DETECTION OF PLANETS ORBITING 1 M{sub ⊙} HOSTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacklin, Savannah; Lund, Michael B.; Stassun, Keivan G.

    2015-07-15

    The Large Synoptic Survey Telescope (LSST) will photometrically monitor ∼10{sup 9} stars for 10 years. The resulting light curves can be used to detect transiting exoplanets. In particular, as demonstrated by Lund et al., LSST will probe stellar populations currently undersampled in most exoplanet transit surveys, including out to extragalactic distances. In this paper we test the efficiency of the box-fitting least-squares (BLS) algorithm for accurately recovering the periods of transiting exoplanets using simulated LSST data. We model planets with a range of radii orbiting a solar-mass star at a distance of 7 kpc, with orbital periods ranging from 0.5more » to 20 days. We find that standard-cadence LSST observations will be able to reliably recover the periods of Hot Jupiters with periods shorter than ∼3 days; however, it will remain a challenge to confidently distinguish these transiting planets from false positives. At the same time, we find that the LSST deep-drilling cadence is extremely powerful: the BLS algorithm successfully recovers at least 30% of sub-Saturn-size exoplanets with orbital periods as long as 20 days, and a simple BLS power criterion robustly distinguishes ∼98% of these from photometric (i.e., statistical) false positives.« less

  1. LSST and the Epoch of Reionization Experiments

    NASA Astrophysics Data System (ADS)

    Ivezić, Željko

    2018-05-01

    The Large Synoptic Survey Telescope (LSST), a next generation astronomical survey, sited on Cerro Pachon in Chile, will provide an unprecedented amount of imaging data for studies of the faint optical sky. The LSST system includes an 8.4m (6.7m effective) primary mirror and a 3.2 Gigapixel camera with a 9.6 sq. deg. field of view. This system will enable about 10,000 sq. deg. of sky to be covered twice per night, every three to four nights on average, with typical 5-sigma depth for point sources of r = 24.5 (AB). With over 800 observations in the ugrizy bands over a 10-year period, these data will enable coadded images reaching r = 27.5 (about 5 magnitudes deeper than SDSS) as well as studies of faint time-domain astronomy. The measured properties of newly discovered and known astrometric and photometric transients will be publicly reported within 60 sec after closing the shutter. The resulting hundreds of petabytes of imaging data for about 40 billion objects will be used for scientific investigations ranging from the properties of near-Earth asteroids to characterizations of dark matter and dark energy. For example, simulations estimate that LSST will discover about 1,000 quasars at redshifts exceeding 7; this sample will place tight constraints on the cosmic environment at the end of the reionization epoch. In addition to a brief introduction to LSST, I review the value of LSST data in support of epoch of reionization experiments and discuss how international participants can join LSST.

  2. Photometric Redshifts with the LSST: Evaluating Survey Observing Strategies

    NASA Astrophysics Data System (ADS)

    Graham, Melissa L.; Connolly, Andrew J.; Ivezić, Željko; Schmidt, Samuel J.; Jones, R. Lynne; Jurić, Mario; Daniel, Scott F.; Yoachim, Peter

    2018-01-01

    In this paper we present and characterize a nearest-neighbors color-matching photometric redshift estimator that features a direct relationship between the precision and accuracy of the input magnitudes and the output photometric redshifts. This aspect makes our estimator an ideal tool for evaluating the impact of changes to LSST survey parameters that affect the measurement errors of the photometry, which is the main motivation of our work (i.e., it is not intended to provide the “best” photometric redshifts for LSST data). We show how the photometric redshifts will improve with time over the 10 year LSST survey and confirm that the nominal distribution of visits per filter provides the most accurate photo-z results. The LSST survey strategy naturally produces observations over a range of airmass, which offers the opportunity of using an SED- and z-dependent atmospheric affect on the observed photometry as a color-independent redshift indicator. We show that measuring this airmass effect and including it as a prior has the potential to improve the photometric redshifts and can ameliorate extreme outliers, but that it will only be adequately measured for the brightest galaxies, which limits its overall impact on LSST photometric redshifts. We furthermore demonstrate how this airmass effect can induce a bias in the photo-z results, and caution against survey strategies that prioritize high-airmass observations for the purpose of improving this prior. Ultimately, we intend for this work to serve as a guide for the expectations and preparations of the LSST science community with regard to the minimum quality of photo-z as the survey progresses.

  3. NASA's commercial space program - Initiatives for the future

    NASA Technical Reports Server (NTRS)

    Rose, James T.; Stone, Barbara A.

    1990-01-01

    NASA's commercial development of the space program aimed at the stimulation and assistance of expanded private sector involvement and investment in civil space activities is discussed, focusing on major new program initiatives and their implementation. NASA's Centers for the Commercial Development of Space (CCDS) program, composed of competitively selected consortia of universities, industries, and government involved in early research and testing phases of potentially commercially viable technologies is described. The 16 centers concentrate on seven different technical areas such as automation and robotics; remote sensing; life sciences; and space power, propulsion, and structures. Private sector participation, CCDS technology development, government and commercially supplied access to space in support of CCDS programs, CCDS hardware development, and CCDS spinoffs are discussed together with various cooperative and reimbursable agreements between NASA and the private sector.

  4. LSST Resources for the Community

    NASA Astrophysics Data System (ADS)

    Jones, R. Lynne

    2011-01-01

    LSST will generate 100 petabytes of images and 20 petabytes of catalogs, covering 18,000-20,000 square degrees of area sampled every few days, throughout a total of ten years of time -- all publicly available and exquisitely calibrated. The primary access to this data will be through Data Access Centers (DACs). DACs will provide access to catalogs of sources (single detections from individual images) and objects (associations of sources from multiple images). Simple user interfaces or direct SQL queries at the DAC can return user-specified portions of data from catalogs or images. More complex manipulations of the data, such as calculating multi-point correlation functions or creating alternative photo-z measurements on terabyte-scale data, can be completed with the DAC's own resources. Even more data-intensive computations requiring access to large numbers of image pixels on petabyte-scale could also be conducted at the DAC, using compute resources allocated in a similar manner to a TAC. DAC resources will be available to all individuals in member countries or institutes and LSST science collaborations. DACs will also assist investigators with requests for allocations at national facilities such as the Petascale Computing Facility, TeraGrid, and Open Science Grid. Using data on this scale requires new approaches to accessibility and analysis which are being developed through interactions with the LSST Science Collaborations. We are producing simulated images (as might be acquired by LSST) based on models of the universe and generating catalogs from these images (as well as from the base model) using the LSST data management framework in a series of data challenges. The resulting images and catalogs are being made available to the science collaborations to verify the algorithms and develop user interfaces. All LSST software is open source and available online, including preliminary catalog formats. We encourage feedback from the community.

  5. Measuring the Growth Rate of Structure with Type IA Supernovae from LSST

    NASA Astrophysics Data System (ADS)

    Howlett, Cullan; Robotham, Aaron S. G.; Lagos, Claudia D. P.; Kim, Alex G.

    2017-10-01

    We investigate the peculiar motions of galaxies up to z = 0.5 using Type Ia supernovae (SNe Ia) from the Large Synoptic Survey Telescope (LSST) and predict the subsequent constraints on the growth rate of structure. We consider two cases. Our first is based on measurements of the volumetric SNe Ia rate and assumes we can obtain spectroscopic redshifts and light curves for varying fractions of objects that are detected pre-peak luminosity by LSST (some of which may be obtained by LSST itself, and others that would require additional follow-up observations). We find that these measurements could produce growth rate constraints at z< 0.5 that significantly outperform those found using Redshift Space Distortions (RSD) with DESI or 4MOST, even though there are ˜ 4× fewer objects. For our second case, we use semi-analytic simulations and a prescription for the SNe Ia rate as a function of stellar mass and star-formation rate to predict the number of LSST SNe IA whose host redshifts may already have been obtained with the Taipan+WALLABY surveys or with a future multi-object spectroscopic survey. We find ˜18,000 and ˜160,000 SNe Ia with host redshifts for these cases, respectively. While this is only a fraction of the total LSST-detected SNe Ia, they could be used to significantly augment and improve the growth rate constraints compared to only RSD. Ultimately, we find that combining LSST SNe Ia with large numbers of galaxy redshifts will provide the most powerful probe of large-scale gravity in the z< 0.5 regime over the coming decades.

  6. The Large Synoptic Survey Telescope project management control system

    NASA Astrophysics Data System (ADS)

    Kantor, Jeffrey P.

    2012-09-01

    The Large Synoptic Survey Telescope (LSST) program is jointly funded by the NSF, the DOE, and private institutions and donors. From an NSF funding standpoint, the LSST is a Major Research Equipment and Facilities (MREFC) project. The NSF funding process requires proposals and D&D reviews to include activity-based budgets and schedules; documented basis of estimates; risk-based contingency analysis; cost escalation and categorization. "Out-of-the box," the commercial tool Primavera P6 contains approximately 90% of the planning and estimating capability needed to satisfy R&D phase requirements, and it is customizable/configurable for remainder with relatively little effort. We describe the customization/configuration and use of Primavera for the LSST Project Management Control System (PMCS), assess our experience to date, and describe future directions. Examples in this paper are drawn from the LSST Data Management System (DMS), which is one of three main subsystems of the LSST and is funded by the NSF. By astronomy standards the LSST DMS is a large data management project, processing and archiving over 70 petabyes of image data, producing over 20 petabytes of catalogs annually, and generating 2 million transient alerts per night. Over the 6-year construction and commissioning phase, the DM project is estimated to require 600,000 hours of engineering effort. In total, the DMS cost is approximately 60% hardware/system software and 40% labor.

  7. The moderating role of cognitive control deficits in the link from emotional dissonance to burnout symptoms and absenteeism.

    PubMed

    Diestel, Stefan; Schmidt, Klaus-Helmut

    2011-07-01

    The present study examines whether cognitive control deficits (CCDs) as a personal vulnerability factor amplify the relationship between emotional dissonance (ED; perceived discrepancy between felt and expressed emotions) and burnout symptoms (emotional exhaustion and depersonalization) as well as absenteeism. CCDs refer to daily failures and impairments of attention regulation, impulse control, and memory. The prediction of the moderator effect of CCDs draws on the argument that portraying emotions which are not genuinely felt is a form of self-regulation taxing and depleting a limited resource capacity. Interindividual differences in the resource capacity are reflected by the measure of CCDs. Drawing on two German samples (one cross-sectional and one longitudinal sample; NTOTAL = 645) of service employees, the present study analyzed interactive effects of ED and CCDs on exhaustion, depersonalization, and two indicators of absenteeism. As was hypothesized, latent moderated structural equation modeling revealed that the adverse impacts of ED on both burnout symptoms and absence behavior were amplified as a function of CCDs. Theoretical and practical implications of the present results will be discussed.

  8. Hyper Suprime-Cam: characteristics of 116 fully depleted back-illuminated CCDs

    NASA Astrophysics Data System (ADS)

    Kamata, Yukiko; Miyazaki, Satoshi; Nakaya, Hidehiko; Komiyama, Yutaka; Obuchi, Yoshiyuki; Kawanomoto, Satoshi; Uraguchi, Fumihiro; Utsumi, Yosuke; Suzuki, Hisanori; Miyazaki, Yasuhito; Muramatsu, Masaharu

    2012-07-01

    Hyper Suprime-Cam (HSC)1,2 is a wide field imaging camera with the field of view (FOV) 1.5 degree diameter, which is to be installed at the prime focus of the Subaru Telescope. The large FOV is realized by the 116 2K × 4K pixels fully depleted back-illuminated CCD (FDCCD) with 15 μm pixel square. The acceptance inspection of the CCDs started around the end of 2009 and finished June 2011. We measured basic characteristics such as charge transfer efficiency (CTE), dark current, readout noise, linearity and the number of the dead column for all CCDs, and measured the quantum effciency (QE) of 21 CCDs. As a result, we confirmed exceptional quality and performance fdor all CCDs ans were able to select the best pissible 116 CCDs. We also measured the flatness of each CCD at room temperature, and optimally placed them on the focal plane plate. In this paper, we report the results of the acceptance inspection asn the installation process into the HSC dewar3,4.

  9. From Science To Design: Systems Engineering For The Lsst

    NASA Astrophysics Data System (ADS)

    Claver, Chuck F.; Axelrod, T.; Fouts, K.; Kantor, J.; Nordby, M.; Sebag, J.; LSST Collaboration

    2009-01-01

    The LSST is a universal-purpose survey telescope that will address scores of scientific missions. To assist the technical teams to convergence to a specific engineering design, the LSST Science Requirements Document (SRD) selects four stressing principle scientific missions: 1) Constraining Dark Matter and Dark Energy; 2) taking an Inventory of the Solar System; 3) Exploring the Transient Optical Sky; and 4) mapping the Milky Way. From these 4 missions the SRD specifies the needed requirements for single images and the full 10 year survey that enables a wide range of science beyond the 4 principle missions. Through optical design and analysis, operations simulation, and throughput modeling the systems engineering effort in the LSST has largely focused on taking the SRD specifications and deriving system functional requirements that define the system design. A Model Based Systems Engineering approach with SysML is used to manage the flow down of requirements from science to system function to sub-system. The rigor of requirements flow and management assists the LSST in keeping the overall scope, hence budget and schedule, under control.

  10. Strong Gravitational Lensing with LSST

    NASA Astrophysics Data System (ADS)

    Marshall, Philip J.; Bradac, M.; Chartas, G.; Dobler, G.; Eliasdottir, A.; Falco, E.; Fassnacht, C. D.; Jee, M. J.; Keeton, C. R.; Oguri, M.; Tyson, J. A.; LSST Strong Lensing Science Collaboration

    2010-01-01

    LSST will find more strong gravitational lensing events than any other survey preceding it, and will monitor them all at a cadence of a few days to a few weeks. We can expect the biggest advances in strong lensing science made with LSST to be in those areas that benefit most from the large volume, and the high accuracy multi-filter time series: studies of, and using, several thousand lensed quasars and several hundred supernovae. However, the high quality imaging will allow us to detect and measure large numbers of background galaxies multiply-imaged by galaxies, groups and clusters. In this poster we give an overview of the strong lensing science enabled by LSST, and highlight the particular associated technical challenges that will have to be faced when working with the survey.

  11. Expanding the user base beyond HEP for the Ganga distributed analysis user interface

    NASA Astrophysics Data System (ADS)

    Currie, R.; Egede, U.; Richards, A.; Slater, M.; Williams, M.

    2017-10-01

    This document presents the result of recent developments within Ganga[1] project to support users from new communities outside of HEP. In particular I will examine the case of users from the Large Scale Survey Telescope (LSST) group looking to use resources provided by the UK based GridPP[2][3] DIRAC[4][5] instance. An example use case is work performed with users from the LSST Virtual Organisation (VO) to distribute the workflow used for galaxy shape identification analyses. This work highlighted some LSST specific challenges which could be well solved by common tools within the HEP community. As a result of this work the LSST community was able to take advantage of GridPP[2][3] resources to perform large computing tasks within the UK.

  12. Patient-based and clinical outcomes of implant telescopic attachment-retained mandibular overdentures: a 1-year longitudinal prospective study.

    PubMed

    Yunus, Norsiah; Saub, Roslan; Taiyeb Ali, Tara Bai; Salleh, Nosizana Mohd; Baig, Mirza Rustum

    2014-01-01

    The purpose of this study was to evaluate and compare Oral Health-Related Quality of Life (OHRQoL), denture satisfaction, and masticatory performance in edentulous patients provided with mandibular implant-supported overdentures (ISODs) retained with telescopic attachments and those of conventional complete dentures (CCDs). Peri-implant soft tissue changes were also evaluated at various intervals during a 1-year observation period. Participating patients received new CCDs and later received two mandibular interforaminal implants and had their mandibular CCDs converted into ISODs with telescopic attachments. Questionnaires were used to assess OHRQoL (Shortened Oral Health Impact Profile-14, Malaysian version) and denture satisfaction at different stages of treatment with CCDs and ISODs. Objective masticatory performance with the CCDs and ISODs was recorded with a mixing ability test. Evaluations were carried out at 3 months with the new CCDs, 3 months after mandibular ISOD provision, and 1 year after receiving the ISOD. Peri-implant parameters were additionally assessed at specific intervals during the treatment period. The data obtained were statistically analyzed and compared. In the 17 patients who completed the protocol, significant improvements were observed in OHRQoL and patient satisfaction when CCDs were modified to ISODs, after 3 months, and at 1 year. Significantly better mixing ability with the ISOD was noted, with the highest values observed at 1 year. Statistically insignificant differences were observed for all the peri-implant parameters, except for gingival recession, for which significant changes were observed 6 months after ISOD delivery (values had stabilized by 1 year). Telescopic crown attachment-retained mandibular ISODs improved OHRQoL, dental prosthesis satisfaction, and masticatory performance compared to CCDs. Peri-implant soft tissue response and implant stability were found to be favorable after 1 year.

  13. Managing Radiation Degradation of CCDs on the Chandra X-Ray Observatory--III

    NASA Technical Reports Server (NTRS)

    O'Dell, Stephen L.; Aldcroft, Thomas L.; Blackwell, William C.; Bucher, Sabina L.; Chappell, Jon H.; DePasquale, Joseph M.; Grant, Catherine E.; Juda, Michael; Martin, Eric R.; Minow, Joseph I.; hide

    2007-01-01

    The CCDs on the Chandra X-ray Observatory are vulnerable to radiation damage from low-energy protons scattered off the telescope's mirrors onto the focal plane. Following unexpected damage incurred early in the mission, the Chandra team developed, implemented, and maintains a radiation-protection program. This program--involving scheduled radiation safing during radiation-belt passes, intervention based upon real-time space-weather conditions and radiation-environment modeling, and on-board radiation monitoring with autonomous radiation safing--has successfully managed the radiation damage to the CCDs. Since implementing the program, the charge-transfer inefficiency (CTI) has increased at an average annual rate of only 3.2x 10(exp -6) (2.3 percent) for the front-illuminated CCDs and 1.0x10(exp -6) (6.7 percent) for the back-illuminated CCDs. This paper describes the current status of the Chandra radiation-management program, emphasizing enhancements implemented since the previous papers.

  14. LSST Astroinformatics And Astrostatistics: Data-oriented Astronomical Research

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.; Stassun, K.; Brunner, R. J.; Djorgovski, S. G.; Graham, M.; Hakkila, J.; Mahabal, A.; Paegert, M.; Pesenson, M.; Ptak, A.; Scargle, J.; Informatics, LSST; Statistics Team

    2011-01-01

    The LSST Informatics and Statistics Science Collaboration (ISSC) focuses on research and scientific discovery challenges posed by the very large and complex data collection that LSST will generate. Application areas include astroinformatics, machine learning, data mining, astrostatistics, visualization, scientific data semantics, time series analysis, and advanced signal processing. Research problems to be addressed with these methodologies include transient event characterization and classification, rare class discovery, correlation mining, outlier/anomaly/surprise detection, improved estimators (e.g., for photometric redshift or early onset supernova classification), exploration of highly dimensional (multivariate) data catalogs, and more. We present sample science results from these data-oriented approaches to large-data astronomical research. We present results from LSST ISSC team members, including the EB (Eclipsing Binary) Factory, the environmental variations in the fundamental plane of elliptical galaxies, and outlier detection in multivariate catalogs.

  15. Evaluation of Potential LSST Spatial Indexing Strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikolaev, S; Abdulla, G; Matzke, R

    2006-10-13

    The LSST requirement for producing alerts in near real-time, and the fact that generating an alert depends on knowing the history of light variations for a given sky position, both imply that the clustering information for all detections is available at any time during the survey. Therefore, any data structure describing clustering of detections in LSST needs to be continuously updated, even as new detections are arriving from the pipeline. We call this use case ''incremental clustering'', to reflect this continuous updating of clustering information. This document describes the evaluation results for several potential LSST incremental clustering strategies, using: (1)more » Neighbors table and zone optimization to store spatial clusters (a.k.a. Jim Grey's, or SDSS algorithm); (2) MySQL built-in R-tree implementation; (3) an external spatial index library which supports a query interface.« less

  16. Photometric classification and redshift estimation of LSST Supernovae

    NASA Astrophysics Data System (ADS)

    Dai, Mi; Kuhlmann, Steve; Wang, Yun; Kovacs, Eve

    2018-07-01

    Supernova (SN) classification and redshift estimation using photometric data only have become very important for the Large Synoptic Survey Telescope (LSST), given the large number of SNe that LSST will observe and the impossibility of spectroscopically following up all the SNe. We investigate the performance of an SN classifier that uses SN colours to classify LSST SNe with the Random Forest classification algorithm. Our classifier results in an area-under-the-curve of 0.98 which represents excellent classification. We are able to obtain a photometric SN sample containing 99 per cent SNe Ia by choosing a probability threshold. We estimate the photometric redshifts (photo-z) of SNe in our sample by fitting the SN light curves using the SALT2 model with nested sampling. We obtain a mean bias (⟨zphot - zspec⟩) of 0.012 with σ (z_phot-z_spec/1+z_spec) = 0.0294 without using a host-galaxy photo-z prior, and a mean bias (⟨zphot - zspec⟩) of 0.0017 with σ (z_phot-z_spec/1+z_spec) = 0.0116 using a host-galaxy photo-z prior. Assuming a flat ΛCDM model with Ωm = 0.3, we obtain Ωm of 0.305 ± 0.008 (statistical errors only), using the simulated LSST sample of photometric SNe Ia (with intrinsic scatter σint = 0.11) derived using our methodology without using host-galaxy photo-z prior. Our method will help boost the power of SNe from the LSST as cosmological probes.

  17. A Cross-Correlated Delay Shift Supervised Learning Method for Spiking Neurons with Application to Interictal Spike Detection in Epilepsy.

    PubMed

    Guo, Lilin; Wang, Zhenzhong; Cabrerizo, Mercedes; Adjouadi, Malek

    2017-05-01

    This study introduces a novel learning algorithm for spiking neurons, called CCDS, which is able to learn and reproduce arbitrary spike patterns in a supervised fashion allowing the processing of spatiotemporal information encoded in the precise timing of spikes. Unlike the Remote Supervised Method (ReSuMe), synapse delays and axonal delays in CCDS are variants which are modulated together with weights during learning. The CCDS rule is both biologically plausible and computationally efficient. The properties of this learning rule are investigated extensively through experimental evaluations in terms of reliability, adaptive learning performance, generality to different neuron models, learning in the presence of noise, effects of its learning parameters and classification performance. Results presented show that the CCDS learning method achieves learning accuracy and learning speed comparable with ReSuMe, but improves classification accuracy when compared to both the Spike Pattern Association Neuron (SPAN) learning rule and the Tempotron learning rule. The merit of CCDS rule is further validated on a practical example involving the automated detection of interictal spikes in EEG records of patients with epilepsy. Results again show that with proper encoding, the CCDS rule achieves good recognition performance.

  18. Systems engineering in the Large Synoptic Survey Telescope project: an application of model based systems engineering

    NASA Astrophysics Data System (ADS)

    Claver, C. F.; Selvy, Brian M.; Angeli, George; Delgado, Francisco; Dubois-Felsmann, Gregory; Hascall, Patrick; Lotz, Paul; Marshall, Stuart; Schumacher, German; Sebag, Jacques

    2014-08-01

    The Large Synoptic Survey Telescope project was an early adopter of SysML and Model Based Systems Engineering practices. The LSST project began using MBSE for requirements engineering beginning in 2006 shortly after the initial release of the first SysML standard. Out of this early work the LSST's MBSE effort has grown to include system requirements, operational use cases, physical system definition, interfaces, and system states along with behavior sequences and activities. In this paper we describe our approach and methodology for cross-linking these system elements over the three classical systems engineering domains - requirement, functional and physical - into the LSST System Architecture model. We also show how this model is used as the central element to the overall project systems engineering effort. More recently we have begun to use the cross-linked modeled system architecture to develop and plan the system verification and test process. In presenting this work we also describe "lessons learned" from several missteps the project has had with MBSE. Lastly, we conclude by summarizing the overall status of the LSST's System Architecture model and our plans for the future as the LSST heads toward construction.

  19. Commentary: Learning About the Sky Through Simulations. Chapter 34

    NASA Technical Reports Server (NTRS)

    Way, Michael J.

    2012-01-01

    The Large Synoptic Survey Telescope (LSST) simulator being built by Andy Connolly and collaborators is an impressive undertaking and should make working with LSST in the beginning stages far more easy than it was initially with the Sloan Digital Sky Survey (SDSS). However, I would like to focus on an equally important problem that has not yet been discussed here, but in the coming years the community will need to address-can we deal with the flood of data from LSST and will we need to rethink the way we work?

  20. A Search for Optically Faint GEO Debris

    DTIC Science & Technology

    2011-09-01

    M. Lederer NASA Orbital Debris Program Office, Johnson Space Center, Houston, TX Edwin S. Barker LZ Technology, Inc., Houston, TX Heather...fainter optical limits requires use of larger telescopes. Detectors on all small GEO survey instruments are usually CCDs, with peak quantum...CCDs. There are small gaps between the individual CCDs in the detector mosaic. The telescope can track at non-sidereal rates, allowing tracking

  1. Cosmology with the Large Synoptic Survey Telescope: an overview

    NASA Astrophysics Data System (ADS)

    Zhan, Hu; Tyson, J. Anthony

    2018-06-01

    The Large Synoptic Survey Telescope (LSST) is a high étendue imaging facility that is being constructed atop Cerro Pachón in northern Chile. It is scheduled to begin science operations in 2022. With an ( effective) aperture, a novel three-mirror design achieving a seeing-limited field of view, and a 3.2 gigapixel camera, the LSST has the deep-wide-fast imaging capability necessary to carry out an survey in six passbands (ugrizy) to a coadded depth of over 10 years using of its observational time. The remaining of the time will be devoted to considerably deeper and faster time-domain observations and smaller surveys. In total, each patch of the sky in the main survey will receive 800 visits allocated across the six passbands with exposure visits. The huge volume of high-quality LSST data will provide a wide range of science opportunities and, in particular, open a new era of precision cosmology with unprecedented statistical power and tight control of systematic errors. In this review, we give a brief account of the LSST cosmology program with an emphasis on dark energy investigations. The LSST will address dark energy physics and cosmology in general by exploiting diverse precision probes including large-scale structure, weak lensing, type Ia supernovae, galaxy clusters, and strong lensing. Combined with the cosmic microwave background data, these probes form interlocking tests on the cosmological model and the nature of dark energy in the presence of various systematics. The LSST data products will be made available to the US and Chilean scientific communities and to international partners with no proprietary period. Close collaborations with contemporaneous imaging and spectroscopy surveys observing at a variety of wavelengths, resolutions, depths, and timescales will be a vital part of the LSST science program, which will not only enhance specific studies but, more importantly, also allow a more complete understanding of the Universe through different windows.

  2. Effect of elastic band-based high-speed power training on cognitive function, physical performance and muscle strength in older women with mild cognitive impairment.

    PubMed

    Yoon, Dong Hyun; Kang, Dongheon; Kim, Hee-Jae; Kim, Jin-Soo; Song, Han Sol; Song, Wook

    2017-05-01

    The effectiveness of resistance training in improving cognitive function in older adults is well demonstrated. In particular, unconventional high-speed resistance training can improve muscle power development. In the present study, the effectiveness of 12 weeks of elastic band-based high-speed power training (HSPT) was examined. Participants were randomly assigned into a HSPT group (n = 14, age 75.0 ± 0.9 years), a low-speed strength training (LSST) group (n = 9, age 76.0 ± 1.3 years) and a control group (CON; n = 7, age 78.0 ± 1.0 years). A 1-h exercise program was provided twice a week for 12 weeks for the HSPT and LSST groups, and balance and tone exercises were carried out by the CON group. Significant increases in levels of cognitive function, physical function, and muscle strength were observed in both the HSPT and LSST groups. In cognitive function, significant improvements in the Mini-Mental State Examination and Montreal Cognitive Assessment were seen in both the HSPT and LSST groups compared with the CON group. In physical functions, Short Physical Performance Battery scores were increased significantly in the HSPT and LSST groups compared with the CON group. In the 12 weeks of elastic band-based training, the HSPT group showed greater improvements in older women with mild cognitive impairment than the LSST group, although both regimens were effective in improving cognitive function, physical function and muscle strength. We conclude that elastic band-based HSPT, as compared with LSST, is more efficient in helping older women with mild cognitive impairment to improve cognitive function, physical performance and muscle strength. Geriatr Gerontol Int 2017; 17: 765-772. © 2016 Japan Geriatrics Society.

  3. LSST Painting Risk Evaluation Memo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfe, Justin E.

    The optics subsystem is required to paint the edges of optics black where possible. Due to the risks in applying the paint LSST requests a review of the impact of removing this requirement for the filters and L3.

  4. Predicting Constraints on Ultra-Light Axion Parameters due to LSST Observations

    NASA Astrophysics Data System (ADS)

    Given, Gabriel; Grin, Daniel

    2018-01-01

    Ultra-light axions (ULAs) are a type of dark matter or dark energy candidate (depending on the mass) that are predicted to have a mass between $10^{‑33}$ and $10^{‑18}$ eV. The Large Synoptic Survey Telescope (LSST) is expected to provide a large number of weak lensing observations, which will lower the statistical uncertainty on the convergence power spectrum. I began work with Daniel Grin to predict how accurately the data from the LSST will be able to constrain ULA properties. I wrote Python code that takes a matter power spectrum calculated by axionCAMB and converts it to a convergence power spectrum. My code then takes derivatives of the convergence power spectrum with respect to several cosmological parameters; these derivatives will be used in Fisher Matrix analysis to determine the sensitivity of LSST observations to axion parameters.

  5. Curvature wavefront sensing performance evaluation for active correction of the Large Synoptic Survey Telescope (LSST).

    PubMed

    Manuel, Anastacia M; Phillion, Donald W; Olivier, Scot S; Baker, Kevin L; Cannon, Brice

    2010-01-18

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary, along with three refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. In order to maintain image quality during operation, the deformations and rigid body motions of the three large mirrors must be actively controlled to minimize optical aberrations, which arise primarily from forces due to gravity and thermal expansion. We describe the methodology for measuring the telescope aberrations using a set of curvature wavefront sensors located in the four corners of the LSST camera focal plane. We present a comprehensive analysis of the wavefront sensing system, including the availability of reference stars, demonstrating that this system will perform to the specifications required to meet the LSST performance goals.

  6. LSST Data Management

    NASA Astrophysics Data System (ADS)

    O'Mullane, William; LSST Data Management Team

    2018-01-01

    The Large Synoptic Survey Telescope (LSST) is an 8-m optical ground-based telescope being constructed on Cerro Pachon in Chile. LSST will survey half the sky every few nights in six optical bands. The data will be transferred to the data center in North America and within 60 seconds it will be reduced using difference imaging and an alert list be generated for the community. Additionally, annual data releases will be constructed from all the data during the 10-year mission, producing catalogs and deep co-added images with unprecedented time resolution for such a large region of sky. In the paper we present the current status of the LSST stack including the data processing components, Qserv database and data visualization software, describe how to obtain it, and provide a summary of the development road map. We are also discuss the move to Python3 and timeline for dropping Python2.

  7. LSST summit facility construction progress report: reacting to design refinements and field conditions

    NASA Astrophysics Data System (ADS)

    Barr, Jeffrey D.; Gressler, William; Sebag, Jacques; Seriche, Jaime; Serrano, Eduardo

    2016-07-01

    The civil work, site infrastructure and buildings for the summit facility of the Large Synoptic Survey Telescope (LSST) are among the first major elements that need to be designed, bid and constructed to support the subsequent integration of the dome, telescope, optics, camera and supporting systems. As the contracts for those other major subsystems now move forward under the management of the LSST Telescope and Site (T and S) team, there has been inevitable and beneficial evolution in their designs, which has resulted in significant modifications to the facility and infrastructure. The earliest design requirements for the LSST summit facility were first documented in 2005, its contracted full design was initiated in 2010, and construction began in January, 2015. During that entire development period, and extending now roughly halfway through construction, there continue to be necessary modifications to the facility design resulting from the refinement of interfaces to other major elements of the LSST project and now, during construction, due to unanticipated field conditions. Changes from evolving interfaces have principally involved the telescope mount, the dome and mirror handling/coating facilities which have included significant variations in mass, dimensions, heat loads and anchorage conditions. Modifications related to field conditions have included specifying and testing alternative methods of excavation and contending with the lack of competent rock substrate where it was predicted to be. While these and other necessary changes are somewhat specific to the LSST project and site, they also exemplify inherent challenges related to the typical timeline for the design and construction of astronomical observatory support facilities relative to the overall development of the project.

  8. Photometric classification and redshift estimation of LSST Supernovae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Mi; Kuhlmann, Steve; Wang, Yun

    Supernova (SN) classification and redshift estimation using photometric data only have become very important for the Large Synoptic Survey Telescope (LSST), given the large number of SNe that LSST will observe and the impossibility of spectroscopically following up all the SNe. We investigate the performance of an SN classifier that uses SN colours to classify LSST SNe with the Random Forest classification algorithm. Our classifier results in an area-under-the-curve of 0.98 which represents excellent classification. We are able to obtain a photometric SN sample containing 99 percent SNe Ia by choosing a probability threshold. We estimate the photometric redshifts (photo-z)more » of SNe in our sample by fitting the SN light curves using the SALT2 model with nested sampling. We obtain a mean bias (⟨zphot - zspec⟩) of 0.012 with σ(z phot -z spec 1+z spec )=0.0294 σ(zphot-zspec1+zspec)=0.0294 without using a host-galaxy photo-z prior, and a mean bias (⟨zphot - zspec⟩) of 0.0017 with σ(z phot -z spec 1+z spec )=0.0116 σ(zphot-zspec1+zspec)=0.0116 using a host-galaxy photo-z prior. Assuming a flat ΛCDM model with Ωm = 0.3, we obtain Ωm of 0.305 ± 0.008 (statistical errors only), using the simulated LSST sample of photometric SNe Ia (with intrinsic scatter σint = 0.11) derived using our methodology without using host-galaxy photo-z prior. Our method will help boost the power of SNe from the LSST as cosmological probes.« less

  9. Research-grade CMOS image sensors for remote sensing applications

    NASA Astrophysics Data System (ADS)

    Saint-Pe, Olivier; Tulet, Michel; Davancens, Robert; Larnaudie, Franck; Magnan, Pierre; Martin-Gonthier, Philippe; Corbiere, Franck; Belliot, Pierre; Estribeau, Magali

    2004-11-01

    Imaging detectors are key elements for optical instruments and sensors on board space missions dedicated to Earth observation (high resolution imaging, atmosphere spectroscopy...), Solar System exploration (micro cameras, guidance for autonomous vehicle...) and Universe observation (space telescope focal planes, guiding sensors...). This market has been dominated by CCD technology for long. Since the mid-90s, CMOS Image Sensors (CIS) have been competing with CCDs for consumer domains (webcams, cell phones, digital cameras...). Featuring significant advantages over CCD sensors for space applications (lower power consumption, smaller system size, better radiations behaviour...), CMOS technology is also expanding in this field, justifying specific R&D and development programs funded by national and European space agencies (mainly CNES, DGA and ESA). All along the 90s and thanks to their increasingly improving performances, CIS have started to be successfully used for more and more demanding space applications, from vision and control functions requiring low-level performances to guidance applications requiring medium-level performances. Recent technology improvements have made possible the manufacturing of research-grade CIS that are able to compete with CCDs in the high-performances arena. After an introduction outlining the growing interest of optical instruments designers for CMOS image sensors, this paper will present the existing and foreseen ways to reach high-level electro-optics performances for CIS. The developments and performances of CIS prototypes built using an imaging CMOS process will be presented in the corresponding section.

  10. Back-illuminated large area frame transfer CCDs for space-based hyper-spectral imaging applications

    NASA Astrophysics Data System (ADS)

    Philbrick, Robert H.; Gilmore, Angelo S.; Schrein, Ronald J.

    2016-07-01

    Standard offerings of large area, back-illuminated full frame CCD sensors are available from multiple suppliers and they continue to be commonly deployed in ground- and space-based applications. By comparison the availability of large area frame transfers CCDs is sparse, with the accompanying 2x increase in die area no doubt being a contributing factor. Modern back-illuminated CCDs yield very high quantum efficiency in the 290 to 400 nm band, a wavelength region of great interest in space-based instruments studying atmospheric phenomenon. In fast framing (e.g. 10 - 20 Hz), space-based applications such as hyper-spectral imaging, the use of a mechanical shutter to block incident photons during readout can prove costly and lower instrument reliability. The emergence of large area, all-digital visible CMOS sensors, with integrate while read functionality, are an alternative solution to CCDs; but, even after factoring in reduced complexity and cost of support electronics, the present cost to implement such novel sensors is prohibitive to cost constrained missions. Hence, there continues to be a niche set of applications where large area, back-illuminated frame transfer CCDs with high UV quantum efficiency, high frame rate, high full well, and low noise provide an advantageous solution. To address this need a family of large area frame transfer CCDs has been developed that includes 2048 (columns) x 256 (rows) (FT4), 2048 x 512 (FT5), and 2048 x 1024 (FT6) full frame transfer CCDs; and a 2048 x 1024 (FT7) split-frame transfer CCD. Each wafer contains 4 FT4, 2 FT5, 2 FT6, and 2 FT7 die. The designs have undergone radiation and accelerated life qualification and the electro-optical performance of these CCDs over the wavelength range of 290 to 900 nm is discussed.

  11. The LSST Data Mining Research Agenda

    NASA Astrophysics Data System (ADS)

    Borne, K.; Becla, J.; Davidson, I.; Szalay, A.; Tyson, J. A.

    2008-12-01

    We describe features of the LSST science database that are amenable to scientific data mining, object classification, outlier identification, anomaly detection, image quality assurance, and survey science validation. The data mining research agenda includes: scalability (at petabytes scales) of existing machine learning and data mining algorithms; development of grid-enabled parallel data mining algorithms; designing a robust system for brokering classifications from the LSST event pipeline (which may produce 10,000 or more event alerts per night) multi-resolution methods for exploration of petascale databases; indexing of multi-attribute multi-dimensional astronomical databases (beyond spatial indexing) for rapid querying of petabyte databases; and more.

  12. On the Detectability of Planet X with LSST

    NASA Astrophysics Data System (ADS)

    Trilling, David E.; Bellm, Eric C.; Malhotra, Renu

    2018-06-01

    Two planetary mass objects in the far outer solar system—collectively referred to here as Planet X— have recently been hypothesized to explain the orbital distribution of distant Kuiper Belt Objects. Neither planet is thought to be exceptionally faint, but the sky locations of these putative planets are poorly constrained. Therefore, a wide area survey is needed to detect these possible planets. The Large Synoptic Survey Telescope (LSST) will carry out an unbiased, large area (around 18000 deg2), deep (limiting magnitude of individual frames of 24.5) survey (the “wide-fast-deep (WFD)” survey) of the southern sky beginning in 2022, and it will therefore be an important tool in searching for these hypothesized planets. Here, we explore the effectiveness of LSST as a search platform for these possible planets. Assuming the current baseline cadence (which includes the WFD survey plus additional coverage), we estimate that LSST will confidently detect or rule out the existence of Planet X in 61% of the entire sky. At orbital distances up to ∼75 au, Planet X could simply be found in the normal nightly moving object processing; at larger distances, it will require custom data processing. We also discuss the implications of a nondetection of Planet X in LSST data.

  13. Machine Learning-based Transient Brokers for Real-time Classification of the LSST Alert Stream

    NASA Astrophysics Data System (ADS)

    Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika; ANTARES Collaboration

    2018-01-01

    The number of transient events discovered by wide-field time-domain surveys already far outstrips the combined followup resources of the astronomical community. This number will only increase as we progress towards the commissioning of the Large Synoptic Survey Telescope (LSST), breaking the community's current followup paradigm. Transient brokers - software to sift through, characterize, annotate and prioritize events for followup - will be a critical tool for managing alert streams in the LSST era. Developing the algorithms that underlie the brokers, and obtaining simulated LSST-like datasets prior to LSST commissioning, to train and test these algorithms are formidable, though not insurmountable challenges. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is a joint project of the National Optical Astronomy Observatory and the Department of Computer Science at the University of Arizona. We have been developing completely automated methods to characterize and classify variable and transient events from their multiband optical photometry. We describe the hierarchical ensemble machine learning algorithm we are developing, and test its performance on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, as well as our progress towards incorporating these into a real-time event broker working on live alert streams from time-domain surveys.

  14. LSST: Cadence Design and Simulation

    NASA Astrophysics Data System (ADS)

    Cook, Kem H.; Pinto, P. A.; Delgado, F.; Miller, M.; Petry, C.; Saha, A.; Gee, P. A.; Tyson, J. A.; Ivezic, Z.; Jones, L.; LSST Collaboration

    2009-01-01

    The LSST Project has developed an operations simulator to investigate how best to observe the sky to achieve its multiple science goals. The simulator has a sophisticated model of the telescope and dome to properly constrain potential observing cadences. This model has also proven useful for investigating various engineering issues ranging from sizing of slew motors, to design of cryogen lines to the camera. The simulator is capable of balancing cadence goals from multiple science programs, and attempts to minimize time spent slewing as it carries out these goals. The operations simulator has been used to demonstrate a 'universal' cadence which delivers the science requirements for a deep cosmology survey, a Near Earth Object Survey and good sampling in the time domain. We will present the results of simulating 10 years of LSST operations using realistic seeing distributions, historical weather data, scheduled engineering downtime and current telescope and camera parameters. These simulations demonstrate the capability of the LSST to deliver a 25,000 square degree survey probing the time domain including 20,000 square degrees for a uniform deep, wide, fast survey, while effectively surveying for NEOs over the same area. We will also present our plans for future development of the simulator--better global minimization of slew time and eventual transition to a scheduler for the real LSST.

  15. Discovery of KPS-1b, a Transiting Hot-Jupiter, with an Amateur Telescope Setup (Abstract)

    NASA Astrophysics Data System (ADS)

    Benni, P.; Burdanov, A.; Krushinsky, V.; Sokov, E.

    2018-06-01

    (Abstract only) Using readily available amateur equipment, a wide-field telescope (Celestron RASA, 279 mm f/2.2) coupled with a SBIG ST-8300M camera was set up at a private residence in a fairly light polluted suburban town thirty miles outside of Boston, Massachusetts. This telescope participated in the Kourovka Planet Search (KPS) prototype survey, along with a MASTER-II Ural wide field telescope near Yekaterinburg, Russia. One goal was to determine if higher resolution imaging ( 2 arcsec/pixel) with much lower sky coverage can practically detect exoplanet transits compared to the successful very wide-field exoplanet surveys (KELT, XO, WASP, HATnet, TrES, Qatar, etc.) which used an array of small aperture telescopes coupled to CCDs.

  16. Formal Education with LSST

    NASA Astrophysics Data System (ADS)

    Herrold, Ardis; Bauer, Amanda, Dr.; Peterson, J. Matt; Large Synoptic Survey Telescope Education and Public Outreach Team

    2018-01-01

    The Large Synoptic Survey Telescope will usher in a new age of astronomical data exploration for science educators and students. LSST data sets will be large, deep, and dynamic, and will establish a time-domain record that will extend over a decade. They will be used to provide engaging, relevant learning experiences.The EPO Team will develop online investigations using authentic LSST data that offer varying levels of challenge and depth by the start of telescope operations, slated to begin in 2022. The topics will cover common introductory astronomy concepts, and will align with the four science domains of LSST: The Milky Way, the changing sky (transients), solar system (moving) objects, and dark matter and dark energy.Online Jupyter notebooks will make LSST data easily available to access and analyze by students at the advanced middle school through college levels. Using online notebooks will circumvent common obstacles caused by firewalls, bandwidth issues, and the need to download software, as they will be accessible from any computer or tablet with internet access. Although the LSST EPO Jupyter notebooks are Python-based, a knowledge of programming will not be required to use them.Each topical investigation will include teacher and student versions of Jupyter notebooks, instructional videos, and access to a suite of support materials including a forum, and professional development training and tutorial videos.Jupyter notebooks will contain embedded widgets to process data, eliminating the need to use external spreadsheets and plotting software. Students will be able to analyze data by using some of the existing modules already developed for professional astronomers. This will shorten the time needed to conduct investigations and will shift the emphasis to understanding the underlying science themes, which is often lost with novice learners.

  17. Giga-z: A 100,000 Object Superconducting Spectrophotometer for LSST Follow-up

    NASA Astrophysics Data System (ADS)

    Marsden, Danica W.; Mazin, Benjamin A.; O'Brien, Kieran; Hirata, Chris

    2013-09-01

    We simulate the performance of a new type of instrument, a Superconducting Multi-Object Spectrograph (SuperMOS), that uses microwave kinetic inductance detectors (MKIDs). MKIDs, a new detector technology, feature good quantum efficiency in the UVOIR, can count individual photons with microsecond timing accuracy, and, like X-ray calorimeters, determine their energy to several percent. The performance of Giga-z, a SuperMOS designed for wide field imaging follow-up observations, is evaluated using simulated observations of the COSMOS mock catalog with an array of 100,000 R 423 nm = E/ΔE = 30 MKID pixels. We compare our results against a simultaneous simulation of LSST observations. In 3 yr on a dedicated 4 m class telescope, Giga-z could observe ≈2 billion galaxies, yielding a low-resolution spectral energy distribution spanning 350-1350 nm for each; 1000 times the number measured with any currently proposed LSST spectroscopic follow-up, at a fraction of the cost and time. Giga-z would provide redshifts for galaxies up to z ≈ 6 with magnitudes mi <~ 25, with accuracy σΔz/(1 + z) ≈ 0.03 for the whole sample, and σΔz/(1 + z) ≈ 0.007 for a select subset. We also find catastrophic failure rates and biases that are consistently lower than for LSST. The added constraint on dark energy parameters for WL + CMB by Giga-z using the FoMSWG default model is equivalent to multiplying the LSST Fisher matrix by a factor of α = 1.27 (wp ), 1.53 (wa ), or 1.98 (Δγ). This is equivalent to multiplying both the LSST coverage area and the training sets by α and reducing all systematics by a factor of 1/\\sqrt{\\alpha }, advantages that are robust to even more extreme models of intrinsic alignment.

  18. The Large Synoptic Survey Telescope: Projected Near-Earth Object Discovery Performance

    NASA Technical Reports Server (NTRS)

    Chesley, Steven R.; Veres, Peter

    2016-01-01

    The Large Synoptic Survey Telescope (LSST) is a large-aperture, wide-field survey that has the potential to detect millions of asteroids. LSST is under construction with survey operations slated to begin in 2022. We describe an independent study to assess the performance of LSST for detecting and cataloging near-Earth objects (NEOs). A significant component of the study will be to assess the survey's ability to link observations of a single object from among the large numbers of false detections and detections of other objects. We also will explore the survey's basic performance in terms of fraction of NEOs discovered and cataloged, both for the planned baseline survey, but also for enhanced surveys that are more carefully tuned for NEO search, generally at the expense of other science drivers. Preliminary results indicate that with successful linkage under the current baseline survey LSST would discover approximately 65% of NEOs with absolute magnitude H is less than 22, which corresponds approximately to 140m diameter.

  19. An optical to IR sky brightness model for the LSST

    NASA Astrophysics Data System (ADS)

    Yoachim, Peter; Coughlin, Michael; Angeli, George Z.; Claver, Charles F.; Connolly, Andrew J.; Cook, Kem; Daniel, Scott; Ivezić, Željko; Jones, R. Lynne; Petry, Catherine; Reuter, Michael; Stubbs, Christopher; Xin, Bo

    2016-07-01

    To optimize the observing strategy of a large survey such as the LSST, one needs an accurate model of the night sky emission spectrum across a range of atmospheric conditions and from the near-UV to the near-IR. We have used the ESO SkyCalc Sky Model Calculator1, 2 to construct a library of template spectra for the Chilean night sky. The ESO model includes emission from the upper and lower atmosphere, scattered starlight, scattered moonlight, and zodiacal light. We have then extended the ESO templates with an empirical fit to the twilight sky emission as measured by a Canon all-sky camera installed at the LSST site. With the ESO templates and our twilight model we can quickly interpolate to any arbitrary sky position and date and return the full sky spectrum or surface brightness magnitudes in the LSST filter system. Comparing our model to all-sky observations, we find typical residual RMS values of +/-0.2-0.3 magnitudes per square arcsecond.

  20. Dose rate effects on array CCDs exposed by Co-60 γ rays induce saturation output degradation and annealing tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zujun, E-mail: wangzujun@nint.ac.cn; Chen, Wei; He, Baoping

    The experimental tests of dose rate and annealing effects on array charge-coupled devices (CCDs) are presented. The saturation output voltage (V{sub S}) versus the total dose at the dose rates of 0.01, 0.1, 1.0, 10.0 and 50 rad(Si)/s are compared. Annealing tests are performed to eliminate the time-dependent effects. The V{sub S} degradation levels depend on the dose rates. The V{sub S} degradation mechanism induced by dose rate and annealing effects is analyzed. The V{sub S} at 20 krad(Si) with the dose rate of 0.03 rad(Si)/s are supplemented to assure the degradation curves between the dose rates of 0.1 andmore » 0.01 rad(Si)/s. The CCDs are divided into two groups, with one group biased and the other unbiased during {sup 60}Co γ radiation. The V{sub S} degradation levels of the biased CCDs during radiation are more severe than that of the unbiased CCDs.« less

  1. Replacing a technology - The Large Space Telescope and CCDs

    NASA Astrophysics Data System (ADS)

    Smith, R. W.; Tatarewicz, J. H.

    1985-07-01

    The technological improvements, design choices and mission goals which led to the inclusion of CCD detectors in the wide field camera of the Large Space Telescope (LST) to be launched by the STS are recounted. Consideration of CCD detectors began before CCDs had seen wide astronomical applications. During planning for the ST, in the 1960s, photographic methods and a vidicon were considered, and seemed feasible provided that periodic manual maintenance could be performed. The invention of CCDs was first reported in 1970 and by 1973 the CCDs were receiving significant attention as potential detectors instead of a vidicon, which retained its own technological challenges. The CCD format gained new emphasis when success was achieved in developments for planetary-imaging spacecraft. The rapidity of progress in CCD capabilities, coupled with the continued shortcomings of the vidicon, resulted in a finalized choice for a CCD device by 1977. The decision was also prompted by continuing commercial and military interest in CCDs, which was spurring the development of the technology and improving the sensitivities and reliability while lowering the costs.

  2. Cosmology with the Large Synoptic Survey Telescope: an overview.

    PubMed

    Zhan, Hu; Anthony Tyson, J

    2018-06-01

    The Large Synoptic Survey Telescope (LSST) is a high étendue imaging facility that is being constructed atop Cerro Pachón in northern Chile. It is scheduled to begin science operations in 2022. With an [Formula: see text] ([Formula: see text] effective) aperture, a novel three-mirror design achieving a seeing-limited [Formula: see text] field of view, and a 3.2 gigapixel camera, the LSST has the deep-wide-fast imaging capability necessary to carry out an [Formula: see text] survey in six passbands (ugrizy) to a coadded depth of [Formula: see text] over 10 years using [Formula: see text] of its observational time. The remaining [Formula: see text] of the time will be devoted to considerably deeper and faster time-domain observations and smaller surveys. In total, each patch of the sky in the main survey will receive 800 visits allocated across the six passbands with [Formula: see text] exposure visits. The huge volume of high-quality LSST data will provide a wide range of science opportunities and, in particular, open a new era of precision cosmology with unprecedented statistical power and tight control of systematic errors. In this review, we give a brief account of the LSST cosmology program with an emphasis on dark energy investigations. The LSST will address dark energy physics and cosmology in general by exploiting diverse precision probes including large-scale structure, weak lensing, type Ia supernovae, galaxy clusters, and strong lensing. Combined with the cosmic microwave background data, these probes form interlocking tests on the cosmological model and the nature of dark energy in the presence of various systematics. The LSST data products will be made available to the US and Chilean scientific communities and to international partners with no proprietary period. Close collaborations with contemporaneous imaging and spectroscopy surveys observing at a variety of wavelengths, resolutions, depths, and timescales will be a vital part of the LSST science program, which will not only enhance specific studies but, more importantly, also allow a more complete understanding of the Universe through different windows.

  3. LSST Camera Optics Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riot, V J; Olivier, S; Bauman, B

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics willmore » meet their performance goals.« less

  4. Advancing the LSST Operations Simulator

    NASA Astrophysics Data System (ADS)

    Saha, Abhijit; Ridgway, S. T.; Cook, K. H.; Delgado, F.; Chandrasekharan, S.; Petry, C. E.; Operations Simulator Group

    2013-01-01

    The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions (including weather and seeing), as well as additional scheduled and unscheduled downtime. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history database are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. This poster reports recent work which has focussed on an architectural restructuring of the code that will allow us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator will be used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities, and assist with performance margin investigations of the LSST system.

  5. Scientific Synergy between LSST and Euclid

    NASA Astrophysics Data System (ADS)

    Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric; Bean, Rachel; Boutigny, Dominique; Bremer, Malcolm N.; Capak, Peter; Cardone, Vincenzo; Carry, Benoît; Conselice, Christopher J.; Connolly, Andrew J.; Cuillandre, Jean-Charles; Hatch, N. A.; Helou, George; Hemmati, Shoubaneh; Hildebrandt, Hendrik; Hložek, Renée; Jones, Lynne; Kahn, Steven; Kiessling, Alina; Kitching, Thomas; Lupton, Robert; Mandelbaum, Rachel; Markovic, Katarina; Marshall, Phil; Massey, Richard; Maughan, Ben J.; Melchior, Peter; Mellier, Yannick; Newman, Jeffrey A.; Robertson, Brant; Sauvage, Marc; Schrabback, Tim; Smith, Graham P.; Strauss, Michael A.; Taylor, Andy; Von Der Linden, Anja

    2017-12-01

    Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy cluster studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. We provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.

  6. Probing LSST's Ability to Detect Planets Around White Dwarfs

    NASA Astrophysics Data System (ADS)

    Cortes, Jorge; Kipping, David

    2018-01-01

    Over the last four years more than 2,000 planets outside our solar system have been discovered, motivating us to search for and characterize potentially habitable worlds. Most planets orbit Sun-like stars, but more exotic stars can also host planets. Debris disks and disintegrating planetary bodies have been detected around white dwarf stars, the inert, Earth-sized cores of once-thriving stars like our Sun. These detections are clues that planets may exist around white dwarfs. Due to the faintness of white dwarfs and the potential rarity of planets around them, a vast survey is required to have a chance at detecting these planetary systems. The Large Synoptic Survey Telescope (LSST), scheduled to commence operations in 2023, will image the entire southern sky every few nights for 10 years, providing our first real opportunity to detect planets around white dwarfs. We characterized LSST’s ability to detect planets around white dwarfs through simulations that incorporate realistic models for LSST’s observing strategy and the white dwarf distribution within the Milky Way galaxy. This was done through the use of LSST's Operations Simulator (OpSim) and Catalog Simulator (CatSim). Our preliminary results indicate that, if all white dwarfs were to possess a planet, LSST would yield a detection for every 100 observed white dwarfs. In the future, a larger set of ongoing simulations will help us quantify the number of planets LSST could potentially find.

  7. Mapping Near-Earth Hazards

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-06-01

    How can we hunt down all the near-Earth asteroids that are capable of posing a threat to us? A new study looks at whether the upcoming Large Synoptic Survey Telescope (LSST) is up to the job.Charting Nearby ThreatsLSST is an 8.4-m wide-survey telescope currently being built in Chile. When it goes online in 2022, it will spend the next ten years surveying our sky, mapping tens of billions of stars and galaxies, searching for signatures of dark energy and dark matter, and hunting for transient optical events like novae and supernovae. But in its scanning, LSST will also be looking for asteroids that approach near Earth.Cumulative number of near-Earth asteroids discovered over time, as of June 16, 2016. [NASA/JPL/Chamberlin]Near-Earth objects (NEOs) have the potential to be hazardous if they cross Earths path and are large enough to do significant damage when they impact Earth. Earths history is riddled with dangerous asteroid encounters, including the recent Chelyabinsk airburst in 2013, the encounter that caused the kilometer-sized Meteor Crater in Arizona, and the impact thought to contribute to the extinction of the dinosaurs.Recognizing the potential danger that NEOs can pose to Earth, Congress has tasked NASA with tracking down 90% of NEOs larger than 140 meters in diameter. With our current survey capabilities, we believe weve discovered roughly 25% of these NEOs thus far. Now a new study led by Tommy Grav (Planetary Science Institute) examines whether LSST will be able to complete this task.Absolute magnitude, H, of asynthetic NEO population. Though these NEOs are all larger than 140 m, they have a large spread in albedos. [Grav et al. 2016]Can LSST Help?Based on previous observations of NEOs and resulting predictions for NEO properties and orbits, Grav and collaborators simulate a synthetic population of NEOs all above 140 m in size. With these improved population models, they demonstrate that the common tactic of using an asteroids absolute magnitude as a proxy for its size is a poor approximation, due to asteroids large spread in albedos. Roughly 23% of NEOs larger than 140 m have absolute magnitudes fainter than H = 22 mag, the authors show which is the value usually assumed as the default absolute magnitude of a 140 m NEO.Fraction of NEOs weve detected as a function of time based on the authors simulations of the current surveys (red), LSST plus the current surveys (black), NEOCam plus the current surveys (blue), and the combined result for all surveys (green). [Grav et al. 2016]Taking this into account, Grav and collaborators then use information about the planned LSST survey strategies and detection limits to test what fraction of this synthetic NEO population LSST will be able to detect in its proposed 10-year mission.The authors find that, within 10 years, LSST will likely be able to detect only 63% of NEOs larger than 140 m. Luckily, LSST may not have to work alone; in addition to the current surveys in operation, a proposed infrared space-based survey mission called NEOCam is planned for launch in 2021. If NEOCam is funded, it will complement LSSTs discovery capabilities, potentially allowing the two surveys to jointly achieve the 90% detection goal within a decade.CitationT. Grav et al 2016 AJ 151 172. doi:10.3847/0004-6256/151/6/172

  8. Characterization of multiport solid state imagers at megahertz data rates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yates, G.J.; Pena, C.R.; Turko, B.T.

    1994-08-01

    Test results obtained from two recently developed multiport Charge-Coupled Devices (CCDs) operated at pixel rates in the 10-to-100 MHz range will be presented . The CCDs were evaluated in Los Alamos National Laboratory`s High Speed Solid State Imager Test Station (HSTS) which features PC-based programmable clock waveform generation (Tektronix DAS 9200) and synchronously clocked Digital Sampling Oscilloscopes (DSOs) (LeCroy 9424/9314 series) for CCD pixel data acquisition, analysis and storage. The HSTS also provided special designed optical pinhole array test patterns in the 5-to-50 micron diameter range for use with Xenon Strobe and pulsed laser light sources to simultaneously provide multiplemore » single-pixel illumination patterns to study CCD point-spread-function (PSF) and pixel smear characteristics. The two CCDs tested, EEV model CCD-13 and EG&G Reticon model HSO512J, are both 512 {times} 512 pixel arrays with eight (8) and sixteen (16) video output ports respectively. Both devices are generically Frame Transfer CCDs (FT CCDs) designed for parallel bi-directional vertical readout to augment their multiport design for increased pixel rates over common single port serial readout architecture. Although both CCDs were tested similarly, differences in their designs precluded normalization or any direct comparisons of test results. Rate dependent parameters investigated include S/N, PSF, and MTF. The performance observed for the two imagers at various pixel rates from selected typical output ports is discussed.« less

  9. Radiation damage in charge-coupled devices.

    PubMed

    Bassler, Niels

    2010-08-01

    Due to their high sensitivity and signal-to-noise ratio, charge-coupled devices (CCDs) have been the preferred optical photon detectors of astronomers for several decades. CCDs are flown in space as the main detection instrument on several well-known missions, such as the Hubble Space Telescope, XMM-Newton or the Cassini Probe. Also, CCDs are frequently used in satellite star trackers which provide attitude information to the satellite orientation system. However, one major drawback is their extreme vulnerability to radiation, which is readily abundant in space. Here, we shall give a brief overview of the radiation effects on CCDs, and mention ways how to mitigate the effects in other ways than merely increase shielding, such as cooling and annealing. As an example, we have investigated the radiation hardness of a particular CCD, the so-called CCD47-20 from Marconi Applied Technologies (now E2V), by exposing it to radiation fields representing the radiation environment found in a highly elliptic orbit crossing the Van-Allen radiation belts. Two engineering-grade CCDs were irradiated with proton beams and photons, and effects of increased bulk dark current, surface dark current and inversion threshold voltage shifts were observed and are quantified.

  10. Research-grade CMOS image sensors for demanding space applications

    NASA Astrophysics Data System (ADS)

    Saint-Pé, Olivier; Tulet, Michel; Davancens, Robert; Larnaudie, Franck; Magnan, Pierre; Corbière, Franck; Martin-Gonthier, Philippe; Belliot, Pierre

    2004-06-01

    Imaging detectors are key elements for optical instruments and sensors on board space missions dedicated to Earth observation (high resolution imaging, atmosphere spectroscopy...), Solar System exploration (micro cameras, guidance for autonomous vehicle...) and Universe observation (space telescope focal planes, guiding sensors...). This market has been dominated by CCD technology for long. Since the mid-90s, CMOS Image Sensors (CIS) have been competing with CCDs for more and more consumer domains (webcams, cell phones, digital cameras...). Featuring significant advantages over CCD sensors for space applications (lower power consumption, smaller system size, better radiations behaviour...), CMOS technology is also expanding in this field, justifying specific R&D and development programs funded by national and European space agencies (mainly CNES, DGA, and ESA). All along the 90s and thanks to their increasingly improving performances, CIS have started to be successfully used for more and more demanding applications, from vision and control functions requiring low-level performances to guidance applications requiring medium-level performances. Recent technology improvements have made possible the manufacturing of research-grade CIS that are able to compete with CCDs in the high-performances arena. After an introduction outlining the growing interest of optical instruments designers for CMOS image sensors, this talk will present the existing and foreseen ways to reach high-level electro-optics performances for CIS. The developments of CIS prototypes built using an imaging CMOS process and of devices based on improved designs will be presented.

  11. Research-grade CMOS image sensors for demanding space applications

    NASA Astrophysics Data System (ADS)

    Saint-Pé, Olivier; Tulet, Michel; Davancens, Robert; Larnaudie, Franck; Magnan, Pierre; Corbière, Franck; Martin-Gonthier, Philippe; Belliot, Pierre

    2017-11-01

    Imaging detectors are key elements for optical instruments and sensors on board space missions dedicated to Earth observation (high resolution imaging, atmosphere spectroscopy...), Solar System exploration (micro cameras, guidance for autonomous vehicle...) and Universe observation (space telescope focal planes, guiding sensors...). This market has been dominated by CCD technology for long. Since the mid- 90s, CMOS Image Sensors (CIS) have been competing with CCDs for more and more consumer domains (webcams, cell phones, digital cameras...). Featuring significant advantages over CCD sensors for space applications (lower power consumption, smaller system size, better radiations behaviour...), CMOS technology is also expanding in this field, justifying specific R&D and development programs funded by national and European space agencies (mainly CNES, DGA, and ESA). All along the 90s and thanks to their increasingly improving performances, CIS have started to be successfully used for more and more demanding applications, from vision and control functions requiring low-level performances to guidance applications requiring medium-level performances. Recent technology improvements have made possible the manufacturing of research-grade CIS that are able to compete with CCDs in the high-performances arena. After an introduction outlining the growing interest of optical instruments designers for CMOS image sensors, this talk will present the existing and foreseen ways to reach high-level electro-optics performances for CIS. The developments of CIS prototypes built using an imaging CMOS process and of devices based on improved designs will be presented.

  12. Development of the COMmerical Experiment Transporter (COMET)

    NASA Technical Reports Server (NTRS)

    Pawlick, Joseph F., Jr.

    1990-01-01

    In order to commercialize space, this nation must develop a well defined path through which the Centers for the Commercial Development of Space (CCDS's) and their industrial partners and counterparts can exploit the advantages of space manufacturing and processing. Such a capability requires systems, a supporting infrastructure, and funding to become a viable component of this nation's economic strength. This paper follows the development of the COMmercial Experiment Program (COMET) from inception to its current position as the country's first space program dedicated to satisfying the needs of industry: an industry which must investigate the feasibility of space based processes, materials, and prototypes. With proposals now being evaluated, much of the COMET story is yet to be written, however concepts and events which led to it's current status and the plans for implementation may be presented.

  13. Modeling Charge Collection in Detector Arrays

    NASA Technical Reports Server (NTRS)

    Hardage, Donna (Technical Monitor); Pickel, J. C.

    2003-01-01

    A detector array charge collection model has been developed for use as an engineering tool to aid in the design of optical sensor missions for operation in the space radiation environment. This model is an enhancement of the prototype array charge collection model that was developed for the Next Generation Space Telescope (NGST) program. The primary enhancements were accounting for drift-assisted diffusion by Monte Carlo modeling techniques and implementing the modeling approaches in a windows-based code. The modeling is concerned with integrated charge collection within discrete pixels in the focal plane array (FPA), with high fidelity spatial resolution. It is applicable to all detector geometries including monolithc charge coupled devices (CCDs), Active Pixel Sensors (APS) and hybrid FPA geometries based on a detector array bump-bonded to a readout integrated circuit (ROIC).

  14. Final Technical Report for DE-SC0012297

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dell'Antonio, Ian

    This is the final report on the work performed in award DE-SC0012297, Cosmic Frontier work in support of the LSST Dark Energy Science Collaboration's work to develop algorithms, simulations, and statistical tests to ensure optimal extraction of the dark energy properties from galaxy clusters observed with LSST. This work focused on effects that could produce a systematic error on the measurement of cluster masses (that will be used to probe the effects of dark energy on the growth of structure). These effects stem from the deviations from pure ellipticity of the gravitational lensing signal and from the blending of lightmore » of neighboring galaxies. Both these effects are expected to be more significant for LSST than for the stage III experiments such as the Dark Energy Survey. We calculate the magnitude of the mass error (or bias) for the first time and demonstrate that it can be treated as a multiplicative correction and calibrated out, allowing mass measurements of clusters from gravitational lensing to meet the requirements of LSST's dark energy investigation.« less

  15. VizieR Online Data Catalog: Imaging and spectroscopy in Lynx W (Jorgensen+, 2014)

    NASA Astrophysics Data System (ADS)

    Jorgensen, I.; Chiboucas, K.; Toft, S.; Bergmann, M.; Zirm, A.; Schiavon, R. P.; Grutzbauch, R.

    2017-01-01

    Ground-based imaging of RX J0848.6+4453 was obtained primarily to show the performance gain provided by replacing the original E2V charge-coupled devices (E2V CCDs) in Gemini Multi-Object Spectrograph on Gemini North (GMOS-N) with E2V Deep Depletion CCDs (E2V DD CCDs). This replacement was done in 2011 October. Imaging of RX J0848.6+4453 was obtained with the original E2V CCDs in 2011 October (UT 2011 Oct 1 to 2011 Oct 2; Program ID: GN-2011B-DD-3) and repeated with the E2V DD CCDs in 2011 November. The imaging was done in the z' filter. For the observations with the original E2V CCDs the total exposure time was 60 minutes (obtained as 12 five-minute exposures) and the co-added image had an image quality of FWHM=0.52'' measured from point sources in the field. For the E2V DD CCDs a total exposure time of 55 minutes was obtained and the resulting image quality was FWHM=0.51''. Imaging of RX J0848.6+4453 was also obtained with Hubble Space Telescope /Advanced Camera for Surveys (HST/ACS using the filters F775W and F850LP) under the program ID 9919. The spectroscopic observations were obtained in multi-object spectroscopic (MOS) mode with GMOS-N (UT 2011 Nov 24 to 2012 Jan 4, Program ID: GN-2011B-DD-5; UT 2013 Mar 9 to 2013 May 18, Program ID: GN-2013A-Q-65). Table10 lists the photometric parameters for the spectroscopic sample as derived from the HST/ACS observations in F850LP and F775W. Tables 11 and 12 list the results from the template fitting and the derived line strengths, respectively. (3 data files).

  16. Scientific Synergy between LSST and Euclid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric

    We report that Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy clustermore » studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. Also, we provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.« less

  17. Scientific Synergy between LSST and Euclid

    DOE PAGES

    Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric; ...

    2017-12-07

    We report that Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy clustermore » studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. Also, we provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.« less

  18. Using SysML for MBSE analysis of the LSST system

    NASA Astrophysics Data System (ADS)

    Claver, Charles F.; Dubois-Felsmann, Gregory; Delgado, Francisco; Hascall, Pat; Marshall, Stuart; Nordby, Martin; Schalk, Terry; Schumacher, German; Sebag, Jacques

    2010-07-01

    The Large Synoptic Survey Telescope is a complex hardware - software system of systems, making up a highly automated observatory in the form of an 8.4m wide-field telescope, a 3.2 billion pixel camera, and a peta-scale data processing and archiving system. As a project, the LSST is using model based systems engineering (MBSE) methodology for developing the overall system architecture coded with the Systems Modeling Language (SysML). With SysML we use a recursive process to establish three-fold relationships between requirements, logical & physical structural component definitions, and overall behavior (activities and sequences) at successively deeper levels of abstraction and detail. Using this process we have analyzed and refined the LSST system design, ensuring the consistency and completeness of the full set of requirements and their match to associated system structure and behavior. As the recursion process proceeds to deeper levels we derive more detailed requirements and specifications, and ensure their traceability. We also expose, define, and specify critical system interfaces, physical and information flows, and clarify the logic and control flows governing system behavior. The resulting integrated model database is used to generate documentation and specifications and will evolve to support activities from construction through final integration, test, and commissioning, serving as a living representation of the LSST as designed and built. We discuss the methodology and present several examples of its application to specific systems engineering challenges in the LSST design.

  19. Electron-Induced Displacement Damage Effects in CCDs

    NASA Technical Reports Server (NTRS)

    Becker, Heidi N.; Elliott, Tom; Alexander, James W.

    2006-01-01

    We compare differences in parametric degradation for CCDs irradiated to the same displacement damage dose with 10-MeV and 50-MeV electrons. Charge transfer efficiency degradation was observed to not scale with NIEL for small signals.

  20. Astroinformatics in the Age of LSST: Analyzing the Summer 2012 Data Release

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.; De Lee, N. M.; Stassun, K.; Paegert, M.; Cargile, P.; Burger, D.; Bloom, J. S.; Richards, J.

    2013-01-01

    The Large Synoptic Survey Telescope (LSST) will image the visible southern sky every three nights. This multi-band, multi-epoch survey will produce a torrent of data, which traditional methods of object-by-object data analysis will not be able to accommodate. Thus the need for new astroinformatics tools to visualize, simulate, mine, and analyze this quantity of data. The Berkeley Center for Time-Domain Informatics (CTDI) is building the informatics infrastructure for generic light curve classification, including the innovation of new algorithms for feature generation and machine learning. The CTDI portal (http://dotastro.org) contains one of the largest collections of public light curves, with visualization and exploration tools. The group has also published the first calibrated probabilistic classification catalog of 50k variable stars along with a data exploration portal called http://bigmacc.info. Twice a year, the LSST collaboration releases simulated LSST data, in order to aid software development. This poster also showcases a suite of new tools from the Vanderbilt Initiative in Data-instensive Astrophysics (VIDA), designed to take advantage of these large data sets. VIDA's Filtergraph interactive web tool allows one to instantly create an interactive data portal for fast, real-time visualization of large data sets. Filtergraph enables quick selection of interesting objects by easily filtering on many different columns, 2-D and 3-D representations, and on-the-fly arithmetic calculations on the data. It also makes sharing the data and the tool with collaborators very easy. The EB/RRL Factory is a neural-network based variable star classifier, which is designed to quickly identify variable stars in a variety of classes from LSST light curve data (currently tuned to Eclipsing Binaries and RR Lyrae stars), and to provide likelihood-based orbital elements or stellar parameters as appropriate. Finally the LCsimulator software allows one to create simulated light curves of multiple types of variable stars based on an LSST cadence.

  1. Stellar Populations with the LSST

    NASA Astrophysics Data System (ADS)

    Saha, Abhijit; Olsen, K.; LSST Stellar Populations Collaboration

    2006-12-01

    The LSST will produce a multi-color map and photometric object catalog of half the sky to g 27.5(5σ). Strategically cadenced time-space sampling of each field spanning ten years will allow variability, proper motion and parallax measurements for objects brighter than g 25. As part of providing an unprecedented map of the Galaxy, the accurate multi-band photometry will permit photometric parallaxes, chemical abundances and a handle on ages via colors at turn-off for main-sequence stars at all distances within the Galaxy, permitting a comprehensive study of star formation histories (SFH) and chemical evolution for field stars. With a geometric parallax accuracy of 1mas, LSST will produce a robust complete sample of the solar neighborhood stars. While delivering parallax accuracy comparable to HIPPARCOS, LSST will extend the catalog to more than a 10 magnitudes fainter limit, and will be complete to MV 15. In the Magellanic Clouds too, the photometry will reach MV +8, allowing the SFH and chemical signatures in the expansive outer extremities to be gleaned from their main sequence stars. This in turn will trace the detailed interaction of the Clouds with the Galaxy halo. The LSST time sampling will identify and characterize variable stars of all types, from time scales of 1hr to several years, a feast for variable star astrophysics. Cepheids and LPVs in all galaxies in the Sculptor, M83 and Cen-A groups are obvious data products: comparative studies will reveal systematic differences with galaxy properties, and help to fine tune the rungs of the distance ladder. Dwarf galaxies within 10Mpc that are too faint to find from surface brightness enhancements will be revealed via over-densities of their red giants: this systematic census will extend the luminosity function of galaxies to the faint limit. Novae discovered by LSST time sampling will trace intergalactic stars out to the Virgo and Fornax clusters.

  2. The LSST Metrics Analysis Framework (MAF)

    NASA Astrophysics Data System (ADS)

    Jones, R. Lynne; Yoachim, Peter; Chandrasekharan, Srinivasan; Connolly, Andrew J.; Cook, Kem H.; Ivezic, Zeljko; Krughoff, K. Simon; Petry, Catherine E.; Ridgway, Stephen T.

    2015-01-01

    Studying potential observing strategies or cadences for the Large Synoptic Survey Telescope (LSST) is a complicated but important problem. To address this, LSST has created an Operations Simulator (OpSim) to create simulated surveys, including realistic weather and sky conditions. Analyzing the results of these simulated surveys for the wide variety of science cases to be considered for LSST is, however, difficult. We have created a Metric Analysis Framework (MAF), an open-source python framework, to be a user-friendly, customizable and easily extensible tool to help analyze the outputs of the OpSim.MAF reads the pointing history of the LSST generated by the OpSim, then enables the subdivision of these pointings based on position on the sky (RA/Dec, etc.) or the characteristics of the observations (e.g. airmass or sky brightness) and a calculation of how well these observations meet a specified science objective (or metric). An example simple metric could be the mean single visit limiting magnitude for each position in the sky; a more complex metric might be the expected astrometric precision. The output of these metrics can be generated for a full survey, for specified time intervals, or for regions of the sky, and can be easily visualized using a web interface.An important goal for MAF is to facilitate analysis of the OpSim outputs for a wide variety of science cases. A user can often write a new metric to evaluate OpSim for new science goals in less than a day once they are familiar with the framework. Some of these new metrics are illustrated in the accompanying poster, "Analyzing Simulated LSST Survey Performance With MAF".While MAF has been developed primarily for application to OpSim outputs, it can be applied to any dataset. The most obvious examples are examining pointing histories of other survey projects or telescopes, such as CFHT.

  3. LSST Probes of Dark Energy: New Energy vs New Gravity

    NASA Astrophysics Data System (ADS)

    Bradshaw, Andrew; Tyson, A.; Jee, M. J.; Zhan, H.; Bard, D.; Bean, R.; Bosch, J.; Chang, C.; Clowe, D.; Dell'Antonio, I.; Gawiser, E.; Jain, B.; Jarvis, M.; Kahn, S.; Knox, L.; Newman, J.; Wittman, D.; Weak Lensing, LSST; LSS Science Collaborations

    2012-01-01

    Is the late time acceleration of the universe due to new physics in the form of stress-energy or a departure from General Relativity? LSST will measure the shape, magnitude, and color of 4x109 galaxies to high S/N over 18,000 square degrees. These data will be used to separately measure the gravitational growth of mass structure and distance vs redshift to unprecedented precision by combining multiple probes in a joint analysis. Of the five LSST probes of dark energy, weak gravitational lensing (WL) and baryon acoustic oscillation (BAO) probes are particularly effective in combination. By measuring the 2-D BAO scale in ugrizy-band photometric redshift-selected samples, LSST will determine the angular diameter distance to a dozen redshifts with sub percent-level errors. Reconstruction of the WL shear power spectrum on linear and weakly non-linear scales, and of the cross-correlation of shear measured in different photometric redshift bins provides a constraint on the evolution of dark energy that is complementary to the purely geometric measures provided by supernovae and BAO. Cross-correlation of the WL shear and BAO signal within redshift shells minimizes the sensitivity to systematics. LSST will also detect shear peaks, providing independent constraints. Tomographic study of the shear of background galaxies as a function of redshift allows a geometric test of dark energy. To extract the dark energy signal and distinguish between the two forms of new physics, LSST will rely on accurate stellar point-spread functions (PSF) and unbiased reconstruction of galaxy image shapes from hundreds of exposures. Although a weighted co-added deep image has high S/N, it is a form of lossy compression. Bayesian forward modeling algorithms can in principle use all the information. We explore systematic effects on shape measurements and present tests of an algorithm called Multi-Fit, which appears to avoid PSF-induced shear systematics in a computationally efficient way.

  4. Giga-z: A 100,000 OBJECT SUPERCONDUCTING SPECTROPHOTOMETER FOR LSST FOLLOW-UP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marsden, Danica W.; Mazin, Benjamin A.; O'Brien, Kieran

    2013-09-15

    We simulate the performance of a new type of instrument, a Superconducting Multi-Object Spectrograph (SuperMOS), that uses microwave kinetic inductance detectors (MKIDs). MKIDs, a new detector technology, feature good quantum efficiency in the UVOIR, can count individual photons with microsecond timing accuracy, and, like X-ray calorimeters, determine their energy to several percent. The performance of Giga-z, a SuperMOS designed for wide field imaging follow-up observations, is evaluated using simulated observations of the COSMOS mock catalog with an array of 100,000 R{sub 423{sub nm}} = E/{Delta}E = 30 MKID pixels. We compare our results against a simultaneous simulation of LSST observations.more » In 3 yr on a dedicated 4 m class telescope, Giga-z could observe Almost-Equal-To 2 billion galaxies, yielding a low-resolution spectral energy distribution spanning 350-1350 nm for each; 1000 times the number measured with any currently proposed LSST spectroscopic follow-up, at a fraction of the cost and time. Giga-z would provide redshifts for galaxies up to z Almost-Equal-To 6 with magnitudes m{sub i} {approx}< 25, with accuracy {sigma}{sub {Delta}z/(1+z)} Almost-Equal-To 0.03 for the whole sample, and {sigma}{sub {Delta}z/(1+z)} Almost-Equal-To 0.007 for a select subset. We also find catastrophic failure rates and biases that are consistently lower than for LSST. The added constraint on dark energy parameters for WL + CMB by Giga-z using the FoMSWG default model is equivalent to multiplying the LSST Fisher matrix by a factor of {alpha} = 1.27 (w{sub p} ), 1.53 (w{sub a} ), or 1.98 ({Delta}{gamma}). This is equivalent to multiplying both the LSST coverage area and the training sets by {alpha} and reducing all systematics by a factor of 1/{radical}({alpha}), advantages that are robust to even more extreme models of intrinsic alignment.« less

  5. Mechanical Design of the LSST Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nordby, Martin; Bowden, Gordon; Foss, Mike

    2008-06-13

    The LSST camera is a tightly packaged, hermetically-sealed system that is cantilevered into the main beam of the LSST telescope. It is comprised of three refractive lenses, on-board storage for five large filters, a high-precision shutter, and a cryostat that houses the 3.2 giga-pixel CCD focal plane along with its support electronics. The physically large optics and focal plane demand large structural elements to support them, but the overall size of the camera and its components must be minimized to reduce impact on the image stability. Also, focal plane and optics motions must be minimized to reduce systematic errors inmore » image reconstruction. Design and analysis for the camera body and cryostat will be detailed.« less

  6. Preliminary study of the reliability of imaging charge coupled devices

    NASA Technical Reports Server (NTRS)

    Beall, J. R.; Borenstein, M. D.; Homan, R. A.; Johnson, D. L.; Wilson, D. D.; Young, V. F.

    1978-01-01

    Imaging CCDs are capable of low light level response and high signal-to-noise ratios. In space applications they offer the user the ability to achieve extremely high resolution imaging with minimum circuitry in the photo sensor array. This work relates the CCD121H Fairchild device to the fundamentals of CCDs and the representative technologies. Several failure modes are described, construction is analyzed and test results are reported. In addition, the relationship of the device reliability to packaging principles is analyzed and test data presented. Finally, a test program is defined for more general reliability evaluation of CCDs.

  7. Big Data Science Cafés: High School Students Experiencing Real Research with Scientists

    NASA Astrophysics Data System (ADS)

    Walker, C. E.; Pompea, S. M.

    2017-12-01

    The Education and Public Outreach group at the National Optical Astronomy Observatory has designed an outside-of-school education program to excite the interest of talented youth in future projects like the Large Synoptic Survey Telescope (LSST) and the NOAO (archival) Data Lab - their data approaches and key science projects. Originally funded by the LSST Corporation, the program cultivates talented youth to enter STEM disciplines and serves as a model to disseminate to the 40+ institutions involved in LSST. One Saturday a month during the academic year, high school students have the opportunity to interact with expert astronomers who work with large astronomical data sets in their scientific work. Students learn about killer asteroids, the birth and death of stars, colliding galaxies, the structure of the universe, gravitational waves, dark energy, dark matter, and more. The format for the Saturday science cafés has been a short presentation, discussion (plus food), computer lab activity and more discussion. They last about 2.5 hours and have been planned by a group of interested local high school students, an undergraduate student coordinator, the presenting astronomers, the program director and an evaluator. High school youth leaders help ensure an enjoyable and successful program for fellow students. They help their fellow students with the activities and help evaluate how well the science café went. Their remarks shape the next science café and improve the program. The experience offers youth leaders ownership of the program, opportunities to take on responsibilities and learn leadership and communication skills, as well as foster their continued interests in STEM. The prototype Big Data Science Academy was implemented successfully in the Spring 2017 and engaged almost 40 teens from greater Tucson in the fundamentals of astronomy concepts and research. As with any first implementation there were bumps. However, staff, scientists, and student leaders all stepped up to make the program a success. The project achieved many of its goals with a relatively small budget, providing value not only to the student leaders and student attendees, but to the scientists and staff as well. Staff learned what worked and what needed more fine-tuning to successfully launch and run a big data academy for teens in the years that follow.

  8. Designing for Peta-Scale in the LSST Database

    NASA Astrophysics Data System (ADS)

    Kantor, J.; Axelrod, T.; Becla, J.; Cook, K.; Nikolaev, S.; Gray, J.; Plante, R.; Nieto-Santisteban, M.; Szalay, A.; Thakar, A.

    2007-10-01

    The Large Synoptic Survey Telescope (LSST), a proposed ground-based 8.4 m telescope with a 10 deg^2 field of view, will generate 15 TB of raw images every observing night. When calibration and processed data are added, the image archive, catalogs, and meta-data will grow 15 PB yr^{-1} on average. The LSST Data Management System (DMS) must capture, process, store, index, replicate, and provide open access to this data. Alerts must be triggered within 30 s of data acquisition. To do this in real-time at these data volumes will require advances in data management, database, and file system techniques. This paper describes the design of the LSST DMS and emphasizes features for peta-scale data. The LSST DMS will employ a combination of distributed database and file systems, with schema, partitioning, and indexing oriented for parallel operations. Image files are stored in a distributed file system with references to, and meta-data from, each file stored in the databases. The schema design supports pipeline processing, rapid ingest, and efficient query. Vertical partitioning reduces disk input/output requirements, horizontal partitioning allows parallel data access using arrays of servers and disks. Indexing is extensive, utilizing both conventional RAM-resident indexes and column-narrow, row-deep tag tables/covering indices that are extracted from tables that contain many more attributes. The DMS Data Access Framework is encapsulated in a middleware framework to provide a uniform service interface to all framework capabilities. This framework will provide the automated work-flow, replication, and data analysis capabilities necessary to make data processing and data quality analysis feasible at this scale.

  9. LSST telescope and site status

    NASA Astrophysics Data System (ADS)

    Gressler, William J.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) Project1 received its construction authorization from the National Science Foundation in August 2014. The Telescope and Site (T and S) group has made considerable progress towards completion in subsystems required to support the scope of the LSST science mission. The LSST goal is to conduct a wide, fast, deep survey via a 3-mirror wide field of view optical design, a 3.2-Gpixel camera, and an automated data processing system. The summit facility is currently under construction on Cerro Pachón in Chile, with major vendor subsystem deliveries and integration planned over the next several years. This paper summarizes the status of the activities of the T and S group, tasked with design, analysis, and construction of the summit and base facilities and infrastructure necessary to control the survey, capture the light, and calibrate the data. All major telescope work package procurements have been awarded to vendors and are in varying stages of design and fabrication maturity and completion. The unique M1M3 primary/tertiary mirror polishing effort is completed and the mirror now resides in storage waiting future testing. Significant progress has been achieved on all the major telescope subsystems including the summit facility, telescope mount assembly, dome, hexapod and rotator systems, coating plant, base facility, and the calibration telescope. In parallel, in-house efforts including the software needed to control the observatory such as the scheduler and the active optics control, have also seen substantial advancement. The progress and status of these subsystems and future LSST plans during this construction phase are presented.

  10. Visits to Tololo | CTIO

    Science.gov Websites

    limited to two groups of 40 people. One group meets at the gatehouse at 9 AM and the other at 1PM. Because Reports SOAR End-of-night Reports Gemini South LSST Optical Engineering AURA Sites Group MASS-DIMM New Projects NOAO Future Instrumentation DECam SAM LSST MONSOON What is MONSOON AURA Sites Group Talks and

  11. Adaptive Optics for the Thirty Meter Telescope

    NASA Astrophysics Data System (ADS)

    Ellerbroek, Brent

    2013-12-01

    This paper provides an overview of the progress made since the last AO4ELT conference towards developing the first-light AO architecture for the Thirty Meter Telescope (TMT). The Preliminary Design of the facility AO system NFIRAOS has been concluded by the Herzberg Institute of Astrophysics. Work on the client Infrared Imaging Spectrograph (IRIS) has progressed in parallel, including a successful Conceptual Design Review and prototyping of On-Instrument WFS (OIWFS) hardware. Progress on the design for the Laser Guide Star Facility (LGSF) continues at the Institute of Optics and Electronics in Chengdu, China, including the final acceptance of the Conceptual Design and modest revisions for the updated TMT telescope structure. Design and prototyping activities continue for lasers, wavefront sensing detectors, detector readout electronics, real-time control (RTC) processors, and deformable mirrors (DMs) with their associated drive electronics. Highlights include development of a prototype sum frequency guide star laser at the Technical Institute of Physics and Chemistry (Beijing); fabrication/test of prototype natural- and laser-guide star wavefront sensor CCDs for NFIRAOS by MIT Lincoln Laboratory and W.M. Keck Observatory; a trade study of RTC control algorithms and processors, with prototyping of GPU and FPGA architectures by TMT and the Dominion Radio Astrophysical Observatory; and fabrication/test of a 6x60 actuator DM prototype by CILAS. Work with the University of British Columbia LIDAR is continuing, in collaboration with ESO, to measure the spatial/temporal variability of the sodium layer and characterize the sodium coupling efficiency of several guide star laser systems. AO performance budgets have been further detailed. Modeling topics receiving particular attention include performance vs. computational cost tradeoffs for RTC algorithms; optimizing performance of the tip/tilt, plate scale, and sodium focus tracking loops controlled by the NGS on-instrument wavefront sensors, sky coverage, PSF reconstruction for LGS MCAO, and precision astrometry for the galactic center and other observations.

  12. LSST and the Physics of the Dark Universe

    ScienceCinema

    Tyson, Anthony [UC Davis, California, United States

    2017-12-09

    The physics that underlies the accelerating cosmic expansion is unknown. This, 'dark energy' and the equally mysterious 'dark matter' comprise most of the mass-energy of the universe and are outside the standard model. Recent advances in optics, detectors, and information technology, has led to the design of a facility that will repeatedly image an unprecedented volume of the universe: LSST. For the first time, the sky will be surveyed wide, deep and fast. The history of astronomy has taught us repeatedly that there are surprises whenever we view the sky in a new way. I will review the technology of LSST, and focus on several independent probes of the nature of dark energy and dark matter. These new investigations will rely on the statistical precision obtainable with billions of galaxies.

  13. Transient Go: A Mobile App for Transient Astronomy Outreach

    NASA Astrophysics Data System (ADS)

    Crichton, D.; Mahabal, A.; Djorgovski, S. G.; Drake, A.; Early, J.; Ivezic, Z.; Jacoby, S.; Kanbur, S.

    2016-12-01

    Augmented Reality (AR) is set to revolutionize human interaction with the real world as demonstrated by the phenomenal success of `Pokemon Go'. That very technology can be used to rekindle the interest in science at the school level. We are in the process of developing a prototype app based on sky maps that will use AR to introduce different classes of astronomical transients to students as they are discovered i.e. in real-time. This will involve transient streams from surveys such as the Catalina Real-time Transient Survey (CRTS) today and the Large Synoptic Survey Telescope (LSST) in the near future. The transient streams will be combined with archival and latest image cut-outs and other auxiliary data as well as historical and statistical perspectives on each of the transient types being served. Such an app could easily be adapted to work with various NASA missions and NSF projects to enrich the student experience.

  14. Solar System science with the Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Jones, Lynne; Brown, Mike; Ivezić, Zeljko; Jurić, Mario; Malhotra, Renu; Trilling, David

    2015-11-01

    The Large Synoptic Survey Telescope (LSST; http://lsst.org) will be a large-aperture, wide-field, ground-based telescope that will survey half the sky every few nights in six optical bands from 320 to 1050 nm. It will explore a wide range of astrophysical questions, ranging from performing a census of the Solar System, to examining the nature of dark energy. It is currently in construction, slated for first light in 2019 and full operations by 2022.The LSST will survey over 20,000 square degrees with a rapid observational cadence, to typical limiting magnitudes of r~24.5 in each visit (9.6 square degree field of view). Automated software will link the individual detections into orbits; these orbits, as well as precisely calibrated astrometry (~50mas) and photometry (~0.01-0.02 mag) in multiple bandpasses will be available as LSST data products. The resulting data set will have tremendous potential for planetary astronomy; multi-color catalogs of hundreds of thousands of NEOs and Jupiter Trojans, millions of asteroids, tens of thousands of TNOs, as well as thousands of other objects such as comets and irregular satellites of the major planets.LSST catalogs will increase the sample size of objects with well-known orbits 10-100 times for small body populations throughout the Solar System, enabling a major increase in the completeness level of the inventory of most dynamical classes of small bodies and generating new insights into planetary formation and evolution. Precision multi-color photometry will allow determination of lightcurves and colors, as well as spin state and shape modeling through sparse lightcurve inversion. LSST is currently investigating survey strategies to optimize science return across a broad range of goals. To aid in this investigation, we are making a series of realistic simulated survey pointing histories available together with a Python software package to model and evaluate survey detections for a user-defined input population. Preliminary metrics from these simulations are shown here; the community is invited to provide further input.

  15. CCDs in the Mechanics Lab--A Competitive Alternative (Part II).

    ERIC Educational Resources Information Center

    Pinto, Fabrizio

    1995-01-01

    Describes a system of interactive astronomy whereby nonscience students are able to acquire their own images from a room remotely linked to a telescope. Briefly discusses some applications of Charge-Coupled Device cameras (CCDs) in teaching free fall, projectile motion, and the motion of the pendulum. (JRH)

  16. Synthesizing Planetary Nebulae for Large Scale Surveys: Predictions for LSST

    NASA Astrophysics Data System (ADS)

    Vejar, George; Montez, Rodolfo; Morris, Margaret; Stassun, Keivan G.

    2017-01-01

    The short-lived planetary nebula (PN) phase of stellar evolution is characterized by a hot central star and a bright, ionized, nebula. The PN phase forms after a low- to intermediate-mass star stops burning hydrogen in its core, ascends the asymptotic giant branch, and expels its outer layers of material into space. The exposed hot core produces ionizing UV photons and a fast stellar wind that sweeps up the surrounding material into a dense shell of ionized gas known as the PN. This fleeting stage of stellar evolution provides insight into rare atomic processes and the nucleosynthesis of elements in stars. The inherent brightness of the PNe allow them to be used to obtain distances to nearby stellar systems via the PN luminosity function and as kinematic tracers in other galaxies. However, the prevalence of non-spherical morphologies of PNe challenge the current paradigm of PN formation. The role of binarity in the shaping of the PN has recently gained traction ultimately suggesting single stars might not form PN. Searches for binary central stars have increased the binary fraction but the current PN sample is incomplete. Future wide-field, multi-epoch surveys like the Large Synoptic Survey Telescope (LSST) can impact studies of PNe and improve our understanding of their origin and formation. Using a suite of Cloudy radiative transfer calculations, we study the detectability of PNe in the proposed LSST multiband observations. We compare our synthetic PNe to common sources (stars, galaxies, quasars) and establish discrimination techniques. Finally, we discuss follow-up strategies to verify new LSST-discovered PNe and use limiting distances to estimate the potential sample of PNe enabled by LSST.

  17. Precise Time Delays from Strongly Gravitationally Lensed Type Ia Supernovae with Chromatically Microlensed Images

    NASA Astrophysics Data System (ADS)

    Goldstein, Daniel A.; Nugent, Peter E.; Kasen, Daniel N.; Collett, Thomas E.

    2018-03-01

    Time delays between the multiple images of strongly gravitationally lensed Type Ia supernovae (glSNe Ia) have the potential to deliver precise cosmological constraints, but the effects of microlensing on time delay extraction have not been studied in detail. Here we quantify the effect of microlensing on the glSN Ia yield of the Large Synoptic Survey Telescope (LSST) and the effect of microlensing on the precision and accuracy of time delays that can be extracted from LSST glSNe Ia. Microlensing has a negligible effect on the LSST glSN Ia yield, but it can be increased by a factor of ∼2 over previous predictions to 930 systems using a novel photometric identification technique based on spectral template fitting. Crucially, the microlensing of glSNe Ia is achromatic until three rest-frame weeks after the explosion, making the early-time color curves microlensing-insensitive time delay indicators. By fitting simulated flux and color observations of microlensed glSNe Ia with their underlying, unlensed spectral templates, we forecast the distribution of absolute time delay error due to microlensing for LSST, which is unbiased at the sub-percent level and peaked at 1% for color curve observations in the achromatic phase, while for light-curve observations it is comparable to state-of-the-art mass modeling uncertainties (4%). About 70% of LSST glSN Ia images should be discovered during the achromatic phase, indicating that microlensing time delay uncertainties can be minimized if prompt multicolor follow-up observations are obtained. Accounting for microlensing, the 1–2 day time delay on the recently discovered glSN Ia iPTF16geu can be measured to 40% precision, limiting its cosmological utility.

  18. Precise Time Delays from Strongly Gravitationally Lensed Type Ia Supernovae with Chromatically Microlensed Images

    DOE PAGES

    Goldstein, Daniel A.; Nugent, Peter E.; Kasen, Daniel N.; ...

    2018-03-01

    Time delays between the multiple images of strongly gravitationally lensed Type Ia supernovae (glSNe Ia) have the potential to deliver precise cosmological constraints, but the effects of microlensing on time delay extraction have not been studied in detail. Here we quantify the effect of microlensing on the glSN Ia yield of the Large Synoptic Survey Telescope (LSST) and the effect of microlensing on the precision and accuracy of time delays that can be extracted from LSST glSNe Ia. Microlensing has a negligible effect on the LSST glSN Ia yield, but it can be increased by a factor of ~2 overmore » previous predictions to 930 systems using a novel photometric identification technique based on spectral template fitting. Crucially, the microlensing of glSNe Ia is achromatic until three rest-frame weeks after the explosion, making the early-time color curves microlensing-insensitive time delay indicators. By fitting simulated flux and color observations of microlensed glSNe Ia with their underlying, unlensed spectral templates, we forecast the distribution of absolute time delay error due to microlensing for LSST, which is unbiased at the sub-percent level and peaked at 1% for color curve observations in the achromatic phase, while for light-curve observations it is comparable to state-of-the-art mass modeling uncertainties (4%). About 70% of LSST glSN Ia images should be discovered during the achromatic phase, indicating that microlensing time delay uncertainties can be minimized if prompt multicolor follow-up observations are obtained. Lastly, accounting for microlensing, the 1-2 day time delay on the recently discovered glSN Ia iPTF16geu can be measured to 40% precision, limiting its cosmological utility.« less

  19. Precise Time Delays from Strongly Gravitationally Lensed Type Ia Supernovae with Chromatically Microlensed Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldstein, Daniel A.; Nugent, Peter E.; Kasen, Daniel N.

    Time delays between the multiple images of strongly gravitationally lensed Type Ia supernovae (glSNe Ia) have the potential to deliver precise cosmological constraints, but the effects of microlensing on time delay extraction have not been studied in detail. Here we quantify the effect of microlensing on the glSN Ia yield of the Large Synoptic Survey Telescope (LSST) and the effect of microlensing on the precision and accuracy of time delays that can be extracted from LSST glSNe Ia. Microlensing has a negligible effect on the LSST glSN Ia yield, but it can be increased by a factor of ~2 overmore » previous predictions to 930 systems using a novel photometric identification technique based on spectral template fitting. Crucially, the microlensing of glSNe Ia is achromatic until three rest-frame weeks after the explosion, making the early-time color curves microlensing-insensitive time delay indicators. By fitting simulated flux and color observations of microlensed glSNe Ia with their underlying, unlensed spectral templates, we forecast the distribution of absolute time delay error due to microlensing for LSST, which is unbiased at the sub-percent level and peaked at 1% for color curve observations in the achromatic phase, while for light-curve observations it is comparable to state-of-the-art mass modeling uncertainties (4%). About 70% of LSST glSN Ia images should be discovered during the achromatic phase, indicating that microlensing time delay uncertainties can be minimized if prompt multicolor follow-up observations are obtained. Lastly, accounting for microlensing, the 1-2 day time delay on the recently discovered glSN Ia iPTF16geu can be measured to 40% precision, limiting its cosmological utility.« less

  20. Mapping the Solar System with LSST

    NASA Astrophysics Data System (ADS)

    Ivezic, Z.; Juric, M.; Lupton, R.; Connolly, A.; Kubica, J.; Moore, A.; Harris, A.; Bowell, T.; Bernstein, G.; Stubbs, C.; LSST Collaboration

    2004-12-01

    The currently considered LSST cadence, based on two 10 sec exposures, may result in orbital parameters, light curves and accurate colors for over a million main-belt asteroids (MBA), and about 20,000 trans-Neptunian objects (TNO). Compared to the current state-of-the-art, this sample would represent a factor of 5 increase in the number of MBAs with known orbits, a factor of 20 increase in the number of MBAs with known orbits and accurate color measurements, and a factor of 100 increase in the number of MBAs with measured variability properties. The corresponding sample increase for TNOs is 10, 100, and 1000, respectively. The LSST MBA and TNO samples will enable detailed studies of the dynamical and chemical history of the solar system. For example, they will constrain the MBA size distribution for objects larger than 100 m, and TNO size distribution for objects larger than 100 km, their physical state through variability measurements (solid body vs. a rubble pile), as well as their surface chemistry through color measurements. A proposed deep TNO survey, based on 1 hour exposures, may result in a sample of about 100,000 TNOs, while spending only 10% of the LSST observing time. Such a deep TNO survey would be capable of discovering Sedna-like objects at distances beyond 150 AU, thereby increasing the observable Solar System volume by about a factor of 7. The increase in data volume associated with LSST asteroid science will present many computational challenges to how we might extract tracks and orbits of asteroids from the underlying clutter. Tree-based algorithms for multihypothesis testing of asteroid tracks can help solve these challenges by providing the necessary 1000-fold speed-ups over current approaches while recovering 95% of the underlying asteroid populations.

  1. Management evolution in the LSST project

    NASA Astrophysics Data System (ADS)

    Sweeney, Donald; Claver, Charles; Jacoby, Suzanne; Kantor, Jeffrey; Krabbendam, Victor; Kurita, Nadine

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) project has evolved from just a few staff members in 2003 to about 100 in 2010; the affiliation of four founding institutions has grown to 32 universities, government laboratories, and industry. The public private collaboration aims to complete the estimated $450 M observatory in the 2017 timeframe. During the design phase of the project from 2003 to the present the management structure has been remarkably stable. At the same time, the funding levels, staffing levels and scientific community participation have grown dramatically. The LSSTC has introduced project controls and tools required to manage the LSST's complex funding model, technical structure and distributed work force. Project controls have been configured to comply with the requirements of federal funding agencies. Some of these tools for risk management, configuration control and resource-loaded schedule have been effective and others have not. Technical tasks associated with building the LSST are distributed into three subsystems: Telescope & Site, Camera, and Data Management. Each sub-system has its own experienced Project Manager and System Scientist. Delegation of authority is enabling and effective; it encourages a strong sense of ownership within the project. At the project level, subsystem management follows the principle that there is one Board of Directors, Director, and Project Manager who have overall authority.

  2. Optimizing the LSST Dither Pattern for Survey Uniformity

    NASA Astrophysics Data System (ADS)

    Awan, Humna; Gawiser, Eric J.; Kurczynski, Peter; Carroll, Christopher M.; LSST Dark Energy Science Collaboration

    2015-01-01

    The Large Synoptic Survey Telescope (LSST) will gather detailed data of the southern sky, enabling unprecedented study of Baryonic Acoustic Oscillations, which are an important probe of dark energy. These studies require a survey with highly uniform depth, and we aim to find an observation strategy that optimizes this uniformity. We have shown that in the absence of dithering (large telescope-pointing offsets), the LSST survey will vary significantly in depth. Hence, we implemented various dithering strategies, including random and repulsive random pointing offsets and spiral patterns with the spiral reaching completion in either a few months or the entire ten-year run. We employed three different implementations of dithering strategies: a single offset assigned to all fields observed on each night, offsets assigned to each field independently whenever the field is observed, and offsets assigned to each field only when the field is observed on a new night. Our analysis reveals that large dithers are crucial to guarantee survey uniformity and that assigning dithers to each field independently whenever the field is observed significantly increases this uniformity. These results suggest paths towards an optimal observation strategy that will enable LSST to achieve its science goals.We gratefully acknowledge support from the National Science Foundation REU program at Rutgers, PHY-1263280, and the Department of Energy, DE-SC0011636.

  3. The future scientific CCD

    NASA Technical Reports Server (NTRS)

    Janesick, J. R.; Elliott, T.; Collins, S.; Marsh, H.; Blouke, M. M.

    1984-01-01

    Since the first introduction of charge-coupled devices (CCDs) in 1970, CCDs have been considered for applications related to memories, logic circuits, and the detection of visible radiation. It is pointed out, however, that the mass market orientation of CCD development has left largely untapped the enormous potential of these devices for advanced scientific instrumentation. The present paper has, therefore, the objective to introduce the CCD characteristics to the scientific community, taking into account prospects for further improvement. Attention is given to evaluation criteria, a summary of current CCDs, CCD performance characteristics, absolute calibration tools, quantum efficiency, aspects of charge collection, charge transfer efficiency, read noise, and predictions regarding the characteristics of the next generation of silicon scientific CCD imagers.

  4. Validity of Childhood Career Development Scale Scores in South Africa

    ERIC Educational Resources Information Center

    Stead, Graham B.; Schultheiss, Donna E. Palladino

    2010-01-01

    The purpose of this study was to provide evidence of the construct and concurrent validity of the Childhood Career Development Scale's (CCDS) scores among South African primary school children. Using a sample of 808 children in grades four through seven, evidence for the CCDS's construct validity was provided using confirmatory factor analysis,…

  5. Performance of an extended dynamic range time delay integration charge coupled device (XDR TDI CCD) for high-intrascene dynamic range scanning

    NASA Astrophysics Data System (ADS)

    Levine, Peter A.; Dawson, Robin M.; Andrews, James T.; Bhaskaran, Mahalingham; Furst, David; Hsueh, Fu-Lung; Meray, Grazyna M.; Sudol, Thomas M.; Swain, Pradyumna K.; Tower, John R.

    2003-05-01

    Many applications, such as industrial inspection and overhead reconnaissance benefit from line scanning architectures where time delay integration (TDI) significantly improves sensitivity. CCDs are particularly well suited to the TDI architecture since charge is transferred virtually noiselessly down the column. Sarnoff's TDI CCDs have demonstrated extremely high speeds where a 7200 x 64, 8 um pixel device with 120 output ports demonstrated a vertical line transfer rate greater than 800 kHz. The most recent addition to Sarnoff's TDI technology is the implementation of extended dynamic range (XDR) in high speed, back illuminated TDI CCDs. The optical, intrascene dynamic range can be adjusted in the design of the imager with measured dynamic ranges exceeding 2,000,000:1 with no degradation in low light performance. The device provides a piecewise linear response to light where multiple slopes and break points can be set during the CCD design. A description of the device architecture and measured results from fabricated XDR TDI CCDs are presented.

  6. Soft X-ray radiation damage in EM-CCDs used for Resonant Inelastic X-ray Scattering

    NASA Astrophysics Data System (ADS)

    Gopinath, D.; Soman, M.; Holland, A.; Keelan, J.; Hall, D.; Holland, K.; Colebrook, D.

    2018-02-01

    Advancement in synchrotron and free electron laser facilities means that X-ray beams with higher intensity than ever before are being created. The high brilliance of the X-ray beam, as well as the ability to use a range of X-ray energies, means that they can be used in a wide range of applications. One such application is Resonant Inelastic X-ray Scattering (RIXS). RIXS uses the intense and tuneable X-ray beams in order to investigate the electronic structure of materials. The photons are focused onto a sample material and the scattered X-ray beam is diffracted off a high resolution grating to disperse the X-ray energies onto a position sensitive detector. Whilst several factors affect the total system energy resolution, the performance of RIXS experiments can be limited by the spatial resolution of the detector used. Electron-Multiplying CCDs (EM-CCDs) at high gain in combination with centroiding of the photon charge cloud across several detector pixels can lead to sub-pixel spatial resolution of 2-3 μm. X-ray radiation can cause damage to CCDs through ionisation damage resulting in increases in dark current and/or a shift in flat band voltage. Understanding the effect of radiation damage on EM-CCDs is important in order to predict lifetime as well as the change in performance over time. Two CCD-97s were taken to PTB at BESSY II and irradiated with large doses of soft X-rays in order to probe the front and back surfaces of the device. The dark current was shown to decay over time with two different exponential components to it. This paper will discuss the use of EM-CCDs for readout of RIXS spectrometers, and limitations on spatial resolution, together with any limitations on instrument use which may arise from X-ray-induced radiation damage.

  7. LSST Astrometric Science

    NASA Astrophysics Data System (ADS)

    Saha, A.; Monet, D.

    2005-12-01

    Continued acquisition and analysis for short-exposure observations support the preliminary conclusion presented by Monet et al. (BAAS v36, p1531, 2004) that a 10-second exposure in 1.0-arcsecond seeing can provide a differential astrometric accuracy of about 10 milliarcseconds. A single solution for mapping coefficients appears to be valid over spatial scales of up to 10 arcminutes, and this suggests that numerical processing can proceed on a per-sensor basis without the need to further divide the individual fields of view into several astrometric patches. Data from the Subaru public archive as well as from the LSST Cerro Pachon 2005 observing campaign and various CTIO and NOAO 4-meter engineering runs have been considered. Should these results be confirmed, the expected astrometric accuracy after 10 years of LSST observations should be around 1.0 milliarcseconds for parallax and 0.2 milliarcseconds/year for proper motions.

  8. The LSST metrics analysis framework (MAF)

    NASA Astrophysics Data System (ADS)

    Jones, R. L.; Yoachim, Peter; Chandrasekharan, Srinivasan; Connolly, Andrew J.; Cook, Kem H.; Ivezic, Željko; Krughoff, K. S.; Petry, Catherine; Ridgway, Stephen T.

    2014-07-01

    We describe the Metrics Analysis Framework (MAF), an open-source python framework developed to provide a user-friendly, customizable, easily-extensible set of tools for analyzing data sets. MAF is part of the Large Synoptic Survey Telescope (LSST) Simulations effort. Its initial goal is to provide a tool to evaluate LSST Operations Simulation (OpSim) simulated surveys to help understand the effects of telescope scheduling on survey performance, however MAF can be applied to a much wider range of datasets. The building blocks of the framework are Metrics (algorithms to analyze a given quantity of data), Slicers (subdividing the overall data set into smaller data slices as relevant for each Metric), and Database classes (to access the dataset and read data into memory). We describe how these building blocks work together, and provide an example of using MAF to evaluate different dithering strategies. We also outline how users can write their own custom Metrics and use these within the framework.

  9. Detection of Double White Dwarf Binaries with Gaia, LSST and eLISA

    NASA Astrophysics Data System (ADS)

    Korol, V.; Rossi, E. M.; Groot, P. J.

    2017-03-01

    According to simulations around 108 double degenerate white dwarf binaries are expected to be present in the Milky Way. Due to their intrinsic faintness, the detection of these systems is a challenge, and the total number of detected sources so far amounts only to a few tens. This will change in the next two decades with the advent of Gaia, the LSST and eLISA. We present an estimation of how many compact DWDs with orbital periods less than a few hours we will be able to detect 1) through electromagnetic radiation with Gaia and LSST and 2) through gravitational wave radiation with eLISA. We find that the sample of simultaneous electromagnetic and gravitational waves detections is expected to be substantial, and will provide us a powerful tool for probing the white dwarf astrophysics and the structure of the Milky Way, letting us into the era of multi-messenger astronomy for these sources.

  10. LSST camera control system

    NASA Astrophysics Data System (ADS)

    Marshall, Stuart; Thaler, Jon; Schalk, Terry; Huffer, Michael

    2006-06-01

    The LSST Camera Control System (CCS) will manage the activities of the various camera subsystems and coordinate those activities with the LSST Observatory Control System (OCS). The CCS comprises a set of modules (nominally implemented in software) which are each responsible for managing one camera subsystem. Generally, a control module will be a long lived "server" process running on an embedded computer in the subsystem. Multiple control modules may run on a single computer or a module may be implemented in "firmware" on a subsystem. In any case control modules must exchange messages and status data with a master control module (MCM). The main features of this approach are: (1) control is distributed to the local subsystem level; (2) the systems follow a "Master/Slave" strategy; (3) coordination will be achieved by the exchange of messages through the interfaces between the CCS and its subsystems. The interface between the camera data acquisition system and its downstream clients is also presented.

  11. The design and realization of a three-dimensional video system by means of a CCD array

    NASA Astrophysics Data System (ADS)

    Boizard, J. L.

    1985-12-01

    Design features and principles and initial tests of a prototype three-dimensional robot vision system based on a laser source and a CCD detector array is described. The use of a laser as a coherent illumination source permits the determination of the relief using one emitter since the location of the source is a known quantity with low distortion. The CCD signal detector array furnishes an acceptable signal/noise ratio and, when wired to an appropriate signal processing system, furnishes real-time data on the return signals, i.e., the characteristic points of an object being scanned. Signal processing involves integration of 29 kB of data per 100 samples, with sampling occurring at a rate of 5 MHz (the CCDs) and yielding an image every 12 msec. Algorithms for filtering errors from the data stream are discussed.

  12. Low power, compact charge coupled device signal processing system

    NASA Technical Reports Server (NTRS)

    Bosshart, P. W.; Buss, D. D.; Eversole, W. L.; Hewes, C. R.; Mayer, D. J.

    1980-01-01

    A variety of charged coupled devices (CCDs) for performing programmable correlation for preprocessing environmental sensor data preparatory to its transmission to the ground were developed. A total of two separate ICs were developed and a third was evaluated. The first IC was a CCD chirp z transform IC capable of performing a 32 point DFT at frequencies to 1 MHz. All on chip circuitry operated as designed with the exception of the limited dynamic range caused by a fixed pattern noise due to interactions between the digital and analog circuits. The second IC developed was a 64 stage CCD analog/analog correlator for performing time domain correlation. Multiplier errors were found to be less than 1 percent at designed signal levels and less than 0.3 percent at the measured smaller levels. A prototype IC for performing time domain correlation was also evaluated.

  13. Sandbox CCDs

    NASA Astrophysics Data System (ADS)

    Janesick, James R.; Elliott, Tom S.; Winzenread, Rusty; Pinter, Jeff H.; Dyck, Rudolph H.

    1995-04-01

    Seven new CCDs are presented. The devices will be used in a variety of applications ranging from generating color cinema movies to adaptive optics camera systems to compensate for atmospheric turbulence at major astronomical observatories. This paper highlights areas of design, fabrication, and operation techniques to achieve state-of-the-art performance. We discuss current limitations of CCD technology for several key parameters.

  14. A rigid and thermally stable all ceramic optical support bench assembly for the LSST Camera

    NASA Astrophysics Data System (ADS)

    Kroedel, Matthias; Langton, J. Brian; Wahl, Bill

    2017-09-01

    This paper will present the ceramic design, fabrication and metrology results and assembly plan of the LSST camera optical bench structure which is using the unique manufacturing features of the HB-Cesic technology. The optical bench assembly consists of a rigid "Grid" fabrication supporting individual raft plates mounting sensor assemblies by way of a rigid kinematic support system to meet extreme stringent requirements for focal plane planarity and stability.

  15. Fast Solar Polarimeter: First Light Results

    NASA Astrophysics Data System (ADS)

    Krishnappa, N.; Feller, A.; Iglesia, F. A.; Solanki, S.

    2013-12-01

    Accurate measurements of magnetic fields on the Sun are crucial to understand various physical processes that take place in the solar atmosphere such as solar eruptions, coronal heating, solar wind acceleration, etc. The Fast Solar Polarimeter (FSP) is a new instrument that is being developed to probe magnetic fields on the Sun. One of the main goals of this polarimeter is to carry out high precision spectropolarimetric observations with spatial resolution close to the telescope diffraction limit. The polarimeter is based on pnCCD technology with split frame transfer and simultaneous multi-channel readout, resulting in frame rate upto 1 kHz. The FSP prototype instrument uses a small format pnCCD of 264x264 pixels which has been developed by PNSensor and by the semiconductor lab of the Max Planck Society. The polarization modulator is based on two ferro-electric liquid crystals (FLCs) interlaced between two static retarders. The first solar observations have been carried out with this prototype during May-June, 2013 at German Vacuum Tower Telescope (VTT) on Tenerife, Canary Islands, Spain. Here we present the instrument performance assessments and the first results on the magnetic field measurements. Further, we briefly discuss about the next phase of FSP which will be a dual beam system with 1k x 1k CCDs.

  16. Solar XUV Imaging and Non-dispersive Spectroscopy for Solar-C Enabled by Scientific CMOS APS Arrays

    NASA Astrophysics Data System (ADS)

    Stern, Robert A.; Lemen, J. R.; Shing, L.; Janesick, J.; Tower, J.

    2009-05-01

    Monolithic CMOS Advanced Pixel Sensor (APS) arrays are showing great promise as eventual replacements for the current workhorse of solar physics focal planes, the scientific CCD. CMOS APS devices have individually addressable pixels, increased radiation tolerance compared to CCDs, and require lower clock voltages, and thus lower power. However, commercially available CMOS chips, while suitable for use with intensifiers or fluorescent coatings, are generally not optimized for direct detection of EUV and X-ray photons. A high performance scientific CMOS array designed for these wavelengths will have significant new capabilities compared to CCDs, including the ability to read out small regions of the solar disk at high (sub sec) cadence, count single X-ray photons with Fano-limited energy resolution, and even operate at room temperature with good noise performance. Such capabilities will be crucial for future solar X-ray and EUV missions such as Solar-C. Sarnoff Corporation has developed scientific grade, monolithic CMOS arrays for X-ray imaging and photon counting. One prototype device, the "minimal" array, has 8 um pixels, is 15 to 25 um thick, is fabricated on high-resistivity ( 10 to 20 kohm-cm) Si wafers, and can be back-illuminated. These characteristics yield high quantum efficiency and high spatial resolution with minimal charge sharing among pixels, making it ideal for the detection of keV X-rays. When used with digital correlated double sampling, the array has demonstrated noise performance as low as 2 e, allowing single photon counting of X-rays over a range of temperatures. We report test results for this device in X-rays, and discuss the implications for future solar space missions.

  17. Spectral analysis using CCDs

    NASA Technical Reports Server (NTRS)

    Hewes, C. R.; Brodersen, R. W.; De Wit, M.; Buss, D. D.

    1976-01-01

    Charge-coupled devices (CCDs) are ideally suited for performing sampled-data transversal filtering operations in the analog domain. Two algorithms have been identified for performing spectral analysis in which the bulk of the computation can be performed in a CCD transversal filter; the chirp z-transform and the prime transform. CCD implementation of both these transform algorithms is presented together with performance data and applications.

  18. A mouse model for creatine transporter deficiency reveals early onset cognitive impairment and neuropathology associated with brain aging.

    PubMed

    Baroncelli, Laura; Molinaro, Angelo; Cacciante, Francesco; Alessandrì, Maria Grazia; Napoli, Debora; Putignano, Elena; Tola, Jonida; Leuzzi, Vincenzo; Cioni, Giovanni; Pizzorusso, Tommaso

    2016-10-01

    Mutations in the creatine (Cr) transporter (CrT) gene lead to cerebral creatine deficiency syndrome-1 (CCDS1), an X-linked metabolic disorder characterized by cerebral Cr deficiency causing intellectual disability, seizures, movement and autistic-like behavioural disturbances, language and speech impairment. Since no data are available about the neural and molecular underpinnings of this disease, we performed a longitudinal analysis of behavioural and pathological alterations associated with CrT deficiency in a CCDS1 mouse model. We found precocious cognitive and autistic-like defects, mimicking the early key features of human CCDS1. Moreover, mutant mice displayed a progressive impairment of short and long-term declarative memory denoting an early brain aging. Pathological examination showed a prominent loss of GABAergic synapses, marked activation of microglia, reduction of hippocampal neurogenesis and the accumulation of autofluorescent lipofuscin. Our data suggest that brain Cr depletion causes both early intellectual disability and late progressive cognitive decline, and identify novel targets to design intervention strategies aimed at overcoming brain CCDS1 alterations. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. THE DARK ENERGY CAMERA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flaugher, B.; Diehl, H. T.; Alvarez, O.

    2015-11-15

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuummore » Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel{sup −1}. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6–9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.« less

  20. The Dark Energy Camera

    DOE PAGES

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar.more » The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.« less

  1. Measurement of Radioactive Contamination in the High-Resistivity Silicon CCDs of the DAMIC Experiment

    DOE PAGES

    Aguilar-Arevalo, A.

    2015-08-25

    We present measurements of radioactive contamination in the high-resistivity silicon charge-coupled devices (CCDs) used by the DAMIC experiment to search for dark matter particles. Novel analysis methods, which exploit the unique spatial resolution of CCDs, were developed to identify α and β particles. Uranium and thorium contamination in the CCD bulk was measured through α spectroscopy, with an upper limit on the 238U ( 232Th) decay rate of 5 (15) kg -1 d -1 at 95% CL. We also searched for pairs of spatially correlated electron tracks separated in time by up to tens of days, as expected from 32Simore » – 32P or 210Pb – 210Bi sequences of b decays. The decay rate of 32Si was found to be 80 +110 -65 (95% CI). An upper limit of ~35 kg -1 d -1 (95% CL) on the 210Pb decay rate was obtained independently by α spectroscopy and the β decay sequence search. Furthermore, these levels of radioactive contamination are sufficiently low for the successful operation of CCDs in the forthcoming 100 g DAMIC detector.« less

  2. Development of CCDs for REXIS on OSIRIS-REx

    NASA Astrophysics Data System (ADS)

    Ryu, Kevin K.; Burke, Barry E.; Clark, Harry R.; Lambert, Renee D.; O'Brien, Peter; Suntharalingam, Vyshnavi; Ward, Christopher M.; Warner, Keith; Bautz, Mark W.; Binzel, Richard P.; Kissel, Steven E.; Masterson, Rebecca A.

    2014-07-01

    The Regolith x-ray Imaging Spectrometer (REXIS) is a coded-aperture soft x-ray imaging instrument on the OSIRIS-REx spacecraft to be launched in 2016. The spacecraft will fly to and orbit the near-Earth asteroid Bennu, while REXIS maps the elemental distribution on the asteroid using x-ray fluorescence. The detector consists of a 2×2 array of backilluminated 1k×1k frame transfer CCDs with a flight heritage to Suzaku and Chandra. The back surface has a thin p+-doped layer deposited by molecular-beam epitaxy (MBE) for maximum quantum efficiency and energy resolution at low x-ray energies. The CCDs also feature an integrated optical-blocking filter (OBF) to suppress visible and near-infrared light. The OBF is an aluminum film deposited directly on the CCD back surface and is mechanically more robust and less absorptive of x-rays than the conventional free-standing aluminum-coated polymer films. The CCDs have charge transfer inefficiencies of less than 10-6, and dark current of 1e-/pixel/second at the REXIS operating temperature of -60 °C. The resulting spectral resolution is 115 eV at 2 KeV. The extinction ratio of the filter is ~1012 at 625 nm.

  3. Design and performance of the SLD vertex detector: a 307 Mpixel tracking system

    NASA Astrophysics Data System (ADS)

    Abe, K.; Arodzero, A.; Baltay, C.; Brau, J. E.; Breidenbach, M.; Burrows, P. N.; Chou, A. S.; Crawford, G.; Damerell, C. J. S.; Dervan, P. J.; Dong, D. N.; Emmet, W.; English, R. L.; Etzion, E.; Foss, M.; Frey, R.; Haller, G.; Hasuko, K.; Hertzbach, S. S.; Hoeflich, J.; Huffer, M. E.; Jackson, D. J.; Jaros, J. A.; Kelsey, J.; Lee, I.; Lia, V.; Lintern, A. L.; Liu, M. X.; Manly, S. L.; Masuda, H.; McKemey, A. K.; Moore, T. B.; Nichols, A.; Nagamine, T.; Oishi, N.; Osborne, L. S.; Russell, J. J.; Ross, D.; Serbo, V. V.; Sinev, N. B.; Sinnott, J.; Skarpaas, K. Viii; Smy, M. B.; Snyder, J. A.; Strauss, M. G.; Dong, S.; Suekane, F.; Taylor, F. E.; Trandafir, A. I.; Usher, T.; Verdier, R.; Watts, S. J.; Weiss, E. R.; Yashima, J.; Yuta, H.; Zapalac, G.

    1997-02-01

    This paper describes the design, construction, and initial operation of SLD's upgraded vertex detector which comprises 96 two-dimensional charge-coupled devices (CCDs) with a total of 307 Mpixel. Each pixel functions as an independent particle detecting element, providing space point measurements of charged particle tracks with a typical precision of 4 μm in each co-ordinate. The CCDs are arranged in three concentric cylinders just outside the beam-pipe which surrounds the e +e - collision point of the SLAC Linear Collider (SLC). The detector is a powerful tool for distinguishing displaced vertex tracks, produced by decay in flight of heavy flavour hadrons or tau leptons, from tracks produced at the primary event vertex. The requirements for this detector include a very low mass structure (to minimize multiple scattering) both for mechanical support and to provide signal paths for the CCDs; operation at low temperature with a high degree of mechanical stability; and high speed CCD readout, signal processing, and data sparsification. The lessons learned in achieving these goals should be useful for the construction of large arrays of CCDs or active pixel devices in the future in a number of areas of science and technology.

  4. LSST Operations Simulator

    NASA Astrophysics Data System (ADS)

    Cook, K. H.; Delgado, F.; Miller, M.; Saha, A.; Allsman, R.; Pinto, P.; Gee, P. A.

    2005-12-01

    We have developed an operations simulator for LSST and used it to explore design and operations parameter space for this large etendue telescope and its ten year survey mission. The design is modular, with separate science programs coded in separate modules. There is a sophisticated telescope module with all motions parametrized for ease of testing different telescope capabilities, e.g. effect of acceleration capabilities of various motors on science output. Sky brightness is calculated as a function of moon phase and separation. A sophisticated exposure time calculator has been developed for LSST which is being incorporated into the simulator to allow specification of S/N requirements. All important parameters for the telescope, the site and the science programs are easily accessible in configuration files. Seeing and cloud data from the three candidate LSST sites are used for our simulations. The simulator has two broad categories of science proposals: sky coverage and transient events. Sky coverage proposals base their observing priorities on a required number of observations for each field in a particular filter with specified conditions (maximum seeing, sky brightness, etc) and one is used for a weak lensing investigation. Transient proposals are highly configurable. A transient proposal can require sequential, multiple exposures in various filters with a specified sequence of filters, and require a particular cadence for multiple revisits to complete an observation sequence. Each science proposal ranks potential observations based upon the internal logic of that proposal. We present the results of a variety of mixed science program observing simulations, showing how varied programs can be carried out simultaneously, with many observations serving multiple science goals. The simulator has shown that LSST can carry out its multiple missions under a variety of conditions. KHC's work was performed under the auspices of the US DOE, NNSA by the Univ. of California, LLNL under contract No. W-7405-Eng-48.

  5. The Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Axelrod, T. S.

    2006-07-01

    The Large Synoptic Survey Telescope (LSST) is an 8.4 meter telescope with a 10 square degree field degree field and a 3 Gigapixel imager, planned to be on-sky in 2012. It is a dedicated all-sky survey instrument, with several complementary science missions. These include understanding dark energy through weak lensing and supernovae; exploring transients and variable objects; creating and maintaining a solar system map, with particular emphasis on potentially hazardous objects; and increasing the precision with which we understand the structure of the Milky Way. The instrument operates continuously at a rapid cadence, repetitively scanning the visible sky every few nights. The data flow rates from LSST are larger than those from current surveys by roughly a factor of 1000: A few GB/night are typical today. LSST will deliver a few TB/night. From a computing hardware perspective, this factor of 1000 can be dealt with easily in 2012. The major issues in designing the LSST data management system arise from the fact that the number of people available to critically examine the data will not grow from current levels. This has a number of implications. For example, every large imaging survey today is resigned to the fact that their image reduction pipelines fail at some significant rate. Many of these failures are dealt with by rerunning the reduction pipeline under human supervision, with carefully ``tweaked'' parameters to deal with the original problem. For LSST, this will no longer be feasible. The problem is compounded by the fact that the processing must of necessity occur on clusters with large numbers of CPU's and disk drives, and with some components connected by long-haul networks. This inevitably results in a significant rate of hardware component failures, which can easily lead to further software failures. Both hardware and software failures must be seen as a routine fact of life rather than rare exceptions to normality.

  6. Atmospheric Dispersion Effects in Weak Lensing Measurements

    DOE PAGES

    Plazas, Andrés Alejandro; Bernstein, Gary

    2012-10-01

    The wavelength dependence of atmospheric refraction causes elongation of finite-bandwidth images along the elevation vector, which produces spurious signals in weak gravitational lensing shear measurements unless this atmospheric dispersion is calibrated and removed to high precision. Because astrometric solutions and PSF characteristics are typically calibrated from stellar images, differences between the reference stars' spectra and the galaxies' spectra will leave residual errors in both the astrometric positions (dr) and in the second moment (width) of the wavelength-averaged PSF (dv) for galaxies.We estimate the level of dv that will induce spurious weak lensing signals in PSF-corrected galaxy shapes that exceed themore » statistical errors of the DES and the LSST cosmic-shear experiments. We also estimate the dr signals that will produce unacceptable spurious distortions after stacking of exposures taken at different airmasses and hour angles. We also calculate the errors in the griz bands, and find that dispersion systematics, uncorrected, are up to 6 and 2 times larger in g and r bands,respectively, than the requirements for the DES error budget, but can be safely ignored in i and z bands. For the LSST requirements, the factors are about 30, 10, and 3 in g, r, and i bands,respectively. We find that a simple correction linear in galaxy color is accurate enough to reduce dispersion shear systematics to insignificant levels in the r band for DES and i band for LSST,but still as much as 5 times than the requirements for LSST r-band observations. More complex corrections will likely be able to reduce the systematic cosmic-shear errors below statistical errors for LSST r band. But g-band effects remain large enough that it seems likely that induced systematics will dominate the statistical errors of both surveys, and cosmic-shear measurements should rely on the redder bands.« less

  7. A simple screening method using ion chromatography for the diagnosis of cerebral creatine deficiency syndromes.

    PubMed

    Wada, Takahito; Shimbo, Hiroko; Osaka, Hitoshi

    2012-08-01

    Cerebral creatine deficiency syndromes (CCDS) are caused by genetic defects in L-arginine:glycine amidinotransferase, guanidinoacetate methyltransferase or creatine transporter 1. CCDS are characterized by abnormal concentrations of urinary creatine (CR), guanidinoacetic acid (GA), or creatinine (CN). In this study, we describe a simple HPLC method to determine the concentrations of CR, GA, and CN using a weak-acid ion chromatography column with a UV detector without any derivatization. CR, GA, and CN were separated clearly with the retention times (mean ± SD, n = 3) of 5.54 ± 0.0035 min for CR, 6.41 ± 0.0079 min for GA, and 13.53 ± 0.046 min for CN. This new method should provide a simple screening test for the diagnosis of CCDS.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, John Russell

    This grant funded the development and dissemination of the Photon Simulator (PhoSim) for the purpose of studying dark energy at high precision with the upcoming Large Synoptic Survey Telescope (LSST) astronomical survey. The work was in collaboration with the LSST Dark Energy Science Collaboration (DESC). Several detailed physics improvements were made in the optics, atmosphere, and sensor, a number of validation studies were performed, and a significant number of usability features were implemented. Future work in DESC will use PhoSim as the image simulation tool for data challenges used by the analysis groups.

  9. The prototype cameras for trans-Neptunian automatic occultation survey

    NASA Astrophysics Data System (ADS)

    Wang, Shiang-Yu; Ling, Hung-Hsu; Hu, Yen-Sang; Geary, John C.; Chang, Yin-Chang; Chen, Hsin-Yo; Amato, Stephen M.; Huang, Pin-Jie; Pratlong, Jerome; Szentgyorgyi, Andrew; Lehner, Matthew; Norton, Timothy; Jorden, Paul

    2016-08-01

    The Transneptunian Automated Occultation Survey (TAOS II) is a three robotic telescope project to detect the stellar occultation events generated by TransNeptunian Objects (TNOs). TAOS II project aims to monitor about 10000 stars simultaneously at 20Hz to enable statistically significant event rate. The TAOS II camera is designed to cover the 1.7 degrees diameter field of view of the 1.3m telescope with 10 mosaic 4.5k×2k CMOS sensors. The new CMOS sensor (CIS 113) has a back illumination thinned structure and high sensitivity to provide similar performance to that of the back-illumination thinned CCDs. Due to the requirements of high performance and high speed, the development of the new CMOS sensor is still in progress. Before the science arrays are delivered, a prototype camera is developed to help on the commissioning of the robotic telescope system. The prototype camera uses the small format e2v CIS 107 device but with the same dewar and also the similar control electronics as the TAOS II science camera. The sensors, mounted on a single Invar plate, are cooled to the operation temperature of about 200K as the science array by a cryogenic cooler. The Invar plate is connected to the dewar body through a supporting ring with three G10 bipods. The control electronics consists of analog part and a Xilinx FPGA based digital circuit. One FPGA is needed to control and process the signal from a CMOS sensor for 20Hz region of interests (ROI) readout.

  10. The Whole is Greater than the Sum of the Parts: Optimizing the Joint Science Return from LSST, Euclid and WFIRST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jain, B.; Spergel, D.; Connolly, A.

    2015-02-02

    The scientific opportunity offered by the combination of data from LSST, WFIRST and Euclid goes well beyond the science enabled by any one of the data sets alone. The range in wavelength, angular resolution and redshift coverage that these missions jointly span is remarkable. With major investments in LSST and WFIRST, and partnership with ESA in Euclid, the US has an outstanding scientific opportunity to carry out a combined analysis of these data sets. It is imperative for us to seize it and, together with our European colleagues, prepare for the defining cosmological pursuit of the 21st century. The mainmore » argument for conducting a single, high-quality reference co-analysis exercise and carefully documenting the results is the complexity and subtlety of systematics that define this co-analysis. Falling back on many small efforts by different teams in selected fields and for narrow goals will be inefficient, leading to significant duplication of effort.« less

  11. The Emerging Infrastructure of Autonomous Astronomy

    NASA Astrophysics Data System (ADS)

    Seaman, R.; Allan, A.; Axelrod, T.; Cook, K.; White, R.; Williams, R.

    2007-10-01

    Advances in the understanding of cosmic processes demand that sky transient events be confronted with statistical techniques honed on static phenomena. Time domain data sets require vast surveys such as LSST {http://www.lsst.org/lsst_home.shtml} and Pan-STARRS {http://www.pan-starrs.ifa.hawaii.edu}. A new autonomous infrastructure must close the loop from the scheduling of survey observations, through data archiving and pipeline processing, to the publication of transient event alerts and automated follow-up, and to the easy analysis of resulting data. The IVOA VOEvent {http://voevent.org} working group leads efforts to characterize sky transient alerts published through VOEventNet {http://voeventnet.org}. The Heterogeneous Telescope Networks (HTN {http://www.telescope-networks.org}) consortium are observatories and robotic telescope projects seeking interoperability with a long-term goal of creating an e-market for telescope time. Two projects relying on VOEvent and HTN are eSTAR {http://www.estar.org.uk} and the Thinking Telescope {http://www.thinkingtelescopes.lanl.gov} Project.

  12. LIMB-DARKENING COEFFICIENTS FOR ECLIPSING WHITE DWARFS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gianninas, A.; Strickland, B. D.; Kilic, Mukremin

    2013-03-20

    We present extensive calculations of linear and nonlinear limb-darkening coefficients as well as complete intensity profiles appropriate for modeling the light-curves of eclipsing white dwarfs. We compute limb-darkening coefficients in the Johnson-Kron-Cousins UBVRI photometric system as well as the Large Synoptic Survey Telescope (LSST) ugrizy system using the most up to date model atmospheres available. In all, we provide the coefficients for seven different limb-darkening laws. We describe the variations of these coefficients as a function of the atmospheric parameters, including the effects of convection at low effective temperatures. Finally, we discuss the importance of having readily available limb-darkening coefficientsmore » in the context of present and future photometric surveys like the LSST, Palomar Transient Factory, and the Panoramic Survey Telescope and Rapid Response System (Pan-STARRS). The LSST, for example, may find {approx}10{sup 5} eclipsing white dwarfs. The limb-darkening calculations presented here will be an essential part of the detailed analysis of all of these systems.« less

  13. Calibration methods and performance evaluation for pnCCDs in experiments with FEL radiation

    NASA Astrophysics Data System (ADS)

    Kimmel, N.; Andritschke, R.; Englert, L.; Epp, S.; Hartmann, A.; Hartmann, R.; Hauser, G.; Holl, P.; Ordavo, I.; Richter, R.; Strüder, L.; Ullrich, J.

    2011-06-01

    Measurement campaigns of the Max-Planck Advanced Study Group (ASG) in cooperation with the Center for Free Electron Laser Science (CFEL) at DESY-FLASH and SLAC-LCLS have established pnCCDs as universal photon imaging spectrometers in the energy range from 90 eV to 2 keV. In the CFEL-ASG multi purpose chamber (CAMP), pnCCD detector modules are an integral part of the design with the ability to detect photons at very small scattering angles. In order to fully exploit the spectroscopic and intensity imaging capability of pnCCDs, it is essentially important to translate the unprocessed raw data into units of photon counts for any given position on the detection area. We have studied the performance of pnCCDs in FEL experiments and laboratory test setups for the range of signal intensities from a few X-ray photons per signal frame to 100 or more photons with an energy of 2 keV per pixel. Based on these measurement results, we were able to characterize the response of pnCCDs over the experimentally relevant photon energy and intensity range. The obtained calibration results are directly relevant for the physics data analysis. The accumulated knowledge of the detector performance was implemented in guidelines for detector calibration methods which are suitable for the specific requirements in photon science experiments at Free Electron Lasers. We discuss the achievable accuracy of photon energy and photon count measurements before and after the application of calibration data. Charge spreading due to illumination of small spots with high photon rates is discussed with respect to the charge handling capacity of a pixel and the effect of the charge spreading process on the resulting signal patterns.

  14. Impact of International Decisions of TAI Generation

    DTIC Science & Technology

    1996-12-01

    in their specified fields as advisers on scientific and technical matters. Among them, the Coma6 Comultakifpow h D6jinition & la Seeondo (CCDS) deak...Time (UTC). For issues concerning time, the CIPM takes advice from the ComitC Consultatif pour la DCfinition de la Seconde (CCDS). Although the name...tests of new devices and suggestions to manufacturers. CONCLUSIONS The meetings of the Comite Consultatif pour la DCfinition de la Seconde gather

  15. A CMOS-based large-area high-resolution imaging system for high-energy x-ray applications

    NASA Astrophysics Data System (ADS)

    Rodricks, Brian; Fowler, Boyd; Liu, Chiao; Lowes, John; Haeffner, Dean; Lienert, Ulrich; Almer, John

    2008-08-01

    CCDs have been the primary sensor in imaging systems for x-ray diffraction and imaging applications in recent years. CCDs have met the fundamental requirements of low noise, high-sensitivity, high dynamic range and spatial resolution necessary for these scientific applications. State-of-the-art CMOS image sensor (CIS) technology has experienced dramatic improvements recently and their performance is rivaling or surpassing that of most CCDs. The advancement of CIS technology is at an ever-accelerating pace and is driven by the multi-billion dollar consumer market. There are several advantages of CIS over traditional CCDs and other solid-state imaging devices; they include low power, high-speed operation, system-on-chip integration and lower manufacturing costs. The combination of superior imaging performance and system advantages makes CIS a good candidate for high-sensitivity imaging system development. This paper will describe a 1344 x 1212 CIS imaging system with a 19.5μm pitch optimized for x-ray scattering studies at high-energies. Fundamental metrics of linearity, dynamic range, spatial resolution, conversion gain, sensitivity are estimated. The Detective Quantum Efficiency (DQE) is also estimated. Representative x-ray diffraction images are presented. Diffraction images are compared against a CCD-based imaging system.

  16. Guiding the Design of Radiation Imagers with Experimentally Benchmarked Geant4 Simulations for Electron-Tracking Compton Imaging

    NASA Astrophysics Data System (ADS)

    Coffer, Amy Beth

    Radiation imagers are import tools in the modern world for a wide range of applications. They span the use-cases of fundamental sciences, astrophysics, medical imaging, all the way to national security, nuclear safeguards, and non-proliferation verification. The type of radiation imagers studied in this thesis were gamma-ray imagers that detect emissions from radioactive materials. Gamma-ray imagers goal is to localize and map the distribution of radiation within their specific field-of-view despite the fact of complicating background radiation that can be terrestrial, astronomical, and temporal. Compton imaging systems are one type of gamma-ray imager that can map the radiation around the system without the use of collimation. Lack of collimation enables the imaging system to be able to detect radiation from all-directions, while at the same time, enables increased detection efficiency by not absorbing incident radiation in non-sensing materials. Each Compton-scatter events within an imaging system generated a possible cone-surface in space that the radiation could have originated from. Compton imaging is limited in its reconstructed image signal-to-background due to these source Compton-cones overlapping with background radiation Compton-cones. These overlapping cones limit Compton imaging's detection-sensitivity in image space. Electron-tracking Compton imaging (ETCI) can improve the detection-sensitivity by measuring the Compton-scattered electron's initial trajectory. With an estimate of the scattered electron's trajectory, one can reduce the Compton-back-projected cone to a cone-arc, thus enabling faster radiation source detection and localization. However, the ability to measure the Compton-scattered electron-trajectories adds another layer of complexity to an already complex methodology. For a real-world imaging applications, improvements are needed in electron-track detection efficiency and in electron-track reconstruction. One way of measuring Compton-scattered electron-trajectories is with high-resolution Charged-Coupled Devices (CCDs). The proof-of-principle CCD-based ETCI experiment demonstrated the CCDs' ability to measure the Compton-scattered electron-tracks as a 2-dimensional image. Electron-track-imaging algorithms using the electron-track-image are able to determine the 3-dimensional electron-track trajectory within +/- 20 degrees. The work presented here is the physics simulations developed along side the experimental proof-of-principle experiment. The development of accurate physics modeling for multiple-layer CCDs based ETCI systems allow for the accurate prediction of future ETCI system performance. The simulations also enable quick development insights for system design, and they guide the development of electron-track reconstruction methods. The physics simulation efforts for this project looked closely at the accuracy of the Geant4 Monte Carlo methods for medium energy electron transport. In older version of Geant4 there were some discrepancies between the electron-tracking experimental measurements and the simulation results. It was determined that when comparing the electron dynamics of electrons at very high resolutions, Geant4 simulations must be fine tuned with careful choices for physics production cuts and electron physics stepping sizes. One result of this work is a CCDs Monte Carlo model that has been benchmarked to experimental findings and fully characterized for both photon and electron transport. The CCDs physics model now match to within 1 percent error of experimental results for scattered-electron energies below 500 keV. Following the improvements of the CCDs simulations, the performance of a realistic two-layer CCD-stack system was characterized. The realistic CCD-stack system looked at the effect of thin passive-layers on the CCDs' front face and back-contact. The photon interaction efficiency was calculated for the two-layer CCD-stack, and we found that there is a 90 percent probability of scattered-electrons from a 662 keV source to stay within a single active layer. This demonstrates the improved detection efficiency, which is one of the strengths of the CCDs' implementation as a ETCI system. The CCD-stack simulations also established that electron-tracks scattering from one CCDs layer to another could be reconstructed. The passive-regions on the CCD-stack mean that these inter-layer scattered-electron-tracks will always loose both angular information and energy information. Looking at the angular changes of these electrons scattering between the CCDs layers showed us there is not a strong energy dependence on the angular changes due to the passive-regions of the CCDs. The angular changes of the electron track are, for the most part, a function of the thickness of the thin back-layer of the CCDs. Lastly, an approach using CCD-stack simulations was developed to reconstruct the energy transport across dead-layers and its feasibility was demonstrated. Adding back this lost energy will limit the loss of energy resolution of the scatter-interactions. Energy resolution losses would negatively impacted the achievable image resolution from image reconstruction algorithms. Returning some of the energy back to the reconstructed electron-track will help retain the expected performance of the electron-track trajectory determination algorithm.

  17. The DAMIC Dark Matter Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Mello Neto, J. R.T.

    The DAMIC (DArk Matter In CCDs) experiment uses high-resistivity, scientific-grade CCDs to search for dark matter. The CCD’s low electronic noise allows an unprecedently low energy threshold of a few tens of eV; this characteristic makes it possible to detect silicon recoils resulting from interactions of low-mass WIMPs. In addition, the CCD’s high spatial resolution and the excellent energy response results in very effective background identification techniques. The experiment has a unique sensitivity to dark matter particles with masses below 10 GeV/c 2. Previous results have motivated the construction of DAMIC100, a 100 grams silicon target detector currently being installedmore » at SNOLAB. The mode of operation and unique imaging capabilities of the CCDs, and how they may be exploited to characterize and suppress backgrounds are discussed, as well as physics results after one year of data taking.« less

  18. Visible and Ultraviolet Detectors for High Earth Orbit and Lunar Observatories

    NASA Technical Reports Server (NTRS)

    Woodgate, Bruce E.

    1989-01-01

    The current status of detectors for the visible and UV for future large observatories in earth orbit and the moon is briefly reviewed. For the visible, CCDs have the highest quantum efficiency, but are subject to contamination of the data by cosmic ray hits. For the moon, the level of hits can be brought down to that at the earth's surface by shielding below about 20 meters of rock. For high earth orbits above the geomagnetic shield, CCDs might be able to be used by combining many short exposures and vetoing the cosmic ray hits, otherwise photoemissive detectors will be necessary. For the UV, photoemissive detectors will be necessary to reject the visible; to use CCDs would require the development of UV-efficient filters which reject the visible by many orders of magnitude. Development of higher count rate capability would be desirable for photoemissive detectors.

  19. Inexpensive Neutron Imaging Cameras Using CCDs for Astronomy

    NASA Astrophysics Data System (ADS)

    Hewat, A. W.

    We have developed inexpensive neutron imaging cameras using CCDs originally designed for amateur astronomical observation. The low-light, high resolution requirements of such CCDs are similar to those for neutron imaging, except that noise as well as cost is reduced by using slower read-out electronics. For example, we use the same 2048x2048 pixel ;Kodak; KAI-4022 CCD as used in the high performance PCO-2000 CCD camera, but our electronics requires ∼5 sec for full-frame read-out, ten times slower than the PCO-2000. Since neutron exposures also require several seconds, this is not seen as a serious disadvantage for many applications. If higher frame rates are needed, the CCD unit on our camera can be easily swapped for a faster readout detector with similar chip size and resolution, such as the PCO-2000 or the sCMOS PCO.edge 4.2.

  20. Constraining neutrinos as background to wimp-nucleon dark matter particle searches for DaMIC: CCD physics analysis and electronics development

    NASA Astrophysics Data System (ADS)

    Butner, Melissa Jean

    The DaMIC (Dark Matter in CCDs) experiment searches for dark matter particles using charge coupled devices (CCDs) operated at a low detection threshold of ˜40 eV electron equivalent energy (eVee). A multiplexor board is tested for DAMIC100+ which has the ability to control up to 16 CCDs at one time allowing for the selection of a single CCD for readout while leaving all others static and maintaining sub-electron noise. A dark matter limit is produced using the results of physics data taken with the DAMIC experiment. Next, the contribution from neutrino-nucleus coherent scattering is investigated using data from the Coherent Neutrino Nucleus Interaction Experiment (CONnuIE) using the same CCD technology. The results are used to explore the performance of CCD detectors that ultimately will limit the ability to differentiate incident solar and atmospheric neutrinos from dark matter particles.

  1. Effects of liquid feeding of corn condensed distiller's solubles and whole stillage on growth performance, carcass characteristics, and sensory traits of pigs.

    PubMed

    Yang, Xiaojian; Nath, Carissa; Doering, Alan; Goihl, John; Baidoo, Samuel Kofi

    2017-01-01

    The immense growth in global bioethanol production has greatly increased the supply of by-products such as whole stillage and condensed distiller's solubles, which could be potentially used for animal feeding. The objective of this study was to investigate effects of liquid feeding high levels of corn condensed distiller's solubles (CCDS) and whole stillage (CWS) on growth performance, carcass characteristics, belly firmness and meat sensory traits of pigs. A total of 256 pigs were blocked by sex and initial BW (13.5 ± 2.5 kg), and pens of pigs (8 pigs/pen) were randomly allocated to 1 of 4 dietary treatments (8 pens/treatment): 1) corn-soybean meal based diet as control, 2) 25% CWS + 5% CCDS, 3) 19.5% CWS + 10.5% CCDS, and 4) 19.5, 26, and 32.5% CWS + 10.5, 14, and 17.5% CCDS in phases 1 (28 d), 2 (38 d), and 3 (60 d), respectively. Inclusion levels of CCDS and CWS for Treatments 1, 2, and 3 were fixed during all the three phases of the experiment. Inclusion levels of CWS and CCDS were on 88% dry matter basis. The liquid feeding system delivered feed from the mixing tank to feed troughs by high-pressure air, had sensors inside feed troughs, and recorded daily feed intake on the basis of a reference feed intake curve. The pigs were fed 5 to 10 times per day with increasing frequency during the experiment. Control pigs had greater ( P  < 0.05) average daily gain (0.91 vs. 0.84, 0.85, 0.85 kg/d) and gain to feed ratio (0.37 vs. 0.33, 0.34, 0.34) than pigs in the other three treatments during the overall period. Compared with the control, the other three groups had ( P  < 0.05) or tended to have ( P  < 0.10) lower carcass weight and backfat depth due to lighter ( P  < 0.05) slaughter body weight, but similar ( P  > 0.10) dressing percentage, loin muscle depth, and lean percentage were observed among the four treatments. Inclusion of CWS and CCDS reduced ( P  < 0.05) or tended to reduce ( P  < 0.10) belly firmness but did not influence ( P  > 0.10) the overall like, flavor, tenderness and juiciness of loin chops when compared with the control group. In conclusion, our results indicate that including 30-50% of a mixture of whole stillage and condensed distiller's solubles in the growing-finishing diets may reduce growth performance, carcass weight and belly firmness, but does not affect pork sensory traits.

  2. A daytime measurement of the lunar contribution to the night sky brightness in LSST's ugrizy bands-initial results

    NASA Astrophysics Data System (ADS)

    Coughlin, Michael; Stubbs, Christopher; Claver, Chuck

    2016-06-01

    We report measurements from which we determine the spatial structure of the lunar contribution to night sky brightness, taken at the LSST site on Cerro Pachon in Chile. We use an array of six photodiodes with filters that approximate the Large Synoptic Survey Telescope's u, g, r, i, z, and y bands. We use the sun as a proxy for the moon, and measure sky brightness as a function of zenith angle of the point on sky, zenith angle of the sun, and angular distance between the sun and the point on sky. We make a correction for the difference between the illumination spectrum of the sun and the moon. Since scattered sunlight totally dominates the daytime sky brightness, this technique allows us to cleanly determine the contribution to the (cloudless) night sky from backscattered moonlight, without contamination from other sources of night sky brightness. We estimate our uncertainty in the relative lunar night sky brightness vs. zenith and lunar angle to be between 0.3-0.7 mags depending on the passband. This information is useful in planning the optimal execution of the LSST survey, and perhaps for other astronomical observations as well. Although our primary objective is to map out the angular structure and spectrum of the scattered light from the atmosphere and particulates, we also make an estimate of the expected number of scattered lunar photons per pixel per second in LSST, and find values that are in overall agreement with previous estimates.

  3. Probing the Solar System with LSST

    NASA Astrophysics Data System (ADS)

    Harris, A.; Ivezic, Z.; Juric, M.; Lupton, R.; Connolly, A.; Kubica, J.; Moore, A.; Bowell, E.; Bernstein, G.; Cook, K.; Stubbs, C.

    2005-12-01

    LSST will catalog small Potentially Hazardous Asteroids (PHAs), survey the main belt asteroid (MBA) population to extraordinarily small size, discover comets far from the sun where their nuclear properties can be discerned without coma, and survey the Centaur and Trans-Neptunian Object (TNO) populations. The present planned observing strategy is to ``visit'' each field (9.6 deg2) with two back-to-back exposures of ˜ 15 sec, reaching to at least V magnitude 24.5. An intra-night revisit time of the order half an hour will distinguish stationary transients from even very distant ( ˜ 70 AU) solar system bodies. In order to link observations and determine orbits, each sky area will be visited several times during a month, spaced by about a week. This cadence will result in orbital parameters for several million MBAs and about 20,000 TNOs, with light curves and colorimetry for the brighter 10% or so of each population. Compared to the current data available, this would represent factor of 10 to 100 increase in the numbers of orbits, colors, and variability of the two classes of objects. The LSST MBA and TNO samples will enable detailed studies of the dynamical and chemical history of the solar system. The increase in data volume associated with LSST asteroid science will present many computational challenges to how we might extract tracks and orbits of asteroids from the underlying clutter. Tree-based algorithms for multihypothesis testing of asteroid tracks can help solve these challenges by providing the necessary 1000-fold speed-ups over current approaches while recovering 95% of the underlying moving objects.

  4. Scheduling Algorithm for the Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Ichharam, Jaimal; Stubbs, Christopher

    2015-01-01

    The Large Synoptic Survey Telescope (LSST) is a wide-field telescope currently under construction and scheduled to be deployed in Chile by 2022 and operate for a ten-year survey. As a ground-based telescope with the largest etendue ever constructed, and the ability to take images approximately once every eighteen seconds, the LSST will be able to capture the entirety of the observable sky every few nights in six different band passes. With these remarkable features, LSST is primed to provide the scientific community with invaluable data in numerous areas of astronomy, including the observation of near-Earth asteroids, the detection of transient optical events such as supernovae, and the study of dark matter and energy through weak gravitational lensing.In order to maximize the utility that LSST will provide toward achieving these scientific objectives, it proves necessary to develop a flexible scheduling algorithm for the telescope which both optimizes its observational efficiency and allows for adjustment based on the evolving needs of the astronomical community.This work defines a merit function that incorporates the urgency of observing a particular field in the sky as a function of time elapsed since last observed, dynamic viewing conditions (in particular transparency and sky brightness), and a measure of scientific interest in the field. The problem of maximizing this merit function, summed across the entire observable sky, is then reduced to a classic variant of the dynamic traveling salesman problem. We introduce a new approximation technique that appears particularly well suited for this situation. We analyze its effectiveness in resolving this problem, obtaining some promising initial results.

  5. Looking through the same lens: Shear calibration for LSST, Euclid, and WFIRST with stage 4 CMB lensing

    NASA Astrophysics Data System (ADS)

    Schaan, Emmanuel; Krause, Elisabeth; Eifler, Tim; Doré, Olivier; Miyatake, Hironao; Rhodes, Jason; Spergel, David N.

    2017-06-01

    The next-generation weak lensing surveys (i.e., LSST, Euclid, and WFIRST) will require exquisite control over systematic effects. In this paper, we address shear calibration and present the most realistic forecast to date for LSST/Euclid/WFIRST and CMB lensing from a stage 4 CMB experiment ("CMB S4"). We use the cosmolike code to simulate a joint analysis of all the two-point functions of galaxy density, galaxy shear, and CMB lensing convergence. We include the full Gaussian and non-Gaussian covariances and explore the resulting joint likelihood with Monte Carlo Markov chains. We constrain shear calibration biases while simultaneously varying cosmological parameters, galaxy biases, and photometric redshift uncertainties. We find that CMB lensing from CMB S4 enables the calibration of the shear biases down to 0.2%-3% in ten tomographic bins for LSST (below the ˜0.5 % requirements in most tomographic bins), down to 0.4%-2.4% in ten bins for Euclid, and 0.6%-3.2% in ten bins for WFIRST. For a given lensing survey, the method works best at high redshift where shear calibration is otherwise most challenging. This self-calibration is robust to Gaussian photometric redshift uncertainties and to a reasonable level of intrinsic alignment. It is also robust to changes in the beam and the effectiveness of the component separation of the CMB experiment, and slowly dependent on its depth, making it possible with third-generation CMB experiments such as AdvACT and SPT-3G, as well as the Simons Observatory.

  6. Optical Design of the LSST Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S; Seppala, L; Gilmore, K

    2008-07-16

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary feeding a camera system that includes a set of broad-band filters and refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. Optical design of the camera lenses and filters is integrated with optical design of telescope mirrors to optimize performance, resulting in excellent image quality over the entire field from ultra-violet to near infra-red wavelengths. The LSST camera optics design consists of three refractive lenses withmore » clear aperture diameters of 1.55 m, 1.10 m and 0.69 m and six interchangeable, broad-band, filters with clear aperture diameters of 0.75 m. We describe the methodology for fabricating, coating, mounting and testing these lenses and filters, and we present the results of detailed tolerance analyses, demonstrating that the camera optics will perform to the specifications required to meet their performance goals.« less

  7. Scientific grade CCDs from EG & G Reticon

    NASA Technical Reports Server (NTRS)

    Cizdziel, Philip J.

    1990-01-01

    The design and performance of three scientific grade CCDs are summarized: a 1200 x 400 astronomical array of 27 x 27 sq micron pixels, a 512 x 512 scientific array of 27 x 27 sq micron pixels and a 404 x 64 VNIR array of 52 x 52 sq micron pixels. Each of the arrays is fabricated using a four phase, double poly, buried n-channel, multi-pinned phase CCD process. Performance data for each sensor is presented.

  8. Evolution of Time Scales

    DTIC Science & Technology

    2006-12-01

    as the fundamental unit of time in the International System of Units. It was defined as ( Metrologia , 1968) “the duration of 9 192 631 770 periods of...atomic time equivalent to the second of ET in principle. The Comité Consultatif pour la Définition de la Seconde (CCDS) of the CIPM recommended...with the definition of the second, the unit of time of the Inter- national System of Units” ( Metrologia , 1971). The CCDS (BIPM Com. Cons. Déf. Seconde

  9. Centers for the commercial development of space

    NASA Technical Reports Server (NTRS)

    Walker, Susan E. (Editor)

    1989-01-01

    In 1985, NASA initiated an innovative effort called Centers for the Commercial Development of Space (CCDS). The CCDS program was designed to increase private-sector interest and investment in space-related activities, while encouraging U.S. economic leadership and stimulating advances in promising areas of research and development. Research conducted in the Centers handling the following areas is summarized: materials processing; life sciences; remote sensing; automation and robotics; space propulsion; space structures and materials; and space power.

  10. Charge shielding in the In-situ Storage Image Sensor for a vertex detector at the ILC

    NASA Astrophysics Data System (ADS)

    Zhang, Z.; Stefanov, K. D.; Bailey, D.; Banda, Y.; Buttar, C.; Cheplakov, A.; Cussans, D.; Damerell, C.; Devetak, E.; Fopma, J.; Foster, B.; Gao, R.; Gillman, A.; Goldstein, J.; Greenshaw, T.; Grimes, M.; Halsall, R.; Harder, K.; Hawes, B.; Hayrapetyan, K.; Heath, H.; Hillert, S.; Jackson, D.; Pinto Jayawardena, T.; Jeffery, B.; John, J.; Johnson, E.; Kundu, N.; Laing, A.; Lastovicka, T.; Lau, W.; Li, Y.; Lintern, A.; Lynch, C.; Mandry, S.; Martin, V.; Murray, P.; Nichols, A.; Nomerotski, A.; Page, R.; Parkes, C.; Perry, C.; O'Shea, V.; Sopczak, A.; Tabassam, H.; Thomas, S.; Tikkanen, T.; Velthuis, J.; Walsh, R.; Woolliscroft, T.; Worm, S.

    2009-08-01

    The Linear Collider Flavour Identification (LCFI) collaboration has successfully developed the first prototype of a novel particle detector, the In-situ Storage Image Sensor (ISIS). This device ideally suits the challenging requirements for the vertex detector at the future International Linear Collider (ILC), combining the charge storing capabilities of the Charge-Coupled Devices (CCD) with readout commonly used in CMOS imagers. The ISIS avoids the need for high-speed readout and offers low power operation combined with low noise, high immunity to electromagnetic interference and increased radiation hardness compared to typical CCDs. The ISIS is one of the most promising detector technologies for vertexing at the ILC. In this paper we describe the measurements on the charge-shielding properties of the p-well, which is used to protect the storage register from parasitic charge collection and is at the core of device's operation. We show that the p-well can suppress the parasitic charge collection by almost two orders of magnitude, satisfying the requirements for the application.

  11. DAMIC at SNOLAB

    DOE PAGES

    Chavarria, Alvaro E.; Tiffenberg, Javier; Aguilar-Arevalo, Alexis; ...

    2015-03-24

    We introduce the fully-depleted charge-coupled device (CCD) as a particle detector. We demonstrate its low energy threshold operation, capable of detecting ionizing energy depositions in a single pixel down to 50 eV ee. We present results of energy calibrations from 0.3 keV ee to 60 ke Vee, showing that the CCD is a fully active detector with uniform energy response throughout the silicon target, good resolution (Fano ~0.16), and remarkable linear response to electron energy depositions. We show the capability of the CCD to localize the depth of particle interactions within the silicon target. We discuss the mode of operationmore » and unique imaging capabilities of the CCD, and how they may be exploited to characterize and suppress backgrounds. We present the first results from the deployment of 250 μm thick CCDs in SNOLAB, a prototype for the upcoming DAMIC100. DAMIC100 will have a target mass of 0.1 kg and should be able to directly test the CDMS-Si signal within a year of operation.« less

  12. Multiplexed Oversampling Digitizer in 65 nm CMOS for Column-Parallel CCD Readout

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grace, Carl; Walder, Jean-Pierre; von der Lippe, Henrik

    2012-04-10

    A digitizer designed to read out column-parallel charge-coupled devices (CCDs) used for high-speed X-ray imaging is presented. The digitizer is included as part of the High-Speed Image Preprocessor with Oversampling (HIPPO) integrated circuit. The digitizer module comprises a multiplexed, oversampling, 12-bit, 80 MS/s pipelined Analog-to-Digital Converter (ADC) and a bank of four fast-settling sample-and-hold amplifiers to instrument four analog channels. The ADC multiplexes and oversamples to reduce its area to allow integration that is pitch-matched to the columns of the CCD. Novel design techniques are used to enable oversampling and multiplexing with a reduced power penalty. The ADC exhibits 188more » ?V-rms noise which is less than 1 LSB at a 12-bit level. The prototype is implemented in a commercially available 65 nm CMOS process. The digitizer will lead to a proof-of-principle 2D 10 Gigapixel/s X-ray detector.« less

  13. Development of Camera Model and Geometric Calibration/validation of Xsat IRIS Imagery

    NASA Astrophysics Data System (ADS)

    Kwoh, L. K.; Huang, X.; Tan, W. J.

    2012-07-01

    XSAT, launched on 20 April 2011, is the first micro-satellite designed and built in Singapore. It orbits the Earth at altitude of 822 km in a sun synchronous orbit. The satellite carries a multispectral camera IRIS with three spectral bands - 0.52~0.60 mm for Green, 0.63~0.69 mm for Red and 0.76~0.89 mm for NIR at 12 m resolution. In the design of IRIS camera, the three bands were acquired by three lines of CCDs (NIR, Red and Green). These CCDs were physically separated in the focal plane and their first pixels not absolutely aligned. The micro-satellite platform was also not stable enough to allow for co-registration of the 3 bands with simple linear transformation. In the camera model developed, this platform stability was compensated with 3rd to 4th order polynomials for the satellite's roll, pitch and yaw attitude angles. With the camera model, the camera parameters such as the band to band separations, the alignment of the CCDs relative to each other, as well as the focal length of the camera can be validated or calibrated. The results of calibration with more than 20 images showed that the band to band along-track separation agreed well with the pre-flight values provided by the vendor (0.093° and 0.046° for the NIR vs red and for green vs red CCDs respectively). The cross-track alignments were 0.05 pixel and 5.9 pixel for the NIR vs red and green vs red CCDs respectively. The focal length was found to be shorter by about 0.8%. This was attributed to the lower operating temperature which XSAT is currently operating. With the calibrated parameters and the camera model, a geometric level 1 multispectral image with RPCs can be generated and if required, orthorectified imagery can also be produced.

  14. On-ground and in-orbit characterisation plan for the PLATO CCD normal cameras

    NASA Astrophysics Data System (ADS)

    Gow, J. P. D.; Walton, D.; Smith, A.; Hailey, M.; Curry, P.; Kennedy, T.

    2017-11-01

    PLAnetary Transits and Ocillations (PLATO) is the third European Space Agency (ESA) medium class mission in ESA's cosmic vision programme due for launch in 2026. PLATO will carry out high precision un-interrupted photometric monitoring in the visible band of large samples of bright solar-type stars. The primary mission goal is to detect and characterise terrestrial exoplanets and their systems with emphasis on planets orbiting in the habitable zone, this will be achieved using light curves to detect planetary transits. PLATO uses a novel multi- instrument concept consisting of 26 small wide field cameras The 26 cameras are made up of a telescope optical unit, four Teledyne e2v CCD270s mounted on a focal plane array and connected to a set of Front End Electronics (FEE) which provide CCD control and readout. There are 2 fast cameras with high read-out cadence (2.5 s) for magnitude ~ 4-8 stars, being developed by the German Aerospace Centre and 24 normal (N) cameras with a cadence of 25 s to monitor stars with a magnitude greater than 8. The N-FEEs are being developed at University College London's Mullard Space Science Laboratory (MSSL) and will be characterised along with the associated CCDs. The CCDs and N-FEEs will undergo rigorous on-ground characterisation and the performance of the CCDs will continue to be monitored in-orbit. This paper discusses the initial development of the experimental arrangement, test procedures and current status of the N-FEE. The parameters explored will include gain, quantum efficiency, pixel response non-uniformity, dark current and Charge Transfer Inefficiency (CTI). The current in-orbit characterisation plan is also discussed which will enable the performance of the CCDs and their associated N-FEE to be monitored during the mission, this will include measurements of CTI giving an indication of the impact of radiation damage in the CCDs.

  15. Delta-Doped CCDs as Detector Arrays in Mass Spectrometers

    NASA Technical Reports Server (NTRS)

    Nikzad, Shouleh; Jones, Todd; Jewell, April; Sinha, Mahadeva

    2007-01-01

    In a conventional mass spectrometer, charged particles (ions) are dispersed through a magnetic sector onto an MCP at an output (focal) plane. In the MCP, the impinging charged particles excite electron cascades that afford signal gain. Electrons leaving the MCP can be read out by any of a variety of means; most commonly, they are post-accelerated onto a solid-state detector array, wherein the electron pulses are converted to photons, which, in turn, are converted to measurable electric-current pulses by photodetectors. Each step in the conversion from the impinging charged particles to the output 26 NASA Tech Briefs, February 2007 current pulses reduces spatial resolution and increases noise, thereby reducing the overall sensitivity and performance of the mass spectrometer. Hence, it would be preferable to make a direct measurement of the spatial distribution of charged particles impinging on the focal plane. The utility of delta-doped CCDs as detectors of charged particles was reported in two articles in NASA Tech Briefs, Vol. 22, No. 7 (July 1998): "Delta-Doped CCDs as Low-Energy-Particle Detectors" (NPO-20178) on page 48 and "Delta- Doped CCDs for Measuring Energies of Positive Ions" (NPO-20253) on page 50. In the present developmental miniature mass spectrometers, the above mentioned miniaturization and performance advantages contributed by the use of delta-doped CCDs are combined with the advantages afforded by the Mattauch-Herzog design. The Mattauch- Herzog design is a double-focusing spectrometer design involving an electric and a magnetic sector, where the ions of different masses are spatially separated along the focal plane of magnetic sector. A delta-doped CCD at the focal plane measures the signals of all the charged-particle species simultaneously at high sensitivity and high resolution, thereby nearly instantaneously providing a complete, high-quality mass spectrum. The simultaneous nature of the measurement of ions stands in contrast to that of a scanning mass spectrometer, in which abundances of different masses are measured at successive times.

  16. An ultrahigh-speed color video camera operating at 1,000,000 fps with 288 frame memories

    NASA Astrophysics Data System (ADS)

    Kitamura, K.; Arai, T.; Yonai, J.; Hayashida, T.; Kurita, T.; Maruyama, H.; Namiki, J.; Yanagi, T.; Yoshida, T.; van Kuijk, H.; Bosiers, Jan T.; Saita, A.; Kanayama, S.; Hatade, K.; Kitagawa, S.; Etoh, T. Goji

    2008-11-01

    We developed an ultrahigh-speed color video camera that operates at 1,000,000 fps (frames per second) and had capacity to store 288 frame memories. In 2005, we developed an ultrahigh-speed, high-sensitivity portable color camera with a 300,000-pixel single CCD (ISIS-V4: In-situ Storage Image Sensor, Version 4). Its ultrahigh-speed shooting capability of 1,000,000 fps was made possible by directly connecting CCD storages, which record video images, to the photodiodes of individual pixels. The number of consecutive frames was 144. However, longer capture times were demanded when the camera was used during imaging experiments and for some television programs. To increase ultrahigh-speed capture times, we used a beam splitter and two ultrahigh-speed 300,000-pixel CCDs. The beam splitter was placed behind the pick up lens. One CCD was located at each of the two outputs of the beam splitter. The CCD driving unit was developed to separately drive two CCDs, and the recording period of the two CCDs was sequentially switched. This increased the recording capacity to 288 images, an increase of a factor of two over that of conventional ultrahigh-speed camera. A problem with the camera was that the incident light on each CCD was reduced by a factor of two by using the beam splitter. To improve the light sensitivity, we developed a microlens array for use with the ultrahigh-speed CCDs. We simulated the operation of the microlens array in order to optimize its shape and then fabricated it using stamping technology. Using this microlens increased the light sensitivity of the CCDs by an approximate factor of two. By using a beam splitter in conjunction with the microlens array, it was possible to make an ultrahigh-speed color video camera that has 288 frame memories but without decreasing the camera's light sensitivity.

  17. Scientific, Back-Illuminated CCD Development for the Transiting Exoplanet Survey Satellite

    NASA Technical Reports Server (NTRS)

    Suntharalingam, V.; Ciampi, J.; Cooper, M. J.; Lambert, R. D.; O'Mara, D. M.; Prigozhin, I.; Young, D. J.; Warner, K.; Burke, B. E.

    2015-01-01

    We describe the development of the fully depleted, back illuminated charge coupled devices for the Transiting Exoplanet Survey Satellite, which includes a set of four wide angle telescopes, each having a 2x2 array of CCDs. The devices are fabricated on the newly upgraded 200-mm wafer line at Lincoln Laboratory. We discuss methods used to produce the devices and present early performance results from the 100- micron thick, 15x15-microns, 2k x 4k pixel frame transfer CCDs.

  18. Performance characteristics of CCDs for the ACIS experiment. [Advanced X-ray Astrophysics Facility CCD Imaging Spectrometer

    NASA Technical Reports Server (NTRS)

    Garmire, Gordon P.; Nousek, John; Burrows, David; Ricker, George; Bautz, Mark; Doty, John; Collins, Stewart; Janesick, James

    1988-01-01

    The search for the optimum CCD to be used at the focal surface of the Advanced X-ray Astrophysics Facility (AXAF) is described. The physics of the interaction of X-rays in silicon through the photoelectric effect is reviewed. CCD technology at the beginning of the AXAF definition phase is summarized, and the results of the CCD enhancement program are discussed. Other sources of optimum CCDs are examined, and CCD enhancements made at MIT Lincoln Laboratory are addressed.

  19. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  20. Charge Diffusion Variations in Pan-STARRS1 CCDs

    NASA Astrophysics Data System (ADS)

    Magnier, Eugene A.; Tonry, J. L.; Finkbeiner, D.; Schlafly, E.; Burgett, W. S.; Chambers, K. C.; Flewelling, H. A.; Hodapp, K. W.; Kaiser, N.; Kudritzki, R.-P.; Metcalfe, N.; Wainscoat, R. J.; Waters, C. Z.

    2018-06-01

    Thick back-illuminated deep-depletion CCDs have superior quantum efficiency over previous generations of thinned and traditional thick CCDs. As a result, they are being used for wide-field imaging cameras in several major projects. We use observations from the Pan-STARRS 3π survey to characterize the behavior of the deep-depletion devices used in the Pan-STARRS 1 Gigapixel Camera. We have identified systematic spatial variations in the photometric measurements and stellar profiles that are similar in pattern to the so-called “tree rings” identified in devices used by other wide-field cameras (e.g., DECam and Hypersuprime Camera). The tree-ring features identified in these other cameras result from lateral electric fields that displace the electrons as they are transported in the silicon to the pixel location. In contrast, we show that the photometric and morphological modifications observed in the GPC1 detectors are caused by variations in the vertical charge transportation rate and resulting charge diffusion variations.

  1. Silicon Based Schottky Barrier Infrared Sensors For Power System And Industrial Applications

    NASA Astrophysics Data System (ADS)

    Elabd, Hammam; Kosonocky, Walter F.

    1984-03-01

    Schottky barrier infrared charge coupled device sensors (IR-CCDs) have been developed. PtSi Schottky barrier detectors require cooling to liquid Nitrogen temperature and cover the wavelength range between 1 and 6 μm. The PtSi IR-CCDs can be used in industrial thermography with NEAT below 0.1°C. Pd Si-Schottkybarrier detectors require cooling to 145K and cover the spectral range between 1 and 3.5 μm. 11d2Si-IR-CCDs can be used in imaging high temperature scenes with NE▵T around 100°C. Several high density staring area and line imagers are available. Both interlaced and noninterlaced area imagers can be operated with variable and TV compatible frame rates as well as various field of view angles. The advantages of silicon fabrication technology in terms of cost and high density structures opens the doors for the design of special purpose thermal camera systems for a number of power aystem and industrial applications.

  2. The GENCODE exome: sequencing the complete human exome

    PubMed Central

    Coffey, Alison J; Kokocinski, Felix; Calafato, Maria S; Scott, Carol E; Palta, Priit; Drury, Eleanor; Joyce, Christopher J; LeProust, Emily M; Harrow, Jen; Hunt, Sarah; Lehesjoki, Anna-Elina; Turner, Daniel J; Hubbard, Tim J; Palotie, Aarno

    2011-01-01

    Sequencing the coding regions, the exome, of the human genome is one of the major current strategies to identify low frequency and rare variants associated with human disease traits. So far, the most widely used commercial exome capture reagents have mainly targeted the consensus coding sequence (CCDS) database. We report the design of an extended set of targets for capturing the complete human exome, based on annotation from the GENCODE consortium. The extended set covers an additional 5594 genes and 10.3 Mb compared with the current CCDS-based sets. The additional regions include potential disease genes previously inaccessible to exome resequencing studies, such as 43 genes linked to ion channel activity and 70 genes linked to protein kinase activity. In total, the new GENCODE exome set developed here covers 47.9 Mb and performed well in sequence capture experiments. In the sample set used in this study, we identified over 5000 SNP variants more in the GENCODE exome target (24%) than in the CCDS-based exome sequencing. PMID:21364695

  3. Radiation effects on space-based stellar photometry: theoretical models and empirical results for CoRoT Space Telescope

    NASA Astrophysics Data System (ADS)

    Pinheiro da Silva, L.; Rolland, G.; Lapeyrere, V.; Auvergne, M.

    2008-03-01

    Convection, Rotation and planetary Transits (CoRoT) is a space mission dedicated to stellar seismology and the search for extrasolar planets. Both scientific programs are based on very high precision photometry and require long, uninterrupted observations. The instrument is based on an afocal telescope and a wide-field camera, consisting of four E2V-4280 CCD devices. This set is mounted on a recurrent platform for insertion in low Earth orbit. The CoRoT satellite has been recently launched for a nominal mission duration of three years. In this work, we discuss the impact of space radiation on CoRoT CCDs, in sight of the in-flight characterization results obtained during the satellite's commissioning phase, as well as the very first observational data. We start by describing the population of trapped particles at the satellite altitude, and by presenting a theoretical prediction for the incoming radiation fluxes seen by the CCDs behind shielding. Empirical results regarding particle impact rates and their geographical distribution are then presented and discussed. The effect of particle impacts is also statistically characterized, with respect to the ionizing energy imparted to the CCDs and the size of impact trails. Based on these results, we discuss the effects of space radiation on precise and time-resolved stellar photometry from space. Finally, we present preliminary results concerning permanent radiation damage on CoRoT CCDs, as extrapolated from the data available at the beginning of the satellite's lifetime.

  4. University of Arizona High Energy Physics Program at the Cosmic Frontier 2014-2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    abate, alex; cheu, elliott

    This is the final technical report from the University of Arizona High Energy Physics program at the Cosmic Frontier covering the period 2014-2016. The work aims to advance the understanding of dark energy using the Large Synoptic Survey Telescope (LSST). Progress on the engineering design of the power supplies for the LSST camera is discussed. A variety of contributions to photometric redshift measurement uncertainties were studied. The effect of the intergalactic medium on the photometric redshift of very distant galaxies was evaluated. Computer code was developed realizing the full chain of calculations needed to accurately and efficiently run large-scale simulations.

  5. Photometric Redshift Calibration Strategy for WFIRST Cosmology

    NASA Astrophysics Data System (ADS)

    Hemmati, Shoubaneh; WFIRST, WFIRST-HLS-COSMOLOGY

    2018-01-01

    In order for WFIRST and other Stage IV Dark energy experiments (e.g. LSST, Euclid) to infer cosmological parameters not limited by systematic errors, accurate redshift measurements are needed. This accuracy can only be met using spectroscopic subsamples to calibrate the full sample. In this poster, we employ the machine leaning, SOM based spectroscopic sampling technique developed in Masters et al. 2015, using the empirical color-redshift relation among galaxies to find the minimum spectra required for the WFIRST weak lensing calibration. We use galaxies from the CANDELS survey to build the LSST+WFIRST lensing analog sample of ~36k objects and train the LSST+WFIRST SOM. We show that 26% of the WFIRST lensing sample consists of sources fainter than the Euclid depth in the optical, 91% of which live in color cells already occupied by brighter galaxies. We demonstrate the similarity between faint and bright galaxies as well as the feasibility of redshift measurements at different brightness levels. 4% of SOM cells are however only occupied by faint galaxies for which we recommend extra spectroscopy of ~200 new sources. Acquiring the spectra of these sources will enable the comprehensive calibration of the WFIRST color-redshift relation.

  6. New Self-lensing Models of the Small Magellanic Cloud: Can Gravitational Microlensing Detect Extragalactic Exoplanets?

    NASA Astrophysics Data System (ADS)

    Mróz, Przemek; Poleski, Radosław

    2018-04-01

    We use three-dimensional distributions of classical Cepheids and RR Lyrae stars in the Small Magellanic Cloud (SMC) to model the stellar density distribution of a young and old stellar population in that galaxy. We use these models to estimate the microlensing self-lensing optical depth to the SMC, which is in excellent agreement with the observations. Our models are consistent with the total stellar mass of the SMC of about 1.0× {10}9 {M}ȯ under the assumption that all microlensing events toward this galaxy are caused by self-lensing. We also calculate the expected event rates and estimate that future large-scale surveys, like the Large Synoptic Survey Telescope (LSST), will be able to detect up to a few dozen microlensing events in the SMC annually. If the planet frequency in the SMC is similar to that in the Milky Way, a few extragalactic planets can be detected over the course of the LSST survey, provided significant changes in the SMC observing strategy are devised. A relatively small investment of LSST resources can give us a unique probe of the population of extragalactic exoplanets.

  7. Commissioning of the new GMOS-N Hamamatsu CCDs

    NASA Astrophysics Data System (ADS)

    Scharwaechter, Julia; Chiboucas, Kristin; Gimeno, German; Boucher, Luc; White, John; Tapia, Eduardo; Lundquist, Michael; Rippa, Mathew; Labrie, Kathleen; Murowinski, Richard; Lazo, Manuel; Miller, Jennifer

    2017-06-01

    We report on the commissioning of the new Hamamatsu fully-depleted CCDs for GMOS-N, installed in the instrument in February 2017. The Hamamatsu detectors replace the e2v deep depletion devices which had been in operation since 2011. The new GMOS-N detector array is expected to provide improved red sensitivity compared to the e2v devices at wavelengths longer than ~900 nm. The commissioning of the new detector array for GMOS-N marks the final step in upgrading the two GMOS instruments at Gemini North and South with Hamamatsu detectors.

  8. Support and Assessment for Fall Emergency Referrals (SAFER 1): Cluster Randomised Trial of Computerised Clinical Decision Support for Paramedics

    PubMed Central

    Snooks, Helen Anne; Carter, Ben; Dale, Jeremy; Foster, Theresa; Humphreys, Ioan; Logan, Philippa Anne; Lyons, Ronan Anthony; Mason, Suzanne Margaret; Phillips, Ceri James; Sanchez, Antonio; Wani, Mushtaq; Watkins, Alan; Wells, Bridget Elizabeth; Whitfield, Richard; Russell, Ian Trevor

    2014-01-01

    Objective To evaluate effectiveness, safety and cost-effectiveness of Computerised Clinical Decision Support (CCDS) for paramedics attending older people who fall. Design Cluster trial randomised by paramedic; modelling. Setting 13 ambulance stations in two UK emergency ambulance services. Participants 42 of 409 eligible paramedics, who attended 779 older patients for a reported fall. Interventions Intervention paramedics received CCDS on Tablet computers to guide patient care. Control paramedics provided care as usual. One service had already installed electronic data capture. Main Outcome Measures Effectiveness: patients referred to falls service, patient reported quality of life and satisfaction, processes of care. Safety Further emergency contacts or death within one month. Cost-Effectiveness Costs and quality of life. We used findings from published Community Falls Prevention Trial to model cost-effectiveness. Results 17 intervention paramedics used CCDS for 54 (12.4%) of 436 participants. They referred 42 (9.6%) to falls services, compared with 17 (5.0%) of 343 participants seen by 19 control paramedics [Odds ratio (OR) 2.04, 95% CI 1.12 to 3.72]. No adverse events were related to the intervention. Non-significant differences between groups included: subsequent emergency contacts (34.6% versus 29.1%; OR 1.27, 95% CI 0.93 to 1.72); quality of life (mean SF12 differences: MCS −0.74, 95% CI −2.83 to +1.28; PCS −0.13, 95% CI −1.65 to +1.39) and non-conveyance (42.0% versus 36.7%; OR 1.13, 95% CI 0.84 to 1.52). However ambulance job cycle time was 8.9 minutes longer for intervention patients (95% CI 2.3 to 15.3). Average net cost of implementing CCDS was £208 per patient with existing electronic data capture, and £308 without. Modelling estimated cost per quality-adjusted life-year at £15,000 with existing electronic data capture; and £22,200 without. Conclusions Intervention paramedics referred twice as many participants to falls services with no difference in safety. CCDS is potentially cost-effective, especially with existing electronic data capture. Trial Registration ISRCTN Register ISRCTN10538608 PMID:25216281

  9. Fabrication of Robust, Flat, Thinned, UV-Imaging CCDs

    NASA Technical Reports Server (NTRS)

    Grunthaner, Paula; Elliott, Stythe; Jones, Todd; Nikzad, Shouleh

    2004-01-01

    An improved process that includes a high-temperature bonding subprocess has been developed to enable the fabrication of robust, flat, silicon-based charge-coupled devices (CCDs) for imaging in ultraviolet (UV) light and/or for detecting low-energy charged particles. The CCDs in question are devices on which CCD circuitry has already been formed and have been thinned for backsurface illumination. These CCDs may be delta doped, and aspects of this type of CCD have been described in several prior articles in NASA Tech Briefs. Unlike prior low-temperature bonding subprocesses based on the use of epoxies or waxes, the high-temperature bonding subprocess is compatible with the deltadoping process as well as with other CCD-fabrication processes. The present improved process and its bonding, thinning, and delta-doping subprocesses, are characterized as postfabrication processes because they are undertaken after the fabrication of CCD circuitry on the front side of a full-thickness silicon substrate. In a typical case, it is necessary to reduce the thickness of the CCD to between 10 and 20 m in order to take advantage of back-side illumination and in order to perform delta doping and/or other back-side treatment to enhance the quantum efficiency. In the prior approach to the fabrication of back-side-illuminated CCDs, the thinning subprocess turned each CCD into a free-standing membrane that was fragile and tended to become wrinkled. In the present improved process, prior to thinning and delta doping, a CCD is bonded on its front side to a silicon substrate that has been prefabricated to include cutouts to accommodate subsequent electrical connections to bonding pads on the CCD circuitry. The substrate provides structural support to increase ruggedness and maintain flatness. At the beginning of this process, the back side of a CCD as fabricated on a full-thickness substrate is polished. Silicon nitride is deposited on the back side, opposite the bonding pads on the front side, in order to define a relatively thick frame. The portion of the CCD not covered by the frame is the portion to be thinned by etching.

  10. The superiority of L3-CCDs in the high-flux and wide dynamic range regimes

    NASA Astrophysics Data System (ADS)

    Butler, Raymond F.; Sheehan, Brendan J.

    2008-02-01

    Low Light Level CCD (L3-CCD) cameras have received much attention for high cadence astronomical imaging applications. Efforts to date have concentrated on exploiting them for two scenarios: post-exposure image sharpening and ``lucky imaging'', and rapid variability in astrophysically interesting sources. We demonstrate their marked superiority in a third distinct scenario: observing in the high-flux and wide dynamic range regimes. We realized that the unique features of L3-CCDs would make them ideal for maximizing signal-to-noise in observations of bright objects (whether variable or not), and for high dynamic range scenarios such as faint targets embedded in a crowded field of bright objects. Conventional CCDs have drawbacks in such regimes, due to a poor duty cycle-the combination of short exposure times (for time-series sampling or to avoid saturation) and extended readout times (for minimizing readout noise). For different telescope sizes, we use detailed models to show that a range of conventional imaging systems are photometrically out-performed across a wide range of object brightness, once the operational parameters of the L3-CCD are carefully set. The cross-over fluxes, above which the L3-CCD is operationally superior, are surprisingly faint-even for modest telescope apertures. We also show that the use of L3-CCDs is the optimum strategy for minimizing atmospheric scintillation noise in photometric observations employing a given telescope aperture. This is particularly significant, since scintillation can be the largest source of error in timeseries photometry. These results should prompt a new direction in developing imaging instrumentation solutions for observatories.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilar-Arevalo, A.

    We present measurements of radioactive contamination in the high-resistivity silicon charge-coupled devices (CCDs) used by the DAMIC experiment to search for dark matter particles. Novel analysis methods, which exploit the unique spatial resolution of CCDs, were developed to identify α and β particles. Uranium and thorium contamination in the CCD bulk was measured through α spectroscopy, with an upper limit on the 238U ( 232Th) decay rate of 5 (15) kg -1 d -1 at 95% CL. We also searched for pairs of spatially correlated electron tracks separated in time by up to tens of days, as expected from 32Simore » – 32P or 210Pb – 210Bi sequences of b decays. The decay rate of 32Si was found to be 80 +110 -65 (95% CI). An upper limit of ~35 kg -1 d -1 (95% CL) on the 210Pb decay rate was obtained independently by α spectroscopy and the β decay sequence search. Furthermore, these levels of radioactive contamination are sufficiently low for the successful operation of CCDs in the forthcoming 100 g DAMIC detector.« less

  12. VizieR Online Data Catalog: Coordinates and photometry of stars in Haffner 16 (Davidge, 2017)

    NASA Astrophysics Data System (ADS)

    Davidge, T. J.

    2017-11-01

    The images and spectra that are the basis of this study were recorded with Gemini Multi-Object Spectrograph (GMOS) on Gemini South as part of program GS-2014A-Q-84 (PI: Davidge). GMOS is the facility visible-light imager and spectrograph. The detector was (the CCDs that make up the GMOS detector have since been replaced) a mosaic of three 2048*4068 EEV CCDs. Each 13.5μm square pixel subtended 0.073arcsec on the sky. The three CCDs covered an area that is larger than that illuminated by the sky so that spectra could be dispersed outside of the sky field. The images and spectra were both recorded with 2*2 pixel binning. The g' (FWHM=0.55) and i' (FWHM=0.45) images of Haffner 16 were recorded on the night of 2013 December 31. The GMOS spectra were recorded during five nights in 2014 March (Mar 19, Mar 27, and Mar 30) and April (Apr 2, and Apr 3). The spectra were dispersed with the R400 grating (λblaze=7640Å, 400lines/mm). (1 data file).

  13. An HMM model for coiled-coil domains and a comparison with PSSM-based predictions.

    PubMed

    Delorenzi, Mauro; Speed, Terry

    2002-04-01

    Large-scale sequence data require methods for the automated annotation of protein domains. Many of the predictive methods are based either on a Position Specific Scoring Matrix (PSSM) of fixed length or on a window-less Hidden Markov Model (HMM). The performance of the two approaches is tested for Coiled-Coil Domains (CCDs). The prediction of CCDs is used frequently, and its optimization seems worthwhile. We have conceived MARCOIL, an HMM for the recognition of proteins with a CCD on a genomic scale. A cross-validated study suggests that MARCOIL improves predictions compared to the traditional PSSM algorithm, especially for some protein families and for short CCDs. The study was designed to reveal differences inherent in the two methods. Potential confounding factors such as differences in the dimension of parameter space and in the parameter values were avoided by using the same amino acid propensities and by keeping the transition probabilities of the HMM constant during cross-validation. The prediction program and the databases are available at http://www.wehi.edu.au/bioweb/Mauro/Marcoil

  14. Fast force actuators for LSST primary/tertiary mirror

    NASA Astrophysics Data System (ADS)

    Hileman, Edward; Warner, Michael; Wiecha, Oliver

    2010-07-01

    The very short slew times and resulting high inertial loads imposed upon the Large Synoptic Survey Telescope (LSST) create new challenges to the primary mirror support actuators. Traditionally large borosilicate mirrors are supported by pneumatic systems, which is also the case for the LSST. These force based actuators bear the weight of the mirror and provide active figure correction, but do not define the mirror position. A set of six locating actuators (hardpoints) arranged in a hexapod fashion serve to locate the mirror. The stringent dynamic requirements demand that the force actuators must be able to counteract in real time for dynamic forces on the hardpoints during slewing to prevent excessive hardpoint loads. The support actuators must also maintain the prescribed forces accurately during tracking to maintain acceptable mirror figure. To meet these requirements, candidate pneumatic cylinders incorporating force feedback control and high speed servo valves are being tested using custom instrumentation with automatic data recording. Comparative charts are produced showing details of friction, hysteresis cycles, operating bandwidth, and temperature dependency. Extremely low power actuator controllers are being developed to avoid heat dissipation in critical portions of the mirror and also to allow for increased control capabilities at the actuator level, thus improving safety, performance, and the flexibility of the support system.

  15. The Search for Transients and Variables in the LSST Pathfinder Survey

    NASA Astrophysics Data System (ADS)

    Gorsuch, Mary Katherine; Kotulla, Ralf

    2018-01-01

    This research was completed during participation in the NSF-REU program at University of Wisconsin-Madison. Two fields of a few square degrees, close to the galactic plane, were imaged on the WIYN 3.5 meter telescope during the commissioning of the One Degree Imager (ODI) focal plane. These images were taken with repeated, shorter exposures in order to model an LSST-like cadence. This data was taken in order to identify transient and variable light sources. This was done by using Source Extractor to generate a catalog of all sources in each exposure, and inserting this data into a larger photometry database composed of all exposures for each field. A Python code was developed to analyze the data and isolate sources of interest from a large data set. We found that there were some discrepancies in the data, which lead to some interesting results that we are looking into further. Variable and transient sources, while relatively well understood, are not numerous in current cataloging systems. This will be a major undertaking of the Large Synoptic Survey Telescope (LSST), which this project is a precursor to. Locating these sources may give us a better understanding of where these sources are located and how they impact their surroundings.

  16. Machine-learning-based Brokers for Real-time Classification of the LSST Alert Stream

    NASA Astrophysics Data System (ADS)

    Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika D.; Wang, Zhe; Lochner, Michelle; Matheson, Thomas; Saha, Abhijit; Yang, Shuo; Zhao, Zhenge; Kececioglu, John; Scheidegger, Carlos; Snodgrass, Richard T.; Axelrod, Tim; Jenness, Tim; Maier, Robert S.; Ridgway, Stephen T.; Seaman, Robert L.; Evans, Eric Michael; Singh, Navdeep; Taylor, Clark; Toeniskoetter, Jackson; Welch, Eric; Zhu, Songzhe; The ANTARES Collaboration

    2018-05-01

    The unprecedented volume and rate of transient events that will be discovered by the Large Synoptic Survey Telescope (LSST) demand that the astronomical community update its follow-up paradigm. Alert-brokers—automated software system to sift through, characterize, annotate, and prioritize events for follow-up—will be critical tools for managing alert streams in the LSST era. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is one such broker. In this work, we develop a machine learning pipeline to characterize and classify variable and transient sources only using the available multiband optical photometry. We describe three illustrative stages of the pipeline, serving the three goals of early, intermediate, and retrospective classification of alerts. The first takes the form of variable versus transient categorization, the second a multiclass typing of the combined variable and transient data set, and the third a purity-driven subtyping of a transient class. Although several similar algorithms have proven themselves in simulations, we validate their performance on real observations for the first time. We quantitatively evaluate our pipeline on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, and demonstrate very competitive classification performance. We describe our progress toward adapting the pipeline developed in this work into a real-time broker working on live alert streams from time-domain surveys.

  17. Charge coupled devices vs. microchannel plates in the extreme and far ultraviolet - A comparison based on the latest laboratory measurements

    NASA Technical Reports Server (NTRS)

    Vallerga, J.; Lampton, M.

    1988-01-01

    While microchannel plates (MCPs) have been established as imaging photon counters in the EUV and FUV for some years, CCDs are associated with low light level sensing at visible and near-IR wavelengths. Attention is presently given to recent proposals for CCDs' use as EUV and FUV detectors with quantum efficiencies sometimes exceeding those of MCPs; quantum resolution, format size, dynamic range, and long-term stability are also used as bases of comparison, for the cases of both space-based astronomical and spectroscopic applications.

  18. Delta-Doped Back-Illuminated CMOS Imaging Arrays: Progress and Prospects

    NASA Technical Reports Server (NTRS)

    Hoenk, Michael E.; Jones, Todd J.; Dickie, Matthew R.; Greer, Frank; Cunningham, Thomas J.; Blazejewski, Edward; Nikzad, Shouleh

    2009-01-01

    In this paper, we report the latest results on our development of delta-doped, thinned, back-illuminated CMOS imaging arrays. As with charge-coupled devices, thinning and back-illumination are essential to the development of high performance CMOS imaging arrays. Problems with back surface passivation have emerged as critical to the prospects for incorporating CMOS imaging arrays into high performance scientific instruments, just as they did for CCDs over twenty years ago. In the early 1990's, JPL developed delta-doped CCDs, in which low temperature molecular beam epitaxy was used to form an ideal passivation layer on the silicon back surface. Comprising only a few nanometers of highly-doped epitaxial silicon, delta-doping achieves the stability and uniformity that are essential for high performance imaging and spectroscopy. Delta-doped CCDs were shown to have high, stable, and uniform quantum efficiency across the entire spectral range from the extreme ultraviolet through the near infrared. JPL has recently bump-bonded thinned, delta-doped CMOS imaging arrays to a CMOS readout, and demonstrated imaging. Delta-doped CMOS devices exhibit the high quantum efficiency that has become the standard for scientific-grade CCDs. Together with new circuit designs for low-noise readout currently under development, delta-doping expands the potential scientific applications of CMOS imaging arrays, and brings within reach important new capabilities, such as fast, high-sensitivity imaging with parallel readout and real-time signal processing. It remains to demonstrate manufacturability of delta-doped CMOS imaging arrays. To that end, JPL has acquired a new silicon MBE and ancillary equipment for delta-doping wafers up to 200mm in diameter, and is now developing processes for high-throughput, high yield delta-doping of fully-processed wafers with CCD and CMOS imaging devices.

  19. Modeling the spectral response for the soft X-ray imager onboard the ASTRO-H satellite

    NASA Astrophysics Data System (ADS)

    Inoue, Shota; Hayashida, Kiyoshi; Katada, Shuhei; Nakajima, Hiroshi; Nagino, Ryo; Anabuki, Naohisa; Tsunemi, Hiroshi; Tsuru, Takeshi Go; Tanaka, Takaaki; Uchida, Hiroyuki; Nobukawa, Masayoshi; Nobukawa, Kumiko Kawabata; Washino, Ryosaku; Mori, Koji; Isoda, Eri; Sakata, Miho; Kohmura, Takayoshi; Tamasawa, Koki; Tanno, Shoma; Yoshino, Yuma; Konno, Takahiro; Ueda, Shutaro; ASTRO-H/SXI Team

    2016-09-01

    The ASTRO-H satellite is the 6th Japanese X-ray astronomical observatory to be launched in early 2016. The satellite carries four kinds of detectors, and one of them is an X-ray CCD camera, the soft X-ray imager (SXI), installed on the focal plane of an X-ray telescope. The SXI contains four CCD chips, each with an imaging area of 31 mm × 31 mm , arrayed in mosaic, covering the field-of-view of 38‧ ×38‧ , the widest ever flown in orbit. The CCDs are a P-channel back-illuminated (BI) type with a depletion layer thickness of 200 μ m . We operate the CCDs in a photon counting mode in which the position and energy of each photon are measured in the energy band of 0.4-12 keV. To evaluate the X-ray spectra obtained with the SXI, an accurate calibration of its response function is essential. For this purpose, we performed calibration experiments at Kyoto and Photon Factory of KEK, each with different X-ray sources with various X-ray energies. We fit the obtained spectra with 5 components; primary peak, secondary peak, constant tail, Si escape and Si fluorescence, and then model their energy dependence using physics-based or empirical formulae. Since this is the first adoption of P-channel BI-type CCDs on an X-ray astronomical satellite, we need to take special care on the constant tail component which is originated in partial charge collection. It is found that we need to assume a trapping layer at the incident surface of the CCD and implement it in the response model. In addition, the Si fluorescence component of the SXI response is significantly weak, compared with those of front-illuminated type CCDs.

  20. Implementation of a 4x8 NIR and CCD Mosaic Focal Plane Technology

    NASA Astrophysics Data System (ADS)

    Jelinsky, Patrick; Bebek, C. J.; Besuner, R. W.; Haller, G. M.; Harris, S. E.; Hart, P. A.; Heetderks, H. D.; Levi, M. E.; Maldonado, S. E.; Roe, N. A.; Roodman, A. J.; Sapozhnikov, L.

    2011-01-01

    Mission concepts for NASA's Wide Field Infrared Survey Telescope (WFIRST), ESA's EUCLID mission, as well as for ground based observations, have requirements for large mosaic focal planes to image visible and near infrared (NIR) wavelengths. We have developed detectors, readout electronics and focal plane design techniques that can be used to create very large scalable focal plane mosaic cameras. In our technology, CCDs and HgCdTe detectors can be intermingled on a single, silicon carbide (SiC) cold plate. This enables optimized, wideband observing strategies. The CCDs, developed at Lawrence Berkeley National Laboratory, are fully-depleted, p-channel devices that are backside illuminated capable of operating at temperatures as low as 110K and have been optimized for the weak lensing dark energy technique. The NIR detectors are 1.7µm and 2.0µm wavelength cutoff H2RG® HgCdTe, manufactured by Teledyne Imaging Sensors under contract to LBL. Both the CCDs and NIR detectors are packaged on 4-side abuttable SiC pedestals with a common mounting footprint supporting a 44.16mm mosaic pitch and are coplanar. Both types of detectors have direct-attached, readout electronics that convert the detector signal directly to serial, digital data streams and allow a flexible, low cost data acquisition strategy, despite the large data volume. A mosaic of these detectors can be operated at a common temperature that achieves the required dark current and read noise performance in both types of detectors necessary for dark energy observations. We report here the design and integration for a focal plane designed to accommodate a 4x8 heterogeneous array of CCDs and HgCdTe detectors. Our current implementation contains over 1/4-billion pixels.

  1. Present and future CCDs for UV and X-ray scientific measurements

    NASA Technical Reports Server (NTRS)

    Janesick, J. R.; Elliott, S. T.; Mccarthy, J. K.; Marsh, H. H.; Collins, S. A.; Blouke, M. M.

    1985-01-01

    Interacting quantum efficiencies in excess of 50 percent have been demonstrated with CCDs throughout the spectral range 600-9,00 A, and comparable sensitivity is expected to continue to wavelengths as short as a few Angstroms. Nondispersive X-ray spectra throughout the 250-8000 V range have been obtained with an FWHM spectral resolution of 200-250 eV. At present, however, both spectral and spatial resolution is limited at some energies by the diffusion of photogenerated charge into more than one picture element. Progress in reducing charge diffusion is reported, with particular attention given to a theoretical diffusion model and its implications for further improvement.

  2. Measurement of radioactive contamination in the CCD’s of the DAMIC experiment

    NASA Astrophysics Data System (ADS)

    Aguilar-Arevalo, A.; Amidei, D.; Bertou, X.; Bole, D.; Butner, M.; Cancelo, G.; Castañeda Vásquez, A.; Chavarria, A. E.; de Mello Neto, J. R. T.; Dixon, S.; D'Olivo, J. C.; Estrada, J.; Fernandez Moroni, G.; Hernández Torres, K. P.; Izraelevitch, F.; Kavner, A.; Kilminster, B.; Lawson, I.; Liao, J.; López, M.; Molina, J.; Moreno-Granados, G.; Pena, J.; Privitera, P.; Sarkis, Y.; Scarpine, V.; Schwarz, T.; Sofo Haro, M.; Tiffenberg, J.; Torres Machado, D.; Trillaud, F.; Yol, X.; Zhou, J.

    2016-05-01

    DAMIC (Dark Matter in CCDs) is an experiment searching for dark matter particles employing fully-depleted charge-coupled devices. Using the bulk silicon which composes the detector as target, we expect to observe coherent WIMP-nucleus elastic scattering. Although located in the SNOLAB laboratory, 2 km below the surface, the CCDs are not completely free of radioactive contamination, in particular coming from radon daughters or from the detector itself. We present novel techniques for the measurement of the radioactive contamination in the bulk silicon and on the surface of DAMIC CCDs. Limits on the Uranium and Thorium contamination as well as on the cosmogenic isotope 32 Si, intrinsically present on the detector, were performed. We have obtained upper limits on the 238 TJ (232 Th) decay rate of 5 (15) kg_1 d_1 at 95% CL. Pairs of spatially correlated electron tracks expected from 32 Si-32 P and 210 Pb-210 Bi beta decays were also measured. We have found a decay rate of 80+l10 -65 kg_1 d_1 for 32 Si and an upper limit of - 35 kg-1 d-1 for 210 Pb, both at 95% CL.

  3. Image analysis of single event transient effects on charge coupled devices irradiated by protons

    NASA Astrophysics Data System (ADS)

    Wang, Zujun; Xue, Yuanyuan; Liu, Jing; He, Baoping; Yao, Zhibin; Ma, Wuying

    2016-10-01

    The experiments of single event transient (SET) effects on charge coupled devices (CCDs) irradiated by protons are presented. The radiation experiments have been carried out at the accelerator protons with the energy of 200 MeV and 60 MeV.The incident angles of the protons are at 30°and 90° to the plane of the CCDs to obtain the images induced by the perpendicularity and incline incident angles. The experimental results show that the typical characteristics of the SET effects on a CCD induced by protons are the generation of a large number of dark signal spikes (hot pixels) which are randomly distributed in the "pepper" images. The characteristics of SET effects are investigated by observing the same imaging area at different time during proton radiation to verify the transient effects. The experiment results also show that the number of dark signal spikes increases with increasing integration time during proton radiation. The CCDs were tested at on-line and off-line to distinguish the radiation damage induced by the SET effects or DD effects. The mechanisms of the dark signal spike generation induced by the SET effects and the DD effects are demonstrated respectively.

  4. Single-silicon CCD-CMOS platform for multi-spectral detection from terahertz to x-rays.

    PubMed

    Shalaby, Mostafa; Vicario, Carlo; Hauri, Christoph P

    2017-11-15

    Charge-coupled devices (CCDs) are a well-established imaging technology in the visible and x-ray frequency ranges. However, the small quantum photon energies of terahertz radiation have hindered the use of this mature semiconductor technological platform in this frequency range, leaving terahertz imaging totally dependent on low-resolution bolometer technologies. Recently, it has been shown that silicon CCDs can detect terahertz photons at a high field, but the detection sensitivity is limited. Here we show that silicon, complementary metal-oxide-semiconductor (CMOS) technology offers enhanced detection sensitivity of almost two orders of magnitude, compared to CCDs. Our findings allow us to extend the low-frequency terahertz cutoff to less than 2 THz, nearly closing the technological gap with electronic imagers operating up to 1 THz. Furthermore, with the silicon CCD/CMOS technology being sensitive to mid-infrared (mid-IR) and the x-ray ranges, we introduce silicon as a single detector platform from 1 EHz to 2 THz. This overcomes the present challenge in spatially overlapping a terahertz/mid-IR pump and x-ray probe radiation at facilities such as free electron lasers, synchrotron, and laser-based x-ray sources.

  5. Evaluation of arteriovenous fistulas and pseudoaneurysms in renal allografts following percutaneous needle biopsy. Color-coded Doppler sonography versus duplex Doppler sonography.

    PubMed

    Hübsch, P J; Mostbeck, G; Barton, P P; Gritzmann, N; Fruehwald, F X; Schurawitzki, H; Kovarik, J

    1990-02-01

    One hundred one patients with renal allografts were studied by two independent observers using duplex Doppler sonography (DDS) and color-coded Doppler sonography (CCDS). In all patients, single or multiple percutaneous needle biopsies of the transplant had been performed 1 to 30 days before. In 6 patients CCDS following the biopsy demonstrated an area of combined red and blue color-coded blood flow within the renal parenchyma (n = 5) or within the sinus (n = 1); the Doppler waveform was abnormal in these areas with signals above and below the zero line indicating turbulent blood flow. Consecutive intraarterial digital subtraction angiography (DSA) revealed the presence of an arteriovenous fistula (n = 4) or of a pseudoaneurysm (n = 2). In one patient, gross hematuria with obstruction of the bladder occurred as a complication of a pseudoaneurysm within the renal sinus; the bleeding could not be stopped by embolization of the lesion and the kidney had to be removed. DDS demonstrated the lesion in only one of the six patients. Thus, CCDS is the method of choice for noninvasive detection of vascular lesions due to percutaneous biopsy.

  6. The need for GPS standardization

    NASA Technical Reports Server (NTRS)

    Lewandowski, Wlodzimierz W.; Petit, Gerard; Thomas, Claudine

    1992-01-01

    A desirable and necessary step for improvement of the accuracy of Global Positioning System (GPS) time comparisons is the establishment of common GPS standards. For this reason, the CCDS proposed the creation of a special group of experts with the objective of recommending procedures and models for operational time transfer by GPS common-view method. Since the announcement of the implementation of Selective Availability at the end of last spring, action has become much more urgent and this CCDS Group on GPS Time Transfer Standards has now been set up. It operates under the auspices of the permanent CCDS Working Group on TAI and works in close cooperation with the Sub-Committee on Time of the Civil GPS Service Interface Committee (CGSIC). Taking as an example the implementation of SA during the first week of July 1991, this paper illustrates the need to develop urgently at least two standardized procedures in GPS receiver software: monitoring GPS tracks with a common time scale and retaining broadcast ephemeris parameters throughout the duration of a track. Other matters requiring action are the adoption of common models for atmospheric delay, a common approach to hardware design and agreement about short-term data processing. Several examples of such deficiencies in standardization are presented.

  7. Robust Period Estimation Using Mutual Information for Multiband Light Curves in the Synoptic Survey Era

    NASA Astrophysics Data System (ADS)

    Huijse, Pablo; Estévez, Pablo A.; Förster, Francisco; Daniel, Scott F.; Connolly, Andrew J.; Protopapas, Pavlos; Carrasco, Rodrigo; Príncipe, José C.

    2018-05-01

    The Large Synoptic Survey Telescope (LSST) will produce an unprecedented amount of light curves using six optical bands. Robust and efficient methods that can aggregate data from multidimensional sparsely sampled time-series are needed. In this paper we present a new method for light curve period estimation based on quadratic mutual information (QMI). The proposed method does not assume a particular model for the light curve nor its underlying probability density and it is robust to non-Gaussian noise and outliers. By combining the QMI from several bands the true period can be estimated even when no single-band QMI yields the period. Period recovery performance as a function of average magnitude and sample size is measured using 30,000 synthetic multiband light curves of RR Lyrae and Cepheid variables generated by the LSST Operations and Catalog simulators. The results show that aggregating information from several bands is highly beneficial in LSST sparsely sampled time-series, obtaining an absolute increase in period recovery rate up to 50%. We also show that the QMI is more robust to noise and light curve length (sample size) than the multiband generalizations of the Lomb–Scargle and AoV periodograms, recovering the true period in 10%–30% more cases than its competitors. A python package containing efficient Cython implementations of the QMI and other methods is provided.

  8. A Euclid, LSST and WFIRST Joint Processing Study

    NASA Astrophysics Data System (ADS)

    Chary, Ranga-Ram; Joint Processing Working Group

    2018-01-01

    Euclid, LSST and WFIRST are the flagship cosmological projects of the next decade. By mapping several thousand square degrees of sky and covering the electromagnetic spectrum from the optical to the NIR with (sub-)arcsec resolution, these projects will provide exciting new constraints on the nature of dark energy and dark matter. The ultimate cosmological, astrophysical and time-domain science yield from these missions, which will detect several billions of sources, requires joint processing at the pixel-level. Three U.S. agencies (DOE, NASA and NSF) are supporting an 18-month study which aims to 1) assess the optimal techniques to combine these, and ancillary data sets at the pixel level; 2) investigate options for an interface that will enable community access to the joint data products; and 3) identify the computing and networking infrastructure to properly handle and manipulate these large datasets together. A Joint Processing Working Group (JPWG) is carrying out this study and consists of US-based members from the community and science/data processing centers of each of these projects. Coordination with European partners is envisioned in the future and European Euclid members are involved in the JPWG as observers. The JPWG will scope the effort and resources required to build up the capabilities to support scientific investigations using joint processing in time for the start of science surveys by LSST and Euclid.

  9. Solar System Science with LSST

    NASA Astrophysics Data System (ADS)

    Jones, R. L.; Chesley, S. R.; Connolly, A. J.; Harris, A. W.; Ivezic, Z.; Knezevic, Z.; Kubica, J.; Milani, A.; Trilling, D. E.

    2008-09-01

    The Large Synoptic Survey Telescope (LSST) will provide a unique tool to study moving objects throughout the solar system, creating massive catalogs of Near Earth Objects (NEOs), asteroids, Trojans, TransNeptunian Objects (TNOs), comets and planetary satellites with well-measured orbits and high quality, multi-color photometry accurate to 0.005 magnitudes for the brightest objects. In the baseline LSST observing plan, back-to-back 15-second images will reach a limiting magnitude as faint as r=24.7 in each 9.6 square degree image, twice per night; a total of approximately 15,000 square degrees of the sky will be imaged in multiple filters every 3 nights. This time sampling will continue throughout each lunation, creating a huge database of observations. Fig. 1 Sky coverage of LSST over 10 years; separate panels for each of the 6 LSST filters. Color bars indicate number of observations in filter. The catalogs will include more than 80% of the potentially hazardous asteroids larger than 140m in diameter within the first 10 years of LSST operation, millions of main-belt asteroids and perhaps 20,000 Trans-Neptunian Objects. Objects with diameters as small as 100m in the Main Belt and <100km in the Kuiper Belt can be detected in individual images. Specialized `deep drilling' observing sequences will detect KBOs down to 10s of kilometers in diameter. Long period comets will be detected at larger distances than previously possible, constrainting models of the Oort cloud. With the large number of objects expected in the catalogs, it may be possible to observe a pristine comet start outgassing on its first journey into the inner solar system. By observing fields over a wide range of ecliptic longitudes and latitudes, including large separations from the ecliptic plane, not only will these catalogs greatly increase the numbers of known objects, the characterization of the inclination distributions of these populations will be much improved. Derivation of proper elements for main belt and Trojan asteroids will allow ever more resolution of asteroid families and their size-frequency distribution, as well as the study of the long-term dynamics of the individual asteroids and the asteroid belt as a whole. Fig. 2 Orbital parameters of Main Belt Asteroids, color-coded according to ugriz colors measured by SDSS. The figure to the left shows osculating elements, the figure to the right shows proper elements - note the asteroid families visible as clumps in parameter space [1]. By obtaining multi-color ugrizy data for a substantial fraction of objects, relationships between color and dynamical history can be established. This will also enable taxonomic classification of asteroids, provide further links between diverse populations such as irregular satellites and TNOs or planetary Trojans, and enable estimates of asteroid diameter with rms uncertainty of 30%. With the addition of light-curve information, rotation periods and phase curves can be measured for large fractions of each population, leading to new insight on physical characteristics. Photometric variability information, together with sparse lightcurve inversion, will allow spin state and shape estimation for up to two orders of magnitude more objects than presently known. This will leverage physical studies of asteroids by constraining the size-strength relationship, which has important implications for the internal structure (solid, fractured, rubble pile) and in turn the collisional evolution of the asteroid belt. Similar information can be gained for other solar system bodies. [1] Parker, A., Ivezic

  10. Thinning and mounting a Texas Instruments 3-phase CCD

    NASA Technical Reports Server (NTRS)

    Lesser, M. P.; Leach, R. W.; Angel, J. R. P.

    1986-01-01

    Thin CCDs with precise control of thickness and surface quality allow astronomers to optimize chips for specific applications. A means of mechanically thinning a TI 800 x 800 CCD with an abrasive slurry of aluminum oxide is presented. Using the same techniques, the abrasives can be replaced with a chemical solution to eliminate subsurface damage. A technique of mounting the CCD which retains the high quality surface generated during thinning is also demonstrated. This requires the backside of the chip to be bonded to a glass window which closely matches silicon's thermal expansion properties. Thinned CCDs require backside treatment to enhance blue and UV quantum efficiency. Two methods are discussed which may be effective with this mounting system.

  11. Preparing for LSST with the LCOGT NEO Follow-up Network

    NASA Astrophysics Data System (ADS)

    Greenstreet, Sarah; Lister, Tim; Gomez, Edward

    2016-10-01

    The Las Cumbres Observatory Global Telescope Network (LCOGT) provides an ideal platform for follow-up and characterization of Solar System objects (e.g. asteroids, Kuiper Belt Objects, comets, Near-Earth Objects (NEOs)) and ultimately for the discovery of new objects. The LCOGT NEO Follow-up Network is using the LCOGT telescope network in addition to a web-based system developed to perform prioritized target selection, scheduling, and data reduction to confirm NEO candidates and characterize radar-targeted known NEOs.In order to determine how to maximize our NEO follow-up efforts, we must first define our goals for the LCOGT NEO Follow-up Network. This means answering the following questions. Should we follow-up all objects brighter than some magnitude limit? Should we only focus on the brightest objects or push to the limits of our capabilities by observing the faintest objects we think we can see and risk not finding the objects in our data? Do we (and how do we) prioritize objects somewhere in the middle of our observable magnitude range? If we want to push to faint objects, how do we minimize the amount of data in which the signal-to-noise ratio is too low to see the object? And how do we find a balance between performing follow-up and characterization observations?To help answer these questions, we have developed a LCOGT NEO Follow-up Network simulator that allows us to test our prioritization algorithms for target selection, confirm signal-to-noise predictions, and determine ideal block lengths and exposure times for observing NEO candidates. We will present our results from the simulator and progress on our NEO follow-up efforts.In the era of LSST, developing/utilizing infrastructure, such as the LCOGT NEO Follow-up Network and our web-based platform for selecting, scheduling, and reducing NEO observations, capable of handling the large number of detections expected to be produced on a daily basis by LSST will be critical to follow-up efforts. We hope our work can act as an example and tool for the community as together we prepare for the age of LSST.

  12. The Zwicky Transient Facility

    NASA Astrophysics Data System (ADS)

    Kulkarni, Shrinivas R.

    2016-01-01

    The Zwicky Transient Facility (ZTF) has been designed with a singular focus: a systematic exploration of the night sky at a magnitude level well suited for spectral classification and follow up with the existing class of 4-m to 10-m class telescopes. ZTF is the successor to the Palomar Transient Factory (PTF). The discovery engine for ZTF is a 47 square degree camera (realized through 16 e2V monolithic CCDs) that fills the entire focal plane of the 48-inch Oschin telescope of the Palomar Observatory. Single 30-s epoch sensitivity is about 20.5 in g and R bands. The Infarared Processing & Analysis Center (IPAC) is the data center for ZTF. ZTF is a public-private partnership with equal contributions from a consortium of world-wide partners and an NSF MSIP grant. Forty percent of ZTF time is set aside for two major community surveys: a 3-day cadence survey of high latitudes (to mimic LSST) and a time domain survey of the entire Northern Galactic plane. We expect first light in February 2017 and begin a 3-year survey starting summer of 2017. The first year will be spent on building up deep reference images of the sky (a must for transient surveys). During the second year IPAC will deliver near archival quality photometric products within 12 hours of observations. By comparison to reference images photometric alerts will be sent out. Year 3 will see the near real-time release of image differencing products. A Community Science Advisory Committee (CSAC), chaired by S. Ridgway (NOAO), has been set up to both advise the PI and to ensure that the US community's interests are well served. Astronomers interested in getting a head start on ZTF may wish to peruse the data releases from PTF. Young people (or young at heart) may wish to attend the annual summer school on PTF/ZTF (August, Caltech campus). The Principal Investigator (PI) for the project is S. Kulkarni and the Project Scientist is Eric Bellm.For further details please consult http://www.ptf.caltech.edu/ztf

  13. The SMILE Soft X-ray Imager (SXI) CCD design and development

    NASA Astrophysics Data System (ADS)

    Soman, M. R.; Hall, D. J.; Holland, A. D.; Burgon, R.; Buggey, T.; Skottfelt, J.; Sembay, S.; Drumm, P.; Thornhill, J.; Read, A.; Sykes, J.; Walton, D.; Branduardi-Raymont, G.; Kennedy, T.; Raab, W.; Verhoeve, P.; Agnolon, D.; Woffinden, C.

    2018-01-01

    SMILE, the Solar wind Magnetosphere Ionosphere Link Explorer, is a joint science mission between the European Space Agency and the Chinese Academy of Sciences. The spacecraft will be uniquely equipped to study the interaction between the Earth's magnetosphere-ionosphere system and the solar wind on a global scale. SMILE's instruments will explore this science through imaging of the solar wind charge exchange soft X-ray emission from the dayside magnetosheath, simultaneous imaging of the UV northern aurora and in-situ monitoring of the solar wind and magnetosheath plasma and magnetic field conditions. The Soft X-ray Imager (SXI) is the instrument being designed to observe X-ray photons emitted by the solar wind charge exchange process at photon energies between 200 eV and 2000 eV . X-rays will be collected using a focal plane array of two custom-designed CCDs, each consisting of 18 μm square pixels in a 4510 by 4510 array. SMILE will be placed in a highly elliptical polar orbit, passing in and out of the Earth's radiation belts every 48 hours. Radiation damage accumulated in the CCDs during the mission's nominal 3-year lifetime will degrade their performance (such as through decreases in charge transfer efficiency), negatively impacting the instrument's ability to detect low energy X-rays incident on the regions of the CCD image area furthest from the detector outputs. The design of the SMILE-SXI CCDs is presented here, including features and operating methods for mitigating the effects of radiation damage and expected end of life CCD performance. Measurements with a PLATO device that has not been designed for soft X-ray signal levels indicate a temperature-dependent transfer efficiency performance varying between 5×10-5 and 9×10-4 at expected End of Life for 5.9 keV photons, giving an initial set of measurements from which to extrapolate the performance of the SXI CCDs.

  14. LSST (Hoop/Column) Maypole Antenna Development Program, phase 1, part 2

    NASA Technical Reports Server (NTRS)

    Sullivan, M. R.

    1982-01-01

    Cable technology is discussed. Manufacturing flow and philosophy are considered. Acceptance, gratification and flight tests are discussed. Fifteen-meter and fifty-meter models are considered. An economic assessment is included.

  15. LSST (Hoop/Column) Maypole Antenna Development Program, phase 1, part 2

    NASA Astrophysics Data System (ADS)

    Sullivan, M. R.

    1982-06-01

    Cable technology is discussed. Manufacturing flow and philosophy are considered. Acceptance, gratification and flight tests are discussed. Fifteen-meter and fifty-meter models are considered. An economic assessment is included.

  16. In Pursuit of LSST Science Requirements: A Comparison of Photometry Algorithms

    NASA Astrophysics Data System (ADS)

    Becker, Andrew C.; Silvestri, Nicole M.; Owen, Russell E.; Ivezić, Željko; Lupton, Robert H.

    2007-12-01

    We have developed an end-to-end photometric data-processing pipeline to compare current photometric algorithms commonly used on ground-based imaging data. This test bed is exceedingly adaptable and enables us to perform many research and development tasks, including image subtraction and co-addition, object detection and measurements, the production of photometric catalogs, and the creation and stocking of database tables with time-series information. This testing has been undertaken to evaluate existing photometry algorithms for consideration by a next-generation image-processing pipeline for the Large Synoptic Survey Telescope (LSST). We outline the results of our tests for four packages: the Sloan Digital Sky Survey's Photo package, DAOPHOT and ALLFRAME, DOPHOT, and two versions of Source Extractor (SExtractor). The ability of these algorithms to perform point-source photometry, astrometry, shape measurements, and star-galaxy separation and to measure objects at low signal-to-noise ratio is quantified. We also perform a detailed crowded-field comparison of DAOPHOT and ALLFRAME, and profile the speed and memory requirements in detail for SExtractor. We find that both DAOPHOT and Photo are able to perform aperture photometry to high enough precision to meet LSST's science requirements, and less adequately at PSF-fitting photometry. Photo performs the best at simultaneous point- and extended-source shape and brightness measurements. SExtractor is the fastest algorithm, and recent upgrades in the software yield high-quality centroid and shape measurements with little bias toward faint magnitudes. ALLFRAME yields the best photometric results in crowded fields.

  17. The variable sky of deep synoptic surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ridgway, Stephen T.; Matheson, Thomas; Mighell, Kenneth J.

    2014-11-20

    The discovery of variable and transient sources is an essential product of synoptic surveys. The alert stream will require filtering for personalized criteria—a process managed by a functionality commonly described as a Broker. In order to understand quantitatively the magnitude of the alert generation and Broker tasks, we have undertaken an analysis of the most numerous types of variable targets in the sky—Galactic stars, quasi-stellar objects (QSOs), active galactic nuclei (AGNs), and asteroids. It is found that the Large Synoptic Survey Telescope (LSST) will be capable of discovering ∼10{sup 5} high latitude (|b| > 20°) variable stars per night atmore » the beginning of the survey. (The corresponding number for |b| < 20° is orders of magnitude larger, but subject to caveats concerning extinction and crowding.) However, the number of new discoveries may well drop below 100 per night within less than one year. The same analysis applied to GAIA clarifies the complementarity of the GAIA and LSST surveys. Discovery of AGNs and QSOs are each predicted to begin at ∼3000 per night and decrease by 50 times over four years. Supernovae are expected at ∼1100 per night, and after several survey years will dominate the new variable discovery rate. LSST asteroid discoveries will start at >10{sup 5} per night, and if orbital determination has a 50% success rate per epoch, they will drop below 1000 per night within two years.« less

  18. Accessing Multi-Dimensional Images and Data Cubes in the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Tody, Douglas; Plante, R. L.; Berriman, G. B.; Cresitello-Dittmar, M.; Good, J.; Graham, M.; Greene, G.; Hanisch, R. J.; Jenness, T.; Lazio, J.; Norris, P.; Pevunova, O.; Rots, A. H.

    2014-01-01

    Telescopes across the spectrum are routinely producing multi-dimensional images and datasets, such as Doppler velocity cubes, polarization datasets, and time-resolved “movies.” Examples of current telescopes producing such multi-dimensional images include the JVLA, ALMA, and the IFU instruments on large optical and near-infrared wavelength telescopes. In the near future, both the LSST and JWST will also produce such multi-dimensional images routinely. High-energy instruments such as Chandra produce event datasets that are also a form of multi-dimensional data, in effect being a very sparse multi-dimensional image. Ensuring that the data sets produced by these telescopes can be both discovered and accessed by the community is essential and is part of the mission of the Virtual Observatory (VO). The Virtual Astronomical Observatory (VAO, http://www.usvao.org/), in conjunction with its international partners in the International Virtual Observatory Alliance (IVOA), has developed a protocol and an initial demonstration service designed for the publication, discovery, and access of arbitrarily large multi-dimensional images. The protocol describing multi-dimensional images is the Simple Image Access Protocol, version 2, which provides the minimal set of metadata required to characterize a multi-dimensional image for its discovery and access. A companion Image Data Model formally defines the semantics and structure of multi-dimensional images independently of how they are serialized, while providing capabilities such as support for sparse data that are essential to deal effectively with large cubes. A prototype data access service has been deployed and tested, using a suite of multi-dimensional images from a variety of telescopes. The prototype has demonstrated the capability to discover and remotely access multi-dimensional data via standard VO protocols. The prototype informs the specification of a protocol that will be submitted to the IVOA for approval, with an operational data cube service to be delivered in mid-2014. An associated user-installable VO data service framework will provide the capabilities required to publish VO-compatible multi-dimensional images or data cubes.

  19. System Architecture of the Dark Energy Survey Camera Readout Electronics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, Theresa; /FERMILAB; Ballester, Otger

    2010-05-27

    The Dark Energy Survey makes use of a new camera, the Dark Energy Camera (DECam). DECam will be installed in the Blanco 4M telescope at Cerro Tololo Inter-American Observatory (CTIO). DECam is presently under construction and is expected to be ready for observations in the fall of 2011. The focal plane will make use of 62 2Kx4K and 12 2kx2k fully depleted Charge-Coupled Devices (CCDs) for guiding, alignment and focus. This paper will describe design considerations of the system; including, the entire signal path used to read out the CCDs, the development of a custom crate and backplane, the overallmore » grounding scheme and early results of system tests.« less

  20. FY-79 - development of fiber optics connector technology for large space systems

    NASA Technical Reports Server (NTRS)

    Campbell, T. G.

    1980-01-01

    The development of physical concepts for integrating fiber optic connectors and cables with structural concepts proposed for the LSST is discussed. Emphasis is placed on remote connections using integrated cables.

  1. Projected Near-Earth Object Discovery Performance of the Large Synoptic Survey Telescope

    NASA Technical Reports Server (NTRS)

    Chesley, Steven R.; Veres, Peter

    2017-01-01

    This report describes the methodology and results of an assessment study of the performance of the Large Synoptic Survey Telescope (LSST) in its planned efforts to detect and catalog near-Earth objects (NEOs).

  2. Searching for modified growth patterns with tomographic surveys

    NASA Astrophysics Data System (ADS)

    Zhao, Gong-Bo; Pogosian, Levon; Silvestri, Alessandra; Zylberberg, Joel

    2009-04-01

    In alternative theories of gravity, designed to produce cosmic acceleration at the current epoch, the growth of large scale structure can be modified. We study the potential of upcoming and future tomographic surveys such as Dark Energy Survey (DES) and Large Synoptic Survey Telescope (LSST), with the aid of cosmic microwave background (CMB) and supernovae data, to detect departures from the growth of cosmic structure expected within general relativity. We employ parametric forms to quantify the potential time- and scale-dependent variation of the effective gravitational constant and the differences between the two Newtonian potentials. We then apply the Fisher matrix technique to forecast the errors on the modified growth parameters from galaxy clustering, weak lensing, CMB, and their cross correlations across multiple photometric redshift bins. We find that even with conservative assumptions about the data, DES will produce nontrivial constraints on modified growth and that LSST will do significantly better.

  3. Baseline design and requirements for the LSST rotating enclosure (dome)

    NASA Astrophysics Data System (ADS)

    Neill, D. R.; DeVries, J.; Hileman, E.; Sebag, J.; Gressler, W.; Wiecha, O.; Andrew, J.; Schoening, W.

    2014-07-01

    The Large Synoptic Survey Telescope (LSST) is a large (8.4 meter) wide-field (3.5 degree) survey telescope, which will be located on the Cerro Pachón summit in Chile. As a result of the wide field of view, its optical system is unusually susceptible to stray light; consequently besides protecting the telescope from the environment the rotating enclosure (Dome) also provides indispensible light baffling. All dome vents are covered with light baffles which simultaneously provide both essential dome flushing and stray light attenuation. The wind screen also (and primarily) functions as a light screen providing only a minimum clear aperture. Since the dome must operate continuously, and the drives produce significant heat, they are located on the fixed lower enclosure to facilitate glycol water cooling. To accommodate day time thermal control, a duct system channels cooling air provided by the facility when the dome is in its parked position.

  4. LSST communications middleware implementation

    NASA Astrophysics Data System (ADS)

    Mills, Dave; Schumacher, German; Lotz, Paul

    2016-07-01

    The LSST communications middleware is based on a set of software abstractions; which provide standard interfaces for common communications services. The observatory requires communication between diverse subsystems, implemented by different contractors, and comprehensive archiving of subsystem status data. The Service Abstraction Layer (SAL) is implemented using open source packages that implement open standards of DDS (Data Distribution Service1) for data communication, and SQL (Standard Query Language) for database access. For every subsystem, abstractions for each of the Telemetry datastreams, along with Command/Response and Events, have been agreed with the appropriate component vendor (such as Dome, TMA, Hexapod), and captured in ICD's (Interface Control Documents).The OpenSplice (Prismtech) Community Edition of DDS provides an LGPL licensed distribution which may be freely redistributed. The availability of the full source code provides assurances that the project will be able to maintain it over the full 10 year survey, independent of the fortunes of the original providers.

  5. Near-Field Cosmology with Resolved Stellar Populations Around Local Volume LMC Stellar-Mass Galaxies

    NASA Astrophysics Data System (ADS)

    Carlin, Jeffrey L.; Sand, David J.; Willman, Beth; Brodie, Jean P.; Crnojevic, Denija; Forbes, Duncan; Hargis, Jonathan R.; Peter, Annika; Pucha, Ragadeepika; Romanowsky, Aaron J.; Spekkens, Kristine; Strader, Jay

    2018-06-01

    We discuss our ongoing observational program to comprehensively map the entire virial volumes of roughly LMC stellar mass galaxies at distances of ~2-4 Mpc. The MADCASH (Magellanic Analog Dwarf Companions And Stellar Halos) survey will deliver the first census of the dwarf satellite populations and stellar halo properties within LMC-like environments in the Local Volume. Our results will inform our understanding of the recent DES discoveries of dwarf satellites tentatively affiliated with the LMC/SMC system. This program has already yielded the discovery of the faintest known dwarf galaxy satellite of an LMC stellar-mass host beyond the Local Group, based on deep Subaru+HyperSuprimeCam imaging reaching ~2 magnitudes below its TRGB, and at least two additional candidate satellites. We will summarize the survey results and status to date, highlighting some challenges encountered and lessons learned as we process the data for this program through a prototype LSST pipeline. Our program will examine whether LMC stellar mass dwarfs have extended stellar halos, allowing us to assess the relative contributions of in-situ stars vs. merger debris to their stellar populations and halo density profiles. We outline the constraints on galaxy formation models that will be provided by our observations of low-mass galaxy halos and their satellites.

  6. The Palomar Transient Factory: High Quality Realtime Data Processing in a Cost-Constrained Environment

    NASA Astrophysics Data System (ADS)

    Surace, J.; Laher, R.; Masci, F.; Grillmair, C.; Helou, G.

    2015-09-01

    The Palomar Transient Factory (PTF) is a synoptic sky survey in operation since 2009. PTF utilizes a 7.1 square degree camera on the Palomar 48-inch Schmidt telescope to survey the sky primarily at a single wavelength (R-band) at a rate of 1000-3000 square degrees a night. The data are used to detect and study transient and moving objects such as gamma ray bursts, supernovae and asteroids, as well as variable phenomena such as quasars and Galactic stars. The data processing system at IPAC handles realtime processing and detection of transients, solar system object processing, high photometric precision processing and light curve generation, and long-term archiving and curation. This was developed under an extremely limited budget profile in an unusually agile development environment. Here we discuss the mechanics of this system and our overall development approach. Although a significant scientific installation in of itself, PTF also serves as the prototype for our next generation project, the Zwicky Transient Facility (ZTF). Beginning operations in 2017, ZTF will feature a 50 square degree camera which will enable scanning of the entire northern visible sky every night. ZTF in turn will serve as a stepping stone to the Large Synoptic Survey Telescope (LSST), a major NSF facility scheduled to begin operations in the early 2020s.

  7. Improved Space Object Observation Techniques Using CMOS Detectors

    NASA Astrophysics Data System (ADS)

    Schildknecht, T.; Hinze, A.; Schlatter, P.; Silha, J.; Peltonen, J.; Santti, T.; Flohrer, T.

    2013-08-01

    CMOS-sensors, or in general Active Pixel Sensors (APS), are rapidly replacing CCDs in the consumer camera market. Due to significant technological advances during the past years these devices start to compete with CCDs also for demanding scientific imaging applications, in particular in the astronomy community. CMOS detectors offer a series of inherent advantages compared to CCDs, due to the structure of their basic pixel cells, which each contain their own amplifier and readout electronics. The most prominent advantages for space object observations are the extremely fast and flexible readout capabilities, feasibility for electronic shuttering and precise epoch registration, and the potential to perform image processing operations on-chip and in real-time. Presently applied and proposed optical observation strategies for space debris surveys and space surveillance applications had to be analyzed. The major design drivers were identified and potential benefits from using available and future CMOS sensors were assessed. The major challenges and design drivers for ground-based and space-based optical observation strategies have been analyzed. CMOS detector characteristics were critically evaluated and compared with the established CCD technology, especially with respect to the above mentioned observations. Similarly, the desirable on-chip processing functionalities which would further enhance the object detection and image segmentation were identified. Finally, the characteristics of a particular CMOS sensor available at the Zimmerwald observatory were analyzed by performing laboratory test measurements.

  8. Managing Radiation Degradation of CCDs on the Chandra X-ray Observatory

    NASA Technical Reports Server (NTRS)

    ODell, Stephen L.; Blackwell, William C.; Minow, Joseph I.; Cameron, Robert A.; Morris, David C.; Virani, Shanil N.; Six, N. Frank (Technical Monitor)

    2002-01-01

    The CCDs on the Chandra X ray Observatory are sensitive to radiation damage particularly from low-energy protons scattering off the telescope's mirrors onto the focal plane. In its highly elliptical orbit, Chandra passes through a spatially and temporally varying radiation environment, ranging from the radiation belts to the solar wind. Translating thc Advanced CCD Imaging Spectrometer (ACIS) out of the focal position during radiation-belt passages has prevented loss of scientific utility and eventually functionality. However, carefully managing the radiation damage during the remainder of the orbit, without unnecessarily sacrificing observing time, is essential to optimizing the scientific value of this exceptional observatory throughout its planned 10-year mission. In working toward this optimization, the Chandra team developed aid applied radiation-management strategies. These strategies include autonomous instrument safing triggered by the on-board radiation monitor, as well as monitoring, alerts, and intervention based upon real-time space-environment data from NOAA and NASA spacecraft. Furthermore, because Chandra often spends much of its orbit out of the solar wind (in the Earth's outer magnetosphere and magnetosheath), the team developed the Chandra Radiation Model to describe the complete low-energy-proton environment. Management of the radiation damage has thus far succeeded in limiting degradation of the charge-transfer inefficiency (CTI) to less than 4.4*10^-6 and 1.4*10^-6 per year for the front-illuminated and back-illuminated CCDs, respectively.

  9. Challenges in photon-starved space astronomy in a harsh radiation environment using CCDs

    NASA Astrophysics Data System (ADS)

    Hall, David J.; Bush, Nathan; Murray, Neil; Gow, Jason; Clarke, Andrew; Burgon, Ross; Holland, Andrew

    2015-09-01

    The Charge Coupled Device (CCD) has a long heritage for imaging and spectroscopy in many space astronomy missions. However, the harsh radiation environment experienced in orbit creates defects in the silicon that capture the signal being transferred through the CCD. This radiation damage has a detrimental impact on the detector performance and requires carefully planned mitigation strategies. The ESA Gaia mission uses 106 CCDs, now orbiting around the second Lagrange point as part of the largest focal-plane ever launched. Following readout, signal electrons will be affected by the traps generated in the devices from the radiation environment and this degradation will be corrected for using a charge distortion model. ESA's Euclid mission will contain a focal plane of 36 CCDs in the VIS instrument. Moving further forwards, the World Space Observatory (WSO) UV spectrographs and the WFIRST-AFTA coronagraph intend to look at very faint sources in which mitigating the impact of traps on the transfer of single electron signals will be of great interest. Following the development of novel experimental and analysis techniques, one is now able to study the impact of radiation on the detector to new levels of detail. Through a combination of TCAD simulations, defect studies and device testing, we are now probing the interaction of single electrons with individual radiation-induced traps to analyse the impact of radiation in photon-starved applications.

  10. CCDS concept paper: Delta-DOR

    NASA Technical Reports Server (NTRS)

    Berry, David S.; Broder, James S.

    2005-01-01

    This Concept Paper proposes the development of Consultative Committee for Space Data Systemes (CCSDS) standards for the deep space navigation technique known as 'delta-DOR' (Delta Differential One-Way Ranging).

  11. LSST Telescope and Optics Status

    NASA Astrophysics Data System (ADS)

    Krabbendam, Victor; Gressler, W. J.; Andrew, J. R.; Barr, J. D.; DeVries, J.; Hileman, E.; Liang, M.; Neill, D. R.; Sebag, J.; Wiecha, O.; LSST Collaboration

    2011-01-01

    The LSST Project continues to advance the design and development of an observatory system capable of capturing 20,000 deg2 of the sky in six wavebands over ten years. Optical fabrication of the unique M1/M3 monolithic mirror has entered final front surface optical processing. After substantial grinding to remove 5 tons of excess glass above the M3 surface, a residual of a single spin casting, both distinct optical surfaces are now clearly evident. Loose abrasive grinding has begun and polishing is to occur during 2011 and final optical testing is planned in early 2012. The M1/M3 telescope cell and internal component designs have matured to support on telescope operational requirements and off telescope coating needs. The mirror position system (hardpoint actuators) and mirror support system (figure actuator) designs have developed through internal laboratory analysis and testing. Review of thermal requirements has assisted with definition of a thermal conditioning and control system. Pre-cooling the M1/M3 substrate will enable productive observing during the large temperature swing often seen at twilight. The M2 ULE™ substrate is complete and lies in storage waiting for additional funding to enable final optical polishing. This 3.5m diameter, 100mm thick meniscus substrate has been ground to within 40 microns of final figure. Detailed design of the telescope mount, including subflooring, has been developed. Finally, substantial progress has been achieved on the facility design. In early 2010, LSST contracted with ARCADIS Geotecnica Consultores, a Santiago based engineering firm to lead the formal architectural design effort for the summit facility.

  12. Near-Earth Object Orbit Linking with the Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Vereš, Peter; Chesley, Steven R.

    2017-07-01

    We have conducted a detailed simulation of the ability of the Large Synoptic Survey Telescope (LSST) to link near-Earth and main belt asteroid detections into orbits. The key elements of the study were a high-fidelity detection model and the presence of false detections in the form of both statistical noise and difference image artifacts. We employed the Moving Object Processing System (MOPS) to generate tracklets, tracks, and orbits with a realistic detection density for one month of the LSST survey. The main goals of the study were to understand whether (a) the linking of near-Earth objects (NEOs) into orbits can succeed in a realistic survey, (b) the number of false tracks and orbits will be manageable, and (c) the accuracy of linked orbits would be sufficient for automated processing of discoveries and attributions. We found that the overall density of asteroids was more than 5000 per LSST field near opposition on the ecliptic, plus up to 3000 false detections per field in good seeing. We achieved 93.6% NEO linking efficiency for H< 22 on tracks composed of tracklets from at least three distinct nights within a 12 day interval. The derived NEO catalog was comprised of 96% correct linkages. Less than 0.1% of orbits included false detections, and the remainder of false linkages stemmed from main belt confusion, which was an artifact of the short time span of the simulation. The MOPS linking efficiency can be improved by refined attribution of detections to known objects and by improved tuning of the internal kd-tree linking algorithms.

  13. Multi-Wavelength Spectroscopy of Tidal Disruption Flares: A Legacy Sample for the LSST Era

    NASA Astrophysics Data System (ADS)

    Cenko, Stephen

    2017-08-01

    When a star passes within the sphere of disruption of a massive black hole, tidal forces will overcome self-gravity and unbind the star. While approximately half of the stellar debris is ejected at high velocities, the remaining material stays bound to the black hole and accretes, resulting in a luminous, long-lived transient known as a tidal disruption flare (TDF). In addition to serving as unique laboratories for accretion physics, TDFs offer the hope of measuring black hole masses in galaxies much too distant for resolved kinematic studies.In order to realize this potential, we must better understand the detailed processes by which the bound debris circularizes and forms an accretion disk. Spectroscopy is critical to this effort, as emission and absorption line diagnostics provide insight into the location and physical state (velocity, density, composition) of the emitting gas (in analogy with quasars). UV spectra are particularly critical, as most strong atomic features fall in this bandpass, and high-redshift TDF discoveries from LSST will sample rest-frame UV wavelengths.Here we propose to obtain a sequence of UV (HST) and optical (Gemini/GMOS) spectra for a sample of 5 TDFs discovered by the Zwicky Transient Facility, doubling the number of TDFs with UV spectra. Our observations will directly test models for the generation of the UV/optical emission (circularization vs reprocessing) by searching for outflows and measuring densities, temperatures, and composition as a function of time. This effort is critical to developing the framework by which we can infer black hole properties (e.g., mass) from LSST TDF discoveries.

  14. Improvement in the light sensitivity of the ultrahigh-speed high-sensitivity CCD with a microlens array

    NASA Astrophysics Data System (ADS)

    Hayashida, T.,; Yonai, J.; Kitamura, K.; Arai, T.; Kurita, T.; Tanioka, K.; Maruyama, H.; Etoh, T. Goji; Kitagawa, S.; Hatade, K.; Yamaguchi, T.; Takeuchi, H.; Iida, K.

    2008-02-01

    We are advancing the development of ultrahigh-speed, high-sensitivity CCDs for broadcast use that are capable of capturing smooth slow-motion videos in vivid colors even where lighting is limited, such as at professional baseball games played at night. We have already developed a 300,000 pixel, ultrahigh-speed CCD, and a single CCD color camera that has been used for sports broadcasts and science programs using this CCD. However, there are cases where even higher sensitivity is required, such as when using a telephoto lens during a baseball broadcast or a high-magnification microscope during science programs. This paper provides a summary of our experimental development aimed at further increasing the sensitivity of CCDs using the light-collecting effects of a microlens array.

  15. Re-visiting the Amplifier Gains of the HST/ACS Wide Field Channel CCDs

    NASA Astrophysics Data System (ADS)

    Desjardins, Tyler D.; Grogin, Norman A.; ACS Team

    2018-06-01

    For the first time since HST Servicing Mission 4 (SM4) in May 2009, we present an analysis of the amplifier gains of the Advanced Camera for Surveys (ACS) Wide Field Channel (WFC) CCDs. Using a series of in-flight flat-field exposures taken in November 2017 with a tungsten calibration lamp, we utilize the photon transfer method to estimate the gains of the WFC1 and WFC2 CCD amplifiers. We find evidence that the gains of the four readout amplifiers have changed by a small, but statistically significant, 1–2% since SM4. We further present a study of historical ACS/WFC observations of the globular cluster NGC 104 (47 Tuc) in an attempt to estimate the time dependence of the gains.

  16. Diagnostic methods and recommendations for the cerebral creatine deficiency syndromes.

    PubMed

    Clark, Joseph F; Cecil, Kim M

    2015-03-01

    Primary care pediatricians and a variety of specialist physicians strive to define an accurate diagnosis for children presenting with impairment of expressive speech and delay in achieving developmental milestones. Within the past two decades, a group of disorders featuring this presentation have been identified as cerebral creatine deficiency syndromes (CCDS). Patients with these disorders were initially discerned using proton magnetic resonance spectroscopy of the brain within a magnetic resonance imaging (MRI) examination. The objective of this review is to provide the clinician with an overview of the current information available on identifying and treating these conditions. We explain the salient features of creatine metabolism, synthesis, and transport required for normal development. We propose diagnostic approaches for confirming a CCDS diagnosis. Finally, we describe treatment approaches for managing patients with these conditions.

  17. Breast Biopsy System

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Charge Coupled Devices (CCDs) are high technology silicon chips that connect light directly into electronic or digital images, which can be manipulated or enhanced by computers. When Goddard Space Flight Center (GSFC) scientists realized that existing CCD technology could not meet scientific requirements for the Hubble Space Telescope Imagining Spectrograph, GSFC contracted with Scientific Imaging Technologies, Inc. (SITe) to develop an advanced CCD. SITe then applied many of the NASA-driven enhancements to the manufacture of CCDs for digital mammography. The resulting device images breast tissue more clearly and efficiently. The LORAD Stereo Guide Breast Biopsy system incorporates SITe's CCD as part of a digital camera system that is replacing surgical biopsy in many cases. Known as stereotactic needle biopsy, it is performed under local anesthesia with a needle and saves women time, pain, scarring, radiation exposure and money.

  18. Geostationary platform systems concepts definition follow-on study. Volume 2A: Technical Task 2 LSST special emphasis

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The results of the Large Space Systems Technology special emphasis task are presented. The task was an analysis of structural requirements deriving from the initial Phase A Operational Geostationary Platform study.

  19. Thermal design and performance of the REgolith x-ray imaging spectrometer (REXIS) instrument

    NASA Astrophysics Data System (ADS)

    Stout, Kevin D.; Masterson, Rebecca A.

    2014-08-01

    The REgolith X-ray Imaging Spectrometer (REXIS) instrument is a student collaboration instrument on the OSIRIS-REx asteroid sample return mission scheduled for launch in September 2016. The REXIS science mission is to characterize the elemental abundances of the asteroid Bennu on a global scale and to search for regions of enhanced elemental abundance. The thermal design of the REXIS instrument is challenging due to both the science requirements and the thermal environment in which it will operate. The REXIS instrument consists of two assemblies: the spectrometer and the solar X-ray monitor (SXM). The spectrometer houses a 2x2 array of back illuminated CCDs that are protected from the radiation environment by a one-time deployable cover and a collimator assembly with coded aperture mask. Cooling the CCDs during operation is the driving thermal design challenge on the spectrometer. The CCDs operate in the vicinity of the electronics box, but a 130 °C thermal gradient is required between the two components to cool the CCDs to -60 °C in order to reduce noise and obtain science data. This large thermal gradient is achieved passively through the use of a copper thermal strap, a large radiator facing deep space, and a two-stage thermal isolation layer between the electronics box and the DAM. The SXM is mechanically mounted to the sun-facing side of the spacecraft separately from the spectrometer and characterizes the highly variable solar X-ray spectrum to properly interpret the data from the asteroid. The driving thermal design challenge on the SXM is cooling the silicon drift detector (SDD) to below -30 °C when operating. A two-stage thermoelectric cooler (TEC) is located directly beneath the detector to provide active cooling, and spacecraft MLI blankets cover all of the SXM except the detector aperture to radiatively decouple the SXM from the flight thermal environment. This paper describes the REXIS thermal system requirements, thermal design, and analyses, with a focus on the driving thermal design challenges for the instrument. It is shown through both analysis and early testing that the REXIS instrument can perform successfully through all phases of its mission.

  20. Satellite Power Systems (SPS). LSST systems and integration task for SPS flight test article

    NASA Technical Reports Server (NTRS)

    Greenberg, H. S.

    1981-01-01

    This research activity emphasizes the systems definition and resulting structural requirements for the primary structure of two potential SPS large space structure test articles. These test articles represent potential steps in the SPS research and technology development.

  1. Development of a 300,000-pixel ultrahigh-speed high-sensitivity CCD

    NASA Astrophysics Data System (ADS)

    Ohtake, H.; Hayashida, T.; Kitamura, K.; Arai, T.; Yonai, J.; Tanioka, K.; Maruyama, H.; Etoh, T. Goji; Poggemann, D.; Ruckelshausen, A.; van Kuijk, H.; Bosiers, Jan T.

    2006-02-01

    We are developing an ultrahigh-speed, high-sensitivity broadcast camera that is capable of capturing clear, smooth slow-motion videos even where lighting is limited, such as at professional baseball games played at night. In earlier work, we developed an ultrahigh-speed broadcast color camera1) using three 80,000-pixel ultrahigh-speed, highsensitivity CCDs2). This camera had about ten times the sensitivity of standard high-speed cameras, and enabled an entirely new style of presentation for sports broadcasts and science programs. Most notably, increasing the pixel count is crucially important for applying ultrahigh-speed, high-sensitivity CCDs to HDTV broadcasting. This paper provides a summary of our experimental development aimed at improving the resolution of CCD even further: a new ultrahigh-speed high-sensitivity CCD that increases the pixel count four-fold to 300,000 pixels.

  2. A vertex detector for SLD

    NASA Astrophysics Data System (ADS)

    Damerell, C. J. S.; English, R. L.; Gillman, A. R.; Lintern, A. L.; Phillips, D.; Rong, G.; Sutton, C.; Wickens, F. J.; Agnew, G.; Clarke, P.; Hedges, S.; Watts, S. J.

    1989-03-01

    The SLAC Linear Collider is currently being commissioned. A second-generation detector for SLC, known as SLD, is now under construction. In the centre of this 4000 ton detector there will be a vertex detector (VXD) consisting of 4 barrels of 2-dimensional CCDs, approximately 250 CCDs in total. This detector will be used as a tracking microscope, able to pinpoint the outgoing tracks with a precision of about 5 μm, and thus to distinguish between particles produced at the primary vertex and those which result from the decay of heavy-flavour quarks (charm, bottom and possibly others) or from the decay of heavy leptons. This paper describes the present state of the VXD project, with particular emphasis on the signal processing procedures which will reduce the 60 million measurements of pixel contents for each event to a manageable level (some tens of kilobytes).

  3. Modelling radiation damage to ESA's Gaia satellite CCDs

    NASA Astrophysics Data System (ADS)

    Seabroke, George; Holland, Andrew; Cropper, Mark

    2008-07-01

    The Gaia satellite is a high-precision astrometry, photometry and spectroscopic ESA cornerstone mission, currently scheduled for launch in late 2011. Its primary science drivers are the composition, formation and evolution of the Galaxy. Gaia will not achieve its scientific requirements without detailed calibration and correction for radiation damage. Microscopic models of Gaia's CCDs are being developed to simulate the effect of radiation damage, charge trapping, which causes charge transfer inefficiency. The key to calculating the probability of a photoelectron being captured by a trap is the 3D electron density within each CCD pixel. However, this has not been physically modelled for Gaia CCD pixels. In this paper, the first of a series, we motivate the need for such specialised 3D device modelling and outline how its future results will fit into Gaia's overall radiation calibration strategy.

  4. Radioactive Quality Evaluation and Cross Validation of Data from the HJ-1A/B Satellites' CCD Sensors

    PubMed Central

    Zhang, Xin; Zhao, Xiang; Liu, Guodong; Kang, Qian; Wu, Donghai

    2013-01-01

    Data from multiple sensors are frequently used in Earth science to gain a more complete understanding of spatial information changes. Higher quality and mutual consistency are prerequisites when multiple sensors are jointly used. The HJ-1A/B satellites successfully launched on 6 September 2008. There are four charge-coupled device (CCD) sensors with uniform spatial resolutions and spectral range onboard the HJ-A/B satellites. Whether these data are keeping consistency is a major issue before they are used. This research aims to evaluate the data consistency and radioactive quality from the four CCDs. First, images of urban, desert, lake and ocean are chosen as the objects of evaluation. Second, objective evaluation variables, such as mean, variance and angular second moment, are used to identify image performance. Finally, a cross validation method are used to ensure the correlation of the data from the four HJ-1A/B CCDs and that which is gathered from the moderate resolution imaging spectro-radiometer (MODIS). The results show that the image quality of HJ-1A/B CCDs is stable, and the digital number distribution of CCD data is relatively low. In cross validation with MODIS, the root mean square errors of bands 1, 2 and 3 range from 0.055 to 0.065, and for band 4 it is 0.101. The data from HJ-1A/B CCD have better consistency. PMID:23881127

  5. Posttraumatic stress symptoms related to community violence and children's diurnal cortisol response in an urban community-dwelling sample.

    PubMed

    Suglia, Shakira Franco; Staudenmayer, John; Cohen, Sheldon; Wright, Rosalind J

    2010-03-01

    While community violence has been linked to psychological morbidity in urban youth, data on the physiological correlates of violence and associated posttraumatic stress symptoms are sparse. We examined the influence of child posttraumatic stress symptoms reported in relationship to community violence exposure on diurnal salivary cortisol response in a population based sample of 28 girls and 15 boys ages 7-13, 54% self-identified as white and 46% as Hispanic. Mothers' reported on the child's exposure to community violence using the Survey of Children's Exposure to Community Violence and completed the Checklist of Children's Distress Symptoms (CCDS) which captures factors related to posttraumatic stress; children who were eight years of age or greater reported on their own community violence exposure. Saliva samples were obtained from the children four times a day (after awakening, lunch, dinner and bedtime) over three days. Mixed models were used to assess the influence of posttraumatic stress symptoms on cortisol expression, examined as diurnal slope and area under the curve (AUC), calculated across the day, adjusting for socio-demographics. In adjusted analyses, higher scores on total traumatic stress symptoms (CCDS) were associated with both greater cortisol AUC and with a flatter cortisol waking to bedtime rhythm. The associations were primarily attributable to differences on the intrusion, arousal and avoidance CCDS subscales. Posttraumatic stress symptomatology reported in response to community violence exposure was associated with diurnal cortisol disruption in these community-dwelling urban children.

  6. Radioactive quality evaluation and cross validation of data from the HJ-1A/B satellites' CCD sensors.

    PubMed

    Zhang, Xin; Zhao, Xiang; Liu, Guodong; Kang, Qian; Wu, Donghai

    2013-07-05

    Data from multiple sensors are frequently used in Earth science to gain a more complete understanding of spatial information changes. Higher quality and mutual consistency are prerequisites when multiple sensors are jointly used. The HJ-1A/B satellites successfully launched on 6 September 2008. There are four charge-coupled device (CCD) sensors with uniform spatial resolutions and spectral range onboard the HJ-A/B satellites. Whether these data are keeping consistency is a major issue before they are used. This research aims to evaluate the data consistency and radioactive quality from the four CCDs. First, images of urban, desert, lake and ocean are chosen as the objects of evaluation. Second, objective evaluation variables, such as mean, variance and angular second moment, are used to identify image performance. Finally, a cross validation method are used to ensure the correlation of the data from the four HJ-1A/B CCDs and that which is gathered from the moderate resolution imaging spectro-radiometer (MODIS). The results show that the image quality of HJ-1A/B CCDs is stable, and the digital number distribution of CCD data is relatively low. In cross validation with MODIS, the root mean square errors of bands 1, 2 and 3 range from 0.055 to 0.065, and for band 4 it is 0.101. The data from HJ-1A/B CCD have better consistency.

  7. How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?

    NASA Astrophysics Data System (ADS)

    Scolnic, D.; Kessler, R.; Brout, D.; Cowperthwaite, P. S.; Soares-Santos, M.; Annis, J.; Herner, K.; Chen, H.-Y.; Sako, M.; Doctor, Z.; Butler, R. E.; Palmese, A.; Diehl, H. T.; Frieman, J.; Holz, D. E.; Berger, E.; Chornock, R.; Villar, V. A.; Nicholl, M.; Biswas, R.; Hounsell, R.; Foley, R. J.; Metzger, J.; Rest, A.; García-Bellido, J.; Möller, A.; Nugent, P.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Bechtol, K.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Cunha, C. E.; D’Andrea, C. B.; da Costa, L. N.; Davis, C.; Doel, P.; Drlica-Wagner, A.; Eifler, T. F.; Flaugher, B.; Fosalba, P.; Gaztanaga, E.; Gerdes, D. W.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; Hartley, W. G.; Honscheid, K.; James, D. J.; Johnson, M. W. G.; Johnson, M. D.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Lahav, O.; Li, T. S.; Lima, M.; Maia, M. A. G.; March, M.; Marshall, J. L.; Menanteau, F.; Miquel, R.; Neilsen, E.; Plazas, A. A.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Smith, R. C.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, R. C.; Tucker, D. L.; Walker, A. R.; DES Collaboration

    2018-01-01

    The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies of {10}3 {{Gpc}}-3 {{yr}}-1, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is z=0.8 for WFIRST, z=0.25 for LSST, and z=0.04 for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. More broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.

  8. How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scolnic, D.; Kessler, R.; Brout, D.

    The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies ofmore » $${10}^{3}\\,{\\mathrm{Gpc}}^{-3}\\,{\\mathrm{yr}}^{-1}$$, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is $z=0.8$ for WFIRST, $z=0.25$ for LSST, and $z=0.04$ for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. Finally, more broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.« less

  9. How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?

    DOE PAGES

    Scolnic, D.; Kessler, R.; Brout, D.; ...

    2017-12-22

    The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies ofmore » $${10}^{3}\\,{\\mathrm{Gpc}}^{-3}\\,{\\mathrm{yr}}^{-1}$$, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is $z=0.8$ for WFIRST, $z=0.25$ for LSST, and $z=0.04$ for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. Finally, more broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.« less

  10. A warm Spitzer survey of the LSST/DES 'Deep drilling' fields

    NASA Astrophysics Data System (ADS)

    Lacy, Mark; Farrah, Duncan; Brandt, Niel; Sako, Masao; Richards, Gordon; Norris, Ray; Ridgway, Susan; Afonso, Jose; Brunner, Robert; Clements, Dave; Cooray, Asantha; Covone, Giovanni; D'Andrea, Chris; Dickinson, Mark; Ferguson, Harry; Frieman, Joshua; Gupta, Ravi; Hatziminaoglou, Evanthia; Jarvis, Matt; Kimball, Amy; Lubin, Lori; Mao, Minnie; Marchetti, Lucia; Mauduit, Jean-Christophe; Mei, Simona; Newman, Jeffrey; Nichol, Robert; Oliver, Seb; Perez-Fournon, Ismael; Pierre, Marguerite; Rottgering, Huub; Seymour, Nick; Smail, Ian; Surace, Jason; Thorman, Paul; Vaccari, Mattia; Verma, Aprajita; Wilson, Gillian; Wood-Vasey, Michael; Cane, Rachel; Wechsler, Risa; Martini, Paul; Evrard, August; McMahon, Richard; Borne, Kirk; Capozzi, Diego; Huang, Jiashang; Lagos, Claudia; Lidman, Chris; Maraston, Claudia; Pforr, Janine; Sajina, Anna; Somerville, Rachel; Strauss, Michael; Jones, Kristen; Barkhouse, Wayne; Cooper, Michael; Ballantyne, David; Jagannathan, Preshanth; Murphy, Eric; Pradoni, Isabella; Suntzeff, Nicholas; Covarrubias, Ricardo; Spitler, Lee

    2014-12-01

    We propose a warm Spitzer survey to microJy depth of the four predefined Deep Drilling Fields (DDFs) for the Large Synoptic Survey Telescope (LSST) (three of which are also deep drilling fields for the Dark Energy Survey (DES)). Imaging these fields with warm Spitzer is a key component of the overall success of these projects, that address the 'Physics of the Universe' theme of the Astro2010 decadal survey. With deep, accurate, near-infrared photometry from Spitzer in the DDFs, we will generate photometric redshift distributions to apply to the surveys as a whole. The DDFs are also the areas where the supernova searches of DES and LSST are concentrated, and deep Spitzer data is essential to obtain photometric redshifts, stellar masses and constraints on ages and metallicities for the >10000 supernova host galaxies these surveys will find. This 'DEEPDRILL' survey will also address the 'Cosmic Dawn' goal of Astro2010 through being deep enough to find all the >10^11 solar mass galaxies within the survey area out to z~6. DEEPDRILL will complete the final 24.4 square degrees of imaging in the DDFs, which, when added to the 14 square degrees already imaged to this depth, will map a volume of 1-Gpc^3 at z>2. It will find ~100 > 10^11 solar mass galaxies at z~5 and ~40 protoclusters at z>2, providing targets for JWST that can be found in no other way. The Spitzer data, in conjunction with the multiwavelength surveys in these fields, ranging from X-ray through far-infrared and cm-radio, will comprise a unique legacy dataset for studies of galaxy evolution.

  11. Stellar Populations and Nearby Galaxies with the LSST

    NASA Astrophysics Data System (ADS)

    Saha, Abhijit; Olsen, K.; Monet, D. G.; LSST Stellar Populations Collaboration

    2009-01-01

    The LSST will produce a multi-color map and photometric object catalog of half the sky to r=27.6 (AB mag; 5-sigma). Time-space sampling of each field spanning ten years will allow variability, proper motion and parallax measurements for objects brighter than r=24.7. As part of providing an unprecedented map of the Galaxy, the accurate multi-band photometry will permit photometric parallaxes, chemical abundances and a handle on ages via colors at turn-off for main-sequence (MS) stars at all distances within the Galaxy as well as in the Magellanic Clouds, and dwarf satellites of the Milky Way. This will support comprehensive studies of star formation histories and chemical evolution for field stars. The structures of the Clouds and dwarf spheroidals will be traced with the MS stars, to equivalent surface densities fainter than 35 mag/square arc-second. With geometric parallax accuracy of 1 milli-arc-sec, comparable to HIPPARCOS but reaching more than 10 magnitudes fainter, a robust complete sample of solar neighborhood stars will be obtained. The LSST time sampling will identify and characterize variable stars of all types, from time scales of 1 hr to several years, a feast for variable star astrophysics. The combination of wide coverage, multi-band photometry, time sampling and parallax taken together will address several key problems: e.g. fine tuning the extragalactic distance scale by examining properties of RR Lyraes and Cepheids as a function of parent populations, extending the faint end of the galaxy luminosity function by discovering them using star count density enhancements on degree scales tracing, and indentifying inter-galactic stars through novae and Long Period Variables.

  12. Surveying the Inner Solar System with an Infrared Space Telescope

    NASA Astrophysics Data System (ADS)

    Buie, Marc W.; Reitsema, Harold J.; Linfield, Roger P.

    2016-11-01

    We present an analysis of surveying the inner solar system for objects that may pose some threat to Earth. Most of the analysis is based on understanding the capability provided by Sentinel, a concept for an infrared space-based telescope placed in a heliocentric orbit near the distance of Venus. From this analysis, we show that (1) the size range being targeted can affect the survey design, (2) the orbit distribution of the target sample can affect the survey design, (3) minimum observational arc length during the survey is an important metric of survey performance, and (4) surveys must consider objects as small as D=15{--}30 m to meet the goal of identifying objects that have the potential to cause damage on Earth in the next 100 yr. Sentinel will be able to find 50% of all impactors larger than 40 m in a 6.5 yr survey. The Sentinel mission concept is shown to be as effective as any survey in finding objects bigger than D = 140 m but is more effective when applied to finding smaller objects on Earth-impacting orbits. Sentinel is also more effective at finding objects of interest for human exploration that benefit from lower propulsion requirements. To explore the interaction between space and ground search programs, we also study a case where Sentinel is combined with the Large Synoptic Survey Telescope (LSST) and show the benefit of placing a space-based observatory in an orbit that reduces the overlap in search regions with a ground-based telescope. In this case, Sentinel+LSST can find more than 70% of the impactors larger than 40 m assuming a 6.5 yr lifetime for Sentinel and 10 yr for LSST.

  13. Wavelength-Dependent PSFs and their Impact on Weak Lensing Measurements

    NASA Astrophysics Data System (ADS)

    Carlsten, S. G.; Strauss, Michael A.; Lupton, Robert H.; Meyers, Joshua E.; Miyazaki, Satoshi

    2018-06-01

    We measure and model the wavelength dependence of the point spread function (PSF) in the Hyper Suprime-Cam Subaru Strategic Program survey. We find that PSF chromaticity is present in that redder stars appear smaller than bluer stars in the g, r, and i-bands at the 1-2 per cent level and in the z and y-bands at the 0.1-0.2 per cent level. From the color dependence of the PSF, we fit a model between the monochromatic PSF size based on weighted second moments, R, and wavelength of the form R(λ)∝λ-b. We find values of b between 0.2 and 0.5, depending on the epoch and filter. This is consistent with the expectations of a turbulent atmosphere with an outer scale length of ˜10 - 100 m, indicating that the atmosphere is dominating the chromaticity. In the best seeing data, we find that the optical system and detector also contribute some wavelength dependence. Meyers & Burchat (2015b) showed that b must be measured to an accuracy of ˜0.02 not to dominate the systematic error budget of the Large Synoptic Survey Telescope (LSST) weak lensing (WL) survey. Using simple image simulations, we find that b can be inferred with this accuracy in the r and i-bands for all positions in the LSST focal plane, assuming a stellar density of 1 star arcmin-2 and that the optical component of the PSF can be accurately modeled. Therefore, it is possible to correct for most, if not all, of the bias that the wavelength-dependent PSF will introduce into an LSST-like WL survey.

  14. Volume phase holographic gratings for the Subaru Prime Focus Spectrograph: performance measurements of the prototype grating set

    NASA Astrophysics Data System (ADS)

    Barkhouser, Robert H.; Arns, James; Gunn, James E.

    2014-08-01

    The Prime Focus Spectrograph (PFS) is a major instrument under development for the 8.2 m Subaru telescope on Mauna Kea. Four identical, fixed spectrograph modules are located in a room above one Nasmyth focus. A 55 m fiber optic cable feeds light into the spectrographs from a robotic fiber positioner mounted at the telescope prime focus, behind the wide field corrector developed for Hyper Suprime-Cam. The positioner contains 2400 fibers and covers a 1.3 degree hexagonal field of view. Each spectrograph module will be capable of simultaneously acquiring 600 spectra. The spectrograph optical design consists of a Schmidt collimator, two dichroic beamsplitters to separate the light into three channels, and for each channel a volume phase holographic (VPH) grating and a dual- corrector, modified Schmidt reimaging camera. This design provides a 275 mm collimated beam diameter, wide simultaneous wavelength coverage from 380 nm to 1.26 µm, and good imaging performance at the fast f/1.1 focal ratio required from the cameras to avoid oversampling the fibers. The three channels are designated as the blue, red, and near-infrared (NIR), and cover the bandpasses 380-650 nm (blue), 630-970 nm (red), and 0.94-1.26 µm (NIR). A mosaic of two Hamamatsu 2k×4k, 15 µm pixel CCDs records the spectra in the blue and red channels, while the NIR channel employs a 4k×4k, substrate-removed HAWAII-4RG array from Teledyne, with 15 µm pixels and a 1.7 µm wavelength cutoff. VPH gratings have become the dispersing element of choice for moderate-resolution astronomical spectro- graphs due their potential for very high diffraction efficiency, low scattered light, and the more compact instru- ment designs offered by transmissive dispersers. High quality VPH gratings are now routinely being produced in the sizes required for instruments on large telescopes. These factors made VPH gratings an obvious choice for PFS. In order to reduce risk to the project, as well as fully exploit the performance potential of this technology, a set of three prototype VPH gratings (one each of the blue, red, and NIR designs) was ordered and has been recently delivered. The goal for these prototype units, but not a requirement, was to meet the specifications for the final gratings in order to serve as spares and also as early demonstration and integration articles. In this paper we present the design and specifications for the PFS gratings, the plan and setups used for testing both the prototype and final gratings, and results from recent optical testing of the prototype grating set.

  15. Multiple Sensor Camera for Enhanced Video Capturing

    NASA Astrophysics Data System (ADS)

    Nagahara, Hajime; Kanki, Yoshinori; Iwai, Yoshio; Yachida, Masahiko

    A resolution of camera has been drastically improved under a current request for high-quality digital images. For example, digital still camera has several mega pixels. Although a video camera has the higher frame-rate, the resolution of a video camera is lower than that of still camera. Thus, the high-resolution is incompatible with the high frame rate of ordinary cameras in market. It is difficult to solve this problem by a single sensor, since it comes from physical limitation of the pixel transfer rate. In this paper, we propose a multi-sensor camera for capturing a resolution and frame-rate enhanced video. Common multi-CCDs camera, such as 3CCD color camera, has same CCD for capturing different spectral information. Our approach is to use different spatio-temporal resolution sensors in a single camera cabinet for capturing higher resolution and frame-rate information separately. We build a prototype camera which can capture high-resolution (2588×1958 pixels, 3.75 fps) and high frame-rate (500×500, 90 fps) videos. We also proposed the calibration method for the camera. As one of the application of the camera, we demonstrate an enhanced video (2128×1952 pixels, 90 fps) generated from the captured videos for showing the utility of the camera.

  16. Gamma Ray Burst Optical Counterpart Search Experiment (GROCSE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, H.S.; Ables, E.; Bionta, R.M.

    GROCSE (Gamma-Ray Optical Counterpart Search Experiments) is a system of automated telescopes that search for simultaneous optical activity associated with gamma ray bursts in response to real-time burst notifications provided by the BATSE/BACODINE network. The first generation system, GROCSE 1, is sensitive down to Mv {approximately} 8.5 and requires an average of 12 seconds to obtain the first images of the gamma ray burst error box defined by the BACODINE trigger. The collaboration is now constructing a second generation system which has a 4 second slewing time and can reach Mv {approximately} 14 with a 5 second exposure. GROCSE 2more » consists of 4 cameras on a single mount. Each camera views the night sky through a commercial Canon lens (f/1.8, focal length 200 mm) and utilizes a 2K x 2K Loral CCD. Light weight and low noise custom readout electronics were designed and fabricated for these CCDs. The total field of view of the 4 cameras is 17.6 x 17.6 {degree}. GROCSE II will be operated by the end of 1995. In this paper, the authors present an overview of the GROCSE system and the results of measurements with a GROCSE 2 prototype unit.« less

  17. New simulation and measurement results on gateable DEPFET devices

    NASA Astrophysics Data System (ADS)

    Bähr, Alexander; Aschauer, Stefan; Hermenau, Katrin; Herrmann, Sven; Lechner, Peter H.; Lutz, Gerhard; Majewski, Petra; Miessner, Danilo; Porro, Matteo; Richter, Rainer H.; Schaller, Gerhard; Sandow, Christian; Schnecke, Martina; Schopper, Florian; Stefanescu, Alexander; Strüder, Lothar; Treis, Johannes

    2012-07-01

    To improve the signal to noise level, devices for optical and x-ray astronomy use techniques to suppress background events. Well known examples are e.g. shutters or frame-store Charge Coupled Devices (CCDs). Based on the DEpleted P-channel Field Effect Transistor (DEPFET) principle a so-called Gatebale DEPFET detector can be built. Those devices combine the DEPFET principle with a fast built-in electronic shutter usable for optical and x-ray applications. The DEPFET itself is the basic cell of an active pixel sensor build on a fully depleted bulk. It combines internal amplification, readout on demand, analog storage of the signal charge and a low readout noise with full sensitivity over the whole bulk thickness. A Gatebale DEPFET has all these benefits and obviates the need for an external shutter. Two concepts of Gatebale DEPFET layouts providing a built-in shutter will be introduced. Furthermore proof of principle measurements for both concepts are presented. Using recently produced prototypes a shielding of the collection anode up to 1 • 10-4 was achieved. Predicted by simulations, an optimized geometry should result in values of 1 • 10-5 and better. With the switching electronic currently in use a timing evaluation of the shutter opening and closing resulted in rise and fall times of 100ns.

  18. Properties of tree rings in LSST sensors

    DOE PAGES

    Park, H. Y.; Nomerotski, A.; Tsybychev, D.

    2017-05-30

    Images of uniformly illuminated sensors for the Large Synoptic Survey Telescope have circular periodic patterns with an appearance similar to tree rings. Furthermore, these patterns are caused by circularly symmetric variations of the dopant concentration in the monocrystal silicon boule induced by the manufacturing process. Non-uniform charge density results in the parasitic electric field inside the silicon sensor, which may distort shapes of astronomical sources. Here, we analyzed data from fifteen LSST sensors produced by ITL to determine the main parameters of the tree rings: amplitude and period, and also variability across the sensors tested at Brookhaven National Laboratory. Treemore » ring pattern has a weak dependence on the wavelength. But the ring amplitude gets smaller as wavelength gets longer, since longer wavelengths penetrate deeper into the silicon. Tree ring amplitude gets larger as it gets closer to the outer part of the wafer, from 0.1 to 1.0%, indicating that the resistivity variation is larger for larger radii.« less

  19. Fringing in MonoCam Y4 filter images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooks, J.; Fisher-Levine, M.; Nomerotski, A.

    Here, we study the fringing patterns observed in MonoCam, a camera with a single Large Synoptic Survey Telescope (LSST) CCD sensor. Images were taken at the U.S. Naval Observatory in Flagstaff, Arizona (NOFS) employing its 1.3 m telescope and an LSST y4 filter. Fringing occurs due to the reflection of infrared light (700 nm or larger) from the bottom surface of the CCD which constructively or destructively interferes with the incident light to produce a net "fringe" pattern which is superimposed on all images taken. Emission lines from the atmosphere, dominated by hydroxyl (OH) spectra, can change in their relativemore » intensities as the night goes on, producing different fringe patterns in the images taken. We found through several methods that the general shape of the fringe patterns remained constant, though with slight changes in the amplitude and phase of the fringes. Lastly, we also found that a superposition of fringes from two monochromatic lines taken in the lab offered a reasonable description of the sky data.« less

  20. scarlet: Source separation in multi-band images by Constrained Matrix Factorization

    NASA Astrophysics Data System (ADS)

    Melchior, Peter; Moolekamp, Fred; Jerdee, Maximilian; Armstrong, Robert; Sun, Ai-Lei; Bosch, James; Lupton, Robert

    2018-03-01

    SCARLET performs source separation (aka "deblending") on multi-band images. It is geared towards optical astronomy, where scenes are composed of stars and galaxies, but it is straightforward to apply it to other imaging data. Separation is achieved through a constrained matrix factorization, which models each source with a Spectral Energy Distribution (SED) and a non-parametric morphology, or multiple such components per source. The code performs forced photometry (with PSF matching if needed) using an optimal weight function given by the signal-to-noise weighted morphology across bands. The approach works well if the sources in the scene have different colors and can be further strengthened by imposing various additional constraints/priors on each source. Because of its generic utility, this package provides a stand-alone implementation that contains the core components of the source separation algorithm. However, the development of this package is part of the LSST Science Pipeline; the meas_deblender package contains a wrapper to implement the algorithms here for the LSST stack.

  1. Properties of tree rings in LSST sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, H. Y.; Nomerotski, A.; Tsybychev, D.

    Images of uniformly illuminated sensors for the Large Synoptic Survey Telescope have circular periodic patterns with an appearance similar to tree rings. Furthermore, these patterns are caused by circularly symmetric variations of the dopant concentration in the monocrystal silicon boule induced by the manufacturing process. Non-uniform charge density results in the parasitic electric field inside the silicon sensor, which may distort shapes of astronomical sources. Here, we analyzed data from fifteen LSST sensors produced by ITL to determine the main parameters of the tree rings: amplitude and period, and also variability across the sensors tested at Brookhaven National Laboratory. Treemore » ring pattern has a weak dependence on the wavelength. But the ring amplitude gets smaller as wavelength gets longer, since longer wavelengths penetrate deeper into the silicon. Tree ring amplitude gets larger as it gets closer to the outer part of the wafer, from 0.1 to 1.0%, indicating that the resistivity variation is larger for larger radii.« less

  2. Fringing in MonoCam Y4 filter images

    DOE PAGES

    Brooks, J.; Fisher-Levine, M.; Nomerotski, A.

    2017-05-05

    Here, we study the fringing patterns observed in MonoCam, a camera with a single Large Synoptic Survey Telescope (LSST) CCD sensor. Images were taken at the U.S. Naval Observatory in Flagstaff, Arizona (NOFS) employing its 1.3 m telescope and an LSST y4 filter. Fringing occurs due to the reflection of infrared light (700 nm or larger) from the bottom surface of the CCD which constructively or destructively interferes with the incident light to produce a net "fringe" pattern which is superimposed on all images taken. Emission lines from the atmosphere, dominated by hydroxyl (OH) spectra, can change in their relativemore » intensities as the night goes on, producing different fringe patterns in the images taken. We found through several methods that the general shape of the fringe patterns remained constant, though with slight changes in the amplitude and phase of the fringes. Lastly, we also found that a superposition of fringes from two monochromatic lines taken in the lab offered a reasonable description of the sky data.« less

  3. OCTOCAM: A Workhorse Instrument for the Gemini Telescopes During the Era of LSST

    NASA Astrophysics Data System (ADS)

    Roming, Peter; van der Horst, Alexander; OCTOCAM Team

    2018-01-01

    The decade of the 2020s are planned to be an era of large surveys and giant telescopes. A trademark of this era will be the large number of interesting objects observed daily by high-cadence surveys, such as the LSST. Because of the sheer numbers, only a very small fraction of these interesting objects will be observed with extremely large telescopes. The follow up workhorses during this era will be the 8-meter class telescopes and corresponding instruments that are prepared to pursue these interesting objects. One such workhorse instrument is OCTOCAM, a highly efficient instrument designed to probe the time domain window with simulatenous broad-wavelength coverage. OCTOCAM optimizes the use of Gemini for broadband imaging and spectroscopic single-target observations. The instrument is designed for high temporal resolution, broad spectral coverage, and moderate spectral resolution. OCTOCAM was selected as part of the Gemini instrumentation program in early 2017. Here we provide a description of the science cases to be addressed, overall instrument design, and current status.

  4. The Large Synoptic Survey Telescope OCS and TCS models

    NASA Astrophysics Data System (ADS)

    Schumacher, German; Delgado, Francisco

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) is a project envisioned as a system of systems with demanding science, technical, and operational requirements, that must perform as a fully integrated unit. The design and implementation of such a system poses big engineering challenges when performing requirements analysis, detailed interface definitions, operational modes and control strategy studies. The OMG System Modeling Language (SysML) has been selected as the framework for the systems engineering analysis and documentation for the LSST. Models for the overall system architecture and different observatory subsystems have been built describing requirements, structure, interfaces and behavior. In this paper we show the models for the Observatory Control System (OCS) and the Telescope Control System (TCS), and how this methodology has helped in the clarification of the design and requirements. In one common language, the relationships of the OCS, TCS, Camera and Data management subsystems are captured with models of the structure, behavior, requirements and the traceability between them.

  5. Dead Sea Scrolls

    NASA Technical Reports Server (NTRS)

    1994-01-01

    A consortium of researchers from Jet Propulsion Laboratory and three other organizations used charged coupled devices (CCDs) and other imaging enhancement technology to decipher previously unreadable portions of the Dead Sea Scrolls. The technique has potentially important implications for archeology.

  6. chroma: Chromatic effects for LSST weak lensing

    NASA Astrophysics Data System (ADS)

    Meyers, Joshua E.; Burchat, Patricia R.

    2018-04-01

    Chroma investigates biases originating from two chromatic effects in the atmosphere: differential chromatic refraction (DCR), and wavelength dependence of seeing. These biases arise when using the point spread function (PSF) measured with stars to estimate the shapes of galaxies with different spectral energy distributions (SEDs) than the stars.

  7. Guide to Calculating Environmental Benefits from EPA Enforcement Cases

    EPA Pesticide Factsheets

    The “Guide to Calculating Environmental Benefits from EPA Enforcement Cases” establishes a framework for identifying and characterizing environmental benefits that are reported on the CCDS and entered in the Integrated Information Compliance System (ICIS).

  8. BioServe space technologies: A NASA Center for the Commercial Development of Space

    NASA Technical Reports Server (NTRS)

    1992-01-01

    BioServe Space Technologies, a NASA Center for the Commercial Development of Space (CCDS), was established in 1987. As is characteristic of each CCDS designated by NASA, the goals of this commercial center are aimed at stimulating high technology research that takes advantage of the space environment and at leading in the development of new products and services which have commercial potential or that contribute to possible new commercial ventures. BioServe's efforts in these areas focus upon space life science studies and the development of enabling devices that will facilitate ground-based experiments as well as the conversion of such to the microgravity environment. A direct result of BioServe's hardware development and life sciences studies is the training of the next generation of bioengineers who will be knowledgeable and comfortable working with the challenges of the space frontier.

  9. Using ACIS on the Chandra X-ray Observatory as a Particle Radiation Monitor II

    NASA Technical Reports Server (NTRS)

    Grant, C. E.; Ford, P. G.; Bautz, M. W.; ODell, S. L.

    2012-01-01

    The Advanced CCD Imaging Spectrometer is an instrument on the Chandra X-ray Observatory. CCDs are vulnerable to radiation damage, particularly by soft protons in the radiation belts and solar storms. The Chandra team has implemented procedures to protect ACIS during high-radiation events including autonomous protection triggered by an on-board radiation monitor. Elevated temperatures have reduced the effectiveness of the on-board monitor. The ACIS team has developed an algorithm which uses data from the CCDs themselves to detect periods of high radiation and a flight software patch to apply this algorithm is currently active on-board the instrument. In this paper, we explore the ACIS response to particle radiation through comparisons to a number of external measures of the radiation environment. We hope to better understand the efficiency of the algorithm as a function of the flux and spectrum of the particles and the time-profile of the radiation event.

  10. EMCCD calibration for astronomical imaging: Wide FastCam at the Telescopio Carlos Sánchez

    NASA Astrophysics Data System (ADS)

    Velasco, S.; Oscoz, A.; López, R. L.; Puga, M.; Pérez-Garrido, A.; Pallé, E.; Ricci, D.; Ayuso, I.; Hernández-Sánchez, M.; Vázquez-Martín, S.; Protasio, C.; Béjar, V.; Truant, N.

    2017-03-01

    The evident benefits of Electron Multiplying CCDs (EMCCDs) -speed, high sensitivity, low noise and their capability of detecting single photon events whilst maintaining high quantum efficiency- are bringing these kinds of detectors to many state-of-the-art astronomical instruments (Velasco et al. 2016; Oscoz et al. 2008). The EMCCDs are the perfect answer to the need for great sensitivity levels as they are not limited by the readout noise of the output amplifier, while conventional CCDs are, even when operated at high readout frame rates. Here we present a quantitative on-sky method to calibrate EMCCD detectors dedicated to astronomical imaging, developed during the commissioning process (Velasco et al. 2016) and first observations (Ricci et al. 2016, in prep.) with Wide FastCam (Marga et al. 2014) at Telescopio Carlos Sánchez (TCS) in the Observatorio del Teide.

  11. The Zwicky Transient Facility Camera

    NASA Astrophysics Data System (ADS)

    Dekany, Richard; Smith, Roger M.; Belicki, Justin; Delacroix, Alexandre; Duggan, Gina; Feeney, Michael; Hale, David; Kaye, Stephen; Milburn, Jennifer; Murphy, Patrick; Porter, Michael; Reiley, Daniel J.; Riddle, Reed L.; Rodriguez, Hector; Bellm, Eric C.

    2016-08-01

    The Zwicky Transient Facility Camera (ZTFC) is a key element of the ZTF Observing System, the integrated system of optoelectromechanical instrumentation tasked to acquire the wide-field, high-cadence time-domain astronomical data at the heart of the Zwicky Transient Facility. The ZTFC consists of a compact cryostat with large vacuum window protecting a mosaic of 16 large, wafer-scale science CCDs and 4 smaller guide/focus CCDs, a sophisticated vacuum interface board which carries data as electrical signals out of the cryostat, an electromechanical window frame for securing externally inserted optical filter selections, and associated cryo-thermal/vacuum system support elements. The ZTFC provides an instantaneous 47 deg2 field of view, limited by primary mirror vignetting in its Schmidt telescope prime focus configuration. We report here on the design and performance of the ZTF CCD camera cryostat and report results from extensive Joule-Thompson cryocooler tests that may be of broad interest to the instrumentation community.

  12. Report of the ultraviolet and visible sensors panel

    NASA Technical Reports Server (NTRS)

    Timothy, J. Gethyn; Blouke, M.; Bredthauer, R.; Kimble, R.; Lee, T.-H.; Lesser, M.; Siegmund, O.; Weckler, G.

    1991-01-01

    In order to meet the science objectives of the Astrotech 21 mission set the Ultraviolet (UV) and Visible Sensors Panel made a number of recommendations. In the UV wavelength range of 0.01 to 0.3 micro-m the focus is on the need for large format high quantum efficiency, radiation hard 'solar-blind' detectors. Options recommended for support include Si and non-Si charge coupled devices (CCDs) as well as photocathodes with improved microchannel plate readouts. For the 0.3 to 0.9 micro-m range, it was felt that Si CCDs offer the best option for high quantum efficiencies at these wavelengths. In the 0.9 to 2.5 micro-m the panel recommended support for the investigation of monolithic arrays. Finally, the panel noted that the implementation of very large arrays will require new data transmission, data recording, and data handling technologies.

  13. Development and flight testing of UV optimized Photon Counting CCDs

    NASA Astrophysics Data System (ADS)

    Hamden, Erika T.

    2018-06-01

    I will discuss the latest results from the Hamden UV/Vis Detector Lab and our ongoing work using a UV optimized EMCCD in flight. Our lab is currently testing efficiency and performance of delta-doped, anti-reflection coated EMCCDs, in collaboration with JPL. The lab has been set-up to test quantum efficiency, dark current, clock-induced-charge, and read noise. I will describe our improvements to our circuit boards for lower noise, updates from a new, more flexible NUVU controller, and the integration of an EMCCD in the FIREBall-2 UV spectrograph. I will also briefly describe future plans to conduct radiation testing on delta-doped EMCCDs (both warm, unbiased and cold, biased configurations) thus summer and longer term plans for testing newer photon counting CCDs as I move the HUVD Lab to the University of Arizona in the Fall of 2018.

  14. Unveiling the extreme nature of the hyper faint galaxy Virgo I

    NASA Astrophysics Data System (ADS)

    Crnojevic, Denija

    2017-08-01

    We request HST/ACS imaging to obtain a deep color-magnitude diagram of the newly discovered candidate Milky Way satellite Virgo I. With an estimated absolute magnitude of only M_V -0.8 and a Galactocentric radius of 90 kpc, Virgo I is one of the faintest and most distant dwarfs ever observed, and could be identified as a prototype ''hyper'' faint galaxy. The detailed characterization of the smallest inhabited dark matter subhalos is crucial to guide hierarchical galaxy formation models, and in particular to constrain reionization, the nature of the dark matter particle, etc. With the advent of deep wide-field, ground-based surveys, the potential of uncovering these lowest-mass galaxies is quickly turning into reality, as demonstrated by the discovery in the past two years of tens of new Local Group members in the ultra-faint regime (M_V>-8). Virgo I represents a new record in galaxy physical properties, and urges us to be prepared for the likely emergence of an entirely new class of such objects in the era of future wide-field surveys (e.g., LSST). Only high resolution HST observations can enable us to confirm the nature of Virgo I, providing significantly more accurate estimates for its distance and structural properties, when compared to the discovery Subaru/HyperSuprimeCam imaging. Our proposed dataset will constitute a fundamental step in the upcoming hunt for galaxies with similarly extreme properties.

  15. “Big Data” Teen Astronomy Cafes at NOAO

    NASA Astrophysics Data System (ADS)

    Pompea, Stephen; Walker, Constance E.

    2018-01-01

    The National Optical Astronomy Observatory has designed and implemented a prototype educational program designed to test and understand best practices with high school students to promote an understanding of modern astronomy research with its emphasis on large data sets, data tools, and visualization tools. This program, designed to cultivate the interest of talented youth in astronomy, is based on a teen science café model developed at Los Alamos as the Café Scientifique New Mexico. In our program, we provide a free, fun way for teens to explore current research topics in astronomy on Saturday mornings at the NOAO headquarters. The program encourages stimulating conversations with astronomers in an informal and relaxed setting, with free food of course. The café is organized through a leadership team of local high school students and recruits students from all parts of the greater Tucson area. The high school students who attend have the opportunity to interact with expert astronomers working with large astronomical data sets on topics such as killer asteroids, the birth and death of stars, colliding galaxies, the structure of the universe, gravitational waves, gravitational lensing, dark energy, and dark matter. The students also have the opportunity to explore astronomical data sets and data tools using computers provided by the program. The program may serve as a model for educational outreach for the 40+ institutions involved in the LSST.

  16. Posttraumatic stress in emergency settings outside North America and Europe: A review of the emic literature

    PubMed Central

    Rasmussen, Andrew; Keatley, Eva; Joscelyne, Amy

    2014-01-01

    Mental health professionals from North America and Europe have become common participants in postconflict and disaster relief efforts outside of North America and Europe. Consistent with their training, these practitioners focus primarily on posttraumatic stress disorder (PTSD) as their primary diagnostic concern. Most research that has accompanied humanitarian aid efforts has likewise originated in North America and Europe, has focused on PTSD, and in turn has reinforced practitioners’ assumptions about the universality of the diagnosis. In contrast, studies that have attempted to identify how local populations conceptualize posttrauma reactions portray a wide range of psychological states. We review this emic literature in order to examine differences and commonalities across local posttraumatic cultural concepts of distress (CCDs). We focus on symptoms to describe these constructs – i.e., using the dominant neo-Kraepelinian approach used in North American and European psychiatry – as opposed to focusing on explanatory models in order to examine whether positive comparisons of PTSD to CCDs meet criteria for face validity. Hierarchical clustering (Ward’s method) of symptoms within CCDs provides a portrait of the emic literature characterized by traumatic multifinality with several common themes. Global variety within the literature suggests that few disaster-affected populations have mental health nosologies that include PTSD-like syndromes. One reason for this seems to be the almost complete absence of avoidance as pathology. Many nosologies contain depression-like disorders. Relief efforts would benefit from mental health practitioners getting specific training in culture-bound posttrauma constructs when entering settings beyond the boundaries of the culture of their training and practice. PMID:24698712

  17. Angiotensin II Inhibits the ROMK-like Small Conductance K Channel in Renal Cortical Collecting Duct during Dietary Potassium Restriction*

    PubMed Central

    Wei, Yuan; Zavilowitz, Beth; Satlin, Lisa M.; Wang, Wen-Hui

    2010-01-01

    Base-line urinary potassium secretion in the distal nephron is mediated by small conductance rat outer medullary K (ROMK)-like channels. We used the patch clamp technique applied to split-open cortical collecting ducts (CCDs) isolated from rats fed a normal potassium (NK) or low potassium (LK) diet to test the hypothesis that AngII directly inhibits ROMK channel activity. We found that AngII inhibited ROMK channel activity in LK but not NK rats in a dose-dependent manner. The AngII-induced reduction in channel activity was mediated by AT1 receptor (AT1R) binding, because pretreatment of CCDs with losartan but not PD123319 AT1 and AT2 receptor antagonists, respectively, blocked the response. Pretreatment of CCDs with U73122 and calphostin C, inhibitors of phospholipase C (PLC) and protein kinase C (PKC), respectively, abolished the AngII-induced decrease in ROMK channel activity, confirming a role of the PLC-PKC pathway in this response. Studies by others suggest that AngII stimulates an Src family protein-tyrosine kinase (PTK) via PKC-NADPH oxidase. PTK has been shown to regulate the ROMK channel. Inhibition of NADPH oxidase with diphenyliodonium abolished the inhibitory effect of AngII or the PKC activator phorbol 12-myristate 13-acetate on ROMK channels. Suppression of PTK by herbimycin A significantly attenuated the inhibitory effect of AngII on ROMK channel activity. We conclude that AngII inhibits ROMK channel activity through PKC-, NADPH oxidase-, and PTK-dependent pathways under conditions of dietary potassium restriction. PMID:17194699

  18. The Matrix Rib Plating System: improving aesthetic outcomes in microvascular breast reconstruction.

    PubMed

    Ahdoot, Michael A; Echo, Anthony; Otake, Leo R; Son, Ji; Zeidler, Kamakshi R; Saadian, Isaac; Lee, Gordon K

    2013-04-01

    During microvascular breast reconstruction, exposure of internal mammary vessels (IMVs) is facilitated by the removal of a portion of the rib resulting in occasional chest contour deformity (CCD). The use of rib plating may reduce CCD and reduce postoperative pain. All patients underwent microvascular breast reconstruction using IMVs. In the retrospective arm, photographs were assessed by a blinded reviewer for CCDs. In the prospective cohort, patients were randomized to rib plating with the Synthes Matrix Rib Plating System or no rib plating. Postoperatively, patients were assessed for CCD and pain. In the retrospective arm, 11 of 98 (11.2%) patients representing 12 of 130 (9.2%) breast reconstructions had a noticeable contour deformity. The average body mass index (BMI) of patients with CCDs was 26.6 kg/m. In the prospective arm, there was 16% (3 of 19) rate of visible and palpable CCDs among controls, compared to 0% rate of palpable and visible contour deformity in the rib plating group. Pain was decreased in the rib plating group on all postoperative days. The pain reduction was statistically significant at rest by postoperative day 30. The majority of patients (9 of 11) with compromised aesthetic outcomes had a BMI less than 30 kg/m, suggesting a paucity of overlying soft tissue contributed to visibility of these bony defects. Rib plating prevented chest contour deformity, reduced postoperative pain, and added limited additional morbidity. We believe that rib plating is a safe, useful adjunct to microvascular breast reconstruction using IMVs, as it may improve aesthetic outcomes and reduce postoperative pain.

  19. Angiotensin II inhibits the ROMK-like small conductance K channel in renal cortical collecting duct during dietary potassium restriction.

    PubMed

    Wei, Yuan; Zavilowitz, Beth; Satlin, Lisa M; Wang, Wen-Hui

    2007-03-02

    Base-line urinary potassium secretion in the distal nephron is mediated by small conductance rat outer medullary K (ROMK)-like channels. We used the patch clamp technique applied to split-open cortical collecting ducts (CCDs) isolated from rats fed a normal potassium (NK) or low potassium (LK) diet to test the hypothesis that AngII directly inhibits ROMK channel activity. We found that AngII inhibited ROMK channel activity in LK but not NK rats in a dose-dependent manner. The AngII-induced reduction in channel activity was mediated by AT1 receptor (AT1R) binding, because pretreatment of CCDs with losartan but not PD123319 AT1 and AT2 receptor antagonists, respectively, blocked the response. Pretreatment of CCDs with U73122 and calphostin C, inhibitors of phospholipase C (PLC) and protein kinase C (PKC), respectively, abolished the AngII-induced decrease in ROMK channel activity, confirming a role of the PLC-PKC pathway in this response. Studies by others suggest that AngII stimulates an Src family protein-tyrosine kinase (PTK) via PKC-NADPH oxidase. PTK has been shown to regulate the ROMK channel. Inhibition of NADPH oxidase with diphenyliodonium abolished the inhibitory effect of AngII or the PKC activator phorbol 12-myristate 13-acetate on ROMK channels. Suppression of PTK by herbimycin A significantly attenuated the inhibitory effect of AngII on ROMK channel activity. We conclude that AngII inhibits ROMK channel activity through PKC-, NADPH oxidase-, and PTK-dependent pathways under conditions of dietary potassium restriction.

  20. Wood-Vasey DOE #SC0011834 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood-Vasey, William Michael

    During the past reporting period (Year 3), this grant has provided partial support for graduate students Daniel Perrefort and Kara Ponder. They have been working exploring different aspects of the technical work needed to take full advantage of the potential for cosmological inference using Type Ia supernovae (SNeIa) with LSST.

  1. The simulated spectrum of the OGRE X-ray EM-CCD camera system

    NASA Astrophysics Data System (ADS)

    Lewis, M.; Soman, M.; Holland, A.; Lumb, D.; Tutt, J.; McEntaffer, R.; Schultz, T.; Holland, K.

    2017-12-01

    The X-ray astronomical telescopes in use today, such as Chandra and XMM-Newton, use X-ray grating spectrometers to probe the high energy physics of the Universe. These instruments typically use reflective optics for focussing onto gratings that disperse incident X-rays across a detector, often a Charge-Coupled Device (CCD). The X-ray energy is determined from the position that it was detected on the CCD. Improved technology for the next generation of X-ray grating spectrometers has been developed and will be tested on a sounding rocket experiment known as the Off-plane Grating Rocket Experiment (OGRE). OGRE aims to capture the highest resolution soft X-ray spectrum of Capella, a well-known astronomical X-ray source, during an observation period lasting between 3 and 6 minutes whilst proving the performance and suitability of three key components. These three components consist of a telescope made from silicon mirrors, gold coated silicon X-ray diffraction gratings and a camera that comprises of four Electron-Multiplying (EM)-CCDs that will be arranged to observe the soft X-rays dispersed by the gratings. EM-CCDs have an architecture similar to standard CCDs, with the addition of an EM gain register where the electron signal is amplified so that the effective signal-to-noise ratio of the imager is improved. The devices also have incredibly favourable Quantum Efficiency values for detecting soft X-ray photons. On OGRE, this improved detector performance allows for easier identification of low energy X-rays and fast readouts due to the amplified signal charge making readout noise almost negligible. A simulation that applies the OGRE instrument performance to the Capella soft X-ray spectrum has been developed that allows the distribution of X-rays onto the EM-CCDs to be predicted. A proposed optical model is also discussed which would enable the missions minimum success criteria's photon count requirement to have a high chance of being met with the shortest possible observation time. These results are compared to a Chandra observation to show the overall effectiveness of the new technologies. The current optical module is shown to narrowly meet the minimum success conditions whilst the proposed model comfortably demonstrates the effectiveness of the technologies if a larger effective area is provided.

  2. Rochester scientist discovers new comet with Dark Energy Camera (DECam) at

    Science.gov Websites

    Sites Group MASS-DIMM New Projects NOAO Future Instrumentation DECam SAM LSST MONSOON What is MONSOON AURA Sites Group Talks and Meetings Upcoming Colloquia Sky Conditions CTIO Site Conditions TASCA colleagues believe. David Cameron, a visiting scientist in Eric Mamajek's research group in the Department of

  3. LSST camera grid structure made out of ceramic composite material, HB-Cesic

    NASA Astrophysics Data System (ADS)

    Kroedel, Matthias R.; Langton, J. Bryan

    2016-08-01

    In this paper we are presenting the ceramic design and the fabrication of the camera structure which is using the unique manufacturing features of the HB-Cesic technology and associated with a dedicated metrology device in order to ensure the challenging flatness requirement of 4 micron over the full array.

  4. Deriving photometric redshifts using fuzzy archetypes and self-organizing maps - II. Implementation

    NASA Astrophysics Data System (ADS)

    Speagle, Joshua S.; Eisenstein, Daniel J.

    2017-07-01

    With an eye towards the computational requirements of future large-scale surveys such as Euclid and Large Synoptic Survey Telescope (LSST) that will require photometric redshifts (photo-z's) for ≳ 109 objects, we investigate a variety of ways that 'fuzzy archetypes' can be used to improve photometric redshifts and explore their respective statistical interpretations. We characterize their relative performance using an idealized LSST ugrizY and Euclid YJH mock catalogue of 10 000 objects spanning z = 0-6 at Y = 24 mag. We find most schemes are able to robustly identify redshift probability distribution functions that are multimodal and/or poorly constrained. Once these objects are flagged and removed, the results are generally in good agreement with the strict accuracy requirements necessary to meet Euclid weak lensing goals for most redshifts between 0.8 ≲ z ≲ 2. These results demonstrate the statistical robustness and flexibility that can be gained by combining template-fitting and machine-learning methods and provide useful insights into how astronomers can further exploit the colour-redshift relation.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter

    Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature.more » Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.« less

  6. Molecular cloning and characterization of cDNAs encoding carotenoid cleavage dioxygenase in bitter melon (Momordica charantia).

    PubMed

    Tuan, Pham Anh; Park, Sang Un

    2013-01-01

    Carotenoid cleavage dioxygenases (CCDs) are a family of enzymes that catalyze the oxidative cleavage of carotenoids at various chain positions to form a broad spectrum of apocarotenoids, including aromatic substances, pigments and phytohormones. Using the rapid amplification of cDNA ends (RACE) PCR method, we isolated three cDNA-encoding CCDs (McCCD1, McCCD4, and McNCED) from Momordica charantia. Amino acid sequence alignments showed that they share high sequence identity with other orthologous genes. Quantitative real-time RT PCR (reverse transcriptase PCR) analysis revealed that the expression of McCCD1 and McCCD4 was highest in flowers, and lowest in roots and old leaves (O-leaves). During fruit maturation, the two genes displayed differential expression, with McCCD1 peaking at mid-stage maturation while McCCD4 showed the lowest expression at that stage. The mRNA expression level of McNCED, a key enzyme involved in abscisic acid (ABA) biosynthesis, was high during fruit maturation and further increased at the beginning of seed germination. When first-leaf stage plants of M. charantia were exposed to dehydration stress, McNCED mRNA expression was induced primarily in the leaves and, to a lesser extend, in roots and stems. McNCED expression was also induced by high temperature and salinity, while treatment with exogenous ABA led to a decrease. These results should be helpful in determining the substrates and cleavage sites catalyzed by CCD genes in M. charantia, and also in defining the roles of CCDs in growth and development, and in the plant's response to environmental stress. Copyright © 2012 Elsevier GmbH. All rights reserved.

  7. Deconstructing the traditional Japanese medicine "Kampo": compounds, metabolites and pharmacological profile of maoto, a remedy for flu-like symptoms.

    PubMed

    Nishi, Akinori; Ohbuchi, Katsuya; Kushida, Hirotaka; Matsumoto, Takashi; Lee, Keiko; Kuroki, Haruo; Nabeshima, Shigeki; Shimobori, Chika; Komokata, Nagisa; Kanno, Hitomi; Tsuchiya, Naoko; Zushi, Makoto; Hattori, Tomohisa; Yamamoto, Masahiro; Kase, Yoshio; Matsuoka, Yukiko; Kitano, Hiroaki

    2017-01-01

    Pharmacological activities of the traditional Japanese herbal medicine (Kampo) are putatively mediated by complex interactions between multiple herbal compounds and host factors, which are difficult to characterize via the reductive approach of purifying major bioactive compounds and elucidating their mechanisms by conventional pharmacology. Here, we performed comprehensive compound, pharmacological and metabolomic analyses of maoto, a pharmaceutical-grade Kampo prescribed for flu-like symptoms, in normal and polyI:C-injected rats, the latter suffering from acute inflammation via Toll-like receptor 3 activation. In total, 352 chemical composition-determined compounds (CCDs) were detected in maoto extract by mass spectrometric analysis. After maoto treatment, 113 CCDs were newly detected in rat plasma. Of these CCDs, 19 were present in maoto extract, while 94 were presumed to be metabolites generated from maoto compounds or endogenous substances such as phospholipids. At the phenotypic level, maoto ameliorated the polyI:C-induced decrease in locomotor activity and body weight; however, body weight was not affected by individual maoto components in isolation. In accordance with symptom relief, maoto suppressed TNF-α and IL-1β, increased IL-10, and altered endogenous metabolites related to sympathetic activation and energy expenditure. Furthermore, maoto decreased inflammatory prostaglandins and leukotrienes, and increased anti-inflammatory eicosapentaenoic acid and hydroxyl-eicosapentaenoic acids, suggesting that it has differential effects on eicosanoid metabolic pathways involving cyclooxygenases, lipoxygenases and cytochrome P450s. Collectively, these data indicate that extensive profiling of compounds, metabolites and pharmacological phenotypes is essential for elucidating the mechanisms of herbal medicines, whose vast array of constituents induce a wide range of changes in xenobiotic and endogenous metabolism.

  8. UV-sensitive scientific CCD image sensors

    NASA Astrophysics Data System (ADS)

    Vishnevsky, Grigory I.; Kossov, Vladimir G.; Iblyaminova, A. F.; Lazovsky, Leonid Y.; Vydrevitch, Michail G.

    1997-06-01

    An investigation of probe laser irradiation interaction with substances containing in an environment has long since become a recognized technique for contamination detection and identification. For this purpose, a near and midrange-IR laser irradiation is traditionally used. However, as many works presented on last ecology monitoring conferences show, in addition to traditional systems, rapidly growing are systems with laser irradiation from near-UV range (250 - 500 nm). Use of CCD imagers is one of the prerequisites for this allowing the development of a multi-channel computer-based spectral research system. To identify and analyze contaminating impurities on an environment, such methods as laser fluorescence analysis, UV absorption and differential spectroscopy, Raman scattering are commonly used. These methods are used to identify a large number of impurities (petrol, toluene, Xylene isomers, SO2, acetone, methanol), to detect and identify food pathogens in real time, to measure a concentration of NH3, SO2 and NO in combustion outbursts, to detect oil products in a water, to analyze contaminations in ground waters, to define ozone distribution in the atmosphere profile, to monitor various chemical processes including radioactive materials manufacturing, heterogeneous catalytic reactions, polymers production etc. Multi-element image sensor with enhanced UV sensitivity, low optical non-uniformity, low intrinsic noise and high dynamic range is a key element of all above systems. Thus, so called Virtual Phase (VP) CCDs possessing all these features, seems promising for ecology monitoring spectral measuring systems. Presently, a family of VP CCDs with different architecture and number of pixels is developed and being manufactured. All CCDs from this family are supported with a precise slow-scan digital image acquisition system that can be used in various image processing systems in astronomy, biology, medicine, ecology etc. An image is displayed directly on a PC monitor through a software support.

  9. Eliminating Health Care Disparities With Mandatory Clinical Decision Support: The Venous Thromboembolism (VTE) Example.

    PubMed

    Lau, Brandyn D; Haider, Adil H; Streiff, Michael B; Lehmann, Christoph U; Kraus, Peggy S; Hobson, Deborah B; Kraenzlin, Franca S; Zeidan, Amer M; Pronovost, Peter J; Haut, Elliott R

    2015-01-01

    All hospitalized patients should be assessed for venous thromboembolism (VTE) risk factors and prescribed appropriate prophylaxis. To improve best-practice VTE prophylaxis prescription for all hospitalized patients, we implemented a mandatory computerized clinical decision support (CCDS) tool. The tool requires completion of checklists to evaluate VTE risk factors and contraindications to pharmacological prophylaxis, and then recommends the risk-appropriate VTE prophylaxis regimen. The objective of the study was to examine the effect of a quality improvement intervention on race-based and sex-based health care disparities across 2 distinct clinical services. This was a retrospective cohort study of a quality improvement intervention. The study included 1942 hospitalized medical patients and 1599 hospitalized adult trauma patients. In this study, the proportion of patients prescribed risk-appropriate, best-practice VTE prophylaxis was evaluated. Racial disparities existed in prescription of best-practice VTE prophylaxis in the preimplementation period between black and white patients on both the trauma (70.1% vs. 56.6%, P=0.025) and medicine (69.5% vs. 61.7%, P=0.015) services. After implementation of the CCDS tool, compliance improved for all patients, and disparities in best-practice prophylaxis prescription between black and white patients were eliminated on both services: trauma (84.5% vs. 85.5%, P=0.99) and medicine (91.8% vs. 88.0%, P=0.082). Similar findings were noted for sex disparities in the trauma cohort. Despite the fact that risk-appropriate prophylaxis should be prescribed equally to all hospitalized patients regardless of race and sex, practice varied widely before our quality improvement intervention. Our CCDS tool eliminated racial disparities in VTE prophylaxis prescription across 2 distinct clinical services. Health information technology approaches to care standardization are effective to eliminate health care disparities.

  10. Effect of aldosterone on BK channel expression in mammalian cortical collecting duct

    PubMed Central

    Estilo, Genevieve; Liu, Wen; Pastor-Soler, Nuria; Mitchell, Phillip; Carattino, Marcelo D.; Kleyman, Thomas R.; Satlin, Lisa M.

    2008-01-01

    Apical large-conductance Ca2+-activated K+ (BK) channels in the cortical collecting duct (CCD) mediate flow-stimulated K+ secretion. Dietary K+ loading for 10–14 days leads to an increase in BK channel mRNA abundance, enhanced flow-stimulated K+ secretion in microperfused CCDs, and a redistribution of immunodetectable channels from an intracellular pool to the apical membrane (Najjar F, Zhou H, Morimoto T, Bruns JB, Li HS, Liu W, Kleyman TR, Satlin LM. Am J Physiol Renal Physiol 289: F922–F932, 2005). To test whether this adaptation was mediated by a K+-induced increase in aldosterone, New Zealand White rabbits were fed a low-Na+ (LS) or high-Na+ (HS) diet for 7–10 days to alter circulating levels of aldosterone but not serum K+ concentration. Single CCDs were isolated for quantitation of BK channel subunit (total, α-splice variants, β-isoforms) mRNA abundance by real-time PCR and measurement of net transepithelial Na+ (JNa) and K+ (JK) transport by microperfusion; kidneys were processed for immunolocalization of BK α-subunit by immunofluorescence microscopy. At the time of death, LS rabbits excreted no urinary Na+ and had higher circulating levels of aldosterone than HS animals. The relative abundance of BK α-, β2-, and β4-subunit mRNA and localization of immunodetectable α-subunit were similar in CCDs from LS and HS animals. In response to an increase in tubular flow rate from ∼1 to 5 nl·min−1·mm−1, the increase in JNa was greater in LS vs. HS rabbits, yet the flow-stimulated increase in JK was similar in both groups. These data suggest that aldosterone does not contribute to the regulation of BK channel expression/activity in response to dietary K+ loading. PMID:18579708

  11. The Binary Offset Effect in CCDs: an Anomalous Readout Artifact Affecting Most Astronomical CCDs in Use

    NASA Astrophysics Data System (ADS)

    Boone, Kyle Robert; Aldering, Gregory; Copin, Yannick; Dixon, Samantha; Domagalski, Rachel; Gangler, Emmanuel; Pecontal, Emmanuel; Perlmutter, Saul; Nearby Supernova Factory Collaboration

    2018-01-01

    We discovered an anomalous behavior of CCD readout electronics that affects their use in many astronomical applications, which we call the “binary offset effect”. Due to feedback in the readout electronics, an offset is introduced in the values read out for each pixel that depends on the binary encoding of the previously read-out pixel values. One consequence of this effect is that a pathological local background offset can be introduced in images that only appears where science data are present on the CCD. The amplitude of this introduced offset does not scale monotonically with the amplitude of the objects in the image, and can be up to 4.5 ADU per pixel for certain instruments. Additionally, this background offset will be shifted by several pixels from the science data, potentially distorting the shape of objects in the image. We tested 22 instruments for signs of the binary offset effect and found evidence of it in 16 of them, including LRIS and DEIMOS on the Keck telescopes, WFC3-UVIS and STIS on HST, MegaCam on CFHT, SNIFS on the UH88 telescope, GMOS on the Gemini telescopes, HSC on Subaru, and FORS on VLT. A large amount of archival data is therefore affected by the binary offset effect, and conventional methods of reducing CCD images do not measure or remove the introduced offsets. As a demonstration of how to correct for the binary offset effect, we have developed a model that can accurately predict and remove the introduced offsets for the SNIFS instrument on the UH88 telescope. Accounting for the binary offset effect is essential for precision low-count astronomical observations with CCDs.

  12. Using model based systems engineering for the development of the Large Synoptic Survey Telescope's operational plan

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Willman, Beth; Petravick, Don; Johnson, Margaret; Reil, Kevin; Marshall, Stuart; Thomas, Sandrine; Lotz, Paul; Schumacher, German; Lim, Kian-Tat; Jenness, Tim; Jacoby, Suzanne; Emmons, Ben; Axelrod, Tim

    2016-08-01

    We† provide an overview of the Model Based Systems Engineering (MBSE) language, tool, and methodology being used in our development of the Operational Plan for Large Synoptic Survey Telescope (LSST) operations. LSST's Systems Engineering (SE) team is using a model-based approach to operational plan development to: 1) capture the topdown stakeholders' needs and functional allocations defining the scope, required tasks, and personnel needed for operations, and 2) capture the bottom-up operations and maintenance activities required to conduct the LSST survey across its distributed operations sites for the full ten year survey duration. To accomplish these complimentary goals and ensure that they result in self-consistent results, we have developed a holistic approach using the Sparx Enterprise Architect modeling tool and Systems Modeling Language (SysML). This approach utilizes SysML Use Cases, Actors, associated relationships, and Activity Diagrams to document and refine all of the major operations and maintenance activities that will be required to successfully operate the observatory and meet stakeholder expectations. We have developed several customized extensions of the SysML language including the creation of a custom stereotyped Use Case element with unique tagged values, as well as unique association connectors and Actor stereotypes. We demonstrate this customized MBSE methodology enables us to define: 1) the rolls each human Actor must take on to successfully carry out the activities associated with the Use Cases; 2) the skills each Actor must possess; 3) the functional allocation of all required stakeholder activities and Use Cases to organizational entities tasked with carrying them out; and 4) the organization structure required to successfully execute the operational survey. Our approach allows for continual refinement utilizing the systems engineering spiral method to expose finer levels of detail as necessary. For example, the bottom-up, Use Case-driven approach will be deployed in the future to develop the detailed work procedures required to successfully execute each operational activity.

  13. Arcus end-to-end simulations

    NASA Astrophysics Data System (ADS)

    Wilms, Joern; Guenther, H. Moritz; Dauser, Thomas; Huenemoerder, David P.; Ptak, Andrew; Smith, Randall; Arcus Team

    2018-01-01

    We present an overview of the end-to-end simulation environment that we are implementing as part of the Arcus phase A Study. With the rcus simulator, we aim to to model the imaging, detection, and event reconstruction properties of the spectrometer. The simulator uses a Monte Carlo ray-trace approach, projecting photons onto the Arcus focal plane from the silicon pore optic mirrors and critical-angle transmission gratings. We simulate the detection and read-out of the photons in the focal plane CCDs with software originally written for the eROSITA and Athena-WFI detectors; we include all relevant detector physics, such as charge splitting, and effects of the detector read-out, such as out of time events. The output of the simulation chain is an event list that closely resembles the data expected during flight. This event list is processed using a prototype event reconstruction chain for the order separation, wavelength calibration, and effective area calibration. The output is compatible with standard X-ray astronomical analysis software.During phase A, the end-to-end simulation approach is used to demonstrate the overall performance of the mission, including a full simulation of the calibration effort. Continued development during later phases of the mission will ensure that the simulator remains a faithful representation of the true mission capabilities, and will ultimately be used as the Arcus calibration model.

  14. Lower Boundary Forcing related to the Occurrence of Rain in the Tropical Western Pacific

    NASA Astrophysics Data System (ADS)

    Li, Y.; Carbone, R. E.

    2013-12-01

    Global weather and climate models have a long and somewhat tortured history with respect to simulation and prediction of tropical rainfall in the relative absence of balanced flow in the geostrophic sense. An important correlate with tropical rainfall is sea surface temperature (SST). The introduction of SST information to convective rainfall parameterization in global models has improved model climatologies of tropical oceanic rainfall. Nevertheless, large systematic errors have persisted, several of which are common to most atmospheric models. Models have evolved to the point where increased spatial resolution demands representation of the SST field at compatible temporal and spatial scales, leading to common usage of monthly SST fields at scales of 10-100 km. While large systematic errors persist, significant skill has been realized from various atmospheric and coupled ocean models, including assimilation of weekly or even daily SST fields, as tested by the European Center for Medium Range Weather Forecasting. A few investigators have explored the role of SST gradients in relation to the occurrence of precipitation. Some of this research has focused on large scale gradients, mainly associated with surface ocean-atmosphere climatology. These studies conclude that lower boundary atmospheric convergence, under some conditions, could be substantially enhanced over SST gradients, destabilizing the atmosphere, and thereby enabling moist convection. While the concept has a firm theoretical foundation, it has not gained a sizeable following far beyond the realm of western boundary currents. Li and Carbone 2012 examined the role of transient mesoscale (~ 100 km) SST gradients in the western Pacific warm pool by means of GHRSST and CMORPH rainfall data. They found that excitation of deep moist convection was strongly associated with the Laplacian of SST (LSST). Specifically, -LSST is associated with rainfall onset in 75% of 10,000 events over 4 years, whereas the background ocean is symmetric about zero Laplacian. This finding is fully consistent with theory for gradients of order ~1degC in low mean wind conditions, capable of inducing atmospheric convergence of N x 10-5s-1. We will present new findings resulting from the application of a Madden-Julian oscillation (MJO) passband filter to GHRSST/CMORPH data. It shows that the -LSST field organizes at scales of 1000-2000 km and can persist for periods of two weeks to 3 months. Such -LSST anomalies are in quadrature with MJO rainfall, tracking and leading the wet phase of the MJO by 10-14 days, from the Indian Ocean to the dateline. More generally, an evaluation of SST structure in rainfall production will be presented, which represents a decidedly alternative view to conventional wisdom. Li, Yanping, and R.E. Carbone, 2012: Excitation of Rainfall over the Tropical Western Pacific, J. Atmos. Sci., 69, 2983-2994.

  15. Dual-energy micro-CT with a dual-layer, dual-color, single-crystal scintillator.

    PubMed

    Maier, Daniel Simon; Schock, Jonathan; Pfeiffer, Franz

    2017-03-20

    A wide range of X-ray imaging applications demand micrometer spatial resolution. In material science and biology especially, there is a great interest in material determination and material separation methods. Here we present a new detector design that allows the recording of a low- and a high-energy radiography image simultaneously with micrometer spatial resolution. The detector system is composed of a layered scintillator stack, two CCDs and an optical system to image the scintillator responses onto the CCDs. We used the detector system with a standard laboratory microfocus X-ray tube to prove the working principle of the system and derive important design characteristics. With the recorded and registered dual-energy data set, the material separation and determination could be shown at an X-ray tube peak energy of up to 160 keV with a spatial resolution of 12 μm. The detector design shows a great potential for further development and a wide range of possible applications.

  16. The 2-d CCD Data Reduction Cookbook

    NASA Astrophysics Data System (ADS)

    Davenhall, A. C.; Privett, G. J.; Taylor, M. B.

    This cookbook presents simple recipes and scripts for reducing direct images acquired with optical CCD detectors. Using these recipes and scripts you can correct un-processed images obtained from CCDs for various instrumental effects to retrieve an accurate picture of the field of sky observed. The recipes and scripts use standard software available at all Starlink sites. The topics covered include: creating and applying bias and flat-field corrections, registering frames and creating a stack or mosaic of registered frames. Related auxiliary tasks, such as converting between different data formats, displaying images and calculating image statistics are also presented. In addition to the recipes and scripts, sufficient background material is presented to explain the procedures and techniques used. The treatment is deliberately practical rather than theoretical, in keeping with the aim of providing advice on the actual reduction of observations. Additional material outlines some of the differences between using conventional optical CCDs and the similar arrays used to observe at infrared wavelengths.

  17. Effects of space-radiation damage and temperature on CCD noise for the Lyman FUSE mission

    NASA Astrophysics Data System (ADS)

    Murowinski, Richard G.; Gao, Linzhuang; Deen, Mohamed J.

    1993-09-01

    Charge coupled device (CCD) imaging arrays are becoming more frequently used in space vehicles and equipment, especially space-based astronomical telescopes. It is important to understand the effects of radiation on a CCD so that its performance degradation during mission lifetime can be predicted, and so that methods to prevent unacceptable performance degradation can be found. Much recent work by various groups has focused on the problems surrounding the loss of charge transfer efficiency and the increase in dark current and dark current spikes in CCDs. The use of a CCD as the fine error sensor in the Lyman Far Ultraviolet Spectroscopic Explorer (FUSE) is limited by its noise performance. In this work we attempt to understand some of the factors surrounding the noise degradation due to radiation in a space environment. Later, we demonstrate how low frequency noise can be used as a characterization tool for studying proton radiation damage in CCDs.

  18. Comparison of Detector Technologies for CAPS

    NASA Technical Reports Server (NTRS)

    Stockum, Jana L.

    2005-01-01

    In this paper, several different detectors are examined for use in a Comet/Asteroid Protection System (CAPS), a conceptual study for a possible future space-based system. Each detector will be examined for its future (25 years or more in the future) ability to find and track near-Earth Objects (NEOs) from a space-based detection platform. Within the CAPS study are several teams of people who each focus on different aspects of the system concept. This study s focus is on detection devices. In particular, evaluations on the following devices have been made: charge-coupled devices (CCDs), charge-injected devices (CIDs), superconducting tunneling junctions (STJs), and transition edge sensors (TESs). These devices can be separated into two main categories; the first category includes detectors that are currently being widely utilized, such as CCDs and CIDs. The second category includes experimental detectors, such as STJs and TESs. After the discussion of the detectors themselves, there will be a section devoted to the explicit use of these detectors with CAPS.

  19. Does metacognitive strategy instruction improve impaired receptive cognitive-communication skills following acquired brain injury?

    PubMed

    Copley, Anna; Smith, Kathleen; Savill, Katelyn; Finch, Emma

    2015-01-01

    To investigate if metacognitive strategy instruction (MSI) improves the receptive language skills of adults with cognitive-communication disorders secondary to acquired brain injury (ABI). An ABA intervention programme was implemented with eight adults with ABI, aged 25-70 years. The Measure of Cognitive-Linguistic Abilities (MCLA) was administered at baseline and following treatment. The treatment employed in this study involved three components: individual goal-based therapy, group remediation therapy using self-instruction and home practice. No receptive language sub-tests of the MCLA reached statistical significance. However, participants' raw score improvements in receptive language sub-tests indicated that MSI may be effective at remediating CCDs following ABI. Preliminary findings indicate that MSI may be effective in improving receptive language skills in adults with CCDs following ABI. Further research involving a more rigorous study, a larger sample size and a more reliable outcome measure is necessary and may provide statistically significant evidence for the effectiveness of MSI for remediating receptive language disorders.

  20. NASA Tech Briefs, April 2004

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Topics covered include: Analysis of SSEM Sensor Data Using BEAM; Hairlike Percutaneous Photochemical Sensors; Video Guidance Sensors Using Remotely Activated Targets; Simulating Remote Sensing Systems; EHW Approach to Temperature Compensation of Electronics; Polymorphic Electronic Circuits; Micro-Tubular Fuel Cells; Whispering-Gallery-Mode Tunable Narrow-Band-Pass Filter; PVM Wrapper; Simulation of Hyperspectral Images; Algorithm for Controlling a Centrifugal Compressor; Hybrid Inflatable Pressure Vessel; Double-Acting, Locking Carabiners; Position Sensor Integral with a Linear Actuator; Improved Electromagnetic Brake; Flow Straightener for a Rotating-Drum Liquid Separator; Sensory-Feedback Exoskeletal Arm Controller; Active Suppression of Instabilities in Engine Combustors; Fabrication of Robust, Flat, Thinned, UV-Imaging CCDs; Chemical Thinning Process for Fabricating UV-Imaging CCDs; Pseudoslit Spectrometer; Waste-Heat-Driven Cooling Using Complex Compound Sorbents; Improved Refractometer for Measuring Temperatures of Drops; Semiconductor Lasers Containing Quantum Wells in Junctions; Phytoplankton-Fluorescence-Lifetime Vertical Profiler; Hexagonal Pixels and Indexing Scheme for Binary Images; Finding Minimum-Power Broadcast Trees for Wireless Networks; and Automation of Design Engineering Processes.

  1. Design and development of a fiber optic TDI CCD-based slot-scan digital mammography system

    NASA Astrophysics Data System (ADS)

    Toker, Emre; Piccaro, Michele F.

    1993-12-01

    We previously reported on the development, design, and clinical evaluation of a CCD-based, high performance, filmless imaging system for stereotactic needle biopsy procedures in mammography. The MammoVision system has a limited imaging area of 50 mm X 50 mm, since it is designed specifically for breast biopsy applications. We are currently developing a new filmless imaging system designed to cover the 18 cm X 24 cm imaging area required for screening and diagnostic mammography. The diagnostic mammography system is based on four 1100 X 330 pixel format, full-frame, scientific grade, front illuminated, MPP mode CCDs, with 24 micrometers X 24 micrometers square pixels Each CCD is coupled to an x-ray intensifying screen via a 1.7:1 fiber optic reducer. The detector assembly (180 mm long and 13.5 mm wide) is scanned across the patient's breast synchronously with the x-ray source, with the CCDs operated in time-delay integration (TDI) mode. The total scan time is 4.0 seconds.

  2. Direct Detection and Imaging of Low-Energy Electrons with Delta-Doped Charge-Coupled Devices

    NASA Technical Reports Server (NTRS)

    Nikzad, S.; Yu, Q.; Smith, A. L.; Jones, T. J.; Tombrello, T. A.; Elliott, S. T.

    1998-01-01

    We report the use fo delta-doped charge-coupled devices (CCDs) for direct detection of electrons in the 50-1500 eV energy range. These are the first measurements with a solid state device to detect electrons in this energy range.

  3. CCDs in the Mechanics Lab--A Competitive Alternative? (Part I).

    ERIC Educational Resources Information Center

    Pinto, Fabrizio

    1995-01-01

    Reports on the implementation of a relatively low-cost, versatile, and intuitive system to teach basic mechanics based on the use of a Charge-Coupled Device (CCD) camera and inexpensive image-processing and analysis software. Discusses strengths and limitations of CCD imaging technologies. (JRH)

  4. Focal plane alignment and detector characterization for the Subaru prime focus spectrograph

    NASA Astrophysics Data System (ADS)

    Hart, Murdock; Barkhouser, Robert H.; Carr, Michael; Golebiowski, Mirek; Gunn, James E.; Hope, Stephen C.; Smee, Stephen A.

    2014-07-01

    We describe the infrastructure being developed to align and characterize the detectors for the Subaru Measure- ment of Images and Redshifts (SuMIRe) Prime Focus Spectrograph (PFS). PFS will employ four three-channel spectrographs with an operating wavelength range of 3800 °A to 12600 °A. Each spectrograph will be comprised of two visible channels and one near infrared (NIR) channel, where each channel will use a separate Schmidt camera to image the captured spectra onto their respective detectors. In the visible channels, Hamamatsu 2k × 4k CCDs will be mounted in pairs to create a single 4k × 4k detector, while the NIR channel will use a single Teledyne 4k × 4k H4RG HgCdTe device. The fast f/1.1 optics of the Schmidt cameras will give a shallow depth of focus necessitating an optimization of the focal plane array flatness. The minimum departure from flatness of the focal plane array for the visible channels is set the by the CCD flatness, typically 10 μm peak-to-valley. We will adjust the coplanarity for a pair of CCDs such that the flatness of the array is consistent with the flatness of the detectors themselves. To achieve this we will use an optical non-contact measurement system to measure surface flatness and coplanarity at both ambient and operating temperatures, and use shims to adjust the coplanarity of the CCDs. We will characterize the performance of the detectors for PFS consistent with the scientific goals for the project. To this end we will measure the gain, linearity, full well, quantum efficiency (QE), charge diffusion, charge transfer inefficiency (CTI), and noise properties of these devices. We also desire to better understand the non-linearity of the photon transfer curve for the CCDs, and the charge persistence/reciprocity problems of the HgCdTe devices. To enable the metrology and characterization of these detectors we are building two test cryostats nearly identical in design. The first test cryostat will primarily be used for the coplanarity measurements and sub- pixel illumination testing, and the second will be dedicated to performance characterization requiring at field illumination. In this paper we will describe the design of the test cryostats. We will also describe the system we have built for measuring focal plane array flatness, and examine the precision and error with which it operates. Finally we will detail the methods by which we plan to characterize the performance of the detectors for PFS, and provide preliminary results.

  5. The Great Firewall of China: A Critical Analysis

    DTIC Science & Technology

    2008-06-01

    wall was constructed twenty five foot high, twenty foot wide and over 4,000 miles long when complete ( Asimov , 1998). The wall was constructed... Asimov , I (1998). Construction of the great wall. Retrieved May 20, 2008, from Great Wall Web site: http://www.ccds.charlotte.nc.us/History/China/save

  6. The Career Development Needs of Rural Elementary School Students

    ERIC Educational Resources Information Center

    Wood, Chris; Kaszubowski, Yvonne

    2008-01-01

    This exploratory study investigated the career development needs of 150 fourth-grade students from 2 rural school districts in the Midwestern United States. The Childhood Career Development Scale (CCDS) was administered in 6 classrooms at 2 elementary schools to assess Donald Super's 9 dimensions (information, curiosity, exploration, interests,…

  7. CMOS Active Pixel Sensors for Low Power, Highly Miniaturized Imaging Systems

    NASA Technical Reports Server (NTRS)

    Fossum, Eric R.

    1996-01-01

    The complementary metal-oxide-semiconductor (CMOS) active pixel sensor (APS) technology has been developed over the past three years by NASA at the Jet Propulsion Laboratory, and has reached a level of performance comparable to CCDs with greatly increased functionality but at a very reduced power level.

  8. The detection of soft X-rays with charged coupled detectors

    NASA Technical Reports Server (NTRS)

    Burstein, P.; Davis, John M.

    1989-01-01

    The characteristics of an ideal soft X-ray imaging detector are enumerated. Of recent technical developments the CCD or charge coupled device goes furthest to meeting these requirements. Several properties of CCDs are described with reference to experimental work and their application to practical instruments is reviewed.

  9. Validating Phasing and Geometry of Large Focal Plane Arrays

    NASA Technical Reports Server (NTRS)

    Standley, Shaun P.; Gautier, Thomas N.; Caldwell, Douglas A.; Rabbette, Maura

    2011-01-01

    The Kepler Mission is designed to survey our region of the Milky Way galaxy to discover hundreds of Earth-sized and smaller planets in or near the habitable zone. The Kepler photometer is an array of 42 CCDs (charge-coupled devices) in the focal plane of a 95-cm Schmidt camera onboard the Kepler spacecraft. Each 50x25-mm CCD has 2,200 x 1,024 pixels. The CCDs accumulate photons and are read out every six seconds to prevent saturation. The data is integrated for 30 minutes, and then the pixel data is transferred to onboard storage. The data is subsequently encoded and transmitted to the ground. During End-to-End Information System (EEIS) testing of the Kepler Mission System (KMS), there was a need to verify that the pixels requested by the science team operationally were correctly collected, encoded, compressed, stored, and transmitted by the FS, and subsequently received, decoded, uncompressed, and displayed by the Ground Segment (GS) without the outputs of any CCD modules being flipped, mirrored, or otherwise corrupted during the extensive FS and GS processing. This would normally be done by projecting an image on the focal plane array (FPA), collecting the data in a flight-like way, and making a comparison between the original data and the data reconstructed by the science data system. Projecting a focused image onto the FPA through the telescope would normally involve using a collimator suspended over the telescope opening. There were several problems with this approach: the collimation equipment is elaborate and expensive; as conceived, it could only illuminate a limited section of the FPA (.25 percent) during a given test; the telescope cover would have to be deployed during testing to allow the image to be projected into the telescope; the equipment was bulky and difficult to situate in temperature-controlled environments; and given all the above, test setup, execution, and repeatability were significant concerns. Instead of using this complicated approach of projecting an optical image on the FPA, the Kepler project developed a method using known defect features in the CCDs to verify proper collection and reassembly of the pixels, thereby avoiding the costs and risks of the optical projection approach. The CCDs composing the Kepler FPA, as all CCDs, had minor defects. At ambient temperature, some pixels look far brighter than they should. These ghot h pixels have a higher rate of charge leakage than the others due to manufacturing variations. They are usually stable over time, and appear at temperatures above 5 oC. The hot pixels on the Kepler FPA were mapped before photometer assembly during module testing. Selected hot pixels were used as target gstars h for the purposes of EEIS testing. gDead h pixels are permanently off, producing a permanently black pixel. These can also be used if there is some illumination of the FPA. During EEIS testing, Dark Current Full Frame Images (FFIs) taken at room temperature were used to create the hot pixel maps for all 84 Kepler photometer CCD channels. Data from two separate nights were used to create two hot pixel maps per channel, which were cross-correlated to remove cosmic ray events which appear to be hot pixels. These hot pixel maps obtained during EEIS testing were compared to the maps made during module testing to verify that the end-to-end data flow was correct.

  10. DESCQA: Synthetic Sky Catalog Validation Framework

    NASA Astrophysics Data System (ADS)

    Mao, Yao-Yuan; Uram, Thomas D.; Zhou, Rongpu; Kovacs, Eve; Ricker, Paul M.; Kalmbach, J. Bryce; Padilla, Nelson; Lanusse, François; Zu, Ying; Tenneti, Ananth; Vikraman, Vinu; DeRose, Joseph

    2018-04-01

    The DESCQA framework provides rigorous validation protocols for assessing the quality of high-quality simulated sky catalogs in a straightforward and comprehensive way. DESCQA enables the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. An interactive web interface is also available at portal.nersc.gov/project/lsst/descqa.

  11. Final acceptance testing of the LSST monolithic primary/tertiary mirror

    NASA Astrophysics Data System (ADS)

    Tuell, Michael T.; Burge, James H.; Cuerden, Brian; Gressler, William; Martin, Hubert M.; West, Steven C.; Zhao, Chunyu

    2014-07-01

    The Large Synoptic Survey Telescope (LSST) is a three-mirror wide-field survey telescope with the primary and tertiary mirrors on one monolithic substrate1. This substrate is made of Ohara E6 borosilicate glass in a honeycomb sandwich, spin cast at the Steward Observatory Mirror Lab at The University of Arizona2. Each surface is aspheric, with the specification in terms of conic constant error, maximum active bending forces and finally a structure function specification on the residual errors3. There are high-order deformation terms, but with no tolerance, any error is considered as a surface error and is included in the structure function. The radii of curvature are very different, requiring two independent test stations, each with instantaneous phase-shifting interferometers with null correctors. The primary null corrector is a standard two-element Offner null lens. The tertiary null corrector is a phase-etched computer-generated hologram (CGH). This paper details the two optical systems and their tolerances, showing that the uncertainty in measuring the figure is a small fraction of the structure function specification. Additional metrology includes the radii of curvature, optical axis locations, and relative surface tilts. The methods for measuring these will also be described along with their tolerances.

  12. On the Detectability of Interstellar Objects Like 1I/'Oumuamua

    NASA Astrophysics Data System (ADS)

    Ragozzine, Darin

    2018-04-01

    Almost since Oort's 1950 hypothesis of a tenuously bound cloud of comets, planetary formation theorists have realized that the process of planet formation must have ejected very large numbers of planetesimals into interstellar space. Unforunately, these objects are distributed over galactic volumes, while they are only likely to be detectable if they pass within a few AU of Earth, resulting in an incredibly sparse detectable population. Furthermore, hypotheses for the formation and distribution of these bodies allows for uncertainties of orders of magnitude in the expected detection rate: our analysis suggested LSST would discover 0.01-100 objects during its lifetime (Cook et al. 2016). The discovery of 1I/'Oumuamua by a survey less powerful that LSST indicates either a low probability event and/or that properties of this population are on the more favorable end of the spectrum. We revisit the detailed detection analysis of Cook et al. 2016 in light of the detection of 1I/'Oumuamua. We use these results to better understand 1I/'Oumuamua and to update our assessment of future detections of interstellar objects. We highlight some key questions that can be answered only by additional discoveries.

  13. Enrollment Forecasting: A Report of the National Dissemination Project for the Community Colleges.

    ERIC Educational Resources Information Center

    Maxie, Francoise

    A systems approach, a multiparameter stochastic model, that will project State vocational education needs is being developed in Washington State by cooperation of the Coordinating Council for Occupational Education with Dr. Samuel Cleff. The model has incorporated the Cleff Career Development Systems (CCDS), a job matching system used for…

  14. Kepler Fine Guidance Sensor Data

    NASA Technical Reports Server (NTRS)

    Van Cleve, Jeffrey; Campbell, Jennifer Roseanna

    2017-01-01

    The Kepler and K2 missions collected Fine Guidance Sensor (FGS) data in addition to the science data, as discussed in the Kepler Instrument Handbook (KIH, Van Cleve and Caldwell 2016). The FGS CCDs are frame transfer devices (KIH Table 7) located in the corners of the Kepler focal plane (KIH Figure 24), which are read out 10 times every second. The FGS data are being made available to the user community for scientific analysis as flux and centroid time series, along with a limited number of FGS full frame images which may be useful for constructing a World Coordinate System (WCS) or otherwise putting the time series data in context. This document will describe the data content and file format, and give example MATLAB scripts to read the time series. There are three file types delivered as the FGS data.1. Flux and Centroid (FLC) data: time series of star signal and centroid data. 2. Ancillary FGS Reference (AFR) data: catalog of information about the observed stars in the FLC data. 3. FGS Full-Frame Image (FGI) data: full-frame image snapshots of the FGS CCDs.

  15. Earth's Minimoons: Opportunities for Science and Technology.

    NASA Astrophysics Data System (ADS)

    Jedicke, Robert; Bolin, Bryce T.; Bottke, William F.; Chyba, Monique; Fedorets, Grigori; Granvik, Mikael; Jones, Lynne; Urrutxua, Hodei

    2018-05-01

    Twelve years ago the Catalina Sky Survey discovered Earth's first known natural geocentric object other than the Moon, a few-meter diameter asteroid designated \\RH. Despite significant improvements in ground-based asteroid surveying technology in the past decade they have not discovered another temporarily-captured orbiter (TCO; colloquially known as minimoons) but the all-sky fireball system operated in the Czech Republic as part of the European Fireball Network detected a bright natural meteor that was almost certainly in a geocentric orbit before it struck Earth's atmosphere. Within a few years the Large Synoptic Survey Telescope (LSST) will either begin to regularly detect TCOs or force a re-analysis of the creation and dynamical evolution of small asteroids in the inner solar system. The first studies of the provenance, properties, and dynamics of Earth's minimoons suggested that there should be a steady state population with about one 1- to 2-meter diameter captured objects at any time, with the number of captured meteoroids increasing exponentially for smaller sizes. That model was then improved and extended to include the population of temporarily-captured flybys (TCFs), objects that fail to make an entire revolution around Earth while energetically bound to the Earth-Moon system. Several different techniques for discovering TCOs have been considered but their small diameters, proximity, and rapid motion make them challenging targets for existing ground-based optical, meteor, and radar surveys. However, the LSST's tremendous light gathering power and short exposure times could allow it to detect and discover many minimoons. We expect that if the TCO population is confirmed, and new objects are frequently discovered, they can provide new opportunities for 1) studying the dynamics of the Earth-Moon system, 2) testing models of the production and dynamical evolution of small asteroids from the asteroid belt, 3) rapid and frequent low delta-v missions to multiple minimoons, and 4) evaluating in-situ resource utilization techniques on asteroidal material. Here we review the past decade of minimoon studies in preparation for capitalizing on the scientific and commercial opportunities of TCOs in the first decade of LSST operations.

  16. Testing fully depleted CCD

    NASA Astrophysics Data System (ADS)

    Casas, Ricard; Cardiel-Sas, Laia; Castander, Francisco J.; Jiménez, Jorge; de Vicente, Juan

    2014-08-01

    The focal plane of the PAU camera is composed of eighteen 2K x 4K CCDs. These devices, plus four spares, were provided by the Japanese company Hamamatsu Photonics K.K. with type no. S10892-04(X). These detectors are 200 μm thick fully depleted and back illuminated with an n-type silicon base. They have been built with a specific coating to be sensitive in the range from 300 to 1,100 nm. Their square pixel size is 15 μm. The read-out system consists of a Monsoon controller (NOAO) and the panVIEW software package. The deafualt CCD read-out speed is 133 kpixel/s. This is the value used in the calibration process. Before installing these devices in the camera focal plane, they were characterized using the facilities of the ICE (CSIC- IEEC) and IFAE in the UAB Campus in Bellaterra (Barcelona, Catalonia, Spain). The basic tests performed for all CCDs were to obtain the photon transfer curve (PTC), the charge transfer efficiency (CTE) using X-rays and the EPER method, linearity, read-out noise, dark current, persistence, cosmetics and quantum efficiency. The X-rays images were also used for the analysis of the charge diffusion for different substrate voltages (VSUB). Regarding the cosmetics, and in addition to white and dark pixels, some patterns were also found. The first one, which appears in all devices, is the presence of half circles in the external edges. The origin of this pattern can be related to the assembly process. A second one appears in the dark images, and shows bright arcs connecting corners along the vertical axis of the CCD. This feature appears in all CCDs exactly in the same position so our guess is that the pattern is due to electrical fields. Finally, and just in two devices, there is a spot with wavelength dependence whose origin could be the result of a defectous coating process.

  17. Oxidative stress mediates through apoptosis the anticancer effect of phospho-nonsteroidal anti-inflammatory drugs: implications for the role of oxidative stress in the action of anticancer agents.

    PubMed

    Sun, Yu; Huang, Liqun; Mackenzie, Gerardo G; Rigas, Basil

    2011-09-01

    We assessed the relationship between oxidative stress, cytokinetic parameters, and tumor growth in response to novel phospho-nonsteroidal anti-inflammatory drugs (NSAIDs), agents with significant anticancer effects in preclinical models. Compared with controls, in SW480 colon and MCF-7 breast cancer cells, phospho-sulindac, phospho-aspirin, phospho-flurbiprofen, and phospho-ibuprofen (P-I) increased the levels of reactive oxygen and nitrogen species (RONS) and decreased GSH levels and thioredoxin reductase activity, whereas the conventional chemotherapeutic drugs (CCDs), 5-fluorouracil (5-FU), irinotecan, oxaliplatin, chlorambucil, paclitaxel, and vincristine, did not. In both cell lines, phospho-NSAIDs induced apoptosis and inhibited cell proliferation much more potently than CCDs. We then treated nude mice bearing SW480 xenografts with P-I or 5-FU that had an opposite effect on RONS in vitro. Compared with controls, P-I markedly suppressed xenograft growth, induced apoptosis in the xenografts (8.9 ± 2.7 versus 19.5 ± 3.0), inhibited cell proliferation (52.6 ± 5.58 versus 25.8 ± 7.71), and increased urinary F2-isoprostane levels (10.7 ± 3.3 versus 17.9 ± 2.2 ng/mg creatinine, a marker of oxidative stress); all differences were statistically significant. 5-FU's effects on tumor growth, apoptosis, proliferation, and F2-isoprostane were not statistically significant. F2-isoprostane levels correlated with the induction of apoptosis and the inhibition of cell growth. P-I induced oxidative stress only in the tumors, and its apoptotic effect was restricted to xenografts. Our data show that phospho-NSAIDs act against cancer through a mechanism distinct from that of various CCDs, underscore the critical role of oxidative stress in their effect, and indicate that pathways leading to oxidative stress may be useful targets for anticancer strategies.

  18. Predicting Chandra CCD Degradation with the Chandra Radiation Model

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Blackwell, William C.; DePasquale, Joseph M.; Grant, Catherine E.; O'Dell, Stephen L.; Plucinsky, Paul P.; Schwartz, Daniel A.; Spitzbart, Bradley D.; Wolk, Scott J.

    2008-01-01

    Not long after launch of the Chandra X-Ray Observatory, it was discovered that the Advanced CCD Imaging Spectrometer (ACIS) detector was rapidly degrading due to radiation. Analysis by Chandra personnel showed that this degradation was due to 10w energy protons (100 - 200 keV) that scattered down the optical path onto the focal plane. In response to this unexpected problem, the Chandra Team developed a radiation-protection program that has been used to manage the radiation damage to the CCDs. This program consists of multiple approaches - scheduled sating of the ACIS detector from the radiation environment during passage through radiation belts, real-time monitoring of space weather conditions, on-board monitoring of radiation environment levels, and the creation of a radiation environment model for use in computing proton flux and fluence at energies that damage the ACIS detector. This radiation mitigation program has been very successful. The initial precipitous increase in the CCDs' charge transfer inefficiency (CTI) resulting from proton damage has been slowed dramatically, with the front-illuminated CCDS having an increase in CTI of only 2.3% per year, allowing the ASIS detector's expected lifetime to exceed requirements. This paper concentrates on one aspect of the Chandra radiation mitigation program, the creation of the Chandra Radiation Model (CRM). Because of Chandra's highly elliptical orbit, the spacecraft spends most of its time outside of the trapped radiation belts that present the severest risks to the ACIS detector. However, there is still a proton flux environment that must be accounted for in all parts of Chandra's orbit. At the time of Chandra's launch there was no engineering model of the radiation environment that could be used in the outer regions of the spacecraft's orbit, so the CRM was developed to provide the flux environment of 100 - 200 keV protons in the outer magnetosphere, magnetosheath, and solar wind regions of geospace. This presentation describes CRM, its role in Chandra operations, and its prediction of the ACIS CTI increase.

  19. ATP Releasing Connexin 30 Hemichannels Mediate Flow-Induced Calcium Signaling in the Collecting Duct

    PubMed Central

    Svenningsen, Per; Burford, James L.; Peti-Peterdi, János

    2013-01-01

    ATP in the renal tubular fluid is an important regulator of salt and water reabsorption via purinergic calcium signaling that involves the P2Y2 receptor, ENaC, and AQP2. Recently, we have shown that connexin (Cx) 30 hemichannels are localized to the non-junctional apical membrane of cells in the distal nephron-collecting duct (CD) and release ATP into the tubular fluid upon mechanical stimuli, leading to reduced salt and water reabsorption. Cx30−/− mice show salt-dependent elevations in BP and impaired pressure-natriuresis. Thus, we hypothesized that increased tubular flow rate leads to Cx30-dependent purinergic intracellular calcium ([Ca2+]i) signaling in the CD. Cortical CDs (CCDs) from wild type and Cx30−/− mice were freshly dissected and microperfused in vitro. Using confocal fluorescence imaging and the calcium-sensitive fluorophore pair Fluo-4 and Fura Red, we found that increasing tubular flow rate from 2 to 20 nl/min caused a significant 2.1-fold elevation in [Ca2+]i in wild type CCDs. This response was blunted in Cx30−/− CCDs ([Ca2+]i increased only 1.2-fold, p < 0.0001 vs. WT, n = 6 each). To further test our hypothesis we performed CD [Ca2+]i imaging in intact mouse kidneys in vivo using multiphoton microscopy and micropuncture delivery of the calcium-sensitive fluorophore Rhod-2. We found intrinsic, spontaneous [Ca2+]i oscillations in free-flowing CDs of wild type but not Cx30−/− mice. The [Ca2+]i oscillations were sensitive also to P2-receptor inhibition by suramin. Taken together, these data confirm that mechanosensitive Cx30 hemichannels mediate tubular ATP release and purinergic calcium signaling in the CD which mechanism plays an important role in the regulation of CD salt and water reabsorption. PMID:24137132

  20. Comparison of a CCD and an APS for soft X-ray diffraction

    NASA Astrophysics Data System (ADS)

    Stewart, Graeme; Bates, R.; Blue, A.; Clark, A.; Dhesi, S. S.; Maneuski, D.; Marchal, J.; Steadman, P.; Tartoni, N.; Turchetta, R.

    2011-12-01

    We compare a new CMOS Active Pixel Sensor (APS) to a Princeton Instruments PIXIS-XO: 2048B Charge Coupled Device (CCD) with soft X-rays tested in a synchrotron beam line at the Diamond Light Source (DLS). Despite CCDs being established in the field of scientific imaging, APS are an innovative technology that offers advantages over CCDs. These include faster readout, higher operational temperature, in-pixel electronics for advanced image processing and reduced manufacturing cost. The APS employed was the Vanilla sensor designed by the MI3 collaboration and funded by an RCUK Basic technology grant. This sensor has 520 x 520 square pixels, of size 25 μm on each side. The sensor can operate at a full frame readout of up to 20 Hz. The sensor had been back-thinned, to the epitaxial layer. This was the first time that a back-thinned APS had been demonstrated at a beam line at DLS. In the synchrotron experiment soft X-rays with an energy of approximately 708 eV were used to produce a diffraction pattern from a permalloy sample. The pattern was imaged at a range of integration times with both sensors. The CCD had to be operated at a temperature of -55°C whereas the Vanilla was operated over a temperature range from 20°C to -10°C. We show that the APS detector can operate with frame rates up to two hundred times faster than the CCD, without excessive degradation of image quality. The signal to noise of the APS is shown to be the same as that of the CCD at identical integration times and the response is shown to be linear, with no charge blooming effects. The experiment has allowed a direct comparison of back thinned APS and CCDs in a real soft x-ray synchrotron experiment.

  1. Gaia Data Release 1. On-orbit performance of the Gaia CCDs at L2

    NASA Astrophysics Data System (ADS)

    Crowley, C.; Kohley, R.; Hambly, N. C.; Davidson, M.; Abreu, A.; van Leeuwen, F.; Fabricius, C.; Seabroke, G.; de Bruijne, J. H. J.; Short, A.; Lindegren, L.; Brown, A. G. A.; Sarri, G.; Gare, P.; Prusti, T.; Prod'homme, T.; Mora, A.; Martín-Fleitas, J.; Raison, F.; Lammers, U.; O'Mullane, W.; Jansen, F.

    2016-11-01

    The European Space Agency's Gaia satellite was launched into orbit around L2 in December 2013 with a payload containing 106 large-format scientific CCDs. The primary goal of the mission is to repeatedly obtain high-precision astrometric and photometric measurements of one thousand million stars over the course of five years. The scientific value of the down-linked data, and the operation of the onboard autonomous detection chain, relies on the high performance of the detectors. As Gaia slowly rotates and scans the sky, the CCDs are continuously operated in a mode where the line clock rate and the satellite rotation spin-rate are in synchronisation. Nominal mission operations began in July 2014 and the first data release is being prepared for release at the end of Summer 2016. In this paper we present an overview of the focal plane, the detector system, and strategies for on-orbit performance monitoring of the system. This is followed by a presentation of the performance results based on analysis of data acquired during a two-year window beginning at payload switch-on. Results for parameters such as readout noise and electronic offset behaviour are presented and we pay particular attention to the effects of the L2 radiation environment on the devices. The radiation-induced degradation in the charge transfer efficiency (CTE) in the (parallel) scan direction is clearly diagnosed; however, an extrapolation shows that charge transfer inefficiency (CTI) effects at end of mission will be approximately an order of magnitude less than predicted pre-flight. It is shown that the CTI in the serial register (horizontal direction) is still dominated by the traps inherent to the manufacturing process and that the radiation-induced degradation so far is only a few per cent. We also present results on the tracking of ionising radiation damage and hot pixel evolution. Finally, we summarise some of the detector effects discovered on-orbit which are still being investigated.

  2. CNES developments of key detection technologies to prepare next generation focal planes for high resolution Earth observation

    NASA Astrophysics Data System (ADS)

    Materne, A.; Virmontois, C.; Bardoux, A.; Gimenez, T.; Biffi, J. M.; Laubier, D.; Delvit, J. M.

    2014-10-01

    This paper describes the activities managed by CNES (French National Space Agency) for the development of focal planes for next generation of optical high resolution Earth observation satellites, in low sun-synchronous orbit. CNES has launched a new programme named OTOS, to increase the level of readiness (TRL) of several key technologies for high resolution Earth observation satellites. The OTOS programme includes several actions in the field of detection and focal planes: a new generation of CCD and CMOS image sensors, updated analog front-end electronics and analog-to-digital converters. The main features that must be achieved on focal planes for high resolution Earth Observation, are: readout speed, signal to noise ratio at low light level, anti-blooming efficiency, geometric stability, MTF and line of sight stability. The next steps targeted are presented in comparison to the in-flight measured performance of the PLEIADES satellites launched in 2011 and 2012. The high resolution panchromatic channel is still based upon Backside illuminated (BSI) CCDs operated in Time Delay Integration (TDI). For the multispectral channel, the main evolution consists in moving to TDI mode and the competition is open with the concurrent development of a CCD solution versus a CMOS solution. New CCDs will be based upon several process blocks under evaluation on the e2v 6 inches BSI wafer manufacturing line. The OTOS strategy for CMOS image sensors investigates on one hand custom TDI solutions within a similar approach to CCDs, and, on the other hand, investigates ways to take advantage of existing performance of off-the-shelf 2D arrays CMOS image sensors. We present the characterization results obtained from test vehicles designed for custom TDI operation on several CIS technologies and results obtained before and after radiation on snapshot 2D arrays from the CMOSIS CMV family.

  3. Controller and data acquisition system for SIDECAR ASIC driven HAWAII detectors

    NASA Astrophysics Data System (ADS)

    Ramaprakash, Anamparambu; Burse, Mahesh; Chordia, Pravin; Chillal, Kalpesh; Kohok, Abhay; Mestry, Vilas; Punnadi, Sujit; Sinha, Sakya

    2010-07-01

    SIDECAR is an Application Specific Integrated Circuit (ASIC), which can be used for control and data acquisition from near-IR HAWAII detectors offered by Teledyne Imaging Sensors (TIS), USA. The standard interfaces provided by Teledyne are COM API and socket servers running under MS Windows platform. These interfaces communicate to the ASIC (and the detector) through an intermediate card called JWST ASIC Drive Electronics (JADE2). As part of an ongoing programme of several years, for developing astronomical focal plane array (CCDs, CMOS and Hybrid) controllers and data acquisition systems (CDAQs), IUCAA is currently developing the next generation controllers employing Virtex-5 family FPGA devices. We present here the capabilities which are built into these new CDAQs for handling HAWAII detectors. In our system, the computer which hosts the application programme, user interface and device drivers runs on a Linux platform. It communicates through a hot-pluggable USB interface (with an optional optical fibre extender) to the FPGA-based card which replaces the JADE2. The FPGA board in turn, controls the SIDECAR ASIC and through it a HAWAII-2RG detector, both of which are located in a cryogenic test Dewar set up which is liquid nitrogen cooled. The system can acquire data over 1, 4, or 32 readout channels, with or without binning, at different speeds, can define sub-regions for readout, offers various readout schemes like Fowler sampling, up-theramp etc. In this paper, we present the performance results obtained from a prototype system.

  4. KiwiSpec - an advanced spectrograph for high resolution spectroscopy: optical design and variations

    NASA Astrophysics Data System (ADS)

    Barnes, Stuart I.; Gibson, Steve; Nield, Kathryn; Cochrane, Dave

    2012-09-01

    The KiwiSpec R4-100 is an advanced high resolution spectrograph developed by KiwiStar Optics, Industrial Research Ltd, New Zealand. The instrument is based around an R4 echelle grating and a 100mm collimated beam diameter. The optical design employs a highly asymmetric white pupil design, whereby the transfer collimator has a focal length only 1/3 that of the primary collimator. This allows the cross-dispersers (VPH gratings) and camera optics to be small and low cost while also ensuring a very compact instrument. The KiwiSpec instrument will be bre-fed and is designed to be contained in both thermal and/or vacuum enclosures. The instrument concept is highly exible in order to ensure that the same basic design can be used for a wide variety of science cases. Options include the possibility of splitting the wavelength coverage into 2 to 4 separate channels allowing each channel to be highly optimized for maximum eciency. CCDs ranging from smaller than 2K2K to larger than 4K4K can be accommodated. This allows good (3-4 pixel) sampling of resolving powers ranging from below 50,000 to greater than 100,000. Among the specic design options presented here will be a two-channel concept optimized for precision radial velocities, and a four-channel concept developed for the Gemini High- Resolution Optical Spectrograph (GHOST). The design and performance of a single-channel prototype will be presented elsewhere in these proceedings.

  5. Single Particle Damage Events in Candidate Star Camera Sensors

    NASA Technical Reports Server (NTRS)

    Marshall, Paul; Marshall, Cheryl; Polidan, Elizabeth; Wacyznski, Augustyn; Johnson, Scott

    2005-01-01

    Si charge coupled devices (CCDs) are currently the preeminent detector in star cameras as well as in the near ultraviolet (uv) to visible wavelength region for astronomical observations in space and in earth-observing space missions. Unfortunately, the performance of CCDs is permanently degraded by total ionizing dose (TID) and displacement damage effects. TID produces threshold voltage shifts on the CCD gates and displacement damage reduces the charge transfer efficiency (CTE), increases the dark current, produces dark current nonuniformities and creates random telegraph noise in individual pixels. In addition to these long term effects, cosmic ray and trapped proton transients also interfere with device operation on orbit. In the present paper, we investigate the dark current behavior of CCDs - in particular the formation and annealing of hot pixels. Such pixels degrade the ability of a CCD to perform science and also can present problems to the performance of star camera functions (especially if their numbers are not correctly anticipated). To date, most dark current radiation studies have been performed by irradiating the CCDs at room temperature but this can result in a significantly optimistic picture of the hot pixel count. We know from the Hubble Space Telescope (HST) that high dark current pixels (so-called hot pixels or hot spikes) accumulate as a function of time on orbit. For example, the HST Advanced Camera for Surveys/Wide Field Camera instrument performs monthly anneals despite the loss of observational time, in order to partially anneal the hot pixels. Note that the fact that significant reduction in hot pixel populations occurs for room temperature anneals is not presently understood since none of the commonly expected defects in Si (e.g. divacancy, E center, and A-center) anneal at such a low temperature. A HST Wide Field Camera 3 (WFC3) CCD manufactured by E2V was irradiated while operating at -83C and the dark current studied as a function of temperature while the CCD was warmed to a sequence of temperatures up to a maximum of +30C. The device was then cooled back down to -83 and re-measured. Hot pixel populations were tracked during the warm-up and cool-down. Hot pixel annealing began below 40C and the anneal process was largely completed before the detector reached +3OC. There was no apparent sharp temperature dependence in the annealing. Although a large fraction of the hot pixels fell below the threshold to be counted as a hot pixel, they nevertheless remained warmer than the remaining population. The details of the mechanism for the formation and annealing of hot pixels is not presently understood, but it appears likely that hot pixels are associated with displacement damage occurring in high electric field regions.

  6. SIOExplorer: Managing Data Flow into a Digital Library

    NASA Astrophysics Data System (ADS)

    Clark, D.; Miller, S. P.; Peckman, U.; Chase, A.; Helly, J.

    2002-12-01

    The diversity of the data held by the Geological Data Center at SIO is a tribute to the evolution of oceanography, from echo sounding rolls and hand-drawn charts to modern multibeam bathymetry. However, the changes in sensor technology and organizational approaches since 1903 present real challenges to the archivist, as we struggle to migrate the holdings into a web-accessible digital library for use by the public and scientific community (www.nsdl.org). Automation of the dataflow is an absolute necessity, with millions of files and a terabyte of data, although the biggest challenges come from complexity rather than bulk volume. The problems stem from our diverse archive collection, with its evolving data content, processing practices, naming conventions, and clutter from intermediate or obsolete files. This is not the only or the last data system to suffer from these problems, and an approach has been designed that can be applied to other projects. Instead of writing code to process each individual type of cruise, which varies from vessel to vessel over the fifty years and 795 cruises, we created a single Canonical Cruise Data Structure (CCDS). After some experimentation, the CCDS now consists of 9 basic categories and a reasonable number of sub-categories (directories) that can hold all the essential information. The key to flexibility and scalability comes from a template-driven rules approach that allows a processing script to harvest data from complex original data structures, and store them in the simple CCDS. The template mimics the structure of the CCDS. As the processing script traverses the template it finds rules for each category, instructing it on where to look for likely sources, and how to prioritize the results when multiple sources are detected. Over time as new situations are encountered, changes are simply made to the template, rather than the code. After the first tests, it became apparent that a visual method was needed to monitor the success of the harvesting, to make sure that every category is filled with the correct content. We can run simulated tests on our data staging area and report the results immediately in graphical form on our website. The web report shows data found (blue) and not found (red). A mouse-over operation shows the search string used as a rule for selection, and the list of files actually found. We are also employing visualization as a quality control technique to screen data prior to storage in the digital library. We are outputting a grid per multibeam file that can be viewed with public domain GMT tools, the Fledermaus visualization package, or ESRI Arcgis software, as a rapid check on sound velocity artifacts, noise levels, and editing status.

  7. Evaluating the Effectiveness of Nurse-Focused Computerized Clinical Decision Support on Urinary Catheter Practice Guidelines

    ERIC Educational Resources Information Center

    Lang, Robin Lynn Neal

    2012-01-01

    A growing national emphasis has been placed on health information technology (HIT) with robust computerized clinical decision support (CCDS) integration into health care delivery. Catheter-associated urinary tract infection is the most frequent health care-associated infection in the United States and is associated with high cost, high volumes and…

  8. Sensor development at the semiconductor laboratory of the Max-Planck-Society

    NASA Astrophysics Data System (ADS)

    Bähr, A.; Lechner, P.; Ninkovic, J.

    2017-12-01

    For more than twenty years the semiconductor laboratory of the Max-Planck Society (MPG-HLL) is developing high-performing, specialised, scientific silicon sensors including the integration of amplifying electronics on the sensor chip. This paper summarises the actual status of these devices like pnCCDs and DePFET Active Pixel Sensors and their applications.

  9. Evaluation of a hybrid pixel detector for electron microscopy.

    PubMed

    Faruqi, A R; Cattermole, D M; Henderson, R; Mikulec, B; Raeburn, C

    2003-04-01

    We describe the application of a silicon hybrid pixel detector, containing 64 by 64 pixels, each 170 microm(2), in electron microscopy. The device offers improved resolution compared to CCDs along with faster and noiseless readout. Evaluation of the detector, carried out on a 120 kV electron microscope, demonstrates the potential of the device.

  10. News and Views: LSST mirror blank; More and better maths; Free telescopes; The hurricane season is starting again Get ready: IYA2009 UK website up and running

    NASA Astrophysics Data System (ADS)

    2008-10-01

    As floods and hurricanes disrupt the lives of people round the world, a new generation of scientific tools are supporting both storm preparedness and recovery. As International Year of Astronomy 2009 approaches, the UK website is developing more features that make it easier to see what's planned for this science extravaganza.

  11. The Impact of Microlensing on the Standardisation of Strongly Lensed Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Foxley-Marrable, Max; Collett, Thomas E.; Vernardos, Georgios; Goldstein, Daniel A.; Bacon, David

    2018-05-01

    We investigate the effect of microlensing on the standardisation of strongly lensed Type Ia supernovae (GLSNe Ia). We present predictions for the amount of scatter induced by microlensing across a range of plausible strong lens macromodels. We find that lensed images in regions of low convergence, shear and stellar density are standardisable, where the microlensing scatter is ≲ 0.15 magnitudes, comparable to the intrinsic dispersion of for a typical SN Ia. These standardisable configurations correspond to asymmetric lenses with an image located far outside the Einstein radius of the lens. Symmetric and small Einstein radius lenses (≲ 0.5 arcsec) are not standardisable. We apply our model to the recently discovered GLSN Ia iPTF16geu and find that the large discrepancy between the observed flux and the macromodel predictions from More et al. (2017) cannot be explained by microlensing alone. Using the mock GLSNe Ia catalogue of Goldstein et al. (2017), we predict that ˜ 22% of GLSNe Ia discovered by LSST will be standardisable, with a median Einstein radius of 0.9 arcseconds and a median time-delay of 41 days. By breaking the mass-sheet degeneracy the full LSST GLSNe Ia sample will be able to detect systematics in H0 at the 0.5% level.

  12. The LSSTC Data Science Fellowship Program

    NASA Astrophysics Data System (ADS)

    Miller, Adam; Walkowicz, Lucianne; LSSTC DSFP Leadership Council

    2017-01-01

    The Large Synoptic Survey Telescope Corporation (LSSTC) Data Science Fellowship Program (DSFP) is a unique professional development program for astronomy graduate students. DSFP students complete a series of six, one-week long training sessions over the course of two years. The sessions are cumulative, each building on the last, to allow an in-depth exploration of the topics covered: data science basics, statistics, image processing, machine learning, scalable software, data visualization, time-series analysis, and science communication. The first session was held in Aug 2016 at Northwestern University, with all materials and lectures publicly available via github and YouTube. Each session focuses on a series of technical problems which are written in iPython notebooks. The initial class of fellows includes 16 students selected from across the globe, while an additional 14 fellows will be added to the program in year 2. Future sessions of the DSFP will be hosted by a rotating cast of LSSTC member institutions. The DSFP is designed to supplement graduate education in astronomy by teaching the essential skills necessary for dealing with big data, serving as a resource for all in the LSST era. The LSSTC DSFP is made possible by the generous support of the LSST Corporation, the Data Science Initiative (DSI) at Northwestern, and CIERA.

  13. Prospects for Determining the Mass Distributions of Galaxy Clusters on Large Scales Using Weak Gravitational Lensing

    NASA Astrophysics Data System (ADS)

    Fong, M.; Bowyer, R.; Whitehead, A.; Lee, B.; King, L.; Applegate, D.; McCarthy, I.

    2018-05-01

    For more than two decades, the Navarro, Frenk, and White (NFW) model has stood the test of time; it has been used to describe the distribution of mass in galaxy clusters out to their outskirts. Stacked weak lensing measurements of clusters are now revealing the distribution of mass out to and beyond their virial radii, where the NFW model is no longer applicable. In this study we assess how well the parameterised Diemer & Kravstov (DK) density profile describes the characteristic mass distribution of galaxy clusters extracted from cosmological simulations. This is determined from stacked synthetic lensing measurements of the 50 most massive clusters extracted from the Cosmo-OWLS simulations, using the Dark Matter Only run and also the run that most closely matches observations. The characteristics of the data reflect the Weighing the Giants survey and data from the future Large Synoptic Survey Telescope (LSST). In comparison with the NFW model, the DK model favored by the stacked data, in particular for the future LSST data, where the number density of background galaxies is higher. The DK profile depends on the accretion history of clusters which is specified in the current study. Eventually however subsamples of galaxy clusters with qualities indicative of disparate accretion histories could be studied.

  14. Progress with the Prime Focus Spectrograph for the Subaru Telescope: a massively multiplexed optical and near-infrared fiber spectrograph

    NASA Astrophysics Data System (ADS)

    Sugai, Hajime; Tamura, Naoyuki; Karoji, Hiroshi; Shimono, Atsushi; Takato, Naruhisa; Kimura, Masahiko; Ohyama, Youichi; Ueda, Akitoshi; Aghazarian, Hrand; de Arruda, Marcio V.; Barkhouser, Robert H.; Bennett, Charles L.; Bickerton, Steve; Bozier, Alexandre; Braun, David F.; Bui, Khanh; Capocasale, Christopher M.; Carr, Michael A.; Castilho, Bruno; Chang, Yin-Chang; Chen, Hsin-Yo; Chou, Richard C. Y.; Dawson, Olivia R.; Dekany, Richard G.; Ek, Eric M.; Ellis, Richard S.; English, Robin J.; Ferrand, Didier; Ferreira, Décio; Fisher, Charles D.; Golebiowski, Mirek; Gunn, James E.; Hart, Murdock; Heckman, Timothy M.; Ho, Paul T. P.; Hope, Stephen; Hovland, Larry E.; Hsu, Shu-Fu; Hu, Yen-Sang; Huang, Pin Jie; Jaquet, Marc; Karr, Jennifer E.; Kempenaar, Jason G.; King, Matthew E.; Le Fèvre, Olivier; Le Mignant, David; Ling, Hung-Hsu; Loomis, Craig; Lupton, Robert H.; Madec, Fabrice; Mao, Peter; Marrara, Lucas S.; Ménard, Brice; Morantz, Chaz; Murayama, Hitoshi; Murray, Graham J.; de Oliveira, Antonio Cesar; de Oliveira, Claudia M.; de Oliveira, Ligia S.; Orndorff, Joe D.; de Paiva Vilaça, Rodrigo; Partos, Eamon J.; Pascal, Sandrine; Pegot-Ogier, Thomas; Reiley, Daniel J.; Riddle, Reed; Santos, Leandro; dos Santos, Jesulino B.; Schwochert, Mark A.; Seiffert, Michael D.; Smee, Stephen A.; Smith, Roger M.; Steinkraus, Ronald E.; Sodré, Laerte; Spergel, David N.; Surace, Christian; Tresse, Laurence; Vidal, Clément; Vives, Sebastien; Wang, Shiang-Yu; Wen, Chih-Yi; Wu, Amy C.; Wyse, Rosie; Yan, Chi-Hung

    2014-07-01

    The Prime Focus Spectrograph (PFS) is an optical/near-infrared multi-fiber spectrograph with 2394 science fibers, which are distributed in 1.3 degree diameter field of view at Subaru 8.2-meter telescope. The simultaneous wide wavelength coverage from 0.38 μm to 1.26 μm, with the resolving power of 3000, strengthens its ability to target three main survey programs: cosmology, Galactic archaeology, and galaxy/AGN evolution. A medium resolution mode with resolving power of 5000 for 0.71 μm to 0.89 μm also will be available by simply exchanging dispersers. PFS takes the role for the spectroscopic part of the Subaru Measurement of Images and Redshifts (SuMIRe) project, while Hyper Suprime-Cam (HSC) works on the imaging part. HSC's excellent image qualities have proven the high quality of the Wide Field Corrector (WFC), which PFS shares with HSC. The PFS collaboration has succeeded in the project Preliminary Design Review and is now in a phase of subsystem Critical Design Reviews and construction. To transform the telescope plus WFC focal ratio, a 3-mm thick broad-band coated microlens is glued to each fiber tip. The microlenses are molded glass, providing uniform lens dimensions and a variety of refractive-index selection. After successful production of mechanical and optical samples, mass production is now complete. Following careful investigations including Focal Ratio Degradation (FRD) measurements, a higher transmission fiber is selected for the longest part of cable system, while one with a better FRD performance is selected for the fiber-positioner and fiber-slit components, given the more frequent fiber movements and tightly curved structure. Each Fiber positioner consists of two stages of piezo-electric rotary motors. Its engineering model has been produced and tested. After evaluating the statistics of positioning accuracies, collision avoidance software, and interferences (if any) within/between electronics boards, mass production will commence. Fiber positioning will be performed iteratively by taking an image of artificially back-illuminated fibers with the Metrology camera located in the Cassegrain container. The camera is carefully designed so that fiber position measurements are unaffected by small amounts of high special-frequency inaccuracies in WFC lens surface shapes. Target light carried through the fiber system reaches one of four identical fast-Schmidt spectrograph modules, each with three arms. All optical glass blanks are now being polished. Prototype VPH gratings have been optically tested. CCD production is complete, with standard fully-depleted CCDs for red arms and more-challenging thinner fully-depleted CCDs with blue-optimized coating for blue arms. The active damping system against cooler vibration has been proven to work as predicted, and spectrographs have been designed to avoid small possible residual resonances.

  15. Test of CCD Precision Limits for Differential Photometry

    NASA Technical Reports Server (NTRS)

    Borucki, W. J.; Dunham, E. W.; Wei, M. Z.; Robinson, L. B.; Ford, C. H.; Granados, A. F.

    1995-01-01

    Results of tests to demonstrate the very high differential-photometric stability of CCD light sensors are presented. The measurements reported here demonstrate that in a controlled laboratory environment, a front-illuminated CCD can provide differential-photometric measurements with reproducible precision approaching one part in 105. Practical limitations to the precision of differential-photometric measurements with CCDs and implications for spaceborne applications are discussed.

  16. Shake, Rattle and Roll: James Webb Telescope Components Pass Tests

    NASA Technical Reports Server (NTRS)

    2008-01-01

    This image shows a model of one of three detectors for the Mid-Infrared Instrument on NASA's upcoming James Webb Space Telescope. The detector, which looks green in this picture, and is similar to the charge-coupled devices, or 'CCDs,' in digital cameras, is housed in the brick-like unit shown here, called a focal plane module.

  17. A low-power, radiation-resistant, Silicon-Drift-Detector array for extraterrestrial element mapping

    NASA Astrophysics Data System (ADS)

    Ramsey, B. D.; Gaskin, J. A.; Elsner, R. F.; Chen, W.; Carini, G. A.; De Geronimo, G.; Keister, J.; Li, S.; Li, Z.; Siddons, D. P.; Smith, G.

    2012-02-01

    We are developing a modular Silicon Drift Detector (SDD) X-Ray Spectrometer (XRS) for measuring the abundances of light surface elements (C to Fe) fluoresced by ambient radiation on remote airless bodies. The value of fluorescence spectrometry for surface element mapping is demonstrated by its inclusion on three recent lunar missions and by exciting new data that have recently been announced from the Messenger Mission to Mercury. The SDD-XRS instrument that we have been developing offers excellent energy resolution and an order of magnitude lower power requirement than conventional CCDs, making much higher sensitivities possible with modest spacecraft resources. In addition, it is significantly more radiation resistant than x-ray CCDs and therefore will not be subject to the degradation that befell recent lunar instruments. In fact, the intrinsic radiation resistance of the SDD makes it applicable even to the harsh environment of the Jovian system where it can be used to map the light surface elements of Europa. In this paper, we first discuss our element-mapping science-measurement goals. We then derive the necessary instrument requirements to meet these goals and discuss our current instrument development status with respect to these requirements.

  18. New Insight into the Cleavage Reaction of Nostoc sp. Strain PCC 7120 Carotenoid Cleavage Dioxygenase in Natural and Nonnatural Carotenoids

    PubMed Central

    Heo, Jinsol; Kim, Se Hyeuk

    2013-01-01

    Carotenoid cleavage dioxygenases (CCDs) are enzymes that catalyze the oxidative cleavage of carotenoids at a specific double bond to generate apocarotenoids. In this study, we investigated the activity and substrate preferences of NSC3, a CCD of Nostoc sp. strain PCC 7120, in vivo and in vitro using natural and nonnatural carotenoid structures. NSC3 cleaved β-apo-8′-carotenal at 3 positions, C-13C-14, C-15C-15′, and C-13′C-14′, revealing a unique cleavage pattern. NSC3 cleaves the natural structure of carotenoids 4,4′-diaponeurosporene, 4,4′-diaponeurosporen-4′-al, 4,4′-diaponeurosporen-4′-oic acid, 4,4′-diapotorulene, and 4,4′-diapotorulen-4′-al to generate novel cleavage products (apo-14′-diaponeurosporenal, apo-13′-diaponeurosporenal, apo-10′-diaponeurosporenal, apo-14′-diapotorulenal, and apo-10′-diapotorulenal, respectively). The study of carotenoids with natural or nonnatural structures produced by using synthetic modules could provide information valuable for understanding the cleavage reactions or substrate preferences of other CCDs in vivo and in vitro. PMID:23524669

  19. Deep-UV-sensitive high-frame-rate backside-illuminated CCD camera developments

    NASA Astrophysics Data System (ADS)

    Dawson, Robin M.; Andreas, Robert; Andrews, James T.; Bhaskaran, Mahalingham; Farkas, Robert; Furst, David; Gershstein, Sergey; Grygon, Mark S.; Levine, Peter A.; Meray, Grazyna M.; O'Neal, Michael; Perna, Steve N.; Proefrock, Donald; Reale, Michael; Soydan, Ramazan; Sudol, Thomas M.; Swain, Pradyumna K.; Tower, John R.; Zanzucchi, Pete

    2002-04-01

    New applications for ultra-violet imaging are emerging in the fields of drug discovery and industrial inspection. High throughput is critical for these applications where millions of drug combinations are analyzed in secondary screenings or high rate inspection of small feature sizes over large areas is required. Sarnoff demonstrated in1990 a back illuminated, 1024 X 1024, 18 um pixel, split-frame-transfer device running at > 150 frames per second with high sensitivity in the visible spectrum. Sarnoff designed, fabricated and delivered cameras based on these CCDs and is now extending this technology to devices with higher pixel counts and higher frame rates through CCD architectural enhancements. The high sensitivities obtained in the visible spectrum are being pushed into the deep UV to support these new medical and industrial inspection applications. Sarnoff has achieved measured quantum efficiencies > 55% at 193 nm, rising to 65% at 300 nm, and remaining almost constant out to 750 nm. Optimization of the sensitivity is being pursued to tailor the quantum efficiency for particular wavelengths. Characteristics of these high frame rate CCDs and cameras will be described and results will be presented demonstrating high UV sensitivity down to 150 nm.

  20. Development of a large blazed transmission grating by effective binary index modulation for the GAIA radial velocity spectrometer

    NASA Astrophysics Data System (ADS)

    Erdmann, M.; Kley, E.-B.; Zeitner, U.

    2017-11-01

    Gaia is an ambitious ESA mission to chart a three-dimensional map of our Galaxy, the Milky Way, in the process revealing the composition, formation and evolution of the Galaxy. Gaia will provide unprecedented positional and radial velocity measurements with the accuracies needed to produce a stereoscopic and cinematic census of about one billion stars in our Galaxy. The payload consists of 2 Three Mirror Anastigmat (TMA) telescopes (aperture size 1.5 m x 0.5 m), 3 instruments (astrometer, photometer and spectrometer) and 106 butted CCDs assembled to a 0.9 Giga-Pixel focal plane. The Radial Velocity Spectrometer (RVS) of Gaia measures the red shift of the stars in the spectral band between 847 nm and 874 nm. The spectrometer is a fully refractive optics consisting of 2 Fery prisms, 2 prisms, a pass band filter and a blazed transmission grating (instrument mass about 30 kg). It is located in the vicinity of the focal plane and illuminates 12 of the 106 Charge Coupled Devices (CCDs). Gaia is in the implementation phase, the launch of the 2120 kg mass satellite is planned in Dec. 2012.

  1. A Low-Power, Radiation-Resistant, Silicon-Drift-Detector Array for Extraterrestrial Element Mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsey B. D.; De Geronimo G.; Gaskin, J.A.

    2012-02-08

    We are developing a modular Silicon Drift Detector (SDD) X-Ray Spectrometer (XRS) for measuring the abundances of light surface elements (C to Fe) fluoresced by ambient radiation on remote airless bodies. The value of fluorescence spectrometry for surface element mapping is demonstrated by its inclusion on three recent lunar missions and by exciting new data that have recently been announced from the Messenger Mission to Mercury. The SDD-XRS instrument that we have been developing offers excellent energy resolution and an order of magnitude lower power requirement than conventional CCDs, making much higher sensitivities possible with modest spacecraft resources. In addition,more » it is significantly more radiation resistant than x-ray CCDs and therefore will not be subject to the degradation that befell recent lunar instruments. In fact, the intrinsic radiation resistance of the SDD makes it applicable even to the harsh environment of the Jovian system where it can be used to map the light surface elements of Europa. In this paper, we first discuss our element-mapping science-measurement goals. We then derive the necessary instrument requirements to meet these goals and discuss our current instrument development status with respect to these requirements.« less

  2. High resolution optical surface metrology with the slope measuring portable optical test system

    NASA Astrophysics Data System (ADS)

    Maldonado, Alejandro V.

    New optical designs strive to achieve extreme performance, and continually increase the complexity of prescribed optical shapes, which often require wide dynamic range and high resolution. SCOTS, or the Software Configurable Optical Test System, can measure a wide range of optical surfaces with high sensitivity using surface slope. This dissertation introduces a high resolution version of SCOTS called SPOTS, or the Slope measuring Portable Optical Test System. SPOTS improves the metrology of surface features on the order of sub-millimeter to decimeter spatial scales and nanometer to micrometer level height scales. Currently there is no optical surface metrology instrument with the same utility. SCOTS uses a computer controlled display (such as an LCD monitor) and camera to measure surface slopes over the entire surface of a mirror. SPOTS differs in that an additional lens is placed near the surface under test. A small prototype system is discussed in general, providing the support for the design of future SPOTS devices. Then the SCOTS instrument transfer function is addressed, which defines the way the system filters surface heights. Lastly, the calibration and performance of larger SPOTS device is analyzed with example measurements of the 8.4-m diameter aspheric Large Synoptic Survey Telescope's (LSST) primary mirror. In general optical systems have a transfer function, which filters data. In the case of optical imaging systems the instrument transfer function (ITF) follows the modulation transfer function (MTF), which causes a reduction of contrast as a function of increasing spatial frequency due to diffraction. In SCOTS, ITF is shown to decrease the measured height of surface features as their spatial frequency increases, and thus the SCOTS and SPOTS ITF is proportional to their camera system's MTF. Theory and simulations are supported by a SCOTS measurement of a test piece with a set of lithographically written sinusoidal surface topographies. In addition, an example of a simple inverse filtering technique is provided. The success of a small SPOTS proof of concept instrument paved the way for a new larger prototype system, which is intended to measure subaperture regions on large optical mirrors. On large optics, the prototype SPOTS is light weight and it rests on the surface being tested. One advantage of this SPOTS is stability over time in maintaining its calibration. Thus the optician can simply place SPOTS on the mirror, perform a simple alignment, collect measurement data, then pick the system up and repeat at a new location. The entire process takes approximately 5 to 10 minutes, of which 3 minutes is spent collecting data. SPOTS' simplicity of design, light weight, robustness, wide dynamic range, and high sensitivity make it a useful tool for optical shop use during the fabrication and testing process of large and small optics.

  3. Scientific charge-coupled devices

    NASA Technical Reports Server (NTRS)

    Janesick, James R.; Elliott, Tom; Collins, Stewart; Blouke, Morley M.; Freeman, Jack

    1987-01-01

    The charge-coupled device dominates an ever-increasing variety of scientific imaging and spectroscopy applications. Recent experience indicates, however, that the full potential of CCD performance lies well beyond that realized in devices currently available.Test data suggest that major improvements are feasible in spectral response, charge collection, charge transfer, and readout noise. These properties, their measurement in existing CCDs, and their potential for future improvement are discussed in this paper.

  4. Test of CCD Precision Limits for Differential Photometry

    NASA Technical Reports Server (NTRS)

    Robinson, L. B.; Wei, M. Z.; Borucki, W. J.; Dunham, E. W.; Ford, C. H.; Granados, A. F.

    1995-01-01

    Results of tests to demonstrate the very high differential-photometric stability of CCD light sensors are presented. The measurements reported here demonstrate that in a controlled laboratory environment, a front-illuminated CCD can provide differential-photometric measurements with reproducible precision approaching one part in 10(exp 5). Practical limitations to the precision of differential-photometric measurements with CCDs and implications for spaceborne applications are discussed.

  5. Performance characterization of UV science cameras developed for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP)

    NASA Astrophysics Data System (ADS)

    Champey, P.; Kobayashi, K.; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, D.; Beabout, B.; Stewart, M.

    2014-07-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-α and to detect the Hanle effect in the line core. Due to the nature of Lyman-α polarizationin the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1% in the line core. CLASP is a dual-beam spectro-polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1% polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. The CLASP cameras were designed to operate with ≤ 10 e-/pixel/second dark current, ≤ 25 e- read noise, a gain of 2.0 +- 0.5 and ≤ 1.0% residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; dark current, read noise, camera gain and residual non-linearity.

  6. Graphical user interface for a dual-module EMCCD x-ray detector array

    NASA Astrophysics Data System (ADS)

    Wang, Weiyuan; Ionita, Ciprian; Kuhls-Gilcrist, Andrew; Huang, Ying; Qu, Bin; Gupta, Sandesh K.; Bednarek, Daniel R.; Rudin, Stephen

    2011-03-01

    A new Graphical User Interface (GUI) was developed using Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) for a high-resolution, high-sensitivity Solid State X-ray Image Intensifier (SSXII), which is a new x-ray detector for radiographic and fluoroscopic imaging, consisting of an array of Electron-Multiplying CCDs (EMCCDs) each having a variable on-chip electron-multiplication gain of up to 2000x to reduce the effect of readout noise. To enlarge the field-of-view (FOV), each EMCCD sensor is coupled to an x-ray phosphor through a fiberoptic taper. Two EMCCD camera modules are used in our prototype to form a computer-controlled array; however, larger arrays are under development. The new GUI provides patient registration, EMCCD module control, image acquisition, and patient image review. Images from the array are stitched into a 2kx1k pixel image that can be acquired and saved at a rate of 17 Hz (faster with pixel binning). When reviewing the patient's data, the operator can select images from the patient's directory tree listed by the GUI and cycle through the images using a slider bar. Commonly used camera parameters including exposure time, trigger mode, and individual EMCCD gain can be easily adjusted using the GUI. The GUI is designed to accommodate expansion of the EMCCD array to even larger FOVs with more modules. The high-resolution, high-sensitivity EMCCD modular-array SSXII imager with the new user-friendly GUI should enable angiographers and interventionalists to visualize smaller vessels and endovascular devices, helping them to make more accurate diagnoses and to perform more precise image-guided interventions.

  7. A generic readout system for astrophysical detectors

    NASA Astrophysics Data System (ADS)

    Doumayrou, E.; Lortholary, M.

    2012-09-01

    We have developed a generic digital platform to fulfill the needs for the development of new detectors in astrophysics, which is used in lab, for ground-based telescopes instruments and also in prototype versions for space instruments development. This system is based on hardware FPGA electronic board (called MISE) together with software on a PC computer (called BEAR). The MISE board generates the fast clocking which reads the detectors thanks to a programmable digital sequencer and performs data acquisition, buffering of digitalized pixels outputs and interfaces with others boards. The data are then sent to the PC via a SpaceWire or Usb link. The BEAR software sets the MISE board up, makes data acquisition and enables the visualization, processing and the storage of data in line. These software tools are made of C++ and Labview (NI) on a Linux OS. MISE and BEAR make a generic acquisition architecture, on which dedicated analog boards are plugged, so that to accommodate with detectors specificity: number of pixels, the readout channels and frequency, analog bias and clock interfaces. We have used this concept to build a camera for the P-ARTEMIS project including a 256 pixels sub-millimeter bolometer detector at 10Kpixel/s (SPIE 7741-12 (2010)). For the EUCLID project, a lab camera is now working for the test of CCDs 4Mpixels at 4*200Kpixel/s. Another is working for the testing of new near infrared detectors (NIR LFSA for the ESA TRP program) 110Kpixels at 2*100Kpixels/s. Other projects are in progress for the space missions PLATO and SPICA.

  8. Office of Commercial Programs' research activities for Space Station Freedom utilization

    NASA Technical Reports Server (NTRS)

    Fountain, James A.

    1992-01-01

    One of the objectives of the Office of Commercial Programs (OCP) is to encourage, enable, and help implement space research which meets the needs of the U.S. industrial sector. This is done mainly through seventeen Centers for the Commercial Development of Space (CCDS's) which are located throughout the United States. The CCDS's are composed of members from U.S. companies, universities, and other government agencies. These Centers are presently engaged in industrial research in space using a variety of carriers to reach low Earth orbit. One of the goals is to produce a body of experience and knowledge that will allow U.S. industrial entities to make informed decisions regarding their participation in commercial space endeavors. A total of 32 items of payload hardware were built to date. These payloads have flown in space a total of 73 times. The carriers range from the KC-135 parabolic aircraft and expendable launch vehicles to the Space Shuttle. This range of carriers allows the experimenter to evolve payloads in complexity and cost by progressively extending the time in microgravity. They can start with a few seconds in the parabolic aircraft and go to several minutes on the rocket flights, before they progress to the complexities of manned flight on the Shuttle. Next year, two new capabilities will become available: COMET, an expendable-vehicle-launched experiment capsule that can carry experiments aloft for thirty days; and SPACEHAB, a new Shuttle borne module which will greatly add to the capability to accommodate small payloads. All of these commercial research activities and carrier capabilities are preparing the OCP to evolve those experiments that prove successful to Space Station Freedom. OCP and the CCDS's are actively involved in Space Station design and utilization planning and have proposed a set of experiments to be launched in 1996 and 1997. These experiments are to be conducted both internal and external to Space Station Freedom and will investigate industrial research topics which range from biotechnology to electronic materials to metallurgy. Some will be designed to make maximum use of the quiescent microgravity conditions in the 'ground-tended' phases during the early years of Space Station Freedom operations.

  9. Synergistic Effects of Phase Folding and Wavelet Denoising with Applications in Light Curve Analysis

    DTIC Science & Technology

    2016-09-15

    future research. 3 II. Astrostatistics Historically, astronomy has been a data-driven science. Larger and more precise data sets have led to the...forthcoming Large Synoptic Survey Telescope (LSST), the human-centric approach to astronomy is becoming strained [13, 24, 25, 63]. More than ever...process. One use of the filtering process is to remove artifacts from the data set. In the context of time domain astronomy , an artifact is an error in

  10. Large Synoptic Survey Telescope: From Science Drivers to Reference Design

    DTIC Science & Technology

    2008-01-01

    faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter , taking an inventory of the Solar...Energy and Dark Matter (2) Taking an Inventory of the Solar System (3) Exploring the Transient Optical Sky (4) Mapping the Milky Way Each of these four...Constraining Dark Energy and Dark Matter Current models of cosmology require the exis- tence of both dark matter and dark energy to match observational

  11. Responding to the Event Deluge

    NASA Technical Reports Server (NTRS)

    Williams, Roy D.; Barthelmy, Scott D.; Denny, Robert B.; Graham, Matthew J.; Swinbank, John

    2012-01-01

    We present the VOEventNet infrastructure for large-scale rapid follow-up of astronomical events, including selection, annotation, machine intelligence, and coordination of observations. The VOEvent.standard is central to this vision, with distributed and replicated services rather than centralized facilities. We also describe some of the event brokers, services, and software that .are connected to the network. These technologies will become more important in the coming years, with new event streams from Gaia, LOF AR, LIGO, LSST, and many others

  12. On the accuracy of modelling the dynamics of large space structures

    NASA Technical Reports Server (NTRS)

    Diarra, C. M.; Bainum, P. M.

    1985-01-01

    Proposed space missions will require large scale, light weight, space based structural systems. Large space structure technology (LSST) systems will have to accommodate (among others): ocean data systems; electronic mail systems; large multibeam antenna systems; and, space based solar power systems. The structures are to be delivered into orbit by the space shuttle. Because of their inherent size, modelling techniques and scaling algorithms must be developed so that system performance can be predicted accurately prior to launch and assembly. When the size and weight-to-area ratio of proposed LSST systems dictate that the entire system be considered flexible, there are two basic modeling methods which can be used. The first is a continuum approach, a mathematical formulation for predicting the motion of a general orbiting flexible body, in which elastic deformations are considered small compared with characteristic body dimensions. This approach is based on an a priori knowledge of the frequencies and shape functions of all modes included within the system model. Alternatively, finite element techniques can be used to model the entire structure as a system of lumped masses connected by a series of (restoring) springs and possibly dampers. In addition, a computational algorithm was developed to evaluate the coefficients of the various coupling terms in the equations of motion as applied to the finite element model of the Hoop/Column.

  13. Firefly: embracing future web technologies

    NASA Astrophysics Data System (ADS)

    Roby, W.; Wu, X.; Goldina, T.; Joliet, E.; Ly, L.; Mi, W.; Wang, C.; Zhang, Lijun; Ciardi, D.; Dubois-Felsmann, G.

    2016-07-01

    At IPAC/Caltech, we have developed the Firefly web archive and visualization system. Used in production for the last eight years in many missions, Firefly gives the scientist significant capabilities to study data. Firefly provided the first completely web based FITS viewer as well as a growing set of tabular and plotting visualizers. Further, it will be used for the science user interface of the LSST telescope which goes online in 2021. Firefly must meet the needs of archive access and visualization for the 2021 LSST telescope and must serve astronomers beyond the year 2030. Recently, our team has faced the fact that the technology behind Firefly software was becoming obsolete. We were searching for ways to utilize the current breakthroughs in maintaining stability, testability, speed, and reliability of large web applications, which Firefly exemplifies. In the last year, we have ported the Firefly to cutting edge web technologies. Embarking on this massive overhaul is no small feat to say the least. Choosing the technologies that will maintain a forward trajectory in a future development project is always hard and often overwhelming. When a team must port 150,000 lines of code for a production-level product there is little room to make poor choices. This paper will give an overview of the most modern web technologies and lessons learned in our conversion from GWT based system to React/Redux based system.

  14. Variable classification in the LSST era: exploring a model for quasi-periodic light curves

    NASA Astrophysics Data System (ADS)

    Zinn, J. C.; Kochanek, C. S.; Kozłowski, S.; Udalski, A.; Szymański, M. K.; Soszyński, I.; Wyrzykowski, Ł.; Ulaczyk, K.; Poleski, R.; Pietrukowicz, P.; Skowron, J.; Mróz, P.; Pawlak, M.

    2017-06-01

    The Large Synoptic Survey Telescope (LSST) is expected to yield ˜107 light curves over the course of its mission, which will require a concerted effort in automated classification. Stochastic processes provide one means of quantitatively describing variability with the potential advantage over simple light-curve statistics that the parameters may be physically meaningful. Here, we survey a large sample of periodic, quasi-periodic and stochastic Optical Gravitational Lensing Experiment-III variables using the damped random walk (DRW; CARMA(1,0)) and quasi-periodic oscillation (QPO; CARMA(2,1)) stochastic process models. The QPO model is described by an amplitude, a period and a coherence time-scale, while the DRW has only an amplitude and a time-scale. We find that the periodic and quasi-periodic stellar variables are generally better described by a QPO than a DRW, while quasars are better described by the DRW model. There are ambiguities in interpreting the QPO coherence time due to non-sinusoidal light-curve shapes, signal-to-noise ratio, error mischaracterizations and cadence. Higher order implementations of the QPO model that better capture light-curve shapes are necessary for the coherence time to have its implied physical meaning. Independent of physical meaning, the extra parameter of the QPO model successfully distinguishes most of the classes of periodic and quasi-periodic variables we consider.

  15. Transient survey rates for orphan afterglows from compact merger jets

    NASA Astrophysics Data System (ADS)

    Lamb, Gavin P.; Tanaka, Masaomi; Kobayashi, Shiho

    2018-06-01

    Orphan afterglows from short γ-ray bursts (GRBs) are potential candidates for electromagnetic (EM) counterpart searches to gravitational wave (GW) detected neutron star or neutron star black hole mergers. Various jet dynamical and structure models have been proposed that can be tested by the detection of a large sample of GW-EM counterparts. We make predictions for the expected rate of optical transients from these jet models for future survey telescopes, without a GW or GRB trigger. A sample of merger jets is generated in the redshift limits 0 ≤ z ≤ 3.0, and the expected peak r-band flux and time-scale above the Large Synoptic Survey Telescope (LSST) or Zwicky Transient Factory (ZTF) detection threshold, mr = 24.5 and 20.4, respectively, is calculated. General all-sky rates are shown for mr ≤ 26.0 and mr ≤ 21.0. The detected orphan and GRB afterglow rate depends on jet model, typically 16≲ R≲ 76 yr-1 for the LSST, and 2≲ R ≲ 8 yr-1 for ZTF. An excess in the rate of orphan afterglows for a survey to a depth of mr ≤ 26 would indicate that merger jets have a dominant low-Lorentz factor population, or the jets exhibit intrinsic jet structure. Careful filtering of transients is required to successfully identify orphan afterglows from either short- or long-GRB progenitors.

  16. The Palomar Transient Factory: Introduction and Data Release

    NASA Astrophysics Data System (ADS)

    Surace, Jason Anthony

    2015-08-01

    The Palomar Transient Factory (PTF) is a synoptic sky survey in operation since 2009. PTF utilizes a 7.1 square degree camera on the Palomar 48-inch Schmidt telescope to survey the sky primarily at a single wavelength (R-band) at a rate of 1000-3000 square degrees a night, to a depth of roughly 20.5. The data are used to detect and study transient and moving objects such as gamma ray bursts, supernovae and asteroids, as well as variable phenomena such as quasars and Galactic stars. The data processing system handles realtime processing and detection of transients, solar system object processing, high photometric precision processing and light curve generation, and long-term archiving and curation. Although a significant scientific installation in of itself, PTF also serves as the prototype for our next generation project, the Zwicky Transient Facility (ZTF). Beginning operations in 2017, ZTF will feature a 50 square degree camera which will enable scanning of the entire northern visible sky every night. ZTF in turn will serve as a stepping stone to the Large Synoptic Survey Telescope (LSST).We announce the availability of the second PTF public data release, which includes epochal images and catalogs, as well as deep (coadded) reference images and associated catalogs, for the majority of the northern sky. The epochal data span the time period from 2009 through 2012, with various cadences and coverages, typically in the tens or hundreds for most points on the sky. The data are available through both a GUI and software API portal at the Infrared Processing and Analysis Center at Caltech. The PTF and current iPTF projects are multi-partner multi-national collaborations.

  17. Biochemical, molecular, and clinical diagnoses of patients with cerebral creatine deficiency syndromes.

    PubMed

    Comeaux, Matthew S; Wang, Jing; Wang, Guoli; Kleppe, Soledad; Zhang, Victor Wei; Schmitt, Eric S; Craigen, William J; Renaud, Deborah; Sun, Qin; Wong, Lee-Jun

    2013-07-01

    Cerebral creatine deficiency syndromes (CCDS) are a group of inborn errors of creatine metabolism that involve AGAT and GAMT for creatine biosynthesis disorders and SLC6A8 for creatine transporter (CT1) deficiency. Deficiencies in the three enzymes can be distinguished by intermediate metabolite levels, and a definitive diagnosis relies on the presence of deleterious mutations in the causative genes. Mutations and unclassified variants were identified in 41 unrelated patients, and 22 of these mutations were novel. Correlation of sequencing and biochemical data reveals that using plasma guanidinoacetate (GAA) as a biomarker has 100% specificity for both AGAT and GAMT deficiencies, but AGAT deficiency has decreased sensitivity in this assay. Furthermore, the urine creatine:creatinine ratio is an effective screening test with 100% specificity in males suspected of having creatine transporter deficiency. This test has a high false-positive rate due to dietary factors or dilute urine samples and lacks sensitivity in females. We conclude that biochemical screening for plasma GAA and measuring of the urine creatine:creatinine ratio should be performed for suspected CCDS patients prior to sequencing. Also, based on the results of this study, we feel that sequencing should only be considered if a patient has abnormal biochemical results on repeat testing. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. HST/WFC3: Evolution of the UVIS Channel's Charge Transfer Efficiency

    NASA Astrophysics Data System (ADS)

    Gosmeyer, Catherine; Baggett, Sylvia M.; Anderson, Jay; WFC3 Team

    2016-06-01

    The Wide Field Camera 3 (WFC3) on the Hubble Space Telescope (HST) contains both an IR and a UVIS channel. After more than six years on orbit, the UVIS channel performance remains stable; however, on-orbit radiation damage has caused the charge transfer efficiency (CTE) of UVIS's two CCDs to degrade. This degradation is seen as vertical charge 'bleeding' from sources during readout and its effect evolves as the CCDs age. The WFC3 team has developed software to perform corrections that push the charge back to the sources, although it cannot recover faint sources that have been bled out entirely. Observers can mitigate this effect in various ways such as by placing sources near the amplifiers, observing bright targets, and by increasing the total background to at least 12 electrons, either by using a broader filter, lengthening exposure time, or post-flashing. We present results from six years of calibration data to re-evaluate the best level of total background for mitigating CTE loss and to re-verify that the pixel-based CTE correction software is performing optimally over various background levels. In addition, we alert observers that CTE-corrected products are now available for retrieval from MAST as part of the CALWF3 v3.3 pipeline upgrade.

  19. Direct imaging detectors for electron microscopy

    NASA Astrophysics Data System (ADS)

    Faruqi, A. R.; McMullan, G.

    2018-01-01

    Electronic detectors used for imaging in electron microscopy are reviewed in this paper. Much of the detector technology is based on the developments in microelectronics, which have allowed the design of direct detectors with fine pixels, fast readout and which are sufficiently radiation hard for practical use. Detectors included in this review are hybrid pixel detectors, monolithic active pixel sensors based on CMOS technology and pnCCDs, which share one important feature: they are all direct imaging detectors, relying on directly converting energy in a semiconductor. Traditional methods of recording images in the electron microscope such as film and CCDs, are mentioned briefly along with a more detailed description of direct electronic detectors. Many applications benefit from the use of direct electron detectors and a few examples are mentioned in the text. In recent years one of the most dramatic advances in structural biology has been in the deployment of the new backthinned CMOS direct detectors to attain near-atomic resolution molecular structures with electron cryo-microscopy (cryo-EM). The development of direct detectors, along with a number of other parallel advances, has seen a very significant amount of new information being recorded in the images, which was not previously possible-and this forms the main emphasis of the review.

  20. Design of Low-Noise Output Amplifiers for P-channel Charge-Coupled Devices Fabricated on High-Resistivity Silicon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haque, S; Frost, F Dion R.; Groulx, R

    2011-12-22

    We describe the design and optimization of low-noise, single-stage output amplifiers for p-channel charge-coupled devices (CCDs) used for scientific applications in astronomy and other fields. The CCDs are fabricated on high-resistivity, 4000–5000 -cm, n-type silicon substrates. Single-stage amplifiers with different output structure designs and technologies have been characterized. The standard output amplifier is designed with an n{sup +} polysilicon gate that has a metal connection to the sense node. In an effort to lower the output amplifier readout noise by minimizing the capacitance seen at the sense node, buried-contact technology has been investigated. In this case, the output transistor hasmore » a p{sup +} polysilicon gate that connects directly to the p{sup +} sense node. Output structures with buried-contact areas as small as 2 μm × 2 μm are characterized. In addition, the geometry of the source-follower transistor was varied, and we report test results on the conversion gain and noise of the various amplifier structures. By use of buried-contact technology, better amplifier geometry, optimization of the amplifier biases and improvements in the test electronics design, we obtain a 45% reduction in noise, corresponding to 1.7 e{sup -} rms at 70 kpixels/sec.« less

  1. In vivo characterization of hair and skin derived carbon quantum dots with high quantum yield as long-term bioprobes in zebrafish

    PubMed Central

    Zhang, Jing-Hui; Niu, Aping; Li, Jing; Fu, Jian-Wei; Xu, Qun; Pei, De-Sheng

    2016-01-01

    Carbon quantum dots (CDs) were widely investigated because of their tunable fluorescence properties and low toxicity. However, so far there have been no reports on in vivo functional studies of hair and skin derived CDs. Here, hair derived CDs (HCDs) and skin derived CDs (SCDs) were produced by using human hair and pig skin as precursors. The quantum yields (QYs) of HCDs and SCDs were quite high, compared to citric acid derived CDs (CCDs). HCDs and SCDs possess optimal photostability, hypotoxicity and biocompatibility in zebrafish, indicating that HCDs and SCDs possess the capacity of being used as fluorescence probes for in vivo biological imaging. The long-time observation for fluorescence alternation of CDs in zebrafish and the quenching assay of CDs by ATP, NADH and Fe3+ ions demonstrated that the decaying process of CDs in vivo might be induced by the synergistic effect of the metabolism process. All results indicated that large batches and high QYs of CDs can be acquired by employing natural and nontoxic hair and skin as precursors. To our knowledge, this is the first time to report SCDs, in vivo comparative studies of HCDs, SCDs and CCDs as bioprobes, and explore their mechanism of photostability in zebrafish. PMID:27886267

  2. In vivo characterization of hair and skin derived carbon quantum dots with high quantum yield as long-term bioprobes in zebrafish.

    PubMed

    Zhang, Jing-Hui; Niu, Aping; Li, Jing; Fu, Jian-Wei; Xu, Qun; Pei, De-Sheng

    2016-11-25

    Carbon quantum dots (CDs) were widely investigated because of their tunable fluorescence properties and low toxicity. However, so far there have been no reports on in vivo functional studies of hair and skin derived CDs. Here, hair derived CDs (HCDs) and skin derived CDs (SCDs) were produced by using human hair and pig skin as precursors. The quantum yields (QYs) of HCDs and SCDs were quite high, compared to citric acid derived CDs (CCDs). HCDs and SCDs possess optimal photostability, hypotoxicity and biocompatibility in zebrafish, indicating that HCDs and SCDs possess the capacity of being used as fluorescence probes for in vivo biological imaging. The long-time observation for fluorescence alternation of CDs in zebrafish and the quenching assay of CDs by ATP, NADH and Fe 3+ ions demonstrated that the decaying process of CDs in vivo might be induced by the synergistic effect of the metabolism process. All results indicated that large batches and high QYs of CDs can be acquired by employing natural and nontoxic hair and skin as precursors. To our knowledge, this is the first time to report SCDs, in vivo comparative studies of HCDs, SCDs and CCDs as bioprobes, and explore their mechanism of photostability in zebrafish.

  3. In vivo characterization of hair and skin derived carbon quantum dots with high quantum yield as long-term bioprobes in zebrafish

    NASA Astrophysics Data System (ADS)

    Zhang, Jing-Hui; Niu, Aping; Li, Jing; Fu, Jian-Wei; Xu, Qun; Pei, De-Sheng

    2016-11-01

    Carbon quantum dots (CDs) were widely investigated because of their tunable fluorescence properties and low toxicity. However, so far there have been no reports on in vivo functional studies of hair and skin derived CDs. Here, hair derived CDs (HCDs) and skin derived CDs (SCDs) were produced by using human hair and pig skin as precursors. The quantum yields (QYs) of HCDs and SCDs were quite high, compared to citric acid derived CDs (CCDs). HCDs and SCDs possess optimal photostability, hypotoxicity and biocompatibility in zebrafish, indicating that HCDs and SCDs possess the capacity of being used as fluorescence probes for in vivo biological imaging. The long-time observation for fluorescence alternation of CDs in zebrafish and the quenching assay of CDs by ATP, NADH and Fe3+ ions demonstrated that the decaying process of CDs in vivo might be induced by the synergistic effect of the metabolism process. All results indicated that large batches and high QYs of CDs can be acquired by employing natural and nontoxic hair and skin as precursors. To our knowledge, this is the first time to report SCDs, in vivo comparative studies of HCDs, SCDs and CCDs as bioprobes, and explore their mechanism of photostability in zebrafish.

  4. Photon collider: a four-channel autoguider solution

    NASA Astrophysics Data System (ADS)

    Hygelund, John C.; Haynes, Rachel; Burleson, Ben; Fulton, Benjamin J.

    2010-07-01

    The "Photon Collider" uses a compact array of four off axis autoguider cameras positioned with independent filtering and focus. The photon collider is two way symmetric and robustly mounted with the off axis light crossing the science field which allows the compact single frame construction to have extremely small relative deflections between guide and science CCDs. The photon collider provides four independent guiding signals with a total of 15 square arc minutes of sky coverage. These signals allow for simultaneous altitude, azimuth, field rotation and focus guiding. Guide cameras read out without exposure overhead increasing the tracking cadence. The independent focus allows the photon collider to maintain in focus guide stars when the main science camera is taking defocused exposures as well as track for telescope focus changes. Independent filters allow auto guiding in the science camera wavelength bandpass. The four cameras are controlled with a custom web services interface from a single Linux based industrial PC, and the autoguider mechanism and telemetry is built around a uCLinux based Analog Devices BlackFin embedded microprocessor. Off axis light is corrected with a custom meniscus correcting lens. Guide CCDs are cooled with ethylene glycol with an advanced leak detection system. The photon collider was built for use on Las Cumbres Observatory's 2 meter Faulks telescopes and currently used to guide the alt-az mount.

  5. Taking the CCDs to the ultimate performance for low threshold experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haro, Miguel; Moroni, Guillermo; Tiffenberg, Javier

    2016-11-14

    Scientific grade CCDs show atractive capabilities for the detection of particles with small energy deposition in matter. Their very low threshold of approximately 40 eV and their good spatial reconstruction of the event are key properties for currently running experiments: CONNIE and DAMIC. Both experiments can benefit from any increase of the detection efficiency of nuclear recoils at low energy. In this work we present two different approaches to increase this efficiency by increasing the SNR of events. The first one is based on the reduction of the readout noise of the device, which is the main contribution of uncertaintymore » to the signal measurement. New studies on the electronic noise from the integrated output amplifier and the readout electronics will be presented together with result of a new configuration showing a lower limit on the readout noise which can be implemented on the current setup of the CCD based experiments. A second approach to increase the SNR of events at low energy that will be presented is the studies of the spatial conformation of nuclear recoil events at different depth in the active volume by studies of new effects that differ from expected models based on not interacting diffusion model of electrons in the semiconductor.« less

  6. Cross-Reactivity between Schistosoma mansoni Antigens and the Latex Allergen Hev b 7: Putative Implication of Cross-Reactive Carbohydrate Determinants (CCDs)

    PubMed Central

    Doenhoff, Michael J.; El-Faham, Marwa; Liddell, Susan; Fuller, Heidi R.; Stanley, Ronald G.; Schramm, Gabriele; Igetei, Joseph E.

    2016-01-01

    IgG antibodies produced by rabbits immunized against S. mansoni antigens cross-reacted with aqueous soluble constituents of a variety of allergens. The antibody cross-reactivity was largely sensitive to degradation by treatment of the target antigens with sodium meta-periodate, suggesting the cross-reactivity was due to carbohydrate determinants that were common to both the schistosome and the allergens (CCDs). The reaction between the rabbit antibodies and a 43 kDa molecule in a rubber latex extract was analysed further: tandem mass spectrometry identified the latex molecule as allergen Hev b 7. Rabbit anti-schistosome IgG antibodies purified by acid-elution from solid-phase latex Hev b 7 reacted with the S. mansoni egg antigens IPSE/alpha-1 and kappa-5 and cercarial antigens SPO-1 and a fatty acid-binding protein. Moreover, purified anti-S. mansoni egg, latex cross-reactive antibodies reacted with antigenic constituents of some fruits, a result of potential relevance to the latex-fruit syndrome of allergic reactions. We propose that IgG anti-schistosome antibodies that cross-react with allergens may be able to block IgE-induced allergic reactions and thus provide a possible explanation for the hygiene hypothesis. PMID:27467385

  7. Technology for large space systems: A special bibliography with indexes

    NASA Technical Reports Server (NTRS)

    1979-01-01

    This bibliography lists 460 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1968 and December 31, 1978. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, and flight experiments.

  8. Technology for large space systems: A special bibliography with indexes (supplement 01)

    NASA Technical Reports Server (NTRS)

    1979-01-01

    This bibliography lists 180 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1979 and June 30, 1979. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, and flight experiments.

  9. Control system design for the large space systems technology reference platform

    NASA Technical Reports Server (NTRS)

    Edmunds, R. S.

    1982-01-01

    Structural models and classical frequency domain control system designs were developed for the large space systems technology (LSST) reference platform which consists of a central bus structure, solar panels, and platform arms on which a variety of experiments may be mounted. It is shown that operation of multiple independently articulated payloads on a single platform presents major problems when subarc second pointing stability is required. Experiment compatibility will be an important operational consideration for systems of this type.

  10. Influence of distillers grains resulting from a cellulosic ethanol process utilizing corn kernel fiber on nutrient digestibility of lambs and steer feedlot performance.

    PubMed

    Lundy, E L; Loy, D D; Hansen, S L

    2015-05-01

    Two experiments evaluated the effects on animal performance of traditional wet distillers grains (T-WDG) compared to cellulosic wet distillers grains (C-WDG) from a new process converting corn kernel fiber into cellulosic ethanol. The resulting coproduct has greater CP and decreased starch and ether extract (EE) concentrations (34.0% CP, 1.6% starch, 7.3% EE) compared to T-WDG (32.5% CP, 5.1% starch, 7.7% EE). In Exp. 1, 10 wethers (34.1 ± 2.35 kg, SD) were used in a replicated 5 × 5 Latin square to evaluate digestibility of DM, fiber, EE, and N. Diets including a corn-based control with 7.5% T-WDG and 7.5% C-WDG (CORN); 30% or 45% inclusion of T-WDG; and 30% or 45% inclusion of C-WDG. Between CORN, 30% T-WDG, 45% T-WDG, or 45% C-WDG, DMI was not different (P ≥ 0.11), but lambs fed 30% C-WDG had decreased (P ≤ 0.05) DMI compared to other diets. Compared to CORN and 30% T-WDG, DM digestibility was lesser ( P< 0.05) for 45% T-WDG or 30% C-WDG, while 45% C-WDG has lesser (P ≤ 0.05) DM digestibility than all other treatments. Digestibility of NDF was not affected by treatment (P= 0.13), and ADF digestibility was not different ( 0.21) between CORN, 30% T-WDG, 30% C-WDG, or 45% C-WDG. However, digestibility of ADF tended to differ (P = 0.06) between 30% T-WDG and 45% C-WDG and was greater (P ≤ 0.05) in lambs fed 45% T-WDG compared to other treatments. In Exp. 2, 168 steers (421 ± 23.9 kg, SD) were used in a randomized complete block design to determine the impact of C-WDG or T-WDG on growth performance and carcass characteristics. Diets included a corn-based control (CON), 30% T-WDG (TRAD), 30% C-WDG (CEL), and 18% C-WDG and 12% condensed corn distillers solubles (CEL+CCDS; = 7 pens of 6 steers/pen). Steers fed TRAD had improved (P ≤ 0.01) ADG, G:F, and HCW compared to steers fed the CON diet. No differences (P ≥ 0.16) in ADG and HCW were noted for steers fed CEL compared to TRAD; however, steers fed CEL had decreased (P = 0.01) G:F due to increased (P = 0.02) DMI compared to TRAD-fed steers. Steers fed CEL or CEL+CCDS did not differ (P = 0.50) in G:F, but CEL+CCDS-fed steers had lesser (P ≤ 0.01) DMI and ADG likely due to greater S content of the CEL+CCDS diet. Overall, while DM digestibility of lambs fed 30% C-WDG was lesser than 30% T-WDG, performance of steers finished on C-WDG was similar to those fed T-WDG. However, WDG from the secondary fermentation appeared to have lesser energy than T-WDG, while maintaining similar cattle performance to corn-fed controls.

  11. Low-power low-noise mixed-mode VLSI ASIC for infinite dynamic range imaging applications

    NASA Astrophysics Data System (ADS)

    Turchetta, Renato; Hu, Y.; Zinzius, Y.; Colledani, C.; Loge, A.

    1998-11-01

    Solid state solutions for imaging are mainly represented by CCDs and, more recently, by CMOS imagers. Both devices are based on the integration of the total charge generated by the impinging radiation, with no processing of the single photon information. The dynamic range of these devices is intrinsically limited by the finite value of noise. Here we present the design of an architecture which allows efficient, in-pixel, noise reduction to a practically zero level, thus allowing infinite dynamic range imaging. A detailed calculation of the dynamic range is worked out, showing that noise is efficiently suppressed. This architecture is based on the concept of single-photon counting. In each pixel, we integrate both the front-end, low-noise, low-power analog part and the digital part. The former consists of a charge preamplifier, an active filter for optimal noise bandwidth reduction, a buffer and a threshold comparator, and the latter is simply a counter, which can be programmed to act as a normal shift register for the readout of the counters' contents. Two different ASIC's based on this concept have been designed for different applications. The first one has been optimized for silicon edge-on microstrips detectors, used in a digital mammography R and D project. It is a 32-channel circuit, with a 16-bit binary static counter.It has been optimized for a relatively large detector capacitance of 5 pF. Noise has been measured to be equal to 100 + 7*Cd (pF) electron rms with the digital part, showing no degradation of the noise performances with respect to the design values. The power consumption is 3.8mW/channel for a peaking time of about 1 microsecond(s) . The second circuit is a prototype for pixel imaging. The total active area is about (250 micrometers )**2. The main differences of the electronic architecture with respect to the first prototype are: i) different optimization of the analog front-end part for low-capacitance detectors, ii) in- pixel 4-bit comparator-offset compensation, iii) 15-bit pseudo-random counter. The power consumption is 255 (mu) W/channel for a peaking time of 300 ns and an equivalent noise charge of 185 + 97*Cd electrons rms. Simulation and experimental result as well as imaging results will be presented.

  12. The CONNIE experiment

    NASA Astrophysics Data System (ADS)

    Aguilar-Arevalo, A.; Bertou, X.; Bonifazi, C.; Butner, M.; Cancelo, G.; Castaneda Vazquez, A.; Cervantes Vergara, B.; Chavez, C. R.; Da Motta, H.; D'Olivo, J. C.; Dos Anjos, J.; Estrada, J.; Fernandez Moroni, G.; Ford, R.; Foguel, A.; Hernandez Torres, K. P.; Izraelevitch, F.; Kavner, A.; Kilminster, B.; Kuk, K.; Lima, H. P., Jr.; Makler, M.; Molina, J.; Moreno-Granados, G.; Moro, J. M.; Paolini, E. E.; Sofo Haro, M.; Tiffenberg, J.; Trillaud, F.; Wagner, S.

    2016-10-01

    The CONNIE experiment uses fully depleted, high resistivity CCDs as particle detectors in an attempt to measure for the first time the Coherent Neutrino-Nucleus Elastic Scattering of antineutrinos from a nuclear reactor with silicon nuclei. This talk, given at the XV Mexican Workshop on Particles and Fields (MWPF), discussed the potential of CONNIE to perform this measurement, the installation progress at the Angra dos Reis nuclear power plant, as well as the plans for future upgrades.

  13. Signal-to-noise limitations in white light holography

    NASA Technical Reports Server (NTRS)

    Ribak, Erez; Breckinridge, James B.; Roddier, Claude; Roddier, Francois

    1988-01-01

    A simple derivation is given for the SNR in images reconstructed from incoherent holograms. Dependence is shown to be on the hologram SNR, object complexity, and the number of pixels in the detector. Reconstruction of involved objects becomes possible with high-dynamic-range detectors such as CCDs. White-light holograms have been produced by means of a rotational shear interferometer combined with a chromatic corrector. A digital inverse transform recreated the object.

  14. Development of a CCD array as an imaging detector for advanced X-ray astrophysics facilities

    NASA Technical Reports Server (NTRS)

    Schwartz, D. A.

    1981-01-01

    The development of a charge coupled device (CCD) X-ray imager for a large aperture, high angular resolution X-ray telescope is discussed. Existing CCDs were surveyed and three candidate concepts were identified. An electronic camera control and computer interface, including software to drive a Fairchild 211 CCD, is described. In addition a vacuum mounting and cooling system is discussed. Performance data for the various components are given.

  15. The CONNIE experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilar-Arevalo, A.; et al.

    2016-10-19

    The CONNIE experiment uses fully depleted, high resistivity CCDs as particle detectors in an attempt to measure for the first time the Coherent Neutrino-Nucleus Elastic Scattering of antineutrinos from a nuclear reactor with silicon nuclei.This talk, given at the XV Mexican Workshop on Particles and Fields (MWPF), discussed the potential of CONNIE to perform this measurement, the installation progress at the Angra dos Reis nuclear power plant, as well as the plans for future upgrades.

  16. CCD Astrometry with Robotic Telescopes

    NASA Astrophysics Data System (ADS)

    AlZaben, Faisal; Li, Dewei; Li, Yongyao; Dennis, Aren Fene, Michael; Boyce, Grady; Boyce, Pat

    2016-01-01

    CCD images were acquired of three binary star systems: WDS06145+1148, WDS06206+1803, and WDS06224+2640. The astrometric solution, position angle, and separation of each system were calculated with MaximDL v6 and Mira Pro x64 software suites. The results were consistent with historical measurements in the Washington Double Star Catalog. Our analysis found some differences in measurements between single-shot color CCD cameras and traditional monochrome CCDs using a filter wheel.

  17. Center for Mapping, Ohio State University

    NASA Technical Reports Server (NTRS)

    Starr, Lowell

    1991-01-01

    There are many future opportunities for Centers for the Commercial Development of Space (CCDS) activities that are directly linked to industry strategic objectives. In the fields of mapping, remote sensing, and geographic information systems (GIS), the near term opportunities may exceed all that have occurred in the past 10 years. It is strongly believed that a national spatial data infrastructure must be established in this country, if we are to remain a leader in the information age.

  18. The Complete Calibration of the Color–Redshift Relation (C3R2) Survey: Survey Overview and Data Release 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masters, Daniel C.; Stern, Daniel K.; Rhodes, Jason D.

    A key goal of the Stage IV dark energy experiments Euclid , LSST, and WFIRST is to measure the growth of structure with cosmic time from weak lensing analysis over large regions of the sky. Weak lensing cosmology will be challenging: in addition to highly accurate galaxy shape measurements, statistically robust and accurate photometric redshift (photo- z ) estimates for billions of faint galaxies will be needed in order to reconstruct the three-dimensional matter distribution. Here we present an overview of and initial results from the Complete Calibration of the Color–Redshift Relation (C3R2) survey, which is designed specifically to calibratemore » the empirical galaxy color–redshift relation to the Euclid depth. These redshifts will also be important for the calibrations of LSST and WFIRST . The C3R2 survey is obtaining multiplexed observations with Keck (DEIMOS, LRIS, and MOSFIRE), the Gran Telescopio Canarias (GTC; OSIRIS), and the Very Large Telescope (VLT; FORS2 and KMOS) of a targeted sample of galaxies that are most important for the redshift calibration. We focus spectroscopic efforts on undersampled regions of galaxy color space identified in previous work in order to minimize the number of spectroscopic redshifts needed to map the color–redshift relation to the required accuracy. We present the C3R2 survey strategy and initial results, including the 1283 high-confidence redshifts obtained in the 2016A semester and released as Data Release 1.« less

  19. The LSST Dome final design

    NASA Astrophysics Data System (ADS)

    DeVries, J.; Neill, D. R.; Barr, J.; De Lorenzi, Simone; Marchiori, Gianpietro

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) is a large (8.4 meter) wide-field (3.5 degree) survey telescope, which will be located on the Cerro Pachón summit in Chile 1. As a result of the Telescope wide field of view, the optical system is unusually susceptible to stray light 2. In addition, balancing the effect of wind induced telescope vibrations with Dome seeing is crucial. The rotating enclosure system (Dome) includes a moving wind screen and light baffle system. All of the Dome vents include hinged light baffles, which provide exceptional Dome flushing, stray light attenuation, and allows for vent maintenance access from inside the Dome. The wind screen also functions as a light screen, and helps define a clear optical aperture for the Telescope. The Dome must operate continuously without rotational travel limits to accommodate the Telescope cadence and travel. Consequently, the Azimuth drives are located on the fixed lower enclosure to accommodate glycol water cooling without the need for a utility cable wrap. An air duct system aligns when the Dome is in its parked position, and this provides air cooling for temperature conditioning of the Dome during the daytime. A bridge crane and a series of ladders, stairs and platforms provide for the inspection, maintenance and repair of all of the Dome mechanical systems. The contract to build the Dome was awarded to European Industrial Engineering in Mestre, Italy in May 2015. In this paper, we present the final design of this telescope and site sub-system.

  20. Performance Characterization of UV Science Cameras Developed for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP)

    NASA Technical Reports Server (NTRS)

    Champey, Patrick; Kobayashi, Ken; Winebarger, Amy; Cirtin, Jonathan; Hyde, David; Robertson, Bryan; Beabout, Brent; Beabout, Dyana; Stewart, Mike

    2014-01-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-alpha and to detect the Hanle effect in the line core. Due to the nature of Lyman-alpha polarization in the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1% in the line core. CLASP is a dual-beam spectro-polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1% polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. Coating the e2v CCD57-10 512x512 detectors with Lumogen-E coating allows for a relatively high (30%) quantum efficiency at the Lyman-$\\alpha$ line. The CLASP cameras were designed to operate with =10 e- /pixel/second dark current, = 25 e- read noise, a gain of 2.0 and =0.1% residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; dark current, read noise, camera gain and residual non-linearity.

  1. Performance Characterization of UV Science Cameras Developed for the Chromospheric Lyman-Alpha Spectro-Polarimeter

    NASA Technical Reports Server (NTRS)

    Champey, P.; Kobayashi, K.; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, D.; Beabout, B.; Stewart, M.

    2014-01-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-alpha and to detect the Hanle effect in the line core. Due to the nature of Lyman-alpha polarization in the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1 percent in the line core. CLASP is a dual-beam spectro-polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1 percent polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. Coating the e2v CCD57-10 512x512 detectors with Lumogen-E coating allows for a relatively high (30 percent) quantum efficiency at the Lyman-alpha line. The CLASP cameras were designed to operate with 10 e-/pixel/second dark current, 25 e- read noise, a gain of 2.0 +/- 0.5 and 1.0 percent residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; dark current, read noise, camera gain and residual non-linearity.

  2. TASS - The Amateur Sky Survey

    NASA Astrophysics Data System (ADS)

    Droege, T. F.; Albertson, C.; Gombert, G.; Gutzwiller, M.; Molhant, N. W.; Johnson, H.; Skvarc, J.; Wickersham, R. J.; Richmond, M. W.; Rybski, P.; Henden, A.; Beser, N.; Pittinger, M.; Kluga, B.

    1997-05-01

    As a non-astronomer watching Shoemaker/Levy 9 crash into Jupiter through postings on sci.astro, it occurred to me that it might be fun to build a comet finding machine. After wild speculations on how such a device might be built - I considered a 26" x 40" fresnel lens and a string of pin diodes -- postings to sci.astro brought me down to earth. I quickly made contact with both professionals and amateurs and found that there was interesting science to be done with an all sky survey. After several prototype drift scan cameras were built using various CCDs, I determined the real problem was software. How does one get the software written for an all sky survey? Willie Sutton could tell you, "Go where the programmers are." Our strategy has been to build a bunch of drift scan cameras and just give them away (without software) to programmers found on the Internet. This author reports more success by this technique than when he had a business and hired and paid programmers at a cost of a million or so a year. To date, 22 drift scan cameras have been constructed. Most of these are operated as triplets spaced 15 degrees apart in Right Ascension and with I, V, I filters. The cameras use 135mm fl, f.2.8 camera lenses for a plate scale of 14 arc seconds per pixel and reach magnitude 13. With 512 pixels across the drift scan direction and running through the night, a triplet will collect 200 Mb of data on three overlapping areas of 3 x 120 degrees each. To date four of the triplets and one single have taken data. Production has started on 25 second generation cameras using 2k x 2k devices and a barn door mount.

  3. Graphical User Interface for a Dual-Module EMCCD X-ray Detector Array.

    PubMed

    Wang, Weiyuan; Ionita, Ciprian; Kuhls-Gilcrist, Andrew; Huang, Ying; Qu, Bin; Gupta, Sandesh K; Bednarek, Daniel R; Rudin, Stephen

    2011-03-16

    A new Graphical User Interface (GUI) was developed using Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) for a high-resolution, high-sensitivity Solid State X-ray Image Intensifier (SSXII), which is a new x-ray detector for radiographic and fluoroscopic imaging, consisting of an array of Electron-Multiplying CCDs (EMCCDs) each having a variable on-chip electron-multiplication gain of up to 2000× to reduce the effect of readout noise. To enlarge the field-of-view (FOV), each EMCCD sensor is coupled to an x-ray phosphor through a fiberoptic taper. Two EMCCD camera modules are used in our prototype to form a computer-controlled array; however, larger arrays are under development. The new GUI provides patient registration, EMCCD module control, image acquisition, and patient image review. Images from the array are stitched into a 2k×1k pixel image that can be acquired and saved at a rate of 17 Hz (faster with pixel binning). When reviewing the patient's data, the operator can select images from the patient's directory tree listed by the GUI and cycle through the images using a slider bar. Commonly used camera parameters including exposure time, trigger mode, and individual EMCCD gain can be easily adjusted using the GUI. The GUI is designed to accommodate expansion of the EMCCD array to even larger FOVs with more modules. The high-resolution, high-sensitivity EMCCD modular-array SSXII imager with the new user-friendly GUI should enable angiographers and interventionalists to visualize smaller vessels and endovascular devices, helping them to make more accurate diagnoses and to perform more precise image-guided interventions.

  4. Characterization of a fully depleted CCD on high-resistivity silicon

    NASA Astrophysics Data System (ADS)

    Stover, Richard J.; Wei, Mingzhi; Lee, Y.; Gilmore, David K.; Holland, S. E.; Groom, D. E.; Moses, William W.; Perlmutter, Saul; Goldhaber, G.; Pennypacker, C.; Wang, N. W.; Palaio, N.

    1997-04-01

    Most scientific CCD imagers are fabricated on 30-50 (Omega) - cm epitaxial silicon. When illuminated form the front side of the device they generally have low quantum efficiency in the blue region of the visible spectrum because of strong absorption in the polycrystalline silicon gates as well as poor quantum efficiency in the far red and near infrared region of the spectrum because of the shallow depletion depth of the low-resistivity silicon. To enhance the blue response of scientific CCDs they are often thinned and illuminated from the back side. While blue response is greatly enhanced by this process, it is expensive and it introduces additional problems for the red end of the spectrum. A typical thinned CCD is 15 to 25 micrometers thick, and at wavelengths beyond about 800 nm the absorption depth becomes comparable to the thickness of the device, leading to interference fringes from reflected light. Because these interference fringes are of high order, the spatial pattern of the fringes is extremely sensitive to small changes in the optical illumination of the detector. Calibration and removal of the effects of the fringes is one of the primary limitations on the performance of astronomical images taken at wavelengths of 800 nm or more. In this paper we present results from the characterization of a CCD which promises to address many of the problems of typical thinned CCDs. The CCD reported on here was fabricated at Lawrence Berkeley National Laboratory (LBNL) on a 10-12 K$OMega-cm n-type silicon substrate.THe CCD is a 200 by 200 15-micrometers square pixel array, and due to the very high resistivity of the starting material, the entire 300 micrometers substrate is depleted. Full depletion works because of the gettering technology developed at LBNL which keeps leakage current down. Both front-side illuminated and backside illuminated devices have been tested. We have measured quantum efficiency, read-noise, full-well, charge-transfer efficiency, and leakage current. We have also observed the effects of clocking waveform shapes on spurious charge generation. While these new CCDs promise to be a major advance in CD technology, they too have limitations such as charge spreading and cosmic-ray effects. These limitations have been characterized and are presented. Examples of astronomical observations obtained with the backside CCD on the 1-meter reflector at Lick Observatory are presented.

  5. Curved CCD detector devices and arrays for multispectral astrophysical applications and terrestrial stereo panoramic cameras

    NASA Astrophysics Data System (ADS)

    Swain, Pradyumna; Mark, David

    2004-09-01

    The emergence of curved CCD detectors as individual devices or as contoured mosaics assembled to match the curved focal planes of astronomical telescopes and terrestrial stereo panoramic cameras represents a major optical design advancement that greatly enhances the scientific potential of such instruments. In altering the primary detection surface within the telescope"s optical instrumentation system from flat to curved, and conforming the applied CCD"s shape precisely to the contour of the telescope"s curved focal plane, a major increase in the amount of transmittable light at various wavelengths through the system is achieved. This in turn enables multi-spectral ultra-sensitive imaging with much greater spatial resolution necessary for large and very large telescope applications, including those involving infrared image acquisition and spectroscopy, conducted over very wide fields of view. For earth-based and space-borne optical telescopes, the advent of curved CCD"s as the principle detectors provides a simplification of the telescope"s adjoining optics, reducing the number of optical elements and the occurrence of optical aberrations associated with large corrective optics used to conform to flat detectors. New astronomical experiments may be devised in the presence of curved CCD applications, in conjunction with large format cameras and curved mosaics, including three dimensional imaging spectroscopy conducted over multiple wavelengths simultaneously, wide field real-time stereoscopic tracking of remote objects within the solar system at high resolution, and deep field survey mapping of distant objects such as galaxies with much greater multi-band spatial precision over larger sky regions. Terrestrial stereo panoramic cameras equipped with arrays of curved CCD"s joined with associative wide field optics will require less optical glass and no mechanically moving parts to maintain continuous proper stereo convergence over wider perspective viewing fields than their flat CCD counterparts, lightening the cameras and enabling faster scanning and 3D integration of objects moving within a planetary terrain environment. Preliminary experiments conducted at the Sarnoff Corporation indicate the feasibility of curved CCD imagers with acceptable electro-optic integrity. Currently, we are in the process of evaluating the electro-optic performance of a curved wafer scale CCD imager. Detailed ray trace modeling and experimental electro-optical data performance obtained from the curved imager will be presented at the conference.

  6. Characterization of full-length sequenced cDNA inserts (FLIcs) from Atlantic salmon (Salmo salar)

    PubMed Central

    Andreassen, Rune; Lunner, Sigbjørn; Høyheim, Bjørn

    2009-01-01

    Background Sequencing of the Atlantic salmon genome is now being planned by an international research consortium. Full-length sequenced inserts from cDNAs (FLIcs) are an important tool for correct annotation and clustering of the genomic sequence in any species. The large amount of highly similar duplicate sequences caused by the relatively recent genome duplication in the salmonid ancestor represents a particular challenge for the genome project. FLIcs will therefore be an extremely useful resource for the Atlantic salmon sequencing project. In addition to be helpful in order to distinguish between duplicate genome regions and in determining correct gene structures, FLIcs are an important resource for functional genomic studies and for investigation of regulatory elements controlling gene expression. In contrast to the large number of ESTs available, including the ESTs from 23 developmental and tissue specific cDNA libraries contributed by the Salmon Genome Project (SGP), the number of sequences where the full-length of the cDNA insert has been determined has been small. Results High quality full-length insert sequences from 560 pre-smolt white muscle tissue specific cDNAs were generated, accession numbers [GenBank: BT043497 - BT044056]. Five hundred and ten (91%) of the transcripts were annotated using Gene Ontology (GO) terms and 440 of the FLIcs are likely to contain a complete coding sequence (cCDS). The sequence information was used to identify putative paralogs, characterize salmon Kozak motifs, polyadenylation signal variation and to identify motifs likely to be involved in the regulation of particular genes. Finally, conserved 7-mers in the 3'UTRs were identified, of which some were identical to miRNA target sequences. Conclusion This paper describes the first Atlantic salmon FLIcs from a tissue and developmental stage specific cDNA library. We have demonstrated that many FLIcs contained a complete coding sequence (cCDS). This suggests that the remaining cDNA libraries generated by SGP represent a valuable cCDS FLIc source. The conservation of 7-mers in 3'UTRs indicates that these motifs are functionally important. Identity between some of these 7-mers and miRNA target sequences suggests that they are miRNA targets in Salmo salar transcripts as well. PMID:19878547

  7. Direct Search for Low Mass Dark Matter Particles with CCDs

    DOE PAGES

    Barreto, J.; Cease, H.; Diehl, H. T.; ...

    2012-05-15

    A direct dark matter search is performed using fully-depleted high-resistivity CCD detectors. Due to their low electronic readout noise (RMS ~7 eV) these devices operate with a very low detection threshold of 40 eV, making the search for dark matter particles with low masses (~5 GeV) possible. The results of an engineering run performed in a shallow underground site are presented, demonstrating the potential of this technology in the low mass region.

  8. VizieR Online Data Catalog: CONCH-SHELL catalog of nearby M dwarfs (Gaidos+, 2014)

    NASA Astrophysics Data System (ADS)

    Gaidos, E.; Mann, A. W.; Lepine, S.; Buccino, A.; James, D.; Ansdell, M.; Petrucci, R.; Mauas, P.; Hilton, E. J.

    2015-04-01

    Lepinet et al. 2011 (J/AJ/142/138) selected candidate M dwarfs as stars that were (i) bright (J<10), (ii) red (V-J>2.7), (iii) had absolute magnitudes or reduced proper motions, proxies for absolute magnitudes, consistent with the main sequence and (iv) infrared Two Micron All-Sky Survey (2MASS; Skrutskie et al. 2006, Cat. II/246) JHKS colours that are consistent with M dwarfs. In this work, we constructed a revised catalogue of J<9 M dwarfs using modified criteria and new photometry from APASS. Spectroscopic observations with a resolution if ~1000 were achieved at the SuperNova Integral Field Spectrograph (SNIFS) on the University of Hawaii 2.2m telescope on Maunakea, Hawaii, the Mark III spectrograph and Boller & Chivens CCDS spectrograph (CCDS) on the 1.3m McGraw-Hill telescope at the MDM Observatory on Kitt Peak, Arizona, the REOSC spectrograph on the 2.15m Jorge Sahade telescope at the Complejo Astronomico El Leoncito Observatory (CASLEO), Argentina, and the RC spectrograph on the 1.9m Radcliffe telescope at the South African Astronomical Observatory. We obtained a total of 3071 spectra of 2583 stars or 86% of the catalog over the span 2002-2014 of more than 11 years. 425 stars were observed twice, 14 stars were observed thrice, and 6 stars had more than four observations. (2 data files).

  9. Composite x-ray image assembly for large-field digital mammography with one- and two-dimensional positioning of a focal plane array

    NASA Technical Reports Server (NTRS)

    Halama, G.; McAdoo, J.; Liu, H.

    1998-01-01

    To demonstrate the feasibility of a novel large-field digital mammography technique, a 1024 x 1024 pixel Loral charge-coupled device (CCD) focal plane array (FPA) was positioned in a mammographic field with one- and two-dimensional scan sequences to obtain 950 x 1800 pixel and 3600 x 3600 pixel composite images, respectively. These experiments verify that precise positioning of FPAs produced seamless composites and that the CCD mosaic concept has potential for high-resolution, large-field imaging. The proposed CCD mosaic concept resembles a checkerboard pattern with spacing left between the CCDs for the driver and readout electronics. To obtain a complete x-ray image, the mosaic must be repositioned four times, with an x-ray exposure at each position. To reduce the patient dose, a lead shield with appropriately patterned holes is placed between the x-ray source and the patient. The high-precision motorized translation stages and the fiber-coupled-scintillating-screen-CCD sensor assembly were placed in the position usually occupied by the film cassette. Because of the high mechanical precision, seamless composites were constructed from the subimages. This paper discusses the positioning, image alignment procedure, and composite image results. The paper only addresses the formation of a seamless composite image from subimages and will not consider the effects of the lead shield, multiple CCDs, or the speed of motion.

  10. Hyper Suprime-Cam: development of the CCD readout electronics

    NASA Astrophysics Data System (ADS)

    Nakaya, Hidehiko; Uchida, Tomohisa; Miyatake, Hironao; Fujimori, Hiroki; Mineo, Sogo; Aihara, Hiroaki; Furusawa, Hisanori; Kamata, Yukiko; Karoji, Hiroshi; Kawanomoto, Satoshi; Komiyama, Yutaka; Miyazaki, Satoshi; Morokuma, Tomoki; Obuchi, Yoshiyuki; Okura, Yuki; Tanaka, Manobu; Tanaka, Yoko; Uraguchi, Fumihiro; Utsumi, Yosuke

    2010-07-01

    Hyper Suprime-Cam (HSC) employs 116 of 2k×4k CCDs with 464 signal outputs in total. The image size exceeds 2 GBytes, and the data can be readout every 10 seconds which results in the data rate of 210 Mbytes / sec. The data is digitized to 16-bit. The readout noise of the electronics at the readout time of 20 seconds is ~0.9 ADU, and the one with CCD is ~1.5 ADU which corresponds to ~4.5 e. The linearity error fits within +/- 0.5 % up to 150,000 e. The CCD readout electronics for HSC was newly developed based on the electronics for Suprime-Cam. The frontend electronics (FEE) is placed in the vacuum dewar, and the backend electronics (BEE) is mounted on the outside of the dewar on the prime focus unit. The FEE boards were designed to minimize the outgas and to maximize the heat transfer efficiency to keep the vacuum of the dewar. The BEE boards were designed to be simple and small as long as to achieve the readout time within 10 seconds. The production of the system has been finished, and the full set of the boards are being tested with several CCDs installed in the HSC dewar. We will show the system design, performance, and the current status of the development.

  11. CMOS Imaging Sensor Technology for Aerial Mapping Cameras

    NASA Astrophysics Data System (ADS)

    Neumann, Klaus; Welzenbach, Martin; Timm, Martin

    2016-06-01

    In June 2015 Leica Geosystems launched the first large format aerial mapping camera using CMOS sensor technology, the Leica DMC III. This paper describes the motivation to change from CCD sensor technology to CMOS for the development of this new aerial mapping camera. In 2002 the DMC first generation was developed by Z/I Imaging. It was the first large format digital frame sensor designed for mapping applications. In 2009 Z/I Imaging designed the DMC II which was the first digital aerial mapping camera using a single ultra large CCD sensor to avoid stitching of smaller CCDs. The DMC III is now the third generation of large format frame sensor developed by Z/I Imaging and Leica Geosystems for the DMC camera family. It is an evolution of the DMC II using the same system design with one large monolithic PAN sensor and four multi spectral camera heads for R,G, B and NIR. For the first time a 391 Megapixel large CMOS sensor had been used as PAN chromatic sensor, which is an industry record. Along with CMOS technology goes a range of technical benefits. The dynamic range of the CMOS sensor is approx. twice the range of a comparable CCD sensor and the signal to noise ratio is significantly better than with CCDs. Finally results from the first DMC III customer installations and test flights will be presented and compared with other CCD based aerial sensors.

  12. VizieR Online Data Catalog: Spectroscopic and photometric properties of Tombaugh 1 (Sales+, 2016)

    NASA Astrophysics Data System (ADS)

    Sales Silva, J. V.; Carraro, G.; Anthony-Twarog, B. J.; Moni Bidin, C.; Costa, E.; Twarog, B. A.

    2018-03-01

    Photometry for Tombaugh 1 was secured in 2010 December during a five-night run using the Cerro Tololo Inter-American Observatory 1.0 m telescope operated by the SMARTS consortium (http://www.astro.yale.edu/smarts). The telescope is equipped with an STA 4064x4064 CCD camera (http://www.astronomy.ohio-state.edu/Y4KCam/detector) with 15 μm pixels, yielding a scale of 0.289"/pixel and a field of view (FOV) of 20'x20' at the Cassegrain focus of the telescope. Over the night of 2010 January 5, we observed 10 potential cluster stars (nine clump stars and one Cepheid; see Section 4.1) with the Inamori-Magellan Areal Camera & Spectrograph (IMACS; Dressler et al. 2006SPIE.6269E..0FD) attached to the Magellan telescope (6.5 m) located at Las Campanas, Chile. The spectra were obtained using the Multi-Object Echelle (MOE) mode with two exposures, one of 900 s and the other of 1200 s. Our spectra have a resolution of R~20000, while the spectral coverage depends on the location of the star on the multislit mask, but it generally goes from 4200 to 9100 Å. The detector consists of a mosaic with eight CCDs with gaps of about 0.93 mm between the CCDs, causing small gaps in stellar spectra. (7 data files).

  13. Quantum efficiency measurements of eROSITA pnCCDs

    NASA Astrophysics Data System (ADS)

    Ebermayer, Stefanie; Andritschke, Robert; Elbs, Johannes; Meidinger, Norbert; Strüder, Lothar; Hartmann, Robert; Gottwald, Alexander; Krumrey, Michael; Scholze, Frank

    2010-07-01

    For the eROSITA X-ray telescope, which is planned to be launched in 2012, detectors were developed and fabricated at the MPI Semiconductor Laboratory. The fully depleted, back-illuminated pnCCDs have an ultrathin pn-junction to improve the low-energy X-ray response function and quantum efficiency. The device thickness of 450 μm is fully sensitive to X-ray photons yielding high quantum efficiency of more than 90% at photon energies of 10 keV. An on-chip filter is deposited on top of the entrance window to suppress visible and UV light which would interfere with the X-ray observations. The pnCCD type developed for the eROSITA telescope was characterized in terms of quantum efficiency and spectral response function. The described measurements were performed in 2009 at the synchrotron radiation sources BESSY II and MLS as cooperation between the MPI Semiconductor Laboratory and the Physikalisch-Technische Bundesanstalt (PTB). Quantum efficiency measurements over a wide range of photon energies from 3 eV to 11 keV as well as spectral response measurements are presented. For X-ray energies from 3 keV to 10 keV the quantum efficiency of the CCD including on-chip filter is shown to be above 90% with an attenuation of visible light of more than five orders of magnitude. A detector response model is described and compared to the measurements.

  14. The transition of ground-based space environmental effects testing to the space environment

    NASA Technical Reports Server (NTRS)

    Zaat, Stephen V.; Schaefer, Glen A.; Wallace, John F.

    1991-01-01

    The goal of the space flight program at the Center for Commercial Development of Space (CCDS)--Materials for Space Structures is to provide environmentally stable structural materials to support the continued humanization and commercialization of the space frontier. Information on environmental stability will be obtained through space exposure, evaluation, documentation, and subsequent return to the supplier of the candidate material for internal investigation. This program provides engineering and scientific service to space systems development firms and also exposes CCDS development candidate materials to space environments representative of in-flight conditions. The maintenance of a technological edge in space for NASA suggests the immediate search for space materials that maintain their structural integrity and remain environmentally stable. The materials being considered for long-lived space structures are complex, high strength/weight ratio composites. In order for these new candidate materials to qualify for use in space structures, they must undergo strenuous testing to determine their reliability and stability when subjected to the space environment. Ultraviolet radiation, atomic oxygen, debris/micrometeoroids, charged particles radiation, and thermal fatigue all influence the design of space structural materials. The investigation of these environmental interactions is the key purpose of this center. Some of the topics discussed with respect to the above information include: the Space Transportation System, mission planning, spaceborne experiments, and space flight payloads.

  15. Technology for Large Space Systems: A Special Bibliography with Indexes (Supplement 2)

    NASA Technical Reports Server (NTRS)

    1980-01-01

    This bibliography lists 258 reports, articles, and other documents introduced into the NASA scientific and technical information system between July 1, 1979 and December 31, 1979. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.

  16. Time-Resolved Surveys of Stellar Clusters

    NASA Astrophysics Data System (ADS)

    Eyer, Laurent; Eggenberger, Patrick; Greco, Claudia; Saesen, Sophie; Anderson, Richard I.; Mowlavi, Nami

    We describe the information that can be gained when a survey is done multi-epoch, and its particular impact in open cluster research. We first explain the irreplaceable information that multi-epoch observations are giving within astrometry, photometry and spectroscopy. Then we give three examples of results on open clusters from multi-epoch surveys, namely, the distance to the Pleiades, the angular momentum evolution of low mass stars and asteroseismology. Finally we mention several very large surveys, which are ongoing or planned for the future, Gaia, JASMINE, LSST, and VVV.

  17. Technology for large space systems: A special bibliography with indexes (supplement 05)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    This bibliography lists 298 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1981 and June 30, 1981. Its purpose is to provide helpful, information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.

  18. Technology for large space systems: A special bibliography with indexes (supplement 06)

    NASA Technical Reports Server (NTRS)

    1982-01-01

    This bibliography lists 220 reports, articles and other documents introduced into the NASA scientific and technical information system between July 1, 1981 and December 31, 1981. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.

  19. Absolute calibration of a charge-coupled device camera with twin beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meda, A.; Ruo-Berchera, I., E-mail: i.ruoberchera@inrim.it; Degiovanni, I. P.

    2014-09-08

    We report on the absolute calibration of a Charge-Coupled Device (CCD) camera by exploiting quantum correlation. This method exploits a certain number of spatial pairwise quantum correlated modes produced by spontaneous parametric-down-conversion. We develop a measurement model accounting for all the uncertainty contributions, and we reach the relative uncertainty of 0.3% in low photon flux regime. This represents a significant step forward for the characterization of (scientific) CCDs used in mesoscopic light regime.

  20. Mapping Van

    NASA Technical Reports Server (NTRS)

    1994-01-01

    A NASA Center for the Commercial Development of Space (CCDS) - developed system for satellite mapping has been commercialized for the first time. Global Visions, Inc. maps an area while driving along a road in a sophisticated mapping van equipped with satellite signal receivers, video cameras and computer systems for collecting and storing mapping data. Data is fed into a computerized geographic information system (GIS). The resulting amps can be used for tax assessment purposes, emergency dispatch vehicles and fleet delivery companies as well as other applications.

  1. CMU DeepLens: deep learning for automatic image-based galaxy-galaxy strong lens finding

    NASA Astrophysics Data System (ADS)

    Lanusse, François; Ma, Quanbin; Li, Nan; Collett, Thomas E.; Li, Chun-Liang; Ravanbakhsh, Siamak; Mandelbaum, Rachel; Póczos, Barnabás

    2018-01-01

    Galaxy-scale strong gravitational lensing can not only provide a valuable probe of the dark matter distribution of massive galaxies, but also provide valuable cosmological constraints, either by studying the population of strong lenses or by measuring time delays in lensed quasars. Due to the rarity of galaxy-scale strongly lensed systems, fast and reliable automated lens finding methods will be essential in the era of large surveys such as Large Synoptic Survey Telescope, Euclid and Wide-Field Infrared Survey Telescope. To tackle this challenge, we introduce CMU DeepLens, a new fully automated galaxy-galaxy lens finding method based on deep learning. This supervised machine learning approach does not require any tuning after the training step which only requires realistic image simulations of strongly lensed systems. We train and validate our model on a set of 20 000 LSST-like mock observations including a range of lensed systems of various sizes and signal-to-noise ratios (S/N). We find on our simulated data set that for a rejection rate of non-lenses of 99 per cent, a completeness of 90 per cent can be achieved for lenses with Einstein radii larger than 1.4 arcsec and S/N larger than 20 on individual g-band LSST exposures. Finally, we emphasize the importance of realistically complex simulations for training such machine learning methods by demonstrating that the performance of models of significantly different complexities cannot be distinguished on simpler simulations. We make our code publicly available at https://github.com/McWilliamsCenter/CMUDeepLens.

  2. Precision matrix expansion - efficient use of numerical simulations in estimating errors on cosmological parameters

    NASA Astrophysics Data System (ADS)

    Friedrich, Oliver; Eifler, Tim

    2018-01-01

    Computing the inverse covariance matrix (or precision matrix) of large data vectors is crucial in weak lensing (and multiprobe) analyses of the large-scale structure of the Universe. Analytically computed covariances are noise-free and hence straightforward to invert; however, the model approximations might be insufficient for the statistical precision of future cosmological data. Estimating covariances from numerical simulations improves on these approximations, but the sample covariance estimator is inherently noisy, which introduces uncertainties in the error bars on cosmological parameters and also additional scatter in their best-fitting values. For future surveys, reducing both effects to an acceptable level requires an unfeasibly large number of simulations. In this paper we describe a way to expand the precision matrix around a covariance model and show how to estimate the leading order terms of this expansion from simulations. This is especially powerful if the covariance matrix is the sum of two contributions, C = A+B, where A is well understood analytically and can be turned off in simulations (e.g. shape noise for cosmic shear) to yield a direct estimate of B. We test our method in mock experiments resembling tomographic weak lensing data vectors from the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST). For DES we find that 400 N-body simulations are sufficient to achieve negligible statistical uncertainties on parameter constraints. For LSST this is achieved with 2400 simulations. The standard covariance estimator would require >105 simulations to reach a similar precision. We extend our analysis to a DES multiprobe case finding a similar performance.

  3. Strong Lens Time Delay Challenge. I. Experimental Design

    NASA Astrophysics Data System (ADS)

    Dobler, Gregory; Fassnacht, Christopher D.; Treu, Tommaso; Marshall, Phil; Liao, Kai; Hojjati, Alireza; Linder, Eric; Rumbaugh, Nicholas

    2015-02-01

    The time delays between point-like images in gravitational lens systems can be used to measure cosmological parameters. The number of lenses with measured time delays is growing rapidly; the upcoming Large Synoptic Survey Telescope (LSST) will monitor ~103 strongly lensed quasars. In an effort to assess the present capabilities of the community, to accurately measure the time delays, and to provide input to dedicated monitoring campaigns and future LSST cosmology feasibility studies, we have invited the community to take part in a "Time Delay Challenge" (TDC). The challenge is organized as a set of "ladders," each containing a group of simulated data sets to be analyzed blindly by participating teams. Each rung on a ladder consists of a set of realistic mock observed lensed quasar light curves, with the rungs' data sets increasing in complexity and realism. The initial challenge described here has two ladders, TDC0 and TDC1. TDC0 has a small number of data sets, and is designed to be used as a practice set by the participating teams. The (non-mandatory) deadline for completion of TDC0 was the TDC1 launch date, 2013 December 1. The TDC1 deadline was 2014 July 1. Here we give an overview of the challenge, we introduce a set of metrics that will be used to quantify the goodness of fit, efficiency, precision, and accuracy of the algorithms, and we present the results of TDC0. Thirteen teams participated in TDC0 using 47 different methods. Seven of those teams qualified for TDC1, which is described in the companion paper.

  4. Fabrication of the LSST monolithic primary-tertiary mirror

    NASA Astrophysics Data System (ADS)

    Tuell, Michael T.; Martin, Hubert M.; Burge, James H.; Ketelsen, Dean A.; Law, Kevin; Gressler, William J.; Zhao, Chunyu

    2012-09-01

    As previously reported (at the SPIE Astronomical Instrumentation conference of 2010 in San Diego1), the Large Synoptic Survey Telescope (LSST) utilizes a three-mirror design in which the primary (M1) and tertiary (M3) mirrors are two concentric aspheric surfaces on one monolithic substrate. The substrate material is Ohara E6 borosilicate glass, in a honeycomb sandwich configuration, currently in production at The University of Arizona’s Steward Observatory Mirror Lab. We will provide an update to the status of the mirrors and metrology systems, which have advanced from concepts to hardware in the past two years. In addition to the normal requirements for smooth surfaces of the appropriate prescriptions, the alignment of the two surfaces must be accurately measured and controlled in the production lab, reducing the degrees of freedom needed to be controlled in the telescope. The surface specification is described as a structure function, related to seeing in excellent conditions. Both the pointing and centration of the two optical axes are important parameters, in addition to the axial spacing of the two vertices. This paper details the manufacturing process and metrology systems for each surface, including the alignment of the two surfaces. M1 is a hyperboloid and can utilize a standard Offner null corrector, whereas M3 is an oblate ellipsoid, so it has positive spherical aberration. The null corrector is a phase-etched computer-generated hologram (CGH) between the mirror surface and the center-of-curvature. Laser trackers are relied upon to measure the alignment and spacing as well as rough-surface metrology during looseabrasive grinding.

  5. PERIODOGRAMS FOR MULTIBAND ASTRONOMICAL TIME SERIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    VanderPlas, Jacob T.; Ivezic, Željko

    This paper introduces the multiband periodogram, a general extension of the well-known Lomb–Scargle approach for detecting periodic signals in time-domain data. In addition to advantages of the Lomb–Scargle method such as treatment of non-uniform sampling and heteroscedastic errors, the multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST). The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common tomore » all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. This decrease in the effective model complexity is the main reason for improved performance. After a pedagogical development of the formalism of least-squares spectral analysis, which motivates the essential features of the multiband model, we use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature and find that this method will be able to efficiently determine the correct period in the majority of LSST’s bright RR Lyrae stars with as little as six months of LSST data, a vast improvement over the years of data reported to be required by previous studies. A Python implementation of this method, along with code to fully reproduce the results reported here, is available on GitHub.« less

  6. The Santiago-Harvard-Edinburgh-Durham void comparison - I. SHEDding light on chameleon gravity tests

    NASA Astrophysics Data System (ADS)

    Cautun, Marius; Paillas, Enrique; Cai, Yan-Chuan; Bose, Sownak; Armijo, Joaquin; Li, Baojiu; Padilla, Nelson

    2018-05-01

    We present a systematic comparison of several existing and new void-finding algorithms, focusing on their potential power to test a particular class of modified gravity models - chameleon f(R) gravity. These models deviate from standard general relativity (GR) more strongly in low-density regions and thus voids are a promising venue to test them. We use halo occupation distribution (HOD) prescriptions to populate haloes with galaxies, and tune the HOD parameters such that the galaxy two-point correlation functions are the same in both f(R) and GR models. We identify both three-dimensional (3D) voids and two-dimensional (2D) underdensities in the plane of the sky to find the same void abundance and void galaxy number density profiles across all models, which suggests that they do not contain much information beyond galaxy clustering. However, the underlying void dark matter density profiles are significantly different, with f(R) voids being more underdense than GR ones, which leads to f(R) voids having a larger tangential shear signal than their GR analogues. We investigate the potential of each void finder to test f(R) models with near-future lensing surveys such as EUCLID and LSST. The 2D voids have the largest power to probe f(R) gravity, with an LSST analysis of tunnel (which is a new type of 2D underdensity introduced here) lensing distinguishing at 80 and 11σ (statistical error) f(R) models with parameters, |fR0| = 10-5 and 10-6, from GR.

  7. Effects of Two Different Anesthetic Solutions on Injection Pain, Efficacy, and Duration of Soft-Tissue Anesthesia with Inferior Alveolar Nerve Block for Primary Molars.

    PubMed

    Elbay, Ülkü Şermet; Elbay, Mesut; Kaya, Emine; Yıldırım, Sinem

    The purpose of the study was to compare the efficacy, injection pain, duration of soft tissue anesthesia, and postoperative complications of two different anesthetics (2% lidocaine with 1:80,000 epinephrine and 3% plain mepivacaine) in pediatric patients in inferior alveolar nerve block (IANB) administered by a computer-controlled delivery system (CCDS). The study was conducted as a randomized, controlled-crossover, double-blind clinical trial with 60 children requiring bilateral pulpotomy or extraction of primary mandibular molars. A CCDS was used to deliver 3% mepivacaine to 1 primary tooth and 2% lidocaine to the contralateral tooth with an IANB technique. Severity of pain and efficacy of anesthesia were evaluated using the Face, Legs, Activity, Cry, Consolability Scale, and comfort and side effects were assessed using a questionnaire. Data were analyzed using the Mann-Whitney U, Wilcoxon t, and Fisher exact tests. Patients receiving 2% lidocaine experienced significantly less pain during injection than those receiving 3% mepivacaine, and no significant differences were found in the pain scores during treatments or in postoperative complications between the two anesthetics. The mean durations of anesthesia for 3% mepivacaine and 2% lidocaine were 139.68 minutes and 149.10 minutes, respectively. Plain mepivacaine and 2% lidocaine were similarly effective in pulpotomy and the extraction of primary mandibular molars. Although the use of 3% mepivacaine provided a shorter duration of anesthesia than 2% lidocaine, both solutions showed similar results in terms of postoperative complications.

  8. The TESS camera: modeling and measurements with deep depletion devices

    NASA Astrophysics Data System (ADS)

    Woods, Deborah F.; Vanderspek, Roland; MacDonald, Robert; Morgan, Edward; Villasenor, Joel; Thayer, Carolyn; Burke, Barry; Chesbrough, Christian; Chrisp, Michael; Clark, Kristin; Furesz, Gabor; Gonzales, Alexandria; Nguyen, Tam; Prigozhin, Gregory; Primeau, Brian; Ricker, George; Sauerwein, Timothy; Suntharalingam, Vyshnavi

    2016-07-01

    The Transiting Exoplanet Survey Satellite, a NASA Explorer-class mission in development, will discover planets around nearby stars, most notably Earth-like planets with potential for follow up characterization. The all-sky survey requires a suite of four wide field-of-view cameras with sensitivity across a broad spectrum. Deep depletion CCDs with a silicon layer of 100 μm thickness serve as the camera detectors, providing enhanced performance in the red wavelengths for sensitivity to cooler stars. The performance of the camera is critical for the mission objectives, with both the optical system and the CCD detectors contributing to the realized image quality. Expectations for image quality are studied using a combination of optical ray tracing in Zemax and simulations in Matlab to account for the interaction of the incoming photons with the 100 μm silicon layer. The simulations include a probabilistic model to determine the depth of travel in the silicon before the photons are converted to photo-electrons, and a Monte Carlo approach to charge diffusion. The charge diffusion model varies with the remaining depth for the photo-electron to traverse and the strength of the intermediate electric field. The simulations are compared with laboratory measurements acquired by an engineering unit camera with the TESS optical design and deep depletion CCDs. In this paper we describe the performance simulations and the corresponding measurements taken with the engineering unit camera, and discuss where the models agree well in predicted trends and where there are differences compared to observations.

  9. Radiation effects on the Gaia CCDs after 30 months at L2

    NASA Astrophysics Data System (ADS)

    Crowley, Cian; Abreu, Asier; Kohley, Ralf; Prod'homme, Thibaut; Beaufort, Thierry

    2016-07-01

    Since the launch of ESA's Gaia satellite in December 2013, the 106 large-format scientific CCDs onboard have been operating at L2. Due to a combination of the high-precision measurement requirements of the mission and the predicted proton environment at L2, the effect of non-ionizing radiation damage on the detectors was early identified pre-launch as potentially imposing a major limitation on the scientific value of the data. In this paper we compare pre-flight radiation-induced Charge Transfer Inefficiency (CTI) predictions against in-flight measurements, focusing especially on charge injection diagnostics, as well as correlating these CTI diagnostic results with solar proton event data. We show that L2-directed solar activity has been relatively low since launch, and radiation damage (so far) is less than originally expected. Despite this, there are clear cases of correlation between earth-directed solar coronal mass ejection events and abrupt changes in CTI diagnostics over time. These sudden jumps are lying on top of a rather constant increase in CTI which we show is primarily due to the continuous bombardment of the devices by high-energy Galactic Cosmic Rays. We examine the possible reasons for the lower than expected levels of CTI as well as examining the effect of controlled payload heating events on the CTI diagnostics. Radiation-induced CTI in the CCD serial registers and effects of ionizing radiation are also correspondingly lower than expected, however these topics are not examined here in detail.

  10. The Amateurs' Love Affair with Large Datasets

    NASA Astrophysics Data System (ADS)

    Price, Aaron; Jacoby, S. H.; Henden, A.

    2006-12-01

    Amateur astronomers are professionals in other areas. They bring expertise from such varied and technical careers as computer science, mathematics, engineering, and marketing. These skills, coupled with an enthusiasm for astronomy, can be used to help manage the large data sets coming online in the next decade. We will show specific examples where teams of amateurs have been involved in mining large, online data sets and have authored and published their own papers in peer-reviewed astronomical journals. Using the proposed LSST database as an example, we will outline a framework for involving amateurs in data analysis and education with large astronomical surveys.

  11. Data Management challenges in Astronomy and Astroparticle Physics

    NASA Astrophysics Data System (ADS)

    Lamanna, Giovanni

    2015-12-01

    Astronomy and Astroparticle Physics domains are experiencing a deluge of data with the next generation of facilities prioritised in the European Strategy Forum on Research Infrastructures (ESFRI), such as SKA, CTA, KM3Net and with other world-class projects, namely LSST, EUCLID, EGO, etc. The new ASTERICS-H2020 project brings together the concerned scientific communities in Europe to work together to find common solutions to their Big Data challenges, their interoperability, and their data access. The presentation will highlight these new challenges and the work being undertaken also in cooperation with e-infrastructures in Europe.

  12. The Cosmic Evolution Through UV Spectroscopy (CETUS) Probe Mission Concept

    NASA Astrophysics Data System (ADS)

    Danchi, William; Heap, Sara; Woodruff, Robert; Hull, Anthony; Kendrick, Stephen E.; Purves, Lloyd; McCandliss, Stephan; Kelly Dodson, Greg Mehle, James Burge, Martin Valente, Michael Rhee, Walter Smith, Michael Choi, Eric Stoneking

    2018-01-01

    CETUS is a mission concept for an all-UV telescope with 3 scientific instruments: a wide-field camera, a wide-field multi-object spectrograph, and a point-source high-resolution and medium resolution spectrograph. It is primarily intended to work with other survey telescopes in the 2020’s (e.g. E-ROSITA (X-ray), LSST, Subaru, WFIRST (optical-near-IR), SKA (radio) to solve major, outstanding problems in astrophysics. In this poster presentation, we give an overview of CETUS key science goals and a progress report on the CETUS mission and instrument design.

  13. Improved Space Object Orbit Determination Using CMOS Detectors

    NASA Astrophysics Data System (ADS)

    Schildknecht, T.; Peltonen, J.; Sännti, T.; Silha, J.; Flohrer, T.

    2014-09-01

    CMOS-sensors, or in general Active Pixel Sensors (APS), are rapidly replacing CCDs in the consumer camera market. Due to significant technological advances during the past years these devices start to compete with CCDs also for demanding scientific imaging applications, in particular in the astronomy community. CMOS detectors offer a series of inherent advantages compared to CCDs, due to the structure of their basic pixel cells, which each contains their own amplifier and readout electronics. The most prominent advantages for space object observations are the extremely fast and flexible readout capabilities, feasibility for electronic shuttering and precise epoch registration, and the potential to perform image processing operations on-chip and in real-time. The major challenges and design drivers for ground-based and space-based optical observation strategies have been analyzed. CMOS detector characteristics were critically evaluated and compared with the established CCD technology, especially with respect to the above mentioned observations. Similarly, the desirable on-chip processing functionalities which would further enhance the object detection and image segmentation were identified. Finally, we simulated several observation scenarios for ground- and space-based sensor by assuming different observation and sensor properties. We will introduce the analyzed end-to-end simulations of the ground- and space-based strategies in order to investigate the orbit determination accuracy and its sensitivity which may result from different values for the frame-rate, pixel scale, astrometric and epoch registration accuracies. Two cases were simulated, a survey using a ground-based sensor to observe objects in LEO for surveillance applications, and a statistical survey with a space-based sensor orbiting in LEO observing small-size debris in LEO. The ground-based LEO survey uses a dynamical fence close to the Earth shadow a few hours after sunset. For the space-based scenario a sensor in a sun-synchronous LEO orbit, always pointing in the anti-sun direction to achieve optimum illumination conditions for small LEO debris, was simulated. For the space-based scenario the simulations showed a 20 130 % improvement of the accuracy of all orbital parameters when varying the frame rate from 1/3 fps, which is the fastest rate for a typical CCD detector, to 50 fps, which represents the highest rate of scientific CMOS cameras. Changing the epoch registration accuracy from a typical 20.0 ms for a mechanical shutter to 0.025 ms, the theoretical value for the electronic shutter of a CMOS camera, improved the orbit accuracy by 4 to 190 %. The ground-based scenario also benefit from the specific CMOS characteristics, but to a lesser extent.

  14. Evaluation of large format electron bombarded virtual phase CCDs as ultraviolet imaging detectors

    NASA Technical Reports Server (NTRS)

    Opal, Chet B.; Carruthers, George R.

    1989-01-01

    In conjunction with an external UV-sensitive cathode, an electron-bombarded CCD may be used as a high quantum efficiency/wide dynamic range photon-counting UV detector. Results are presented for the case of a 1024 x 1024, 18-micron square pixel virtual phase CCD used with an electromagnetically focused f/2 Schmidt camera, which yields excellent simgle-photoevent discrimination and counting efficiency. Attention is given to the vacuum-chamber arrangement used to conduct system tests and the CCD electronics and data-acquisition systems employed.

  15. C2D8: An eight channel CCD readout electronics dedicated to low energy neutron detection

    NASA Astrophysics Data System (ADS)

    Bourrion, O.; Clement, B.; Tourres, D.; Pignol, G.; Xi, Y.; Rebreyend, D.; Nesvizhevsky, V. V.

    2018-02-01

    Position-sensitive detectors for cold and ultra-cold neutrons (UCN) are in use in fundamental research. In particular, measuring the properties of the quantum states of bouncing neutrons requires micro-metric spatial resolution. To this end, a Charge Coupled Device (CCD) coated with a thin conversion layer that allows a real time detection of neutron hits is under development at LPSC. In this paper, we present the design and performance of a dedicated electronic board designed to read-out eight CCDs simultaneously and operating under vacuum.

  16. Technical Operations Support (TOPS) II. Delivery Order 0011: Summary Status of MISSE-1 and MISSE-2 Experiments and Details of Estimated Environmental Exposures for MISSE-1 and MISSE-2

    DTIC Science & Technology

    2006-07-01

    distinct types of devices were used that respond to radiation differently, TLDs (thermoluminescent dosimeters ), Charge-Coupled Devices (CCDs) and...optocouplers. The TLDs respond to total ionizing dose, most of which is contributed to by the trapped electrons for locations with less than 100 mils of... electrons , slab config. Space Station Calc., 1 yr dose, protons + electrons 7th TLD on W2-15 gave anomalously low reading BREB =Boeing Radiation

  17. InSb arrays with CCD readout for 1.0- to 5.5-microns infrared applications

    NASA Technical Reports Server (NTRS)

    Phillips, J. D.; Scorso, J. B.; Thom, R. D.

    1976-01-01

    There were two approaches for fabricating indium antimonide (InSb) arrays with CCD readout discussed. The hybrid approach integrated InSb detectors and silicon CCDs in a modular assembly via an advanced interconnection technology. In the monolithic approach, the InSb infrared detectors and the CCD readout were integrated on the same InSb chip. Both approaches utilized intrinsic (band-to-band) photodetection with the attendant advantages over extrinsic detectors. The status of each of these detector readout concepts, with pertinent performance characteristics, was presented.

  18. Read-noise characterization of focal plane array detectors via mean-variance analysis.

    PubMed

    Sperline, R P; Knight, A K; Gresham, C A; Koppenaal, D W; Hieftje, G M; Denton, M B

    2005-11-01

    Mean-variance analysis is described as a method for characterization of the read-noise and gain of focal plane array (FPA) detectors, including charge-coupled devices (CCDs), charge-injection devices (CIDs), and complementary metal-oxide-semiconductor (CMOS) multiplexers (infrared arrays). Practical FPA detector characterization is outlined. The nondestructive readout capability available in some CIDs and FPA devices is discussed as a means for signal-to-noise ratio improvement. Derivations of the equations are fully presented to unify understanding of this method by the spectroscopic community.

  19. The OCA CCD Camera Controller

    DTIC Science & Technology

    1996-01-01

    multi CCD arrays for wide field telescopes with an array of 8x8 1K CCDs in use at Las Campanas Observatory in Chile . The same group is also involved...Verify key EPROM -292H VIH . VIH Program security bitl 1 29AH . VPP Program security’ bit 2 *. .298H -Vpp Verify security bits - 9HVIH ViI NOTE: 1...Pulsed from V.. to VIL and returned to VIH . EPROM PROGRAMMING AND VERIFICATION ..t= 21’C to-+27 ’rC:-VCC= 5V ±10%VS3 = OV. SYMBOL I .-- PARAMETER MIN MAX

  20. Methods for identification of images acquired with digital cameras

    NASA Astrophysics Data System (ADS)

    Geradts, Zeno J.; Bijhold, Jurrien; Kieft, Martijn; Kurosawa, Kenji; Kuroki, Kenro; Saitoh, Naoki

    2001-02-01

    From the court we were asked whether it is possible to determine if an image has been made with a specific digital camera. This question has to be answered in child pornography cases, where evidence is needed that a certain picture has been made with a specific camera. We have looked into different methods of examining the cameras to determine if a specific image has been made with a camera: defects in CCDs, file formats that are used, noise introduced by the pixel arrays and watermarking in images used by the camera manufacturer.

  1. Near-Earth Object Survey Simulation Software

    NASA Astrophysics Data System (ADS)

    Naidu, Shantanu P.; Chesley, Steven R.; Farnocchia, Davide

    2017-10-01

    There is a significant interest in Near-Earth objects (NEOs) because they pose an impact threat to Earth, offer valuable scientific information, and are potential targets for robotic and human exploration. The number of NEO discoveries has been rising rapidly over the last two decades with over 1800 being discovered last year, making the total number of known NEOs >16000. Pan-STARRS and the Catalina Sky Survey are currently the most prolific NEO surveys, having discovered >1600 NEOs between them in 2016. As next generation surveys such as Large Synoptic Survey Telescope (LSST) and the proposed Near-Earth Object Camera (NEOCam) become operational in the next decade, the discovery rate is expected to increase tremendously. Coordination between various survey telescopes will be necessary in order to optimize NEO discoveries and create a unified global NEO discovery network. We are collaborating on a community-based, open-source software project to simulate asteroid surveys to facilitate such coordination and develop strategies for improving discovery efficiency. Our effort so far has focused on development of a fast and efficient tool capable of accepting user-defined asteroid population models and telescope parameters such as a list of pointing angles and camera field-of-view, and generating an output list of detectable asteroids. The software takes advantage of the widely used and tested SPICE library and architecture developed by NASA’s Navigation and Ancillary Information Facility (Acton, 1996) for saving and retrieving asteroid trajectories and camera pointing. Orbit propagation is done using OpenOrb (Granvik et al. 2009) but future versions will allow the user to plug in a propagator of their choice. The software allows the simulation of both ground-based and space-based surveys. Performance is being tested using the Grav et al. (2011) asteroid population model and the LSST simulated survey “enigma_1189”.

  2. Mental Fatigue Impairs Soccer-Specific Physical and Technical Performance.

    PubMed

    Smith, Mitchell R; Coutts, Aaron J; Merlini, Michele; Deprez, Dieter; Lenoir, Matthieu; Marcora, Samuele M

    2016-02-01

    To investigate the effects of mental fatigue on soccer-specific physical and technical performance. This investigation consisted of two separate studies. Study 1 assessed the soccer-specific physical performance of 12 moderately trained soccer players using the Yo-Yo Intermittent Recovery Test, Level 1 (Yo-Yo IR1). Study 2 assessed the soccer-specific technical performance of 14 experienced soccer players using the Loughborough Soccer Passing and Shooting Tests (LSPT, LSST). Each test was performed on two occasions and preceded, in a randomized, counterbalanced order, by 30 min of the Stroop task (mentally fatiguing treatment) or 30 min of reading magazines (control treatment). Subjective ratings of mental fatigue were measured before and after treatment, and mental effort and motivation were measured after treatment. Distance run, heart rate, and ratings of perceived exertion were recorded during the Yo-Yo IR1. LSPT performance time was calculated as original time plus penalty time. LSST performance was assessed using shot speed, shot accuracy, and shot sequence time. Subjective ratings of mental fatigue and effort were higher after the Stroop task in both studies (P < 0.001), whereas motivation was similar between conditions. This mental fatigue significantly reduced running distance in the Yo-Yo IR1 (P < 0.001). No difference in heart rate existed between conditions, whereas ratings of perceived exertion were significantly higher at iso-time in the mental fatigue condition (P < 0.01). LSPT original time and performance time were not different between conditions; however, penalty time significantly increased in the mental fatigue condition (P = 0.015). Mental fatigue also impaired shot speed (P = 0.024) and accuracy (P < 0.01), whereas shot sequence time was similar between conditions. Mental fatigue impairs soccer-specific running, passing, and shooting performance.

  3. Strong Gravitational Lensing as a Probe of Gravity, Dark-Matter and Super-Massive Black Holes

    NASA Astrophysics Data System (ADS)

    Koopmans, L.V.E.; Barnabe, M.; Bolton, A.; Bradac, M.; Ciotti, L.; Congdon, A.; Czoske, O.; Dye, S.; Dutton, A.; Elliasdottir, A.; Evans, E.; Fassnacht, C.D.; Jackson, N.; Keeton, C.; Lasio, J.; Moustakas, L.; Meneghetti, M.; Myers, S.; Nipoti, C.; Suyu, S.; van de Ven, G.; Vegetti, S.; Wucknitz, O.; Zhao, H.-S.

    Whereas considerable effort has been afforded in understanding the properties of galaxies, a full physical picture, connecting their baryonic and dark-matter content, super-massive black holes, and (metric) theories of gravity, is still ill-defined. Strong gravitational lensing furnishes a powerful method to probe gravity in the central regions of galaxies. It can (1) provide a unique detection-channel of dark-matter substructure beyond the local galaxy group, (2) constrain dark-matter physics, complementary to direct-detection experiments, as well as metric theories of gravity, (3) probe central super-massive black holes, and (4) provide crucial insight into galaxy formation processes from the dark matter point of view, independently of the nature and state of dark matter. To seriously address the above questions, a considerable increase in the number of strong gravitational-lens systems is required. In the timeframe 2010-2020, a staged approach with radio (e.g. EVLA, e-MERLIN, LOFAR, SKA phase-I) and optical (e.g. LSST and JDEM) instruments can provide 10^(2-4) new lenses, and up to 10^(4-6) new lens systems from SKA/LSST/JDEM all-sky surveys around ~2020. Follow-up imaging of (radio) lenses is necessary with moderate ground/space-based optical-IR telescopes and with 30-50m telescopes for spectroscopy (e.g. TMT, GMT, ELT). To answer these fundamental questions through strong gravitational lensing, a strong investment in large radio and optical-IR facilities is therefore critical in the coming decade. In particular, only large-scale radio lens surveys (e.g. with SKA) provide the large numbers of high-resolution and high-fidelity images of lenses needed for SMBH and flux-ratio anomaly studies.

  4. STRONG LENS TIME DELAY CHALLENGE. I. EXPERIMENTAL DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobler, Gregory; Fassnacht, Christopher D.; Rumbaugh, Nicholas

    2015-02-01

    The time delays between point-like images in gravitational lens systems can be used to measure cosmological parameters. The number of lenses with measured time delays is growing rapidly; the upcoming Large Synoptic Survey Telescope (LSST) will monitor ∼10{sup 3} strongly lensed quasars. In an effort to assess the present capabilities of the community, to accurately measure the time delays, and to provide input to dedicated monitoring campaigns and future LSST cosmology feasibility studies, we have invited the community to take part in a ''Time Delay Challenge'' (TDC). The challenge is organized as a set of ''ladders'', each containing a groupmore » of simulated data sets to be analyzed blindly by participating teams. Each rung on a ladder consists of a set of realistic mock observed lensed quasar light curves, with the rungs' data sets increasing in complexity and realism. The initial challenge described here has two ladders, TDC0 and TDC1. TDC0 has a small number of data sets, and is designed to be used as a practice set by the participating teams. The (non-mandatory) deadline for completion of TDC0 was the TDC1 launch date, 2013 December 1. The TDC1 deadline was 2014 July 1. Here we give an overview of the challenge, we introduce a set of metrics that will be used to quantify the goodness of fit, efficiency, precision, and accuracy of the algorithms, and we present the results of TDC0. Thirteen teams participated in TDC0 using 47 different methods. Seven of those teams qualified for TDC1, which is described in the companion paper.« less

  5. STRONG LENS TIME DELAY CHALLENGE. II. RESULTS OF TDC1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, Kai; Treu, Tommaso; Marshall, Phil

    2015-02-10

    We present the results of the first strong lens time delay challenge. The motivation, experimental design, and entry level challenge are described in a companion paper. This paper presents the main challenge, TDC1, which consisted of analyzing thousands of simulated light curves blindly. The observational properties of the light curves cover the range in quality obtained for current targeted efforts (e.g., COSMOGRAIL) and expected from future synoptic surveys (e.g., LSST), and include simulated systematic errors. Seven teams participated in TDC1, submitting results from 78 different method variants. After describing each method, we compute and analyze basic statistics measuring accuracy (ormore » bias) A, goodness of fit χ{sup 2}, precision P, and success rate f. For some methods we identify outliers as an important issue. Other methods show that outliers can be controlled via visual inspection or conservative quality control. Several methods are competitive, i.e., give |A| < 0.03, P < 0.03, and χ{sup 2} < 1.5, with some of the methods already reaching sub-percent accuracy. The fraction of light curves yielding a time delay measurement is typically in the range f = 20%-40%. It depends strongly on the quality of the data: COSMOGRAIL-quality cadence and light curve lengths yield significantly higher f than does sparser sampling. Taking the results of TDC1 at face value, we estimate that LSST should provide around 400 robust time-delay measurements, each with P < 0.03 and |A| < 0.01, comparable to current lens modeling uncertainties. In terms of observing strategies, we find that A and f depend mostly on season length, while P depends mostly on cadence and campaign duration.« less

  6. PROFIT: Bayesian profile fitting of galaxy images

    NASA Astrophysics Data System (ADS)

    Robotham, A. S. G.; Taranu, D. S.; Tobar, R.; Moffett, A.; Driver, S. P.

    2017-04-01

    We present PROFIT, a new code for Bayesian two-dimensional photometric galaxy profile modelling. PROFIT consists of a low-level C++ library (libprofit), accessible via a command-line interface and documented API, along with high-level R (PROFIT) and PYTHON (PyProFit) interfaces (available at github.com/ICRAR/libprofit, github.com/ICRAR/ProFit, and github.com/ICRAR/pyprofit, respectively). R PROFIT is also available pre-built from CRAN; however, this version will be slightly behind the latest GitHub version. libprofit offers fast and accurate two-dimensional integration for a useful number of profiles, including Sérsic, Core-Sérsic, broken-exponential, Ferrer, Moffat, empirical King, point-source, and sky, with a simple mechanism for adding new profiles. We show detailed comparisons between libprofit and GALFIT. libprofit is both faster and more accurate than GALFIT at integrating the ubiquitous Sérsic profile for the most common values of the Sérsic index n (0.5 < n < 8). The high-level fitting code PROFIT is tested on a sample of galaxies with both SDSS and deeper KiDS imaging. We find good agreement in the fit parameters, with larger scatter in best-fitting parameters from fitting images from different sources (SDSS versus KiDS) than from using different codes (PROFIT versus GALFIT). A large suite of Monte Carlo-simulated images are used to assess prospects for automated bulge-disc decomposition with PROFIT on SDSS, KiDS, and future LSST imaging. We find that the biggest increases in fit quality come from moving from SDSS- to KiDS-quality data, with less significant gains moving from KiDS to LSST.

  7. Unveiling the population of orphan γ-ray bursts

    NASA Astrophysics Data System (ADS)

    Ghirlanda, G.; Salvaterra, R.; Campana, S.; Vergani, S. D.; Japelj, J.; Bernardini, M. G.; Burlon, D.; D'Avanzo, P.; Melandri, A.; Gomboc, A.; Nappo, F.; Paladini, R.; Pescalli, A.; Salafia, O. S.; Tagliaferri, G.

    2015-06-01

    Gamma-ray bursts (GRBs) are detectable in the γ-ray band if their jets are oriented toward the observer. However, for each GRB with a typical θjet, there should be ~2/θ2jet bursts whose emission cone is oriented elsewhere in space. These off-axis bursts can eventually be detected when, due to the deceleration of their relativistic jets, the beaming angle becomes comparable to the viewing angle. Orphan afterglows (OAs) should outnumber the current population of bursts detected in the γ-ray band even if they have not been conclusively observed so far at any frequency. We compute the expected flux of the population of orphan afterglows in the mm, optical, and X-ray bands through a population synthesis code of GRBs and the standard afterglow emission model. We estimate the detection rate of OAs with ongoing and forthcoming surveys. The average duration of OAs as transients above a given limiting flux is derived and described with analytical expressions: in general OAs should appear as daily transients in optical surveys and as monthly/yearly transients in the mm/radio band. We find that ~2 OA yr-1 could already be detected by Gaia and up to 20 OA yr-1 could be observed by the ZTF survey. A larger number of 50 OA yr-1 should be detected by LSST in the optical band. For the X-ray band, ~26 OA yr-1 could be detected by the eROSITA. For the large population of OA detectable by LSST, the X-ray and optical follow up of the light curve (for the brightest cases) and/or the extensive follow up of their emission in the mm and radio band could be the key to disentangling their GRB nature from other extragalactic transients of comparable flux density.

  8. Optical testing of the LSST combined primary/tertiary mirror

    NASA Astrophysics Data System (ADS)

    Tuell, Michael T.; Martin, Hubert M.; Burge, James H.; Gressler, William J.; Zhao, Chunyu

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) utilizes a three-mirror design in which the primary (M1) and tertiary (M3) mirrors are two concentric aspheric surfaces on one monolithic substrate. The substrate material is Ohara E6 borosilicate glass, in a honeycomb sandwich configuration, currently in production at The University of Arizona's Steward Observatory Mirror Lab. In addition to the normal requirements for smooth surfaces of the appropriate prescriptions, the alignment of the two surfaces must be accurately measured and controlled in the production lab. Both the pointing and centration of the two optical axes are important parameters, in addition to the axial spacing of the two vertices. This paper describes the basic metrology systems for each surface, with particular attention to the alignment of the two surfaces. These surfaces are aspheric enough to require null correctors for each wavefront. Both M1 and M3 are concave surfaces with both non-zero conic constants and higher-order terms (6th order for M1 and both 6th and 8th orders for M3). M1 is hyperboloidal and can utilize a standard Offner null corrector. M3 is an oblate ellipsoid, so has positive spherical aberration. We have chosen to place a phase-etched computer-generated hologram (CGH) between the mirror surface and the center-of-curvature (CoC), whereas the M1 null lens is beyond the CoC. One relatively new metrology tool is the laser tracker, which is relied upon to measure the alignment and spacings. A separate laser tracker system will be used to measure both surfaces during loose abrasive grinding and initial polishing.

  9. Computer analysis of digital sky surveys using citizen science and manual classification

    NASA Astrophysics Data System (ADS)

    Kuminski, Evan; Shamir, Lior

    2015-01-01

    As current and future digital sky surveys such as SDSS, LSST, DES, Pan-STARRS and Gaia create increasingly massive databases containing millions of galaxies, there is a growing need to be able to efficiently analyze these data. An effective way to do this is through manual analysis, however, this may be insufficient considering the extremely vast pipelines of astronomical images generated by the present and future surveys. Some efforts have been made to use citizen science to classify galaxies by their morphology on a larger scale than individual or small groups of scientists can. While these citizen science efforts such as Zooniverse have helped obtain reasonably accurate morphological information about large numbers of galaxies, they cannot scale to provide complete analysis of billions of galaxy images that will be collected by future ventures such as LSST. Since current forms of manual classification cannot scale to the masses of data collected by digital sky surveys, it is clear that in order to keep up with the growing databases some form of automation of the data analysis will be required, and will work either independently or in combination with human analysis such as citizen science. Here we describe a computer vision method that can automatically analyze galaxy images and deduce galaxy morphology. Experiments using Galaxy Zoo 2 data show that the performance of the method increases as the degree of agreement between the citizen scientists gets higher, providing a cleaner dataset. For several morphological features, such as the spirality of the galaxy, the algorithm agreed with the citizen scientists on around 95% of the samples. However, the method failed to analyze some of the morphological features such as the number of spiral arms, and provided accuracy of just ~36%.

  10. voevent-parse: Parse, manipulate, and generate VOEvent XML packets

    NASA Astrophysics Data System (ADS)

    Staley, Tim D.

    2014-11-01

    voevent-parse, written in Python, parses, manipulates, and generates VOEvent XML packets; it is built atop lxml.objectify. Details of transients detected by many projects, including Fermi, Swift, and the Catalina Sky Survey, are currently made available as VOEvents, which is also the standard alert format by future facilities such as LSST and SKA. However, working with XML and adhering to the sometimes lengthy VOEvent schema can be a tricky process. voevent-parse provides convenience routines for common tasks, while allowing the user to utilise the full power of the lxml library when required. An earlier version of voevent-parse was part of the pysovo (ascl:1411.002) library.

  11. The Follow-up Crisis: Optimizing Science in an Opportunity Rich Environment

    NASA Astrophysics Data System (ADS)

    Vestrand, T.

    Rapid follow-up tasking for robotic telescopes has been dominated by a one-dimensional uncoordinated response strategy developed for gamma-ray burst studies. However, this second-grade soccer approach is increasing showing its limitations even when there are only a few events per night. And it will certainly fail when faced with the denial-of-service attack generated by the nightly flood of new transients generated by massive variability surveys like LSST. We discuss approaches for optimizing the scientific return from autonomous robotic telescopes in the high event range limit and explore the potential of a coordinated telescope ecosystem employing heterogeneous telescopes.

  12. PN-CCD camera for XMM: performance of high time resolution/bright source operating modes

    NASA Astrophysics Data System (ADS)

    Kendziorra, Eckhard; Bihler, Edgar; Grubmiller, Willy; Kretschmar, Baerbel; Kuster, Markus; Pflueger, Bernhard; Staubert, Ruediger; Braeuninger, Heinrich W.; Briel, Ulrich G.; Meidinger, Norbert; Pfeffermann, Elmar; Reppin, Claus; Stoetter, Diana; Strueder, Lothar; Holl, Peter; Kemmer, Josef; Soltau, Heike; von Zanthier, Christoph

    1997-10-01

    The pn-CCD camera is developed as one of the focal plane instruments for the European photon imaging camera (EPIC) on board the x-ray multi mirror (XMM) mission to be launched in 1999. The detector consists of four quadrants of three pn-CCDs each, which are integrated on one silicon wafer. Each CCD has 200 by 64 pixels (150 micrometer by 150 micrometers) with 280 micrometers depletion depth. One CCD of a quadrant is read out at a time, while the four quadrants can be processed independently of each other. In standard imaging mode the CCDs are read out sequentially every 70 ms. Observations of point sources brighter than 1 mCrab will be effected by photon pile- up. However, special operating modes can be used to observe bright sources up to 150 mCrab in timing mode with 30 microseconds time resolution and very bright sources up to several crab in burst mode with 7 microseconds time resolution. We have tested one quadrant of the EPIC pn-CCD camera at line energies from 0.52 keV to 17.4 keV at the long beam test facility Panter in the focus of the qualification mirror module for XMM. In order to test the time resolution of the system, a mechanical chopper was used to periodically modulate the beam intensity. Pulse periods down to 0.7 ms were generated. This paper describes the performance of the pn-CCD detector in timing and burst readout modes with special emphasis on energy and time resolution.

  13. Simultaneous determination of guanidinoacetate, creatine and creatinine in urine and plasma by un-derivatized liquid chromatography-tandem mass spectrometry.

    PubMed

    Carling, R S; Hogg, S L; Wood, T C; Calvin, J

    2008-11-01

    Creatine plays an important role in the storage and transmission of phosphate-bound energy. The cerebral creatine deficiency syndromes (CCDS) comprise three inherited defects in creatine biosynthesis and transport. They are characterized by mental retardation, speech and language delay and epilepsy. All three disorders cause low-creatine signal on brain magnetic resonance spectroscopy (MRS); however, MRS may not be readily available and even when it is, biochemical tests are required to determine the underlying disorder. Analysis was performed by liquid chromatography-tandem mass spectrometry in positive ionization mode. Samples were analysed underivatized using a rapid 'dilute and shoot' approach. Chromatographic separation of the three compounds was achieved. Stable isotope internal standards were used for quantification. Creatine, creatinine and guanidinoacetate were measured with a 2.5 minute run time. For guanidinoacetate, the standard curve was linear to at least 5000 mumol/L and for creatine and creatinine it was linear to at least 25 mmol/L. The lower limit of quantitation was 0.4 mumol/L for creatine and guanidinoacetate and 0.8 mumol/L for creatinine. Recoveries ranged from 86% to 106% for the three analytes. Intra- and inter-assay variation for each analyte was <10% in both urine and plasma. A tandem mass spectrometric method has been developed and validated for the underivatized determination of guanidinoacetate, creatine and creatinine in human urine and plasma. Minimal sample preparation coupled with a rapid run time make the method applicable to the routine screening of patients with suspected CCDS.

  14. Modelling electron distributions within ESA's Gaia satellite CCD pixels to mitigate radiation damage

    NASA Astrophysics Data System (ADS)

    Seabroke, G. M.; Holland, A. D.; Burt, D.; Robbins, M. S.

    2009-08-01

    The Gaia satellite is a high-precision astrometry, photometry and spectroscopic ESA cornerstone mission, currently scheduled for launch in 2012. Its primary science drivers are the composition, formation and evolution of the Galaxy. Gaia will achieve its unprecedented positional accuracy requirements with detailed calibration and correction for radiation damage. At L2, protons cause displacement damage in the silicon of CCDs. The resulting traps capture and emit electrons from passing charge packets in the CCD pixel, distorting the image PSF and biasing its centroid. Microscopic models of Gaia's CCDs are being developed to simulate this effect. The key to calculating the probability of an electron being captured by a trap is the 3D electron density within each CCD pixel. However, this has not been physically modelled for the Gaia CCD pixels. In Seabroke, Holland & Cropper (2008), the first paper of this series, we motivated the need for such specialised 3D device modelling and outlined how its future results will fit into Gaia's overall radiation calibration strategy. In this paper, the second of the series, we present our first results using Silvaco's physics-based, engineering software: the ATLAS device simulation framework. Inputting a doping profile, pixel geometry and materials into ATLAS and comparing the results to other simulations reveals that ATLAS has a free parameter, fixed oxide charge, that needs to be calibrated. ATLAS is successfully benchmarked against other simulations and measurements of a test device, identifying how to use it to model Gaia pixels and highlighting the affect of different doping approximations.

  15. Reconciling disparate information in continuity of care documents: Piloting a system to consolidate structured clinical documents.

    PubMed

    Hosseini, Masoud; Jones, Josette; Faiola, Anthony; Vreeman, Daniel J; Wu, Huanmei; Dixon, Brian E

    2017-10-01

    Due to the nature of information generation in health care, clinical documents contain duplicate and sometimes conflicting information. Recent implementation of Health Information Exchange (HIE) mechanisms in which clinical summary documents are exchanged among disparate health care organizations can proliferate duplicate and conflicting information. To reduce information overload, a system to automatically consolidate information across multiple clinical summary documents was developed for an HIE network. The system receives any number of Continuity of Care Documents (CCDs) and outputs a single, consolidated record. To test the system, a randomly sampled corpus of 522 CCDs representing 50 unique patients was extracted from a large HIE network. The automated methods were compared to manual consolidation of information for three key sections of the CCD: problems, allergies, and medications. Manual consolidation of 11,631 entries was completed in approximately 150h. The same data were automatically consolidated in 3.3min. The system successfully consolidated 99.1% of problems, 87.0% of allergies, and 91.7% of medications. Almost all of the inaccuracies were caused by issues involving the use of standardized terminologies within the documents to represent individual information entries. This study represents a novel, tested tool for de-duplication and consolidation of CDA documents, which is a major step toward improving information access and the interoperability among information systems. While more work is necessary, automated systems like the one evaluated in this study will be necessary to meet the informatics needs of providers and health systems in the future. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Typical effects of laser dazzling CCD camera

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Zhang, Jianmin; Shao, Bibo; Cheng, Deyan; Ye, Xisheng; Feng, Guobin

    2015-05-01

    In this article, an overview of laser dazzling effect to buried channel CCD camera is given. The CCDs are sorted into staring and scanning types. The former includes the frame transfer and interline transfer types. The latter includes linear and time delay integration types. All CCDs must perform four primary tasks in generating an image, which are called charge generation, charge collection, charge transfer and charge measurement. In camera, the lenses are needed to input the optical signal to the CCD sensors, in which the techniques for erasing stray light are used. And the electron circuits are needed to process the output signal of CCD, in which many electronic techniques are used. The dazzling effects are the conjunct result of light distribution distortion and charge distribution distortion, which respectively derive from the lens and the sensor. Strictly speaking, in lens, the light distribution is not distorted. In general, the lens are so well designed and fabricated that its stray light can be neglected. But the laser is of much enough intensity to make its stray light obvious. In CCD image sensors, laser can induce a so large electrons generation. Charges transfer inefficiency and charges blooming will cause the distortion of the charge distribution. Commonly, the largest signal outputted from CCD sensor is restricted by capability of the collection well of CCD, and can't go beyond the dynamic range for the subsequent electron circuits maintaining normal work. So the signal is not distorted in the post-processing circuits. But some techniques in the circuit can make some dazzling effects present different phenomenon in final image.

  17. Analysis of smear in high-resolution remote sensing satellites

    NASA Astrophysics Data System (ADS)

    Wahballah, Walid A.; Bazan, Taher M.; El-Tohamy, Fawzy; Fathy, Mahmoud

    2016-10-01

    High-resolution remote sensing satellites (HRRSS) that use time delay and integration (TDI) CCDs have the potential to introduce large amounts of image smear. Clocking and velocity mismatch smear are two of the key factors in inducing image smear. Clocking smear is caused by the discrete manner in which the charge is clocked in the TDI-CCDs. The relative motion between the HRRSS and the observed object obliges that the image motion velocity must be strictly synchronized with the velocity of the charge packet transfer (line rate) throughout the integration time. During imaging an object off-nadir, the image motion velocity changes resulting in asynchronization between the image velocity and the CCD's line rate. A Model for estimating the image motion velocity in HRRSS is derived. The influence of this velocity mismatch combined with clocking smear on the modulation transfer function (MTF) is investigated by using Matlab simulation. The analysis is performed for cross-track and along-track imaging with different satellite attitude angles and TDI steps. The results reveal that the velocity mismatch ratio and the number of TDI steps have a serious impact on the smear MTF; a velocity mismatch ratio of 2% degrades the MTFsmear by 32% at Nyquist frequency when the TDI steps change from 32 to 96. In addition, the results show that to achieve the requirement of MTFsmear >= 0.95 , for TDI steps of 16 and 64, the allowable roll angles are 13.7° and 6.85° and the permissible pitch angles are no more than 9.6° and 4.8°, respectively.

  18. Electronic cameras for low-light microscopy.

    PubMed

    Rasnik, Ivan; French, Todd; Jacobson, Ken; Berland, Keith

    2013-01-01

    This chapter introduces to electronic cameras, discusses the various parameters considered for evaluating their performance, and describes some of the key features of different camera formats. The chapter also presents the basic understanding of functioning of the electronic cameras and how these properties can be exploited to optimize image quality under low-light conditions. Although there are many types of cameras available for microscopy, the most reliable type is the charge-coupled device (CCD) camera, which remains preferred for high-performance systems. If time resolution and frame rate are of no concern, slow-scan CCDs certainly offer the best available performance, both in terms of the signal-to-noise ratio and their spatial resolution. Slow-scan cameras are thus the first choice for experiments using fixed specimens such as measurements using immune fluorescence and fluorescence in situ hybridization. However, if video rate imaging is required, one need not evaluate slow-scan CCD cameras. A very basic video CCD may suffice if samples are heavily labeled or are not perturbed by high intensity illumination. When video rate imaging is required for very dim specimens, the electron multiplying CCD camera is probably the most appropriate at this technological stage. Intensified CCDs provide a unique tool for applications in which high-speed gating is required. The variable integration time video cameras are very attractive options if one needs to acquire images at video rate acquisition, as well as with longer integration times for less bright samples. This flexibility can facilitate many diverse applications with highly varied light levels. Copyright © 2007 Elsevier Inc. All rights reserved.

  19. Flagging and Correction of Pattern Noise in the Kepler Focal Plane Array

    NASA Technical Reports Server (NTRS)

    Kolodziejczak, Jeffery J.; Caldwell, Douglas A.; VanCleve, Jeffrey E.; Clarke, Bruce D.; Jenkins, Jon M.; Cote, Miles T.; Klaus, Todd C.; Argabright, Vic S.

    2010-01-01

    In order for Kepler to achieve its required less than 20 PPM photometric precision for magnitude 12 and brighter stars, instrument-induced variations in the CCD readout bias pattern (our "2D black image"), which are either fixed or slowly varying in time, must be identified and the corresponding pixels either corrected or removed from further data processing. The two principle sources of these readout bias variations are crosstalk between the 84 science CCDs and the 4 fine guidance sensor (FGS) CCDs and a high frequency amplifier oscillation on less than 40% of the CCD readout channels. The crosstalk produces a synchronous pattern in the 2D black image with time-variation observed in less than 10% of individual pixel bias histories. We will describe a method of removing the crosstalk signal using continuously-collected data from masked and over-clocked image regions (our "collateral data"), and occasionally-collected full-frame images and reverse-clocked readout signals. We use this same set to detect regions affected by the oscillating amplifiers. The oscillations manifest as time-varying moir pattern and rolling bands in the affected channels. Because this effect reduces the performance in only a small fraction of the array at any given time, we have developed an approach for flagging suspect data. The flags will provide the necessary means to resolve any potential ambiguity between instrument-induced variations and real photometric variations in a target time series. We will also evaluate the effectiveness of these techniques using flight data from background and selected target pixels.

  20. Performance Characterization of UV Science Cameras Developed for the Chromospheric Lyman-Alpha Spectro-Polarimeter

    NASA Technical Reports Server (NTRS)

    Champey, Patrick; Kobayashi, Ken; Winebarger, Amy; Cirtain, Jonathan; Hyde, David; Robertson, Bryan; Beabout, Brent; Beabout, Dyana; Stewart, Mike

    2014-01-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-alpha and to detect the Hanle effect in the line core. Due to the nature of Lyman-alpha polarization in the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1 percent in the line core. CLASP is a dual-beam spectro- polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1 percent polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. Coating the e2v CCD57-10 512x512 detectors with Lumogen-E coating allows for a relatively high (30 percent) quantum efficiency at the Lyman-alpha line. The CLASP cameras were designed to operate with a gain of 2.0 +/- 0.5, less than or equal to 25 e- readout noise, less than or equal to 10 e-/second/pixel dark current, and less than 0.1percent residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; system gain, dark current, read noise, and residual non-linearity.

  1. Contour Mapping

    NASA Technical Reports Server (NTRS)

    1995-01-01

    In the early 1990s, the Ohio State University Center for Mapping, a NASA Center for the Commercial Development of Space (CCDS), developed a system for mobile mapping called the GPSVan. While driving, the users can map an area from the sophisticated mapping van equipped with satellite signal receivers, video cameras and computer systems for collecting and storing mapping data. George J. Igel and Company and the Ohio State University Center for Mapping advanced the technology for use in determining the contours of a construction site. The new system reduces the time required for mapping and staking, and can monitor the amount of soil moved.

  2. Generalized approach to cooling charge-coupled devices using thermoelectric coolers

    NASA Technical Reports Server (NTRS)

    Petrick, S. Walter

    1987-01-01

    This paper is concerned with the use of thermoelectric coolers (TECs) to cool charge-coupled devices (CCDs). Heat inputs to the CCD from the warmer environment are identified, and generalized graphs are used to approximate the major heat inputs. A method of choosing and estimating the power consumption of the TEC is discussed. This method includes the use of TEC performance information supplied by the manufacturer and equations derived from this information. Parameters of the equations are tabulated to enable the reader to use the TEC performance equations for choosing and estimating the power needed for specific TEC applications.

  3. Development of CMOS Active Pixel Image Sensors for Low Cost Commercial Applications

    NASA Technical Reports Server (NTRS)

    Gee, R.; Kemeny, S.; Kim, Q.; Mendis, S.; Nakamura, J.; Nixon, R.; Ortiz, M.; Pain, B.; Staller, C.; Zhou, Z; hide

    1994-01-01

    JPL, under sponsorship from the NASA Office of Advanced Concepts and Technology, has been developing a second-generation solid-state image sensor technology. Charge-coupled devices (CCD) are a well-established first generation image sensor technology. For both commercial and NASA applications, CCDs have numerous shortcomings. In response, the active pixel sensor (APS) technology has been under research. The major advantages of APS technology are the ability to integrate on-chip timing, control, signal-processing and analog-to-digital converter functions, reduced sensitivity to radiation effects, low power operation, and random access readout.

  4. Evaluation of RCA thinned buried channel charge-coupled devices /CCDs/ for scientific applications

    NASA Technical Reports Server (NTRS)

    Zucchino, P.; Long, D.; Lowrance, J. L.; Renda, G.; Crawshaw, D. D.; Battson, D. F.

    1981-01-01

    An experimental version of a thinned illuminated buried-channel 512 x 320 pixel CCD with reduced amplifier input capacitance has been produced which is characterized by lower readout noise. Changes made to the amplifier are discussed, and readout noise measurements obtained by several different techniques are presented. The single energetic electron response of the CCD in the electron-bombarded mode and the single 5.9 keV X-ray pulse height distribution are reported. Results are also given on the dark current versus temperature and the spatial frequency response as a function of signal level.

  5. Selecting Pixels for Kepler Downlink

    NASA Technical Reports Server (NTRS)

    Bryson, Stephen T.; Jenkins, Jon M.; Klaus, Todd C.; Cote, Miles T.; Quintana, Elisa V.; Hall, Jennifer R.; Ibrahim, Khadeejah; Chandrasekaran, Hema; Caldwell, Douglas A.; Van Cleve, Jeffrey E.; hide

    2010-01-01

    The Kepler mission monitors > 100,000 stellar targets using 42 2200 1024 pixel CCDs. Bandwidth constraints prevent the downlink of all 96 million pixels per 30-minute cadence, so the Kepler spacecraft downlinks a specified collection of pixels for each target. These pixels are selected by considering the object brightness, background and the signal-to-noise of each pixel, and are optimized to maximize the signal-to-noise ratio of the target. This paper describes pixel selection, creation of spacecraft apertures that efficiently capture selected pixels, and aperture assignment to a target. Diagnostic apertures, short-cadence targets and custom specified shapes are discussed.

  6. ESA's CCD test bench for the PLATO mission

    NASA Astrophysics Data System (ADS)

    Beaufort, Thierry; Duvet, Ludovic; Bloemmaert, Sander; Lemmel, Frederic; Prod'homme, Thibaut; Verhoeve, Peter; Smit, Hans; Butler, Bart; van der Luijt, Cornelis; Heijnen, Jerko; Visser, Ivo

    2016-08-01

    PLATO { PLAnetary Transits and Oscillations of stars { is the third medium-class mission to be selected in the European Space Agency (ESA) Science and Robotic Exploration Cosmic Vision programme. Due for launch in 2025, the payload makes use of a large format (8 cm x 8 cm) Charge-Coupled Devices (CCDs), the e2v CCD270 operated at 4 MHz and at -70 C. To de-risk the PLATO CCD qualification programme initiated in 2014 and support the mission definition process, ESA's Payload Technology Validation section from the Future Missions Office has developed a dedicated test bench.

  7. CCD correlation techniques

    NASA Technical Reports Server (NTRS)

    Hewes, C. R.; Bosshart, P. W.; Eversole, W. L.; Dewit, M.; Buss, D. D.

    1976-01-01

    Two CCD techniques were discussed for performing an N-point sampled data correlation between an input signal and an electronically programmable reference function. The design and experimental performance of an implementation of the direct time correlator utilizing two analog CCDs and MOS multipliers on a single IC were evaluated. The performance of a CCD implementation of the chirp z transform was described, and the design of a new CCD integrated circuit for performing correlation by multiplication in the frequency domain was presented. This chip provides a discrete Fourier transform (DFT) or inverse DFT, multipliers, and complete support circuitry for the CCD CZT. The two correlation techniques are compared.

  8. Modeling Contamination Migration on the Chandra X-Ray Observatory - III

    NASA Technical Reports Server (NTRS)

    O'Dell, Stephen L.; Swartz, Douglas A.; Tice, Neil W.; Plucinsky, Paul P.; Grant, Catherine E.; Marshall, Herman L.; Vikhlinin, Alexy A.; Tennant, Allyn F.; Dahmer, Matthew T.

    2015-01-01

    During its first 16 years of operation, the cold (about -60 C) optical blocking filter of the Advanced CCD Imaging Spectrometer (ACIS), aboard the Chandra X-ray Observatory, has accumulated a growing layer of molecular contamination that attenuates low-energy x rays. Over the past few years, the accumulation rate, spatial distribution, and composition have changed. This evolution has motivated further analysis of contamination migration within and near the ACIS cavity, in part to evaluate potential bake-out scenarios intended to reduce the level of contamination. Keywords: X-ray astronomy, CCDs, contamination, modeling and simulation, spacecraft operations

  9. Gastric adenocarcinoma with chief cell differentiation: a proposal for reclassification as oxyntic gland polyp/adenoma.

    PubMed

    Singhi, Aatur D; Lazenby, Audrey J; Montgomery, Elizabeth A

    2012-07-01

    Gastric adenocarcinoma with chief cell differentiation (GA-CCD) has been reported as a new, rare variant of gastric adenocarcinoma. Only 12 cases in Japanese patients have been described to date, but they demonstrate distinct clinicopathologic features. To further characterize these lesions, we have collected 10 additional cases. Patients ranged in age from 44 to 79 years (mean, 64.2 y) with a relatively equal sex distribution (6 women and 4 men). Stratified by race, 4 patients were Hispanic, 2 were White, 2 were African American, 1 was Asian (Chinese), and the race was unknown for 1 patient. All patients presented with gastroesophageal reflux that prompted an endoscopic examination. The majority of GA-CCDs were identified in the fundus (7 of 10, 70%) and the remaining in the cardia (n=3). Grossly, they were solitary and polypoid, ranging in size from 0.2 to 0.8 cm (mean, 0.4 cm). Histologically, all cases were centered in the deep mucosa, with focal involvement of surface foveolar epithelium in 3 (30%) cases but not the submucosa. The tumors consisted of clustered glands and irregular branching cords of oxyntic epithelium. Thin wisps of radiating smooth muscle separated the epithelium, but desmoplasia was distinctly absent in all cases. The oxyntic mucosa was 1 to 2 cells thick and composed of a mixture of mucous neck, parietal, and chief cells. In 7 of 10 (70%) cases, chief cells were the predominant cell type, whereas the remaining 3 cases consisted primarily of mucous neck cells. The nuclei were mildly enlarged with slight nuclear pleomorphism, but no mitotic figures were identified. In addition, necrosis, lymphovascular invasion, and perineural invasion were absent. Immunohistochemically, GA-CCDs were diffusely positive for MUC6 (10 of 10, 100%) and negative for MUC5AC (0%) and MUC2 (0%). Ki-67 immunolabeling demonstrated variable expression, with the highest areas ranging from 0.2% to 10%. Clinical follow-up was available for 9 of 10 (90%) patients and ranged from 6 to 39 months. One patient had persistence of lesion at 6 months because of incomplete removal, whereas the other 8 were disease free. In summary, GA-CCDs are solitary, mucosal lesions of the gastric cardia/fundus that arise in patients from multiple ethnic backgrounds. Considering that patients within this study and those reported previously have had neither true recurrence nor progression of disease, these lesions are best regarded as benign. Consequently, the term GA-CCD is contradictory and we prefer the descriptive term "oxyntic gland polyp/adenoma" until further studies can clarify the pathogenesis of these lesions and their natural history.

  10. High-resolution EEG techniques for brain-computer interface applications.

    PubMed

    Cincotti, Febo; Mattia, Donatella; Aloise, Fabio; Bufalari, Simona; Astolfi, Laura; De Vico Fallani, Fabrizio; Tocci, Andrea; Bianchi, Luigi; Marciani, Maria Grazia; Gao, Shangkai; Millan, Jose; Babiloni, Fabio

    2008-01-15

    High-resolution electroencephalographic (HREEG) techniques allow estimation of cortical activity based on non-invasive scalp potential measurements, using appropriate models of volume conduction and of neuroelectrical sources. In this study we propose an application of this body of technologies, originally developed to obtain functional images of the brain's electrical activity, in the context of brain-computer interfaces (BCI). Our working hypothesis predicted that, since HREEG pre-processing removes spatial correlation introduced by current conduction in the head structures, by providing the BCI with waveforms that are mostly due to the unmixed activity of a small cortical region, a more reliable classification would be obtained, at least when the activity to detect has a limited generator, which is the case in motor related tasks. HREEG techniques employed in this study rely on (i) individual head models derived from anatomical magnetic resonance images, (ii) distributed source model, composed of a layer of current dipoles, geometrically constrained to the cortical mantle, (iii) depth-weighted minimum L(2)-norm constraint and Tikhonov regularization for linear inverse problem solution and (iv) estimation of electrical activity in cortical regions of interest corresponding to relevant Brodmann areas. Six subjects were trained to learn self modulation of sensorimotor EEG rhythms, related to the imagination of limb movements. Off-line EEG data was used to estimate waveforms of cortical activity (cortical current density, CCD) on selected regions of interest. CCD waveforms were fed into the BCI computational pipeline as an alternative to raw EEG signals; spectral features are evaluated through statistical tests (r(2) analysis), to quantify their reliability for BCI control. These results are compared, within subjects, to analogous results obtained without HREEG techniques. The processing procedure was designed in such a way that computations could be split into a setup phase (which includes most of the computational burden) and the actual EEG processing phase, which was limited to a single matrix multiplication. This separation allowed to make the procedure suitable for on-line utilization, and a pilot experiment was performed. Results show that lateralization of electrical activity, which is expected to be contralateral to the imagined movement, is more evident on the estimated CCDs than in the scalp potentials. CCDs produce a pattern of relevant spectral features that is more spatially focused, and has a higher statistical significance (EEG: 0.20+/-0.114 S.D.; CCD: 0.55+/-0.16 S.D.; p=10(-5)). A pilot experiment showed that a trained subject could utilize voluntary modulation of estimated CCDs for accurate (eight targets) on-line control of a cursor. This study showed that it is practically feasible to utilize HREEG techniques for on-line operation of a BCI system; off-line analysis suggests that accuracy of BCI control is enhanced by the proposed method.

  11. Protecting Dark Skies in Chile

    NASA Astrophysics Data System (ADS)

    Smith, R. Chris; Sanhueza, Pedro; Phillips, Mark

    2018-01-01

    Current projections indicate that Chile will host approximately 70% of the astronomical collecting area on Earth by 2030, augmenting the enormous area of ALMA with that of three next-generation optical telescopes: LSST, GMTO, and E-ELT. These cutting-edge facilities represent billions of dollars of investment in the astronomical facilities hosted in Chile. The Chilean government, Chilean astronomical community, and the international observatories in Chile have recognized that these investments are threatened by light pollution, and have formed a strong collaboration to work at managing the threats. We will provide an update on the work being done in Chile, ranging from training municipalities about new lighting regulations to exploring international recognition of the dark sky sites of Northern Chile.

  12. Agile software development in an earned value world: a survival guide

    NASA Astrophysics Data System (ADS)

    Kantor, Jeffrey; Long, Kevin; Becla, Jacek; Economou, Frossie; Gelman, Margaret; Juric, Mario; Lambert, Ron; Krughoff, Simon; Swinbank, John D.; Wu, Xiuqin

    2016-08-01

    Agile methodologies are current best practice in software development. They are favored for, among other reasons, preventing premature optimization by taking a somewhat short-term focus, and allowing frequent replans/reprioritizations of upcoming development work based on recent results and current backlog. At the same time, funding agencies prescribe earned value management accounting for large projects which, these days, inevitably include substantial software components. Earned Value approaches emphasize a more comprehensive and typically longer-range plan, and tend to characterize frequent replans and reprioritizations as indicative of problems. Here we describe the planning, execution and reporting framework used by the LSST Data Management team, that navigates these opposite tensions.

  13. Pendrin protein abundance in the kidney is regulated by nitric oxide and cAMP.

    PubMed

    Thumova, Monika; Pech, Vladimir; Froehlich, Otto; Agazatian, Diana; Wang, Xiaonan; Verlander, Jill W; Kim, Young Hee; Wall, Susan M

    2012-09-15

    Pendrin is a Cl(-)/HCO(3)(-) exchanger, expressed in the apical regions of some intercalated cell subtypes, and is critical in the pressor response to angiotensin II. Since angiotensin type 1 receptor inhibitors reduce renal pendrin protein abundance in mice in vivo through a mechanism that is dependent on nitric oxide (NO), we asked if NO modulates renal pendrin expression in vitro and explored the mechanism by which it occurs. Thus we quantified pendrin protein abundance by confocal fluorescent microscopy in cultured mouse cortical collecting ducts (CCDs) and connecting tubules (CNTs). After overnight culture, CCDs maintain their tubular structure and maintain a solute gradient when perfused in vitro. Pendrin protein abundance increased 67% in CNT and 53% in CCD when NO synthase was inhibited (N(G)-nitro-L-arginine methyl ester, 100 μM), while NO donor (DETA NONOate, 200 μM) application reduced pendrin protein by ∼33% in the CCD and CNT. When CNTs were cultured in the presence of the guanylyl cyclase inhibitor 1H-[1,2,4] oxadiazolo[4,3-a]quinoxalin-1-one (10 μM), NO donors did not alter pendrin abundance. Conversely, pendrin protein abundance rose when cAMP content was increased by the application of an adenylyl cyclase agonist (forskolin, 10 μM), a cAMP analog (8-bromo-cAMP, 1 mM), or a phosphodiesterase inhibitor (BAY60-7550, 50 μM). Since NO reduces cellular cAMP in the CNT, we asked if NO reduces pendrin abundance by reducing cAMP. With blockade of cGMP-stimulated phosphodiesterase II, NO did not alter pendrin protein abundance. We conclude that NO acts through cAMP to reduce pendrin total protein abundance by enhancing cAMP degradation.

  14. Pendrin protein abundance in the kidney is regulated by nitric oxide and cAMP

    PubMed Central

    Thumova, Monika; Pech, Vladimir; Froehlich, Otto; Agazatian, Diana; Wang, Xiaonan; Verlander, Jill W.; Kim, Young Hee

    2012-01-01

    Pendrin is a Cl−/HCO3− exchanger, expressed in the apical regions of some intercalated cell subtypes, and is critical in the pressor response to angiotensin II. Since angiotensin type 1 receptor inhibitors reduce renal pendrin protein abundance in mice in vivo through a mechanism that is dependent on nitric oxide (NO), we asked if NO modulates renal pendrin expression in vitro and explored the mechanism by which it occurs. Thus we quantified pendrin protein abundance by confocal fluorescent microscopy in cultured mouse cortical collecting ducts (CCDs) and connecting tubules (CNTs). After overnight culture, CCDs maintain their tubular structure and maintain a solute gradient when perfused in vitro. Pendrin protein abundance increased 67% in CNT and 53% in CCD when NO synthase was inhibited (NG-nitro-l-arginine methyl ester, 100 μM), while NO donor (DETA NONOate, 200 μM) application reduced pendrin protein by ∼33% in the CCD and CNT. When CNTs were cultured in the presence of the guanylyl cyclase inhibitor 1H-[1,2,4] oxadiazolo[4,3-a]quinoxalin-1-one (10 μM), NO donors did not alter pendrin abundance. Conversely, pendrin protein abundance rose when cAMP content was increased by the application of an adenylyl cyclase agonist (forskolin, 10 μM), a cAMP analog (8-bromo-cAMP, 1 mM), or a phosphodiesterase inhibitor (BAY60-7550, 50 μM). Since NO reduces cellular cAMP in the CNT, we asked if NO reduces pendrin abundance by reducing cAMP. With blockade of cGMP-stimulated phosphodiesterase II, NO did not alter pendrin protein abundance. We conclude that NO acts through cAMP to reduce pendrin total protein abundance by enhancing cAMP degradation. PMID:22811483

  15. Bioinformatic and expression analyses on carotenoid dioxygenase genes in fruit development and abiotic stress responses in Fragaria vesca.

    PubMed

    Wang, Yong; Ding, Guanqun; Gu, Tingting; Ding, Jing; Li, Yi

    2017-08-01

    Carotenoid dioxygenases, including 9-cis-epoxycarotenoid dioxygenases (NCEDs) and carotenoid cleavage dioxygenases (CCDs), can selectively cleave carotenoids into various apocarotenoid products that play important roles in fleshy fruit development and abiotic stress response. In this study, we identified 12 carotenoid dioxygenase genes in diploid strawberry Fragaria vesca, and explored their evolution with orthologous genes from nine other species. Phylogenetic analyses suggested that the NCED and CCDL groups moderately expanded during their evolution, whereas gene numbers of the CCD1, CCD4, CCD7, and CCD8 groups maintained conserved. We characterized the expression profiles of FveNCED and FveCCD genes during flower and fruit development, and in response to several abiotic stresses. FveNCED1 expression positively responded to osmotic, cold, and heat stresses, whereas FveNCED2 was only induced under cold stress. In contrast, FveNCED2 was the unique gene highly and continuously increasing in receptacle during fruit ripening, which co-occurred with the increase in endogenous abscisic acid (ABA) content previously reported in octoploid strawberry. The differential expression patterns suggested that FveNCED1 and FveNCED2 were key genes for ABA biosynthesis in abiotic stress responses and fruit ripening, respectively. FveCCD1 exhibited the highest expression in most stages of flower and fruit development, while the other FveCCDs were expressed in a subset of stages and tissues. Our study suggests distinct functions of FveNCED and FveCCD genes in fruit development and stress responses and lays a foundation for future study to understand the roles of these genes and their metabolites, including ABA and other apocarotenoid products, in the growth and development of strawberry.

  16. HiPERCAM: a high-speed quintuple-beam CCD camera for the study of rapid variability in the universe

    NASA Astrophysics Data System (ADS)

    Dhillon, Vikram S.; Marsh, Thomas R.; Bezawada, Naidu; Black, Martin; Dixon, Simon; Gamble, Trevor; Henry, David; Kerry, Paul; Littlefair, Stuart; Lunney, David W.; Morris, Timothy; Osborn, James; Wilson, Richard W.

    2016-08-01

    HiPERCAM is a high-speed camera for the study of rapid variability in the Universe. The project is funded by a ɛ3.5M European Research Council Advanced Grant. HiPERCAM builds on the success of our previous instrument, ULTRACAM, with very significant improvements in performance thanks to the use of the latest technologies. HiPERCAM will use 4 dichroic beamsplitters to image simultaneously in 5 optical channels covering the u'g'r'I'z' bands. Frame rates of over 1000 per second will be achievable using an ESO CCD controller (NGC), with every frame GPS timestamped. The detectors are custom-made, frame-transfer CCDs from e2v, with 4 low noise (2.5e-) outputs, mounted in small thermoelectrically-cooled heads operated at 180 K, resulting in virtually no dark current. The two reddest CCDs will be deep-depletion devices with anti-etaloning, providing high quantum efficiencies across the red part of the spectrum with no fringing. The instrument will also incorporate scintillation noise correction via the conjugate-plane photometry technique. The opto-mechanical chassis will make use of additive manufacturing techniques in metal to make a light-weight, rigid and temperature-invariant structure. First light is expected on the 4.2m William Herschel Telescope on La Palma in 2017 (on which the field of view will be 10' with a 0.3"/pixel scale), with subsequent use planned on the 10.4m Gran Telescopio Canarias on La Palma (on which the field of view will be 4' with a 0.11"/pixel scale) and the 3.5m New Technology Telescope in Chile.

  17. Health Care Expenditures and Utilization for Children With Noncomplex Chronic Disease.

    PubMed

    Hoefgen, Erik R; Andrews, Annie L; Richardson, Troy; Hall, Matthew; Neff, John M; Macy, Michelle L; Bettenhausen, Jessica L; Shah, Samir S; Auger, Katherine A

    2017-09-01

    Pediatric health care expenditures and use vary by level of complexity and chronic illness. We sought to determine expenditures and use for children with noncomplex chronic diseases (NC-CDs). We performed a retrospective, cross-sectional analysis of Medicaid enrollees (ages 0-18 years) from January 1, 2012, through December 31, 2013, using administrative claims (the Truven MarketScan Medicaid Database). Patients were categorized by chronicity of illness by using 3M Health Information System's Clinical Risk Groups (CRGs) as follows: without chronic diseases (WO-CDs) (CRG 1-2), NC-CDs (CRG 3-5), and complex chronic diseases (C-CDs) (CRG 6-9). Primary outcomes were medical expenditures, including total annualized population expenditure and per-member per-year expenditure (PMPY). Secondary outcomes included the number of health care encounters over the 2-year period. There were 2 424 946 children who met inclusion criteria, 53% were WO-CD; 36% had an NC-CD; and 11% had a C-CD. Children with NC-CDs accounted for 33% ($2801 PMPY) of the annual spending compared with 20% ($1151 PMPY) accounted for by children WO-CDs and 47% ($12 569 PMPY) by children with C-CDs. The median outpatient visit count by group over the 2-year period was 15 (interquartile range [IQR] 10-25) for NC-CD, 8 (IQR 5-13) WO-CD, and 34 (IQR 19-72) for C-CD. Children with NC-CDs accounted for 33% of pediatric Medicaid expenditures and have significantly higher PMPY and aggregate annual expenditures than children WO-CDs. The annual aggregate expenditures of the NC-CD group represent a significant societal cost because of the high volume of children, extrapolated to ∼$34.9 billion annually in national Medicaid expenditures. Copyright © 2017 by the American Academy of Pediatrics.

  18. Antigenic cross-reactivity between Schistosoma mansoni and peanut: a role for cross-reactive carbohydrate determinants (CCDs) and implications for the hygiene hypothesis.

    PubMed

    Igetei, Joseph E; El-Faham, Marwa; Liddell, Susan; Doenhoff, Michael J

    2017-04-01

    The antigenic reactivity of constituents of Schistosoma mansoni and peanut (Arachis hypogaea) was investigated to determine whether identical antigenic epitopes possessed by both organisms provided a possible explanation for the negative correlation between chronic schistosome infection and atopy to allergens. Aqueous extracts of peanuts were probed in Western immunoblots with rabbit IgG antibodies raised against the egg, cercarial and adult worm stages of S. mansoni. Several molecules in the peanut extract were antigenically reactive with antibodies from the various rabbit anti-schistosome sera. A pair of cross-reactive peanut molecules at ~30 000-33 000 molecular weight was purified and both proteins were identified by mass spectrometric analysis as the peanut allergen Ara h 1. Anti-S. mansoni soluble egg antigen antibodies that were eluted off the peanut molecules reacted with two S. mansoni egg antigens identified by mass spectrometry as IPSE/α-1 and κ-5. Alignments of the amino acid sequences of Ara h 1 and either IPSE/α-1 or κ-5 revealed a low level of peptide sequence identity. Incubation of nitrocellulose paper carrying electrophoresed peanut molecules, six constituents of other allergic plants and S. mansoni egg antigens in a mild solution of sodium metaperiodate before probing with antibodies, inhibited most of the cross-reactivities. The results are consistent with the antigenic cross-reactive epitopes of S. mansoni egg antigens, peanut and other allergic plants being cross-reactive carbohydrate determinants (CCDs). These findings are novel and an explanation based on 'blocking antibodies' could provide an insight for the inverse relationship observed between schistosome infection and allergies. © 2017 John Wiley & Sons Ltd.

  19. Comparing simulations and test data of a radiation damaged CCD for the Euclid mission

    NASA Astrophysics Data System (ADS)

    Skottfelt, Jesper; Hall, David; Gow, Jason; Murray, Neil; Holland, Andrew; Prod'homme, Thibaut

    2016-07-01

    The radiation damage effects from the harsh radiative environment outside the Earth's atmosphere can be a cause for concern for most space missions. With the science goals becoming ever more demanding, the requirements on the precision of the instruments on board these missions also increases, and it is therefore important to investigate how the radiation induced damage affects the Charge-Coupled Devices (CCDs) that most of these instruments rely on. The primary goal of the Euclid mission is to study the nature of dark matter and dark energy using weak lensing and baryonic acoustic oscillation techniques. The weak lensing technique depends on very precise shape measurements of distant galaxies obtained by a large CCD array. It is anticipated that over the 6 year nominal lifetime of mission, the CCDs will be degraded to an extent that these measurements will not be possible unless the radiation damage effects are corrected. We have therefore created a Monte Carlo model that simulates the physical processes taking place when transferring signal through a radiation damaged CCD. The software is based on Shockley-Read-Hall theory, and is made to mimic the physical properties in the CCD as close as possible. The code runs on a single electrode level and takes charge cloud size and density, three dimensional trap position, and multi-level clocking into account. A key element of the model is that it takes device specific simulations of electron density as a direct input, thereby avoiding to make any analytical assumptions about the size and density of the charge cloud. This paper illustrates how test data and simulated data can be compared in order to further our understanding of the positions and properties of the individual radiation-induced traps.

  20. Functional Implication of β-Carotene Hydroxylases in Soybean Nodulation1[C][W][OA

    PubMed Central

    Kim, Yun-Kyoung; Kim, Sunghan; Um, Ji-Hyun; Kim, Kyunga; Choi, Sun-Kang; Um, Byung-Hun; Kang, Suk-Woo; Kim, Jee-Woong; Takaichi, Shinichi; Song, Seok-Bo; Lee, Choon-Hwan; Kim, Ho-Seung; Kim, Ki Woo; Nam, Kyoung Hee; Lee, Suk-Ha; Kim, Yul-Ho; Park, Hyang-Mi; Ha, Sun-Hwa; Verma, Desh Pal S.; Cheon, Choong-Ill

    2013-01-01

    Legume-Rhizobium spp. symbiosis requires signaling between the symbiotic partners and differential expression of plant genes during nodule development. Previously, we cloned a gene encoding a putative β-carotene hydroxylase (GmBCH1) from soybean (Glycine max) whose expression increased during nodulation with Bradyrhizobium japonicum. In this work, we extended our study to three GmBCHs to examine their possible role(s) in nodule development, as they were additionally identified as nodule specific, along with the completion of the soybean genome. In situ hybridization revealed the expression of three GmBCHs (GmBCH1, GmBCH2, and GmBCH3) in the infected cells of root nodules, and their enzymatic activities were confirmed by functional assays in Escherichia coli. Localization of GmBCHs by transfecting Arabidopsis (Arabidopsis thaliana) protoplasts with green fluorescent protein fusions and by electron microscopic immunogold detection in soybean nodules indicated that GmBCH2 and GmBCH3 were present in plastids, while GmBCH1 appeared to be cytosolic. RNA interference of the GmBCHs severely impaired nitrogen fixation as well as nodule development. Surprisingly, we failed to detect zeaxanthin, a product of GmBCH, or any other carotenoids in nodules. Therefore, we examined the possibility that most of the carotenoids in nodules are converted or cleaved to other compounds. We detected the expression of some carotenoid cleavage dioxygenases (GmCCDs) in wild-type nodules and also a reduced amount of zeaxanthin in GmCCD8-expressing E. coli, suggesting cleavage of the carotenoid. In view of these findings, we propose that carotenoids such as zeaxanthin synthesized in root nodules are cleaved by GmCCDs, and we discuss the possible roles of the carotenoid cleavage products in nodulation. PMID:23700351

  1. Consortium for materials development in space interaction with Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Lundquist, Charles A.; Seaquist, Valerie

    1992-01-01

    The Consortium for Materials Development in Space (CMDS) is one of seventeen Centers for the Commercial Development of Space (CCDS) sponsored by the Office of Commercial Programs of NASA. The CMDS formed at the University of Alabama in Huntsville in the fall of 1985. The Consortium activities therefore will have progressed for over a decade by the time Space Station Freedom (SSF) begins operation. The topic to be addressed here is: what are the natural, mutually productive relationships between the CMDS and SSF? For management and planning purposes, the Consortium organizes its activities into a number of individual projects. Normally, each project has a team of personnel from industry, university, and often government organizations. This is true for both product-oriented materials projects and for infrastructure projects. For various projects Space Station offers specific mutually productive relationships. First, SSF can provide a site for commercial operations that have evolved as a natural stage in the life cycle of individual projects. Efficiency and associated cost control lead to another important option. With SSF in place, there is the possibility to leave major parts of processing equipment in SSF, and only bring materials to SSF to be processed and return to earth the treated materials. This saves the transportation costs of repeatedly carrying heavy equipment to orbit and back to the ground. Another generic feature of commercial viability can be the general need to accomplish large through-put or large scale operations. The size of SSF lends itself to such needs. Also in addition to processing equipment, some of the other infrastructure capabilities developed in CCDS projects may be applied on SSF to support product activities. The larger SSF program may derive mutual benefits from these infrastructure abilities.

  2. Three-dimensional gadolinium-enhanced magnetic resonance venography in suspected thrombo-occlusive disease of the central chest veins.

    PubMed

    Kroencke, T J; Taupitz, M; Arnold, R; Fritsche, L; Hamm, B

    2001-11-01

    To determine the usefulness of high-resolution three-dimensional (3D) gadolinium-enhanced magnetic resonance venography (MRV) in the evaluation of central venous thrombo-occlusive disease of the chest. Prospective study. University hospital. Sixteen consecutive patients with clinically suspected thrombosis of the superior vena cava, subclavian, brachiocephalic/innominate, internal jugular, or axillary veins. Thirteen patients had a neoplasm, two patients had a connective tissue disease, and one patient had a history of strenuous exercise. Twelve of 16 patients had prior central venous catheter placement. MRI was correlated with color-coded duplex sonography (CCDS) in 7 of 16 patients, digital subtraction angiography (DSA) in 3 of 16 patients, and CT in 2 of 16 patients. Contrast-enhanced MRV was performed in a total of 20 examinations. A 3D data set (gradient echo; time to repeat, 4.6 ms; time to echo, 1.8 ms; flip angle, 30 degrees; time of acquisition, 23 s; 512 matrix/64 partitions; slice thickness, 1.5 mm) was acquired in the arterial and venous phase. Overall image quality was assessed on a 5-point scale. The presence, site, and extent of thrombus, as well as presence of an intravascular device, were determined. Overall image quality was rated very good (1 point) in 7 of 16 cases (44%) and good (2 points) in 9 of 16 cases (56%). Thrombus was detected in 16 of 16 patients, and complete extent of disease could be determined in 15 of 16 patients (94%). MRV did not miss any finding obtained by CCDS, DSA, or CT, and provided additional information in 6 of 16 examinations (38%). Contrast-enhanced MRV is a fast and reliable noninvasive procedure with excellent results regarding detection and determination of the extent of thrombo-occlusive disease of the chest veins.

  3. Dominant KCNA2 mutation causes episodic ataxia and pharmacoresponsive epilepsy.

    PubMed

    Corbett, Mark A; Bellows, Susannah T; Li, Melody; Carroll, Renée; Micallef, Silvana; Carvill, Gemma L; Myers, Candace T; Howell, Katherine B; Maljevic, Snezana; Lerche, Holger; Gazina, Elena V; Mefford, Heather C; Bahlo, Melanie; Berkovic, Samuel F; Petrou, Steven; Scheffer, Ingrid E; Gecz, Jozef

    2016-11-08

    To identify the genetic basis of a family segregating episodic ataxia, infantile seizures, and heterogeneous epilepsies and to study the phenotypic spectrum of KCNA2 mutations. A family with 7 affected individuals over 3 generations underwent detailed phenotyping. Whole genome sequencing was performed on a mildly affected grandmother and her grandson with epileptic encephalopathy (EE). Segregating variants were filtered and prioritized based on functional annotations. The effects of the mutation on channel function were analyzed in vitro by voltage clamp assay and in silico by molecular modeling. KCNA2 was sequenced in 35 probands with heterogeneous phenotypes. The 7 family members had episodic ataxia (5), self-limited infantile seizures (5), evolving to genetic generalized epilepsy (4), focal seizures (2), and EE (1). They had a segregating novel mutation in the shaker type voltage-gated potassium channel KCNA2 (CCDS_827.1: c.765_773del; p.255_257del). A rare missense SCN2A (rs200884216) variant was also found in 2 affected siblings and their unaffected mother. The p.255_257del mutation caused dominant negative loss of channel function. Molecular modeling predicted repositioning of critical arginine residues in the voltage-sensing domain. KCNA2 sequencing revealed 1 de novo mutation (CCDS_827.1: c.890G>A; p.Arg297Gln) in a girl with EE, ataxia, and tremor. A KCNA2 mutation caused dominantly inherited episodic ataxia, mild infantile-onset seizures, and later generalized and focal epilepsies in the setting of normal intellect. This observation expands the KCNA2 phenotypic spectrum from EE often associated with chronic ataxia, reflecting the marked variation in severity observed in many ion channel disorders. © 2016 American Academy of Neurology.

  4. Hyper Suprime-Cam: Camera dewar design

    NASA Astrophysics Data System (ADS)

    Komiyama, Yutaka; Obuchi, Yoshiyuki; Nakaya, Hidehiko; Kamata, Yukiko; Kawanomoto, Satoshi; Utsumi, Yousuke; Miyazaki, Satoshi; Uraguchi, Fumihiro; Furusawa, Hisanori; Morokuma, Tomoki; Uchida, Tomohisa; Miyatake, Hironao; Mineo, Sogo; Fujimori, Hiroki; Aihara, Hiroaki; Karoji, Hiroshi; Gunn, James E.; Wang, Shiang-Yu

    2018-01-01

    This paper describes the detailed design of the CCD dewar and the camera system which is a part of the wide-field imager Hyper Suprime-Cam (HSC) on the 8.2 m Subaru Telescope. On the 1.°5 diameter focal plane (497 mm in physical size), 116 four-side buttable 2 k × 4 k fully depleted CCDs are tiled with 0.3 mm gaps between adjacent chips, which are cooled down to -100°C by two pulse tube coolers with a capability to exhaust 100 W heat at -100°C. The design of the dewar is basically a natural extension of Suprime-Cam, incorporating some improvements such as (1) a detailed CCD positioning strategy to avoid any collision between CCDs while maximizing the filling factor of the focal plane, (2) a spherical washers mechanism adopted for the interface points to avoid any deformation caused by the tilt of the interface surface to be transferred to the focal plane, (3) the employment of a truncated-cone-shaped window, made of synthetic silica, to save the back focal space, and (4) a passive heat transfer mechanism to exhaust efficiently the heat generated from the CCD readout electronics which are accommodated inside the dewar. Extensive simulations using a finite-element analysis (FEA) method are carried out to verify that the design of the dewar is sufficient to satisfy the assigned errors. We also perform verification tests using the actually assembled CCD dewar to supplement the FEA and demonstrate that the design is adequate to ensure an excellent image quality which is key to the HSC. The details of the camera system, including the control computer system, are described as well as the assembling process of the dewar and the process of installation on the telescope.

  5. Implementation of a scalable, web-based, automated clinical decision support risk-prediction tool for chronic kidney disease using C-CDA and application programming interfaces.

    PubMed

    Samal, Lipika; D'Amore, John D; Bates, David W; Wright, Adam

    2017-11-01

    Clinical decision support tools for risk prediction are readily available, but typically require workflow interruptions and manual data entry so are rarely used. Due to new data interoperability standards for electronic health records (EHRs), other options are available. As a clinical case study, we sought to build a scalable, web-based system that would automate calculation of kidney failure risk and display clinical decision support to users in primary care practices. We developed a single-page application, web server, database, and application programming interface to calculate and display kidney failure risk. Data were extracted from the EHR using the Consolidated Clinical Document Architecture interoperability standard for Continuity of Care Documents (CCDs). EHR users were presented with a noninterruptive alert on the patient's summary screen and a hyperlink to details and recommendations provided through a web application. Clinic schedules and CCDs were retrieved using existing application programming interfaces to the EHR, and we provided a clinical decision support hyperlink to the EHR as a service. We debugged a series of terminology and technical issues. The application was validated with data from 255 patients and subsequently deployed to 10 primary care clinics where, over the course of 1 year, 569 533 CCD documents were processed. We validated the use of interoperable documents and open-source components to develop a low-cost tool for automated clinical decision support. Since Consolidated Clinical Document Architecture-based data extraction extends to any certified EHR, this demonstrates a successful modular approach to clinical decision support. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  6. Constraining neutrino masses with the integrated-Sachs-Wolfe-galaxy correlation function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lesgourgues, Julien; Valkenburg, Wessel; Gaztanaga, Enrique

    2008-03-15

    Temperature anisotropies in the cosmic microwave background (CMB) are affected by the late integrated Sachs-Wolfe (lISW) effect caused by any time variation of the gravitational potential on linear scales. Dark energy is not the only source of lISW, since massive neutrinos induce a small decay of the potential on small scales during both matter and dark energy domination. In this work, we study the prospect of using the cross correlation between CMB and galaxy-density maps as a tool for constraining the neutrino mass. On the one hand massive neutrinos reduce the cross-correlation spectrum because free-streaming slows down structure formation; onmore » the other hand, they enhance it through their change in the effective linear growth. We show that in the observable range of scales and redshifts, the first effect dominates, but the second one is not negligible. We carry out an error forecast analysis by fitting some mock data inspired by the Planck satellite, Dark Energy Survey (DES) and Large Synoptic Survey Telescope (LSST). The inclusion of the cross correlation data from Planck and LSST increases the sensitivity to the neutrino mass m{sub {nu}} by 38% (and to the dark energy equation of state w by 83%) with respect to Planck alone. The correlation between Planck and DES brings a far less significant improvement. This method is not potentially as good for detecting m{sub {nu}} as the measurement of galaxy, cluster, or cosmic shear power spectra, but since it is independent and affected by different systematics, it remains potentially interesting if the total neutrino mass is of the order of 0.2 eV; if instead it is close to the lower bound from atmospheric oscillations, m{sub {nu}}{approx}0.05 eV, we do not expect the ISW-galaxy correlation to be ever sensitive to m{sub {nu}}.« less

  7. The Future of Astrometric Education

    NASA Astrophysics Data System (ADS)

    van Altena, W.; Stavinschi, M.

    2005-10-01

    Astrometry is poised to enter an era of unparalleled growth and relevance due to the wealth of highly accurate data expected from the SIM and GAIA space missions. Innovative ground-based telescopes, such as the LSST, are planned which will provide less precise data, but for many more stars. The potential for studies of the structure, kinematics and dynamics of our Galaxy as well as for the physical nature of stars and the cosmological distance scale is without equal in the history of astronomy. It is therefore ironic that in two years not one course in astrometry will be taught in the US, leaving all astrometric education to Europe, China and Latin America. Who will ensure the astrometric quality control for the JWT, SIM, GAIA, LSST, to say nothing about the current large ground-based facilities, such as the VLT, Gemini, Keck, NOAO, Magellan, LBT, etc.? Hipparcos and the HST were astrometric successes due only to the dedicated work of specialists in astrometry who fought to maintain the astrometric characteristics of those satellites and their data pipelines. We propose a renewal of astrometric education in the universities to prepare qualified scientists so that the scientific returns from the investment of billions of dollars in these unique facilities will be maximized. The funding agencies are providing outstanding facilities. The universities, national and international observatories and agencies should acknowledge their responsibility to hire qualified full-time astrometric scientists to teach students, and to supervise existing and planned astronomical facilities so that quality data will be obtained and analyzed. A temporary solution to this problem is proposed in the form of a series of international summer schools in Astrometry. The Michelson Science Center of the SIM project has offered to hold an astrometry summer school in 2005 to begin this process. A one-semester syllabus is suggested as a means of meeting the needs of Astronomy by educating students in astrometric techniques that might be most valuable for careers associated with modern astrophysics.

  8. Managing Astronomy Research Data: Case Studies of Big and Small Research Projects

    NASA Astrophysics Data System (ADS)

    Sands, Ashley E.

    2015-01-01

    Astronomy data management refers to all actions taken upon data over the course of the entire research process. It includes activities involving the collection, organization, analysis, release, storage, archiving, preservation, and curation of research data. Astronomers have cultivated data management tools, infrastructures, and local practices to ensure the use and future reuse of their data. However, new sky surveys will soon amass petabytes of data requiring new data management strategies.The goal of this dissertation, to be completed in 2015, is to identify and understand data management practices and the infrastructure and expertise required to support best practices. This will benefit the astronomy community in efforts toward an integrated scholarly communication framework.This dissertation employs qualitative, social science research methods (including interviews, observations, and document analysis) to conduct case studies of data management practices, covering the entire data lifecycle, amongst three populations: Sloan Digital Sky Survey (SDSS) collaboration team members; Individual and small-group users of SDSS data; and Large Synoptic Survey Telescope (LSST) collaboration team members. I have been observing the collection, release, and archiving of data by the SDSS collaboration, the data practices of individuals and small groups using SDSS data in journal articles, and the LSST collaboration's planning and building of infrastructure to produce data.Preliminary results demonstrate that current data management practices in astronomy are complex, situational, and heterogeneous. Astronomers often have different management repertoires for working on sky surveys and for their own data collections, varying their data practices as they move between projects. The multitude of practices complicates coordinated efforts to maintain data.While astronomy expertise proves critical to managing astronomy data in the short, medium, and long term, the larger astronomy data workforce encompasses a greater breadth of educational backgrounds. Results show that teams of individuals with distinct expertise are key to ensuring the long-term preservation and usability of astronomy datasets.

  9. Color Me Intrigued: The Discovery of iPTF 16fnm, an SN 2002cx-like Object

    NASA Astrophysics Data System (ADS)

    Miller, A. A.; Kasliwal, M. M.; Cao, Y.; Adams, S. M.; Goobar, A.; Knežević, S.; Laher, R. R.; Lunnan, R.; Masci, F. J.; Nugent, P. E.; Perley, D. A.; Petrushevska, T.; Quimby, R. M.; Rebbapragada, U. D.; Sollerman, J.; Taddia, F.; Kulkarni, S. R.

    2017-10-01

    Modern wide-field, optical time-domain surveys must solve a basic optimization problem: maximize the number of transient discoveries or minimize the follow-up needed for the new discoveries. Here, we describe the Color Me Intrigued experiment, the first from the intermediate Palomar Transient Factory (iPTF) to search for transients simultaneously in the g PTF and R PTF bands. During the course of this experiment, we discovered iPTF 16fnm, a new member of the 02cx-like subclass of Type Ia supernovae (SNe). iPTF 16fnm peaked at {M}{g{PTF}}=-15.09+/- 0.17 {mag}, making it the second-least-luminous known SN Ia. iPTF 16fnm exhibits all the hallmarks of the 02cx-like class: (I) low luminosity at peak, (II) low ejecta velocities, and (III) a non-nebular spectrum several months after peak. Spectroscopically, iPTF 16fnm exhibits a striking resemblance to two other low-luminosity 02cx-like SNe: SN 2007qd and SN 2010ae. iPTF 16fnm and SN 2005hk decline at nearly the same rate, despite a 3 mag difference in brightness at peak. When considering the full subclass of 02cx-like SNe, we do not find evidence for a tight correlation between peak luminosity and decline rate in either the g‧ or r‧ band. We measure the relative rate of 02cx-like SNe to normal SNe Ia and find {r}{N02{cx}/{N}{Ia}}={33}-25+158 % . We further examine the g‧ - r‧ evolution of 02cx-like SNe and find that their unique color evolution can be used to separate them from 91bg-like and normal SNe Ia. This selection function will be especially important in the spectroscopically incomplete Zwicky Transient Facility/Large Synoptic Survey Telescope (LSST) era. Finally, we close by recommending that LSST periodically evaluate, and possibly update, its observing cadence to maximize transient science.

  10. Liverpool telescope 2: a new robotic facility for rapid transient follow-up

    NASA Astrophysics Data System (ADS)

    Copperwheat, C. M.; Steele, I. A.; Barnsley, R. M.; Bates, S. D.; Bersier, D.; Bode, M. F.; Carter, D.; Clay, N. R.; Collins, C. A.; Darnley, M. J.; Davis, C. J.; Gutierrez, C. M.; Harman, D. J.; James, P. A.; Knapen, J. H.; Kobayashi, S.; Marchant, J. M.; Mazzali, P. A.; Mottram, C. J.; Mundell, C. G.; Newsam, A.; Oscoz, A.; Palle, E.; Piascik, A.; Rebolo, R.; Smith, R. J.

    2015-03-01

    The Liverpool Telescope is one of the world's premier facilities for time domain astronomy. The time domain landscape is set to radically change in the coming decade, with synoptic all-sky surveys such as LSST providing huge numbers of transient detections on a nightly basis; transient detections across the electromagnetic spectrum from other major facilities such as SVOM, SKA and CTA; and the era of `multi-messenger astronomy', wherein astrophysical events are detected via non-electromagnetic means, such as neutrino or gravitational wave emission. We describe here our plans for the Liverpool Telescope 2: a new robotic telescope designed to capitalise on this new era of time domain astronomy. LT2 will be a 4-metre class facility co-located with the Liverpool Telescope at the Observatorio del Roque de Los Muchachos on the Canary island of La Palma. The telescope will be designed for extremely rapid response: the aim is that the telescope will take data within 30 seconds of the receipt of a trigger from another facility. The motivation for this is twofold: firstly it will make it a world-leading facility for the study of fast fading transients and explosive phenomena discovered at early times. Secondly, it will enable large-scale programmes of low-to-intermediate resolution spectral classification of transients to be performed with great efficiency. In the target-rich environment of the LSST era, minimising acquisition overheads will be key to maximising the science gains from any follow-up programme. The telescope will have a diverse instrument suite which is simultaneously mounted for automatic changes, but it is envisaged that the primary instrument will be an intermediate resolution, optical/infrared spectrograph for scientific exploitation of transients discovered with the next generation of synoptic survey facilities. In this paper we outline the core science drivers for the telescope, and the requirements for the optical and mechanical design.

  11. Potential of CCDs for the study of sterile neutrino oscillations via Coherent Neutrino-Nucleus Elastic Scattering

    NASA Astrophysics Data System (ADS)

    Chávez-Estrada, Marisol; Aguilar-Arevalo, Alexis A.

    2017-10-01

    We study the potential of a detector based on CCD sensors (CONNIE experiment) to study neutrino oscillations to sterile states using reactor neutrinos. We calculate the number of events expected in a 1 kg detector and determine the sensitivity to oscillations νe → νs in the Δ m412 vs. sin2 θes parameter space for various exposures. The sensitivity is compared with the regions excluded by the Daya Bay experiment under the assumption θ 24 = θ 34 = 0. This work was carried out independently of the CONNIE Collaboration using published information, and its results are not official.

  12. CCD radiation damage in ESA Cosmic Visions missions: assessment and mitigation

    NASA Astrophysics Data System (ADS)

    Lumb, David H.

    2009-08-01

    Charge Coupled Device (CCD) imagers have been widely used in space-borne astronomical instruments. A frequent concern has been the radiation damage effects on the CCD charge transfer properties. We review some methods for assessing the Charge Transfer Inefficiency (CTI) in CCDs. Techniques to minimise degradation using background charge injection and p-channel CCD architectures are discussed. A critical review of the claims for p-channel architectures is presented. The performance advantage for p-channel CCD performance is shown to be lower than claimed previously. Finally we present some projections for the performance in the context of some future ESA missions.

  13. Ultrahigh-frame CCD imagers

    NASA Astrophysics Data System (ADS)

    Lowrance, John L.; Mastrocola, V. J.; Renda, George F.; Swain, Pradyumna K.; Kabra, R.; Bhaskaran, Mahalingham; Tower, John R.; Levine, Peter A.

    2004-02-01

    This paper describes the architecture, process technology, and performance of a family of high burst rate CCDs. These imagers employ high speed, low lag photo-detectors with local storage at each photo-detector to achieve image capture at rates greater than 106 frames per second. One imager has a 64 x 64 pixel array with 12 frames of storage. A second imager has a 80 x 160 array with 28 frames of storage, and the third imager has a 64 x 64 pixel array with 300 frames of storage. Application areas include capture of rapid mechanical motion, optical wavefront sensing, fluid cavitation research, combustion studies, plasma research and wind-tunnel-based gas dynamics research.

  14. A robotic reflective Schmidt telescope for Dome C

    NASA Astrophysics Data System (ADS)

    Strassmeier, K. G.; Andersen, M. I.; Steinbach, M.

    2004-10-01

    This paper lays out a wide-field robotic Schmidt telescope (RST) for the Antarctic site Dome C. The telescope is based on 80/120cm reflective Schmidt optics, built originally for a space project, and a mosaic of four 7.5k×7.5k 8-μm thinned CCDs from the PEPSI/LBT wafer run. The telescope's total field of view (FOV) would be 5o circular (minimum 3o× 3o square) with a plate scale of 0.7 arcsec per pixel. Limiting magnitude is expected to be V=21.5mag in 60 sec for a field of 9 square degrees.

  15. CCD charge collection efficiency and the photon transfer technique

    NASA Technical Reports Server (NTRS)

    Janesick, J.; Klaasen, K.; Elliott, T.

    1985-01-01

    The charge-coupled device (CCD) has shown unprecendented performance as a photon detector in the areas of spectral response, charge transfer, and readout noise. Recent experience indicates, however, that the full potential for the CCD's charge collection efficiency (CCE) lies well beyond that which is realized in currently available devices. A definition of CCE performance is presented and a standard test tool (the photon transfer technique) for measuring and optimizing this important CCD parameter is introduced. CCE characteristics for different types of CCDs are compared; the primary limitations in achieving high CCE performance are discussed, and the prospects for future improvement are outlined.

  16. Ground/bonding for Large Space System Technology (LSST). [of metallic and nonmetallic structures

    NASA Technical Reports Server (NTRS)

    Dunbar, W. G.

    1980-01-01

    The influence of the environment and extravehicular activity remote assembly operations on the grounding and bonding of metallic and nonmetallic structures is discussed. Grounding and bonding philosophy is outlined for the electrical systems and electronic compartments which contain high voltage, high power electrical and electronic equipment. The influence of plasma and particulate on the system was analyzed and the effects of static buildup on the spacecraft electrical system discussed. Conceptual grounding bonding designs are assessed for capability to withstand high current arcs to ground from a high voltage conductor and electromagnetic interference. Also shown were the extravehicular activities required of the space station and or supply spacecraft crew members to join and inspect the ground system using manual on remote assembly construction.

  17. Cables and connectors for Large Space System Technology (LSST)

    NASA Technical Reports Server (NTRS)

    Dunbar, W. G.

    1980-01-01

    The effect of the environment and extravehicular activity/remote assembly operations on the cables and connectors for spacecraft with metallic and/or nonmetallic structures was examined. Cable and connector philosophy was outlined for the electrical systems and electronic compartments which contain high-voltage, high-power electrical and electronic equipment. The influence of plasma and particulates on the system is analyzed and the effect of static buildup on the spacecraft electrical system discussed. Conceptual cable and connector designs are assessed for capability to withstand high current and high voltage without danger of arcs and electromagnetic interference. The extravehicular activites required of the space station and/or supply spacecraft crew members to join and inspect the electrical system, using manual or remote assembly construction are also considered.

  18. Toroid Joining Gun. [thermoplastic welding system using induction heating

    NASA Technical Reports Server (NTRS)

    Buckley, J. D.; Fox, R. L.; Swaim, R J.

    1985-01-01

    The Toroid Joining Gun is a low cost, self-contained, portable low powered (100-400 watts) thermoplastic welding system developed at Langley Research Center for joining plastic and composite parts using an induction heating technique. The device developed for use in the fabrication of large space sructures (LSST Program) can be used in any atmosphere or in a vacuum. Components can be joined in situ, whether on earth or on a space platform. The expanded application of this welding gun is in the joining of thermoplastic composites, thermosetting composites, metals, and combinations of these materials. Its low-power requirements, light weight, rapid response, low cost, portability, and effective joining make it a candidate for solving many varied and unique bonding tasks.

  19. Automated software configuration in the MONSOON system

    NASA Astrophysics Data System (ADS)

    Daly, Philip N.; Buchholz, Nick C.; Moore, Peter C.

    2004-09-01

    MONSOON is the next generation OUV-IR controller project being developed at NOAO. The design is flexible, emphasizing code re-use, maintainability and scalability as key factors. The software needs to support widely divergent detector systems ranging from multi-chip mosaics (for LSST, QUOTA, ODI and NEWFIRM) down to large single or multi-detector laboratory development systems. In order for this flexibility to be effective and safe, the software must be able to configure itself to the requirements of the attached detector system at startup. The basic building block of all MONSOON systems is the PAN-DHE pair which make up a single data acquisition node. In this paper we discuss the software solutions used in the automatic PAN configuration system.

  20. The Dynamics of the Local Group in the Era of Precision Astrometry

    NASA Astrophysics Data System (ADS)

    Besla, Gurtina; Garavito-Camargo, Nicolas; Patel, Ekta

    2018-06-01

    Our understanding of the dynamics of our Local Group of galaxies has changed dramatically over the past few years owing to significant advancements in astrometry and our theoretical understanding of galaxy structure. New surveys now enable us to map the 3D structure of our Milky Way and the dynamics of tracers of its dark matter distribution, like globular clusters, satellite galaxies and streams, with unprecedented precision. Some results have met with controversy, challenging preconceived notions of the orbital dynamics of key components of the Local Group. I will provide an overview of this evolving picture of our Local Group and outline how we can test the cold dark matter paradigm in the era of Gaia, LSST and JWST.

Top